How to mitigate common data migration challenges when implementing SAP SuccessFactors Employee Central

ByPermanand Singh

How to mitigate common data migration challenges when implementing SAP SuccessFactors Employee Central

 

Data Migration is like integration in many ways. They both move data from one system to another. They are both complicated activities that are totally dependent on the system configuration. Decisions about project scope, budget, timing and resources are often made without fully understanding the impact to data migration and integration. As a result any large, global Employee Central project will likely experience multiple challenges with data migration. Here are some mitigation approaches for the more common data migration challenges.

1)      Constantly changing Employee Central configuration

2)      Managing and maintaining multiple versions of extraction programs and data files

3)      Ownership of issues

4)      Inadequate resource allocation, visibility and urgency associated with data migration activities

5)      Multiple source and target environments

 

Constantly changing Employee Central configuration

Based on the standard three iteration configuration process used by SAP SuccessFactors, there should only be a few times that the system configuration will change. In reality, the configuration is constantly changing. Even for brief periods between iteration cycles when the client is reviewing the configuration, there can be system modifications to address critical issues that are preventing the client from proceeding with their validation. Additionally, it is not only the configuration of the employee and foundation structures that impacts data migration, but also the foundation and picklist values, business rules and user permissions.

The mitigation approach here is to use a separate SuccessFactors instance for testing data migration. This will ensure that you have a stable environment for validating the extraction files. It will also significantly reduce all of the issues relating to configuration changes and make the development and testing of the data migration (ETL) programs more efficient. You can do an instance clone or use instance sync to update the configuration in the migration instance as needed to ensure the migration instance mirrors the test/dev environment.

Managing and maintaining multiple versions of extraction programs and data files

The overall complexity of data migration will be directly dependent on the number of migration files, data sources and the complexity of the conversion logic. The conversion logic will be impacted by country specific configurations. If these are significant, you may need a separate file for each country/object in order to effectively complete the data loads. This quickly adds up to a large number of extraction programs and data files that will be needed. So for example if there are 20 countries and the data is being sources from three different data sources, that alone can result in 60 Job Information programs/files.

To reduce the number of extraction programs, use an ETL tool that supports country specific conversion logic. This would allow you to use one extraction program per object/per data source. Within the extraction program different conversion rules can be applied based on each country. This way instead of 60 extraction programs from the above example, you are down to only three. To manage the number of files, do use proper naming convention or again you can use an ETL tool that is able to use SuccessFactors API’s and upload the data directly into the system without the need for individual files.

Ownership of Issues

There will be a lot of ambiguity on who is responsible for resolving data migration issues. This is because many issues will not be related to the actual data conversion programs. It can be a data quality issue, incomplete foundation and validation data in Employee Central or even a configuration/permission/business rule issue in EC. Issues will be assigned to incorrect teams and individuals where they can languish before they are escalated as critical.

There are several approaches to addressing this issue. First, use a RACI chart to specifically define ownership of key activities and use a process diagram to communicate the troubleshooting process. So for example, all data quality issues are the client responsibilities, while the EC configuration team can be the owner for any configuration issues or missing validation data. Next, ensure that data migration issues are tracked in a log where all the relevant personal can access it. This log also needs to be reviewed and prioritized similar to the configuration issue log.

Additionally, having a Data Migration project manager/lead with oversight for the data migrations activities, documentation and issue management will also help to ensure that the migration team is not being bogged down and overwhelmed with items that they are not able to resolve.

Inadequate resource allocation, visibility and urgency associated with data migration activities

An EC project cannot go live until data migration is complete. Chances are however, when you look at your core EC team, Data Migration will likely not be adequately represented nor staffed. And even if there are multiple personnel assigned to the data migration team, you do need to ensure it is the right personnel. There is no substitute for having team members who understand the client data and how this data will be used in the new EC system.

To avoid Data Migration being the cause of delays to your project, here are some mitigations steps:

–           Ensure that a few data migration activities are on the critical path of the overall project plan

–           Provide a comprehensive overview of the data migration process and milestones to the client team early in the process and highlight the risks if data migration is not successful

–           Include multiple data migration cycles thru the implementation, so the focus is not just on the production migration

–           Maintain a risk log and provide updates on data migration at the Steering committee level

Multiple source and target environments

Most large companies will use multiple SuccessFactors environments (Dev/Test, QA, Parallel, Prod) and will also have a similar landscape for their legacy source systems. This creates quite a few challenges for the data migration team. On the SuccessFactors side the configuration/permission and foundation data can be different in each environment. On the source systems, there can be similar discrepancies across the different environments. The end result is that the data migration testing cycles may not accurately reflect what you will experience when doing production data migration.

From a logical perspective it would appear to make sense to pair up the legacy environments with the appropriate SuccessFactors instances and test data migration from one to the other. In reality this will not be efficient. Instead, try to complete most of your data migration testing using production data or masked production data from the source system into the designated SuccessFactors data migration instance. Since the latter is a separate instance, you can restrict access to the production data to only a few users. This way by the time data migration is done in the Parallel and Prod environment most of the issues, whether data quality or data transformation will already be addressed.

Conclusion

Utilizing the above mitigation approaches will improve your chance of a timely go live. However, since each project is unique, this is by no way an exhaustive list. This blog only covers some of the more common challenges that will apply to most projects. If you have experienced other data migration challenges, please feel free to share the impact and how you were able to resolve them.

About the author

Permanand Singh administrator