Datamigration from Legacy to Cloud

Published on: Category: IT development and operations

Our client decided to exchange their legacy HCM application (PeopleSoft) with HCM Cloud. For many companies it is a business case nowadays to change their legacy applications for a new application in the cloud.

Among the reasons are, that the legacy systems have been designed to be changed by IT, and therefore this has been done very heavily so. But at the same time the software supplier (in our case Oracle) has also regular updates and upgrades to their product. It is taking more and more time and effort to keep in pace with the updates and upgrades from the supplier and at the same time to keep the whole overload of customizations as well.

Stuck with PeopleSoft 8.0

The customizations have to be coded again and again with every update and upgrade. Either clients will choose to do all this effort over and over again, spending much time and money in doing so, or otherwise they choose not to do all the updates and upgrades by the supplier –so that all of their customizations are not overwritten by it. Our client had chosen for the last strategy, and so they were stuck with PeopleSoft 8.0 which is considered to be quite outdated.  

Reasons to move to the cloud

Furthermore for the business it is very hard to keep in control of all processes, if IT is doing so many changes. Lots of times when something is changed at one place of the application, a completely other part of the application is suddenly not working anymore. These are all reasons to move to the cloud, so that the supplier is in charge of the whole application and can be addressed if something is not working.

Since there is a strong limitation to the customizations you can do in the cloud, there is a lot of setup possible to meet the specific requirements of the client. And there are also quite some localizations, for specific setup per country to meet their laws. Then last but not least, an application in the cloud, has the same version for all clients. Updates and upgrades are rolled out on all sites. Which means that you share all developments in the cloud with many other clients. This is why an application in the cloud is more cost-friendly than a custom application.

Uploading data

I was hired for the role of HCM Data Analyst, to do the mapping of data in PeopleSoft to where it should land in the cloud data model. We did this with a team of 4 people working on the edge of knowledge of HCM processes and IT. There was also a team who was responsible for uploading data in the cloud itself. In the chosen cloud application, WorkDay, there are two tools with which you can upload data. The iLoad and the EIB. The iLoad is typically used for the initial load and was used by us most of the time. The data is uploaded from  an excel file in a certain format, specified by the supplier. The EIB is a tool to upload to a webservice from an excel file in a certain format. This can be used for data load from spreadsheets on a daily basis if needed.

Sometimes the spreadsheet for an iLoad can have a complex structure, which means it is not always row-based. There can be a sequence of rows with a different context, meaning that data on a certain row will land on another place on the screen than information on another row but in the same column. Therefore the supplier requires to use an  ETL-tool to create those excel sheets that will  be loaded into the cloud. As an ETL-tool we used Pentaho from Hitachi.


In Pentaho you can include steps which contain SQL, steps that contain Javascript and steps which contain mapping/translation tables in Excel. This last type we used for instance for the addresses in Italy. In the setup an Italian address needs to have a province, while in PeopleSoft this is most of the time not filled in. So from the city it was then possible to defer in which province this city is. And therefore we created such a table in Excel with all the Italian cities in scope and their provinces. So in this way this step worked as a lookup for the provinces. Because I also know SQL and Javascript, I could help with these steps as well when needed, to make sure the functional requirements were met in the Pentaho tranformations.

Although Pentaho can handle all sorts of input, we decided to use tables in a SQL server database. For the connectivity we used JDBC, which was recommend but also was much faster and a more stable connection to the database in comparison to ODBC.

Filtration in SQL Server

We decided to work with complete dumps from the PeopleSoft tables, and load those from PeopleSoft production into the SQL Server database. In SQL Server we did filtration, because for most tables history was not in scope of the data migration project. But we did also some conversions already in SQL server, because we didn’t want the query in Pentaho to become unreadable because of complexity. In this way little chunks of the conversion code where in separate SQL scripts and the rest was in Pentaho. In this way we could cut up the whole complexity in smaller readable chunks so that we could always track back was has happened to the data in cases of doubt.

6 prototypes and many compliments

We have been working with 6 prototypes. In each prototype we did the complete functional setup (done by the Functional Implementation Team) and data migration, and then afterwards we held validation sessions with the business to see if everything was in place, or if some  parts of data migration, setup or eligibility needed to be tweaked. In the end we received many compliments from the business for the quality of the data migration. We did a great job!

Stefan van Glabbeek
About the author Stefan van Glabbeek

Oracle Fusion Apps HCM functional consultant at Qualogy. Also a senior functional and technical PeopleSoft HCM consultant with 14 years of experience. Over 18 years of experience in IT.

More posts by Stefan van Glabbeek