DATA MIGRATION: the waves to success
Data is the insurance industries’ lifeboat: powering everything from underwriting to claims processing. As insurers modernize their operations and adopt new core insurance platforms, data migration has become an essential component in order to simplify their application landscape. Migrating data from legacy systems to new or other platforms can be complex, risky and long, with potential impacts on business operations, customer service, and compliance. In this post, we'll discuss best practices and our approach for managing the process to ensure a controlled and successful outcome. Whether you're a technology leader at an insurance company, a consultant, or a vendor, this post will provide valuable insights into this critical migration. Without a successful data migration, the benefit of application simplification can simply not be counted for.
We developed a framework that will guide any project on their migration trajectory. As the title suggest, these are the waves to success.
The framework contains two main waves. The first one being mapping, transforming and loading the data, and the second one has all the testing and reconciliation. Let’s start with a deep dive into the first cycle: the Migration Sprints.
We have clearly chosen a sprint-based approach, not only because it is fashionable but because data itself is depicting the requirements and the sooner your discovery what kind of data exists, the better you can adapt to it during the transformation process of the data towards the target system.
Our approach follows this reason fully this methodology throughout the cycles and the lifetime of the project.
We start off with the Mapping Designs and Transformations. In this phase, the team works together to perform data profiling and see how the target fields need to be filled in with the source data. Missing data or incorrect data will also be crucial component and the earlier this is discovered the better.
After the rules have been validated with SME, we then use some kind of simulation (a loader) to mimic the actual loading: the trial run. During the trial run, data quality scripts are run to filter unnecessary data, and perform data transformation logic. It then goes through the loader (a loader contains essential business logic of the target system), but no data will be inserted in the target database. This way you can have a very fast cycle time which mean you can iterative often and incorporate feedback loops.
In order to see how well we are performing, we will keep track of some Key Performance Indicators (KPIs). As mentioned earlier, we have regular mapping workshops. Sometimes we set default values for more complex fields, but overall, we track the Mapping Design Ratio. This KPI allows us to see the progress of mapping source to target data.
Another important KPI is the Load Ratio. Here we compare what we put through the loader to what comes out. In other words, how much has been successfully loaded?
Cycle 2 : Testing & & Reconciliation
After finetuning the mappings and the figures are looking great, it is time to begin one of the most important phases in the migration: testing & reconciliation. Just because it has been loaded on the right spot, doesn’t per se mean that the logic behind it works as it should.
What is Data Migration Testing, actually? Data migration testing is a process that ensures the accuracy, completeness, and consistency of data during the migration process. The process involves testing the entire data migration process, including data extraction, data transformation, and data loading. The goal is to verify that the data has been migrated correctly and is available for use in the new system. There are many types of testing to be done. All equally as important. In the figure below, you will see a high-level overview of types of testing.
Data Migration Testing
Within this category, we distinguish 2 types of testing:
- Static Testing
This type is purely visual. The goal is to check whether data in the source system is identical in the target system (according to the rules defined in a mapping workshop). Some things we check are premiums, names, addresses, and critical data is used for the calculation of premiums & reserves.
- Dynamic Testing
Also known as functional testing, here we aim to ensure that data can be managed in the target application: can we create an amendment on this policy and will this generate the expected premium for example ? It is important to note that functional testing also attempts to verify regression.
Application Testing on Migrated Data
Application Testing is run after the initial Data Migration Testing has been done. Within this category, we have two types of testing:
- System Integration Testing (SIT)
It is worth noting that SITs are performed before User Acceptance Testing. The main objective of SITs is to identify and resolve any defects or issues that may arise during the integration of various system components. It involves testing the interactions and interfaces between different modules to ensure that they work seamlessly together and meet the requirements of the system.
- User Acceptance Testing (UAT)
UATs are typically conducted after the completion of system integration testing (SIT), where the entire system is tested in a simulated production environment. The purpose of UATs is to ensure that the software is user-friendly, meets business needs, and operates as expected in a real-world environment.
The testing process involves creating test cases and scenarios that mimic real-world use cases and user interactions.
Data reconciliation is the process of comparing data in the source and target systems to ensure they are consistent. The process involves verifying the accuracy and completeness of data in both systems, identifying any discrepancies, and resolving them. Data reconciliation is essential because it ensures that the data in the new system is accurate and consistent with the data in the source system.
What if there would be no reconciliation? Well, without a proper recon report, data inconsistencies can go unnoticed, leading to errors and affecting business decisions based on inaccurate data. Data reconciliation helps to ensure that the data in the new system is reliable and accurate, minimizing the risks associated with data migration.
Data Reconciliation is crucial for any business, and we're committed to providing valuable insights on this topic. Stay tuned for our upcoming blog post, where we'll dive deep into the best practices for successful reconciliation. Be sure to follow us on LinkedIn so you don't miss out on this opportunity.