Data migration in IT transformation projects 101

Data Migration

The solutions engineer’s perspective on the complicated, time-consuming, and challenging process called data migration.

Data migration is a key step in IT transformation projects. It is also highly likely to veer off course and need correction without proper planning.  

So, what is data migration? Essentially it’s the process of transferring data from one system to another. In reality, it’s a complicated, time-consuming, and challenging process that requires careful planning and precise execution. Even more so when dealing with large amounts of data, complex, interdependent data structures, and data that underpins business processes.   

Why is the success of data migration so important?

 Before we go into “how”, let’s talk about “why” for a moment. When moving to a new system, a lot of focus is given to the selection and design of the new system, but data migration is perceived as an ad hoc activity that “the IT team just needs to get through so we can finally leverage the capabilities of our new system”.  

Thinking like this is a fool’s errand.  

Imagine moving to a new house. You’ve spent months selecting it, planning, and designing the interior, and then booked a truck to move your furniture. When the truck arrives, you realize that with the amount of stuff you have and the size of the truck, it will take five trips to move everything. So now it will take five days instead of one to move everything.  

You then find that some of your furniture does not fit through the door of the new place, some does not fit the room size, and some items got broken during transportation and need to be fixed or replaced. Eventually, you end up having to rent your existing place for another month at a higher price because you could not complete the move on time – and it will take a few months for you to unclutter and organize things in your new place.  

You get the idea. 

Migration is the foundation of transformations that involve a transition to a new system or platform. Organizations need to get it right to unlock the full potential of the new system and achieve their goals because your system is only as good as the quality of data you have in it. 

Data Migration Software

Staging the data migration situation

An example of transformation projects that involve migration is a transition from legacy OSS/BSS systems to a new digital commerce and subscription management platform like CloudBlue.  

This change would enable a digital marketplace, and a subscription consumption model, automate fulfilment, and expand a portfolio by offering more products.  

So, you need to migrate existing accounts and subscriptions to a new system. Where do you start? What challenges should you anticipate? 

The data migration process consists of four major phases: 

  1. Scoping and planning: the most important phase of migration. It involves understanding what exactly you need to migrate, discovering technical and business requirements and constraints, choosing the right strategy, researching possible data migration software, tools, solutions, and aligning on strategy and execution with stakeholders.
  2. Execution: primarily depends on the first one because good planning and an appropriate solution will help migrate without failures. 
  3. Verification: ensure that everything was done based on the plan by running verification tests.  
  4. Cutover: in this phase, you switch to the new system. 

Depending on the data migration strategy, steps 2-4 can be performed once, or it can be an iterative process with these steps performed for each migration batch.

Planning the data migration

The first questions you would usually ask when it comes to data migration planning are “what do we need to migrate”, and, “how many objects” then expand it: 

  • As you are migrating customer accounts that belong to resellers, do you need to migrate the resellers as well? Or will new resellers be created in the new system? 
  • Will you migrate all subscriptions or only subscriptions for specific products? 
  • Will you migrate only active accounts/subscriptions or inactive/disabled ones too?
  • If you need to migrate inactive accounts/subscriptions, will you migrate all of them or just the ones that have been inactive for a certain period?  
  • Are there any other dependent objects you need to migrate together with accounts, such as past orders, invoices, terms, conditions, acceptance history, etc?  

To ensure you don’t miss anything regarding types of objects, their properties, and dependencies, use multiple data sources: look at existing and new systems design documents, UI, data models, API specification, and database schema.

Data Migration Plan

Data mapping, validation, and transformation

This is where you map entities that you need to migrate from the old system (accounts, subscriptions, users, user roles, etc) to ones in the new system.  

And this is not just about mapping objects, it’s also about mapping properties. Some, like account company name, contact details, and address will be straightforward. But what about properties that do not have direct counterparts in the new system? Or what if the new system has a mandatory property not in the old system – what logic should you use to decide what value to set for such a property?  

Data consistency checks and validation is also an important part of this process. If you think that all data in the system you’ve been using for the past five years is accurate and consistent – you’re in for a surprise. Old bugs, accounts created before validation checks were added, and workarounds involving direct data manipulations bypassing validation checks are just a few of the reasons that can lead to “broken” data in the system.  

It’s much easier to find and fix such data inconsistencies in advance in the old system or as part of data transformation before or during migration rather than in the middle. Or even worse, finding these issues weeks or months after the migration is done and they have caused problems in the new system.

Data migration approach 101

This is where you will decide what high-level approach will be used for data migration: 

  1. Bulk export data from the source system, copies exported data to the target system, apply data transformation, and import data to the new system.
  2. Use data migration tools that process each object one by one by retrieving the object and its properties from the source system API, applying transformation rules, and creating an object in the new system using its API. 
  3. Combination of #1 and #2. 

It will depend on the migration requirements and capabilities of the old and new systems.

Cloud Data Migration

Data co-existence and cutover processes 

Large migration can take days, weeks, or even months to complete. This means there will be a period when some objects are managed by the old system, and some are managed by the new one. Planning co-existence and cutover is crucial for the success of the migration. You need to make sure that migration design and procedures address the following: 

  • Will it be one-off migration where everything is migrated in one shot and the old system is shut down, or will you do it in batches? Will you need to move the management of accounts and subscriptions from the old system to the new system batch by batch?
  • Once you migrate an object, do you want to make it possible to manage this object from the old system and synchronize these changes to the new system? Or do you want to make sure that once an object is migrated, the old system cannot change it, and all changes can only be done from the new system?
  • If the decision is that once an object is migrated, it should not be changed from the old system – how do you achieve this? Just sending an email to the customer asking them to log in to the new system is not enough, some customers may miss this email. If possible, this should be implemented as a restriction on a technical level: for example, by marking all users of the account as inactive so no one can log in and make changes from UI.
  • For systems that handle account/subscription billing, make sure migration aligns dates and billing cadence in a way that there is no missed billing. There is a gap or double billing because there is overlap and some period is billed by both old and new systems.
  • What is your rollback plan in case migration of an object fails and you need to restore it to its original state in the old system?

Requirements for systems configuration

When designing and configuring the new system, the focus is on the future. This leads to overlooking what’s required to bring existing customers from the old system who may be using “legacy” products.  

Think back to the “furniture does not fit through the door of a new house” analogy. 

Let’s say you are migrating customers with Microsoft 365 subscriptions. Some customers have subscriptions based on offers that are now discontinued and no longer sold but still work. If you don’t have those offers configured in the new system – can you migrate customers with subscriptions that use these offers? Do you need to configure these offers in the new system but mark them as not available for new sale?  

Then the final questions: what will be impacted for customers, are they aware and informed about this – and do you have a compatible offer configured in the new system for each legacy offer?

Connectivity requirements across systems

Depending on how the migration will be executed, the location, and network configuration of the old system, new system, and migration tools (same network, separate isolated networks, or one is on-premises and another in the cloud) you will have different connectivity requirements:  

  • Do you need additional configuration in the old/new system to expose the APIs required for migration activities?
  • Is a direct connection possible or do you need VPN? What is the expected throughput considering the size of data that will be sent over?
Data Migration Process

Data security considerations

Here we assume that the new system is already setup to meet security requirements and we need to make sure that migration processes are aligned with the requirements too: 

If you are using API over a public network:

  • Is communication with API encrypted using a cryptographic protocol that meets your requirements? It is less likely to be a problem on the new system side, but you’ll be surprised how many legacy systems still use API over plain HTTP or use SSL 2.0 or SSL 3.0.
  • Is the authentication mechanism strong enough?  

If not, do you have to switch from public access to API to using VPN or do you use an API gateway to take care of authentication and encryption requirements? 

  • If you are doing bulk export and uploading data to an intermediate server for further processing before it’s imported to the target system – where is this server located, is it in line with data residency requirements, is data encrypted at rest on this server, how do you make sure export data is removed from this server once it’s no longer needed?

What is the timeframe for data migration?

“How long will it take?” can be answered in the context of overall migration, project timelines, or in the daily migration batches that need to fit within allocated maintenance windows. 

Try using existing performance estimates or benchmark results to calculate durations. It can be risky to rely on these as the performance of your system can differ from abstract benchmark results. 

Perform duration estimates by running test migrations on the staging environment, or do pilot batches in the production environment to calculate how long it takes, extrapolating to planned batch sizing. Do these tests on comparable volume. If you are migrating batches of 10,000 accounts, using a test migration of five accounts as an estimate is not reliable. You will not see performance bottlenecks on API throttling, running out of worker threads, or database performance degradation.

Planning migration batches

If you are running a large migration, chances are you won’t be able to complete it all in one run and need to break it down into batches. 

When planning batches, consider the following: 

  • Batch size, migration performance estimates, and how much time you are given to complete this batch. There will usually be a maintenance or migration window where you can perform these activities to avoid/minimize the impact on day-to-day activities. 
  • When planning batch size vs migration duration, allocate time for additional activities that may be required depending on your migration design, such as batch validation, post-batch migration verification, troubleshooting problems, generating post-migration reports, etc. 
  • Objects dependencies. For example, you have an account with two subscriptions in the old system. Does your migration and systems co-existence design allow you to migrate the account today and migrate its subscriptions two days later or do they all have to be migrated within the same migration window so that once it’s done account and his subscriptions are all managed by the new system only?
     

Data migration is a critical process that requires careful planning, preparation, and execution. If you want to talk about any of the information shared, please get in touch: igor.safonov@cloudblue.com.

Read time
Share article
Newsletter
Get the latest expert advice and strategies in your inbox.

Subscribe to our newsletter

Monetize your SaaS subscription business with CloudBlue! Subscribe to our newsletter for expert insights, strategies, and tips to maximize your revenue potential.

By providing my Personal Data to CloudBlue and its affiliates, I agree to be contacted for marketing purposes and I acknowledge and agree to the collection and processing of my Personal Data in accordance with the Privacy Statement.

Mike Jennett, Director of CloudBlue Platform Strategy, is an accomplished business and technology executive. With a deep focus on product development and go-to-market strategy, he plays a pivotal role driving strategic growth and market expansion. Mike’s career is characterized by his adeptness in driving technological advancements and his commitment to leading digital transformations with experience including IDC where he was VP of the Mobility and Digital Transformation IEP practices, and HP where he held numerous leadership roles. Mike’s expertise is also reflected in his published works and contributions to multiple tech publications. Mike holds a B.A. from California Polytechnic University.
Having previously to strategic product management, agile transformations, and user experience in CloudBlue, Taylor Giddens heads the Services & Solutions team where he ensures smooth delivery, operations and solution growth for our partners and customers.

The team includes technical account management, managed services, support, custom solution development, and customer enablement.

Prior to CloudBlue, his resume boasts leadership of some of the world’s largest companies during their digital transformations and marketplace launches. Taylor is a practitioner of servitude leadership when it comes to enabling his team to drive positive outcomes on the road to operational excellence.
Laurens van Alphen, a visionary entrepreneur with over 29 years of internet technology expertise, serves as Director of Technical Managed Services at CloudBlue, responsible for Operations and Delivery of CloudBlue SaaS.

As a Dutch racing champion and car enthusiast, he brings the same drive to the tech realm, steering Keenondots from a managed hosting firm to a global cloud enablement leader. Laurens is celebrated for his outcome-driven leadership, deep industry insight, and passion for balancing business innovation with client engagement.
Lincoln Lincoln is CloudBlue’s Head of Global Sales; having been with the company since November 2017. Leading CloudBlue’s global go-to-market organization, he’s responsible for driving accelerated and sustained mutual growth with CloudBlue’s customers and partners, as well as forming new customer partnerships across the Vendor and Provider ecosystem. As part of CloudBlue’s leadership team he is responsible the organisation’s revenue and continued market leadership by delivering and supporting products, services and solutions to organizations in established and new markets around the world.

Before joining CloudBlue, Lincoln was AppDirect’s Regional Director, Asia Pacific & Japan, responsible for forming, building and leading AppDirect’s business and operations across the APJ geography. He built and led AppDirect’s fastest growing and highest performing region globally within 3yrs.

Before joining AppDirect, Lincoln was EMC’s Practice Manager, Cloud Service Providers, APJ, working with the leading Service Providers to maximise their Cloud Business presence & market success. Lincoln joined EMC in 2007, and has over 20 years’ experience in the IT industry, having been based out of Singapore, Australia and the UK. Prior to EMC, he was in range of sales and channel positions at Symantec and VERITAS.

Lincoln has an Honours degree in Business Administration from Kingston University in the United Kingdom.
Brent Clooney is the Executive Director and Associate General Counsel for Ingram Micro Inc., and lead counsel for CloudBlue.

Brent is a Canadian based corporate lawyer with more than 20 years of experience as a strategic legal advisor both in private practice and as in-house counsel to large multi-national companies. Prior to joining Ingram Micro in 2008, he worked at a well-respected corporate law firm in Toronto, Canada and later served as general counsel for Toshiba Canada. During his 15-year tenure at Ingram Micro, he has held positions of increasing complexity and responsibility, and since being promoted to his current role in 2022, Brent is the legal lead for both Ingram Micro’s Canadian and global cloud businesses, as well as CloudBlue.

Brent holds a law degree (LL.B.) from Queen’s University, a Psychology degree (B.A. Honours) from Lakehead University, and has been admitted to the bar in Ontario, Canada since 2002.
Anurag serves as the Head of Product Management for CloudBlue and is responsible for product direction and driving innovation. His leadership has been marked by a keen focus on customer needs, growing the ISV ecosystem, and ensuring the continual evolution of CloudBlue’s product portfolio.

Anurag joined Ingram Micro in 2017 and has been instrumental in, positioning CloudBlue as an industry leading monetization platform for MSP’s, Telco’s and Distributors. Previously Anurag worked at Oracle and Microsoft where he managed many technology projects and programs.
As VP of Engineering of CloudBlue, Rony oversees the development and engineering efforts of the company. He is a recognized leader with more than 25 years of experience in Technology and Product.

Prior to joining CloudBlue Rony lead the R&D efforts at Tripwire acquired by Thoma Bravo, and Cedexis acquired by Citrix. Rony is a leader with extensive experience in transforming both complex technology problems into products that customers love and disjointed organizations into agile high performing teams.
Coen is a distinguished leader and entrepreneur in the realm of cloud technology. Currently serving as CEO of Keenondots and the Global Director of CloudBlue SaaS. He is passionate about driving innovation, fostering collaboration, and leading high-performing teams to achieve transformative results.

With a background as Managing Director of INTO Cloud and a pivotal role as Director of Products of KPN, he brings a wealth of experience in steering organizations through the complexities of the digital landscape.

Beyond the boardroom, Coen is a marathon enthusiast, demonstrating endurance and discipline in pursuit of both professional and personal goals.
Alyson has over twenty years of experience in demand generation, marketing automation and data management. She is responsible for leading the strategy and direction of the company’s brand, performance, and digital marketing.

Prior to CloudBlue, Alyson served as Ingram Micro’s Director of Global Business Intelligence Marketing Automation driving channel partner campaigns. Her tenure in marketing leadership at prestigious companies such as Western Digital, Ocean Institute celebrates redefining marketing campaigns and building top performing teams based on trust, experimentation, and results.

Alyson resides with her husband and three children in Orange County and is an active volunteer and donor within her children’s sports and education programs.

Darek Tasak is leading Customer Success & Value Creation for CloudBlue. In his role, he looks after CloudBlue customers globally during the entire lifecycle of our relationship: from the initial on-boarding, through in-life account management, always ensuring they build successful businesses leveraging our technology. Additionally, he is also in charge of Partnership & Alliances, as well as Pricing Management for everything we commercialize.


Before CloudBlue, Darek managed Ingram Micro’s Services division for hi-tech customers in Europe & APAC. His prior experiences include also launching and leading pan-European services business for TDSynnex, as well as strategy consulting with Boston Consulting Group (BCG).

As President of CloudBlue, Uddhav is a distinguished leader and visionary with nearly two decades of platform-building experience. He is an industry leader in digital commerce, the subscription economy, and monetization platforms.


Notably, at SAP, he spearheaded the transformation of their platform business into a multi-cloud platform-as-a-service, offering enterprise and developer-friendly subscription models. At Pure Storage, he championed the efforts to successfully disrupt the storage industry by creating revolutionary Storage-as-a-service, AIOps-as-a-service, and Disaster Recovery-as-a-service offerings with cutting-edge features and establishing a sophisticated subscription commerce infrastructure that is channel-friendly.


At CloudBlue, Uddhav guides and empowers businesses to rethink their monetization strategies by unlocking the power of digital ecosystems and marketplaces. CloudBlue provides enterprises with a mature multi-tier, multi-channel marketplace and monetization platform that enables usage-based subscription models and global delivery of Anything-as-a-Service solutions. Uddhav has played a pivotal role in shaping the future of the subscription economy through his innovative thinking and impactful contributions.

Let's talk