Skip to main content

Data migration success - your top 3 questions answered

David Mead 7 minute read Data migration

We recently hosted a packed-out Data Migration Roundtable event that focused on some research that we commissioned from Data Migration Pro to explore ‘the state of the nation’ regarding modern data migration projects.

With the research findings as a backbone to the session, there were numerous ideas, recommendations, and of course challenges, shared amongst the group.

In this post, I want to cover off some of the questions that we simply didn’t get a chance to cover fully on the day.

I invited back our regular guest expert, Dylan Jones, to go through the questions and provide some practical answers and tips.

 

Q1: How can we best manage suppliers and vendors when multiple systems are in play on the migration?

Dylan: This is a great question, particularly as the research found that the majority of modern data migration projects involve suppliers of some fashion, in fact only 30% of migrations were carried out without any external support.

I think there are a few things to consider when managing suppliers:

Tip 1: Carry out a Pre-Migration Impact Assessment (PMIA)

This will help you get a handle on:

  • What are your data quality issues likely to be?
  • What type of migration are you facing?
  • What are the risks (for you and suppliers)?
  • What technology will be required?
  • What type of supplier will be required at each phase?

On one PMIA, we found that the original supplier selection and technology procurement was 100% flawed for the type of migration required.

Implementing a PMIA can really help you figure out your supplier strategy.

Tip 2: Adopt a common framework

If you have multiple suppliers on a project it can get messy trying to fit everyone’s data migration methodologies (and ideologies!) into one stack.

I find it easier if the customer takes the lead and standardises on one common framework e.g. PDM v2 and then educates all the suppliers on the terminology, deliverables and roadmap stages.

You all need to be singing from the same hymn-sheet. I accept that many customers will lack this knowledge so speak to someone who can explain how a common framework can smooth the communication between the various (and often conflicting) supplier teams.

Tip 3: Create a unified project ‘hub’

If you can, get all the project teams in one physical location, it helps enormously. In addition, create a virtual project hub online where everyone can communicate freely.

Closed email is a killer on projects, you need open communication.

This doesn’t need to be expensive. I’ve used tools like Basecamp on projects with 50 team members, 3 suppliers and 4 different time-zones!

Tip 4: Be crystal clear about who does what, when and how

As a customer, you may be forgiven for thinking that your supplier will do all the heavy lifting.

However, you won’t get off that lightly.

Most projects require the customer to manage data quality but senior management may brush that off as a sideshow - “our data has been ok up-to-now, right? Why do we need to assess and clean it up?”.

The reality is that your data was never designed for the pressures of the new target system. This can leave you with a huge change request late in the project if ignored so use your PMIA (see above) to figure out the tasks involved.

Likewise, who will do the testing? Who will sign off the decommissioning plan? What data is to be archived for legal reasons?

Break down all the project tasks in advance, combine with your PMIA findings, and figure out a natural split of accountabilities and deliverables amongst your supplier partners.

Q2: Why is data migration always treated as the ‘bridesmaid’ on major transformation programmes?

Dylan: Love this question. It got a few laughs and nods of recognition at the roundtable.

A big challenge is awareness.

I remember on one project, the programme manager demanded to know why it would take several months to migrate 10GB of data between multiple systems when “…they could get 16GB on their USB memory stick and move it wherever they wanted to!’.

I like to think we’ve moved on from that viewpoint but the recent research uncovered countless stories of business sponsors and program managers who had seriously underestimated how complex a modern data migration can be.

For example, the research found that in 55% of the projects surveyed, the project management team did not have a good understanding of data migration best practice.

If the data migration project leadership don’t understand best-practice, there’s a fair chance that the wider programme has an even poorer understanding!

So, the key is to educate early and build communication into the programme, right from the outset.

The PMIA can help a great deal with this. By demonstrating ‘what lies ahead’ based on a thorough data impact assessment, you can start to communicate the value of applying a robust data migration approach.

 

Q3: We are about to embark on a phased data migration. What are some simple recommendations for a smoother project?

Dylan: Phased migrations are increasingly the norm.

The days of the terrifying ‘Long Weekend Big-Bang’ migration are numbered because the risks are simply too great, not to mention the logistics of getting a window of opportunity when your business never shuts down, even for a day.

Here are some tips for a successful phased data migration:

Tip 1: Enrich your legacy data with ‘identifiable metadata’

One of the big problems with a phased migration is keeping track of what data has gone where. You need to be really clear on what has been migrated at each phase. To do this, I typically like to enrich the legacy data with the following metadata:

  • Record creation date
  • Record update date
  • Record sequence number
  • Record location

Sometimes you have to get creative and add this metadata to existing comments fields because the DBA team are protective about their legacy systems. Despite these challenges, the usefulness of this metadata can’t be overlooked when it comes to phased migrations because you need to have a solid understanding of the state of your data so you can perform rollbacks and synchronisation between source and target over the course of the project.

Tip 2: Create robust data migration data models

Quite often, your data migration may start with a flurry of ETL mapping activity as coders race to get the project completed by hacking away at the thousands of source-to-target data transformations required.

This should come down the line. First, carry out your detailed data quality rules discovery and measurement activity.

Find out whether your data will support the intended migration. Do all those relationships and transformations make sense given the quality, structure and relationships found in the legacy data?

Finally, once you’ve understood the quality, structure, meaning and relationships of the data, transform this into accurate legacy data models. Start from the top, conceptual models first, then logical and finally physical models.

Then map out your common model to the target platform and identify how your model will support each phase of the migration.

The challenge with phased migrations is that the phasing options can change. One minute the business will ask for migration by customer account segment, then they’ll change it to region, then they’ll change it to the date of creation. If you don’t have accurate models in place, you’ll struggle to understand what the impact of these changes will be.

Hopefully, these answers will help you on your next data migration project.

If you want to learn more about data migration best practices, be sure to visit our Data Migration Leaders hub where you can check out the latest research from Data Migration Pro and a host of other useful resources.

Comments