Skip to main content

6 compelling reasons to retain a post-migration data quality initiative


Dylan Jones 5 minute read Data quality

You know by now that a data quality strategy is vital for your data migration project and by means of a quick recap, the reason why data migration projects overrun so often is partly due to:

  • Poor data quality in the legacy environment
  • Lack of insight to the structure, meaning and use of legacy data (and how this will impact the target)
  • Failure to implement the right data discovery, data quality and data monitoring environment during migration

This isn’t speculation, it’s fact borne from independent studies from the likes of Bloor Research and hundreds of interviews with practitioners.

What a lot of business leaders don’t realise however is the advantage of keeping the ball rolling with data quality after the migration terminates.

Ask yourself

Why would we invest all that time, cost and energy into protecting such a critical asset only to watch it slide into disrepair post-migration?

Doesn’t make financial sense, operational sense or common sense. Agreed?

The purpose of this article is to outline 6 reasons why implementing data quality both during AND after your migration is the right choice and it all stems from a perfect opportunity that migration projects present.

Data migration - a unique meeting of minds (and technology)

Having spoken to hundreds of data quality leaders I’ve discovered there are 6 common elements to every successful data quality initiative:

  • Skills & expertise
  • Knowledge
  • Methodologies/frameworks
  • Technology/tools
  • Demand & desire
  • Investment & ownership

If you remove any of those elements it can make it far more difficult to either launch or sustain your data quality initiative but fortunately your migration project offers the perfect environment for these elements to come together.

That’s the good news.

However, what do business leaders typically do after the migration project terminates?

  • Skills and expertise : released to other projects
  • Knowledge : analysis, discovery and documentation – deleted
  • Methodologies : no re-use on other initiatives
  • Technology : licenses released or not extended
  • Demand & Desire : momentum falls
  • Investment & Ownership : ceases after the migration

That’s the bad news. Especially for users, customers and the balance sheet.

Remember why your organisation was introducing that new target system? It was probably for one (or more) of these reasons:

  • Better customer service
  • Lower operating costs
  • Increased competitive advantage
  • Easier compliance and governance
  • Increased productivity

Data Quality Management has been proven time and time again to support each of those objectives above so to throw away the platform you’ve created is astonishingly short-sighted.

Remember how much effort it took to resolve data quality in your legacy environment because it had been neglected? That’s exactly what will happen if you ignore data quality in the new target environment. Over time the data will become more and more degraded, costing the organisation in stranded assets, customer churn, financial anomalies and compliance failure.

The problem is compounded by the fact that many new systems still have to share information with existing legacy systems, many of which still have poor quality data. So implementing a data quality strategy for the longer term by focusing on your target environment first but then building out capabilities to your entire legacy real-estate is definitely a smart move.

What data quality facets can be reused after the data migration project?

So you’re bought in on the concept but now you need to convince stakeholders of the value of retaining your data quality resources long term. To do this you can share the useful list below:

Skills and expertise

  • Data profiling, Data discovery, Data quality assessment, Data quality rules, Data cleansing and Data quality monitoring skills gained within the project team e.g. Project leaders, team members and even stakeholders
  • A lot of the stewardship tasks will pass to the target environment but your data stewards will already have been identified (even if they don’t traditionally present themselves as a data steward!)
  • Most important: Business users are fully engaged in the data quality management life cycle so you’ve got a great head start


  • Fully documented Data Quality Rules are created
  • Extensive data profiling creates accurate metadata for data models, relationships, structures and value ranges/formats/rules
  • Business glossaries and details of data meaning/definitions created
  • Specifications of target system interfaces, structures, functions

Data quality methodology

  • Data Quality Management components of your data migration methodology can be recycled and extended with data quality frameworks (e.g. the Data Quality Rules process of PDMv2)
  • Data Migration introduces accountability framework (often for the first time)
  • Becomes easier to introduce a data quality methodology because everyone has already been engaged in the process

Data quality technology

  • Typically need data profiling/discovery, data cleansing and data quality management tool for data migration projects – these can now be retained
  • Tools such as Pandora perform data discovery + data quality + data migration + reporting + data archival functions, long-term usage helps justify investment
  • There are often adapters and interfaces created for the target environment, these can be reused and quality controlled via the data quality tool
  • Any servers or physical infrastructure can simply be retained and ported over to the new data quality management environment (in the case of Pandora this can be greatly scaled down)


  • Without data quality management your migration will fail (or take a lot longer, deliver far lower quality data and burn far more resources)
  • By emphasising the benefits of reuse you can overcome those short-sighted ‘We can’t justify funding data quality for a one-off IT investment’ objections
  • If you apply best-practice data migration approaches, you will get all the foundational elements of a data quality management program that can be sustained over the longer term

What are your views on the importance of post-migration data quality?