Did you know that 38% of data migration projects fail?*
While this statistic represents a bleak picture of wasted effort leading to a drain on resource and budget, it is important to note that a data migration is a challenging project and the risk involved is high. However, being aware of the common hurdles that could potentially derail your project will increase the likelihood of achieving a smooth transition of data to your new system.
You’ve probably noticed we’ve got a new name. So, whilst we fight over the last few branded coffee mugs, pens and reminisce of the old QuickAddress days, we’ll also be celebrating our new name ‘Experian Data Quality.’
But why are we doing this?
At the beginning of November I shared with you a few simple tips to help you get started with your data quality programme. This blog was titled ‘Top Tips for Data Quality.’ If you haven’t had chance to read it you can do here.
This blog is my two part special if you like. I didn’t want to overload you with all of my tips at once, so here are my next 5. Hopefully these will help you ensure you have a strategy and the tools in place to consistently deliver good data quality.
As well as usability and performance enhancements, the latest version of Experian Pandora brings significant new capabilities such as the user-configurable Outliers Report, which automatically highlights “unusual” data, and reference table functionality which enables the licensed re-use of data for cleansing and enrichment.
Big data is becoming an increasingly important tool for many business organisations but new research by IBM has found that a large number of insurance companies are failing to take advantage of it.
The advent of big data has changed the way organisations work forever, yet there is still some misunderstanding about what this asset can do for marketing. This is about more than generalisations; it is about getting to the crux of the matter and delivering focused projects with defined results.
I recently attended the IRM Enterprise Data and BI conference held from 4th to 6th November in London. This conference was a great opportunity to not only present, but also to gather knowledge from many data practitioners.
One of my favourite sessions was Defining Data Quality Dimensions presented by Nicola Askham and Denise Cook, which gave the audience an opportunity to understand and review their recent white paper covering the six primary dimensions for data quality assessment.
I thought I’d share with you a few simple tips to help you get started with your data quality programme. Data quality doesn’t have to be complicated. Really it’s all about having the right people, processes and technology in place. I’ve tried to simplify it into these five steps below so follow these and in no time you’ll have got to grips with your data.
Getting a reference in a dictionary is an indicator that something has gone from being niche to near universal. That was the story of 2013, the year the big data became ubiquitous.
The data world is buzzing with articles, posts and presentations asking organisations where they are with their big data strategy. It seems as though the world had barely started taking data quality seriously for basic master and reference data, and now we are out to conquer the next data challenge, lots of data!