Not too long ago, in the late noughties, I was consulting with a large, international business around one of their data-driven projects. Issues were found in the quality of the data and further analysis was needed to understand the extent and cause of the problem. The barrier I faced however, was that extracting the data could only be done during the quarterly development schedule which was over two months away!
So in this example, even analysing the problem required moving mountains. This represented a time when data quality technology was procured and primarily used by IT departments, with a strong focus on specialist skills and timescales driven by software development lifecycles. After cycles of escalations, we got the matter resolved, but by then the problem had become worse and the cost to find and prevent the problem had increased. This represented the classic reactive data quality practice and poor data governance.
Today, data and data quality is climbing up the corporate agenda. The reactive data quality machine may still be present but it is fast losing out to the changing business appetite. The 2014 Experian Data Quality Global Research revealed that nearly 99% of companies surveyed have a data quality strategy.1 With moving times, the context, scope, and user expectations have changed; data quality practices must be nimble and agile when responding to business needs. No longer can businesses twiddle their thumbs and sit on problems for over two months. Regulators are breathing down our necks and competition is stealing customers over the smallest failures in data quality. Organisations Gartner has surveyed estimate that poor-quality data is costing them on average $14.2 million annually.2 That is not a figure to take lightly. Businesses today require more responsive data quality practices.
Our new paper “2014: Key Trends Driving the Change In Data Quality Technology” features the Gartner research note, “The State of Data Quality: Current Practices and Evolving Trends.” It looks at the changing nature of data quality as a practice and the technology that has evolved with it.
We follow this with an Experian note on how these trends are changing the nature of technology and we look at examples from across the industry. In particular we look at:
Gartner reports “Those planning to deploy data quality tools over the next 12 months cited information governance programmes as their most common intended use case, at 57%”.2
Gartner state “CDOs, information governance teams and other roles in the business will also become more involved. Vendors’ technology solutions are beginning to reflect this change.”2
With these changing trends it is imperative that business review their current data quality technology and identify if they are already facing gaps between capability and expectations. The paper also lists specific actions for businesses, advising on what to look out for when facing these particular trends.
At Experian, we have seen the rise of these trends and are always adapting our data quality portfolio. The most notable introduction is the Experian Pandora, which allows business users to take control of data quality, allowing them to analyse, improve and control the quality of their data. The platform especially takes into consideration the changing nature of data, the data quality audience, the expectations of external drivers like governance and the rise of data profiling and visualisation.
Which of these trends are changing the way you plan and execute your data quality strategy within your organisation? Are you observing any different trends and are they making your life easy or difficult? We are keen to hear from businesses out there, so please use the comments box below to share your experience.
1 Global Data Quality Research 2014,’ an independent market research report commissioned by Experian Data Quality and produced by Dynamic Markets.
2 Gartner, The State of Data Quality: Current Practices and Evolving Trends, Ted Friedman | Saul Judah, 11 December 2013