Janani is a Principal Consultant assisting customers with their unique, industry specific data requirements including data migrations, data quality management and data governance. She also delivers thought leadership content at internal and external seminars, events and webinars as well as through authoring white papers. Janani has over 15 years experience in the IT consulting industry, working with clients in the UK and US implementing complex data management and analytics solutions. In her free time, Janani enjoys massively multiplayer online role-playing gaming (MMORPG) and reading sci-fi and fantasy books.
At the heart of the definition of data governance, is the word “management”, and for me it is very evident that data governance needs technology to provide the oversight of data processes, the related planning activities and monitoring of actions and outcomes.
Data ownership is now an increasingly hot topic among those in charge of data quality within the business community. This article looks at how in today's data driven world most of us have some sort of relationship with it. Janani considers how different data quality roles interact and how to define responsibilities and ownership in a way that contributes to the wider organisational goals.
*As featured by Computing
With data moving higher up the business agenda we’re seeing pressure mounting on those responsible to deliver on its demands. As a result, a number of challenges have emerged for both Chief Information Officers (CIO) and Chief Data Officers (CDO) alike, caused by a lack of suitable technology that’s ultimately having a time and cost impact on resources, their ability to support each other and deliver on data strategy.
According to the latest Experian Data Quality Global Research, almost 63% of organisations lack a coherent, centralised approach to data quality. When looking at the numbers behind the research, more than half (51%) say individual departments still adopt their own strategy, and while 92% of businesses surveyed still find data quality challenging, this clearly points to a lack of a consistent and centralised approach as one of the problems. It is not just about having a data quality strategy in place; it’s about doing it in a streamlined, consistent and efficient manner that creates the more mature data quality organisation.
Not too long ago, in the late noughties, I was consulting with a large, international business around one of their data-driven projects. Issues were found in the quality of the data and further analysis was needed to understand the extent and cause of the problem. The barrier I faced however, was that extracting the data could only be done during the quarterly development schedule which was over two months away.
Data migrations are a fairly common occurrence in today’s businesses. They are no longer just a once in a blue moon activity, because data is migrated practically on a daily basis. We migrate data when we acquire new information, when we merge and demerge operations, or move data around to get a more complete picture of our customers, products and financial position. The labels change, but the principles of a migration remain the same.
"How do I build a successful business case for data quality?"
This is a frequent question I hear from businesses at our data quality events and seminars. However, in an age where data quality has climbed up the corporate agenda, it mystifies me why this seems a recurring topic.
Did you know that 38% of data migration projects fail?*
While this statistic represents a bleak picture of wasted effort leading to a drain on resource and budget, it is important to note that a data migration is a challenging project and the risk involved is high. However, being aware of the common hurdles that could potentially derail your project will increase the likelihood of achieving a smooth transition of data to your new system.
I recently attended the IRM Enterprise Data and BI conference held from 4th to 6th November in London. This conference was a great opportunity to not only present, but also to gather knowledge from many data practitioners.
One of my favourite sessions was Defining Data Quality Dimensions presented by Nicola Askham and Denise Cook, which gave the audience an opportunity to understand and review their recent white paper covering the six primary dimensions for data quality assessment.
The data world is buzzing with articles, posts and presentations asking organisations where they are with their big data strategy. It seems as though the world had barely started taking data quality seriously for basic master and reference data, and now we are out to conquer the next data challenge, lots of data!