Skip to main content

A recap of our 'Improve your data usage in 2016' webinar

Every year we conduct a research study into data quality and data management trends. Yesterday we held a webinar to introduce our 2016 global data management benchmark report’s most important findings and conclusions. In case you missed our webinar highlights, we’ve included the three most important points that we touched upon, as well as a list of actions people can take to begin improving their data management initiatives now.

Today’s data landscape is virtually unrecognizable when compared to how it was 10 years ago. When you factor in the current pace of technological innovation, it’s not too farfetched to say that the next 10 years will see even more changes as to how organizations and individuals interact with and use data. The three topics we went over in our webinar were: Building customer relationships through better data, data quality challenges, and how data management is evolving.

1) Building customer relationships through better data

Over the past year, we’ve seen an overarching trend of customer data driving this data obsession. Why? Personalization. Businesses across all industries want to use customer data and information to provide a more tailored experience. However, a lot of organizations aren’t at the level where they need to be at simply because they either can’t easily access their customer data (due to multiple collection channels and disparate systems) or they can’t easily make sense of their customer data (due to inaccuracies).

Individuals and companies both want data to be leveraged for their benefit. The biggest reasons for maintaining accurate data have to do with this mutual benefit, along with increasing efficiency, improving customer satisfaction, and now, also for complying with regulations. Organizations harboring large amounts of inaccurate data are undermining their ability to provide excellence in customer service and experience.

What you can do: Look to implement, at the very least, some front-end safeguards against human error, which is consistently the top contributor to data inaccuracies. This means including verification solutions in places where consumers interact with your business the most, like the point-of-sale, call centers, and website checkouts.

2) Data quality challenges

We’ve observed that over the past 12 months, nearly all data errors have increased in frequency. This can be attributed to many factors, including more businesses popping up, more data (both unstructured and structured) that businesses collect and manage, and more technology that businesses must learn how to use correctly. Incomplete, outdated, and duplicate data are the three most common data errors within organizations today. The amount of duplicate data increased up the most, jumping 19 percent in the past 12 months. Both internal and external challenges are roadblocks to improving data quality; however, the internal challenges of lacking the knowledge and employee resources are sizeable issues businesses must face today.

What you can do: Keeping up to date with industry benchmarks is very important in this dynamic data environment. It ensures that you stay on top of challenges faced by your competitors and that you have a roadmap of future initiatives to come. Make sure there’s nothing to blindside you when planning for data projects.

3) How is data management evolving?

Many organizations are siloed in the way they collect, manage, and define data. When we operate like this, we’re getting in our own way on the path to better data quality. Oftentimes data management is driven by multiple stakeholders at a department-by-department level, which means there are many inconsistencies around the measurement and use of various technologies within the organization. However, looking forward, businesses are framing data as a holistic, company-wide imperative, with ownership not living solely within the IT side of the organization. 

Businesses want to use data quickly and efficiently; they want it freed from silos and standardized, and they want to make sure it's secure. Assigning a central data owner as well as implementing technology to regularly monitor and visualize data is what organizations should look to achieve when creating their data quality strategies.

What you can do: U.S. organizations have a long way to go along this data quality sophistication curve. Most are sitting along the 'reactive' portion of the curve, where there is no central employee to take responsibility of data and where data is still hoarded in departmental silos. Twenty-four percent of businesses sit along the 'proactive' portion, and a only mere 19 percent are 'optimized' in the way they handle data. To begin, look at properly staffing your business with more data-centric roles. The chief data officer title in particular will grow in prominence as businesses see the value having their data managed by a central owner. Other positions in high demand are data analysts, data scientists, and data warehouse specialists.

Data management and data quality practices are quickly accelerating, and knowing what’s going on with your data is the first step to improving upon it. It’s all about embedding data into your culture and getting the entire organization to acknowledge their part to play in a holistic data strategy.

Take a look at our webinar on SlideShare below if you missed it.

To learn how to create a more holistic data quality strategy for your organization, check out our white paper.

Learn more