Skip to main content

Realizing your data's full potential

2019 Trends in global data management

Watch the webinar to see our experts’ analysis on data management trends, and then share some takeaways about how to unlock your data's full potential in 2019.

Hi everyone, thank you for joining us on today’s webinar “Realizing data's full potential: Trends in global data management practices.” My name is Sean Coombs, and I lead content marketing and thought leadership here at Experian’s data quality business. On the line with me, I have my colleague Erin Haselkorn. Erin leads our global analyst relations and market insights program.

Thanks for joining me today, Erin!

Each year, Erin and I oversee Experian’s global research study into the data management and data quality spaces. This is the eighth some-odd year that we’ve done this study. We are excited to share with you all some of the key findings from this year’s report. During our webcast today, Erin and I are going to cover off on the key trends that stood out from a global perspective, we’ll cover our experts’ analysis on these trends, and then share some takeaways with you as we talk about unlock data’s full potential.

Because much of today’s presentation is based off of research that we’ve conducted, we wanted to start off with a brief run-down of the survey methodology. That will help you to understand to the context of the data as we move through today’s session. With that little bit of housekeeping out of the way, we’ll jump right into the topic of customer experience and discuss how data plays a critical role in driving these efforts. After that, we’ll share our findings around trust in data and how that relates to the quality of data in your systems. Lastly, having ownership over data assets is what makes so much of this possible, so we’re going to have a conversation around how data responsibilities are beginning to align closer with the business.

At the end of today’s webinar, we will have some time to answer your questions, so please feel free to ask questions throughout – we’ll try to get to as many as we can at the end.

Now, before we jump into the findings, I wanted to talk about the study itself to give you some background. Each year, we conduct a global survey of data practitioners to ask them a series of questions around data management practices and trends. Some of the questions are unique to that year’s study while about half of the questions are repeats, and that gives us some cool year-over-year benchmarks for comparison.

So about six months ago, we surveyed a little over a thousand data professionals from the four countries you see here: the US, UK, Australia, and Brazil. And we were specifically looking for folks who have visibility and knowledge over their data management practices. Among those we spoke to, more than a quarter work in an IT-related function, 13 percent are Chief Data Officers, 10 percent work in Sales, and so on… And when it comes to these folks’ level of seniority, 19 percent are at the C-level and about 69 percent are either at the director or manager levels.

You can see, it’s a pretty diverse survey population and consistent with past years.

So now that you have a little background on the study itself, I wanted to cover off on some high-level themes that stood out from the data.

The first one we see here is that customer experience is once again the key driver behind the majority of data initiatives. In fact, 98% of the companies we surveyed say that they use data to improve the customer experience. Erin will go into this in more detail in just a few slides, but it’s interesting to see customer experience come to the forefront, as business globally are facing increased competitive pressures.

The second trend we see is that data management capabilities are no longer an ideal future-state, but a strategic imperative today. 82% of businesses globally recognize the reality that improving data management can help them to achieve their business growth objectives. Whether you’re trying to do advanced analytics, improve marketing efficiencies, comply with certain regulations, what have you…the success of these objectives really comes down to your ability to manage the data.

This ties in nicely with the third trend here: a lack of control over data. Organizations nowadays are dealing with an epidemic of information overload. Data coming from a variety of internal, and even external, sources, data that unstandardized, unverified, and siloed across the business. On top of that, when we try to make sense of it all, most businesses rely on IT to run reports and just fix the data. Well, guess what? It’s not working. 70% of businesses say not having direct control over their data impacts their ability to meet strategic objectives.

So we’re seeing a shift in data responsibility – our fourth trend here. 75% of those we surveyed believe that the responsibility for data quality should ultimately lie with the business with occasional help from IT. Now does this mean IT should take their hands off the wheel? Of course not. But we are beginning to see a realization that you cannot adequately manage the quality of the data without the business context and an understanding of how the data created, how it was transformed over time, and how it is intended to be used now and in the future.

Now, I’m going to hand things over to Erin, and she’s going to share our analysis around Customer Experience.

As we all know, organizations are working diligently on their customer experience. One key way a lot of organizations are looking to improve in this area is to better understand those consumers. If you know how customers commonly engage with your business, the frequent products they purchase, general demographic information and more, then you can better engage with them and tailor your message for a one-to-one type of experience. This is especially important in the digital era where customer experience is really the key competitive advantage.

Gaining this customer insights isn't easy, but it is a key priority for many businesses over the next year. Here are some of the highlights of the research in this area and then we will dive into these in more detail.

First, as I mentioned, we see customer experience as a top priority for businesses. This directly ties to key revenue generating initiatives so it is not surprising that it is top of the list. The challenge in achieving this objective is that inaccurate data is getting in the way. When you have data on your customers that is inaccurate, incomplete or dispersed across the business with no way of pulling it together, you can't full leverage this valuable data asset to understand the direction your business should take. That is why we see 69% saying inaccurate data is undermining their ability to provide an excellent customer experience.

Finally, with that desire to understand the customer in more detail, we see the definition around single customer view changing. It used to be that SCV was just one central data repository that people went to. It was the master data for the business with a very specific definition and purpose. That is still how some people define it today. However, we see other definitions coming forward. As individual departments and people try to leverage data to understand the customer for a specific business outcome, we see that single customer view needs to change based on the demands of that specific business purpose. That changes a great deal what you need to do to manage information and make that view more flexible.

But let's get into the data around these areas in a little more detail.

First, we have the top priority for businesses over the next year. You can see customer experience at the top, followed by data security, gaining cost efficiencies, managing talent and workforce development. Clearly not all these areas directly relate to data, but many are aided in a better understanding of customers, employees, and the business as a whole.

Now it is important to note that these figures on the screen are for the global organizations. When we isolate just US business respondents, we did see a slight shift. The US put more of an emphasis on data security than customer experience by 6 percentage points (59% vs 53%). It is still very important, but many are concerned around data security.

However, these two are still significantly higher in percentage than gaining cost efficiencies and reducing risk, which can be considered more operational in nature. With the increasing competition and the evolving digital marketplace, it is not surprising to us that customer experience is at the top considering it is a key differentiator between brands.

Organizations are working to improve the customer experience in a number of ways, to the point that we see 98% of businesses using data to improve the customer experience.

Some of the common ways they are leveraging data can be seen on the graph. These include areas such as customer service communication, better handling of customer complaints, product delivery and fulfilment, personalization, customer on-boarding, and more.

Each of these efforts requires a vast repository of consumer data, and more than that, the data on hand must be high-quality, trustworthy, and paint a holistic view of each consumer. However, that picture of the consumer through data often doesn't exist today within businesses today.

However, we do have a customer example of where this has worked well.

There is a leisure company who was looking to develop a deeper understanding of their customer and create a single customer view. They had dozens of data sources, all being fed into a legacy CRM. That data was then used to run marketing campaigns to customers, work on B2B customer campaigns, help make decisions around operations based on customer insight, etc.

The problem is that when they entered the data, the CRM could not tell if a customer already existed, that resulted in large amounts of duplicate data and a host of inaccuracies. They couldn’t get a complete view of the client. Sound familiar?

They started using an Experian data management platform to put data in the hands of the business. They could meet their business goals around understanding their customer, but also develop trust in their data. Within 6 hours, the client was creating business rules and transforming data to help get that understanding.

So they are using data management to help them increase the understanding of their clients, to put in a repeatable workflow that could be used consistently across the different data sources, and then develop that trust around their customer data so they could confidently make decisions.

Now that is one example of how to overcome a few challenges to gain access to usable data. But let's look at what those common challenges are.

When we look at some of the challenges around leveraging data, poor quality data, customer wants/needs changing and legacy systems are all areas tied for causing challenges.

There is more information coming in about customers than ever before. Organizations are struggling under an information overload so it is not surprising to see poor data quality towards the top of the list. When we look at leveraging data, to achieve general initiatives, we find that information is often incomplete, organizations lack a single customer view and there is a big lack of skills when it comes to managing and manipulating data. Again, a lot of these come back to key data management issues.

Frankly, we believe that data quality concerns are going to continue to plague organizations, in one way or another, until they can implement the right processes and technology to scale with modern data demands. Most data management and governance programs aren't flexible enough to account for the changing needs of the business. They have to be put more in the hands of a business user and they have to become more flexible.

One of the big challenges we see is that the question of 'who is your customer' is not always easy to answer. Different departments may have different answers based on what their needs are and the messages they are trying to get across. That is leading to some debate around the definition of SCV.

We see this year that there is a fair amount of variance around what people mean when they say single customer view. Some respondents believe it is based on individual departments, others believe there is one central repository, and still a few believe that a SCV is dependent on the individual needs of the user.

Giving that particular definition of SCV, we see half of companies saying they have an SCV. A good percentage are also working on one today.

First off, I think the number of folks that have a SCV is fairly high. I think people are being generous around the data they have. Most of it is still flawed and incomplete, and since matching and pulling data together is one of the hardest aspects of data management, I think that is a very generous statistic.

That said, I do think this data makes a very interesting point that we are moving in a direction of a contextual customer view more than a traditional technical view of the customer. The old definition doesn't apply as much anymore and people want a SCV that is based on the context of their purpose for leveraging the data. It is certainly not one size fits all.

This is more indicative of a general trend toward the need of the individual over the needs of the masses. Consumers want and expect things to be personalized. Getting an accurate view of the customer for a particular business purpose is imperative in adjusting to this individual desire.

There are a few other trends around SCV I want to highlight quickly. First is where a SCV typically resides. Again, we are seeing more companies and individuals highlighting a variety of views across the business. You can see from the percentages that these don't all add up to 100. However, you can see that a central CRM is still a very popular place for SCV. However, number 2 is that there are different views across the business.


Finally, what are some of the common challenges around creating SCV, because this certainly isn't an easy task. There are often too many different data sources, the volume of information is really high, there are limited IT resources and then finally poor data quality.

I will point out that the first 3 of these are not going away any time soon. We are going to gather more information on the consumer and the volume of data isn't going to drop, it is only going to go up. And as more information is leveraged by more business units or departments, the number of repositories is going to increase. That said, these areas can be mitigated with better data management practices and by putting more tools and resources into the business units that are actually leveraging the data.

As a key take-away for this section, it is important to remember that data and customer data in particular is a key asset in improving the customer experience, a key initiative for many this year. However, the data management processes haven't kept up to make these goals a reality. You need to take a good hard look at the people, processes and technology around data to ensure trust, but also creative a fair amount of flexibility in whatever you put in place to accommodate the needs of individual business units.

Be sure to think about the right data talent, think about the need for context, one size is not going to fit all, then finally when purchasing technology, you need to think about the business. They need to feel comfortable interfacing with software to use on their own for data rules, transformation and logic to provide a holistic and contextual view of the data.

Achieving business objectives, like improving the customer experience, really comes down to trusting the data you hold. Yet, our research revealed that despite the fact that many business users think data quality is already handled by IT, companies lack the trusted data they need to operate in today’s digital environment.

On average, we’re seeing that companies believe 29% of the data they hold is inaccurate in some way. And if you’re thinking that’s a lot, you should know that this number has remained consistent year-over-year for at least the last three years that we’ve asked this question. When we dug a little deeper into the data, we found that those in more senior positions consider a greater percentage of their data to be inaccurate. C-level executives, for example, believe 39% of their organization’s data is inaccurate. This distrust can have real consequences for businesses—hamstringing efforts around data analytics and business intelligence at the highest levels.

Now, moving to the middle stat up here, we’re seeing that 69% of organizations say they’re struggling to turn data into useful insight because of the volume, variety and speed of information. Now, this goes back to this idea of information overload and being able to flex data management programs to meet more modern data demands. We’ll talk more about this stat in a few slides.

Last here, you’ll see that despite an increasing demand for data and insight, we have not seen organizational data management maturity improve in the past three years – actually, most remain relatively immature in their data management practices.

The figure on this slide shows the data management sophistication of the organizations in our study. While these numbers have remained relatively flat from prior years, it is encouraging is that as organizations become more mature in data management, they tend to have lower perceived percentages of inaccurate data. Those who are the least sophisticated (inactive) believe 39% of their data is inaccurate, and that is contrasted with the most sophisticated organizations (Optimized) who believe 21% of their data is inaccurate. So it’s good to see that those who have invested in the people, processes, and tools to manage their data are experiencing greater perceived levels of data accuracy—and by extension, greater trust around data.

Now, no conversation about trusted data can ignore the issues associated with bad data. Our study shows that 95% of organizations see impacts in their organization from poor data quality. Namely, these impact come in the form of wasted resources and added costs, holding back key business initiatives, delays in data migration projects, and – importantly – negatively affecting the customer experience. There are plenty of examples of poor data quality in organizations today, in all industries, but the challenge we face is identifying and resolving data quality issues before they cause problems or significantly delay strategic projects.

Despite an increasing demand for data and insight, most organizations remain relatively immature in their data management practices. As such, human error continues to be the largest contributor to this level of inaccuracy – and this becomes more prevalent with the increasing volume of data. Whether you’re collecting information manually entered by people outside your organization, such as consumers, or you have employees who are either creating or transforming data, human error is inevitable, in my opinion.

Now, challenges like the number of different data sources and volume of data are not going away any time soon, either. In fact, they will only get worse as more digital channels and data assets become available. To turn this level of inaccurate information around, it is crucial for businesses to have the right people, processes, and technology to manage data and make sure it is sound, complete, valid, accurate and reliable.

Now, it’s not all bad news, and there are very good reasons why so many organizations are looking to their data as the future of business. 99% of companies acknowledge that being data-driven gives them a competitive advantage!

From driving improved customer experiences to making business processes more efficient, being data-driven provides a number of tangible business benefits.

Interestingly, when we spliced out the data, those at the lower end of the data maturity spectrum put more emphasis on data as a means of driving efficiency. And those who are more mature in data management place a higher emphasis on growth initiatives, like using data to improve the customer experience or providing better insight for decision making.

And of course, data needs to be high-quality in order to achieve these objectives. Here are some reason why organizations today maintain high-quality data.

High atop the list you’ll see things like increasing efficiency, reducing risk and fraud, cost savings, and so on.

Having trust in data enables a host of benefits. However, practices around data management haven’t really kept pace with changing data usage.

Our advice? Organizations need to invest in data management to drive innovation and keep up with exploding data volumes and sources.

With that – I’d like to pass the baton back to Erin to talk about data ownership.

For this final section, we'll talk about the changing nature of data ownership, which is something I personally am very passionate about. The previous two sections spoke about data related to customer experience and how you build trust in information (giving that we currently lack the needed accurate data). However, to improve on those two areas, in some ways you need to establish a certain degree of ownership so you know who is making the improvements. Nothing is going to get better if you just keep pointing blame at another department or kick the can down the road. And with digital transformation and this new reliance on data, the idea of control is being exposed more and more as an issue.

Here are a few initial trends we'll cover off in this section.

First, right now data is primarily managed through IT for the vast majority of companies we spoke with. However, that is not the ideal state. Currently, the ownership of data (or lack there of) is creating the large amount of inaccurate data and challenges highlighted in the previous sections. Therefore, 75% think that data quality should ultimately lie within the business, letting the people who use the data take more control and ownership over the information.

That lack of control is causing an issue. 70% say that not having direct control over data impacts their ability to meet strategic objectives. When you think about it, requesting data or insights from IT and then waiting days or weeks for an answer just doesn't work in today's environment.

Finally, when IT manages data, the biggest challenge is that they may produce results that are too centric to one part of a business or not think about the end objective or business outcome. 56% of respondents say the IT department doesn't fully understand the data management needs of the business. That isn't surprising when you think about how much IT has on their plate, they are so busy, and data management can be difficult and time-consuming.

Let's first look at how information is managed today. Now when we think about control, we traditionally think about when a single department has sole responsibility for managing and providing access to data. The bulk of management is happening by IT, although for 51% of businesses some individual departments manage their own data.

What this trend and information is telling us is we are starting to see a growing desire to make a shift to a more decentralized model. At a basic level, you are removing the ringfence and giving responsibility and access back to the business users who need the data to do their day jobs.

There is a great quote in the Harvard Business Review that says "Companies that want to compete in the age of data need to do three things: share data tools, spread data skills, and spread data responsibility".

We see from the stats like 75% think data quality responsibility should ultimately lie with the business, and 56% think that IT doesn't fully understand the needs of the business, that IT and the business are disconnected when it comes to data. This dangerous disconnect hurts the effectiveness of analytics and decision making. There is a clear appetite for responsibility to lie within the business.

Now it is important to note that right now, we are seeing similar challenges regardless of management style. You can see that all companies face lengthy delays in gaining insight, they lack trust in their data, they don’t have the customer insight they need, etc. That said, I do think this data is a sign that we are changing. Having departments manage the data is new and we are still figuring out how this decentralized system will work. I think until some of these processes become more formal, we are still going to see these similar issues. However, as we become more mature, they will smooth out.

And as companies move up the level of data management maturity, they cite fewer challenges for the business related to managing data.

As we continue to think about this decentralized model, it plays into the data management projects organizations are thinking about as well as the factors when choosing technology.

First, we see a lot of projects around analytics, data integration and operational data quality. Many of these, especially analytics, tend to reside more within particular lines of business. Data integration is often also associated with moving more data into the cloud, which makes it more accessible in theory to a broader audience. One thing the data also told is us that those with higher levels of revenue optimism have more data management projects planned.

When people are looking at technology for these projects, it is important they think about the business user. You can see that the top two are around ease of use and having technology that can work with the existing suite of products. Most people aren't ripping out everything IT has in place when they are looking to make data more accessible to the business. They are really looking to augment their existing structure with tools that allow the business to develop their own SCV or get data ready for an operational standpoint. That means these tools need to be easy to use, but very flexible in the environment you already have.

With all this data, what we recommend is that organizations need to evaluate and move away from pure IT ownership. Ideally, they start looking more towards a decentralized approach.

Keep in mind, this approach is less straightforward and can pose risk if clear rules are not set in areas of regulatory compliance, data quality, ethics, security, etc. What you should do is ideally have a strong chief data officer, who does not replace the IT department, but delivers strategic direction around data to ensure the right people have the right tools to manage or access relevant data and deliver the best outcomes for the business.

We are staring to see this shift, but organizations need to remember that data management solutions need to fit an increasingly diverse group of stakeholders who want control of their mission critical data. Think about the strategic objects for leveraging your data, the data leadership you have in place and then finally if you have the right technology that can be leveraged by the business themselves, not an overly technical IT department.

Thanks Erin. To summarize what we discussed today, I think it’s important to be in the mindset that we’re all going through this rapid shift to digital and our data is quickly become more and more critical to our success in this area. Key business initiatives, like improving customer experience, are hinging on our ability to leverage data appropriately.

Yet legacy ways of managing data simply cannot scale to meet the demands of data-driven business today, and inaccurate (or rather data that is perceived to be inaccurate) is eroding trust. We’re also seeing the responsibility for data start to gravitate towards the business, though not completely. This is resulting in new data leaders.

And lastly, businesses are investing in data management technology—but they really need to think about the bigger picture: how they can really weave together their people, processes, and tools in a scalable way to really be set up for success in the digital economy.