Every company that works with large volumes of information on a regular basis has an interest in maintaining data quality. If a business is reliant upon clusters of data about customers, clients or general economic trends to make decisions, it's essential that IT officials work to purify that data to avoid taking wrong turns in the analytical process.
The first step toward ensuring quality is using address management tools to verify contact information. When consumers enter data manually into forms, it's subject to numerous problems, including typographical errors, outdated contacts and duplicate entries. Email verification solutions and other quality assurance resources can help on that front.
Aside from faulty contact information, though, companies come across other problems when trying to ensure data quality. According to Information Management, firms need to put more philosophical thought into the way they gather and analyze information. Michele Goetz, an analyst at Forrester Research, believes that data isn't leading to positive outcomes often enough.
"In the world of marketing data science, IT may not be completely relieved of its data quality duties, but the game has changed," Goetz said. "Lack of or reduced IT data quality services are not always a barrier to big data when in context of relevant quality customer insight. Yet, IT still needs to support and certify data quality in the access and integration of data."
Here are two problems that sometimes arise in big data initiatives.
Goetz cited Panera Bread as one example of a company that had trouble with using complete sets of data. The restaurant chain recently attempted to analyze customers' spending habits by monitoring the use of their credit and debit cards. They encountered a problem - that strategy didn't account for customers who used cash. What if these people had demographic biases that needed to be considered?
There's also the question of how much time to put into gathering information. Say a chain like Panera Bread wants to gather information as quickly as possible so it can adjust its prices, so it totals up its financial data over a span of two weeks. Is that enough time, or is that sample prone to fluctuations in data? Is a month more likely to yield reliable results? Is a year?
There are no easy answers to these questions. Ensuring data quality is not exactly an exact science - it's a philosophical endeavor, and every company has its own approach.