There's a general consensus among business leaders that using more data for making pivotal decisions is the wave of the future. To that end, it's important to maintain data quality, and executives now understand the importance of using such tools as address management solutions for verifying the accuracy of individuals' contact information, among other important facts.
There's still some dissonance, though, over exactly how important quality should be when undertaking big data initiatives. How accurate is accurate enough? Is 90 percent efficient enough? Is 95, or 98? How much time, money and energy are companies willing to dedicate to this task when they could focus their resources on other areas that could lead to business growth?
Information Management recently highlighted this debate, explaining that while data quality has become a subject of grave importance, the business world must place the issue under the microscope to agree on categories of data quality and standards to which analysts should adhere. Dan Myers, who manages enterprise data management initiatives for Farmers Insurance, wrote about the importance of addressing this issue.
"I believe we'd be throwing out the baby with the bathwater if we dismissed the writing of multiple authors on the dimensions of data quality just because there isn't a current consensus," Myers wrote. "Now is the right time for the data quality industry to finalize a set of standards, much like the accounting field has done with the Generally Accepted Accounting Principles."
Attention to this issue has greatly increased in recent years. TechTarget reported that according to Gartner, the level of focus has changed from a mindset of neglect to one of "something has to be done about this."
What exactly should be done is a difficult question to answer. But by sitting down and engaging in advanced dialogues about today's business landscape, companies can make better decisions about how they verify their data moving forward.