Every company that works with large volumes of customer data has an interest in purifying that information. When they collect knowledge from consumers - addresses and phone numbers, financial data or anything else - there are inevitably going to be imperfections. Technical malfunctions and human errors couple to create countless errors in continuity that can hamper an enterprise.
In fixing these mistakes, there's a delicate balance to be struck. If companies don't put enough emphasis on fixing their data, they might make glaring mistakes, ranging from contacting people at the wrong addresses to delivering unwise deals and discounts. However, if they focus too much on data quality, they run the risk of overextending their resources. So just how high should their standards be?
According to Smart Data Collective, there's a bell curve when it comes to data quality - there are low, average and high performers, and most companies fall into the "average" range on the spectrum. It's these businesses that get the most ROI from improving data quality. Those who are low performers are a lost cause, and those who are high are already in good shape without any extra efforts. Those in the middle, however, have work to do.
Smart Data Collective cited data quality expert William McKnight, author of the book "Information Management: Strategies for Gaining a Competitive Advantage with Data." McKnight argues that no company will be able to successfully eliminate all of its data quality imperfections. Quality, therefore, isn't about the absence of all defects - it's a matter of "the absence of intolerable defects." How each company defines "intolerable" is a matter of taste.
"[Data quality] is the absence of defects that see us falling short of a standard in a way that would have real, measurable negative business impact," McKnight stated. "Those negative effects could see us mistreating customers, stocking shelves erroneously, creating foolish marketing campaigns or missing chances for expansion. Proper data quality management is also a value proposition that will ultimately fall short of perfection, yet will provide more value than it costs."
There are always opportunities to cleanse data. Companies can clean up their information at the point of collection, when they first make sales, or they can eliminate imperfections after the fact. Either way, being decisive is important.
The goal should be to move along that bell curve - to transcend the "average" and become a high performer when it comes to data quality.