Companies across all industries go to great lengths to ensure data quality. Collecting massive amounts of information on consumers and their habits can be tremendously useful, as it helps managers make informed decisions about the future of their enterprises, but without quality, data is essentially worthless. That's why many businesses use address management tools to verify that every scrap of information they collect is accurate and usable.
Information Week raises a question in response to this movement, though. Has all the emphasis on accuracy gone too far? Is it possible that data quality has become overrated?
Columnist Rajan Chandras admits the idea sounds heretical. But he also notes that sometimes, "close enough" is sufficient. It might be less important to ensure 100 percent quality in data than to proceed quickly with information that's, say, 70 or 80 percent accurate.
"There are a slew of use cases where granular data quality doesn't matter much," Chandras writes. "Typical examples include summary-level and statistical reporting/analytics. If a trucking company is looking to identify most frequently used or most-profitable routes, for example, individual discrepancies in transportation records don't really matter."
It's possible that what's more important isn't data quality, but data velocity, as Information Week recently posited. It's a challenge for companies to make sure their information is up to date and lacking mistakes, but it might be even more vital that they act quickly, winning the analytics race with their competitors.
Data quality should not be forgotten. Chandras writes that in many cases, we still aren't taking it seriously enough - collecting data is every company's obsession, but purifying it is often relegated to a mere afterthought. This needs to change.
At the same time, though, overemphasizing data quality can be just as harmful as overdoing. There's a delicate balance to be struck between quality and quantity, and companies must proceed carefully.