As we approach the dead of winter and the weather keeps getting progressively colder, the words "polar vortex" are on the tip of every tongue. Everyone's talking about the cold fronts that are sweeping across North America. With all the road closures, flight cancellations and work stoppages impacting everyone's lives, the weather has become a chief topic of conversation from coast to coast.
Except here's the thing about cold weather - there's often a difference between the actual temperature and what people perceive it to be. Because the wind blows harder in some locales than others, there are often wild disparities in what the temperature "feels like," even within a tight radius. When people complain about how cold they feel, they're not really talking about temperature - they're referring to the wind chill factor.
The same logic can apply to data quality. Often, there's a disconnect between the actual accuracy of a company's data and IT officials' perception of it. That's why, according to Smart Data Collective, businesses need to begin thinking about their "data quality wind chill factor."
MIKE20 Governance Association points out that data quality, just like temperature, is an exact science. A person's address in a database either is correct or it isn't, and there are objective measures to tell the difference. And yet all too often, companies set their data criteria not based on exact factual evidence, but on whims.
"If you want your organization's data quality to be warm and cozy for all of your users, make sure you consider what data quality feels like from their business perspective, perhaps supplementing objective data quality metrics with a subjective data quality chill factor that's customized for each user," the source advised.
It's good advice. Companies need to be keenly aware of their levels of data quality - not just considering their biased, subjective opinions on the matter.