Skip to main content

Keeping data quality mistakes out of end-results

Rachel Wheeler Archive

Excitement about the prospects of big data extends across industries, with marketing teams hoping to achieve better returns on their campaign investments, banks looking to reduce risks associated with their decisions and retailers aiming to find sales tactics that delight customers. One thing is certain - spending on analytics tools has grown since big data hit the scene and is expected to continue as late comers catch up with market leaders. What isn't definite yet is whether big data adopters fully understand the value of data quality and have thus invested in address management systems to complement their databases. 

Pictures might be skewing reality
Big data may have gained its 'sexy' image because of the slick visualizations that have accompanied the rise into the mainstream, according to information and communications technology professional Jim Stikeleather, who recently authored an article for Harvard Business Review. Infographics have become a common way to combine technical skills and statistical information, but their purposes may be that shallow. Stikeleather worries that some companies are creating eye-catching graphics in lieu of passing along valuable information. 

For a data visualization to be considered necessary and beneficial to its audience, creators must ensure they  have strong data quality, understand the context and eliminate any cues that could create biases, he adds. The adage "garbage in, garbage out," applies to information compiled for illustrative graphics, and letting mistakes slip through in the beginning might mean that end results are flawed and unusable. 

Although companies like to think they have accurate and complete information on hand, this is not necessarily the case. Nearly 95 percent of businesses think the contact data they have contains some sort of inaccuracies, according to Experian QAS' findings. 

Assigning responsibility for data quality mistakes
Nobody likes to take the blame when mistakes are found, but responsible parties must be rooted out to ensure the issues are addressed or preventative tools can be introduced. 

In his latest blog post for The Data Roundtable, Jim Harris writes that data quality issues are not the product of advanced technology -  they have been plaguing business operations since information first began passing hands from originators to recipients. In one famous case of these discrepancies, a Philadelphia wool dealer Frank Primrose filed a lawsuit against the Western Union Telegraph Company in 1887 because the words "buy" and "bought" were misconstrued en route to an agent in Kansas. The mistake cost Primrose $20,000.

The court sided with the telegraph company because the back of the telegram explained that mistakes fall on senders' shoulders unless they take steps to ensure the message is correct before transmission, Harris added.