Skip to main content

Data quality must be measured to assess its effectiveness

Rachel Wheeler

March 17, 2010

Archive

Quantified measurements should be put in place to reduce the impact poor data quality has upon US organizations, it has been reported.

Writing for Information Management, Ed Wrazen cites recent research carried out by industry analyst Gartner as a reason to take data quality seriously.

The firm averaged the loss estimated by 140 separate businesses finding that poor data quality was costing companies a total of $8.2 million every year.

According to the article, measurements need to be made to assess the extent that data quality issues are affecting the day-to-day running of businesses.

Mr Wrazen argues that a program of data quality metrics should be implemented to raise awareness of problems and opportunities.

"A facility to monitor and measure data quality over time is fundamental too. Only then is it possible to prove that investments in data quality are making a difference," he writes.

Meanwhile, Silicon Republic recently reported that incorrect data quality could be costing the US over six per cent of its GDP.

Posted by Paul NewtonADNFCR-2366-ID-19674732-ADNFCR

Copyright ©, 2014-2017. All rights reserved.

125 Summer St Ste 1910, Boston MA 02110-1615, US