Data quality problems remain an issue for some organizations despite the area being a focal point for a number of data management professionals and data governance organizations, it has been suggested.
Writing for Smart Data Collective, Sean McLowry claims that a lack of guidelines which could allow data practitioners to measure losses caused by inadequate information is hampering development.
"Regulatory agencies, executive management and data governance organizations are lacking a standard, objective and scientifically defined way to articulate data quality requirements and measure data quality improvement progress," he argues.
Because of this, Mr McLowery states that many businesses often only ever see limited steps forward and success from their data quality scheme.
The importance of implementing effective data quality plans was highlighted earlier this year by research conducted by Forbes and SAP.
According to the results of the study, data-related problems are costing some large companies more than $20 million a year.
Of those questioned, 82 per cent believe that bad data can lead to expensive mistakes being made, while 61 per cent claim that possessing bad information negatively affects their business processes.
Posted by Paul Newman