Long-term data quality strategies need to focus on ensuring that future information capture is accurate, it has been suggested.
Writing for B-Eye-Network, David Loshin, president of consultancy Knowledge Integrity, explained that currently schemes tend to focus on correcting information already stored on systems.
This, Mr Loshin argues, fails to address the root cause of data quality errors and will not ensure that data quality levels improve in the long-term.
He said: "There is some flaw in our thinking about process improvement and data quality management and it may center on the concept of validating data with respect to previously identified errors."
"The challenge is not is monitoring for errors that you already know about - it is monitoring for errors that you don't know about."
In addition, the industry expert recommended that businesses discuss what type of issues occur and consider ways they could be prevented.
Jim Harris recently claimed on his OCDQ blog that organizations looking to improve their data quality should ensure that they invest in the right tools.
Posted by Rachel Wheeler