Data quality has been highlighted as one of the crucial factors the success of IBM's attempts to develop techniques to automate business decisions.
William Pulleyblank, chief of the company's center for business optimization, tells ZDNet.com the firm is developing a number of algorithms to enable computerized choices to be made.
He says this technology has a number of landmines which could disrupt its implementation, including potential issues with data quality.
"Any automated process is only as good as the data being used. If the data has errors in it IBM's algorithms won't work as well," he tells the news provider.
Mr Pulleyblank says that by implementing filtering, "noisy data" can be minimised and the accuracy of the decisions can be increased.
Andy Hayler, chief executive officer of the Information Difference, recently told IT-Director.com that address data quantity demand for master data management initiatives is increasing.
He stated that tools which enable address checking, as well as data quality firewalls, are becoming more easily available.