Skip to main content

Are companies measuring the costs of data quality when they invest in big data

Predictive analytics has finally taken off, now that the technology costs have dropped and awareness has grown. It's been a long time coming for big data, according to an article by Harish Kotadia for the Smart Data Collective. Kotadia writes that as a Ph.D. student in the 1990s, he was determined to help companies use data to improve the services they provided to their clients. 

Even as the technology developed and companies gained access to information through customer relationship management (CRM) programs and business intelligence (BI) systems, adoption was limited due to lack of expertise and budgets, he explains. 

However, big data has now reached the mainstream and implementation is surging as companies realize they can afford the technology that enables them to anticipate upcoming trends and prepare for them in advance. However, they may not necessarily invest in the complementary data quality measures that will ensure their efforts will be a success. 

Unfortunately, many companies choose to make these additional purchases until they have recognized the consequences that come when they are ignored. The majority of questions posed to Ted Friedman, vice president and distinguished analyst with Gartner's Information Management team, following a recent webinar dealt with the costs of data quality, as reported by IT Business Edge. 

Friedman told the source that many people recognize the fundamental issues involved, but choose not to invest until they know exactly how much bad data quality will cost them. It can be difficult to quantify, but bad information costs users valuable resources and productivity, he explains.