Skip to main content

Data quality should be part of the plan, not an afterthought

Rachel Wheeler Archive
As companies embark upon their big data plans to capitalize on a growing trend that is expected to take off this year, they may find themselves swamped with data. Businesses that are eager to capitalize on new analytics solutions are harnessing more data and speeding up processing, according to a recent survey by SAS.

An Economist Intelligence Unit study by the business analytics firm revealed that among the 752 international respondents, two-thirds were now collecting information about customers through the internet. In order to accommodate that influx of information, 65 percent said that in the past year, they had increased the speed at which they were able to process that data.

When companies launch big data strategies effectively, they stand to benefit from faster decision making and real-time information. However, those efforts shouldn't come at the price of quality. Unfortunately, many business executives move forward with their plans, eager to capture the advantages, but find they fall short of meeting their goals because the proper tools were not used to structure and verify the data.

"You think that data quality is like taking out the garbage," Grant Ingersoll, chief scientist for LucidWorks, told the audience at a recent big data conference, as reported by Forbes. "It's something you take for granted until it doesn't work, until the garbage man doesn't show up and you got a whole bunch of trash sitting on your front step for weeks on end."