Big data has many unique qualities that set it apart from classic, structured information. One of these features is speed. Companies can capture input and turn it into insight faster than ever. Under these circumstances, IT leaders may wonder if data quality
efforts still make sense. According to Information Management contributor Michele Goetz, they do, but with a new set of usage rules in place.
Goetz noted that companies may not want to filter their incoming information as meticulously as usual. This is acceptable, as big data programs often account for low data integrity. She did note, however, that management practices are likely worth keeping, as they give the organization a general picture of data quality to compare new results to.
Whether data needs to be cleansed is now a question, according to Goetz. She suggested that firms can make the choice based on how different results are from expectations and what the risks are in each particular case.
Companies are likely in search of particular analytics practices that will maximize the value of their data resources, both unstructured and otherwise. Data mining expert Dan Abbott specified on his blog that companies early in advanced information usage projects should carefully define their targets to make sure efforts are successful.