Skip to main content

Don't let speed and volume trump data quality

Paul Newman Archive

In today's data-driven world, speed and size seem to be of the utmost importance. Businesses are rapidly investing in big data solutions so they can harvest information faster and more effectively than their competitors. As more companies look to emerging analytics tactics for better insights, they will find that without data quality, their velocity and volume can fall short, according to a post by Louis Lovas for the High Frequency Traders blog. 

"Market data comes in many shapes, sizes and encodings. It continually changes and requires corrections and an occasional tweak. The challenge of achieving timely data quality in this data dump is dealing with the vagaries of multiple data sources and managing a sea of reference data," Lovas writes. 

If businesses don't assign people to data quality responsibilities, they often end up pointing fingers after issues and inaccuracies emerge, Jim Harris writes in an entry for the Obsessive-Compulsive Data Quality blog. 

Harris explains that when it comes down to rooting out the source of data problems, many professionals involved in the project will refuse to assume responsibility for the mistakes, take ownership for the information or be held accountable for data governance.