Skip to main content

The three-step process for improving data quality

Richard Jones Archive

While companies have made great strides in recent years in information mining, collecting more information about their customers and the economy at large than ever before, they still have plenty of work to do in terms of data quality. When gathering knowledge from a variety of disparate sources, all organizations need to be careful about accuracy. Everyone, ranging from retail chains to institutions of higher learning, wants to distill the truth and throw away the noise.

Unfortunately, mistakes in data quality have many ways of seeping into an organization's coffers. They can crop up at the point of collection, or much later. IT leaders constantly need to have their guard up.

According to Smart Data Collective, companies need to have a multi-step process for ensuring data quality. Data governance expert Timo Elliott explained that this procedure must begin by setting standards for quality and preparing to apply them fairly across the board.

"The right approach to all these problems is to have a data quality 'firewall' that filters data rather like internet traffic," Elliott stated. "And you can't create that firewall unless you first have a definition of what 'good data' looks like ... Ultimately, only the business knows what defines business-ready data, therefore IT has to collaborate with them to create the business rules."

How can quality improve?
It's not easy for companies to ramp up their attention to data quality, but they can make tangible improvements in three main ways.

The first is to shore up data migration. When information is first trickling into a database - from a mobile app, a social media page or a customer phone call, for instance - companies need to make sure it's recorded accurately and there are no technical errors in the process.

Secondly, data governance officials need to make sure they're not making any mistakes in the day-to-day movement of data throughout various points in their infrastructure. At every access point along the way, impurities need to be sniffed out and eliminated.

Thirdly and perhaps most importantly, employees need to be constantly vigilant about not making human errors in their transfer of data. It's common for workers to make typos, enter outdated facts, leave values missing or incorrect or create duplicate entries for one person. All of these mistakes must be kept to a minimum.

This quest for data quality is not a one-time deal. It requires constant attention from workers and supervision from management at any organization.