Skip to main content

Merge and validate files for better data quality

Rachel Wheeler Archive
Various departments within an organization may gather and store their own information about customer interactions and operations, and this is particularly true among larger companies. As a result, there are often independent databases that do not interact with each other or share data about clients. This can lead to duplicate records, an issue that becomes even more problematic if the company ever decides to merge files or share the data across divisions.

For instance, if a retail company's customer service department and order fulfillment section do not share updates to client listings, contact data quality can drop sharply. Conversely, if they do enter files into the same database but do not validate and condense that information, multiple records with contradicting details can proliferate.

John Schmidt, writing for the Informatica Blog, explains that simply compiling all of your disparate sources of data into one place will create more problems than it solves.

"One version of the truth isn't achieved by putting all your data in one big system or one big database - that's impossible," Schmidt explains. Rather, companies have to take a systematic approach to collecting duplicate files and merging the information to create a single, "true" record. He suggests regularly measuring data quality to keep tabs on the state of a company's records and setting up a program for governing the information and enforcing policies for maintaining it.ADNFCR-16001315-ID-800796071-ADNFCR

Comments