Moving data from one system to another or integrating systems to create a more complete picture of operations can be critical for companies. It can raise problems, according to eWeek, unless the employees involved account for variables like data quality
The news provider suggested that before making any definitive moves to transfer data, workers should survey the information and apply quality management processes. Deduplication software
could be critical for this step. Performing these actions, according to eWeek, will ensure that the next group of users takes significant value from the data.
EWeek also reported that data quality management is especially important when taking information from legacy hardware. Shifting files from servers and mainframes into a more flexible cloud setting is becoming more common, meaning the need to ensure the files are consistent and accurate is a widespread concern, affecting nearly every industry.
Though nearly all types of firms have some data quality needs, there are certain companies where the requirements are acute. Intelligent Utility recently reported the results of a Utility Analytics Institute webcast. The participants, data managers from utilities providers, urged managers in the field to set aside resources to keep data quality high.