An ongoing study by IDC revealed that 2.8 zettabytes of information will have been created and replicated in 2012, though this will jump to approximately 40 zettabytes by 2020. This means that there will be more than 5,000 gigabytes of data for every man, woman and child on Earth by 2020, yet only 0.5 percent is being analyzed.
"As the volume and complexity of data barraging businesses from all angles increases, IT organizations have a choice: they can either succumb to information-overload paralysis, or they can take steps to harness the tremendous potential teeming within all of those data streams," IT expert Jeremy Burton said.
IDC noted that if organizations decide to leverage these records, decision-makers need to take the appropriate steps to ensure they maintain data quality
without jeopardizing security. While information can be extremely useful for a number of mission-critical operations within the private sector, inaccuracy can dramatically hinder a task's effectiveness.
A separate study by Bloor Research noted that the majority of companies understand the importance of data quality but sometimes have a problem ensuring information is accurate. By using advanced analytic tools, however, decision-makers can easily evaluate and understand files to ensure their verity.