Skip to main content

Look for data decay when archiving information

Rachel Wheeler Archive

Bad data quality can cause serious problems for users. A recent Finextra article suggested that information management issues might have been at play during the 2008 credit crisis. Because important figures such as banks' interest rates were stored as unstructured data in transaction agreements, stakeholders might have found it challenging to accurately asses risks.

This seem counterintuitive to the current sentiment, which is defined by an overarching excitement about the promise of big data as a way to negate risks and achieve favorable outcomes. Yet, incomplete, inaccurate and outdated information can add new obstacles to organizations' analytics efforts.

In one example, IT professionals at the University of Bristol in the United Kingdom knew there was great potential in the piles of data that had been stored in their systems, Information Week reports. 

"We always suspected a lot of that data we were backing up constantly was no longer active," Harvey Ditchfield, a senior systems operator for the university, told the news outlet. "I was surprised at how old some of this was and how long it had been since anyone had touched it."

Data has a relatively short expiration date, depending on the information it contains. Names and addresses frequently change, which means data analysts need to regularly purge outdated content or their insights might be contaminated.