Big data has changed the game. Companies can now store as much information as they need and use more of it productively than ever before. These shifts in the status quo have affected the science of maintaining data quality
. Data Roundtable contributor Dylan Jones recently stated that data management is an important proces to keep IT complexity from spiraling out of control.
As companies take on more data sources, there is a persistent risk that IT will grow unchecked and become confusing and difficult to manage. Jones presented data management and data quality tools as an antidote to building confusion and stated that if companies follow industry standards when pursuing data quality, simplicity will naturally follow.
Jones offered an example to make his point - he described a utilities provider that was unable to mine its more than 30 network inventory storage systems for insight. The company's main failing, according to Jones, was a lack of data management.
Having a specific plan to manage big data is sometimes overlooked. According to Network World contributor Jill Dyche, some firms are so fixated on finding ways to capture and scan their massive new information stores that they lack focus regarding data management best practices.