Skip to main content

Big data brings big changes

Rachel Wheeler Archive
The volume of information produced by businesses today has exploded as more organizations leverage cloud computing, mobile technologies and social networks to store, analyze and access records from customers. Ensuring big data quality requires having the right tools for the job.

According to a new study by InformationWeek Reports, nearly half of survey respondents have more than 500 terabytes of records stored within their organizations, with financial transactions and email acting as the top two drivers behind the volume explosion. However, only one-third of businesses differentiate between traditional files and "big data," meaning only a fraction of companies are using data quality tools and management solutions to accommodate evolving information demands.

InformationWeek Reports analysts said that companies with more than 30 terabytes of information should start making big data strategies, which include implementing data quality tools that can classify and categorize different types of records. Decision-makers and IT departments also need to improve bandwidth to minimize latency, which is a major problem that can erupt when organizations start managing massive volumes of information.

Many firms are implementing cloud computing technologies to eliminate latency and lag. According to the study, 32 percent of survey respondents plan to use hosted environments but have not yet made the migration. Meanwhile, another 30 percent of organizations are already testing applications or fully utilizing the cloud to improve data quality management and accessibility.

According to an InfoWorld report, cloud computing environments offer businesses the ability to store large quantities of information for a low price. Cloud environments can easily scale up or down to meet capacity demands and can be accessed virtually anywhere in the world, making them ideal solutions for managing big data.

Another element of big data that decision-makers need to address is complexity. True big data involves multiple information types, including structured, unstructured and semistructured, and can consist of single log files, sparse or inconsistent information, the study said. Traditional data management tools cannot turn unstructured or semistructured files into meaningful information, which the company can then utilize to its advantage. As a result, organizations often encounter problems when they first address big data without the appropriate tools.

These complexities are likely going to increase in time as information evolves and businesses continue to create large quantities of data. Problems often arise when companies first initiate data-intensive systems, like analytics or mining applications, InformationWeek Reports noted.

 ADNFCR-16001315-ID-800765552-ADNFCR

Comments

Copyright ©, 2014-2017. All rights reserved.

125 Summer St Ste 1910, Boston MA 02110-1615, US