Though big data storage and use are relatively new ideas in the IT world, workers have adapted to them. This has been out of necessity, as companies have faced the challenge of either adding big data programs or falling behind their early-adopter peers. In a short span of time, companies have taken the concepts behind Hadoop and NoSQL databases to heart, making their large data stores available at high speed. According to BeyeNetwork contributor and software industry veteran Barry Devlin, however, there may be a shift on the horizon.Online giants change course
The future, according to Devlin, is in the hands of the usual suspects in the big data technology world: Google and Facebook. These companies, developed in the digital age, have massive stores of information and have spawned many of the technological approaches that have become industry standard in recent years. Devlin noted that the software being used at these offices involves new methods of storing big data.
Hadoop and NoSQL databases, according to Devlin, are actually quite simple in execution. He framed the current situation as one driven by economics, a search by companies to purchase affordable space with an open source development environment. Compared to such low-stakes surroundings, the new database projects in progress at Google in particular look advanced, with the systems becoming faster than their current equivalents.
Devlin noted that recent reports about Google's big data capacity have focused on the company's efforts to return to a database environment, albeit one capable of processing huge data sets at high speed. He noted that Google has spent the past three years attempting to change the storage paradigm and has published research on its efforts. He stated that the new technologies underway, namely Dremel, Caffeine, Pregel and Spanner, will be important elements of the tech hype cycle in the near future.Storage and management
One of the open questions regarding the future of data usage revolves around the problems companies already have managing and using their storage systems. TechTarget recently reported that many companies have serious problems using their various data sources, with adverse effects on data quality
and integrity. These faults are often produced by fallacies brought on by the hype cycle. The source noted that many big data users are so eager to harness flashy external data reservoirs that they ignore very useful internal archives they already have access to.