Skip to main content

Expertise and quality can assuage the big data dilemma

Paul Newman Archive

Organizations and companies alike are throwing money at big data - an all-too-familiar course of action that is meant to remove stumbling blocks and smooth the road to better outcomes. However, investors are often finding that no matter how deep their pockets, they still need the right people and solutions, like data quality, to achieve strategic goals. 

Take the United States government, for example. A survey by Booz Allen Hamilton and the Government Business Council found that federal departments are optimistic about big data technology, but they do not currently have the skilled staff members to achieve their aims. 

While some people think the problems are caused by lack of expertise in the nascent field of big data analytics, others believe the issues are actually rooted in scope, according to FCW. 

"In the government sector, you get told you're doing cloud, you're doing big data, you're doing Hadoop,' and you're having this technology chasing questions or problems," Jim Campbell, corporate systems engineer at HP, told the source. "But before you do that, you have to define the scope of what you're trying to define. All other questions fall in place once you've done that."

Federal government entities might take note of Los Angeles, as the city has narrowed its big data scope and implemented a new Automated Traffic Surveillance and Control System that is expected to improve traffic flow, cut drive times and reduce pollution.