Skip to main content

Good data quality helping the healthcare industry identify solutions

Rachel Wheeler Archive
The United States healthcare industry is going through a number of changes. This year, the Affordable Care Act was deemed constitutional, which means healthcare providers will soon be experiencing an influx of patients that were previously uninsured. At the same time, doctors are implementing new tech tools, such as electronic health records, that will improve accuracy and efficiency.

When combined, these two factors mean the healthcare industry is soon going to have a massive amount of data on its hands, according to Federal Computer Week.

In fact, the Demystifying Big Data report by TechAmerica Foundation's Federal Big Data Commission found that in 2009, the healthcare sector had generated 150 exabytes of data. This was more than the U.S. government at 848 petabytes and more than the amount of data it would take to represent all of the words spoken by human beings, which is only five exabytes.

Rather than focusing on how big this data is, Shahid Shah suggests in an article for Health IT News that healthcare facilities use their resources for data quality, identifying the most actionable and practical information.

When they do so, they might be able to find unexpected correlations that can lead to major medical discoveries, such as one recently reported by Genetic Engineering & Biotechnology News. A study by deCODE Genetics and Illumina revealed a connection between an immune system gene variation and a common form of Alzheimer's disease.

"So-called big data research has evolved to a new level of sophistication due to new research tools, access to expanded and high-quality genomic data sets and certainly the profound analytic skill level of investigators now combining sequence data and biological knowledge to find drug targets," said Kari Stefansson, CEO of deCODE Genetics.