The cost of care is skyrocketing. Could big data, the internet of things (IoT), and internet-based technologies be the cure?
From Martin Shkreli’s notorious price hike of the HIV drug Daraprim, to highly publicized government programs like the Affordable Care Act (ACA), to sparring between presidential frontrunners, healthcare costs have been a hot topic in our national media as of late. But amid all the chaos and confusion, there’s one thing that most people can agree upon: the price of healthcare is far too high.
There is statistical evidence to support this assertion: the amount that the average American worker spends on healthcare has increased by 134% over the past decade, according to Aon. And premiums, says the Kaiser Family Foundation, have jumped 303% over the same period, far outpacing both wage growth and inflation. They’re expected to grow another 4.1% this year.
As a result of these skyrocketing costs, more and more average consumers have been forced to forego treatment — a trend that, unfortunately, doesn’t seem to be slowing down. However, internet technologies are painting a brighter picture, helping the industry both drive down the primary contributing factors of high costs, and improve the overall quality of care. But like all major, institutional changes, progress will be gradual.
Factors Driving the Crisis of Cost
Sir Andrew Witty, CEO of pharma giant GlaxoSmithKline, was candid about the cost of care crisis in a recent interview with NPR. “I think this [problem] is telling us things have to change… People are concerned about the affordability of healthcare overall.” While it’s clear that the industry needs to evolve, he explains that “The problem in the U.S., bluntly speaking, is there’s no transparency around what the real prices of everything [are],” alleging that as much as 40% of drug ticket prices are diverted away from the people that are performing drug R&D (the Pharma companies themselves).
Witty says that, to lower the costs of clinical research, these companies actually need to be held to much higher standards of care. He argues that the creation of accountable care organizations (which act as watchdogs of sorts), increased cooperation between private and public entities, along with increased transparency, are all positive trends that have been encouraged by ACA.
And with regard to R&D costs, Witty has a point; according to the Tufts Center for Drug Development, it takes an average $2.6 billion to develop a new drug. Moreover, FDA drug approval follows a trend half-jokingly referred to as Eroom’s Law (“Moore’s Law” backwards), wherein the number of drugs approved by the FDA per every billion dollars spent tends to halve every nine years (adjusting for inflation) — indeed, FDA approvals have dropped 40% since 2005.
Despite this daunting data, there are signs of a silver lining: just as the tech sector has revitalized many struggling industries, it also has the potential to significantly impact healthcare. Internet technologies are facilitating both increased transparency and lowered clinical research costs — and, ultimately, the price of drugs and therapies for patients
Big Data and the IoT Create Connective Solutions
One huge barrier to driving down pharmaceutical costs is that drug companies and clinical researchers rarely share information — the data needed to innovate may already exist, but it’s usually comfortably siloed in another organization and thus out of reach. Big data and the internet of things (IoT), however, have started bridging the gaps. According to the NCBI, “The industry has identified a new frontier that might provide the insights needed to turn the ship around and allow the industry to return to sustainable growth.”
Their effect will be not insignificant. McKinsey estimates that big data applications could reduce healthcare spending by $300 to 450 billion, chipping away as much as 17% from the average $2.6 trillion spent annually on U.S. healthcare. How? In clinical research, for instance, end-to-end data integration and highly targeted digital advertising could enable clinicians to identify and recruit qualified trial participants more accurately and efficiently — thereby shortening enrollment periods and reducing overall trial costs by significant margins. What's more, trial performance and outcomes can be tracked and optimized in real-time, and electronically-captured data can then easily be shared with other organizations for renewed, cost-effective R&D.
Impacts Are Becoming Tangible
Many tech and pharmaceutical companies have already taken notice. For example, Pfizer and IBM recently partnered to monitor and collect real-time symptoms data from clinical Parkinson’s patients by situating them with IoT-connected medical devices. Peter Bergethon, head of quantitative medicine at Pfizer, told Forbes, “We need to understand not just why we’re making someone symptomatically better, but we also need to identify earlier on who needs the drug and if we’ll be able to make a difference in the disease progression.”
In other areas, professionals are using these technologies to prevent disease. A Toronto-based firm, BlueDot, successfully used big data disease modeling to accurately predict the spread of the Zika virus. In an area rife with uncertainty and unknowns — professionals don’t have open access to patient information for obvious privacy reasons — this is a huge victory.
Where all of this digital innovation will ultimately take the healthcare industry is, as yet, unknown — but the key takeaway is that regardless of this uncertainty, costs will become more predictable, and hence more manageable. As Sir Andrew Witty said in his interview, “The number one way to reduce costs in pharma R&D is to fail less often. Many people compare it to putting a man on the moon — at least they could see the moon.” Despite the intangibility of healthcare costs, the internet is beginning to light the way forward.