In the 21st century, the push toward digitizing information exploded, except for in one major industry: health-care. Featured in a recent Wall Street Journal piece on electronic medical records, Assistant Professor of Information Systems, Dr. Jiban Khuntia, explained why the shift took so long, and what finally led to it.
When you walk into a doctor’s office, you oftentimes expect fast answers. However, given the current set-up and storage of medical data, 75 to 80 percent of the information contained in electronic health records cannot be stored in the existing databases used by hospitals, meaning that such data cannot be accessed easily. McKesson, a health IT firm, hopes to put this vast amount of data to use by developing an advanced algorithm to restructure this raw data, increasing the speed and quality of care afforded to patients by medical professionals.
It may seem surprising to find that such a large pool of data continues to go largely unused. However, as Dr. Khuntia explained, the adoption of virtual records and analytics tools didn’t occur up until recently. It was in 2009, in the Obama administration’s early days, that health-care organizations began to skew towards new health-centered information technology, Dr. Khuntia noted. This was due in part to the monumental legislation that was being fiercely debated at the time, the Affordable Care Act, or as it’s more commonly known, “Obamacare.” Amid the debate came increasing pressure on the part of health companies to reduce health-care costs, which ultimately led them to explore IT advances within the field as a way to provide “value-based care”—high-quality, fast, affordable care—which the legislation incentivized with higher provider reimbursements, rather than simply paying providers based on the raw number of patients treated (e.g., fee-for-service models).
Dr. Khuntia recently sat down to expand on the topic of value-based care models discussed in the article. He stated, “These models center on patient outcomes and care quality improvements at a lower cost. Providers need to assess where they stand in current quality and outcomes performance.”
To do that, providers will need to collect, organize, and analyze the right data to measure and relate to performance measures. They will also need an information system infrastructure that allows providers to link clinical and financial data. Clinicians on the front lines of care can then use the analysis to reflect and modify process and workflows that align with better performance. Simple changes such as avoidance of falls or hand wash protocols in patient rooms through alert based systems may reduce hospital mortality. In the same vein, discharge instruction procedures may reduce readmissions.
But measuring the impact of these processes requires specialized knowledge in data and analysis. Thus, the current motivation for investing in data and analytical technologies with an information systems architecture is largely because of the switch from fee-for-service to value-based care reimbursement. The current healthcare landscape demands lowering costs while improving quality and combining the two. There is push from the policy-echelons through value-based models, and pull from patients to approach the task with a thorough understanding of clinical quality measures and the costs associated with delivering care. This push and pull is the futuristic approach to create an optimal efficient healthcare system. And the enabling solution to this complex challenge is data and analytics.
Since 2009, as more and more health-care organizations look for ways to mine existing data, opportunities for health information technology continue to increase, allowing companies like McKesson to help improve the healthcare system for all.
For more information, read the full Wall Street Journal story.