News, Analysis, Trends, Management Innovations for
Clinical Laboratories and Pathology Groups

Hosted by Robert Michel

News, Analysis, Trends, Management Innovations for
Clinical Laboratories and Pathology Groups

Hosted by Robert Michel
Sign In

Mass General Brigham Joins with Best Buy Health to Create Country’s Largest Hospital-at-Home Program

Clinical laboratories with mobile phlebotomy programs are positioned to benefit as demand for at-home blood draws increases

Hospital-at-Home (HaH) models of remote healthcare continue to pick up speed. The latest example comes from the 793-bed Mass General Brigham (MGB) health system which partnered with Best Buy Health to build the largest HaH program in the nation, according to Becker’s Hospital Review. This means clinical laboratories will have new opportunities to provide mobile phlebotomy home-draw services for MGB’s HaH patients.

Headquartered in Somerville, Mass., MGB presented its new “Home Hospital” program at the World Medical Innovation Forum (WMIF) in September.

“The health system now has a capacity for acute hospital care at home of 70 patients and is currently treating about 50 to 60 a day. The goal is to move to 10% of Mass General Brigham’s overall capacity, or about 200 to 300 patients,” Becker’s reported.

Best Buy Health provides MGB’s Home Hospital patients with computer tablets and Internet access, Becker’s noted.

“Healthcare is fragmented, the technology doesn’t always connect. Technology is our expertise,” said Chemu Lang’at, COO, Best Buy Health, during the WMIF presentation.

The hospital is the most expensive site of care in the US healthcare industry. Thus, preventing patients from needing to be hospitalized—or treating them in their homes—could reduce the cost of care considerably for both patients and multihospital systems.

“It’s been estimated that 30% of inpatient care will move to the home in the next five years, representing $82 billion in revenue. This is a tremendous opportunity,” said Heather O’Sullivan, MS, RN, A-GNP, Mass General Brigham’s President of Healthcare at Home, during MGB’s presentation at the World Medical Innovation Forum in September, according to Becker’s Hospital Review. MGB’s HaH program offers clinical laboratories with new opportunities to provide mobile phlebotomy services to the health system’s Hospital-at-Home patients. (Photo copyright: Mass General Brigham.)

Hospital-at-Home

Proponents of HaH call it a “sustainable, innovative, and next-generation healthcare model. [It is] person-centered medical care that keeps patients out of the hospital, away from possible complications, and on to better outcomes,” RamaOnHealthcare reported.

Some of the biggest payoffs of HaH include:

• Cost Savings: Anne Klibanski, MD, President and CEO, MGB, described the HaH program as “a way the health system could stay afloat and thrive amid financial challenges affecting the industry, with lower costs and better outcomes for patients at home,” Becker’s Hospital Review reported.

• Increased Capacity: Having an HaH program can help alleviate bed shortages by treating many conditions in patient’s homes rather than in the ER. “The program … typically treats patients with conditions like COPD flare-ups, heart failure exacerbations, acute infections and complex cellulitis,” Becker’s reported.

“It’s not typically comfortable to be cared for in the emergency room,” said O’Neil Britton, MD, MGB’s Chief Integration Officer, at WMIF.

• Decreased Staff Exhaustion: “Clinicians have described getting an extra level of joy from treating patients at home,” said Jatin Dave, MD, CMO, MassHealth, at WMIF. He added that this could provide one solution to healthcare burnout, Becker’s noted.

• Lab Connection: Clinical laboratories have the opportunity to meet the need for mobile phlebotomists to draw blood specimens from HaH patients in their homes.

• Patient Satisfaction: “The data suggests that for populations studied in multiple areas, [HaH] is a safe service with high-quality care, low readmission rates, low escalation rates, low infection rates and—bottom line—patients love it.” Adam Groff, MD, co-founder of Maribel Health, told RamaOnHealthcare.

HaH Program Going Forward

Britton told the WMIF audience that MGB hopes to “expand the program for surgery, oncology, and pain management patients, recently admitting its first colorectal surgery patient,” Becker’s reported.

However, the future of MGB’s HaH program is not assured. “The Centers for Medicare and Medicaid Services (CMS) waiver to provide acute hospital care at home expires at the end of 2024. A bill to extend the program recently passed a House committee,” Becker’s reported.

Dave said at WMIF that he “hopes the home will one day provide a ‘single infrastructure’ for all levels of care: from primary to inpatient care to skilled nursing,” Becker’s Hospital Review noted, adding, “The home is where, in the long run, we can have this full continuum.”

Hospital-at-Home programs are not new. In “Best Buy Health and Atrium Health Collaborate on a Hospital-at-Home Program, Leveraging the Electronics Retailer’s ‘Specially Trained’ Geek Squad, Omnichannel Expertise,” Dark Daily covered how Best Buy Health had partnered with 40-hospital Atrium Health in an HaH program that the healthcare system plans to scale nationally.

And in “Orlando Health’s New Hospital-in-the-Home Program Brings Quality Healthcare to Patients in the Comfort of their Homes,” we reported how 3,200-bed Orlando Health had launched its Hospital Care at Home program to provide patients in central Florida acute care outside of traditional hospital settings.

Overall, this can be a snapshot of where the HaH movement in the US is currently at, with the Mass General Brigham example showing that this mode of healthcare is delivering results and helping patients. Clinical laboratories across the nation should track efforts by hospitals and health systems in their areas to establish and expand hospital-at-home programs.

—Kristin Althea O’Connor

Related Information:

How Mass General Brigham Built the Largest ‘Hospital at Home’

‘Society Will Greatly Benefit’ from the Transformative Hospital-at-Home Movement

Are Hospital at Home Programs Forgetting about the Patient?

Best Buy Health and Atrium Health Collaborate on a Hospital-at-Home Program, Leveraging the Electronics Retailer’s ‘Specially Trained’ Geek Squad, Omnichannel Expertise

Orlando Health’s New Hospital-in-the-Home Program Brings Quality Healthcare to Patients in the Comfort of their Homes

Could Biases in Artificial Intelligence Databases Present Health Risks to Patients and Financial Risks to Healthcare Providers, including Medical Laboratories?

Clinical laboratories working with AI should be aware of ethical challenges being pointed out by industry experts and legal authorities

Experts are voicing concerns that using artificial intelligence (AI) in healthcare could present ethical challenges that need to be addressed. They say databases and algorithms may introduce bias into the diagnostic process, and that AI may not perform as intended, posing a potential for patient harm.

If true, the issues raised by these experts would have major implications for how clinical laboratories and anatomic pathology groups might use artificial intelligence. For that reason, medical laboratory executives and pathologists should be aware of possible drawbacks to the use of AI and machine-learning algorithms in the diagnostic process.

Is AI Underperforming?

AI’s ability to improve diagnoses, precisely target therapies, and leverage healthcare data is predicted to be a boon to precision medicine and personalized healthcare.

For example, Accenture (NYSE:ACN) says that hospitals will spend $6.6 billion on AI by 2021. This represents an annual growth rate of 40%, according to a report from the Dublin, Ireland-based consulting firm, which states, “when combined, key clinical health AI applications can potentially create $150 billion in annual savings for the United States healthcare economy by 2026.”

But are healthcare providers too quick to adopt AI?

Accenture defines AI as a “constellation of technologies from machine learning to natural language processing that allows machines to sense, comprehend, act, and learn.” However, some experts say AI is not performing as intended, and that it introduces biases in healthcare worthy of investigation.

Keith Dreyer, DO, PhD, is Chief Data Science Officer at Partners Healthcare and Vice Chairman of Radiology at Massachusetts General Hospital (MGH). At a World Medical Innovation Forum on Artificial Intelligence covered by HealthITAnalytics, he said, “There are currently no measures to indicate that a result is biased or how much it might be biased. We need to explain the dataset these answers came from, how accurate we can expect them to be, where they work, and where they don’t work. When a number comes back, what does it really mean? What’s the difference between a seven and an eight or a two?” (Photo copyright: Healthcare in Europe.)

What Goes in Limits What Comes Out

Could machine learning lead to machine decision-making that puts patients at risk? Some legal authorities say yes. Especially when computer algorithms are based on limited data sources and questionable methods, lawyers warn.

Pilar Ossorio PhD, JD, Professor of Law and Bioethics at the University of Wisconsin Law School (UW), toldHealth Data Management (HDM) that genomics databases, such as the Genome-Wide Association Studies (GWAS), house data predominantly about people of Northern European descent, and that could be a problem.

How can AI provide accurate medical insights for people when the information going into databases is limited in the first place? Ossorio pointed to lack of diversity in genomic data. “There are still large groups of people for whom we have almost no genomic data. This is another way in which the datasets that you might use to train your algorithms are going to exclude certain groups of people altogether,” she told HDM.

She also sounded the alarm about making decisions about women’s health when data driving them are based on studies where women have been “under-treated compared with men.”

“This leads to poor treatment, and that’s going to be reflected in essentially all healthcare data that people are using when they train their algorithms,” Ossorio said during a Machine Learning for Healthcare (MLHC) conference covered by HDM.

How Bias Happens 

Bias can enter healthcare data in three forms: by humans, by design, and in its usage. That’s according to David Magnus, PhD, Director of the Stanford Center for Biomedical Ethics (SCBE) and Senior Author of a paper published in the New England Journal of Medicine (NEJM) titled, “Implementing Machine Learning in Health Care—Addressing Ethical Challenges.”

The paper’s authors wrote, “Physician-researchers are predicting that familiarity with machine-learning tools for analyzing big data will be a fundamental requirement for the next generation of physicians and that algorithms might soon rival or replace physicians in fields that involve close scrutiny of images, such as radiology and anatomical pathology.”

In a news release, Magnus said, “You can easily imagine that the algorithms being built into the healthcare system might be reflective of different, conflicting interests. What if the algorithm is designed around the goal of making money? What if different treatment decisions about patients are made depending on insurance status or their ability to pay?”

In addition to the possibility of algorithm bias, the authors of the NEJM paper have other concerns about AI affecting healthcare providers:

  • “Physicians must adequately understand how algorithms are created, critically assess the source of the data used to create the statistical models designed to predict outcomes, understand how the models function and guard against becoming overly dependent on them.
  • “Data gathered about patient health, diagnostics, and outcomes become part of the ‘collective knowledge’ of published literature and information collected by healthcare systems and might be used without regard for clinical experience and the human aspect of patient care.
  • “Machine-learning-based clinical guidance may introduce a third-party ‘actor’ into the physician-patient relationship, challenging the dynamics of responsibility in the relationship and the expectation of confidentiality.”    
“We need to be cautious about caring for people based on what algorithms are showing us. The one thing people can do that machines can’t do is step aside from our ideas and evaluate them critically,” said Danton Char, MD, Lead Author and Assistant Professor of Anesthesiology, Perioperative, and Pain Medicine at Stanford, in the news release. “I think society has become very breathless in looking for quick answers,” he added. (Photo copyright: Stanford Medicine.)

Acknowledge Healthcare’s Differences

Still, the Stanford researchers acknowledge that AI can benefit patients. And that healthcare leaders can learn from other industries, such as car companies, which have test driven AI. 

“Artificial intelligence will be pervasive in healthcare in a few years,” said

Nigam Shah, PhD, co-author of the NEJM paper and Associate Professor of Medicine at Stanford, in the news release. He added that healthcare leaders need to be aware of the “pitfalls” that have happened in other industries and be cognizant of data. 

“Be careful about knowing the data from which you learn,” he warned.

AI’s ultimate role in healthcare diagnostics is not yet fully known. Nevertheless, it behooves clinical laboratory leaders and anatomic pathologists who are considering using AI to address issues of quality and accuracy of the lab data they are generating. And to be aware of potential biases in the data collection process.

—Donna Marie Pocius

Related Information:

Accenture: Healthcare Artificial Intelligence

Could Artificial Intelligence Do More Harm than Good in Healthcare?

AI Machine Learning Algorithms Are Susceptible to Biased Data

Implementing Machine Learning in Healthcare—Addressing Ethical Challenges

Researchers Say Use of AI in Medicine Raises Ethical Questions

;