Oct 18, 2018 | Instruments & Equipment, Laboratory Instruments & Laboratory Equipment, Laboratory Management and Operations, Laboratory News, Laboratory Operations, Laboratory Pathology, Management & Operations
Many of the newer patient-centered technologies are those based on the concept of remote patient monitoring (RPM).
RPM is proving so beneficial for patients and healthcare professionals, it has touched off a new wave of innovation—that of microsampling blood collection technology.
Download the free White Paper
More technicians, and the clinicians who rely upon them, are adopting patient-centric technologies to improve the quality of patient care and thus support enhanced clinical outcomes. Included in this proliferation of new technologies are those based on concept of remote blood sampling using microsampling technology.
Remote patient monitoring through microsampling blood collection makes many aspects of healthcare less invasive and intrusive for patients, with the ability to participate in one’s care from the comfort and privacy of his or her home. Expenses associated with healthcare travel and long wait times are minimized, patients take more control over their treatment, and are often happier than those who need to travel to have illnesses and chronic conditions monitored.
Dark Daily is pleased to offer a recently published free White Paper that shares with laboratory professionals valuable and informative insights on how the field-changing technology of microsampling can answer the challenges of changing remote patient requirements.
“How to Create a Patient-centered Lab with Breakthrough Blood Collection Technology: How to Save Time and Increase Profitability by Using Modular Technology to Improve Access Features, Automate Reporting & Expand Efficiencies” details the ways in which new microsampling blood collection methods facilitate a more patient-centric lab, and provide a user-friendly alternative to older, more intrusive or cumbersome methods.
In addition, this complimentary White Paper provides labs with a practical, step-by-step roadmap to new microsampling technology adoption, deployment, and success.
Download the free White Paper
At DarkDaily.com, readers can access free publications on a variety of topics tailored specifically to the needs of laboratory administrators, lab managers, pathologists, and lab industry consultants.
Oct 17, 2018 | Laboratory Management and Operations, Laboratory News, Laboratory Operations, Laboratory Pathology, Laboratory Sales and Marketing, Laboratory Testing
Shift from fee-for-service to value-based reimbursement is fueling increase in joint ventures and co-branded insurance products, creating opportunities for nimble clinical laboratories and anatomic pathology groups
As healthcare moves from fee-for-service to value-based reimbursement, health insurers and providers are joining forces at a steadily increasing rate, with nearly three-quarters of partnered products in early 2018 being joint ventures or fully co-branded insurance products. This trend presents an opportunity for clinical laboratories to help providers become more effective in their use of laboratory tests as they aim for better patient outcomes and lower treatment costs.
While health systems integrating with insurance services is not new, the roll out of the Affordable Care Act (ACA) in 2014 and its emphasis on value-based reimbursement helped create renewed interest in vertical integration, notes Becker’s Hospital Review.
According to consulting firm Oliver Wyman, the number of payer-provider partnerships has grown rapidly over the past six years, with 73% of the 22 insurance products launched in the first quarter of 2018 being joint ventures of co-branded offerings.
In comparison:
- 22% of partnerships were joint ventures or co-branded in 2014:
- 33% in 2015;
- 57% in 2016; and,
- 71% last year.
Of the 22 new payer-provider partnerships announced this year, 20 product announcements explicitly emphasized value-based compensation, while compensation was implied but not mentioned in the final two product-based partnerships.
“Payers and providers continue to be interested in forming product-based partnerships,” Oliver Wyman stated when releasing the new data. “Our analysis … continues to show a steady increase of trend toward deeper partnership, with more co-branding, greater levels of value-based financial alignment, and other forms of closer collaboration and joint ventures.”
Oliver Wyman cited several “notable” new entrants:
In addition, Oliver Wyman noted that national payers Aetna and Cigna added to their growing rosters of joint ventures in 2018.
Speaking with Healthcare Dive, Tom Robinson, Partner, Health and Life Sciences at Oliver Wyman, described this year’s new ventures as varying in type, size, location, and model. He noted that 50/50 joint ventures with co-branding have gained in popularity, however, accountable care organizations (ACOs), pay-for-performance, and bundled-payment models also are being formed. Robinson believes these vertical integrations offer opportunities for innovation.
“The point of these partnerships is to create something new, rather than just building the same old offerings with a narrow network,” Robinson said. “Successful partnerships will take the opportunity to innovate around the product and experience now that the incentives, insight, investment and integration are all for it.”
In the video above, Oliver Wyman Health and Life Sciences Partner Tom Robinson discusses the emerging trend of payer-provider partnerships, and he highlights unique challenges and opportunities of these joint ventures. Click here to watch the video. (Photo and caption copyright: Oliver Wyman.)
Lower Costs, Improved Access, Through Payer-Provider Partnerships
In announcing Blue Cross Blue Shield of Rhode Island (BCBSRI), and Lifespan’s launch of coordinated healthcare plan BlueCHiP Direct Advance, BCBSRI President and Chief Executive Kim Keck pointed to the plan’s ability to drive down healthcare costs.
“We hear a consistent theme from our members—they want more affordable health plan options—and through our collaboration with Lifespan we are doing that,” Keck stated in a news release. “BlueCHiP Direct Advance is an innovative product that features Lifespan’s vast network of providers who are positioned to more effectively manage and coordinate a patient’s care. And, our partnership allows us to offer this new product at a cost that is 10% lower than our comparable plans.”
When Allina Health System of Minnesota and Aetna last year announced their partnership plans, Allina Chief Executive Penny Wheeler, MD, praised the ability of “payer-provider” partnerships to improve care coordination and increase access to preventive care.
Jim Schowalter, MPP, President and Chief of Executive of the Minnesota Council of Health Plans, told the Star Tribune the joint venture between the for-profit insurer and local health system would accelerate the shift within the state to value-based care.
“This is another effort in our state that moves us away from old fee-for-service systems,” Schowalter stated. “Working together, doctors and insurers can deliver better personal care and hold down medical expenses.”
While the future of the ACA and other healthcare reforms is uncertain, clinical laboratories and anatomic pathology groups should expect healthcare networks and insurers to continue to find ways of partnering. That means pathologists can expect to have an expanded role in helping providers improve patient outcomes and reduce healthcare spending.
—Andrea Downing Peck
Related Information:
Analysis: Payers and Providers Continue to Partner
Providers Becoming Payors: Should Hospitals Start Their Own Health Plans?
Payer-provider Partnerships on Record Pace
Blue Cross and Blue Shield of Rhode Island and Lifespan Partner to Bring Lower Cost Option to Rhode Island Residents in 2018
Security Health Plan Adds Mayo Clinic Health System to Provider Network
New Partnership Expands WellCare Members’ Access to UNC Health Alliance
Allina Health and Aetna to Launch Insurance Company in Minnesota
Oct 15, 2018 | Instruments & Equipment, Laboratory Instruments & Laboratory Equipment, Laboratory Management and Operations, Laboratory News, Laboratory Operations, Laboratory Pathology, Laboratory Testing, Management & Operations
Silicon Valley startup is using gene sequencing to identify in the bloodstream free-floating genetic material shed by tumors
There has been plenty of excitement about the new diagnostic technologies designed to identify circulating tumor cells in blood samples. Now, a well-funded Silicon Valley startup has developed a blood test that it says holds promise for detecting early-stage lung and other cancers.
Though experimental, the screening test—which uses gene sequencing to identify in the bloodstream cancer-signaling genetic material shed by tumors—would be a boon for clinical laboratories and health networks. It also could play a role in advancing precision medicine treatments and drug therapies.
GRAIL, a Menlo Park, Calif., life sciences company, presented its initial findings at the 2018 American Society of Clinical Oncology Annual Meeting in Chicago. Its lung cancer data is part of GRAIL’s ongoing Circulating Cell-Free Genome Atlas (CCGA) study, which aims to enroll 15,000 participants and investigate 20 different types of cancers.
“We’re excited that the initial results for the CCGA study show it is possible to detect early-state lung cancer from blood samples using genome sequencing,” said lead study author Geoffrey Oxnard, MD, Dana-Farber Cancer Institute and Associate Professor of Medicine at Harvard Medical School, in a Dana-Farber news release.
“There is an unmet need globally for early-detection tests for lung cancer that can be easily implemented by healthcare systems,” lead study author Geoffrey Oxnard, MD (above), said in the Dana-Farber news release. “These are promising early results and the next steps are to further optimize the assays and validate the results in a larger group of people.” (Photo copyright: Dana-Farber Cancer Institute.)
According to the news release, researchers in this initial analysis explored the ability of three different prototype sequencing assays, each with 98% specificity, to detect lung cancer in blood samples:
“The initial results showed that all three assays could detect lung cancer with a low rate of false positives (in which a test indicates a person has cancer when there is no cancer),” the Dana-Farber news release noted.
Identifying Disease Risk Before Symptoms Appear
Screening tests help identify individuals who are not displaying disease symptoms but may be at high risk for developing a disease. GRAIL’s goal is to develop a test with a specificity of 99% or higher. This means no more than one out of 100 people would receive a false-positive.
Otis Brawley, MD, Chief Medical and Scientific Officer at the American Cancer Society, points out that specificity is important when developing a population-based screening test that ultimately would be given to large portions of the general public based on age, medical history, or other factors.
“I am much more concerned about specificity than sensitivity [true positive rate], and [GRAIL] exhibited extremely high specificity,” Brawley told Forbes. “You don’t want a lot of false alarms.”
Some cancer experts have a wait-and-see reaction to GRAIL’s initial results, due in part to the small sample size included in the sub-study. Benjamin Davies, MD, Associate Professor of Urology at the University of Pittsburgh School of Medicine, and an expert on prostate cancer screening, told Forbes the early data was “compelling,” but the number of patients in the study was too small to generate excitement.
Oxnard, however, believes the initial results validate the promise of GRAIL’s blood screening test project.
“I was a skeptic two years ago,” Oxnard, a GRAIL consultant, told Forbes. “I think these data need to put a lot of the skepticism to rest. It can be done. This is proof you can find cancer in the blood, you can find advanced cancer, therefore this has legs. This has a real future. It’s going to be many steps down the line, but this deserves further investigation and should move forward.”
Next Steps
Researchers next plan to verify the initial results in an independent group of 1,000 CCGA participants as part of the same sub-study. They then will attempt to optimize the assays before validating them in a larger data set from CCGA, the Dana-Farber news release explained.
Illumina, a sequencing-technology developer, formed GRAIL in 2016, with participating investments from Bill Gates, Bezos Expeditions and Sutter Hill Ventures. Since then, GRAIL has attracted other high-flying investors, including Amazon, Merck, Johnson and Johnson, and Bristol-Myers Squibb.
Forbes notes that as of 2018 GRAIL has raised $1.6 billion in venture capital and has a $3.2 billion valuation, according to private market data firm Pitchbook. Last year, GRAIL merged with Hong Kong-based Cirina Ltd., a privately held company also focused on the early detection of cancer.
While GRAIL’s projects hold promise, anatomic pathologists and clinical laboratories may be wise to temper their enthusiasm until more research is done.
“We all would like to dream that someday you’d be able to diagnose cancer with a blood test,” Eric Topol, MD, Executive Vice President and Professor of Molecular Medicine at Scripps Research, told Forbes. Topol says he’s “encouraged” by GRAIL’s methodical approach, but warns: “We’re at the earliest stage of that.”
—Andrea Downing Peck
Related Information:
Biotech Firm GRAIL Takes the First Steps in Its Quest for a Blood Test for Cancer
Blood Test Shows Potential for Early Detection of Lung Cancer
Detection via Blood-Based Screening
Illumina Launches GRAIL, Focused on Blood-Based Cancer Screening
GRAIL and Cirina Combine to Create Global Company Focused on Early Detection of Cancer
Oct 12, 2018 | Compliance, Legal, and Malpractice, Instruments & Equipment, Laboratory Instruments & Laboratory Equipment, Laboratory Management and Operations, Laboratory News, Laboratory Operations, Laboratory Pathology, Laboratory Testing, Management & Operations
Diagnostic medical laboratories may sequence DNA genetic tests correctly, but there are issues with how companies analyze the information
In 2017, some 12 million people paid to spit in a tube and have their genetic data analyzed, according to Technology Review. Many companies offer this type of DNA testing, and each of them works with one or more clinical laboratories to get the actual sequencing performed. For example, Ancestry.com, one of the largest direct-to-consumer genetic data testing companies, works with both Quest Diagnostics and Illumina.
In the case of Quest Diagnostics, the clinical laboratory company does the actual sequencing for Ancestry. But the analysis of the genetic data for an individual and its interpretation is performed by Ancestry’s team.
There are critics of the booming direct-to-consumer genetic testing business, but it’s not due to the quality of the sequencing. Rather, critics cite other issues, such as:
- Privacy concerns;
- How the physical samples are stored and used;
- Who owns the data; and,
- That this branch of genetics is an area of emerging study and not clearly understood.
What Does All That Genetic Data Mean?
The consumer DNA testing market was worth $359 million dollars in 2017 and is projected to grow to $928 million by 2023, according to a report from Research and Markets. Those numbers represent a lot of spit, and an enormous amount of personal health information. As of now, some one in every 25 adults in the US has access to their genetic data. But, what does all that data mean?
The answer depends, in large part, on who you ask. Many reporters, scientists, and others have taken multiple DNA tests from different companies and received entirely different results. In some cases, the sequencing from one sample submitted to different companies for analysis have rendered dramatically different results.
“There is a wild-west aspect to all of this,” Erin Murphy, a New York University law professor and genetics specialist who focuses on privacy implications, told McClatchy. “It just takes one person in a family to reveal the genetic information of everyone in the family,” she notes. (Photo copyright: New York University.)
It’s All About the Database
Although some people purchase kits from multiple companies, the majority of people take just one test. Each person who buys genetic analysis from Ancestry, for example, consents to having his/her data become part of Ancestry’s enormous database, which is used to perform the analyses that people pay for. There are some interesting implications to how these databases are built.
First, they are primarily made up of paying customers, which means that the vast majority of genetic datasets in Ancestry’s database come from people who have enough disposable income to purchase the kit and analysis. It may not seem like an important detail, but it shows that the comparison population is not the same as the general population.
Second, because the analyses compare the sample DNA to DNA already in the database, it matters how many people from any given area have taken the test and are in the database. An article in Gizmodo describes one family’s experience with DNA testing and some of the pitfalls. The author quotes a representative from the company 23andMe as saying, “Different companies have different reference data sets and different algorithms, hence the variance in results. Middle Eastern reference populations [for example] are not as well represented as European, an industry-wide challenge.”
The same is true for any population where not many members have taken the test for a particular company. In an interview with NPR about trying to find information about her ancestry, journalist Alex Wagner described a similar problem, saying, “There are not a lot of Burmese people taking DNA tests … and so, the results that were returned were kind of nebulous.”
Wagner’s mother and grandmother both immigrated to the US from Burma in 1965, and when Wagner began investigating her ancestry, she, both of her parents, and her grandmother, all took tests from three different direct-to-consumer DNA testing companies. To Wagner’s surprise, her mother and grandmother both had results that showed they were Mongolian, but none of the results indicated Burmese heritage. In the interview she says that one of the biggest things she learned through doing all these tests was that “a lot of these DNA test companies [are] commercial enterprises. So, they basically purchase or acquire DNA samples on market-demand.”
As it turns out, there aren’t many Burmese people taking DNA tests, so there’s not much reason for the testing companies to pursue having a robust Burmese or even Southeast Asian database of DNA.
Who Owns Your Genetic Data?
As is often the case when it comes to technological advances, existing law hasn’t quite caught up with the market for ancestry DNA testing. There are some important unanswered questions, such as who owns the data that results from a DNA analysis?
An investigation conducted by the news organization McClatchy found that Ancestry does allow customers to request their DNA information be deleted from the company’s database, and that they can request their physical sample be destroyed as well. The author writes, “But it is a two-step process, and customers must read deep into the company’s privacy statement to learn how to do it. Requests for DNA data elimination can be made online, but the company asks customers to call its support center to request destruction of their biological sample.”
Another concern is hacking or theft. Ancestry and similar companies take steps to protect customers’ information, such as using barcodes rather than names and encryption when samples are sent to labs. Nevertheless, there was an incident in 2017 in which hackers infiltrated a website owned by Ancestry called RootsWeb. “The RootsWeb situation was certainly unfortunate,” Eric Heath, Ancestry’s Chief Privacy Officer, told McClatchy. He added that RootsWeb was a “completely separate system” from the Ancestry database that includes DNA information.
What We Don’t Know
The biggest pitfall for consumers may be that geneticists don’t know very much about DNA analysis. Adam Rutherford, PhD, is a British geneticist who interviewed for the Gizmodo story. He said that the real problem with companies like Ancestry is that people have a basic, fundamental misunderstanding of what can be learned from a DNA test.
“They’re not telling you where your DNA comes from in the past. They’re telling you where on Earth your DNA is from today,” Rutherford told Gizmodo.
Science evolves, of course, and genetic testing has much evolving to do. The author of the Gizmodo piece writes, “It’s not that the science is bad. It’s that it’s inherently imperfect.” There aren’t any best-practices for analyzing DNA data yet, and companies like Ancestry aren’t doing much to make sure their customers understand that fact.
Nevertheless, issues surrounding genetic testing, the resulting data, and its storage, interpretation, and protection, continue to impact clinical laboratories and anatomic pathology groups.
—Dava Stewart
Related Information:
2017 Was the Year Consumer DNA Testing Blew Up
Quest Diagnostics and Ancestry DNA Collaborate to Expand Consumer DNA Testing
Illumina, Secret Giant of DNA Sequencing, Is Bringing Its Tech to the Masses
Global $928 Million Consumer DNA (Genetic) Testing Market 2018-2023 with 23andMe, Ancestry, Color Genomics and Gene by Gene Dominating
How DNA Testing Botched My Family’s Heritage, and Probably Yours, Too
A Journalist Seeks Out Her Roots but Finds Few Answers in the Soil
Ancestry Wants Your Spit, Your DNA and Your Trust. Should You Give Them All Three?
Oct 10, 2018 | Coding, Billing, and Collections, Compliance, Legal, and Malpractice, Laboratory Management and Operations, Laboratory News, Laboratory Operations, Laboratory Pathology, Laboratory Testing, Management & Operations
Protecting patient privacy is of critical importance, and yet researchers reidentified data using only a few additional data points, casting doubt on the effectiveness of existing federally required data security methods and sharing protocols
Clinical laboratories and anatomic pathologists know the data generated by their diagnostics and testing services constitute most of a patient’s personal health record (PHR). They also know federal law requires them to secure their patients’ protected health information (PHI) and any threat to the security of that data endangers medical laboratories and healthcare practices as well.
Therefore, recent coverage in The Guardian which reported on how easily so-called “deidentified data” can be reidentified with just a few additional data points should be of particular interest to clinical laboratory and health network managers and stakeholders.
Risky Balance Between Data Sharing and Privacy
In December 2017, University of Melbourne (UM) researchers, Chris Culnane, PhD, Benjamin Rubinstein, and Vanessa Teague, PhD, published a report with the Cornell University Library detailing how they reidentified data listed in an open dataset of Australian medical billing records.
“We found that patients can be re-identified, without decryption, through a process of linking the unencrypted parts of the record with known information about the individual such as medical procedures and year of birth,” Culnane stated in a UM news release. “This shows the surprising ease with which de-identification can fail, highlighting the risky balance between data sharing and privacy.”
In a similar study published in Scientific Reports, Yves-Alexandre de Montjoye, PhD, a computation private researcher, used location data on 1.5 million people from a mobile phone dataset collected over 15 months to identify 95% of the people in an anonymized dataset using four unique data points. With just two unique data points, he could identify 50% of the people in the dataset.
“Location data is a fingerprint. It’s a piece of information that’s likely to exist across a broad range of data sets and could potentially be used as a global identifier,” Montjoye told The Guardian.
The problem is exacerbated by the fact that everything we do online these days generates data—much of it open to the public. “If you want to be a functioning member of society, you have no ability to restrict the amount of data that’s being vacuumed out of you to a meaningful level,” Chris Vickery, a security researcher and Director of Cyber Risk Research at UpGuard, told The Guardian.
This privacy vulnerability isn’t restricted to just users of the Internet and social media. In 2013, Latanya Sweeney, PhD, Professor and Director at Harvard’s Data Privacy Lab, performed similar analysis on approximately 579 participants in the Personal Genome Project who provided their zip code, date of birth, and gender to be included in the dataset. Of those analyzed, she named 42% of the individuals. Personal Genome Project later confirmed 97% of her submitted names according to Forbes.
In testimony before the Privacy and Integrity Advisory Committee of the Department of Homeland Security (DHS), Latanya Sweeney, PhD (above), Professor and Director at Harvard’s Data Privacy Lab stated, “One problem is that people don’t understand what makes data unique or identifiable. For example, in 1997 I was able to show how medical information that had all explicit identifiers, such as name, address and Social Security number removed could be reidentified using publicly available population registers (e.g., a voter list). In this particular example, I was able to show how the medical record of William Weld, the Governor of Massachusetts of the time, could be reidentified using only his date of birth, gender, and ZIP. In fact, 87% of the population of the United States is uniquely identified by date of birth (e.g., month, day, and year), gender, and their 5-digit ZIP codes. The point is that data that may look anonymous is not necessarily anonymous. Scientific assessment is needed.” (Photo copyright: US Department of Health and Human Services.)
These studies reveal that—regardless of attempts to create security standards—such as the Privacy Rule in the Health Insurance Portability and Accountability Act of 1996 (HIPAA)—the sheer amount of available data on the Internet makes it relatively easy to reidentify data that has been deidentified.
The Future of Privacy in Big Data
“Open publication of deidentified records like health, census, tax or Centrelink data is bound to fail, as it is trying to achieve two inconsistent aims: the protection of individual privacy and publication of detailed individual records,” Dr. Teague noted in the UM news release. “We need a much more controlled release in a secure research environment, as well as the ability to provide patients greater control and visibility over their data.”
While studies are mounting to show how vulnerable deidentified information might be, there’s little in the way of movement to fix the issue. Nevertheless, clinical laboratories should consider carefully any decision to sell anonymized (AKA, blinded) patient data for data mining purposes. The data may still contain enough identifying information to be used inappropriately. (See Dark Daily, “Coverage of Alexion Investigation Highlights the Risk to Clinical Laboratories That Sell Blinded Medical Data,” June 21, 2017.)
Should regulators and governments address the issue, clinical laboratories and healthcare providers could find more stringent regulations on the sharing of data—both identified and deidentified—and increased liability and responsibility regarding its governance and safekeeping.
Until then, any healthcare professional or researcher should consider the implications of deidentification—both to patients and businesses—should people use the data shared in unexpected and potentially malicious ways.
—Jon Stone
Related Information:
‘Data Is a Fingerprint’: Why You Aren’t as Anonymous as You Think Online
Research Reveals De-Identified Patient Data Can Be Re-Identified
Health Data in an Open World
The Simple Process of Re-Identifying Patients in Public Health Records
Harvard Professor Re-Identifies Anonymous Volunteers in DNA Study
How Someone Can Re-Identify Your Medical Records
Trading in Medical Data: Is this a Headache or An Opportunity for Pathologists and Clinical Laboratories
Coverage of Alexion Investigation Highlights the Risk to Clinical Laboratories That Sell Blinded Medical Data
Oct 8, 2018 | Laboratory Management and Operations, Laboratory News, Laboratory Operations, Laboratory Pathology, Laboratory Testing
Both health systems will use their EHRs to track genetic testing data and plan to bring genetic data to primary care physicians
Clinical laboratories and pathology groups face a big challenge in how to get appropriate genetic and molecular data into electronic health record (EHR) systems in ways that are helpful for physicians. Precision medicine faces many barriers and this is one of the biggest. Aside from the sheer enormity of the data, there’s the question of making it useful and accessible for patient care. Thus, when two major healthcare systems resolve to accomplish this with their EHRs, laboratory managers and pathologists should take notice.
NorthShore University HealthSystem in Illinois and Geisinger Health System in Pennsylvania and New Jersey are working to make genetic testing part of primary care. And both reached similar conclusions regarding the best way for primary care physicians to make use of the information.
One area of common interest is pharmacogenomics.
At NorthShore, two genetic testing programs—MedClueRx and the Genetic and Wellness Assessment—provide doctors with more information about how their patients metabolize certain drugs and whether or not their medical and family histories suggest they need further, more specific genetic testing.
“We’re not trying to make all of our primary care physicians into genomic experts. That is a difficult strategy that really isn’t scalable. But we’re giving them enough tools to help them feel comfortable,” Peter Hulick, MD, Director of the Center for Personalized Medicine at NorthShore, told Healthcare IT News.
Conversely, Geisinger has made genomic testing an automated part of primary care. When patients visit their primary care physicians, they are asked to sign a release and undergo whole genome sequencing. An article in For the Record describes Geisinger’s program:
“The American College of Medical Genetics and Genomics classifies 59 genes as clinically actionable, with an additional 21 others recommended by Geisinger. If a pathogenic or likely pathogenic variant is found in one of those 80 genes, the patient and the primary care provider are notified.”
William Andrew Faucett (left) is Director of Policy and Education, Office of the Chief Scientific Officer at Geisinger Health; and Peter Hulick, MD (right), is Director of the Center for Personalized Medicine at NorthShore University HealthSystem. Both are leading programs at their respective healthcare networks to improve precision medicine and primary care by including genetic testing data and accessibility to it in their patients’ EHRs. (Photo copyrights: Geisinger/NorthShore University HealthSystem.)
The EHR as the Way to Access Genetic Test Results
Both NorthShore and Geisinger selected their EHRs for making important genetic information accessible to primary care physicians, as well as an avenue for tracking that information over time.
Hulick told Healthcare IT News that NorthShore decided to make small changes to their existing Epic EHR that would enable seemingly simple but actually complex actions to take place. For example, tracking the results of a genetic test within the EHR. According to Hulick, making the genetic test results trackable creates a “variant repository,” also known as a Clinical Data Repository.
“Once you have that, you can start to link it to other information that’s known about the patient: family history status, etc.,” he explained. “And you can start to build an infrastructure around it and use some of the tools for clinical decision support that are used in other areas: drug/drug interactions, reminders for flu vaccinations, and you can start to build on those decision support tools but apply them to genomics.”
Like NorthShore, Geisinger is also using its EHR to make genetic testing information available to primary care physician when a problem variant is identified. They use EHR products from both Epic and Cerner and are working with both companies to streamline and simplify the processes related to genetic testing. When a potentially problematic variant is found, it is listed in the EHR’s problem list, similar to other health issues.
Geisinger has developed a reporting system called GenomeCOMPASS, which notifies patients of their results and provides related information. It also enables patients to connect with a geneticist. GenomeCOMPASS has a physician-facing side where primary care doctors receive the results and have access to more information.
Andrew Faucett, Senior Investigator (Professor) and Director of Policy and Education, Office of the Chief Scientific Officer at Geisinger, compares the interpretation of genetic testing to any other kind of medical testing. “If a patient gets an MRI, the primary care physicians doesn’t interpret it—the radiologist does,” adding, “Doctors want to help patients follow the recommendations of the experts,” he told For the Record.
The Unknown Factor
Even though researchers regularly make new discoveries in genomics, physicians practicing today have had little, if any, training on how to incorporate genetics into their patients’ care. Combine that lack of knowledge and training with the current lack of EHR interoperability and the challenges in using genetic testing for precision medicine multiply to a staggering degree.
One thing that is certain: the scientific community will continue to gather knowledge that can be applied to improving the health of patients. Medical pathology laboratories will play a critical role in both testing and helping ensure results are useful and accessible, now and in the future.
—Dava Stewart
Related Information:
Introducing “Genomics and Precision Health”
How NorthShore Tweaked Its Epic EHR to Put Precision Medicine into Routine Clinical Workflows
Precise, Purposeful Health Care
Next-Generation Laboratory Information Management Systems Will Deliver Medical Laboratory Test Results and Patient Data to Point of Care, Improving Outcomes, Efficiency, and Revenue