Oct 26, 2018 | Digital Pathology, Instruments & Equipment, Laboratory Instruments & Laboratory Equipment, Laboratory Management and Operations, Laboratory News, Laboratory Operations, Laboratory Pathology, Laboratory Testing
Future EHRs will focus on efficiency, machine learning, and cloud services—improving how physicians and medical laboratories interact with the systems to support precision medicine and streamlined workflows
When the next generation of electronic health record (EHR) systems reaches the market, they will have advanced features that include cloud-based services and the ability to collect data from and communicate with patients using mobile devices. These new developments will provide clinical laboratories and anatomic pathology groups with new opportunities to create value with their lab testing services.
Proposed Improvements and Key Trends
Experts with EHR developers Epic Systems, Allscripts, Accenture, and drchrono spoke recently with Healthcare IT News about future platform initiatives and trends they feel will shape their next generation of EHR offerings.
They include:
- Automation analytics and human-centered designs for increased efficiency and to help reduce physician burnout;
- Improved feature parity across mobile and computer EHR interfaces to provide patients, physicians, and medical laboratories with access to information across a range of technologies and locations;
- Integration of machine learning and predictive modeling to improve analytics and allow for better implementation of genomics-informed medicine and population health features; and
- A shift toward cloud-hosted EHR solutions with support for application programming interfaces (APIs) designed for specific healthcare facilities that reduce IT overhead and make EHR systems accessible to smaller practices and facilities.
Should these proposals move forward, future generations of EHR platforms could transform from simple data storage/retrieval systems into critical tools physicians and medical laboratories use to facilitate communications and support decision-making in real time.
And, cloud-based EHRs with access to clinical labs’ APIs could enable those laboratories to communicate with and receive data from EHR systems with greater efficiency. This would eliminate yet another bottleneck in the decision-making process, and help laboratories increase volumes and margins through reduced documentation and data management overhead.
Cloud-based EHRs and Potential Pitfalls
Cloud-based EHRs rely on cloud computing, where IT resources are shared among multiple entities over the Internet. Such EHRs are highly scalable and allow end users to save money by hiring third-party IT services, rather than maintaining expensive IT staff.
Kipp Webb, MD, provider practice lead and Chief Clinical Innovation Officer at Accenture told Healthcare IT News that several EHR vendors are only a few years out on releasing cloud-based inpatient/outpatient EHR systems capable of meeting the needs of full-service medical centers.
While such a system would mean existing health networks would not need private infrastructure and dedicate IT teams to manage EHR system operations, a major shift in how next-gen systems are deployed and maintained could lead to potential interoperability and data transmission concerns. At least in the short term.
Yet, the transition also could lead to improved flexibility and connectivity between health networks and data providers—such as clinical laboratories and pathologist groups. This would be achieved through application programming interfaces (APIs) that enable computer systems to talk to each other and exchange data much more efficiently.
“Perhaps one of the biggest ways having a fully cloud-based EHR will change the way we as an industry operate will be enabled API access.” Daniel Kivatinos, COO and founder of drchrono, told Healthcare IT News. “You will be able to add other partners into the mix that just weren’t available before when you have a local EHR install only.”
Paul Black, CEO of Allscripts, believes these changes will likely require more than upgrading existing software or hardware. “The industry needs an entirely new approach to the EHR,” he told Healthcare IT News. “We’re seeing a huge need for the EHR to be mobile, cloud-based, and comprehensive to streamline workflow and get smarter with every use.” (Photo copyright: Allscripts.)
Reducing Physician Burnout through Human-Centered Design
As Dark Daily reported last year, EHRs have been identified as contributing to physician burnout, increased dissatisfaction, and decreased face-to-face interactions with patients.
Combined with the increased automation, Carl Dvorak, President of Epic Systems, notes next-gen EHR changes hold the potential to streamline the communication of orders, laboratory testing data, and information relevant to patient care. They could help physicians reach treatment decisions faster and provide laboratories with more insight, so they can suggest appropriate testing pathways for each episode of care.
“[Automation analytics] holds the key to unlocking some of the secrets to physician well-being,” Dvorak told Healthcare IT News. “For example, we can avoid work being unnecessarily diverted to physicians when it could be better managed by others.”
Black echoes similar benefits, saying, “We believe using human-centered design will transform the way physicians experience and interact with technology, as well as improve provider wellness.”
Some might question the success of the first wave of EHR systems. Though primarily built to address healthcare reform requirements, these systems provided critical feedback and data to EHR developers focused not on simply fulfilling regulatory requirements, but on meeting the needs of patients and care providers as well.
If these next-generations systems can help improve the quality of data recording, storage, and transmission, while also reducing physician burnout, they will have come a long way from the early EHRs. For medical laboratory professionals, these changes will likely impact how orders are received and lab results are reported back to doctors in the future. Thus, it’s worth monitoring these developments.
—Jon Stone
Related Information:
Next-Gen EHRs: Epic, Allscripts and Others Reveal Future of Electronic Health Records
Next-Gen IT Infrastructure: A Nervous System Backed by Analytics and Context
EHR Systems Continue to Cause Burnout, Physician Dissatisfaction, and Decreased Face-to-Face Patient Care
Oct 22, 2018 | Digital Pathology, Instruments & Equipment, Laboratory Instruments & Laboratory Equipment, Laboratory Management and Operations, Laboratory News, Laboratory Pathology, Laboratory Testing
UK study shows how LDTs may one day enable physicians to identify patients genetically predisposed to chronic disease and prescribe lifestyle changes before medical treatment becomes necessary
Could genetic predisposition lead to clinical laboratory-developed tests (LDTs) that enable physicians to assess patients’ risk for specific diseases years ahead of onset of symptoms? Could these LDTs inform treatment/lifestyle changes to help reduce the chance of contracting the disease?
A UK study into the genetics of one million people with high blood pressure reveals such tests could one day exist.
Researchers at Queen Mary University of London and Imperial College London uncovered 535 new gene regions affecting hypertension in the largest ever worldwide genetic study of blood pressure, according to a news release.
They also confirmed 274 loci (gene locations) and replicated 92 loci for the first time.
“This is the most major advance in blood pressure genetics to date. We now know that there are over 1,000 genetic signals which influence our blood pressure. This provides us with many new insights into how our bodies regulate blood pressure and has revealed several new opportunities for future drug development,” said Mark Caulfield, MD,
Professor of Clinical Pharmacology at Queen Mary University of London, in the news release. He is also Director of the National Institute for Health Research Barts Biomedical Research Centre.
The researchers believe “this means almost a third of the estimated heritability for blood pressure is now explained,” the news release noted.
Clinical Laboratories May Eventually Get a Genetic Test Panel for Hypertension
Of course, more research is needed. But the study suggests a genetic test panel for hypertension may be in the future for anatomic pathologists and medical laboratories. Physicians might one day be able to determine their patients’ risks for high blood pressure years in advance and advise treatment and lifestyle changes to avert medical problems.
By involving more than one million people, the study also demonstrates how ever-growing pools of data will be used in research to develop new diagnostic assays.
The researchers published their study in Nature Genetics.
The video above summarizes research led by Queen Mary University of London and Imperial College London, which found over 500 new gene regions that influence people’s blood pressure, in the largest global genetic study of blood pressure to date. Click here to view the video. (Photo and caption copyright: Queen Mary University of London.)
Genetics Influence Blood Pressure More Than Previously Thought
In addition to identifying hundreds of new genetic regions influencing blood pressure, the researchers compared people with the highest genetic risk of high blood pressure to those in the low risk group. Based on this comparison, the researchers determined that all genetic variants were associated with:
- “having around a 13 mm Hg higher blood pressure;
- “having 3.34 times the odds for increased risk of hypertension; and,
- “1.52 times the odds for increased risk of poor cardiovascular outcomes.”
“We identify 535 novel blood pressure loci that not only offer new biological insights into blood pressure regulation, but also highlight shared genetic architecture between blood pressure and lifestyle exposures. Our findings identify new biological pathways for blood pressure regulation with potential for improved cardiovascular disease prevention in the future,” the researchers wrote in Nature Genetics.
Other Findings Link Known Genes and Drugs to Hypertension
The UK researchers also revealed the Apolipoprotein E (ApoE) gene’s relation to hypertension. This gene has been associated with both Alzheimer’s and coronary artery diseases, noted Lab Roots. The study also found that Canagliflozin, a drug used in type 2 diabetes treatment, could be repurposed to also address hypertension.
“Identifying genetic signals will increasingly help us to split patients into groups based on their risk of disease,” Paul Elliott, PhD, Professor, Imperial College London Faculty of Medicine, School of Public Health, and co-lead author, stated in the news release. “By identifying those patients who have the greatest underlying risk, we may be able to help them to change lifestyle factors which make them more likely to develop disease, as well as enabling doctors to provide them with targeted treatments earlier.”
Working to Advance Precision Medicine
The study shares new and important information about how genetics may influence blood pressure. By acquiring data from more than one million people, the UK researchers also may be setting a new expectation for research about diagnostic tests that could become part of the test menu at clinical laboratories throughout the world. The work could help physicians and patients understand risk of high blood pressure and how precision medicine and lifestyle changes can possibly work to prevent heart attacks and strokes among people worldwide.
—Donna Marie Pocius
Related Information:
Study of One Million People Leads to World’s Biggest Advance in Blood Pressure Genetics
Researchers Find 535 New Gene Regions That Influence Blood Pressure
Genetic Analysis of Over One Million Identifies 535 New Loci Associated with Blood Pressure Traits
The Facts About High Blood Pressure
High Blood Pressure Breakthrough: Over 500 Genes Uncovered
Study of a Million People Reveals Hypertension Genes
Oct 18, 2018 | Instruments & Equipment, Laboratory Instruments & Laboratory Equipment, Laboratory Management and Operations, Laboratory News, Laboratory Operations, Laboratory Pathology, Management & Operations
Many of the newer patient-centered technologies are those based on the concept of remote patient monitoring (RPM).
RPM is proving so beneficial for patients and healthcare professionals, it has touched off a new wave of innovation—that of microsampling blood collection technology.
Download the free White Paper
More technicians, and the clinicians who rely upon them, are adopting patient-centric technologies to improve the quality of patient care and thus support enhanced clinical outcomes. Included in this proliferation of new technologies are those based on concept of remote blood sampling using microsampling technology.
Remote patient monitoring through microsampling blood collection makes many aspects of healthcare less invasive and intrusive for patients, with the ability to participate in one’s care from the comfort and privacy of his or her home. Expenses associated with healthcare travel and long wait times are minimized, patients take more control over their treatment, and are often happier than those who need to travel to have illnesses and chronic conditions monitored.
Dark Daily is pleased to offer a recently published free White Paper that shares with laboratory professionals valuable and informative insights on how the field-changing technology of microsampling can answer the challenges of changing remote patient requirements.
“How to Create a Patient-centered Lab with Breakthrough Blood Collection Technology: How to Save Time and Increase Profitability by Using Modular Technology to Improve Access Features, Automate Reporting & Expand Efficiencies” details the ways in which new microsampling blood collection methods facilitate a more patient-centric lab, and provide a user-friendly alternative to older, more intrusive or cumbersome methods.
In addition, this complimentary White Paper provides labs with a practical, step-by-step roadmap to new microsampling technology adoption, deployment, and success.
Download the free White Paper
At DarkDaily.com, readers can access free publications on a variety of topics tailored specifically to the needs of laboratory administrators, lab managers, pathologists, and lab industry consultants.
Oct 15, 2018 | Instruments & Equipment, Laboratory Instruments & Laboratory Equipment, Laboratory Management and Operations, Laboratory News, Laboratory Operations, Laboratory Pathology, Laboratory Testing, Management & Operations
Silicon Valley startup is using gene sequencing to identify in the bloodstream free-floating genetic material shed by tumors
There has been plenty of excitement about the new diagnostic technologies designed to identify circulating tumor cells in blood samples. Now, a well-funded Silicon Valley startup has developed a blood test that it says holds promise for detecting early-stage lung and other cancers.
Though experimental, the screening test—which uses gene sequencing to identify in the bloodstream cancer-signaling genetic material shed by tumors—would be a boon for clinical laboratories and health networks. It also could play a role in advancing precision medicine treatments and drug therapies.
GRAIL, a Menlo Park, Calif., life sciences company, presented its initial findings at the 2018 American Society of Clinical Oncology Annual Meeting in Chicago. Its lung cancer data is part of GRAIL’s ongoing Circulating Cell-Free Genome Atlas (CCGA) study, which aims to enroll 15,000 participants and investigate 20 different types of cancers.
“We’re excited that the initial results for the CCGA study show it is possible to detect early-state lung cancer from blood samples using genome sequencing,” said lead study author Geoffrey Oxnard, MD, Dana-Farber Cancer Institute and Associate Professor of Medicine at Harvard Medical School, in a Dana-Farber news release.
“There is an unmet need globally for early-detection tests for lung cancer that can be easily implemented by healthcare systems,” lead study author Geoffrey Oxnard, MD (above), said in the Dana-Farber news release. “These are promising early results and the next steps are to further optimize the assays and validate the results in a larger group of people.” (Photo copyright: Dana-Farber Cancer Institute.)
According to the news release, researchers in this initial analysis explored the ability of three different prototype sequencing assays, each with 98% specificity, to detect lung cancer in blood samples:
“The initial results showed that all three assays could detect lung cancer with a low rate of false positives (in which a test indicates a person has cancer when there is no cancer),” the Dana-Farber news release noted.
Identifying Disease Risk Before Symptoms Appear
Screening tests help identify individuals who are not displaying disease symptoms but may be at high risk for developing a disease. GRAIL’s goal is to develop a test with a specificity of 99% or higher. This means no more than one out of 100 people would receive a false-positive.
Otis Brawley, MD, Chief Medical and Scientific Officer at the American Cancer Society, points out that specificity is important when developing a population-based screening test that ultimately would be given to large portions of the general public based on age, medical history, or other factors.
“I am much more concerned about specificity than sensitivity [true positive rate], and [GRAIL] exhibited extremely high specificity,” Brawley told Forbes. “You don’t want a lot of false alarms.”
Some cancer experts have a wait-and-see reaction to GRAIL’s initial results, due in part to the small sample size included in the sub-study. Benjamin Davies, MD, Associate Professor of Urology at the University of Pittsburgh School of Medicine, and an expert on prostate cancer screening, told Forbes the early data was “compelling,” but the number of patients in the study was too small to generate excitement.
Oxnard, however, believes the initial results validate the promise of GRAIL’s blood screening test project.
“I was a skeptic two years ago,” Oxnard, a GRAIL consultant, told Forbes. “I think these data need to put a lot of the skepticism to rest. It can be done. This is proof you can find cancer in the blood, you can find advanced cancer, therefore this has legs. This has a real future. It’s going to be many steps down the line, but this deserves further investigation and should move forward.”
Next Steps
Researchers next plan to verify the initial results in an independent group of 1,000 CCGA participants as part of the same sub-study. They then will attempt to optimize the assays before validating them in a larger data set from CCGA, the Dana-Farber news release explained.
Illumina, a sequencing-technology developer, formed GRAIL in 2016, with participating investments from Bill Gates, Bezos Expeditions and Sutter Hill Ventures. Since then, GRAIL has attracted other high-flying investors, including Amazon, Merck, Johnson and Johnson, and Bristol-Myers Squibb.
Forbes notes that as of 2018 GRAIL has raised $1.6 billion in venture capital and has a $3.2 billion valuation, according to private market data firm Pitchbook. Last year, GRAIL merged with Hong Kong-based Cirina Ltd., a privately held company also focused on the early detection of cancer.
While GRAIL’s projects hold promise, anatomic pathologists and clinical laboratories may be wise to temper their enthusiasm until more research is done.
“We all would like to dream that someday you’d be able to diagnose cancer with a blood test,” Eric Topol, MD, Executive Vice President and Professor of Molecular Medicine at Scripps Research, told Forbes. Topol says he’s “encouraged” by GRAIL’s methodical approach, but warns: “We’re at the earliest stage of that.”
—Andrea Downing Peck
Related Information:
Biotech Firm GRAIL Takes the First Steps in Its Quest for a Blood Test for Cancer
Blood Test Shows Potential for Early Detection of Lung Cancer
Detection via Blood-Based Screening
Illumina Launches GRAIL, Focused on Blood-Based Cancer Screening
GRAIL and Cirina Combine to Create Global Company Focused on Early Detection of Cancer
Oct 12, 2018 | Compliance, Legal, and Malpractice, Instruments & Equipment, Laboratory Instruments & Laboratory Equipment, Laboratory Management and Operations, Laboratory News, Laboratory Operations, Laboratory Pathology, Laboratory Testing, Management & Operations
Diagnostic medical laboratories may sequence DNA genetic tests correctly, but there are issues with how companies analyze the information
In 2017, some 12 million people paid to spit in a tube and have their genetic data analyzed, according to Technology Review. Many companies offer this type of DNA testing, and each of them works with one or more clinical laboratories to get the actual sequencing performed. For example, Ancestry.com, one of the largest direct-to-consumer genetic data testing companies, works with both Quest Diagnostics and Illumina.
In the case of Quest Diagnostics, the clinical laboratory company does the actual sequencing for Ancestry. But the analysis of the genetic data for an individual and its interpretation is performed by Ancestry’s team.
There are critics of the booming direct-to-consumer genetic testing business, but it’s not due to the quality of the sequencing. Rather, critics cite other issues, such as:
- Privacy concerns;
- How the physical samples are stored and used;
- Who owns the data; and,
- That this branch of genetics is an area of emerging study and not clearly understood.
What Does All That Genetic Data Mean?
The consumer DNA testing market was worth $359 million dollars in 2017 and is projected to grow to $928 million by 2023, according to a report from Research and Markets. Those numbers represent a lot of spit, and an enormous amount of personal health information. As of now, some one in every 25 adults in the US has access to their genetic data. But, what does all that data mean?
The answer depends, in large part, on who you ask. Many reporters, scientists, and others have taken multiple DNA tests from different companies and received entirely different results. In some cases, the sequencing from one sample submitted to different companies for analysis have rendered dramatically different results.
“There is a wild-west aspect to all of this,” Erin Murphy, a New York University law professor and genetics specialist who focuses on privacy implications, told McClatchy. “It just takes one person in a family to reveal the genetic information of everyone in the family,” she notes. (Photo copyright: New York University.)
It’s All About the Database
Although some people purchase kits from multiple companies, the majority of people take just one test. Each person who buys genetic analysis from Ancestry, for example, consents to having his/her data become part of Ancestry’s enormous database, which is used to perform the analyses that people pay for. There are some interesting implications to how these databases are built.
First, they are primarily made up of paying customers, which means that the vast majority of genetic datasets in Ancestry’s database come from people who have enough disposable income to purchase the kit and analysis. It may not seem like an important detail, but it shows that the comparison population is not the same as the general population.
Second, because the analyses compare the sample DNA to DNA already in the database, it matters how many people from any given area have taken the test and are in the database. An article in Gizmodo describes one family’s experience with DNA testing and some of the pitfalls. The author quotes a representative from the company 23andMe as saying, “Different companies have different reference data sets and different algorithms, hence the variance in results. Middle Eastern reference populations [for example] are not as well represented as European, an industry-wide challenge.”
The same is true for any population where not many members have taken the test for a particular company. In an interview with NPR about trying to find information about her ancestry, journalist Alex Wagner described a similar problem, saying, “There are not a lot of Burmese people taking DNA tests … and so, the results that were returned were kind of nebulous.”
Wagner’s mother and grandmother both immigrated to the US from Burma in 1965, and when Wagner began investigating her ancestry, she, both of her parents, and her grandmother, all took tests from three different direct-to-consumer DNA testing companies. To Wagner’s surprise, her mother and grandmother both had results that showed they were Mongolian, but none of the results indicated Burmese heritage. In the interview she says that one of the biggest things she learned through doing all these tests was that “a lot of these DNA test companies [are] commercial enterprises. So, they basically purchase or acquire DNA samples on market-demand.”
As it turns out, there aren’t many Burmese people taking DNA tests, so there’s not much reason for the testing companies to pursue having a robust Burmese or even Southeast Asian database of DNA.
Who Owns Your Genetic Data?
As is often the case when it comes to technological advances, existing law hasn’t quite caught up with the market for ancestry DNA testing. There are some important unanswered questions, such as who owns the data that results from a DNA analysis?
An investigation conducted by the news organization McClatchy found that Ancestry does allow customers to request their DNA information be deleted from the company’s database, and that they can request their physical sample be destroyed as well. The author writes, “But it is a two-step process, and customers must read deep into the company’s privacy statement to learn how to do it. Requests for DNA data elimination can be made online, but the company asks customers to call its support center to request destruction of their biological sample.”
Another concern is hacking or theft. Ancestry and similar companies take steps to protect customers’ information, such as using barcodes rather than names and encryption when samples are sent to labs. Nevertheless, there was an incident in 2017 in which hackers infiltrated a website owned by Ancestry called RootsWeb. “The RootsWeb situation was certainly unfortunate,” Eric Heath, Ancestry’s Chief Privacy Officer, told McClatchy. He added that RootsWeb was a “completely separate system” from the Ancestry database that includes DNA information.
What We Don’t Know
The biggest pitfall for consumers may be that geneticists don’t know very much about DNA analysis. Adam Rutherford, PhD, is a British geneticist who interviewed for the Gizmodo story. He said that the real problem with companies like Ancestry is that people have a basic, fundamental misunderstanding of what can be learned from a DNA test.
“They’re not telling you where your DNA comes from in the past. They’re telling you where on Earth your DNA is from today,” Rutherford told Gizmodo.
Science evolves, of course, and genetic testing has much evolving to do. The author of the Gizmodo piece writes, “It’s not that the science is bad. It’s that it’s inherently imperfect.” There aren’t any best-practices for analyzing DNA data yet, and companies like Ancestry aren’t doing much to make sure their customers understand that fact.
Nevertheless, issues surrounding genetic testing, the resulting data, and its storage, interpretation, and protection, continue to impact clinical laboratories and anatomic pathology groups.
—Dava Stewart
Related Information:
2017 Was the Year Consumer DNA Testing Blew Up
Quest Diagnostics and Ancestry DNA Collaborate to Expand Consumer DNA Testing
Illumina, Secret Giant of DNA Sequencing, Is Bringing Its Tech to the Masses
Global $928 Million Consumer DNA (Genetic) Testing Market 2018-2023 with 23andMe, Ancestry, Color Genomics and Gene by Gene Dominating
How DNA Testing Botched My Family’s Heritage, and Probably Yours, Too
A Journalist Seeks Out Her Roots but Finds Few Answers in the Soil
Ancestry Wants Your Spit, Your DNA and Your Trust. Should You Give Them All Three?
Oct 3, 2018 | Laboratory Instruments & Laboratory Equipment, Laboratory Management and Operations, Laboratory News, Laboratory Operations, Laboratory Pathology, Laboratory Testing, Management & Operations
Next step is to design Web portal offering low-cost ‘polygenic risk score’ to people willing to upload genetic data received from DNA testing companies such as 23andMe
Pathologists and other medical professionals have long predicted that multi-gene diagnostics tests which examine thousands of specific gene sequences might one day hold the key to assessing disease risk, diagnosing diseases, and guiding precision medicine treatment decisions. Now, a research team from the Broad Institute, Massachusetts General Hospital (MGH) and Harvard Medical School have brought that prediction closer to reality.
Their study, published last month in Nature Genetics, found that a genome analysis called polygenic risk scoring can identify individuals with a high risk of developing one of five potentially deadly diseases:
- Coronary artery disease;
- Atrial fibrillation;
- Type 2 diabetes;
- Inflammatory bowel disease; and,
- Breast cancer.
Polygenic Scoring Predicts Risk of Disease Among General Population
To date, most genetic testing has been “single gene,” focusing on rare mutations in specific genes such as those causing sickle cell disease or cystic fibrosis. This latest research indicates that polygenic predictors could be used to discover heightened risk factors in a much larger portion of the general population, enabling early interventions to prevent disease before other warning signs appear. The ultimate goal of precision medicine.
“We’ve known for long time that there are people out there at high risk for disease based just on their overall genetic variation,” senior author Sekar Kathiresan, MD, co-Director of the Medical and Population Genetics Program at the Broad Institute, and Director, Center for Genomic Medicine at Massachusetts General Hospital, said in a Broad Institute news release. “Now, we’re able to measure that risk using genomic data in a meaningful way. From a public health perspective, we need to identify these higher-risk segments of the population, so we can provide appropriate care.”
“What I foresee is in five years, each person will know this risk number—this ‘polygenic risk score’—similar to the way each person knows his or her cholesterol,” Sekar Kathiresan, MD (above), Co-Director of the Medical and Population Genetics Program at the Broad Institute, and Director, Center for Genomic Medicine at Massachusetts General Hospital, told the Associated Press (AP). He went on to say a high-risk score could lead to people taking other steps to lower their overall risk for specific diseases, while a low-risk score “doesn’t give you a free pass” since an unhealthy lifestyle can lead to disease as well. (Photo copyright: Massachusetts General Hospital.)
The researchers conducted the study using data from more than 400,000 individuals in the United Kingdom Biobank. They created a risk score for coronary artery disease by looking for 6.6 million single-letter genetic changes that are more prevalent in people who have had early heart attacks. Of the individuals in the UK Biobank dataset, 8% were more than three times as likely to develop the disease compared to everyone else, based on their genetic variation.
In absolute terms, only 0.8% of individuals with the very lowest polygenic risk scores had coronary artery disease, compared to 11% for people with the highest scores, the Broad Institute news release stated.
“The results should be eye-opening for cardiologists,” Charles C. Hong, MD, PhD, Director of Cardiovascular Research at the University of Maryland School of Medicine, told the AP. “The only disappointment is that this score applies only to those with European ancestry, so I wonder if similar scores are in the works for the large majority of the world population that is not white.”
In its news release, the Broad Institute noted the need for additional studies to “optimize the algorithms for other ethnic groups.”
The Broad Institute’s results suggest, however, that as many as 25 million people in the United States may be at more than triple the normal risk for coronary artery disease. And millions more may be at similar elevated risk for the other conditions, based on genetic variations alone.
Reanalyzing Data from DNA Testing Companies
The researchers are building a website that would enable users to receive a low-cost polygenic risk score—such as calculating inherited risk score for many common diseases—by reanalyzing data users previously receive from DNA testing companies such as 23andMe.
Kathiresan told Forbes his goal is for the 17 million people who have used genotyping services to submit their data to the web portal he is building. He told the magazine he’s hoping “people will be able to get their polygenic scores for about as much as the cost of a cholesterol test.”
Some Experts Not Impressed with Broad Institute Study
But not all experts believe the Broad Institute/MGH/Harvard Medical School study deserves so much attention. Ali Torkamani, PhD, Director of Genomics and Genome Informatics at the Scripps Research Translational Institute, offered a tepid assessment of the Nature Genetics study.
In an article in GEN that noted polygenic risk scores were receiving “the type of attention reserved for groundbreaking science,” Torkamani said the recent news is “not particularly” a big leap forward in the field of polygenic risk prediction. He described the results as “not a methodological advance or even an unexpected result,” noting his own group had generated similar data for type 2 diabetes in their analysis of the UK dataset.
Nevertheless, Kathiresan is hopeful the study will advance disease treatment and prevention. “Ultimately, this is a new type of genetic risk factor,” he said in the news release. “We envision polygenic risk scores as a way to identify people at high or low risk for a disease, perhaps as early as birth, and then use that information to target interventions—either lifestyle modifications or treatments—to prevent disease.”
This latest research indicates healthcare providers could soon be incorporating polygenic risking scoring into routine clinical care. Not only would doing so mean another step forward in the advancement of precision medicine, but clinical laboratories and pathology groups also would have new tools to help diagnose disease and guide treatment decisions.
—Andrea Downing Peck
Related Information:
Genome-wide Polygenic Scores for Common Diseases Identify Individuals with Risk Equivalent to Monogenic Mutations
Predicting Risk for Common Deadly Diseases from Millions of Genetic Variants
Multigene Test May Find Risk for Heart Disease and More
A Harvard Scientist Thinks He Has a Gene Test for Heart Attack Risk. He Wants to Give It Away Free
Why Do Polygenic Risk Scores Get So Much Hype?