Jul 6, 2018 | Instruments & Equipment, Laboratory Instruments & Laboratory Equipment, Laboratory News, Laboratory Operations, Laboratory Pathology, Laboratory Testing
Analysis performed by this new biosensor could help identify inflammatory bowel diseases, cancer, and other chronic diseases, and contribute to influencing the best treatment options, a critical aspect of personalized medicine
Anatomic pathologists and clinical laboratories have long known that disease, as the saying goes, “is written in the blood.” How to spot the disease has been the challenge.
Now, researchers at Finland’s Aalto University have developed a cutting-edge plasmonic biosensor that uses the intense light absorption and reflective properties of plasmonic materials to discern refractive changes between healthy and diseased exosomes—even with the naked eye!
This opens the door to a plethora of non-invasive health tests similar to home pregnancy tests. Should such tests prove accurate and affordable, medical laboratories could have new tools in their fight to end chronic disease.
New Rules for Differentiating Healthy and Diseased Human Exosomes
The Aalto researchers produced the biosensor by depositing plasmonic metaparticles (hypothetical particles that always move faster than light, such as Tachyons) on a black metal surface capable of absorbing electromagnetic radiation. With it, abnormalities can be distinguished by the color generated when the plasmons impact the black surface.
“We exploited it as the basis of new design rules to differentiate diseased human serum exosomes from healthy ones in a simple manner with no need [for] any specialized equipment”, Dr. Abdou Elsharawy, PhD, Postdoctoral Researcher at Kiel University in Kiel, Germany, stated in an Aalto University news release.
Researchers at Aalto University in Finland have developed a method for “visualizing the specular reflection color by a blackbody substrate. The carriers containing Ag nanoparticles [shown above] are covered with various dielectrics of AlN [aluminum nitride], SiO2 [silicon dioxide], and the composites thereof that are placed on a black background to enhance the reflectivity contrast of various colors at a normal angle of incidence.” This has resulted in a tool that medical laboratories could use to differentiate between healthy and diseased exosomes in human blood. (Photo and caption copyrights: Aalto University.)
Dr. Mady Elbahri, PhD, Professor, Nanochemistry and Nanoengineering, Department of Chemistry and Materials Science at Aalto University, indicated that there is no need to use sophisticated fabrication and patterning methods with the biosensor as bulk biodetection of samples can be seen with the naked eye.
“It is extraordinary that we can detect diseased exosomes by the naked eye. The conventional plasmonic biosensors are able to detect analytes solely at a molecular level. So far, the naked-eye detection of biosamples has been either rarely considered or unsuccessful,” Elbahri noted in the news release.
Exosomes Critical to Many Human Bodily Processes
Exosomes are cell-derived vesicles that are present in many and perhaps all eukaryotic fluids, including blood, urine, and cultured medium of cell cultures. These small bundles of material are released by the outer wall of a cell and contain everything from proteins to ribonucleic acid (RNA) and Messenger RNA (mRNA). They are important indicators of health conditions.
There is mounting evidence that exosomes have exclusive functions and perform a significant role in bodily processes like coagulation, intercellular signaling, and waste management.
Interest in the clinical applications of exosomes is increasing, along with their potential for use in prognosis, development of therapies, and as biomarkers for diseases. But, exosomes are rare and distinguishing them among all other elements located in bodily fluids has proven difficult.
Thus, the Aalto study has strong implications for clinical laboratories and anatomic pathology groups. More research and regulatory approval will be needed before use of this new tool comes to fruition. However, any method that accurately and inexpensively identifies chronic disease biomarkers will impact the medical laboratory and anatomic pathology professions and is worth watching
—JP Schlingman
Related Information:
Plasmonic Biosensors Enable Development of New Easy-to-use Health Tests
Plasmonic Biosensor to Detect Exosomes with Naked Eye
Plasmonic Metaparticles on a Blackbody Create Vivid Reflective Colors for Naked‐Eye Environmental and Clinical Biodetection
Plasmonic Biosensors
Jun 27, 2018 | Digital Pathology, Instruments & Equipment, Laboratory Instruments & Laboratory Equipment, Laboratory Management and Operations, Laboratory News, Laboratory Operations, Laboratory Pathology, Laboratory Testing, Management & Operations
Using GPIIb/IIIa inhibition, and ion chelation, researchers have developed a “universal” method for preserving blood up to 72 hours while keeping it viable for advanced rare-cell applications
Through microfluidics and automation, clinical laboratories and anatomic pathologists have been able to detect ever-smaller quantities of biomarkers and other indicators of chronic disease.
However, preserving sample quality is an essential part of analytical accuracy. This is particularly true in precision oncology and other specialties where isolating rare cells (aka, low abundance cells), such as circulating tumor cells (CTCs), is a key component to obtaining information and running diagnostics.
Publishing their finding in Nature, researchers at Massachusetts General Hospital Center for Engineering in Medicine (MGH-CEM) have developed a whole blood stabilization method that is ideal for rare-cell applications, and which preserves sample integrity for up to 72 hours.
Should further testing validate their findings and methodology, this change could allow greater use of central laboratories and other remote testing facilities that previously would not be available due to distance and sample travel time.
Keeping Blood Alive Is Not Easy
“At Mass. General, we have the luxury of being so integrated with the clinical team that we can process blood specimens in the lab typically within an hour or two after they are drawn,” stated lead author Keith Wong, PhD, former Research Fellow, MGH-CEM, and now Senior Scientist at Rubius Therapeutics, Boston, in a Mass General press release. “But to make these liquid biopsy technologies routine lab tests for the rest of the world, we need ways to keep blood alive for much longer than several hours, since these assays are best performed in central laboratories for reasons of cost-effectiveness and reproducibility.”
Study authors Wong and co-lead author Shannon Tessier, PhD, Investigator at MGH-CEM, noted that current FDA-approved blood stabilization methods for CTC assays use chemical fixation—a process that can result in degradation of sensitive biomolecules and kill the cells within the sample.
Without stabilization, however, breakdown of red cells, activation of leukocytes (white blood cells), and clot formation can render the results of analyzing a sample useless, or create issues with increasingly sensitive equipment used to run assays and diagnostics.
“We wanted to slow down the biological clock as much as possible by using hypothermia, but that is not as simple as it sounds,” says Tessier. “Low temperature is a powerful means to decrease metabolism, but a host of unwanted side effects occur at the same time.”
Researchers started by using hypothermic treatments to slow degradation and cell death. However, this created another obstacle—aggressive platelet coagulation. By introducing glycoprotein IIb/IIIa inhibitors, they found they could minimize this aggregation.
Keith Wong, PhD (left), a former Research Fellow, MGH-CEM, and now Senior Scientist at Rubius Therapeutics in Boston; and Shannon Tessier, PhD (right), Investigator at MGH-CEM, co-authored a study to develop a whole blood stabilization method that preserves sample integrity for up to 72 hours, making it possible to transport blood specimens further distances to central clinical laboratories for processing. (Photo copyrights: LinkedIn.)
Prior to microfluidic processing of their test samples, researchers applied a brief calcium chelation treatment. The result was efficient sorting of rare CTCs from blood drawn up to 72 hours prior, while keeping RNA intact and retaining cell viability.
“The critical achievement here,” says Tessier, “Is that the isolated tumor cells contain high-quality RNA that is suitable for demanding molecular assays, such as single-cell qPCR, droplet digital PCR, and RNA sequencing.”
Their testing involved 10 patients with metastatic prostate cancer. Sample integrity was verified by comparing CTC analysis results between fresh samples and preserved samples from the same patients using MGH-CEM’s own microfluidic CTC-iChip device.
Results showed a 92% agreement across 12 cancer-specific gene transcripts. For AR-V7, their preservation method achieved 100% agreement. “This is very exciting for clinicians,” declared David Miyamoto, MD, PhD, of Massachusetts General Hospital Cancer Center in the press release. “AR-V7 mRNA can only be detected using CTCs and not with circulating tumor DNA or other cell-free assays.”
Methodology Concerns and Future Confirmations
“Moving forward, an extremely exciting area in precision oncology is the establishment of patient-specific CTC cultures and xenograft models for drug susceptibility,” the study authors noted. “The lack of robust methods to preserve viable CTCs is a major roadblock towards this Holy Grail in liquid biopsy. In our preliminary experiments, we found that spiked tumor cells in blood remain highly viable (>80%) after 72 hours of hypothermic preservation.”
Despite this, they also acknowledge limitations on their current findings. The first is the need for larger-scale validation, as their testing involved a 10-patient sample group.
Second, they note that further studies will be needed to “more completely characterize whole-transcriptome alterations as a result of preservation, and to what extent they can be stabilized through other means, such as further cooling (e.g., non-freezing sub-zero temperatures) or metabolic depression.”
Researchers also note that their approach has multiple advantages for regulatory approval and further testing—GPIIb/IIIa inhibitors are both low-cost and already approved for clinical use, implementation requires no modification of existing isolation assays, and cold chain protocols are already in place allowing for easy adaptation to fit the needs of pathology groups, medical laboratories, and other diagnostics providers handling samples.
While still in its early stages, the methods introduced by the researchers at MGH-CEM show potential to allow both the facilities collecting samples and the clinical laboratories processing them greater flexibility and increased accuracy, as high-sensitivity assays and diagnostics continue to power the push toward personalized medicine and expand laboratory menus across the industry.
—Jon Stone
Related Information:
Whole Blood Stabilization for the Microfluidic Isolation and Molecular Characterization of Circulating Tumor Cells
Improved Blood Stabilization Should Expand Use of Circulating Tumor Cell Profiling
Genentech Scientists Zero In on “Liquid Biopsies” as a Way to Replace Tissue Biopsies in Breast Cancer
University of Michigan Researchers Use “Labyrinth” Chip Design in Clinical Trial to Capture Circulating Tumor Cells of Different Cancer Types
Super-Fast Microscope Captures Circulating Tumor Cells with High Sensitivity and Resolution in Real Time
Jun 13, 2018 | Instruments & Equipment, Laboratory Instruments & Laboratory Equipment, Laboratory Management and Operations, Laboratory News, Laboratory Operations, Laboratory Pathology, Laboratory Testing
Access to vast banks of genomic data is powering a new wave of assessments and predictions that could offer a glimpse at how genetic variation might impact everything from Alzheimer’s Disease risk to IQ scores
Anatomic pathology groups and clinical laboratories have become accustomed to performing genetic tests for diagnosing specific chronic diseases in humans. Thanks to significantly lower costs over just a few years ago, whole-genome sequencing and genetic DNA testing are on the path to becoming almost commonplace in America. BRCA 1 and BRCA 2 breast cancer gene screenings are examples of specific genetic testing for specific diseases.
However, a much broader type of testing—called polygenic scoring—has been used to identify certain hereditary traits in animals and plants for years. Also known as a genetic-risk score or a genome-wide score, polygenic scoring is based on thousands of genes, rather than just one.
Now, researchers in Cambridge, Mass., are looking into whether it can be used in humans to predict a person’s predisposition to a range of chronic diseases. This is yet another example of how relatively inexpensive genetic tests are producing data that can be used to identify and predict how individuals get different diseases.
Assessing Heart Disease Risk through Genome-Wide Analysis
Sekar Kathiresan, MD, Co-Director of the Medical and Population Genetics program at Broad Institute of MIT/Harvard and Director of the Center for Genomics Medicine at Massachusetts General Hospital (Mass General); and Amit Khera, MD, Cardiology Fellow at Mass General, told MIT Technology Review “the new scores can now identify as much risk for disease as the rare genetic flaws that have preoccupied physicians until now.”
“Where I see this going is that, at a young age, you’ll basically get a report card,” Khera noted. “And it will say for these 10 diseases, here’s your score. You are in the 90th percentile for heart disease, 50th for breast cancer, and the lowest 10% for diabetes.”
However, as the MIT Technology Review article points out, predictive genetic testing, such as that under development by Khera and Kathiresan, can be performed at any age.
“If you line up a bunch of 18-year-olds, none of them have high cholesterol, none of them have diabetes. It’s a zero in all the columns, and you can’t stratify them by who is most at risk,” Khera noted. “But with a $100 test we can get stratification [at the age of 18] at least as good as when someone is 50, and for a lot of diseases.”
Sekar Kathiresan, MD (left), Co-Director of the Medical and Population Genetics program at Broad Institute at MIT/Harvard and Director of the Center for Genomics Medicine at Massachusetts General Hospital; and Amit Khera, MD (right), Cardiology Fellow at Mass General, are researching ways polygenic scores can be used to predict the chance a patient will be prone to develop specific chronic diseases. Anatomic pathology biomarkers and new clinical laboratory performed genetic tests will likely follow if their research is successful. (Photo copyrights: Twitter.)
Polygenic Scores Show Promise for Cancer Risk Assessment
Khera and Kathiresan are not alone in exploring the potential of polygenic scores. Researchers at the University of Michigan’s School of Public Health looked at the association between polygenic scores and more than 28,000 genotyped patients in predicting squamous cell carcinoma.
“Looking at the data, it was surprising to me how logical the secondary diagnosis associations with the risk score were,” Bhramar Mukherjee, PhD, John D. Kalbfleisch Collegiate Professor of Biostatistics, and Professor of Epidemiology at U-M’s School of Public Health, stated in a press release following the publication of the U-M study, “Association of Polygenic Risk Scores for Multiple Cancers in a Phenome-wide Study: Results from The Michigan Genomics Initiative.”
“It was also striking how results from population-based studies were reproduced using data from electronic health records, a database not ideally designed for specific research questions and [which] is certainly not a population-based sample,” she continued.
Additionally, researchers at the University of California San Diego School of Medicine (UCSD) recently published findings in Molecular Psychiatry on their use of polygenic scores to assess the risk of mild cognitive impairment and Alzheimer’s disease.
The UCSD study highlights one of the unique benefits of polygenic scores. A person’s DNA is established in utero. However, predicting predisposition to specific chronic diseases prior to the onset of symptoms has been a major challenge to developing diagnostics and treatments. Should polygenic risk scores prove accurate, they could provide physicians with a list of their patients’ health risks well in advance, providing greater opportunity for early intervention.
Future Applications of Polygenic Risk Scores
In the January issue of the British Medical Journal (BMJ), researchers from UCSD outlined their development of a polygenic assessment tool to predict the age-of-onset of aggressive prostate cancer. As Dark Daily recently reported, for the first time in the UK, prostate cancer has surpassed breast cancer in numbers of deaths annually and nearly 40% of prostate cancer diagnoses occur in stages three and four. (See, “UK Study Finds Late Diagnosis of Prostate Cancer a Worrisome Trend for UK’s National Health Service,” May 23, 2018.)
An alternative to PSA-based testing, and the ability to differentiate aggressive and non-aggressive prostate cancer types, could improve outcomes and provide healthcare systems with better treatment options to reverse these trends.
While the value of polygenic scores should increase as algorithms and results are honed and verified, they also will most likely add to concerns raised about the impact genetic test results are having on patients, physicians, and genetic counselors.
And, as the genetic testing technology of personalized medicine matures, clinical laboratories will increasingly be required to protect and distribute much of the protected health information (PHI) they generate.
Nevertheless, when the data produced is analyzed and combined with other information—such as anatomic pathology testing results, personal/family health histories, and population health data—polygenic scores could isolate new biomarkers for research and offer big-picture insights into the causes of and potential treatments for a broad spectrum of chronic diseases.
—Jon Stone
Related Information:
Forecasts of Genetic Fate Just Got a Lot More Accurate
Polygenic Scores to Classify Cancer Risk
Association of Polygenic Risk Scores for Multiple Cancers in a Phenome-Wide Study: Results from the Michigan Genomics Initiative
Polygenic Risk Score May Identify Alzheimer’s Risk in Younger Populations
Use of an Alzheimer’s Disease Polygenic Risk Score to Identify Mild Cognitive Impairment in Adults in Their 50s
New Polygenic Hazard Score Predicts When Men Develop Prostate Cancer
Polygenic Hazard Score to Guide Screening for Aggressive Prostate Cancer: Development and Validation in Large Scale Cohorts
UK Study Finds Late Diagnosis of Prostate Cancer a Worrisome Trend for UK’s National Health Service
Jun 8, 2018 | Instruments & Equipment, Laboratory Instruments & Laboratory Equipment, Laboratory Management and Operations, Laboratory Pathology, Laboratory Testing
As standard masks are used they collect exhaled airborne pathogens that remain living in the masks’ fibers, rendering them infectious when handled
Surgical-style facial masks harbor a secret—viruses that could be infectious to the people wearing them. However, masks can become effective virus killers as well. At least that’s what researchers at the University of Alberta (UAlberta) in Edmonton, Canada, have concluded.
If true, such a re-engineered mask could protect clinical laboratory workers from exposure to infectious diseases, such as, SARS (Severe Acute Respiratory Syndrome), MERS (Middle East Respiratory Syndrome), and Swine Influenza.
“Surgical masks were originally designed to protect the wearer from infectious droplets in clinical settings, but it doesn’t help much to prevent the spread of respiratory diseases such as SARS or MERS or influenza,” Hyo-Jick Choi, PhD, Assistant Professor in UAlberta’s Department of Chemical and Materials Engineering, noted in a press release.
So, Choi developed a mask that effectively traps and kills airborne viruses.
Clinical Laboratory Technicians at Risk from Deadly Infectious Diseases
The global outbreak of SARS in 2003 is a jarring reminder of how infectious diseases impact clinical laboratories, healthcare workers, and patients. To prevent spreading the disease, Canadian-based physicians visited with patients in hotel rooms to keep the virus from reaching their medical offices, medical laboratory couriers were turned away from many doctors’ offices, and hospitals in Toronto ceased elective surgery and non-urgent services, reported The Dark Report—Dark Daily’s sister publication. (See The Dark Report, “SARS Challenges Met with New Technology,” April 14, 2003.)
UAlberta materials engineering professor Hyo-Jick Choi, PhD, (right) and graduate student Ilaria Rubino (left) examine filters treated with a salt solution that kills viruses. Choi and his research team have devised a way to improve the filters in surgical masks, so they can trap and kill airborne pathogens. Clinical laboratory workers will especially benefit from this protection. (Photo and caption copyright: University of Alberta.)
How Current Masks Spread Disease
How do current masks spread infectious disease? According to UAlberta researchers:
- A cough or a sneeze transmits airborne pathogens such as influenza in aerosolized droplets;
- Virus-laden droplets can be trapped by the mask;
- The virus remains infectious and trapped in the mask; and,
- Risk of spreading the infection persists as the mask is worn and handled.
“Aerosolized pathogens are a leading cause of respiratory infection and transmission. Currently used protective measures pose potential risk of primary and secondary infection and transmission,” the researchers noted in their paper, published in Scientific Reports.
That’s because today’s loose-fitting masks were designed primarily to protect healthcare workers against large respiratory particles and droplets. They were not designed to protect against infectious aerosolized particles, according to the Centers for Disease Control and Prevention (CDC).
In fact, the CDC informed the public that masks they wore during 2009’s H1N1 influenza virus outbreak provided no assurance of infection protection.
“Face masks help stop droplets from being spread by the person wearing them. They also keep splashes or sprays from reaching the mouth and nose of the person wearing the face mask. They are not designed to protect against breathing in very small particle aerosols that may contain viruses,” a CDC statement noted.
Pass the Salt: A New Mask to Kill Viruses
Choi and his team took on the challenge of transforming the filters found on many common protective masks. They applied a coating of salt that, upon exposure to virus aerosols, recrystallizes and destroys pathogens, Engineering360 reported.
“Here we report the development of a universal, reusable virus deactivation system by functionalization of the main fibrous filtration unit of surgical mask with sodium chloride salt,” the researchers penned in Scientific Reports.
The researchers exposed their altered mask to the influenza virus. It proved effective at higher filtration compared to conventional masks, explained Contagion Live. In addition, viruses that came into contact with the salt-coated fibers had more rapid infectivity loss than untreated masks.
How Does it Work?
Here’s how the masks work, according to the researchers:
- Aerosol droplets carrying the influenza virus contact the treated filter;
- The droplet absorbs salt on the filter;
- The virus is exposed to increasing concentration of salt; and,
- The virus is damaged when salt crystallizes.
“Salt-coated filters proved highly effective in deactivating influenza viruses regardless of [influenza] subtypes,” the researchers wrote in Scientific Reports. “We believe that [a] salt-recrystallization-based virus deactivation system can contribute to global health by providing a more reliable means of preventing transmission and infection of pandemic or epidemic diseases and bioterrorism.”
Other Reports on Dangerous Exposure for Clinical Laboratory Workers
This is not the first time Dark Daily has reported on dangers to clinical laboratory technicians and ways to keep them safe.
In “Health of Pathology Laboratory Technicians at Risk from Common Solvents like Xylene and Toluene,” we reported on a 2011 study that determined medical laboratory technicians who handle common solvents were at greater risk of developing auto-immune connective tissue diseases.
And more recently, in “Europe Implements New Anatomic Pathology Guidelines to Reduce Nurse Exposure to Formaldehyde and Other Toxic Histology Chemicals,” we shared information on new approaches to protect nurses from contacting toxic chemicals, such as formalin, toluene, and xylene.
The UAlberta team may have come up with an inexpensive, simple, and effective way to protect healthcare workers and clinical laboratory technicians. Phlebotomists, laboratory couriers, and medical technologists also could wear the masks as protection from accidental infection and contact with specimens. It will be interesting to follow the progress of this special mask with its salty filter.
—Donna Marie Pocius
Related Information:
Researcher Turns “SARS Mask” into a Virus Killer
Universal Reusable Virus Deactivation System for Respiratory Protection
Understanding Respiratory Protection Options in Healthcare
H1N1 Flu and Masks
Arming Surgical Masks to Kill Viruses
New Surgical Mask Designed to Kill Viruses
SARS Challenges Met with New Technology
Toronto Hospital Labs Cope with SARS Impact
Europe Implements New Anatomic Pathology Guidelines to Reduce Nurse Exposure to Formaldehyde and Other Toxic Histology Chemicals
Health of Laboratory Technicians at Risk from Common Solvents Like Xylene and Toluene
Jun 6, 2018 | Digital Pathology, Instruments & Equipment, Laboratory Management and Operations, Laboratory News, Laboratory Operations, Laboratory Pathology, Laboratory Testing
In what could be a major boon to clinical laboratories and healthcare providers, researchers found that fears of rampant testing and ballooning spending due to results of whole-genome sequencing may be less of a concern than opponents claim
Clinical laboratory testing and personalized medicine (AKA, precision medicine) continue to reshape how the healthcare industry approaches treating disease. And, whole-genome sequencing (WGS) has shown promise in helping in vitro diagnostic (IVD) companies develop specific treatments for specific patients’ needs based on their existing conditions and physiology.
At first blush, this would seem to be a good thing. However, there has been controversy over cost and unintended consequences after patients who received their test results experienced negative encounters with physicians and genetic counselors. The impact on their lives and on their caregivers have not always been positive. (See Dark Daily, “Consumers Buying Genealogy Gene Sequencing Tests in Record Numbers; Some Experts Concerned Data Could Be Misinterpreted,” May 14, 2018.)
Nevertheless, WGS development and the ensuing controversy continues. This has motivated researchers at Brigham and Women’s Hospital (BWH) in Boston to engage in a study that compares the upfront costs of WGS to the downstream costs of healthcare, in an attempt to determine if and how whole-genome sequencing does actually impact the cost of care.
Are Doctors Acting Responsibly?
The MedSeq Project study, published in Genetics in Medicine, a journal of the American College of Medical Genetics and Genomics, involved 200 people—100 of them healthy, the other 100 diagnosed with cardiomyopathy. Roughly half of each group underwent whole-genome sequencing, while the other half used family history to guide treatments and procedures. The project then collected data on downstream care costs for the next six months for each group to compare how whole-genome sequencing might impact the final totals.
“Whole genome sequencing is coming of age, but there’s fear that with these advancements will come rocketing healthcare costs,” lead author Kurt Christensen, PhD, Instructor of Medicine in the Division of Genetics at BWH, stated in a press release.
“Our pilot study is the first to provide insights into the cost of integrating whole-genome sequencing into the everyday practice of medicine,” noted Kurt Christensen, PhD, lead author of the Brigham and Women’s Hospital study. “Our data [provides] reassurance that physicians seem to be responding responsibly and that we’re not seeing evidence of dramatically increased downstream spending.” (Photo copyright: ResearchGate.)
Clinical Laboratory Testing Largest Difference in Cost/Services Rendered
Within the healthy volunteer group, patients who based treatment decisions solely on their family medical history averaged $2,989 in medical costs over the next six months. Those who received WGS incurred $3,670 in costs.
Services also remained relatively consistent between both groups. The WGS group averaging 5.5 outpatient lab tests and 8.4 doctor visits across the period, while the family history group averaged 4.4 outpatient lab tests and 6.9 doctor visits.
Within the cardiology patient group, however, the dynamic flipped. WGS recipients averaged $8,109 in spending, while the family history group averaged $9,670. Study authors attribute this to the possibility of treatments while being hospitalized for concerns unrelated to the study.
When removing hospitalizations from the data set, the WGS group averaged $5,392, while the family history group averaged $4,962—a result similar to that of the healthy group.
Utilization of services was also similar. The WGS group averaged 7.8 doctor visits, while the family history group averaged 7.2 visits. However, the outpatient lab testing spread was wider than any other group in the study. WGS patients averaged 9.5 tests compared to the 6.5 of the family history group.
Unanswered Questions
In their report, the study’s authors acknowledged a range of questions still unanswered by their initial research.
First, the project took place at a facility in which physicians were educated in genetics, had contacts familiar with genetics, and had the support of a genome resource center. The level of experience with genetics may also have prevented additional spending by tempering responses to results.
Although the whole-genome sequencing that took place during the project uncovered genetic variants known to or likely to cause disease within the healthy population, this did not trigger the wave of testing or panic many opponents of genetic sequencing predicted.
Authors also acknowledge that a longer, larger study would offer more conclusive results. Researchers are planning for a longer 5-year study to verify their initial findings. However, study co-author Robert Green, MD, Director of the Genomes2People Research Program at BWH told STAT, “… downstream medical costs of sequencing may be far more modest than the common narrative suggests.”
Further Research Needed
The BWH researchers acknowledged that monetary cost is only one facet of the impact of genetic sequencing results. “Patient time costs were not assessed,” the study authors pointed out. “Nor were the effects of disclosure on participants’ family members, precluding a complete analysis from a societal perspective.”
Lastly, they noted that while the sample size sufficed to verify their results, diversity was lacking. In particular, they mentioned that the participant pool was “more educated and less ethnically diverse than the general population.”
The cost of genetic sequencing and similar technologies continue to drop as automation and innovation make the process more accessible to clinicians and healthcare providers. This could further impact longer studies of the overall cost of sequencing and other genetics-based tools.
For medical laboratories, these results offer proof to both payers and physicians on the value of services in relation to the overall cost of care—a critical concern, as margins continue to shrink and regulations focus on efficiency across a broad spectrum of healthcare-related service industries.
—Jon Stone
Related Information:
Genetic Sequencing: Low Rate of Downstream Costs Demonstrate It’s Worth the Investment
Getting Your Genome Sequenced Might Not Make You Spend More on Health Care
Sequencing Patients’ Genomes Might Not Break the Health Care Bank, Study Finds
Studies Show How Clinical Whole-Exome Sequencing May Forever Change the Future Practice of Medicine while Giving Pathologists a New Opportunity to Deliver Value
Consumers Buying Genealogy Gene Sequencing Tests in Record Numbers; Some Experts Concerned Data Could Be Misinterpreted
Jun 4, 2018 | Digital Pathology, Instruments & Equipment, Laboratory Instruments & Laboratory Equipment, Laboratory Management and Operations, Laboratory News, Laboratory Pathology, Management & Operations
New scientific insights from these studies represent progress in the effort to develop a clinical laboratory test that would enable physicians to diagnose Alzheimer’s Disease earlier and with greater accuracy
Most medical laboratory professionals are aware that, for more than 30 years, in vitro diagnostic (IVD) developers and pharmaceutical researchers have sought the Holy Grail of clinical laboratory testing—an accurate test for Alzheimer’s disease that is minimally-invasive and produces information that is actionable by clinicians at a reasonable cost. Such a test could spark a revolution in the diagnosis and treatment of this debilitating disease and would improve the lives of tens of thousands of people each year.
Now, two different research studies being conducted in Germany and Japan may have developed such tests that use blood samples. The tests detect specific biomarkers found in Alzheimer’s patients and one day could enable physicians to diagnose the disease in its preclinical stages.
German Test Identifies Amyloid-Beta Biomarker
The test under development at Ruhr University in Bochum, Germany, detects the presence of amyloid-beta, a component of amyloid plaque (AKA, amyloid-β plaques), which has consistently been found in Alzheimer’s patents, according to United Press International (UPI).
A healthy brain has amyloid-beta plaques, too. However, in a person with Alzheimer’s disease, the amyloid-beta is misfolded, formed like a sheet, and toxic to nerve cells, the researchers explained in a press release.
The test works with small amounts of blood plasma and employs an immuno-infrared-sensor, also developed at Ruhr University. The sensor measures the amounts of both pathological (the misfolded kind) and healthy amyloid-beta in the blood.
Amyloid plaques can start to form decades prior to the onset of Alzheimer’s symptoms, making them identifiable biomarkers that can be used as a “preselection funnel in two‐step diagnostics,” the researchers noted.
“The use of the immuno‐infrared‐sensor as an initial screening funnel to identify people who should undergo further diagnostics and eventually take part in clinical trials on therapeutics targeting Aβ misfolding might already be an important step forward because subjects with early AD stages are hard to identify,” the researchers note. “To our knowledge, there is today no other plasma test available, which has been tested both in an AD research cohort and in the general population.”
Klaus Gerwert, PhD, (left) Chair of Biophysics at Ruhr University in Bochum, Germany, and Dr. Katsuhiko Yanagisawa, PhD, (right) molecular biologist and Director of the Center for Development of Advanced Medicine for Dementia in Obu City, Japan, both lead research teams that developed tests for identifying amyloid-β biomarkers in early onset Alzheimer’s patients. More research must be conducted before these assays could be offered by clinical laboratories. (Photo copyrights: International Max Planck Research School in Chemical and Molecular Biology/Nagoya University School of Medicine.)
Another Blood Test Finds Amyloid-Beta
Interestingly, just a few months ahead of the German researchers’ paper, scientists at the Center for Development of Advanced Medicine for Dementia (CAMD) in Obu City, Japan, published their own paper on a similar blood test they developed that also identifies high levels of amyloid-beta in patients with Alzheimer’s.
However, according to a news release, the Japanese study involved the use of immunoprecipitation and mass spectrometry to measure amyloid-beta related fragments in the blood.
The study, which was published in Nature, involved 373 people: 121 Japanese in the discovery cohort set and 252 Australians in the validation data set. The test found amyloid-beta levels in the brain with 90% accuracy, The Scientist reported.
“These results demonstrate the potential clinical utility of plasma biomarkers in predicting brain amyloid-β burden at an individual level. These plasma biomarkers also have cost-benefit and scalability advantages over current techniques, potentially enabling broader clinical access and efficient population screening,” the researchers wrote in their paper.
Previous Alzheimer’s Research
These studies are not the first to seek biomarkers that could detect the early-onset of Alzheimer’s disease. In 2016, Dark Daily reported on two other studies: one conducted at Rowan University School of Osteopathic Medicine (RowanSOM) and another by IVD company Randox Laboratories. (See Dark Daily, “Two Different Research Teams Announce Tests for Alzheimer’s Disease That Could Be Useful for Clinical Laboratories after Clearance by the FDA,” November 30, 2016.)
Nevertheless, as of 2018, Alzheimer’s disease has impacted the lives of approximately 5.7 million Americans of all ages, according to the Alzheimer’s Association. And yet, doctors currently only have expensive positron emission tomography (PET) brain scans and invasive cerebrospinal fluid (CSF) analysis to identify the disease, generally in the latter stages of its development.
Thus, a less invasive, inexpensive test that accurately identifies biomarkers found in the majority of people during the early stages of the disease would be a boon to physicians who treat chronic neurodegenerative disease, medical laboratories that perform the tests, and, of course, the thousands of people each year who are diagnosed and suffer with this debilitating condition.
—Donna Marie Pocius
Related Information:
Blood Test Can Detect Alzheimer’s Years Before Symptoms
New Blood Test Useful to Detect People at Risk of Developing Alzheimer’s Disease
Blood Test Detects Alzheimer’s Before Symptoms Appear
Blood Test May Detect Very Early Alzheimer’s
Simple Blood Test Spots Dementia Protein
High Performance Plasma Amyloid-Beta Biomarkers for Alzheimer’s Disease
Researchers Develop Potential Blood Test for Alzheimer’s Disease
Japan Researchers Develop Cheap and Easy Way to Diagnose Alzheimer’s
Two Different Research Teams Announce Tests for Alzheimer’s Disease That Could Be Useful for Clinical Laboratories After Clearance by the FDA