News, Analysis, Trends, Management Innovations for
Clinical Laboratories and Pathology Groups

Hosted by Robert Michel

News, Analysis, Trends, Management Innovations for
Clinical Laboratories and Pathology Groups

Hosted by Robert Michel
Sign In

Genomics England Increases Goal of Whole Genome Sequencing Project from 100,000 to 500,000 Sequences in Five Years

Genomic sequencing continues to benefit patients through precision medicine clinical laboratory treatments and pharmacogenomic therapies

EDITOR’S UPDATE—Jan. 26, 2022: Since publication of this news briefing, officials from Genomics England contacted us to explain the following:

  • The “five million genome sequences” was an aspirational goal mentioned by then Secretary of State for Health and Social Care Matt Hancock, MP, in an October 2, 2018, press release issued by Genomics England.
  • As of this date a spokesman for Genomics England confirmed to Dark Daily that, with the initial goal of 100,000 genomes now attained, the immediate goal is to sequence 500,000 genomes.
  • This goal was confirmed in a tweet posted by Chris Wigley, CEO at Genomics England.

In accordance with this updated input, we have revised the original headline and information in this news briefing that follows.

What better proof of progress in whole human genome screening than the announcement that the United Kingdom’s 100,000 Genome Project has not only achieved that milestone, but will now increase the goal to 500,000 whole human genomes? This should be welcome news to clinical laboratory managers, as it means their labs will be positioned as the first-line provider of genetic data in support of clinical care.

Many clinical pathologists here in the United States are aware of the 100,000 Genome Project, established by the National Health Service (NHS) in England (UK) in 2012. Genomics England’s new goal to sequence 500,000 whole human genomes is to pioneer a “lasting legacy for patients by introducing genomic sequencing into the wider healthcare system,” according to Technology Networks.

The importance of personalized medicine and of the power of precise, accurate diagnoses cannot be understated. This announcement by Genomics England will be of interest to diagnosticians worldwide, especially doctors who diagnose and treat patients with chronic and life-threatening diseases.

Building a Vast Genomics Infrastructure

Genetic sequencing launched the era of precision medicine in healthcare. Through genomics, drug therapies and personalized treatments were developed that improved outcomes for all patients, especially those suffering with cancer and other chronic diseases. And so far, the role of genomics in healthcare has only been expanding, as Dark Daily covered in numerous ebriefings.

In the US, the National Institute of Health’s (NIH’s) Human Genome Project sequenced the first whole genome in 2003. That achievement opened the door to a new era of precision medicine.

Genomics England, which is wholly owned by the Department of Health and Social Care in the United Kingdom, was formed in 2012 with the goal of sequencing 100,000 whole genomes of patients enrolled in the UK National Health Service. That goal was met in 2018, and now the NHS aspires to sequence 500,000 genomes.

Richard Scott, MD, PhD

“The last 10 years have been really exciting, as we have seen genetic data transition from being something that is useful in a small number of contexts with highly targeted tests, towards being a central part of mainstream healthcare settings,” Richard Scott, MD, PhD (above), Chief Medical Officer at Genomics England told Technology Networks. Much of the progress has found its way into clinical laboratory testing and precision medicine diagnostics. (Photo copyright: Genomics England.)

Genomics England’s initial goals included:

  • To create an ethical program based on consent,
  • To set up a genomic medicine service within the NHS to benefit patients,
  • To make new discoveries and gain insights into the use of genomics, and
  • To begin the development of a UK genomics industry.

To gain the greatest benefit from whole genome sequencing (WGS), a substantial amount of data infrastructure must exist. “The amount of data generated by WGS is quite large and you really need a system that can process the data well to achieve that vision,” said Richard Scott, MD, PhD, Chief Medical Officer at Genomics England.

In early 2020, Weka, developer of the WekaFS, a fully parallel and distributed file system, announced that it would be working with Genomics England on managing the enormous amount of genomic data. When Genomics England reached 100,000 sequenced genomes, it had already gathered 21 petabytes of data. The organization expects to have 140 petabytes by 2023, notes a Weka case study.

Putting Genomics England’s WGS Project into Action

WGS has significantly impacted the diagnosis of rare diseases. For example, Genomics England has contributed to projects that look at tuberculosis genomes to understand why the disease is sometimes resistant to certain medications. Genomic sequencing also played an enormous role in fighting the COVID-19 pandemic.

Scott notes that COVID-19 provides an example of how sequencing can be used to deliver care. “We can see genomic influences on the risk of needing critical care in COVID-19 patients and in how their immune system is behaving. Looking at this data alongside other omics information, such as the expression of different protein levels, helps us to understand the disease process better,” he said.

What’s Next for Genomics Sequencing?

As the research continues and scientists begin to better understand the information revealed by sequencing, other areas of scientific study like proteomics and metabolomics are becoming more important.

“There is real potential for using multiple strands of data alongside each other, both for discovery—helping us to understand new things about diseases and how [they] affect the body—but also in terms of live healthcare,” Scott said.

Along with expanding the target of Genomics England to 500,000 genomes sequenced, the UK has published a National Genomic Strategy named Genome UK. This plan describes how the research into genomics will be used to benefit patients. “Our vision is to create the most advanced genomic healthcare ecosystem in the world, where government, the NHS, research and technology communities work together to embed the latest advances in patient care,” according to the Genome UK website.

Clinical laboratories professionals with an understanding of diagnostics will recognize WGS’ impact on the healthcare industry. By following genomic sequencing initiatives, such as those coming from Genomics England, pathologists can keep their labs ready to take advantage of new discoveries and insights that will improve outcomes for patients.

Dava Stewart

Related Information:

The 100,000 Genomes Project

Genome Sequencing in Modern Medicine: An Interview with Genomics England

WekaIO Accelerates Five Million Genomes Project at Genomics England

Genomics England Improved Scale and Performance for On-Premises Cluster

Whole Genome Sequencing Increases Rare Disorder Diagnosis by 31%

Genome UK: The Future of Healthcare

UCLA’s Virtual Histology Could Eliminate Need for Invasive Biopsies for Some Skin Conditions and Cancers

Though the new technology could speed diagnoses of cancers and other skin diseases, it would also greatly reduce dermatopathology biopsy referrals and revenue

What effect would elimination of tissue biopsies have on dermatopathology and clinical laboratory revenue? Quite a lot. Dermatologists alone account for a significant portion of skin biopsies sent to dermatopathologists. Thus, any new technology that can “eliminate the need for invasive skin biopsies” would greatly reduce the number of histopathological referrals and reduce revenue to those practices.

Nevertheless, one such new technology may have been created by Ozcan Research Group in a proof-of-concept study they conducted at the University of California, Los Angeles (UCLA).

Called Virtual Histology, the technology applies artificial intelligence (AI) deep learning methods to reflectance confocal microscopy (RCM) images “to rapidly perform virtual histology of in vivo, label-free RCM images of normal skin structure, basal cell carcinoma, and melanocytic nevi with pigmented melanocytes, demonstrating similar histological features to traditional histology from the same excised tissue,” the UCLA scientists wrote in their study, published in the Nature peer-reviewed journal Light: Science and Applications.

Aydogan Ozcan, PhD

“What if we could entirely bypass the biopsy process and perform histology-quality staining without taking tissue and processing tissue in a noninvasive way? Can we create images that diagnosticians can benefit from?” asked Aydogan Ozcan, PhD (above), Chancellor’s Professor of Electrical and Computer Engineering at UCLA’s Samueli School of Engineering, one of the scientists who developed UCLA’s new virtual histology method, during an interview with Medical Device + Diagnostic Industry (MD+DI). (Photo copyright: Nature.)

Could Skin Biopsies be Eliminated?

The UCLA researchers believe their innovative deep learning-enabled imaging framework could possibly circumvent the need for skin biopsies to diagnose skin conditions.

“Here, we present a deep learning-based framework that uses a convolutional neural network to rapidly transform in vivo RCM images of unstained skin into virtually-stained hematoxylin and eosin-like images with microscopic resolution, enabling visualization of the epidermis, dermal-epidermal junction, and superficial dermis layers.

“This application of deep learning-based virtual staining to noninvasive imaging technologies may permit more rapid diagnoses of malignant skin neoplasms and reduce invasive skin biopsies,” the researchers added in their published study.

“This process bypasses several standard steps typically used for diagnosis, including skin biopsy, tissue fixation, processing, sectioning, and histochemical staining,” Aydogan Ozcan, PhD, Chancellor’s Professor of Electrical and Computer Engineering at UCLA’s Samueli School of Engineering, told Optics.org.

AI and Deep Learning in Dermatopathology

According to the published study, the UCLA team trained their neural network under an adversarial machine learning scheme to transform grayscale RCM images into virtually stained 3D microscopic images of normal skin, basal cell carcinoma, and pigmented melanocytic nevi. The new images displayed similar morphological features to those shown with the widely used hematoxylin and eosin (H&E) staining method.

“In our studies, the virtually stained images showed similar color contrast and spatial features found in traditionally stained microscopic images of biopsied tissue,” Ozcan told Photonics Media. “This approach may allow diagnosticians to see the overall histological features of intact skin without invasive skin biopsies or the time-consuming work of chemical processing and labeling of tissue.”

The framework covers different skin layers, including the epidermis, dermal-epidermis, and superficial dermis layers. It images deeper into tissue without being invasive and can be quickly performed.

“The virtual stain technology can be streamlined to be almost semi real time,” Ozcan told Medical Device + Diagnostic Industry (MD+DI). “You can have the virtual staining ready when the patient is wrapping up. Basically, it can be within a couple of minutes after you’re done with the entire imaging.”

Currently, medical professionals rely on invasive skin biopsies and histopathological evaluations to diagnose skin diseases and cancers. These diagnostic techniques can result in unnecessary biopsies, scarring, multiple patient visits and increased medical costs for patients, insurers, and the healthcare system.

Improving Time to Diagnosis through Digital Pathology

Another advantage of this virtual technology, the UCLA researchers claim, is that it can provide better images than traditional staining methods, which could improve the ability to diagnose pathological skin conditions and help alleviate human error.

“The majority of the time, small laboratories have a lot of problems with consistency because they don’t use the best equipment to cut, process, and stain tissue,” dermatopathologist Philip Scumpia, MD, PhD, Assistant Professor of Dermatology and Dermatopathology at UCLA Health and one of the authors of the research paper, told MD+DI.

“What ends up happening is we get tissue on a histology slide that’s basically unevenly stained, unevenly put on the microscope, and it gets distorted,” he added, noting that this makes it very hard to make a diagnosis.  

Scumpia also added that this new technology would allow digital images to be sent directly to the pathologist, which could reduce processing and laboratory times.

“With electronic medical records now and the ability to do digital photography and digital mole mapping, where you can obtain a whole-body imaging of patients, you could imagine you can also use one of these reflectance confocal devices. And you can take that image from there, add it to the EMR with the virtual histology stain, which will make the images more useful,” Scumpia said. “So now, you can track lesions as they develop.

“What’s really exciting too, is that there’s the potential to combine it with other artificial intelligence, other machine learning techniques that can give more information,” Scumpia added. “Using the reflectance confocal microscope, a clinician who might not be as familiar in dermatopathology could take images and send [them] to a practitioner who could give a more expert diagnosis.”

Faster Diagnoses but Reduced Revenue for Dermatopathologists, Clinical Labs

Ozcan noted that there’s still a lot of work to be done in the clinical assessment, validation, and blind testing of their AI-based staining method. But he hopes the technology can be propelled into a useful tool for clinicians.

“I think this is a proof-of-concept work, and we’re very excited to make it move forward with further advances in technology, in the ways that we acquire 3D information [and] train our neural networks for better and faster virtual staining output,” he told MD+DI.

Though this new technology may reduce the need for invasive biopsies and expedite the diagnosis of skin conditions and cancers—thus improving patient outcomes—what affect might it have on dermatopathology practices?

More research and clinical studies are needed before this new technology becomes part of the diagnosis and treatment processes for skin conditions. Nevertheless, should virtual histology become popular and viable, it could greatly impact the amount of skin biopsy referrals to pathologists, dermatopathologists, and clinical laboratories, thus diminishing a great portion of their revenue. 

—JP Schlingman

Related Information:

Virtual Histology Eliminates Need for Invasive Skin Biopsies

UCLA Deep-learning Reduces Need for Invasive Biopsies

AI Imaging Method Provides Biopsy-free Skin Diagnosis

Light People: Professor Aydogan Ozcan

Histology Process Bypasses Need for Biopsies, Enables Diagnoses

Reflection-Mode Virtual Histology Using Photoacoustic Remote Sensing Microscopy

Introduction to Reflectance Confocal Microscopy and Its Use in Clinical Practice

Biopsy-free In Vivo Virtual Histology of Skin Using Deep Learning

Can This New Tech Reduce the Need for Skin Biopsies?

An Unlikely Pandemic Pairing: Facemasks Embedded with Ostrich Antibodies That Detect COVID-19 under UV Light

Japanese scientists who developed the detection method hope to use it to create ‘easy testing kits that anyone can use’

What do ostriches and humans have in common during the current COVID-19 pandemic? The unexpected answer is that ostrich antibodies can be used to identify humans infected with COVID-19. If proven viable in healthcare settings, the possibility exists that new clinical laboratory tests could be developed based on wearable diagnostics technologies that pathologists would interpret for doctors and patients.

This insight was the result of research conducted at Japan’s Kyoto Prefectural University. The KPU scientists found that a paper facemask coated with ostrich antibodies will give off a fluorescence in the presence of the SARS-CoV-2 coronavirus under ultraviolet (UV) light.

Yasuhiro Tsukamoto, PhD

According to Study Finds, scientists at Kyoto Prefectural University in Japan have created a removable mask filter that, when sprayed with a fluorescent dye coated with antibodies extracted from ostrich eggs, will glow under UV light when COVID-19 is detected. The discovery by Yasuhiro Tsukamoto, PhD (above), President of Kyoto Prefectural University, and his researchers could lead to development of low-cost at home COVID-19 testing kits using the same ostrich-antibody-based technique. (Photo copyright: Kyoto Prefectural University/Reuters.)

The KPU scientists conducted a small study with 32 COVID-19 patients over a 10-day span. The surgical-style masks they wore later glowed around the nose and mouth areas but became dimmer over time as their viral load decreased.

“The ostrich antibody for corona placed on the mouth filter of the mask captures the coronavirus in coughing, sneezing, and water,” the researchers explained in Study Finds.

Tsukamoto himself learned he had contracted COVID-19 after wearing a prototype mask and noticing it glowed under UV light. A PCR test later confirmed his diagnosis, Kyodo News reported.

The KPU team “hopes to further develop the masks so they will glow automatically, without special lighting, if the [COVID-19] virus is detected.” Reuters noted in its coverage of the ostrich-antibody masks.

Making Medicine from Ostrich Antibodies

A profile in Audubon noted that Tsukamoto, who also serves as a veterinary medicine professor at Kyoto Prefectural University, made ostriches the focus of his research since the 1990s as he looked for ways to harness the dinosaur-like bird’s properties to fight human infections. He maintains a flock of 500 captive ostriches. Each female ostrich can produce 50 to 100 eggs/year over a 50-year life span.

Tsukamoto’s research focuses on customizing the antibodies in ostrich eggs by injecting females with inactive viruses, allergens, and bacteria, and then extracting the antibodies to develop medicines for humans. Antibodies form in the egg yolks in about six weeks and can be collected without harming the parent or young.

“The idea of using ostrich antibodies for therapeutics in general is a very interesting concept, particularly because of the advantages of producing the antibodies from eggs,” Ashley St. John, PhD, an Associate Professor in Immunology, at Duke-NUS Medical School in Singapore, told Audubon.

While more clinical studies will be needed before ostrich-antibody masks reach the commercial marketplace, Tsukamoto’s team is planning to expand their experiment to 150 participants with a goal of receiving Japanese government approval to begin selling the glowing COVID-detection masks as early as 2022. But they believe the ostrich-antibody technique ultimately may lead to development of an inexpensive COVID-19 testing kit.

“We can mass-produce antibodies from ostriches at a low cost. In the future, I want to make this into an easy testing kit that anyone can use,” Tsukamoto told Kyodo News.

Harvard, MIT Also Working on COVID-19 Detecting Facemask

Not to be out done, scientists at the Massachusetts Institute of Technology (MIT) and Harvard University are participating in a similar effort to create a facemask capable of detecting COVID-19.

According to Fast Company, the MIT/Harvard COVID-19-detecting masks use the same core technology as previous paper tests for Ebola and Zika that utilize proteins and nucleic acids embedded in paper that react to target molecules.

New facemask

Fast Company explained that the mask wearer launches a test by pushing a button to release a small water reservoir embedded in the mask (above). Droplets from their breath are than analyzed by the sensors in the masks, which could be adapted to test for new COVID variants or other respiratory pathogens. In addition to eliminating the use of a nasal swab, the mask-based testing system may compete with clinical laboratory-based results. (Photo copyright: Felice Frankel/MIT.)

“Our system just allows you to add on laboratory-grade diagnostics to your normal mask wearing,” Peter Q. Nguyen, PhD, lead author of a study published in Nature Biotechnology, titled, “Wearable Materials with Embedded Synthetic Biology Sensors for Biomolecule Detection.” Nguyen is a research scientist at the Wyss Institute for Bioinspired Engineering at Harvard.

“They would especially be useful in situations where local variant outbreaks are occurring, allowing people to conveniently test themselves at home multiple times a day,” he told Fast Company.

“It’s on par specificity and sensitivity that you will get in a state-of-the-art [medical] laboratory, but with no one there,” Luis Ruben Soenksen, PhD, Venture Builder in Artificial Intelligence and Healthcare at MIT and one of the co-authors of the Nature Biotechnology study, told Fast Company.

Wearable Diagnostics

This isn’t the first-time unlikely sources have led to useful diagnostic information. In “Researchers in Japan Have Developed a ‘Smart’ Diaper Equipped with a Self-powered Biosensor That Can Monitor Blood Glucose Levels in Adults,” Dark Daily reported on another Japanese research team that developed self-powered wearable biosensors in undergarments that could detect blood glucose levels in individuals with diabetes as well as “smart diapers” that detect urine changes in babies.

As the definition of “wearable diagnostic technology” broadens, pathologists and clinical laboratory scientists may see their roles expand to include helping consumers interpret data collected by point-of-care testing technology as well as performing, evaluating, and interpreting laboratory test results that come from non-traditional sources. 

Andrea Downing Peck

Related Information:

Wearable Materials with Embedded Synthetic Biology Sensors for Biomolecule Detection

Face Mask Made with Ostrich Extract Detects COVID-19 by Glowing Under UV Light

How the Biggest Birds on Earth Could Help Fend Off Epidemics

Scientists Use Ostrich Cells to Make Glowing COVID Detection Masks

Japan Researchers Use Ostrich Cells to Make Glowing COVID-19 Detection Masks

This Mask Glows If You Have COVID

This New Face Mask Tests You for COVID while Protecting You from It

Researchers in Japan Have Developed a ‘Smart’ Diaper Equipped with a Self-powered Biosensor That Can Monitor Blood Glucose Levels in Adults

University of Colorado Researchers Develop Miniature Colonoscopy Robot Capable of Collecting Biopsies and Transmit Diagnostic Information in Real Time

GI pathologists will be interested in how the Endoculus device uses tank-like treads to traverse the gastrointestinal tract, where it can capture images and perform biopsies

Gastroenterologists (GI) may soon gain a useful new tool for use in gathering both biopsies and diagnostic information when examining the gastrointestinal tract. Ongoing development of a new robotic device promises both capabilities using technology that will be of interest to GI pathologists and clinical laboratory scientists.

Researchers at the University of Colorado Boulder’s Advanced Medical Technologies Laboratory (AMTL) have developed a capsule-sized robotic device called “Endoculus” which they believe could eventually replace traditional endoscopes used in colonoscopies and endoscopies.

The minute robotic device uses tank-like treads to traverse the colon. While there, it can capture live images and perform biopsies under the control of a gastroenterologist. The researchers believe the robotic technology will benefit GIs performing the colonoscopies as well as the pathologists called upon to analyze biopsies.

Gregory Formosa, PhD

“Currently, endoscopy consists of a gastroenterologist using a semi-rigid, long rope-like device and endoscope to propel through your colon manually,” Gregory Formosa, PhD (above) a member of the AMTL team that developed Endoculus, said in a YouTube video describing the device. “We think that a robotic capsule endoscope can replace conventional endoscopes by making them faster, safer, and more robust than a human operator can do currently with traditional techniques,” he added. (Photo copyright: University of Colorado.)

AMTL researcher Gregory Formosa, PhD, said the team’s goal is to “have a capsule-sized robot that can actively traverse [a patient’s] entire gastrointestinal tract and send out diagnostics in real time, as well as autonomously navigate itself to localize problematic areas within [the] intestinal tract.”

Formosa noted that colorectal cancer is “the third-most fatal and diagnosed cancer in the United States.” But if caught at an early stage, these cancers are “95% treatable,” he added. “So, if we can get people screened early, we definitely can reduce the fatality rate of colorectal cancers significantly.”

The AMTL research team, now led by mechanical engineering professor Mark Rentschler, PhD, described an early prototype of the device in a Surgical Endoscopy paper, titled, “Surgical Evaluation of a Novel Tethered Robotic Capsule Endoscope Using Micro-Patterned Treads.” The researchers have since followed that with additional papers in IEEE journals and presentations at the IEEE International Conference on Intelligent Robots and Systems.

The Endoculus device

Currently about the size of a C battery, Endoculus (above) is a “fully packed medical device, complete with a camera, an air pump for inflating the colon, a water pump for cleaning, and a tool port for holding biopsy snares,” states a University of Colorado news story, titled, “A Robot May One Day Perform Your Colonoscopy.” (Photo copyright: University of Colorado.)

How Endoculus Works

One key to the device are the four treads, which are designed for traction on digestive tissue.

“You have to forget about everything you know from a locomotion standpoint because driving around inside the body is very different than driving around in a car,” said Rentschler in the University of Colorado news story. “The environment is highly deformable. It’s very slick. There are sharp peaks that you have to go over.”

The university news story noted the current availability of ingestible “pill cams” that can take photos as they travel through the digestive system. But once swallowed, their movements cannot be controlled.

“For our robots to be able to reach those regions that [can be] reached with a pill-cam—but also be able to stop and look around—that could be a big paradigm shift in the way we view these procedures,” said Micah Prendergast, PhD, an AMTL research team member.

Could Biopsies Be Diagnosed In Situ with Endoculus?

The researchers currently view Endoculus as a potentially better way to perform conventional biopsies. But could it lead to bigger advancements?

“Researchers continue to develop devices to help various specialist physicians—in this case GIs—do more when treating patients,” said Dark Daily Publisher and Editor-in-Chief Robert Michel. “This device fits that description. It is designed to improve the ability of GIs to evaluate the colon. Not only does this device do that, but it can also collect a biopsy at sites of interest. In this way, it is a device that can be a benefit to pathologists who will analyze the biopsy.

“With improvements in digital cameras and associated AI-powered analytical tools, the day might not be far off when a device like this can use the camera and artificial intelligence to diagnose the tissue of interest in situ,” he added. “This might create the opportunity for pathologists to be present in the exam room during the procedure, or even viewing the images remotely.

“Not only would that eliminate the need to collect a tissue specimen that must then be sent to a pathology lab, but it would create a new opportunity for pathologists to add value to patient care while shortening the time to diagnosis for the tissue of interest during these procedures,” Michel noted.

         

Stephen Beale

Related Information:

Colon Explorer for Automatic Imaging and Biopsying of Polyps

This Tiny Robot Tank Could One Day Help Doctors Explore Your Intestine

A Robot May One Day Perform Your Colonoscopy

Mitchell Cancer Institute in Alabama Combines New Robotic Method for Detecting and Excising Biopsies with Rapid On-site Evaluation (ROSE) to Speed Diagnosis of Lung Cancer

Mitchell Cancer Institute in Alabama Combines New Robotic Method for Detecting and Excising Biopsies with Rapid On-site Evaluation (ROSE) to Speed Diagnosis of Lung Cancer

Combining robotic-assisted bronchoscopy with rapid on-site evaluation by cytopathologists enables cancer evaluation and diagnosis in one procedure

New technologies are making it possible to both collect a tissue biopsy and diagnose lung cancer during the same procedure. Cytopathologist are essential in this unique approach, which has the potential to greatly shorten the time required to diagnose lung cancer.

At USA Health Mitchell Cancer Institute in Alabama, a team consisting of pulmonology, pathology, surgical, and medical oncology specialists can diagnose lung cancer significantly faster thanks to the combining of a robotic-assisted bronchoscopy (RAB) system with rapid on-site evaluation of biopsies (ROSE) by a cytopathologist during the same procedure.

The RAB platform was created by Auris Health in Redwood City, Calif. According to a USA Health new release, the Auris Health Monarch “enables physicians to see inside the lung and biopsy hard-to-reach nodules using a flexible endoscope. When combined with rapid on-site evaluation (ROSE) it allows for diagnosis at the time of bronchoscopy.”

USA Health says it is the only academic health system in Alabama to combine the Auris Health Monarch (Monarch) with ROSE to diagnose lung cancer in a single procedure. 

“Nine-nine percent of the time we make a diagnosis—negative or positive (at time of bronchoscopy). We don’t have to do repeat procedures,” said Elba Turbat-Herrera, MD, Director of Pathological Services at USA Health’s Mitchell Cancer Institute (MCI) and Professor, MCI Interdisciplinary Clinical Oncology, in an exclusive interview with Dark Daily.

The American Society for Cytopathology defines ROSE as “a clinical service provided for patients where a pathologist, or in certain settings, an experienced and appropriately qualified cytotechnologist provides immediate real‐time evaluation of a fine needle aspiration (FNA) biopsy or touch imprints of a core biopsy.”

As a cytopathologist, Turbat-Herrera performs ROSE during procedures at USA Health. “I think we have improved diagnostics very much. With the Monarch equipment, one can see where the needle is traveling in the bronchial tube. It is more precise,” Turbat-Herrera explained.

Patients Benefit from Robotic-assisted Bronchoscopy

Traditionally, anatomic pathologists receive core (tissue sampling) biopsies and fine-needle aspiration biopsies from doctors looking to determine if a lung nodule may be cancerous. But the procedures to secure the biopsies are invasive and stressful for patients waiting for results from clinical laboratories. And some nodules are difficult for surgeons to reach, which can delay care to patients.

Brian Persing, MD

“The Monarch and ROSE technologies represent a huge step forward in lung bronchoscopy. Being able to see directly inside the lung and evaluate samples immediately provides the most advanced care for patients,” said Brian Persing, MD (above), Medical Oncologist, Mitchell Cancer Institute, and Assistant Professor of Interdisciplinary Clinical Oncology at the University of South Alabama College of Medicine, in the news release. (Photo copyright: University of South Alabama.)

Currently, more than 112 US healthcare providers use the Monarch robotic-assisted bronchoscopy (RAB) platform, which garnered US Food and Drug Administration (FDA) clearance in 2018, the USA Health news release noted.

The Monarch platform, according to USA Health, “integrates robotics, micro-instrumentation, endoscope design, and data science into one platform to empower physicians.”

Monarch's "controller-like interface"

Monarch’s “controller-like interface” (seen above) enables physicians to operate the endoscope and access small and “hard-to-reach” lung nodules. “The Monarch platform,” Duluth News Tribune explained, “is an endoscope guided by a handheld controller very similar to an Xbox controller. As the Monarch Platform drives through the lungs, the camera and other diagrams on a screen help the physician locate the nodule, then collect the biopsy with better accuracy and precision.” (Photo copyright: Jed Carlson/Superior Telegram/Duluth News Tribune.)

Eric Swanson, a pulmonologist at Essentia Health-St. Mary’s Medical Center in Duluth, MD, calls Monarch a game changer. “It’s a big, big upgrade from what we had before,” Swanson told the Duluth News Tribune. “(Before), you’d just pass a small catheter through a regular bronchoscope, and you turn it and hope you land in the right spot.”

The Monarch platform has enabled USA Health to step-up diagnosis of lung cancer, as compared to FNA (fine needle aspiration) biopsy on its own, according to Turbat-Herrera.

“With FNA alone, you try to get (sample tissue), and you are not sure. Now, if it is there, you should get it because the (Monarch) equipment helps you get there. Our role in pathology is to help guide the hand of the pulmonologist: ‘you don’t have what we need,’ or ‘keep going in that area of the lung,’” she said, adding that physicians have been able to reach tiny lesions.

High Incidence of Lung Cancer

The American Cancer Society, says lung cancer is the second most common cancer, with an estimated 235,760 new lung cancer cases and 131,880 deaths from the disease in 2021.

It’s hoped that healthcare providers’ investment in new robotic technology—such as Monarch and others—may shorten the time required to diagnose lung cancer and eventually save lives.

Providers such as USA Health go a step further by integrating ROSE with RAB. The robotic technology—coupled with on-site rapid evaluation by a cytopathologist that averts repeat biopsy procedures—immediately secures an assessment of sample adequacy and a cancer diagnosis that may benefit patients as well.  

This is yet another example of how a new technology in one field can have a benefit for anatomic pathologists.   

Donna Marie Pocius

Related Information:

USA Health Mitchell Cancer Institute Offers State-of-the-Art Lung Cancer Diagnosis

FDA Clears Auris Health’s Robotic Monarch Platform for Endoscopy

New Robotic Diagnostic Device Searches for Lung Cancer

High Diagnostic Yield in Sampling Sub-Centimeter Peripheral Pulmonary Nodules with Robotic-Assisted Bronchoscopy

ASC Rapid On‐Site Evaluation (ROSE) Position Statement

Dermatopathologists May Soon Have Useful New Tool That Uses AI Algorithm to Detect Melanoma in Wide-field Images of Skin Lesions Taken with Smartphones

MIT’s deep learning artificial intelligence algorithm demonstrates how similar new technologies and smartphones can be combined to give dermatologists and dermatopathologists valuable new ways to diagnose skin cancer from digital images

Scientists at the Massachusetts Institute of Technology (MIT) and other Boston-area research institutions have developed an artificial intelligence (AI) algorithm that detects melanoma in wide-field images of skin lesions taken on smartphones. And its use could affect how dermatologists and dermatopathologists diagnose cancer.

The study, published in Science Translational Medicine, titled, “Using Deep Learning for Dermatologist-Level Detection of Suspicious Pigmented Skin Lesions from Wide-Field Images,” demonstrates that even a common device like a smartphone can be a valuable resource in the detection of disease.

According to an MIT press release, “The paper describes the development of an SPL [Suspicious Pigmented Lesion] analysis system using DCNNs [Deep Convolutional Neural Networks] to more quickly and efficiently identify skin lesions that require more investigation, screenings that can be done during routine primary care visits, or even by the patients themselves. The system utilized DCNNs to optimize the identification and classification of SPLs in wide-field images.”

The MIT scientists believe their AI analysis system could aid dermatologists, dermatopathologists, and clinical laboratories detect melanoma, a deadly form of skin cancer, in its early stages using smartphones at the point-of-care.  

Luis Soenksen, PhD

“Our research suggests that systems leveraging computer vision and deep neural networks, quantifying such common signs, can achieve comparable accuracy to expert dermatologists,” said Luis Soenksen, PhD (above), Venture Builder in Artificial Intelligence and Healthcare at MIT and first author of the study in an MIT press release. “We hope our research revitalizes the desire to deliver more efficient dermatological screenings in primary care settings to drive adequate referrals.” The MIT study demonstrates that dermatologists, dermatopathologists, and clinical laboratories can benefit from using common technologies like smartphones in the diagnosis of disease. (Photo copyright: Wyss Institute Harvard University.)

Improving Melanoma Treatment and Patient Outcomes

Melanoma develops when pigment-producing cells called melanocytes start to grow out of control. The cancer has traditionally been diagnosed through visual inspection of SPLs by physicians in medical settings. Early-stage identification of SPLs can drastically improve the prognosis for patients and significantly reduce treatment costs. It is common to biopsy many lesions to ensure that every case of melanoma can be diagnosed as early as possible, thus contributing to better patient outcomes.

“Early detection of SPLs can save lives. However, the current capacity of medical systems to provide comprehensive skin screenings at scale are still lacking,” said Luis Soenksen, PhD, Venture Builder in Artificial Intelligence and Healthcare at MIT and first author of the study in the MIT press release.

The researchers trained their AI system by using 20,388 wide-field images from 133 patients at the Gregorio Marañón General University Hospital in Madrid, as well as publicly available images. The collected photographs were taken with a variety of ordinary smartphone cameras that are easily obtainable by consumers.

They taught the deep learning algorithm to examine various features of skin lesions such as size, circularity, and intensity. Dermatologists working with the researchers also visually classified the lesions for comparison.

Smartphone image of pigmented skin lesions

When the algorithm is “shown” a wide-field image like that above taken with a smartphone, it uses deep convolutional neural networks to analyze individual pigmented lesions and screen for early-stage melanoma. The algorithm then marks suspicious images as either yellow (meaning further inspection should be considered) or red (indicating that further inspection and/or referral to a dermatologist is required). Using this tool, dermatopathologists may be able to diagnose skin cancer and excise it in-office long before it becomes deadly. (Photo copyright: MIT.)

“Our system achieved more than 90.3% sensitivity (95% confidence interval, 90 to 90.6) and 89.9% specificity (89.6 to 90.2%) in distinguishing SPLs from nonsuspicious lesions, skin, and complex backgrounds, avoiding the need for cumbersome individual lesion imaging,” the MIT researchers noted in their Science Translational Medicine paper.

In addition, the algorithm agreed with the consensus of experienced dermatologists 88% of the time and concurred with the opinions of individual dermatologists 86% of the time, Medgadget reported.

Modern Imaging Technologies Will Advance Diagnosis of Disease

According to the American Cancer Society, about 106,110 new cases of melanoma will be diagnosed in the United States in 2021. Approximately 7,180 people are expected to die of the disease this year. Melanoma is less common than other types of skin cancer but more dangerous as it’s more likely to spread to other parts of the body if not detected and treated early.

More research is needed to substantiate the effectiveness and accuracy of this new tool before it could be used in clinical settings. However, the early research looks promising and smartphone camera technology is constantly improving. Higher resolutions would further advance development of this type of diagnostic tool.

In addition, MIT’s algorithm enables in situ examination and possible diagnosis of cancer. Therefore, a smartphone so equipped could enable a dermatologist to diagnose and excise cancerous tissue in a single visit, without the need for biopsies to be sent to a dermatopathologist.

Currently, dermatologists refer a lot of skin biopsies to dermapathologists and anatomic pathology laboratories. An accurate diagnostic tool that uses modern smartphones to characterize suspicious skin lesions could become quite popular with dermatologists and affect the flow of referrals to medical laboratories.

JP Schlingman

Related Information:

Software Spots Suspicious Skin Lesions on Smartphone Photos

An Artificial Intelligence Tool That Can Help Detect Melanoma

Using Deep Learning for Dermatologist-level Detection of Suspicious Pigmented Skin Lesions from Wide-field Images

;