News, Analysis, Trends, Management Innovations for
Clinical Laboratories and Pathology Groups

Hosted by Robert Michel

News, Analysis, Trends, Management Innovations for
Clinical Laboratories and Pathology Groups

Hosted by Robert Michel
Sign In

Cedars-Sinai Researchers Determine Smartphone App Can Assess Stool Form as Well as Gastroenterologists and Better than IBS Patients

Artificial intelligence performs BSS assessments with higher sensitivity and specificity than human diagnosticians

In a recent study conducted by scientists at Cedars-Sinai Medical Center in Los Angeles, researchers evaluated a smartphone application (app) that uses artificial intelligence (AI) to assess and characterize digital images of stool samples. The app, it turns out, matched the accuracy of participating gastroenterologists and exceeded the accuracy of study patients’ self-reports of stool specimens, according to a news release.

Though smartphone apps are technically not clinical laboratory tools, anatomic pathologists and medical laboratory scientists (MLSs) may be interested to learn how health information technology (HIT), machine learning, and smartphone apps are being used to assess different aspects of individuals’ health, independent of trained healthcare professionals.

The issue that the Cedars Sinai researchers were investigating is the accuracy of patient self-reporting. Because poop can be more complicated than meets the eye, when asked to describe their bowel movements patients often find it difficult to be specific. Thus, use of a smartphone app that enables patients to accurately assess their stools in cases where watching the function of their digestive tract is relevant to their diagnoses and treatment would be a boon to precision medicine treatments of gastroenterology diseases.

The scientists published their findings in the American Journal of Gastroenterology, titled, “A Smartphone Application Using Artificial Intelligence Is Superior to Subject Self-Reporting when Assessing Stool Form.”

Mark Pimentel, MD

“This app takes out the guesswork by using AI—not patient input—to process the images (of bowel movements) taken by the smartphone,” said gastroenterologist Mark Pimentel, MD (above), Executive Director of Cedars-Sinai’s Medically Associated Science and Technology (MAST) program and principal investigator of the study, in a news release. “The mobile app produced more accurate and complete descriptions of constipation, diarrhea, and normal stools than a patient could, and was comparable to specimen evaluations by well-trained gastroenterologists in the study.” (Photo copyright: Cedars-Sinai.)

Pros and Cons of Bristol Stool Scale

In their paper, the scientists discussed the Bristol Stool Scale (BSS), a traditional diagnostic tool for identifying stool forms into seven categories. The seven types of stool are:

  • Type 1: Separate hard lumps, like nuts (difficult to pass).
  • Type 2: Sausage-shaped, but lumpy.
  • Type 3: Like a sausage, but with cracks on its surface.
  • Type 4: Like a sausage or snake, smooth and soft (average stool).
  • Type 5: Soft blobs with clear cut edges.
  • Type 6: Fluffy pieces with ragged edges, a mushy stool (diarrhea).
  • Type 7: Watery, no solid pieces, entirely liquid (diarrhea). 

In an industry guidance report on irritable bowel syndrome (IBS)and associated drugs for treatment, the US Food and Drug Administration (FDA) said the BSS is “an appropriate instrument for capturing stool consistency in IBS.”

But even with the BSS, things can get murky for patients. Inaccurate self-reporting of stool forms by people with IBS and diarrhea can make proper diagnoses difficult.

“The problem is that whenever you have a patient reporting an outcome measure, it becomes subjective rather than objective. This can impact the placebo effect,” gastroenterologist Mark Pimentel, MD, Executive Director of Cedars-Sinai’s Medically Associated Science and Technology (MAST) program and principal investigator of the study, told Healio.

Thus, according to the researchers, AI algorithms can help with diagnosis by systematically doing the assessments for the patients, News Medical reported.

30,000 Stool Images Train New App

To conduct their study, the Cedars-Sinai researchers tested an AI smartphone app developed by Dieta Health. According to Health IT Analytics, employing AI trained on 30,000 annotated stool images, the app characterizes digital images of bowel movements using five parameters:

  • BSS,
  • Consistency,
  • Edge fuzziness,
  • Fragmentation, and
  • Volume.

“The app used AI to train the software to detect the consistency of the stool in the toilet based on the five parameters of stool form, We then compared that with doctors who know what they are looking at,” Pimentel told Healio.

AI Assessments Comparable to Doctors, Better than Patients

According to Health IT Analytics, the researchers found that:

  • AI assessed the stool comparable to gastroenterologists’ assessments on BSS, consistency, fragmentation, and edge fuzziness scores.
  • AI and gastroenterologists had moderate-to-good agreement on volume.
  • AI outperformed study participant self-reports based on the BSS with 95% accuracy, compared to patients’ 89% accuracy.

Additionally, the AI outperformed humans in specificity and sensitivity as well:

  • Specificity (ability to correctly report a negative result) was 27% higher.
  • Sensitivity (ability to correctly report a positive result) was 23% higher.

“A novel smartphone application can determine BSS and other visual stool characteristics with high accuracy compared with the two expert gastroenterologists. Moreover, trained AI was superior to subject self-reporting of BSS. AI assessments could provide more objective outcome measures for stool characterization in gastroenterology,” the Cedars-Sinai researchers wrote in their paper.

“In addition to improving a physician’s ability to assess their patients’ digestive health, this app could be advantageous for clinical trials by reducing the variability of stool outcome measures,” said gastroenterologist Ali Rezaie, MD, study co-author and Medical Director of Cedars-Sinai’s GI Motility Program in the news release.

The researchers plan to seek FDA review of the mobile app.

Opportunity for Clinical Laboratories

Anatomic pathologists and clinical laboratory leaders may want to reach out to referring gastroenterologists to find out how they can help to better serve gastro patients. As the Cedars-Sinai study suggests, AI smartphone apps can perform BSS assessments as good as or better than humans and may be useful tools in the pursuit of precision medicine treatments for patient suffering from painful gastrointestinal disorders.

—Donna Marie Pocius

Related Information:

Smartphone Application Using Artificial Intelligence is Superior to Subject Self-Reporting When Assessing Stool Form

Study: App More Accurate than Patient Evaluation of Stool Samples

Industry Guidance Report: Irritable Bowel Syndrome—Clinical Evaluation of Drugs

Artificial Intelligence-based Smartphone App for Characterizing Stool Form

AI Mobile App Improves on “Subjective” Patient-Reported Stool Assessment in IBS

Artificial Intelligence App Outperforms Patient-Reported Stool Assessments

UCLA’s Virtual Histology Could Eliminate Need for Invasive Biopsies for Some Skin Conditions and Cancers

Though the new technology could speed diagnoses of cancers and other skin diseases, it would also greatly reduce dermatopathology biopsy referrals and revenue

What effect would elimination of tissue biopsies have on dermatopathology and clinical laboratory revenue? Quite a lot. Dermatologists alone account for a significant portion of skin biopsies sent to dermatopathologists. Thus, any new technology that can “eliminate the need for invasive skin biopsies” would greatly reduce the number of histopathological referrals and reduce revenue to those practices.

Nevertheless, one such new technology may have been created by Ozcan Research Group in a proof-of-concept study they conducted at the University of California, Los Angeles (UCLA).

Called Virtual Histology, the technology applies artificial intelligence (AI) deep learning methods to reflectance confocal microscopy (RCM) images “to rapidly perform virtual histology of in vivo, label-free RCM images of normal skin structure, basal cell carcinoma, and melanocytic nevi with pigmented melanocytes, demonstrating similar histological features to traditional histology from the same excised tissue,” the UCLA scientists wrote in their study, published in the Nature peer-reviewed journal Light: Science and Applications.

Aydogan Ozcan, PhD

“What if we could entirely bypass the biopsy process and perform histology-quality staining without taking tissue and processing tissue in a noninvasive way? Can we create images that diagnosticians can benefit from?” asked Aydogan Ozcan, PhD (above), Chancellor’s Professor of Electrical and Computer Engineering at UCLA’s Samueli School of Engineering, one of the scientists who developed UCLA’s new virtual histology method, during an interview with Medical Device + Diagnostic Industry (MD+DI). (Photo copyright: Nature.)

Could Skin Biopsies be Eliminated?

The UCLA researchers believe their innovative deep learning-enabled imaging framework could possibly circumvent the need for skin biopsies to diagnose skin conditions.

“Here, we present a deep learning-based framework that uses a convolutional neural network to rapidly transform in vivo RCM images of unstained skin into virtually-stained hematoxylin and eosin-like images with microscopic resolution, enabling visualization of the epidermis, dermal-epidermal junction, and superficial dermis layers.

“This application of deep learning-based virtual staining to noninvasive imaging technologies may permit more rapid diagnoses of malignant skin neoplasms and reduce invasive skin biopsies,” the researchers added in their published study.

“This process bypasses several standard steps typically used for diagnosis, including skin biopsy, tissue fixation, processing, sectioning, and histochemical staining,” Aydogan Ozcan, PhD, Chancellor’s Professor of Electrical and Computer Engineering at UCLA’s Samueli School of Engineering, told Optics.org.

AI and Deep Learning in Dermatopathology

According to the published study, the UCLA team trained their neural network under an adversarial machine learning scheme to transform grayscale RCM images into virtually stained 3D microscopic images of normal skin, basal cell carcinoma, and pigmented melanocytic nevi. The new images displayed similar morphological features to those shown with the widely used hematoxylin and eosin (H&E) staining method.

“In our studies, the virtually stained images showed similar color contrast and spatial features found in traditionally stained microscopic images of biopsied tissue,” Ozcan told Photonics Media. “This approach may allow diagnosticians to see the overall histological features of intact skin without invasive skin biopsies or the time-consuming work of chemical processing and labeling of tissue.”

The framework covers different skin layers, including the epidermis, dermal-epidermis, and superficial dermis layers. It images deeper into tissue without being invasive and can be quickly performed.

“The virtual stain technology can be streamlined to be almost semi real time,” Ozcan told Medical Device + Diagnostic Industry (MD+DI). “You can have the virtual staining ready when the patient is wrapping up. Basically, it can be within a couple of minutes after you’re done with the entire imaging.”

Currently, medical professionals rely on invasive skin biopsies and histopathological evaluations to diagnose skin diseases and cancers. These diagnostic techniques can result in unnecessary biopsies, scarring, multiple patient visits and increased medical costs for patients, insurers, and the healthcare system.

Improving Time to Diagnosis through Digital Pathology

Another advantage of this virtual technology, the UCLA researchers claim, is that it can provide better images than traditional staining methods, which could improve the ability to diagnose pathological skin conditions and help alleviate human error.

“The majority of the time, small laboratories have a lot of problems with consistency because they don’t use the best equipment to cut, process, and stain tissue,” dermatopathologist Philip Scumpia, MD, PhD, Assistant Professor of Dermatology and Dermatopathology at UCLA Health and one of the authors of the research paper, told MD+DI.

“What ends up happening is we get tissue on a histology slide that’s basically unevenly stained, unevenly put on the microscope, and it gets distorted,” he added, noting that this makes it very hard to make a diagnosis.  

Scumpia also added that this new technology would allow digital images to be sent directly to the pathologist, which could reduce processing and laboratory times.

“With electronic medical records now and the ability to do digital photography and digital mole mapping, where you can obtain a whole-body imaging of patients, you could imagine you can also use one of these reflectance confocal devices. And you can take that image from there, add it to the EMR with the virtual histology stain, which will make the images more useful,” Scumpia said. “So now, you can track lesions as they develop.

“What’s really exciting too, is that there’s the potential to combine it with other artificial intelligence, other machine learning techniques that can give more information,” Scumpia added. “Using the reflectance confocal microscope, a clinician who might not be as familiar in dermatopathology could take images and send [them] to a practitioner who could give a more expert diagnosis.”

Faster Diagnoses but Reduced Revenue for Dermatopathologists, Clinical Labs

Ozcan noted that there’s still a lot of work to be done in the clinical assessment, validation, and blind testing of their AI-based staining method. But he hopes the technology can be propelled into a useful tool for clinicians.

“I think this is a proof-of-concept work, and we’re very excited to make it move forward with further advances in technology, in the ways that we acquire 3D information [and] train our neural networks for better and faster virtual staining output,” he told MD+DI.

Though this new technology may reduce the need for invasive biopsies and expedite the diagnosis of skin conditions and cancers—thus improving patient outcomes—what affect might it have on dermatopathology practices?

More research and clinical studies are needed before this new technology becomes part of the diagnosis and treatment processes for skin conditions. Nevertheless, should virtual histology become popular and viable, it could greatly impact the amount of skin biopsy referrals to pathologists, dermatopathologists, and clinical laboratories, thus diminishing a great portion of their revenue. 

—JP Schlingman

Related Information:

Virtual Histology Eliminates Need for Invasive Skin Biopsies

UCLA Deep-learning Reduces Need for Invasive Biopsies

AI Imaging Method Provides Biopsy-free Skin Diagnosis

Light People: Professor Aydogan Ozcan

Histology Process Bypasses Need for Biopsies, Enables Diagnoses

Reflection-Mode Virtual Histology Using Photoacoustic Remote Sensing Microscopy

Introduction to Reflectance Confocal Microscopy and Its Use in Clinical Practice

Biopsy-free In Vivo Virtual Histology of Skin Using Deep Learning

Can This New Tech Reduce the Need for Skin Biopsies?

Researchers in Five Countries Use AI, Deep Learning to Analyze and Monitor the Quality of Donated Red Blood Cells Stored for Transfusions

By training a computer to analyze blood samples, and then automating the expert assessment process, the AI processed months’ worth of blood samples in a single day

New technologies and techniques for acquiring and transporting biological samples for clinical laboratory testing receive much attention. But what of the quality of the samples themselves? Blood products are expensive, as hospital medical laboratories that manage blood banks know all too well. Thus, any improvement to how labs store blood products and confidently determine their viability for transfusion is useful.

One such improvement is coming out of Canada. Researchers at the University of Alberta  (U of A) in collaboration with scientists and academic institutions in five countries are looking into ways artificial intelligence (AI) and deep learning can be used to efficiently and quickly analyze red blood cells (RBCs). The results of the study may alter the way donated blood is evaluated and selected for transfusion to patients, according to an article in Folio, a U of A publication, titled, “AI Could Lead to Faster, Better Analysis of Donated Blood, Study Shows.” 

The study, which uses AI and imaging flow cytometry (IFC) to scrutinize the shape of RBCs, assess the quality of the stored blood, and remove human subjectivity from the process, was published in Proceedings of the National Academy of Sciences (PNAS,) titled, “Objective Assessment of Stored Blood Quality by Deep Learning.”

Improving Blood Diagnostics through Precision Medicine and Deep Learning

“This project is an excellent example of how we are using our world-class expertise in precision health to contribute to the interdisciplinary work required to make fundamental changes in blood diagnostics,” said Jason Acker, PhD, a senior scientist at Canadian Blood Services’ Centre for Innovation, Professor of Laboratory Medicine and Pathology at the University of Alberta, and one of the lead authors of the study, in the Folio article.

The research took more than three years to complete and involved 19 experts from 12 academic institutions and blood collection facilities located in Canada, Germany, Switzerland, the United Kingdom, and the US.

Jason Acker, PhD (above), Senior Research Scientist, Canadian Blood Services, and Professor of Laboratory Medicine and Pathology at the University of Alberta in a white lab jacket in a laboratory
“Our study shows that artificial intelligence gives us better information about the red blood cell morphology, which is the study of how these cells are shaped, much faster than human experts,” said Jason Acker, PhD (above), Senior Research Scientist, Canadian Blood Services, and Professor of Laboratory Medicine and Pathology at the University of Alberta, in an article published on the Canadian Blood Services website. “We anticipate this technology will improve diagnostics for clinicians as well as quality assurance for blood operators such as Canadian Blood Services in the coming years,” he added. Clinical laboratories in the US may also benefit from this new blood viability process. (Photo copyright: University of Alberta.)

To perform the study, the scientists first collected and manually categorized 52,000 red blood cell images. Those images were then used to train an algorithm that mimics the way a human mind works. The computer system was next tasked with analyzing the shape of RBCs for quality purposes. 

Removing Human Bias from RBC Classification

“I was happy to collaborate with a group of people with diverse backgrounds and expertise,” said Tracey Turner, a senior research assistant in Acker’s laboratory and one of the authors of the study, in a Canadian Blood Services (CBS) article. “Annotating and reviewing over 52,000 images took a long time, however, it allowed me to see firsthand how much bias there is in manual classification of cell shape by humans and the benefit machine classification could bring.”

According to the CBS article, a red blood cell lasts about 115 days in the human body and the shape of the RBC reveals its age. Newer, healthier RBCs are shaped like discs with smooth edges. As they age, those edges become jagged and the cell eventually transforms into a sphere and loses the ability to perform its duty of transporting oxygen throughout the body. 

Blood donations are processed, packed, and stored for later use. Once outside the body, the RBCs begin to change their shape and deteriorate. RBCs can only be stored for a maximum of 42 days before they lose the ability to function properly when transfused into a patient. 

Scientists routinely examine the shape of RBCs to assess the quality of the cell units for transfusion to patients and, in some cases, diagnose and assess individuals with certain disorders and diseases. Typically, microscope examinations of red blood cells are performed by experts in medical laboratories to determine the quality of the stored blood. The RBCs are classified by shape and then assigned a morphology index score. This can be a complex, time-consuming, and laborious process.

“One of the amazing things about machine learning is that it allows us to see relationships we wouldn’t otherwise be able to see,” Acker said. “We categorize the cells into the buckets we’ve identified, but when we categorize, we take away information.”

Human analysis, apparently, is subjective and different professionals can arrive at different results after examining the same blood samples. 

“Machines are naive of bias, and AI reveals some characteristics we wouldn’t have identified and is able to place red blood cells on a more nuanced spectrum of change in shape,” Acker explained.

The researchers discovered that the AI could accurately analyze and categorize the quality of the red blood cells. This ability to perform RBC morphology assessment could have critical implications for transfusion medicine.

“The computer actually did a better job than we could, and it was able to pick up subtle differences in a way that we can’t as humans,” Acker said.

“It’s not surprising that the red cells don’t just go from one shape to another. This computer showed that there’s actually a gradual progression of shape in samples from blood products, and it’s able to better classify these changes,” he added. “It radically changes the speed at which we can make these assessments of blood product quality.”

More Precision Matching Blood Donors to Recipients

According to the World Health Organization (WHO), approximately 118.5 million blood donations are collected globally each year. There is a considerable contrast in the level of access to blood products between high- and low-income nations, which makes accurate assessment of stored blood even more critical. About 40% of all blood donations are collected in high-income countries that home to only about 16% of the world’s population.

More studies and clinical trials will be necessary to determine if U of A’s approach to using AI to assess the quality of RBCs can safely transfer to clinical use. But these early results promise much in future precision medicine treatments.

“What this research is leading us to is the fact that we have the ability to be much more precise in how we match blood donors and recipients based on specific characteristics of blood cells,” Acker stated. “Through this study we have developed machine learning tools that are going to help inform how this change in clinical practice evolves.”

The AI tools being developed at the U of A could ultimately benefit patients as well as blood collection centers, and at hospitals where clinical laboratories typically manage the blood banking services, by making the process of matching transfusion recipients to donors more precise and ultimately safer.

—JP Schlingman

Related Information:

Objective Assessment of Stored Blood Quality by Deep Learning

Machines Rival Expert Analysis of Stored Red Blood Cell Quality

Breakthrough Study Uses AI to Analyze Red Blood Cells

Machine Learning Opens New Frontiers in Red Blood Cell Research

AI Could Lead to Faster, Better Analysis of Donated Blood, Study Shows

Blood Safety and Availability

;