The proof-of-concept experiment showed data can be encoded in DNA and retrieved using automated systems, a development that may have positive significance for clinical laboratories
It may seem far-fetched, but computer scientists and research groups have worked for years to discover if it is possible to store data on Deoxyribonucleic acid (DNA). Now, Microsoft Research (MR) and the University of Washington (UW) have achieved just that, and the implications of their success could be far-reaching.
Clinical pathologists are increasingly performing genetic DNA sequencing in their medical laboratories to identify biomarkers for disease, help clinicians understand their patients’ risk for a specific disease, and track the progression of a disease. The ability to store data in DNA would take that to another level and could have an impact on diagnostic pathology. Pathologist familiar with DNA sequencing may find a whole new area of medical service open to them.
The MR/UW researchers recently demonstrated a fully automated system that encoded data into DNA and then recovered the information as digital data. “In a simple proof-of-concept test, the team successfully encoded the word ‘hello’ in snippets of fabricated DNA and converted it back to digital data using a fully automated end-to-end system,” Microsoft stated in a news release.
DNA’s Potential Storage Capacity and Why We Need It
Thus far, the challenge of using DNA for data storage has
been that there wasn’t a way to easily code and retrieve the information. That,
however, seems to be changing quite rapidly. Several major companies have
invested heavily in research, with consumer offerings expected soon.
At Microsoft Research, ‘consumer interest’ in genetic testing has driven the research into using DNA for data storage. “As People get better access to their own DNA, why not also give them the ability to read any kind of data written in DNA?” asked Doug Carmean, an Architect at Microsoft, during an interview with Wired.
Scientists are interested in using DNA for data storage because
humanity is creating more data than ever before, and the pace is accelerating.
Currently, most of that data is stored on tape, which is inexpensive, but has
drawbacks. Tape degrades and has to be replaced every 10 years or so. But DNA,
on the other hand, lasts for thousands of years!
“DNA won’t degrade over time like cassette tapes and CDs, and it won’t become obsolete,” Yaniv Erlich, PhD, Chief Science Officer at MyHeritage, an online genealogy platform located in Israel, and Associate Professor, Columbia University, told Science Mag.
Tape also takes up an enormous amount of physical space compared to DNA. One single gram of DNA can hold 215 petabytes (roughly one zettabyte) of data. Wired puts the storage capacity of DNA into perspective: “Imagine formatting every movie ever made into DNA; it would be smaller than the size of a sugar cube. And it would last for 10,000 years.”
Victor Zhirnov, Chief Scientist at Semiconductor Research Corporation says the worries over storage space aren’t simply theoretical. “Today’s technology is already close to the physical limits of scaling,” he told Wired, which stated, “Five years ago humans had produced 4.4 zettabytes of data; that’s set to explode to 160 zettabytes (each year!) by 2025. Current infrastructure can handle only a fraction of the coming data deluge, which is expected to consume all the world’s microchip-grade silicon by 2040.”
MIT Technology Review agrees, stating, “Humanity is creating information at an unprecedented rate—some 16 zettabytes every year. And this rate is increasing. Last year, the research group IDC calculated that we’ll be producing over 160 zettabytes every year by 2025.”
Heavy Investment by Major Players
The whole concept may seem like something out of a science
fiction story, but the fact that businesses are investing real dollars into it
is evidence that DNA for data storage will likely be a reality in the near
future. Currently, there are a couple of barriers, but work is commencing to
overcome them.
First, the cost of synthesizing DNA in a medical laboratory
for the specific purpose of data storage must be cheaper for the solution to
become viable. Second, the sequencing process to read the information must also
become less expensive. And third is the problem of how to extract the data
stored in the DNA.
In a paper published in ASPLOS ‘16, the MR/UW scientists wrote: “Today, neither the performance nor the cost of DNA synthesis and sequencing is viable for data storage purposes. However, they have historically seen exponential improvements. Their cost reductions and throughput improvements have been compared to Moore’s Law in Carlson’s Curves … Important biotechnology applications such as genomics and the development of smart drugs are expected to continue driving these improvements, eventually making data storage a viable application.”
Automation appears to be the final piece of the puzzle. Currently,
too much human labor is necessary for DNA to be used efficiently as data
storage.
It may take some time before DNA becomes a viable medium for
data storage. However, savvy pathology laboratory managers should be aware of,
and possibly prepared for, this coming opportunity.
While it’s unlikely the average consumer will see much
difference in how they save and retrieve data, medical laboratories with the
ability to sequence DNA may find themselves very much in demand because of
their expertise in sequencing DNA and interpreting gene sequences.
CDC estimates that 92% of cancers caused by HPV could be eliminated in the US if HPV vaccination recommendations in this country are followed
Medical
laboratories in the United States once processed as many as 55-million Pap tests each year. However,
the need for cervical cancer screening tests is diminishing. That’s primarily because
the human
papilloma virus (HPV) vaccination effectively eliminates new cases of
cervical cancer. At least, that’s what’s happening in Australia.
When it was introduced in 2007, Australia’s nationwide
publicly-funded HPV
vaccination program only included girls, but was extended to boys in 2013.
Today, it is being credited with helping slash the country’s cervical cancer
rates.
Research published in The
Lancet Public Health (Lancet) predicts cervical cancer could be
eliminated in Australia by 2028 if current vaccination rates and screening
programs continue. Cervical cancer would be classified as effectively
eliminated once there are four or fewer new cases per 100,000 women each year.
These developments will be of interests to pathologists and cytotechnologists in
the United States.
“From the beginning, I think the [Australian] government
successfully positioned the advent of HPV vaccination as a wonderful package
that had a beneficial effect for the population,” Karen
Canfell, PhD, Director, Cancer Research Division at Cancer Council New
South Wales, Australia, and Adjunct Professor, University
of Sydney, told the Texas
Tribune. “It was celebrated for that reason, and it was a great public
health success.”
In addition to high vaccination rates, the Lancet
study notes that last year Australia transitioned from cytology-based cervical screening
every two years for women aged 18 to 69 years, to primary HPV testing every
five years for women aged 25 to 69 and exit testing for women aged 70 to 74
years.
“Large-scale clinical trials and detailed modelling suggest
that primary HPV screening is more effective at detecting cervical
abnormalities and preventing cervical cancer than screening with cytology at
shorter intervals,” the Lancet study states.
The incidence of cervical cancer in Australia now stands at
seven cases per 100,000. That’s about half the global average. The country is
on pace to see cervical cancer officially considered a “rare” cancer by 2020,
when rates are projected to drop to fewer than six new cases per 100,000 women.
US Cervical Cancer Rates
In Texas, meanwhile, the state’s failure to embrace HPV
vaccination is being blamed for slowing potential improvements in cervical
cancer rates. In 2007, Texas lawmakers rejected legislation that would have
mandated girls entering sixth grade be vaccinated for HPV. The Texas Tribune
reports that, in the decade that followed, vaccination rates remained stagnant
with only about 40% of Texans between 13 and 17 years old having been vaccinated
for HPV by 2017.
Though Texas has a similar size population as Australia, the
state’s low vaccination rates have meant cervical cancer rates have shown
little improvement. Statistics compiled by the federal Centers for Disease Control
and Prevention (CDC) show that Texas’ age-adjusted rate of new cervical
cancer cases sits at 9.2 per 100,000 women—unchanged since 2006.
Texas has the fifth highest rate of cervical cancer in the
nation, according to the CDC.
Lois Ramondetta,
MD, Professor of Gynecologic Oncology at MD Anderson Cancer Center in Houston,
told the Texas Tribune the state ignored an opportunity that Australia
seized. “[Australia] embraced the vaccine at that time, and our fear kind of
began around then,” Ramondetta said. “Really, vaccination in general has just
gone down the tube since then.”
CDC Study Pushes HPV Vaccination Recommendations in US
Texas is not the only state failing to capitalize on the HPV
vaccine’s cancer-curing promise. The CDC recently stated in a news
release announcing a recent study that 92% of cancers caused by HPV could
be eliminated if HPV vaccine recommendations were followed. CDC published the
study in its Morbidity
and Mortality Weekly Report.
HPV is a common virus that is linked to not only cervical
cancer but also cancers of the penis, head, and neck, as well as conditions
like genital warts. Though the CDC recommends children get the two-dose vaccine
at ages 11-12, the study findings indicate that only 51% of teens ages 11 to 17
have received the recommended doses of HPV vaccine, a 2% increase from 2017 to
2018.
“A future without HPV cancers is within reach, but urgent
action is needed to improve vaccine coverage rates,” Brett
Giroir, MD, Assistant Secretary for Health, US Department of Health and
Human Services (HHS), stated in the CDC news release. “Increasing HPV
vaccination overage to 80% has been and will continue to be a priority
initiative for HHS, and we will continue to work with our governmental and
private sector partners to make this a reality.”
Can Australia Eliminate Cervical Cancer?
University of Queensland Professor Ian Frazer, MD, who
co-authored the Lancet Public Health study, believes Australia is on the
verge not only of eliminating cervical cancer, but also eradicating the HPV
virus itself.
“Because this human papillomavirus only infects humans, and
the vaccine program prevents the spread of the virus, eventually we’ll get rid
of it, like we did with smallpox,” Frazer told The
Age.
“It’s not going to happen in my lifetime,” he added. “But it
could happen in the lifetime of my kids if they go about it the right way.”
If Australia’s combination of high HPV vaccination rates and
new HPV screening program succeeds in effectively eliminating cervical cancer,
clinical laboratories in this country should expect stepped-up efforts to
increase HPV vaccination rates in the United States. A renewed focus on reducing—and
ultimately eliminating—cervical cancer, could lead to fewer or less-frequently
performed Pap tests as part of cervical cancer screening protocols.
Genetic data captured by this new technology could lead to a new understanding of how different types of cells exchange information and would be a boon to anatomic pathology research worldwide
What if it were possible to map the interior of cells and view their genetic sequences using chemicals instead of light? Might that spark an entirely new way of studying human physiology? That’s what researchers at the Massachusetts Institute of Technology (MIT) believe. They have developed a new approach to visualizing cells and tissues that could enable the development of entirely new anatomic pathology tests that target a broad range of cancers and diseases.
Scientists at MIT’s Broad Institute and McGovern Institute for Brain Research developed this new technique, which they call DNA Microscopy. They published their findings in Cell, titled, “DNA Microscopy: Optics-free Spatio-genetic Imaging by a Stand-Alone Chemical Reaction.”
Joshua Weinstein, PhD, a postdoctoral associate at the Broad Institute and first author of the study, said in a news release that DNA microscopy “is an entirely new way of visualizing cells that captures both spatial and genetic information simultaneously from a single specimen. It will allow us to see how genetically unique cells—those comprising the immune system, cancer, or the gut for instance—interact with one another and give rise to complex multicellular life.”
The news release goes on to state that the new technology “shows
how biomolecules such as DNA and RNA are organized in cells and tissues,
revealing spatial and molecular information that is not easily accessible
through other microscopy methods. DNA microscopy also does not require
specialized equipment, enabling large numbers of samples to be processed
simultaneously.”
New Way to Visualize Cells
The MIT researchers saw an opportunity for DNA microscopy to
find genomic-level cell information. They claim that DNA microscopy images
cells from the inside and enables the capture of more data than with
traditional light microscopy. Their new technique is a chemical-encoded
approach to mapping cells that derives critical genetic insights from the
organization of the DNA and RNA in cells and tissue.
And that type of genetic information could lead to new precision medicine treatments for chronic disease. New Atlas notes that “ Speeding the development of immunotherapy treatments by identifying the immune cells best suited to target a particular cancer cell is but one of the many potential application for DNA microscopy.”
In their published study, the scientists note that “Despite enormous progress in molecular profiling of cellular constituents, spatially mapping [cells] remains a disjointed and specialized machinery-intensive process, relying on either light microscopy or direct physical registration. Here, we demonstrate DNA microscopy, a distinct imaging modality for scalable, optics-free mapping of relative biomolecule positions.”
How DNA Microscopy Works
The New York Times (NYT) notes that the advantage of DNA microscopy is “that it combines spatial details with scientists’ growing interest in—and ability to measure—precise genomic sequences, much as Google Street View integrates restaurant names and reviews into outlines of city blocks.”
And Singularity Hub notes that “ DNA microscopy, uses only a pipette and some liquid reagents. Rather than monitoring photons, here the team relies on ‘bar codes’ that chemically tag onto biomolecules. Like cell phone towers, the tags amplify, broadcasting their signals outward. An algorithm can then piece together the captured location data and transform those GPS-like digits into rainbow-colored photos. The results are absolutely breathtaking. Cells shine like stars in a nebula, each pseudo-colored according to their genomic profiles.”
“We’ve used DNA in a way that’s mathematically similar to photons in light microscopy,” Weinstein said in the Broad Institute news release. “This allows us to visualize biology as cells see it and not as the human eye does.”
In their study, researchers used DNA microscopy to tag RNA
molecules and map locations of individual human cancer cells. Their method is
“surprisingly simple” New Atlas reported. Here’s how it’s done,
according to the MIT news release:
Small synthetic DNA tags (dubbed “barcodes” by the MIT team) are added to biological samples;
The “tags” latch onto molecules of genetic material in the cells;
The tags are then replicated through a chemical reaction;
The tags combine and create more unique DNA labels;
The scientists use a DNA sequencer to decode and reconstruct the biomolecules;
A computer algorithm decodes the data and converts it to images displaying the biomolecules’ positions within the cells.
“The first time I saw a DNA microscopy image, it blew me away,” said Aviv Regev, PhD, a biologist at the Broad Institute, a Howard Hughes Medical Institute (HHMI) Investigator, and co-author of the MIT study, in an HHMI news release. “It’s an entirely new category of microscopy. It’s not just a technique; it’s a way of doing things that we haven’t ever considered doing before.”
Precision Medicine Potential
“Every cell has a unique make-up of DNA letters or genotype. By capturing information directly from the molecules being studied, DNA microscopy opens up a new way of connecting genotype to phenotype,” said Feng Zhang, PhD, MIT Neuroscience Professor,
Core Institute Member of the Broad Institute, and
Investigator at the McGovern Institute for Brain Research at MIT, in the HHMI
news release.
In other words, DNA microscopy could someday have applications in precision medicine. The MIT researchers, according to Stat, plan to expand the technology further to include immune cells that target cancer.
The Broad Institute has applied for a patent on DNA
microscopy. Clinical laboratory and anatomic pathology group leaders seeking
novel resources for diagnosis and treatment of cancer may want to follow the MIT
scientists’ progress.
First used to track cryptocurrencies such as Bitcoin, blockchain is finding its way into tracking and quality control systems in healthcare, including clinical laboratories and big pharma
Four companies were selected by the US Food and Drug Administration (FDA) to participate in a pilot program that will utilize blockchain technology to create a real-time monitoring network for pharmaceutical products. The companies selected by the FDA include: IBM (NYSE:IBM), Merck (NYSE:MRK), Walmart (NYSE:WMT), and KPMG, an international accounting firm. Each company will bring its own distinct expertise to the venture.
This important project to utilize blockchain technologies in
the pharmaceutical distribution chain is another example of prominent
healthcare organizations looking to benefit from blockchain technology.
Clinical laboratories and health insurers also are collaborating on blockchain projects. A recent intelligence briefing from The Dark Report, the sister publication of Dark Daily, describes collaborations between multiple health insurers and Quest Diagnostics to improve their provider directories using blockchain. (See, “Four Insurers, Quest Developing Blockchain,” July 1, 2019.)
Improving Traceability and Security in Healthcare
Blockchain continues to intrigue federal officials, health network administrators, and health information technology (HIT) developers looking for ways to accurately and efficiently track inventory, improve information access and retrieval, and increase the accuracy of collected and stored patient data.
In the FDA’s February press release announcing the pilot program, Scott Gottlieb, MD, who resigned as the FDA’s Commissioner in April, stated, “We’re invested in exploring new ways to improve traceability, in some cases using the same technologies that can enhance drug supply chain security, like the use of blockchain.”
Congress created this latest program, which is part of the federal US Drug Supply Chain Security Act (DSCSA) enacted in 2013, to identify and track certain prescription medications as they are disseminated nationwide. However, once fully tested, similar blockchain systems could be employed in all aspects of healthcare, including clinical laboratories, where critical supplies, fragile specimens, timing, and quality control are all present.
The FDA hopes the electronic framework being tested during
the pilot will help protect consumers from counterfeit, stolen, contaminated, or
harmful drugs, as well as:
reduce the time needed to track and trace
product inventory;
enable timely retrieval of accurate distribution
information;
increase the accuracy of data shared among the
network members; and
help maintain the integrity of products in the
distribution chain, including ensuring products are stored at the correct
temperature.
Companies in the FDA’s Blockchain Pilot
IBM, a leading blockchain provider, will serve as the
technology partner on the project. The tech giant has implemented and provided
blockchain applications to clients for years. Its cloud-based platform provides
customers with end-to-end capabilities that enable them to develop, maintain,
and secure their networks.
“Blockchain could provide an important new approach to further improving trust in the biopharmaceutical supply chain,” said Mark Treshock, Global Blockchain Solutions Leader for Healthcare and Life Sciences at IBM, in a news release. “We believe this is an ideal use for the technology because it can not only provide an audit trail that tracks drugs within the supply chain; it can track who has shared data and with whom, without revealing the data itself. Blockchain has the potential to transform how pharmaceutical data is controlled, managed, shared and acted upon throughout the lifetime history of a drug.”
Merck, known as MSD outside of the US and Canada, is
a global pharmaceutical company that researches and develops medications and
vaccines for both human and animal diseases. Merck delivers health solutions to
customers in more than 140 countries across the globe.
“Our supply chain strategy, planning and logistics are built around the customers and patients we serve,” said Craig Kennedy, Senior Vice President, Global Supply Chain Management at Merck, in the IBM news release. “Reliable and verifiable supply helps improve confidence among all the stakeholders—especially patients—while also strengthening the foundation of our business.”
Kennedy added that transparency is one of Merck’s primary
goals in participating in this blockchain project. “If you evaluate today’s
pharmaceutical supply chain system in the US, it’s really a series of handoffs
that are opaque to each other and owned by an individual party,” he said,
adding, “There is no transparency that provides end-to-end capabilities. This
hampers the ability for tracking and tracing within the supply chain.”
Walmart, the world’s largest company by revenue, will
be distributing drugs through their pharmacies and care clinics for the
project. Walmart has successfully experimented using blockchain technology with
other products. It hopes this new collaboration will benefit their customers,
as well.
“With successful blockchain pilots in pork, mangoes, and leafy greens that provide enhanced traceability, we are looking forward to the same success and transparency in the biopharmaceutical supply chain,” said Karim Bennis, Vice President of Strategic Planning of Health and Wellness at Walmart, in the IBM news release. “We believe we have to go further than offering great products that help our customers live better at everyday low prices. Our customers also need to know they can trust us to help ensure products are safe. This pilot, and US Drug Supply Chain Security Act requirements, will help us do just that.”
KPMG, a multi-national professional services network
based in the Netherlands, will be providing knowledge regarding compliance
issues to the venture.
“Blockchain’s innate ability within a private, permissioned
network to provide an ‘immutable record’ makes it a logical tool to deploy to
help address DSCSA compliance requirements,” said Arun Ghosh, US Blockchain
Leader at KPMG, in the IBM news release. “The ability to leverage existing
cloud infrastructure is making enterprise blockchain increasingly affordable
and adaptable, helping drug manufacturers, distributors, and dispensers meet
their patient safety and supply chain integrity goals.”
The FDA’s blockchain project is scheduled to be completed in
the fourth quarter of 2019, with the end results being published in a DSCSA
report. The participating organizations will evaluate the need for and plan any
future steps at that time.
Blockchain is a new and relatively untested technology
within the healthcare industry. However, projects like those supported by the
FDA may bring this technology to the forefront for healthcare organizations,
including clinical laboratories and pathology groups. Once proven, blockchain
technology could have significant benefits for patient data accuracy and
security.
Scientists worldwide engaged in research to develop a biomarker for dementia are predicting success, though some say additional research will be needed
Could a blood test for Alzheimer’s disease soon be on clinical laboratory test menus nationwide? Perhaps so. A recent Associated Press (AP) article that was picked up by NBC News and other healthcare publications reported that experimental test results presented during the Alzheimer’s Association International Conference (AAIC) in July suggest the Holy Grail of dementia tests—one where the specimen can be collected in a doctor’s office during a routine screening exam—may be close at hand.
The AP story noted that “half a dozen research groups gave new results on various experimental tests, including one that seems 88% accurate at indicating Alzheimer’s risk.” And Richard Hodes, MD, Director of the National Institute on Aging, told AP, “In the past year, we’ve seen a dramatic acceleration in progress [on Alzheimer’s tests]. This has happened at a pace that is far faster than any of us would have expected.”
This could be a boon for medical laboratories seeking way to contribute more value to patient care. Especially among Alzheimer’s patients, who account for as many as 70% of all dementia cases.
Plasma Biomarker for Predicting Alzheimer’s
One of the experimental blood tests presented at the AAIC involved a 2018 study into “the potential clinical utility of plasma biomarkers in predicting brain amyloid-β burden at an individual level. These plasma biomarkers also have cost-benefit and scalability advantages over current techniques, potentially enabling broader clinical access and efficient population screening,” the researchers stated an article they published in Nature.
AP also reported that Japanese scientists at the AAIC
presented results of a validation test conducted on 201 people who had either
Alzheimer’s, other types of dementia, or little or no symptoms. They found that
the test “correctly identified 92% of people who had Alzheimer’s and correctly
ruled out 85% who did not have it, for an overall accuracy of 88%.”
Akinori Nakamura, MD, PhD, of the National Center for
Geriatrics and Gerontology in Obu, Japan, was a member of the research team and
first author of the research paper. He told the AP that the test results “closely
matched those from the top tests used now—three types of brain scans and a
mental assessment exam.”
Koichi Tanaka is a Japanese engineer who won the Nobel prize winner for chemistry. He heads the Koichi Tanaka Research Lab at Shimadzu Corp. (OTCMKTS:SHMZF) in Kyoto, Japan, and was on the team that developed the Amyloid beta biomarker test that was presented at AAIC. He told Bloomberg, “Our finding overturned the common belief that it wouldn’t be possible to estimate amyloid accumulation in the brain from blood. We’re now being chased by others, and the competition is intensifying.”
But Tanaka cautions that the test needs further study before
it is ready for clinical use, and that for now “it belongs in the hands of drug
developers and research laboratories,” Bloomberg reported.
Other Studies into Developing an Alzheimer’s Biomarker
Alzheimer’s is usually diagnosed after symptoms appear, such
as memory loss. To arrive at their diagnoses, doctors often rely on medical
history, brain imaging (MRI, CT), PET, and measurement of amyloid in spinal
fluid.
An article published on Alzforum, a website and news service dedicated to the research and treatment for Alzheimer’s and other related disorders, noted a study by King’s College London researchers who, using mass spectrometry, “found a panel of biomarkers that predicted with almost 90% accuracy whether cognitively normal people had a positive amyloid scan.”
Nicholas Ashton, PhD, neuroscientist and Wallenberg Postdoctoral Fellow at University of Gothenburg in Sweden, and first author of the King’s College study, explained that “Amyloid-burden and neurofilament light polypeptide (NFL) peptides were important in predicting Alzheimer’s, but alone they weren’t as predictable as when we combined them with novel proteins related to amyloid PET.”
The researchers published their study earlier this year in Science Advances. “Using an unbiased mass spectrometry approach, we have found and replicated with high accuracy, specificity, and sensitivity a plasma protein classifier reflecting amyloid-beta burden in a cognitively unimpaired cohort,” the researchers wrote.
Meanwhile, researchers at Washington University School of Medicine St. Louis, along with the German Center for Neurodegenerative Diseases, a member of the Helmholtz Association, stated in a news release that a blood test they developed works by detecting leaks of NFL before the onset of symptoms. When the protein is found in cerebrospinal fluid, it could be a sign that Alzheimer’s may develop, as well as point to other neurodegenerative conditions such as multiple sclerosis, brain injury, or stroke, the researchers stated.
“This is something that would be easy to incorporate into a screening test in a neurology clinic,” Brian Gordon, PhD, Assistant Professor of Radiology at Washington University’s Mallinckrodt Institute of Radiology, and an author of the study, stated in the news release.
These parallel studies into screening for Alzheimer’s by
researchers worldwide are intriguing. The favorable results suggest that
someday there may be a screen for Alzheimer’s using a clinical laboratory blood
test.
With Alzheimer’s affecting nearly six million Americans of all ages, such an assay would enable clinical laboratories to help many people.
Though the field of oncology has some AI-driven tools, overall, physicians report the reality isn’t living up to the hype
Artificial intelligence (AI) has been heavily touted as the next big thing in healthcare for nearly a decade. Much ink has been devoted to the belief that AI would revolutionize how doctors treat patients. That it would bring about a new age of point-of-care clinical decision support tools and clinical laboratory diagnostic tests. And it would enable remote telemedicine to render distance between provider and patient inconsequential.
But nearly 10 years after IBM’s Watson defeated two human contestants on the game show Jeopardy, some experts believe AI has under-delivered on the promise of a brave new world in medicine, noted IEEE Spectrum, a website and magazine dedicated to applied sciences and engineering.
In the years since Watson’s victory on Jeopardy, IBM (NYSE:IBM) has announced
almost 50 partnerships, collaborations, and projects intended to develop
AI-enabled tools for medical purposes. Most of these projects did not bear
fruit.
However, IBM’s most publicized medical partnerships revolved
around the field of oncology and the expectation that Watson could analyze data
and patients’ records and help oncologists devise personalized and effective
cancer treatment plans. Success in helping physicians more accurately diagnosis
different types of cancer would require anatomic pathologists to understand
this new role for Watson and how the pathology profession should respond to it,
strategically and tactically.
But Watson and other AI systems often struggled to
understand the finer points of medical text. “The information that physicians
extract from an article, that they use to change their care, may not be the
major point of the study,” Mark
Kris, MD, Medical Oncologist at Memorial
Sloan Kettering Cancer Center, told IEEE Spectrum. “Watson’s
thinking is based on statistics, so all it can do is gather statistics about
main outcomes. But doctors don’t work that way.”
Ultimately, IEEE Spectrum reported, “even today’s
best AI struggles to make sense of complex medical information.”
“Reputationally, I think they’re in some trouble,” Robert Wachter, MD, Professor and Chair, Department of Medicine, University of California, San Francisco, told IEEE Spectrum. “They came in with marketing first, product second, and got everybody excited. Then the rubber hit the road. This is an incredibly hard set of problems, and IBM, by being first out, has demonstrated that for everyone else.”
Over Promises and Under Deliveries
In 2016, MD Anderson Cancer Center canceled a project with IBM Watson after spending $62 million on it, Becker’s Hospital Review reported. That project was supposed to use natural language processing (NLP) to develop personalized treatment plans for cancer patients by comparing databases of treatment options with patients’ electronic health records.
“We’re doing incredibly better with NLP than we were five
years ago, yet we’re still incredibly worse than humans,” Yoshua Bengio, PhD,
Professor of Computer Science at the University
of Montreal, told IEEE Spectrum.
The researchers hoped that Watson would be able to examine
variables in patient records and keep current on new information by scanning
and interpreting articles about new discoveries and clinical trials. But Watson
was unable to interpret the data as humans can.
IEEE Spectrum reported that “The realization that
Watson couldn’t independently extract insights from breaking news in the
medical literature was just the first strike. Researchers also found that it
couldn’t mine information from patients’ electronic health records as they’d
expected.”
Researchers Lack Confidence in Watson’s Results
In 2018, the team at MD Anderson published a paper in The
Oncologist outlining their experiences with Watson and cancer
care. They found that their Watson-powered tool, called Oncology
Expert Advisor, had “variable success in extracting information from
text documents in medical records. It had accuracy scores ranging from 90% to
96% when dealing with clear concepts like diagnosis, but scores of only 63% to
65% for time-dependent information like therapy timelines.”
A team of researchers at the University of Nebraska Medical Center (UNMC) have experimented with Watson for genomic analytics and breast cancer patients. After treating the patients, scientists identify mutations using their own tools, then enter that data into Watson, which can quickly pick out some of the mutations that have drug treatments available.
“But the unknown thing here is how good are the results,” Babu Guda, PhD, Professor and Chief Bioinformatics and Research Computing Officer at UNMC, told Gizmodo. “There is no way to validate what we’re getting from IBM is accurate unless we test the real patients in an experiment.”
Guda added that IBM needs to publish the results of studies
and tests performed on thousands of patients if they want scientists to have
confidence in Watson tools.
“Otherwise it’s very difficult for researchers,” he said.
“Without publications, we can’t trust anything.”
Computer Technology Evolving Faster than AI Can Utilize
It
The inability of Watson to produce results for medical uses
may be exacerbated by the fact that the cognitive computing technologies that
were cutting edge back in 2011 aren’t as advanced today.
IEEE Spectrum noted that professionals in both
computer science and medicine believe that AI has massive potential for
improving and enhancing the field of medicine. To date, however, most of AI’s
successes have occurred in controlled experiments with only a few AI-based
medical tools being approved by regulators. IBM’s Watson has only had a few
successful ventures and more research and testing is needed for Watson to prove
its value to medical professionals.
“As a tool, Watson has extraordinary potential,” Kris told IEEE
Spectrum. “I do hope that the people who have the brainpower and computer
power stick with it. It’s a long haul, but it’s worth it.”
Meanwhile, the team at IBM Watson Health continues to forge ahead. In February 2019, Healthcare IT News interviewed Kyu Rhee, MD, Vice President and Chief Health Officer at IBM Corp. and IBM Watson Health. He outlined the directions IBM Watson Health would emphasize at the upcoming annual meeting of the Healthcare Information and Management Systems Society (HIMSS).
IBM Watson Health is “using our presence at HIMSS19 this
year to formally unveil the work we’ve been doing over the past year to
integrate AI technology and smart, user-friendly analytics into the provider
workflow, with a particular focus on real-world solutions for providers to start
tackling these types of challenges head-on,” stated Rhee. “We will tackle these
challenges by focusing our offerings in three core areas. First, is management
decision support. These are the back-office capabilities that improve
operational decisions.”
Clinical laboratory leaders and anatomic pathologists may or
may not agree about how Watson is able to support clinical care initiatives.
But it’s important to note that, though AI’s progress toward its predicted
potential has been slow, it continues nonetheless and is worth watching.