News, Analysis, Trends, Management Innovations for
Clinical Laboratories and Pathology Groups

Hosted by Robert Michel

News, Analysis, Trends, Management Innovations for
Clinical Laboratories and Pathology Groups

Hosted by Robert Michel
Sign In

Wiley Launches Paper Mill Detection Tool after Losing Millions Due to Fraudulent Journal Submissions

Groups representing academic publishers are taking steps to combat paper mills that write the papers and then sell authorship spots

Clinical laboratory professionals rely on peer-reviewed research to keep up with the latest findings in pathology, laboratory medicine, and other medical fields. They should thus be interested in new efforts to combat the presence of “research paper mills,” defined as “profit oriented, unofficial, and potentially illegal organizations that produce and sell fraudulent manuscripts that seem to resemble genuine research,” according to the Committee on Publication Ethics (COPE), a non-profit organization representing stakeholders in academic publishing.

“They may also handle the administration of submitting the article to journals for review and sell authorship to researchers once the article is accepted for publication,” the COPE website states.

In a recent example of how paper mills impact scholarly research, multinational publishing company John Wiley and Sons (Wiley) announced in The Scholarly Kitchen last year that it had retracted more than 1,700 papers published in journals from the company’s Hindawi subsidiary, which specializes in open-access academic publishing.

“Often journals will invite contributions to a special issue on a specific topic and this provides an opening for paper mills to submit often many publications to the same issue,” explained a June 2022 research report from the COPE and the International Association of Scientific Technical and Medical Publishers (STM).

“In Hindawi’s case, this is a direct result of sophisticated paper mill activity,” wrote Jay Flynn, Wiley’s Executive Vice President and General Manager, Research, in a Scholarly Kitchen guest post. “The extent to which our processes and systems were breached required an end-to-end review of every step in the peer review and publishing process.”

In addition, journal indexer Clarivate removed 19 Hindawi journals from its Web of Science list in March 2023, due to problems with their editorial quality, Retraction Watch reported.

Hindawi later shut down four of the journals, which had been “heavily compromised by paper mills,” according to a blog post from the publisher.

Wiley also announced at that time that it would temporarily pause Hindawi’s special issues publishing program due to compromised articles, according to a press release.

“We urgently need a collaborative, forward-looking and thoughtful approach to journal security to stop bad actors from further abusing the industry’s systems, journals, and the communities we serve,” wrote Jay Flynn (above), Wiley EVP and General Manager, Research and Learning, in an article he penned for The Scholarly Kitchen. “We’re committed to addressing the challenge presented by paper mills and academic fraud head on, and we invite our publishing peers, and the many organizations that work alongside us, to join us in this endeavor.” Clinical laboratory leaders understand the critical need for accurate medical research papers. (Photo copyright: The Scholarly Kitchen.)

Using AI to Detect Paper Mill Submissions

Wiley acquired Hindawi in 2021 in a deal valued at $298 million, according to a press release, but the subsidiary has since become a financial drain for the company.

The journals earn their revenue by charging fees to authors. But in fiscal year 2024, which began last fall, “Wiley expects $35-40 million in lost revenue from Hindawi as it works to turn around journals with issues and retract articles,” Retraction Watch reported, citing an earnings call.

Wiley also revealed that it would stop using the Hindawi brand name and bring the subsidiary’s remaining journals under its own umbrella by the middle of 2024.

To combat the problem, Wiley announced it would launch an artificial intelligence (AI)-based service called Papermill Detection in partnership with Sage Publishing and the Institute of Electrical and Electronics Engineers (IEEE).

The service will incorporate tools to detect signs that submissions originated from paper mills, including similarities with “known papermill hallmarks” and use of “tortured phrases” indicating that passages were translated by AI-based language models, according to a press release.

These tools include:

  • Papermill Similarity Detection: Checks for known papermill hallmarks and compares content against existing papermills papers.
  • Problematic Phrase Recognition: Flags unusual alternatives to established terms.
  • Unusual Publication Behavior Detection: Identifies irregular publishing patterns by paper authors.
  • Researcher Identity Verification: Helps detect potential bad actors.
  • Gen-AI Generated Content Detection: Identifies potential misuse of generative AI.
  • Journal Scope Checker: Analyzes the article’s relevance to the journal.

The company said that the new service will be available through Research Exchange, Wiley’s manuscript submission platform, as early as next year.

Other Efforts to Spot Paper Mill Submissions

Previously, STM announced the launch of the STM Integrity Hub, with a mission “to equip the scholarly communication community with data, intelligence, and technology to protect research integrity,” Program Director Joris van Rossum, PhD, told The Scholarly Kitchen.

In 2023, the group announced that the hub would integrate Papermill Alarm from Clear Skies, a paper mill detection tool launched in 2022 with a focus on cancer research. It uses a “traffic-light rating system for research papers,” according to a press release.

In an announcement about the launch of Wiley’s Papermill Detection service, Retraction Watch suggested that one key to addressing the problem would be to reduce incentives for authors to use paper mills. Those incentives boil down to the pressure placed on many scientists, clinicians, and students to publish manuscripts, according to the research report from STM and COPE.

In one common scenario, the report noted, a paper mill will submit a staff-written paper to multiple journals. If the paper is accepted, the company will list it on a website and offer authorship spaces for sale.

“If a published paper is challenged, the ‘author’ may sometimes back down and ask for the paper to be retracted because of data problems, or they may try to provide additional supporting information including a supporting letter from their institution which is also a fake,” the report noted.

All of this serves as a warning to pathologists and clinical laboratory professionals to carefully evaluate the sources of medical journals publishing studies that feature results on areas of healthcare and lab medicine research that are of interest.

—Stephen Beale

Related Information:

Potential “Paper Mills” and What to Do about Them: A Publisher’s Perspective

Up to One in Seven Submissions to Hundreds of Wiley Journals Flagged by New Paper Mill Tool

Guest Post: Addressing Paper Mills and a Way Forward for Journal Security

Paper Mills Research Report from COPE and STM

Wiley Paused Hindawi Special Issues amid Quality Problems, Lost $9 Million in Revenue

‘The Situation Has Become Appalling’: Fake Scientific Papers Push Research Credibility to Crisis Point

Publisher Retracts More than a Dozen Papers at Once for Likely Paper Mill Activity

STM Integrity Hub Incorporates Clear Skies’ Papermill Alarm Screening Tool

The New STM Integrity Hub

Upholding Research Integrity in the Age of AI

Large Dutch Survey Shines Light on Fraud and Questionable Research Practices in Medical Studies Published in Scientific Journals

About half of nearly 7,000 respondents admitted to sloppy practices, which suggests that pathologists and clinical lab professionals may want to be skeptical about the findings of many papers published in medical journals

It may surprise pathologists and medical laboratory professionals to learn that as many as 10% of surveyed authors of published scientific papers admitted to regularly falsifying or fabricating data! This was one finding in a study conducted by researchers to determine the quality and accuracy of scientific papers that are published in journals.

The National Survey on Research Integrity (NSRI), an organization based in The Netherlands, conducted the research.

In its coverage of the NSRI’s findings, Nature wrote, “Between October and December 2020, study authors contacted nearly 64,000 researchers at 22 universities in the Netherlands, 6,813 of whom completed the survey.”

According to Nature, “An estimated 8% of scientists who participated in an anonymous survey of research practices at Dutch universities confessed to falsifying and/or fabricating data at least once between 2017 and 2020. More than 10% of medical and life-science researchers admitted to committing this type of fraud, the survey found.”

Gowri Gopalakrishna, PhD, an epidemiologist and public health policy scientist with the Amsterdam University Medical Center (AUMC) who helped lead the NSRI study “thinks that the percentage of researchers who confessed to falsifying or fabricating data could be an underestimate,” Nature reported.

Thousands of Researchers Admit to ‘Questionable Research Practices’

Conducted online, the NSRI received responses from nearly 7,000 academics and researchers across a wide range of disciplines. About half admitted to engaging in “questionable research practices” (QRPs), 4.3% admitted to fabrication of data, and 4.2% admitted to falsification of data.

The NSRI presented its survey results in two preprints:

The NSRI study authors wrote that QRPs included “subtle trespasses such as not submitting valid negative results for publication, not reporting flaws in study design or execution, selective citation to enhance one’s own findings and so forth.”

An article in Science, titled, “Landmark Research Integrity Survey Finds Questionable Practices Are Surprisingly Common,” notes that the NSRI survey organizers took steps to ensure anonymity of respondents. “So, we have good reason to believe that our outcome is closer to reality than that of previous studies,” Gopalakrishna said.

Publish or Perish

Survey organizers originally sought responses from more than 60,000 researchers, but “many institutions refused to cooperate for fear of negative publicity,” Science reported.

The authors cited “publication pressure,” otherwise known as the “publish or perish” reward system, as the top factor driving questionable research practices. Respondents were “less likely” to engage in questionable research practices, data falsification, or fabrication if they subscribed to scientific norms and perceived a high likelihood of being detected.

According the NSRI findings, within academic ranks, PhD candidates and junior researchers were “most likely” to engage in QRPs, as well as males and people involved in empirical research.

Gowri Gopalakrishna, PhD

Epidemiologist Gowri Gopalakrishna, PhD (above), a post-doctoral researcher and the project secretary for the NSRI, told Science that advocates for research integrity should pay more attention to “sloppy research practices” as opposed to outright misconduct. “We need to have a positive environment where mistakes can happen, and where there is more focus on responsible conduct, slower science, and taking time for good quality research,” she said. Clinical laboratory professionals would likely agree with Gopalakrishna’s assessment.  (Photo copyright: University of Amsterdam Medical Center.)

Tracking Retractions

Retraction Watch, a blog founded in 2010 by medical journalists Ivan Oransky, MD, and Adam Marcus, offers a day-to-day barometer on research integrity. As the name indicates, the blog tracks research studies that have been retracted due to scientific misconduct or other reasons. In 2018, the bloggers launched a searchable database with more than 18,000 papers or conference abstracts that had been retracted.

An analysis by Science, titled, “What a Massive Database of Retracted Papers Reveals about Science Publishing’s ‘Death Penalty’,” looked at about 10,500 retracted journal articles in the database. It found that about half of those retractions involved scientific misconduct, including fabrication, falsification, and plagiarism. Nearly 40% were withdrawn “because of errors, problems with reproducibility, and other issues,” the analysis noted.

The data also indicates that a relatively small number of authors—about 500—accounted for about 25% of the retractions in journals.

In addition to the blog, Oransky and Marcus penned a column for STAT, titled, “The Watchdogs” in which they called attention to scientific misconduct and suggested solutions. Some solutions included:

Tips From a Media Watchdog

Gary Schwitzer, founder and Publisher of HealthNewsReview.org, a media watchdog website, offers additional insights. Schwitzer is a longtime medical journalist who also taught health journalism and media ethics at the University of Minnesota.

“Not all studies are the same and no study should necessarily be equated with the truth,” Schwitzer said in a video embedded on the website. People “often lose sight of the fact that journals were meant to be forums for discussions among scientists, not a source of daily news.”

The website includes the following Tips for Analyzing Studies, Medical Evidence, and Health Care Claims:

The website also includes a tip sheet for evaluating claims about medical tests.

The NSRI’s research is the latest in a long line of studies into so-called “scientific research,” some of which found “cooked” data and outright fraud. This suggests that pathologists and clinical laboratory professionals should follow the saying caveat emptor (“Let the buyer beware”) when absorbing research published in scientific journals or presented at meetings.

Stephen Beale

Related Information

Prevalence of Questionable Research Practices, Research Misconduct and Their Potential Explanatory Factors: A Survey Among Academic Researchers in the Netherlands

Prevalence of Responsible Research Practices and Their Potential Explanatory Factors: A Survey Among Academic Researchers in the Netherlands

Largest Study Ever on Research Integrity Launches, Aimed at All Researchers in the Netherlands

Prevalence of Research Misconduct and Questionable Research Practices: A Systematic Review and Meta-Analysis

A Huge Database of Scientific Retractions Is Live. That’s Great for Science

The Real Plague Affecting Science? It Isn’t Fraud

Academic Journals, Journalists Perpetuate Misinformation in Their Handling of Research Retractions, a New Study Finds

What a Massive Database of Retracted Papers Reveals about Science Publishing’s ‘Death Penalty’

Why Our Peer Review System Is a Toothless Watchdog

Science Isn’t Broken. It’s Just a Hell of a Lot Harder Than We Give It Credit For

Q/A with Dr. Ivan Oransky from Retraction Watch

The Science of This Pandemic Is Moving at Dangerous Speeds

Op-ed: Covering Science at Dangerous Speeds

The Watchdogs: We’ll Sniff Out Scientific Misconduct

Why Most Published Research Findings Are False

Tips for Analyzing Studies, Medical Evidence and Health Care Claims

There’s a Way to Spot Data Fakery. All Journals Should Be Using It

Does Science Self-Correct? What We’ve Learned at Retraction Watch

Retractions, Post-Publication Peer Review and Fraud

Ivan Oransky Co-Founder of Retraction Watch Discusses Scientific Research Integrity

;