News & events

News

19 January 2026

Ensuring credible science: Navigating the risks of questionable journals in the age of open access

Human Sciences Research Council (HSRC)

In short

• Predatory journals often mimic credible open-access titles and bypass genuine peer review.
• Publishing in questionable outlets harms researcher credibility, limits visibility and can affect ratings and funding.
• Even indexed journals may be unreliable, requiring careful verification beyond lists and AI tools.

Photo: Freepik

The term “publish or perish” originated more than a century ago to describe the imperative to disseminate important academic work. Over time, it has come to reflect the pressure on researchers to publish frequently to sustain or advance their careers and secure research funding.

In South Africa, publication output is closely tied to academic recognition, with highly published researchers often receiving better National Research Foundation (NRF) ratings. In his 2021 article “The Publish or Perish Paradox”, Nicolas Sinclair described the growing tension between the need to publish and the potential decline in research quality when quantity takes precedence over meaningful inquiry, academic integrity and the advancement of knowledge.

Exploiting the open access model

Over recent decades, open-access publishing—providing free, unrestricted online access to research—has greatly increased the visibility and accessibility of scholarship. By removing traditional gatekeepers, open access has empowered scholars who might previously have been excluded from established publishing platforms.

Open access has also broadened access to global research and elevated the visibility of African scholarship through platforms such as SciELO SA and AJOL. However, high article processing charges (APCs) still limit participation, underscoring the need for sustainable, low-cost publishing models that advance true equity.

Unfortunately, this same model has been exploited. Questionable or predatory publishers pose as legitimate open-access outlets but their main focus is on collecting APCs—often at low rates—from researchers that are eager to publish quickly. These journals typically promise rapid publication without genuine peer review, editing or quality control. While they claim to support open access, their true motive is profit rather than advancing knowledge.

The dangers of publishing in questionable journals

Publishing in a questionable or predatory journal can seriously harm the reputation of researchers and their institutions. Such papers often go uncited and are not recognised by the broader academic community. If the journal is not reliably indexed in reputable databases such as Scopus, Web of Science or PubMed, other researchers may never find the article. As specialist librarian Denise Nicholson noted at an HSRC webinar in 2021, these are “dead papers”—the work is unlikely to be republished elsewhere, and reputable journals generally do not accept previously published articles, particularly if such articles appeared in questionable journals.

In a 2019 statement, the NRF reminded researchers that they are responsible for avoiding predatory publishing and “unethical editorial practices” and for selecting “authentic and credible” publishing platforms. Those who publish in or cite questionable sources risk losing access to funding and strong NRF ratings.

Navigating a complex landscape

Researchers must also look beyond journal lists. A study by Nguyen Minh Duc and colleagues in Vietnam showed that questionable journals can infiltrate reputable databases, and the authors proposed a checklist to identify them. Similarly, library experts at the Karolinska Institute University and Maastricht University recommend using multiple verification tools, as even indexed journals can later be delisted for failing to meet quality standards. Conversely, legitimate journals can sometimes be mislabled as predatory, for example, due to a name change. The vetting process can be complex and time-consuming to avoid such errors.

AI detection: promise and limits

Recent research by Han Zhuang, Lizhen Liang and Daniel Acuña explored how artificial intelligence (AI) could help to identify questionable journals. Their model analysed more than 15,000 titles based on editorial board composition, website design, citation patterns and publication volume. It flagged more than 1,000 potentially questionable journals. However, the authors caution that AI is not (yet) foolproof as false negatives and false positives remain likely. While AI can accelerate journal vetting, human expertise and ethical judgement remain indispensable.

How to recognise a questionable journal

1. Indexed: Confirm that the journal appears on accredited lists such as the Department of Higher Education and Training (DHET) indexes, such as the Directory of Open Access Journals (DOAJ), the International Bibliography of the Social Sciences (IBSS), South Africa’s Scientific Electronic Library Online (SciElO SA) or the Norwegian Register for Scientific Journals, Series and Publishers. Inclusion alone does not guarantee quality—some predatory journals have slipped through.

2. Editor and editorial board: Verify that the editor and board members are recognised experts with verifiable institutional affiliations. Look out for duplicated names across multiple journals, unverifiable credentials or fictitious members.

3. Mimicking: Predatory publishers often mimic reputable titles by altering wording and punctuation slightly. Conduct a quick online comparison to detect imitation titles.Scope: Legitimate journals have a defined disciplinary focus. Titles that claim to publish across unrelated disciplines—such as medicine, engineering and agriculture—often lack the expertise to maintain quality.

4. Impact factor: Verify the journal’s impact factor via trusted databases such as Journal Citation Reports or Scopus. Beware of self-assigned or fabricated metrics.

5. Website and content details: Poor grammar, excessive punctuation, unprofessional fonts and personal email domains (eg Gmail or Yahoo) are warning signs. Use Google Maps to verify the publisher’s listed address; false claims of UK or US locations are common.

6. Peer review and output volume: Peer review is the cornerstone of scholarly publishing. A two-week turnaround or unusually high publication volume signals a lack of quality control.

7. Fees: Article processing charges (APCs) should be transparent and reasonable. Avoid journals that request additional fees after acceptance.

8. Copyright and licensing: Legitimate open-access journals provide clear copyright information and use recognised Creative Commons licences.

9. Solicitation and flattery: Treat flattery and solicitations promising fast publication with suspicion.

10. Verifying sources: Avoid citing questionable journals or unverified or unaccredited sources. Prioritise primary peer-reviewed research.

11. Conferences: Be cautious of for-profit “academic” conferences that cover unrelated disciplines and use grand titles—featuring words like Global or International—to mimic credibility.

Additional sources: Mouton & Valentine (2017), Elmore & Weston (2020) and O’Donnell (2012)

HSRC support for research integrity

The HSRC’s eResearch Knowledge Unit (eRKU) plays a central role in supporting the research process and ensuring that the organisation produces credible knowledge that can be used by the research community and policymakers.

“The eRKU is dedicated to raising awareness among researchers regarding their responsibility to use sound professional judgement in every aspect of their work and resulting publications,” says Lillian Santi, director of the eRKU. “This proactive approach ensures that research is reproducible, transparent and ethically sound. Ultimately, each researcher is responsible for the quality, credibility and impact of their work within the broader scientific domain.”

While acknowledging the potential of advanced tools, the eRKU advises that the integration of AI in research be approached with caution and critical oversight. “AI should serve as an augmenting tool, not a substitute for the human-led ethical and methodological rigour that underpins high-quality scholarship,” Santi adds.

Research contacts and acknowledgements

This article was written by Shingirirai Muzondo (principal information consultant) and Hanlie Baudin (manager of digital scholarship services) in the HSRC’s eResearch Knowledge Unit (eRKU), with Antoinette Oosthuizen, editor of the HSRC Review. The eRKU supports researchers and stakeholders with access to subscribed and open-access information, datasets, literature searches, training and data management. For more information, contact Shingirirai Muzondo at smuzondo@hsrc.ac.za or Hanlie Baudin at hbaudin@hsrc.ac.za.

If you enjoyed reading this article, please click here to subscribe to the HSRC Review quarterly magazine.

Human Sciences Research Council (HSRC)

Related Articles