Why science needs a stronger voice: Navigating trust, AI, and the pressure to publish

Mar 23, 2025
Why science need a stronger voice

The future of public trust in science depends not only on integrity but also on communication. Here’s how we can lead through the complexity.

Trust in science has long been a cornerstone of progress. But what happens when that trust is shaken - when papers are retracted, peer review falters, and AI enters the lab before guidelines are even written? Having worked closely with scientists throughout my career as a journalist, communications professional and public speaking coach, I’ve witnessed both -the power and pitfalls of modern research unfold in real-time.

The pressure to publish and its consequences

For many scientists, publishing is not just a goal - it’s currency. Careers, funding, and reputation often hinge on the next journal article. Friends and colleagues in academia describe the relentless drive to produce as motivating on the one hand and, on the other hand, simply overwhelming and stressful.

According to a recent article in The Conversation, scientific misconduct is rising. From image manipulation and plagiarism to outright data fabrication, misconduct has serious consequences - not only for the reputations of individual researchers but for the wider trust the public places in science. When misconduct makes headlines, it can erode public confidence at a time when we most need evidence-based decision-making to contradict influencers voices that aren't data- and evidence-based.

Scientific integrity is not just a matter for the lab - it’s a public good

A case in point: German anaesthetist Joachim Boldt, who fabricated data in dozens of studies - over 200 of his publications have since been retracted. That’s a record I, as a German-Australian, am obviously not proud of. Closer to home, here in Melbourne, one of the most highly regarded research institutions came under scrutiny when a study it published was found to likely describe an experiment that was never actually conducted. And it's not the only one, some of our renowed universities also generated similar headlines.

Peer review is designed to catch flaws and uphold quality, and I saw it working early in my career, being responsible for requesting them. But even this system is under strain today. In Australia, institutions aren’t legally required to publish the results of internal misconduct investigations, meaning public trust is sometimes left hanging in the dark.

None of this means science is broken, but ...

And with ever-increasing submission numbers, many journals are struggling to keep up. Peer review rings and fraudulent reviewer identities have surfaced globally - sometimes enabling false studies to slip through the cracks. There is talk about "peer review fatigue" and "faked peer review" due to a heavy workload in academia. None of this means science is broken, but it does mean we need to be more honest, transparent, and proactive in communicating the process, not just the outcome.

Enter AI: A blessing and a curse

AI is transforming science at warp speed - from data analysis to drafting literature reviews. When used with integrity, it enhances productivity and helps surface new patterns and findings. But without standards, AI risks being misused.

Entire fake research papers generated by large language models (LLM) have already been submitted to journals. And unlike humans, AI can replicate errors - or outright fabrications—at scale. This is a double-edged sword.

As a communicator, I’ve also seen this play out in my field. AI-generated content is everywhere. But there’s a clear distinction between those who personalise and contextualise AI outputs and those who copy-paste. Nowadays, it separates the good from the average. Without human insight, AI-driven comms risk becoming what I call “communications fraud.” But without AI? How can we as communicators cope with the workload and keep up with the developments and be at the forefront of our careers? 

I’m worried about everyone not using AI - and at the same time, worried about everyone using it. 

This is where communication professionals can - and must - play a vital role, and scientists must engage us early. It’s no longer enough to write a media release after a discovery has been made. We need to be embedded earlier in the research journey, helping scientists frame their findings, share their limitations, make the process transparent and build relationships with the public before crises strike.

The antidote to misinformation isn’t silence. It’s strategy.

Science is not just about data; it’s about meaning. It’s about connecting breakthroughs in the lab with real lives outside it. And that connection, now more than ever, needs to be built on trust, transparency, and proactive storytelling. Trust in science doesn't rest on what we discover but on how we explain it.

A call to action for the research sector

If you're a scientist, start seeing communication not as an add-on or an afterthought but as part of your responsibility. If you're a leader in research or health tech, invest in communication early and often. And if you're a fellow communications professional, this is our moment to lead with integrity, empathy, and courage.

By integrating these insights - ethical AI use, scientific transparency, and early engagement with communications - we can strengthen public trust, support integrity, and build a scientific culture ready for the future.

Written by Claudia Raab - Diakla Founder

"If you’ve ever struggled to explain the true value of what you do – or felt your ideas weren’t getting the attention they deserve – 

I built Diakla for you." 

Read more about Claudia

Sign up for The Signal

Sharp, story-led strategies, tips and tricks delivered straight to your inbox.