Edit

In science we trust? Six takeaways from a Science and Society conference

EMBL's latest Science and Society conference brought together researchers, ethicists, communicators, policy professionals, and more to discuss the thorny question of trust in science

Credit: Aleksandra Krolik/EMBL

By Simona Gioè and Shreya Ghosh

In an era where incredible scientific breakthroughs and rapid technological progress exist against a background of mis- and disinformation, it is critical to delve deeper into the root causes of trust (or distrust) in science. This is a necessary first step towards finding ways to nurture and develop such trust, in order to create a societal relationship more grounded in scientific reasoning and rational thought, as well as a more reliable scientific establishment that justifies its public funding.

To further explore the subject of trust in science, both within the scientific community and among society at large, the EMBL Bioethics Team organised a symposium titled ‘In science we trust?’. Held at EMBL Heidelberg in June 2025, the conference was attended by researchers, historians, policy professionals, ethicists, students, and more. 

Here are six main takeaways from the two days of sessions: 

1. The COVID pandemic exposed already existing gaps in societal trust in science

The conference opened with a keynote by virologist Ralf Bartenschlager (Universitätsklinikum Heidelberg), who shared his experience serving as president of the Gesellschaft für Virologie (GfV) – the primary academic society for virology in German-speaking countries – during the COVID-19 pandemic. A recurrent theme during the conference, the pandemic shed light on several failure points or weaknesses in the relationship between science and society, emerging as a natural case study for societal (mis)trust in science and scientists.

For example, despite the success of vaccine development strategies during the pandemic, well-meaning but misguided communication attempts from a few academics sometimes bolstered pre-existing misconceptions about the scientific community and how science works. 

During her address, science historian Caitjan Gainty (Kings College, London, UK) also mentioned that the dialogue between scientists and members of society has too often deteriorated to rhetoric, leaving little space for nuances or opportunities for questioning, and in doing so, denying a fundamental aspect of the scientific process itself.

2. Distrust in science arises from a range of factors

Another theme frequently touched upon during the conference was how cases of scientific misconduct and fraud erode trust in science, and how these are much more prevalent than is ordinarily perceived. During his talk, Csaba Szabo (University of Fribourg, Switzerland) spoke about the reproducibility crisis in biomedical research, taking recently publicised cases of image manipulation as an example. This is exacerbated by the ‘publish or perish’ culture and a poor incentive structure that doesn’t reward quality control or research aimed at ensuring reproducible results. 

Other causes of mistrust include a perceived sense of elitism in the scientific community, as Bartenschlager discussed in his keynote address. Gainty also discussed how scientists’ insistence on being seen as ‘authorities’ prevents meaningful engagements and discussions with non-scientists, by positing science as a collection of facts rather than an iterative process of discovery, and by not leaving enough room for healthy scepticism. Poor communication, involving overreliance on jargon and ignoring risks for misinterpretation, further emphasises this distance. 

On a related note, Ben Bleasdale from the Campaign for Science & Engineering (CASE), UK, used results from a survey of over 30,000 UK residents to show that while investment in R&D is seen as important, it is nevertheless perceived as a luxury in the background of the multiple global crises we are experiencing in current times. This further emphasises how science and technology are often seen as something ‘separate’ from our ordinary day-to-day life as citizens.

3. To build trust, scientists must first focus on making science trustworthy 

Science is not a monolith, and it does not happen in a vacuum. It’s a collection of individuals operating within and often on behalf of society, and as such, it needs public buy-in to survive and fulfil its purposes. So, how can scientists work together towards gaining societal trust in science?

According to Maura Hiney (UCD Institute for Discovery), public perceptions of the morals and character of scientists are even better determinants of their credibility than their qualifications. Outward communication about science should then be honest and acknowledge the challenges of research and the problems in research culture, rather than focus disproportionately on success stories. 

How the scientific community reacts to and addresses these challenges also matters. Multiple solutions were debated during the conference: more transparency during the peer-review process, better systems for detecting and reporting scientific misconduct and fraud, and more effective whistleblower protection mechanisms were brought up again and again.

Discussions about retractions dominated a particularly lively Q&A session, exploring the stigma around it, different ways to handle the process from the publisher’s side, as well as the need to ensure retracted research ceases to have an impact altogether instead of being merely harder to find. The role of open science in the context of societal trust was also examined and often described as a double-edged sword: improving transparency and facilitating access to science on one hand, but also flooding the research landscape with massive amounts of data without effective systems for handling its quality control.

4. Policy-level solutions are important but not enough

These problems are not new. Systemic pressures, especially related to the way research output is assessed, have long been influencing academic culture and have been often connected to the emergence of misconduct. In response, a range of preventative measures have been introduced over time, including the introduction of research integrity guidelines at the national and institutional levels, and several grassroots, volunteer-led initiatives for scientific fraud detection and reporting have been gaining traction. 

But cultural change is slow, and problems persist. Szabó drew attention to the lack of enforcement mechanisms that result in real consequences for those involved in questionable practices, in most of the current policy-level initiatives. He advocated for better infrastructure supporting the self-correcting nature of science, including the professionalisation of peer reviewers and “sleuths”, and for the introduction of punitive measures in response to misconduct and fraud to increase accountability. 

Responsible research assessment practices, as discussed by Rebecca Lawrence (F1000 / DORA, UK), can also be a step in the right direction, moving away from evaluation models that mainly reward publication efficiency and scale and towards a ‘publish-review-curate’ model that better supports ethics and integrity checks.

5. AI has the potential to both help and hinder in the quest towards building trust in science

In recent years, generative AI has come into the spotlight repeatedly as a force for both boosting and hindering the ethical conduct of scientific practice. During a spirited panel discussion, Perihan Elif Ekmekci (TOBB University, Turkey), Resham Kotecha (The Open Data Institute, UK), and Mihalis Kritikos (European Commission, Belgium) spoke about the transformative potential of large language models (LLMs). 

On one hand, AI can be used to make previously inaccessible datasets more accessible, can be trained to better detect image manipulations and other forms of scientific fraud, or assist in fact-checking of published claims. On the other hand, AI can perpetuate existing biases and, by influencing the way researchers form hypotheses, potentially contribute towards building scientific monocultures. 

During the meeting, several speakers discussed how there has been a recent uptick in AI-assisted fraudulent practices, including the wholesale generation of synthetic datasets, the practice of ‘journal hijacking’ (where malicious actors impersonate a legitimate journal and create a counterfeit website for fraudulent purposes), and the creation of ‘paper mills’, where falsified scientific papers written by AI can flood the peer review system, with very little to distinguish them from bona-fide scientific information. 

To better navigate this rapidly shifting scenario, AI literacy is critical, not only among scientists and the public, but also among policymakers. It is also important to have guardrails in place, and to update the current ethical review infrastructure, which is not set up to handle the many facets of AI technology.

6. Participatory dialogue is key in bridging the trust gap

While both policy-level and grassroots solutions were discussed during the meeting, the participants kept coming back to the point of open communication, dialogue, and breaking down of silos or barriers to understanding. A panel discussion focusing on science communication mentioned how important such initiatives are for both engaging the public’s interest and enhancing their trust in science. Bleasdale used the results from the CASE survey to emphasise the importance of ‘place’ – science that is ‘local’ is often perceived as more trustworthy and important, and of ‘purpose’ – the ulterior aims or potential benefits of the research being conducted. Several speakers also emphasised the importance of the public being involved in the decision-making process for new policies, at least at the consultation stage. 


Over the two days of sessions, the symposium gave rise to substantial food for thought for the scientific community, while bringing together a group of individuals with expertise across domains to discuss and share their own experiences in trying to study or bridge the trust gap. Many solutions were discussed, along with evidence from initiatives from across the world that have made an impact in strengthening trust in science and scientists. The speakers emphasised that this is the start of a dialogue that must continue, and the conference provided a springboard for discussions that will, in the coming years, help build a stronger foundation for a trust-based relationship between scientists and society. 


Tags: event, heidelberg, science and society, science communication

News archive

E-newsletter archive

EMBLetc archive

News archive

For press

Contact the Press Office
Edit