Skip to main content
Staff homeUCI News home
Story
5 of 50

Increasing public trust in science

Anteater Insider podcast explores the importance of integrity and transparency in academic research

A 2023 Pew Research Center survey found that the percentage of Americans who believe science has had a mostly positive effect on society has decreased as public trust in scientists continues to wane. While 57 percent of respondents indicated that science had a mostly positive impact on society, this figure has dropped by 16 points since before the beginning of the COVID-19 outbreak.

This loss of trust could have a profound impact on U.S. universities, which receive billions in research funding from public sources annually. What is driving this decline? Is it warranted? And what actions can the scientific community, particularly U.S. universities, take in response?

In this edition of the Scholarly Values Anteater Insider podcast, Duncan Pritchard, distinguished professor of philosophy and Chair of the Year of Scholarly Values Committee, will discuss the pressing issues facing the scientific community with Pramod Khargonekar, the vice chancellor for research and Distinguished Professor of electrical engineering & computer science. Pramod leads UC Irvine’s $700 million research enterprise and he addresses these questions and much more.

To get the latest episodes of the Anteater Podcast or The UCI Podcast delivered automatically, subscribe at Apple Podcasts or Spotify.

TRANSCRIPT

Duncan Pritchard:

So let’s begin by asking whether you agree that there is a loss of trust in science. And if you do agree with that, what do you think is the cause?

Pramod Khargonekar:

The Pew survey, as well as other surveys, reveals that there is a loss of trust in science and scientists. Now, having said this, the loss does not appear to be specific to science, but it is somewhat similar to the loss of trust in many other civic, cultural and governmental institutions. So if you look at survey results, public trust in science is somewhat comparable to public trust in the military, which has traditionally enjoyed a high level of trust. And scientists’ trust is much higher than in journalists, business leaders and elected officials, although you might say that’s a little bit of a low bar for us to be comparing ourselves to. So yeah, I would say yes, there is a loss of trust in science and scientists, but let’s kind of, you know, dissect it a little bit deeper.

So again, if you look at those survey results that you pointed out and other similar surveys, what comes across is that the public generally has high levels of confidence in scientists, intelligence, competence and honesty. In fact, in one other survey, 89 percent of the people felt scientists were very intelligent. And they certainly believe that scientists work on real-world problems and that they are competent in what they do and that they do their work. Honestly, where I think things begin to go awry are three areas where the public seems to have a much less rosy view of science and scientists. So let me lay that out. Those are values, biases, and a focus on publications and grants. So I would say in the surveys that I have seen, less than half the public believes that scientists share their values, that these values differ from theirs. Similarly, less than half of the public believes that scientists can overcome their human and political biases. However, a majority of the public believes that scientists prioritize getting grants and publishing, even if it means compromising on quality. Considering these findings, it seems that the real loss lies in the areas of values and public perception of scientists as biased in a specific direction. Moreover, the incentive systems drive scientists to focus on publications and grants, leading to potential compromises.

Duncan Pritchard:

That’s really interesting point you make. I think you’re right about this. There’s a general crisis of confidence and expertise, and science just falls under that general umbrella. And I think your diagnosis would also explain why there’s a general skepticism about expertise. The experts have values subjective biases, such they don’t really represent us that’s why we can’t really take what they say at face value. And they think scientists maybe have political agendas. This may be part of what went on with COVID-19, when many thought the scientists maybe were siding with one political agenda over another, or something like that. If that is the right diagnosis, can we do something about this? What can we do to correct this?

Pramod Khargonekar:

That’s a big question, Duncan, and I think we can think about things that we might be able to do first. We need to regain trust because trust is the fundamental thing. I’ll give you an example. Some data suggests that those groups of people that had lost trust in, say, scientists or scientific institutions and federal institutions were less likely to take the COVID-19 vaccine and were more susceptible to misinformation about the vaccine. So there’s a vicious cycle between loss of trust and making decisions that are clearly not in their best interest. And that cycle sort of feeds itself. And now I don’t know what the causal relationship in that vicious cycle is, but I think from my point of view, the priority has to be to regain the trust of the public in what we do and how we do it, and trust that we do our work with minimizing the biases. We need to come with open-mindedness, robust processes of peer review, of analyzing the evidence, gathering full data, and all the things that we in the scientific and research community hold as paramount, in terms of the processes that we use to conduct the scientific research. We need to come up with communications strategies, working with the public so that we can rebuild this trust. Because I think that’s the problem we need to solve.

Duncan Pritchard:

It’s interesting, isn’t it? I mean, you mentioned there how you know a certain group that’s most likely to view scientists with mistrust and that they will be the ones who are probably susceptible to further information, which will confirm their bias against science. And that’s the product of the information age. It reinforces viewpoints regardless of whether they’re correct or not. And it’s very difficult to do anything about that without actually interfering with the platforms that provide this information, which in itself will be viewed as a political act, right? It would itself make people suspicious of the information, which might be indeed correct about science. That’s the dilemma all experts face, not just scientists. How do you correct for that without being seen as meddling in people’s free speech and free access to information?

I suppose we agree that we have to build this trust. In practical terms, though, I’d like to sort of pin you down on this. For example, you mentioned grants a moment ago, and I can see how people would detect a kind of causal relationship between the fact that scientists have to get grants. Many grants come through industry or through public bodies. They have their own agendas, so that’s going to dictate a certain kind of agenda for science. No wonder then that science is biased; it has its agendas and so forth, but it’s hard to imagine how we would do science without funding from these bodies, right? So, I mean, just in practical terms, how might we project science in a way that presents it as trustworthy, given that there are these structural features of the scientific enterprise, which could be construed as making us, making the scientific enterprise sub-optimal in terms of whether it should be trustworthy.

Pramod Khargonekar:

Let me begin by saying what we should not do. What we should not do is aim for uncritical trust and belief among the public in science. That would be counter to the core principles of the scientific method, which relies upon evidence, testing, analysis, debate, skepticism, objectivity, and so on and so forth. We should expect the public to be critical of science, and we should earn their trust in the same way that we earn trust with each other, when we conduct scholarly work. I would say, let’s not go down the path of manipulating the public opinion, because I think that will be counterproductive, and in my opinion, it won’t work. I think we need to treat public with respect and understand their point of view before we go about changing their point of view.

Now, you asked me in terms of practical things that we can do. What we need to do is researchers, scientific organizations, funding agencies and universities need to redevelop our commitment to conduct and to communicate and critique research. We should correct published records when errors are found or when we run across occasional cases of research misconduct. I think that should be communicated openly with transparency so people can see that we don’t hide biased work or misleading work or data fabrication. I think all of these things detract from trust in science. We should also demonstrate to the public that we do have vigorous debate in the scientific community and that we do not all sort of take the so-called party line. We should find a way to communicate the fact that scientists disagree with each other, sometimes vehemently.

If the public saw more of that give and take in the scientific process they might, they might appreciate more the fact that the scientific process is built on differing opinions and the questioning of each other, dislodging the existing dogmas or paradigms by which research is being conducted. Finally, I will say the following based on my understanding of science communication – if you ask any individual to choose between a scientifically valid fact and belonging to the community in which they see themselves as being part of, almost invariably they will choose the community over what we think is objective fact. We have to come to grips with this. It’s a social phenomenon. I’m not a psychologist or a social scientist, so all I’m saying is that is what the field of science communication has taught us.

We must find ways of respecting the fact that people belong to certain social groups or social communities, whether they are based on where they live, their profession, with their belief systems and so forth. I think we should not ask people to give up their social structures. Rather, we should meet them where they are, understand their point of view, and find ways with respect, honesty and transparency to communicate how science has reached a particular conclusion. I think, again, without faulting anybody publicly, they began to see the scientific community as manipulating them in certain emergency situations. And I think that was very counterproductive. Whether that was deserved or not is a whole different debate that’s going on in the public right now. But I think we need to sort of step away from any idea that we can manipulate public opinion.

Duncan Pritchard:

It’s very interesting. You say that because on several occasions over the last few years, science has been shown to be doing its job properly, but it is misrepresented as being a sort of crisis of science. For example, there was this replication crisis in social psychology, and there have been a number of high-profile cases where scientists have been found to have manipulated data and so on. But the fact that we’ve discovered this, that we’ve been studying it, that it’s been brought to the fore, should be a win for science. It should be a win for the objectivity of science. But of course, it gets cast as being how much more of this is this sort of kind of narrative. So it’s a tricky thing in this information age to get across that.

It’s a good thing that we’ve discovered a replication crisis and that we’re now doing the replication. It’s a good thing that we’re exposing data fraud and things like that. And related to this, you talk about scientists being open about the fact that they disagree, but often I think when the public sees scientists disagree, as with COVID-19, people treat that as a reason not to see science as objective anymore. It seems like we need more education or something like that about what it is, what science is and what it offers so that people can understand that scientific disagreement is compatible with the objectivity of science. Does that sound right to you?

Pramod Khargonekar:

It sounds right. And I think the situation is maybe a tad worse than you portray, Duncan. It’s in the nature of science for there to be uncertainty about the ultimate fact or ultimate truth, right? I mean, we are always using Karl Popper’s formulation that no hypothesis is proved to be true. We only have not falsified it yet. So, and, and we know that there is inherent uncertainty in scientific research. I mean, nothing is known with one hundred percent accuracy, in general. I think the public has a hard time knowing scientists have a hard time dealing with the uncertainty of knowledge. And so how do we effectively communicate this nature of science that it’s always sort of improvising, questioning, changing? Maybe we need to be doing more of it in the university as we educate our students as undergraduates, for them to have an understanding of the nature of the scientific process, nature of how this sort of gradual improvement in our understanding of the universe, and how the world works.

So I don’t think there is a magic bullet here, but perhaps we can begin at the university level, perhaps even at the K-12 level. The world has completely changed in the last, say, 15 to 20 years, as now you’ve got myriad sources of information. Everybody’s a publisher, everybody’s an influencer. So how do we teach people to look for reliable sources of information? How do you weigh different sources of information? How do you fight confirmation bias? How do you look for a contradictory opinion? And how do you weigh that? I mean, I think these are all fundamental scholarly values, Duncan, and we need to do a better job of putting them at least in the university curricula, but perhaps in K-12 and also perhaps for the general public. The main thing is to do this with respect. If people feel like you’re looking down on them, this will be counterproductive.

Duncan Pritchard:

That’s music to my ears, of course, because that kind of training is effectively what I do, right? As philosophers, we try to help people to think clearly for themselves and think rationally. Don’t lead to conclusions. Don’t just accept things because that’s what you’re told, and so forth. And try and sift through the evidence. And look at what the scientific method is. What does it offer? What is it exactly to uncover a scientific truth as opposed to another kind of truth and so forth? I would say one way of construing your answer is that we need more of that kind of philosophical training in our curriculum. Thank you, Pramod, for joining the Anteater Insider podcast. That was a very interesting conversation.

Pramod Khargonekar:

Thank you, Duncan, for inviting me and having this wonderful discussion. I hope we can make progress on these critically important issues.