Skip to content
All Podcast Episodes

Not Cool Ep 26: Naomi Oreskes on trusting climate science

Published
26 November, 2019
Naomi Oreskes discusses her new book and explains why we should trust climate science

It’s the Not Cool series finale, and by now we’ve heard from climate scientists, meteorologists, physicists, psychologists, epidemiologists and ecologists. We’ve gotten expert opinions on everything from mitigation and adaptation to security, policy and finance. Today, we’re tackling one final question: why should we trust them? Ariel is joined by Naomi Oreskes, Harvard professor and author of seven books, including the newly released Why Trust Science? Naomi lays out her case for why we should listen to experts, how we can identify the best experts in a field, and why we should be open to the idea of more than one type of "scientific method." She also discusses industry-funded science, scientists’ misconceptions about the public, and the role of the media in proliferating bad research.

Topics discussed include:

  • Why Trust Science?
  • 5 tenets of reliable science
  • How to decide which experts to trust
  • Why non-scientists can't debate science
  • Industry disinformation
  • How to communicate science
  • Fact-value distinction
  • Why people reject science
  • Shifting arguments from climate deniers
  • Individual vs. structural change
  • State- and city-level policy change

References discussed include:

We have people out there who are just doing everything in their power to keep the fossil fuel economy alive and to continue to make profits by selling fossil fuels, come hell or literally high water. That tells me that this is fundamentally a political problem, that we have to fight the political power of the fossil fuel industry.

~ Naomi Oreskes

Transcript

Ariel Conn: Hi everyone. I’m Ariel Conn, host of Not Cool, a climate podcast. Today doesn’t just mark the 26th episode of the show. This is also the last interview we’re releasing for the series. Though tomorrow, we’ll also release a brief epilogue where I’ll highlight some of the more interesting things that I learned, as well as some of the things we didn’t get to cover that I would have liked to talk about. 

But for our last interview, I’m honored to have Naomi Oreskes joining the show. Naomi has written extensively about climate change and the scientific consensus surrounding climate change. Her most recent book is Why Trust Science?, which is what we’ll primarily discuss on this show, since the question pertains so directly to climate change. 

Naomi is a Professor of the History of Science and Affiliated Professor of Earth and Planetary Sciences at Harvard University. She is a world-renowned geologist, historian and public speaker, as well as a leading voice on the role of science in society and the reality of anthropogenic climate change.

Naomi is author or co-author of 7 books, and over 150 articles, essays and opinion pieces, including Merchants of Doubt, The Collapse of Western Civilization, Discerning Experts, Why Trust Science?, and Science on a Mission: American Oceanography from the Cold War to Climate Change. Merchants of Doubt, which she co-authored with Erik Conway, was the subject of a documentary film of the same name, and has been translated into nine languages. A new edition of Merchants of Doubt, with an introduction by Al Gore, will be published in 2020.

Naomi, thank you so much for joining us. 

Naomi Oreskes: You're welcome. Nice to be with you. 

Ariel Conn: All right, so your book is Why Trust Science? Why did you write the book? What problems are you hoping to identify or address? 

Naomi Oreskes: That's easy to answer because this book, more than some things I've done, had a very specific beginning. After Eric Conway and I published Merchants of Doubt in 2010, I went on the lecture circuit, found myself giving a lot of public lectures all across the country, and making a point to try to accept invitations from places that might not always be friendly. And often I would give these very well-constructed talks where I would explain the history of climate science, and all the details about how scientists have come to understand it, and what the evidence was, and how the evidence had been collected by a lot of different people in different places over a long period of time. And once I was giving a talk and a man stood up afterwards and he said, "Well, that's all very well and good, but why should we trust the science?" And in that moment I thought, fair enough. And I started thinking about that, and so a couple of years later when I was invited to do the Tanner Lectures on Human Values at Princeton, I thought, “That would be a good topic. Let me see if I can try to answer that question.” 

Ariel Conn: I guess you list five tenets of reliable science that I found, and that was consensus, method, evidence, values and humility. Can you just briefly explain what you meant by each of those — and if there's something I missed, mention that as well? 

Naomi Oreskes: Sure. Although I think if it's okay with you, I might back up. Those five things kind of fall out at the end of the book as important themes that I ended up thinking were important things for us to think about when we think about the reliability of science. Before we get there, there's a more basic argument about the basis for trust in science, and that's essentially two things. One is the argument for expertise. Even though it's very fashionable these days to be skeptical of experts, the reality is that there's good reason to support experts. And the reason is that they have knowledge and information and experience that we don't have. So just like we need plumbers to fix our plumbing — sometimes we can fix it ourselves, but a lot of times we can't — or if the electricity goes on the fritz, most of us are not in a position to safely fix our own electricity. 

So we call in an electrician, and most of the time that's a good thing. Scientists are our experts on the natural world. They are the people who have studied it, who have specialized training and knowledge, and who know things that we don't know but that we need to rely on — just like we rely on our plumbers or electricians or dentists or doctors or nurses, whatever. And without expertise, civilization would come to a standstill. You wake up in the morning and you turn on the radio and you listen to the radio, you listen to the weather. You get in your car and you drive to work; when you get to work, your office hopefully has been cleaned the night before. I mean, everything we do, there are other people around us helping us out, and in many cases those people are experts. 

We don't really think about that. But when you begin to think about expertise in that way, you realize that it's foolish to be disrespectful of experts. A little healthy skepticism here and there — some electricians are crooks, you know — but in general, most experts do jobs that we need them to do, and scientists are our natural world experts. The second part of the argument is the argument for the critical vetting of claims. In science, it's not enough to be an expert, do work and say, “Okay, I've spent the last 15 years studying these mineral deposits in Chile and now I'm going to tell you all about them.” No, there's a second step. And that second step is the critical vetting of the claims by the community of other experts. I have to present my evidence, I have to show the data, my colleagues get to ask questions, and they get to ask tough questions. And then I submit it for publication, where there's another round of questioning, and if my colleagues are not satisfied with my arguments, I have to fix them. 

This process is the key to yielding claims that have been vetted. They've been tested, and in general, history tells us that by and large they turn out to be pretty reliable. So that's the basic structure and framework for why we should trust science. Out of that basic framework, in the book I look at the larger history of different attempts to try to understand science, but then I also look specifically at a couple of cases where we would say in hindsight that scientists did get it wrong, and ask the question, well, what can we learn from those experiences? And so I pull out five things that I think are important. 

The first and most important is the whole notion of consensus. So again, it's fashionable in some quarters to criticize consensus, to say that science isn't about consensus. But actually science is about consensus, because that's what you get after you go through this whole process — or maybe you don't get it, but when you do have consensus, that's when we say, okay, we know something. 

And what I found in my examination of examples where supposedly scientists had got it wrong was that actually in every one of those cases we find that there actually wasn't a consensus, that even at the time there was significant disagreement within the scientific community. This means it's very important for us to look closely in some issue that might be contested — like climate change — to really find out, is this contested by scientists within the scientific community, or is the contestation political contestation, which is what we see in the case of climate change, or is it some kind of social or cultural contestation, which is what we see in the case of vaccinations. These are two very different things, and they need to be addressed in different ways. So consensus is really important. 

The second thing is method. And a big part of the book is to refute the popular conventional wisdom that there is a scientific method. Historians and philosophers have been saying for a long time now that that's wrong. We have a huge amount of evidence from the history of science that scientists actually use very diverse methods. So when we look at science, what we have to accept is that there are different tools for different kinds of problems, but that's okay. And what is not okay is when we get fetishistic about method, when we insist that there's only one right way to do science. And this was on full display just a few weeks ago in the recent debate that erupted over the question of whether or not it's healthy to eat red meat. And it comes up in all the cases that I looked at in my study too: that where you have scientists going wrong, it's often where they become obsessive — I use the word fetishistic — about a certain method, and then that blinds them to important evidence that comes from other places. 

And sometimes this is done in a principled way, that scientists persuade themselves that only a randomized clinical trial is legitimate. Or sometimes it's exploited cynically, let's say by the tobacco industry, that used it to try to deny a significant part of the evidence of the harms of tobacco. And we have seen this just two weeks ago: the people who are now claiming that red meat is fine and you should eat as much red meat as you want, have, in my opinion, cynically exploited the whole notion of RCT. So, this may be more detailed than you want, but in their paper they used a methodology that was designed to prioritize RCTs on the grounds that randomized clinical trials are the gold standard in epidemiology and clinical trials. Well, that's true; if you can do an RCT, then it's definitely a good thing to do. But there are many problems for which RCTs are not suitable. 

And nutrition is probably the most important example because you can't do a double-blinded clinical trial; very difficult to randomize a population; people lie about what they eat — I mean, there's all kinds of reasons why it's hard to do RCTS in nutrition. However, we have an awful lot of good information from other sources: population studies, cohort group studies, animal studies. It's true, these other methods are not as good as RCTs, but if you can't do an RCT, then obviously you have to rely on other information. And we now know that these authors did in fact have links to the food industry. I think this was a cynical ploy to rule out a lot of important evidence and to say, "Oh, we have no good evidence that red meat is bad for you, and therefore you should just keep on eating red meat." So this can be exploited cynically, or it can be used in a kind of authentic, but I think misguided, way. So a really important message of my book is don't be fetishistic about method. Accept the fact that evidence comes in a lot of different sizes, shapes and colors. It's not always perfect, but the fact that evidence is imperfect is no reason to throw it out. 

Ariel Conn: Do you see scientists recognizing that there's lots of different types of methods, or do you think this is something that scientists need to be reminded about as well? 

Naomi Oreskes: Both. I think there are a lot of scientists that do recognize it. Certainly, scientists who are in fields like nutrition, who realize the difficulty of doing RCTs — they certainly know and accept that you need to be able to do other things like animal studies, for example. But I think there are other scientists who need to be reminded. I think it's easy to fall into the trap of thinking a certain method is the gold standard, a certain method is better — and then because it's better, you become a little bit narrow-minded or fixated about the idea that that is what you should be working for. And again, if you can do it, great, but if you can't do it then you have to be willing to accept that there are other ways. 

Just as there are many different methods, so evidence comes in lots of sizes, shapes and colors. And the key thing in science is as much as possible to be open minded to all of the evidence, to look at the weight of evidence and not to discount evidence just because it's not in your preferred form. And so, many of the cases that I looked at in my book where we see scientists in hindsight making mistakes or going awry, we find people discounting evidence because they didn't like where it came from, or they didn't like who was supplying it. And in hindsight, many times we see that this discounted evidence was actually correct. And so a big part of my argument is to be open-minded about evidence. It doesn't mean that you accept any old claim, but it does mean that you realize evidence is complicated and it may sometimes come from places that you didn't always initially expect. So being open-minded about evidence and weighing the evidence and really being able to encompass the full availability of evidence: the evidence in my study supports that that's the appropriate approach. 

Ariel Conn: And then values is actually one that I thought was really interesting, because I do think that that's one that's especially important when trying to communicate the science to such a diverse group of people. 

Naomi Oreskes: Many scientists think that to be scientific is to suppress your personal values, and that it's inappropriate to talk about values in science because if you do, then it will appear that you're not objective. And I think this is mistaken on a couple of levels. I think it's mistaken as a matter of fact, because the reality is that scientists are people, we're human beings, and we all have values and preferences — and it is simply not possible to expunge your values. The idea of being a completely value-neutral objective scientist is simply a fantasy. Maybe someday in the future there'll be a robot that could do that, but no human being can do that. So if you set that as your goal, you will necessarily fall short. And then if people discover you actually have values — which of course you do — then it's, “Aha, so you really do have values!”

And then you can be potentially discredited by people who would like to discredit you. Or even if nobody's deliberately trying to discredit you, it just may come across as inauthentic or dishonest because you say, “Oh, I have no values,” and your audience is thinking, “Yeah, right.” It also comes out of my experience trying to communicate to a wide range of different people about climate change, and one of the things that I've discovered is that not only is it impossible to completely hide your values, but it's not even a good idea. Because what I've found in my own work was that when I would talk to people about me — who I was, and my own personal values — very often people resonated with that.

Even people who might have been skeptical to begin with about climate change, when I would actually talk about why I got involved in this issue, why it matters to me, why I act on climate, to use the hashtag, I found that often people would suddenly be listening more closely. Then I become a person just like them, grappling with a complicated issue, caring about my children just like them, and I think that opens up a space to make a human connection with people that maybe otherwise you might think you have nothing in common with them. 

That happens a lot with scientists. They think, “How can I possibly talk to somebody who thinks the earth is 6,000 years old?” Or, “How can I possibly talk to somebody who hasn't vaccinated their children?” This is one way to answer that question: to say, “Well, look, you have values, they have values, and it turns out actually many of those values may overlap. Those people who aren't vaccinating their children, they love their children just as much as you do, but they have some kind of conceptual framework that has made them think that not vaccinating their children is an appropriate expression of their love. And if you can find a way to say, ‘Well, I love my children too and here's why I vaccinate my children’ — sometimes that can open up a space that would otherwise not be there.”

Ariel Conn: I'm going to come back to this in a minute, but let's finish with this list. All of these really tie nicely, link nicely together. So humility is the last one. 

Naomi Oreskes: Right. Well, humility is the last one; in a way, it's the first one, because it's something I've been thinking about for a long time, but I always keep coming back to it. As an academic, it's a tricky thing because you do all this work and then people press you to say, “Here's my conclusion, and here it is in a soundbite without any qualifications or caveats.” And of course we know that's not right, because everything we've done is potentially subject to revision. So a long time ago when I first started writing about climate change, my very first article on it was my 2004 article on the scientific consensus on climate change. And one of the things I specifically said in that article, after demonstrating that there was indeed a consensus, that of course the history of science tells us that we might be wrong. And I wrote, “If history proves anything, if it teaches anything, it teaches humility.”

So we should always be aware of the possibility that in the future we will learn new things and that our ideas may need to be revised. and we should therefore not be too self-satisfied, not become auto-intoxicated. But at the same time, if we need to make an important decision, then it makes sense to make our decision based on the best information we have. So we can be clear and firm about what we know while at the same time still recognizing that yes, it's possible that we will revise this in the future, but it's rather unlikely that we're going to find out that there's no climate change because all of the available evidence tells us there is. 

Ariel Conn: As I was reading this, I initially went into the book with the title Why Trust Science? — I assumed it was geared towards the public. And maybe that was the goal, but as I was reading it, it seemed to me an awful lot like really the audience who needs to be reading this are scientists. And so I was curious who you intended this for.

Naomi Oreskes: Yeah, that's a good question. Sometimes when you write you have a very specific audience in mind, and sometimes it's a little bit less specific. I think in this case, who I had in mind were the people who come to my talks. When I give public talks, a lot of people come; I mean, I've lectured to many, many thousands of people across this country and I've written op-ed pieces that have been read by an order of magnitude more. And I don't know who all those people are, but there's a lot of people out there who read my books, and most of those people are not scientists, most of those people do not have PhDs and most of them are not academics. But they're educated enough to be engaged in a serious conversation about science. So whoever they are, I'm very grateful to them, and this book for you. 

There is some audience out there of educated people who care about these issues, and it does include scientists. I think many scientists need to have these broader, more philosophical conversations about the nature of science and the relationship between facts and values, for example, but I don't see scientists as primarily my audience. I see it primarily as all those nice people in Iowa and North Dakota and Utah who have come to my talks and listened to me on public radio and might be listening to this podcast.

Ariel Conn: To keep up with all scientific fields — that's impossible for scientists, let alone the general public. And so then you start trying to get people who don't have a scientific background, who can't read these peer-reviewed papers: how do they reasonably keep up with what information they need to keep up with and find reliable mediators and communicators that they can trust? 

Naomi Oreskes: Well of course part of the answer is, they don't. I don't expect any ordinary citizen to keep up with science writ large. I mean, I do this for a living and I still wouldn't claim that I keep up with all science. I keep up very closely with earth and environmental science, and to a lesser extent with a set of issues that sort of overlap with it that include nutrition and health and smoking and a few other things. But the whole point of the book — and of course one reason I worked really hard to keep it under 250 pages — was to say that there are some kinds of principles that you can begin to think about that can help you and give you guidance. And so one of the principles is the principle of expertise. It doesn't take a lot of work to ask the question — if some so-called expert is on television or on the radio or being quoted in the newspaper — it doesn't take a lot of work to say, "Well, who is this person?" And if the person is talking about science, but they're not a scientist, that should be a red flag. 

I have noticed in my own work that an enormous number of people who get quoted on television, on the radio, making scientific claims — or I should say really making anti-scientific claims; claiming there's no climate change or claiming that vaccines cause autism — these people are not scientists. So, obvious example: Jenny McCarthy saying that vaccines caused her son's autism. She's an actress. She may be a fine actress; she may be a fine person. I have no question whatsoever that she loves her son. She's not an expert about autism. And her experience as a parent does not make her an expert on the causation of autism. 

It may make her an expert on the experience of frustrated parents who are trying to grapple with a society that doesn't do a really good job of helping people who have autism. And that's totally legitimate. If you were doing a program about the difficulty of being a parent of autistic children, then it would be absolutely legitimate to invite Jenny McCarthy on that program. But if the program is about the causes of autism, then she's not the appropriate expert. And I think once you begin to think in those terms, a lot of things get sorted out. 

So you asked about the audience for the book: part of the audience for the book is you, and journalists, because I think journalists have really dropped the ball on this one. And the number of times that I have seen journalists interviewing some shill for the oil industry or somebody from a libertarian think tank like the Cato Institute — and this is something that has been driving me crazy for years: climate scientists get up and talk about climate change and then a journalist will interview someone from the Cato Institute challenging the science. That's completely inappropriate. If you want to talk about the policy aspects — like now that we know that this is happening, what do we do about it? — fair enough, invite the libertarians. But then you should also invite Greenpeace. The appropriate counter to Cato is Greenpeace. The appropriate counter to science is not. It would be other scientists if there's an actual debate. 

In the case of the causes of autism, yes, we don't know what causes autism. So there, let's say you had some guy on who has a theory that autism is caused by sugar — this is a real thing; I was just reading about this the other day. I have no idea if that's true or not, so don't quote me as saying that sugar causes autism, but if you thought it was an interesting hypothesis and you wanted to invite him on board and you wanted to see what evidence he had, then the appropriate counter is some other scientist who thinks that the cause of autism — I mean, I don't know, who doesn't think it's sugar. There are real scientific debates, and that's when you get scientists together. But when it's a political debate, then you don't invite a political person to counter scientific evidence. That's what philosophers would call a category mistake. 

Ariel Conn: Do you think we're sort of falling — I don't know if fallacy is quite the right word, but sort of this idea that we need to get both sides of an argument, even when there isn't both sides, there isn't another side?

Naomi Oreskes: Yeah, absolutely. And I talk about this in the book. So it is a fallacy; it's the fallacy of false equivalence — the idea that every story has two sides. And you hear this all the time, but it's not true. I believe I say this in the book; I've certainly said this many times in other places: if there's a genuine scientific debate, then typically there's a lot more than two sides. My first book was on the debate over continental drift in the early 20th century, and in that debate there were five or six or seven major schools of thought about how to explain continental motions. And in my book I have a diagram that I love, an illustration that came out of the classroom notes of a geologists in the 1920s who studied tectonics with one of the world's most famous seismologists, Beno Gutenberg. 

And in that class he listed 21 different hypotheses to explain the motion of continents. So if you were doing some kind of TV show — we didn't have television in those days, but imagine we did — it would be reasonable to invite six or eight or even ten scientists on board to talk about all these different hypotheses. But when scientists come to an agreement on something and conclude that something is known — like we would now say we know the continents move — then the whole notion of sides makes no sense. And the whole framework of sides is very unhelpful. And so I think what a lot of journalists have done is they've taken a framework out of politics, where the notion of sides in a two party system is sort of understandable — although even then you might argue that it's not all that helpful, but at least you have a political debate; you have Democrats and Republicans. Okay. That's a framework that we understand, and has some logical relationship to the political system that we live in. But the framework of two sides has no logical relationship to science. 

Ariel Conn: I've often wondered — and suspected, frankly — that one of the things that hurts science, and people's understanding of science, is maybe poor reporting of health-related issues, but even just reporting of health-related issues. We're constantly getting new updates to our scientific understanding of health, and sometimes that means that what we previously thought turns out to be false. My theory is that what happens is most people aren't really making a strong differentiation between health science and any other type of science. And so when they see all these issues related to health science and all the inconsistencies and changes, they then turn to all the other sciences and say, "Well, all science is like that." 

Naomi Oreskes: I think that's correct. When I was giving these public lectures and people would sometimes say, "Why should we trust the science," the other thing people would sometimes say is, "Well, why should we trust the science when we know that scientists are always changing their minds or always getting it wrong?" I got that a lot. So sometimes I would ask people, “Well, what science in particular are you thinking of?” And it was interesting: very often people would have no answer to that question. It was just a sort of vague sense that scientists are always changing their mind. But if they did have an answer, it was always nutrition, always. And this is why I started getting more interested in nutrition, to try to get a better understanding of what was going on there. So I think you're right, I think there's a few things going on. I think one is that nutrition is a very tough science; it's hard to pin down. 

So it means that you have a lot of suboptimal studies, and that means that drawing very firm conclusions is difficult. And it also means that new studies could challenge existing ones. It is true that shifting nutritional advice can give the impression that science is very unstable. And I do think you're right, I think people who don't follow other areas of science might falsely generalize from that to, say, climate science. Because if you look at climate science, you see that actually the advice from climate science has been extraordinarily stable for about 30 years now. There's actually been no reversals of fortune in climate science. 

But it's not just that. There's also, I think, two other things that are in a way worse than that. And one is the confounding effects of industry disinformation. One of the things that we're increasingly understanding is that the industry has actually been responsible for some of the misleading and confusing information. We now know that there was a lot of industry funding of research designed to blame fat for nutritional problems in order to distract attention from the harms of sugar. And this has been quite well-documented now. We also know that there's a lot of industry funding by the meat industry, and this appears to play a role in the papers that came out two weeks ago saying that meat is fine and just keep on eating. So whenever you have an industry with a vested interest in defending a product that is potentially harmful, if that industry gets involved in funding distracting research, then that muddies the waters very quickly. And we know that this has happened in nutrition.

And then the third thing has to do with the way journalists report on this. So a new study comes out, and if the new study contradicts accepted wisdom, often journalists will run with that story because this is exciting, and this is what happened with the meat story. Some papers get published that say the evidence for the harms of meat is weak — that was their claim; therefore, since people like meat, they should just keep on eating it. Now that was a very illogical claim, because even if the evidence is weak, it's still evidence — this gets back to the argument on evidence — and whether people like something and whether it's good for them are two different questions. We could say, “Yes, it's true. People like meat, and if they want to eat meat knowing that it's bad for them, it's a free country and they should have the right to do that.” And you could say the same thing about smoking, right? Yes, smoking is bad for you, but if knowing how bad it is, you still choose to smoke — well, there we have to have the caveat that actually it does harm other people, so you also have bunch restrictions there, but it's perfectly legitimate for you to smoke if you make that decision as a grownup. 

But those papers deliberately conflated those two different things, and to me that's highly suspicious. To say, "Well, you should keep on eating meat because you like it,” is to me an illegitimate argument in a paper that's supposed to be about the question of whether or not it's harmful. There's some real industry conflation here — distraction, disinformation — and that's made it very, very hard for people to figure out what's going on. But now add the journalists to it. This set of papers comes out with an aggressive press release from the journal, which is a whole other issue which we could talk about or not, depending on if you want to go there — there were a lot of layers in this beef issue. But the journalists ran with it, and had big headlines, “Maybe red meat's fine after all; keep on eating meat.” And so of course this got a huge amount of attention because it was unexpected, because it seemed to go against everything we know. 

And then within a couple of days — I mean the New York Times did a big spread on this and the very next day they're like, "Well, actually it turns out some of the authors might be linked to the beef industry, so maybe it might not be as good a study as we thought." So right away, immediately, you can imagine a meat lover says, "Oh my God, first it's bad for me, then it’s good for me, now we're not sure. Guess what? I'm going to just keep on eating hamburgers." Which of course is exactly what the industry wants. So journalists are actually playing into the hands of the industry when they do this, and one of the things we know from our work is that the industry knows that and they do this deliberately. They deliberately try to get the media to cover confounding or distracting studies in the hopes that people will be confused, and therefore they'll say, "Okay, well it's too confusing, so I just keep on smoking, I just keep on eating meat, I just keep on eating sugar." Because that's what they want. 

Ariel Conn: The journalists are clearly a problem there. And I guess the other question I have, then, is to me it doesn't seem that people who are not into science really understand the difference between profit-based industry science and, say, academic science. 

Naomi Oreskes: Yeah. I think that the distinction between profit-based science and academic science is not appreciated by most people — and frankly, here's where now we have enough blame to go around. I mean here's where academia and universities have played a role too. Because of the decrease in funding for scientific research in the last 30 years, universities have been very aggressive about soliciting industry partnerships, industry funding, and have really encouraged faculty to actively seek out private sector support. And that's not necessarily bad. There are certainly examples of private sector support for research that are good; I always do full disclosure, my PhD was funded in part by a mining company. So it's not that industry funding is necessarily bad or corrupting, but it can be bad. If we're going to have industry funding, we have to be much more careful, in my opinion, about how we do it, what our disclosure rules are. 

So for example, the Annals of Internal Medicine does have a disclosure policy, but it's a very weak one. And so what we found out in this beef story was that these authors did file a disclosure claiming they had no conflict of interest, but very quickly it came out that actually they did have conflicts of interest. But technically, they had in fact followed the disclosure policy of the annals of internal medicine — they had followed the letter of the law, if not the spirit. 

And the journal said, "Well, we don't check; we rely on the good faith of the authors to reveal potential conflicts." I mean, I get it — the journal probably doesn't have the capacity. But if they're not checking, then how do we enforce it? And so my view is — two views on this. One is that, actually, a certain amount of checking would be in order. I think for a study like this that was so contra-suggestive, so counterintuitive, and which to my mind actually had a number of red flags that we could talk about if you're interested — but I think a study like that, the journal actually did have an obligation to do some due diligence. Because it only took the New York Times 24 hours to find out that a lot of these authors did in fact have industry connections, so that tells us that the journal could have found that out pretty quickly too. 

So I think that for really consequential papers having to do with public health, that seem to be surprising, I think journals do have an obligation to do some due diligence. However, I also think that the journal is right to say that they can't really be expected to be policing the disclosures of people. So I think that if it comes out after the fact that people have not disclosed relevant conflicts of interest, I think the papers should be retracted. Because unless there's some consequence to malfeasance, then the policy isn't really worth the paper it's written on. 

Ariel Conn: Could that be something that's asked of the reviewers of the paper, to also just do a quick Google search of the authors? 

Naomi Oreskes: Yes. I think it would be reasonable to ask reviewers to at least take into consideration whether there could be, or appear to be, the suggestion of potential conflicts. I will tell you — I have it right here on my desk, so I will read you something that I think is a red flag. The editorial is called Meat Consumption in Health: Food for Thought, and this ran in the Annals of Internal Medicine in the same issue that published the paper saying that there was no good evidence of any harm in eating as much red meat as you want. And the editorial was basically saying how great these papers were. So that already is sort of taking sides in a way that I think was unhelpful. I think it might've been more helpful to have an editorial that was a little bit more neutral, to say, “These papers will be controversial; they go against tremendous amounts of evidence we have, but it's right that we should have a serious discussion of them. It's right that they're being published.”

But here's the interesting thing. He says, “It may be time to stop producing observational research in this area.” This is the exact same thing that the tobacco industry did. The tobacco industry tried to claim that the whole science of epidemiology was no good because you could never prove causation, and therefore that epidemiological studies should not be allowed to be used. Well, we can laugh at that now, because obviously we know that epidemiology is really important and we know that it played a crucial role. But at the time, many people took that argument seriously and epidemiologists were put in this very defensive position where then they had to explain why their science was legitimate. The same thing has happened now: all the people who've done observational studies, now they have to defend the legitimacy of observational studies. So that's a huge red flag to me. Nobody should saying that. A person should say, “Sometimes observational studies are the best thing we have. If that's all we have, then we have to use it because that's what we have.”

And the same with animal studies. Animal studies obviously are imperfect; no animal model is exactly the same as people. But we use animal models because there's all kinds of things we can’t do to people, and we know that there are some animals — particularly pigs, for example — whose metabolism is a lot like people. So it's not perfect, but we can get really good information out of animal studies. I raise this because there are now people in the industry saying we shouldn't use animal studies, that EPA should not be allowed to use animal studies to prove toxicity. So now if you eliminate animal studies, what's left? 

Ariel Conn: The issues that you're talking about with people who now have to defend their science and all of that, I think that ties back to the question I asked much earlier — for you to talk about the difference between facts and values and what scientists need to understand about communicating both. 

Naomi Oreskes: Well, the fact-value distinction is one about which hundreds of articles and books have been written. So people have spent a lot of time trying to sort this out, and I don't pretend that I can solve a problem that hundreds of people before may have not solved. But in a broad way, I think we can say the following thing: there are certain kinds of questions about the natural world that we believe science can answer. And that's why science exists: to answer questions about the laws of nature, about the origins of species, whether continents move. And these are questions that exist, in a sense, prior to us. The continents don't care what we think about them. Gravity doesn't care if you're a Republican or a Democrat. Acid rain falls on golf courses as well as organic farms. So nature exists prior to us — I mean, of course we have changed it in the Anthropocene, but it exists prior to us and it operates according to what we conventionally call the laws of nature. 

And the purpose of science is to understand what those laws of nature are, using the word law broadly. And traditionally, we view that project as being distinct from what we would call value questions such as what is the meaning of life? Is there a God? Is it better to smoke and be happy than not smoke and be unhappy? You know, anything that has to do with our preferences — what we like or don't like, what we want to be true, what we care about. And there's a long, large literature that traditionally has supported the notion that these two domains can be cleanly separated, and it's the job of science only to answer the fact questions. 

Now, in recent years that's been challenged for a variety of reasons, one of which is the one we've already discussed, that in reality human beings are not really able to separate these things so cleanly. And particularly, if I study something like coral reefs, it may be the case that I am motivated to study at them because they're threatened and because I think they're beautiful and because I care about coral reefs, I want to study. And therefore, my scientific work is actually intercalated with my values that tell me that coral reefs are beautiful and should be preserved. 

So the traditional scientific response to that is to say that I have to try to unpack those two things — even though in reality they're intercalated — and when I talk about my science, I should only talk about the fact piece. But I think one of the things we've learned is that it doesn't really work. We are motivated by the values and our listeners are motivated by the values, and in fact, when you bring up the value — like the fact that coral reefs are amazingly beautiful and awe-inspiring and might even actually make you believe in God — when you talk about those things, then you suddenly find that people who might not otherwise care about coral reefs and might be completely — if you start talking about the facts, they're totally glazed over, but you start talking about the values, suddenly they're listening. This idea that we can actually connect with people through the values piece, which has led me to believe that — it's not to say that facts and values are the same thing, I'm not arguing that, and it's not to say that there isn't some virtue in still trying to be clear about what parts of the story are factual and what parts are value-based, but it's to say that some integration of the two could turn out to actually be a useful thing, particularly when we're trying to communicate to people why the science matters to them and to their lives. 

Ariel Conn: What do you think are some of the biggest misconceptions that scientists have regarding public perception of science? 

Naomi Oreskes: I think one perception is that people are stupid. A lot of scientists wouldn't say this out loud because they know it's not socially adept to say that, but I think a lot of scientists think that the reason why people don't understand science is because they're stupid or ignorant. 

Ariel Conn: I feel like we do get that a lot against people who don't want to vaccinate their kids. They’re constantly attacked for their intelligence. 

Naomi Oreskes: Right. But that's false, and we have lots of evidence that it’s false. And I always say scientists are actually being very unscientific when they say that, because they're not actually looking at the evidence. We have a lot of evidence that there are many highly educated people who reject science in particular domains — for example, particularly in climate science. One of the things we know from the evidence is that among Republicans, the more educated you are, the more likely you are to reject climate science. That's a little bit of a scary statistic, especially for those of us who believe in education. 

Ariel Conn: Are there theories about why? 

Naomi Oreskes: Yeah, there are. The theories are that the more highly educated Republicans are more likely to read The Wall Street Journal or Forbes, some of these magazines that have promoted climate change denial. Or they're more motivated: maybe you run a business that's dependent upon cheap fossil fuels. Then you are more motivated to try to find denialist arguments, so you go on the internet and you look for those arguments and you find them. There could be a number of different explanations for it, but that tells us this is not a problem of ignorance. Many of these people are intelligent, and the same with vaccine rejection. 

Seth Mnookin has a great book called The Panic Virus. One of the things he talks about is the families of children with autism who blame it on their vaccinations. And many of these people are not uneducated, but they are sad. Their children have a situation that's very hard to deal with and they don't know how to help their children. And frankly, modern medicine doesn't have a lot to offer them. And they're looking for someone to blame. And so if someone comes along and says, “Well, it's the vaccinations,” and sure enough the child's autism developed just around the time that they got their vaccinations, it seems to make sense. It's not that they're uneducated or stupid, but they're in a situation in which they've become motivated or incentivized to look for other explanations. 

The other big misconception is the one we started with, that there's a general crisis of trust in science — that's false, all the evidence shows us that's not true. But people do reject science in specific areas where they think the implications of science threaten their worldview or their economic interests or their religious beliefs. And that's crucially important because it tells us that you will not reach those people simply by giving them more factual information. 

This is what sociologists call implicatory denial. You deny something because you don't like its implications. The only way to address that is to talk about the implications. Here evolution is a good example: we know that many people who reject evolutionary theory reject it because they think it means that life is meaningless. If life is produced by a random purposeless process, then my life is meaningless. That's not true. I mean, life can have all kinds of meanings, and there are many atheists who think that life is very meaningful. There are many evolutionary biologists who are religious believers. So there are many ways to adjust the question of meaning within the framework of evolutionary biology. 

And there are people who've done this. John Howden is very articulate on this; Kenneth Miller from Brown university. There've been studies: Arizona State University, they've done some very nice work where they show that if you take children, or I should say college age students, who come from, say, evangelical Christian backgrounds and who begin with this idea, that evolutionary theory is something that they do not like, but then you talk to them about this question of meaning and how biologists who are religious believers find meaning even in the face of random evolution, this can be transformative for many of these young people. So if you correctly diagnosed the problem, and you don't blame it on stupidity or superstition but you actually engage with what it actually is, then you realize that there are options that can work. 

Ariel Conn: Climate change, I think, does hit all three of those points that you mentioned in terms of why people don't trust science. As we're doing this interview, The Guardian just came out — well, I read it in The Guardian, I don't know which paper published it first — but a discussion about how arguments against addressing climate change are shifting. We've established that climate change is happening, and so now the shift is to this idea that well, if it's already happening, it's too late for us to do anything, so there's no point in trying to address climate change. I guess I have a two part question for you there. One, what's your response to that? 

Naomi Oreskes: I mean that's just idiotic. Anybody who says that actually is either an idiot or they’re a shill for the industry because that's just an extremely foolish thing to say, because we know that the more greenhouse gases in the environment, the worse it gets. Every day that we continue to delay, the problem gets worse. But if we get it under control, there's still the opportunity to the worst case disaster scenario. So that's just a factually incorrect argument. 

Ariel Conn: I think the next part is we're likely to continue to see the sort of evolving argument against climate change. Do you have ideas for how we can try to minimize that? 

Naomi Oreskes: What people have to understand about this is that it's not that the arguments are evolving; it’s that the climate change deniers have a Rolodex of different arguments that they pull out at different times. And we have seen many arguments go away and then come back. So you could say, “Well, nobody now would say there's no climate change because you can't just say that with a straight face.” But actually, back in 1997, people said the exact same thing and, guess what, it came back with a vengeance. And we have seen this now; if you've been following this issue for 20 years as I have been now, this is what you see. 

It means A, you can't get complacent. You can't believe that, okay, we've solved the outright denials of climate change; now it's this denial of fatalism or nihilism. No, that rationale will come back. In fact, it's still there on the internet. The blaming it on other things, we've seen that argument come and go. It's like the game of whack-a-mole. These arguments pop up and they get beaten down by scientists, so they go away for a while and then they shift your attention to a bunch of arguments and then they say, "Oh, but wait, what about this one?" And then it comes back. This is part of how we know how cynical it really is, because if these were good faith arguments, they wouldn't return to an old discredited argument. And yet we see that happening all the time. 

Ariel Conn: Do you think we might do better if, rather than talking about all of the awful things that climate change is going to cause and all of the changes everyone needs to make, et cetera — I mean, we still need to talk about all of that — do you think we should be focusing more on how much better things would be if we do this? 

Naomi Oreskes: I think that's a “yes, and” or “yes, but” question. I think you're right and certainly Nick Stern, with whom I recently wrote an op-ed piece, is very strong on making that argument, that there's a positive vision of what a new economy based on renewable energy can look like, and it involves in many ways a much better life. I mean think about all the time we waste in traffic every day. How is that a good thing? And this is why I put the cartoon in the book. Well, what if we create a better world for nothing? If we address the climate change issue, we can make a lot of other things better too. I definitely agree with that, and I have been making that case for a long time and I think a lot of people have. But at the same time, one of the things we know is that even though that positive vision has been painted by a lot of people, we still don't get rid of the denial. 

And that's because we have people out there who are just doing everything in their power to keep the fossil fuel economy alive and to continue to make profits by selling fossil fuels, come hell or literally high water. That tells me that this is fundamentally a political problem, that we have to fight the political power of the fossil fuel industry. And a lot of scientists don't like to talk about that because they don't want to be political. They went into science because they like science and they like things that are not value-laden. And now you say to them, “Well, actually this is a political problem.” That's a hard sell for a lot of scientists. So my view is, okay, that's fine. I mean, let's face it, there aren't really that many scientists in the world anyway. The scientists can keep doing science, because that's what they love and that's what they're good at. 

But for the rest of us, the message is we have to become politically activated, we have to fight against this industry that has basically bought congress and is doing something that most of us don't want, that's not in our interest — it's not in our economic interests, it's not in the interest of our health, certainly not in the interest of the future prosperity of our children and our grandchildren. This is a really, really bad thing. And the only people who want it are the people who either are profiting from it or who have been misled by the arguments of the people who are profiting from it. And so we have to fight that. We have to find the disinformation and we have to fight what's essentially the corruption of our political system. So that's something that's hard. A lot of people don't want to hear that, especially people who thought they were coming to hear a talk about science, who thought they were going to listen to a podcast about science. But that's the lesson of all this, is that fundamentally it's about politics, and it's about fighting for a democracy that actually represents the interests of the American people. 

Ariel Conn: That is a perfect connection to the last question I have for you. What do we do? I think I've interviewed almost 30 people at this point, 30 experts — maybe even more because I've had multiple people in a couple of the shows — all talking about climate change, the threat, what we do about it. And honestly I feel a little bit more comfortable knowing some of the things we should be doing. I think it's easier to identify individual actions we should be taking. But in terms of getting political and trying to help get power away from the fossil fuels, that seems really daunting. 

Naomi Oreskes: It is daunting, but it has to be done, because the whole personal responsibility thing — that's a very tough one. Obviously, absolutely everyone should do what they personally can do, and if you have it in your power to make changes in your life, then you by all means should do it. I have solar panels on my roof, and I have greatly reduced the amount of meat I eat, and I'm trying as much as possible, when I get invited to conferences in far away places, to ask them if I could Skype instead. We definitely have to do the personal thingsm because otherwise we can feel incredibly disempowered if you feel like there's nothing I can do except change the entire political system, right?

Ariel Conn: Uh-huh.

Naomi Oreskes: It is important to do those personal things. But as I like to say, I can change my light bulbs by myself, but I can't change my electricity grid. There are big structural issues that have to be addressed and we cannot solve this problem simply by changing our light bulbs. What you can do that's a little less daunting is, a lot of this could be done on the state level. Here in Massachusetts we have a renewable portfolio standard that's making a difference. I was able to do my solar panels in part because we got tax credits due to the renewable portfolio standard. The same thing in California, the same thing in New Jersey. We know that these policies make a difference, and they make a difference in part because they help empower people to do the right thing on the personal level. 

And it turns out that on the state level, it's a whole lot easier than on the federal level for a variety of different reasons. And when you get involved in state politics, I don’t want to say that it's always great, but I think that for many people there's a sense that our state government is more responsive than the federal government. In many states, not all, but in many States there's a feeling that it’s not as utterly bought as the Senate is right now, let's say. That can be a place to start. And also cities: cities have tremendous power because the vast majority of carbon emissions come from cities, because carbon emissions are linked to economic activity and most economic activity is taking place in cities. 

And here's something to feel good about: most cities in this country are progressive. Polls show the people who live in New York, Boston, Philadelphia, Chicago, Los Angeles, San Francisco, Denver, Seattle — all of the major cities in this country — all the major population centers are filled with people who want action on climate change. And if those cities would install renewable portfolio standards, carbon taxes, it would make a giant difference. And one of the things we know from history is that when cities and states act, sometimes the federal government follows. 

If we think about air pollution regulations, Clean Air Act, all of those great laws that clean up the air in the United States: in the 1960s, people were dropping dead in the streets from air pollution. This is something a lot of people have forgotten, or maybe young people never knew, but literally dropping dead on the streets of Los Angeles from air pollution. That doesn't happen anymore. We fixed that. And we fixed it through sensible laws. But who led the way? It wasn't the federal government; it was the state of California. And when the state of California began to move on air pollution control, the federal government followed. And I think that's what we're going to see here too. I mean, right now we already have California moving in a big way. We have New York, Massachusetts, New Jersey, some other states also starting to move. I think if enough of the states begin to move, we will see the federal government follow. Maybe not in this administration, but quite possibly in just a year or two from now. So that's the optimistic, don't-feel-depressed pep talk with which we can end the podcast. 

Ariel Conn: I like it. Is there anything else that you think is important to mention that we didn't get into? 

Naomi Oreskes: So it's a both end answer. Yes, you should absolutely do what you can do on a personal level. And this is one reason why the nutrition thing is so important, because food is the thing that, as individuals, we have the most control over. We can start changing the way we eat tonight. And if you cut back on beef — and this is something that frankly the beef industry doesn't want you to know, but it's true — this is the total absolute win, win, win solution because A, it's better for your health; B, it's better for the planet; and C, it's cheaper. A healthy vegetarian meal costs a whole lot less money, in general, than a beef meal. So nutrition is a really, really good place to start. And then you can build out from there and then you'll feel better and you'll be healthier, so you'll be more empowered to take political action. 

Ariel Conn: All right. Well, thank you so much. I personally really enjoyed your book. I encourage everyone to read it. I will also add, there was a question I was going to ask about some of the different sciences that you looked at and since we didn't get into that, I will mention it's limited energy theory, continental drift, eugenics, birth control and depression, and the debate about dental floss. And I'm hoping that will help entice people to check out your book.

Naomi Oreskes: Great. Ariel, very nice speaking with you.

Ariel Conn: Yeah, you too.

I truly hope you’ve enjoyed these interviews and this episode specifically. As I mentioned at the beginning, we’ll round out the Not Cool climate podcast series with a short final episode recapping some of what we covered and some things I wish we’d covered. Thank you so much for listening, and as always, if you enjoyed this episode, please like it, share it, and maybe even leave a good review.

View transcript

Sign up for the Future of Life Institute newsletter

Join 40,000+ others receiving periodic updates on our work and cause areas.
cloudmagnifiercrossarrow-up linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram