...how to look out for vaccine misinformation?

Ever Wonder? / June 9, 2021

...how to look out for vaccine misinformation?

portrait of Tara Haelle, an independent science journalist
Image attribution
Jim Coventry

A few weeks ago, we spoke to journalist Tara Haelle (@tarahaelle), who’s written about vaccines and vaccine hesitancy for a decade. If you haven’t already listened to her first episode, where she breaks down how to make sense of all the news around COVID vaccines, please go check it out. We had such a great conversation with Tara that we couldn’t fit everything in a single episode—so now we’re airing more.

There’s so much news coming out about COVID vaccines, it can be hard to keep up. And it doesn’t help that our best efforts to make rational decisions can be thwarted at every turn. It turns out that our own brains can sometimes mislead us, making it harder to judge whether what we’re learning is likely to be true or not.

Do you ever wonder how to look out for vaccine misinformation?

Tara had some great advice for dealing with misinformation. And she explained why humans are so bad at evaluating risk and what we can try to do to be better at it.

Have a question you've been wondering about? Send an email or voice recording to the podcast team to tell us what you'd like to hear in future episodes.

Subscribe to our show on Apple Podcasts, Spotify, or Google Podcasts. To see a full list of episodes, visit our show’s webpage.


Perry Roth-Johnson (00:06):

Hello! This is Ever Wonder? from the California Science Center. I'm Perry Roth-Johnson. A few weeks ago, we spoke to journalist Tara Haelle, who's written about vaccines and vaccine hesitancy for a decade. If you haven't already listened to her first episode, where she breaks down how to make sense of all the news around COVID vaccines, please go check it out. We had such a great conversation with Tara that we couldn't fit everything in a single episode—so now we're airing more. There's so much news coming out about COVID vaccines, that it can be hard to keep up. And it doesn't help that our best efforts to make rational decisions can be thwarted at every turn. It turns out that our own brains can sometimes mislead us, making it harder to judge whether what we're learning is likely to be true or not. Do you ever wonder how to look out for vaccine misinformation? Tara had some great advice for dealing with misinformation. And she explained why humans are so bad at evaluating risk and what we can try to do to be better at it. Take a listen.

Devin Waller (01:11):

Evaluating risk has been an important part of navigating the pandemic in our daily lives. Um, we've had to make risk assessments about what activities and choices to make in order to slow the spread of this disease. And now with the vaccines available, we have to evaluate the benefits versus risks of getting vaccinated. So, first of all, why are humans so bad at evaluating risks to begin with? And how can we get better at that?

Tara Haelle (01:36):

I love the way you phrased that because it acknowledges right at the fact that we are really, really lousy at assessing risks. We just are. Oh, well, okay—we're good at assessing the risks that our ancestors had to worry with. We're good at hearing a lion roar and knowing that we need to get into that cage fast. Okay? That kind of risk we're good at because that's how our brains developed to assess risk. But today there are not lions chasing us into caves. We don't have to worry about that anymore. We have to decide if getting onto a giant, several ton, piece of metal that supposedly stays in the air by itself to go from New York to L.A. is safe. And if I described it that way, people are like, "Oh no, that's not safe!" But hey, airplanes are safer than cars. So, um, our brains are more, they rely more heavily on concrete things. And when you say that there is a 1 in 500,000 chance of something happening, that's abstract. How do we, well, even more abstract than that is if we say there's a 0.025% chance of something happening, right?

Perry Roth-Johnson (02:40):


Tara Haelle (02:40):

What does that mean? Right? And sometimes what helps us is to use an analogy. So if someone said there's a 1 in 500,000 chance of something happening, I think about where I went to high school, which is Arlington, Texas, and that city is about 350,000 people. And I think, okay, so let me add another 150,000 onto that. Well, I don't know how much Mansfield is next door. Let's say Mansfield is 150,000 [people]. So 1 in 500,000 is of the entire city of Arlington and Mansfield—one person. Okay, well that actually makes me understand it more. Now I can actually have something to grab onto that makes sense to me to understand something. So you really have to use that. Um, and also, our risks are affected by those same biases I've talked about before. And one of the most common ones is availability bias. Availability bias means that we are fixated more on something that we more recently heard about or what sounds more dramatic. And that stands out as a bigger risk than the things that are more mundane that we're more familiar with. So if you're going to the beach and you watch Jaws a couple of weeks earlier, which I don't advise, by the way. You're going to be, you know, your friend is probably going to have to drag you, kicking and screaming into that water because you're going to be worried that there's a shark in the water. Never mind the fact that sharks mostly are out during like dawn and dusk and are more scared of you than you are of them. Or the fact that we kill a million sharks, a hundred million sharks a year and they kill maybe five of us a year, right? Those stats, those numbers, aren't what's in your head. What's in your head is, "Oh my God, I'm going to die of a shark attack!"

Perry Roth-Johnson (04:14):


Tara Haelle (04:14):

And also not in your head is the fact that the undertow might cause you to drown or that the sunburn is gonna, you know, give you a higher risk of skin cancer.

Devin Waller (04:25):


Tara Haelle (04:25):

Or that simply driving there, you have a risk of getting into a car accident. All of those are much, much, much greater risks than a shark encounter, but that's not what you're thinking about because it's not the dramatic thing that stands out in your head. And that's happened a lot with vaccines—with the vaccines and autism thing. "Well, I heard that my next door neighbor's cousin's daughter got autism after she had the vaccine." First of all, we know that that was a coincidence now. We know that there's no connection. But even if we didn't know that yet you haven't heard from the other like 200,000 daughters of cousins of neighbors on your block, who got a vaccine and nothing happened, right? You've only heard about that one. So our brains sort of take—they, they, they grab onto, they latch onto those risks and it works our sense of risk. How can we get better at it? Um, one is what I said already using those analogies to try and make some sense. Um, try to focus, and this is hard. Try to focus on the numbers rather than the emotions or examine your emotions and try to pinpoint where those emotions are coming from. Um, and I say, that's hard because it's kind of like telling somebody who's having an anxiety attack, "Have you tried not having an anxiety?" Right?

Devin Waller (05:45):


Perry Roth-Johnson (05:45):


Tara Haelle (05:45):

If someone says, if you're, if you're uptight or anxious about something and someone says, "Calm down", what's the first thing you do? "I'm not going to calm down!" And you get more angry, right? So I hesitate to say that because, you know, easier said than done, right? Um, but you know, it is hard and you have to just, you have to compare risks. Um, with the blood clots, with the Johnson & Johnson vaccine, we do know that those were almost certainly caused by the vaccine. And they also occur—um, what was it—six cases in 7 million shots or something like that. Compare that to, out of every person who gets COVID and is not hospitalized, one in 100 to 125 people will have a blood clot, of the ones who are not hospitalized. When you're hospitalized for COVID, one in 20 people. For every 20 persons who are in the hospital for COVID, one of them is going to get a blood clot. If you're in ICU, it's one in five. (Source: https://elemental.medium.com/everything-we-know-about-blood-clots-and-the-johnson-johnson-covid-19-vaccine-so-far-69015afe0296)

Perry Roth-Johnson (06:51):


Devin Waller (06:53):


Tara Haelle (06:53):

So, you know, put that into perspective. Compare those risks. And it's hard to do that, but that's what you have to do. You have to really look at those risks by the numbers and then say, "It is totally rational in a human sense for me to be scared that I'm going to be the one in a million, but the odds are much better that I'm going to get COVID and develop a blood clot." And that's hard to do. I don't have an easy solution to that because it is a natural human inclination to worry that you're going to be the one. That makes sense. I'm not going to say that's irrational. I mean, I guess it is technically irrational, but it is built into our DNA. So, you know, you're not crazy if that's the way you think you just have to find ways to overcome it.

Perry Roth-Johnson (07:38):

Yeah. It's not unreasonable. I mean, I've been in those conversations personally. I think another analogy I heard that I liked because it was such a common drug. I think it was Dr. Francis Collins, the head of NIH on Meet the Press. He said, You're less likely of getting a blood clot from J&J—I'm paraphrasing here—than getting struck by lightning next year or that the risk of aspirin inducing a significant intestinal bleed is much higher than what we're talking about here. And I take aspirin all the time!

Tara Haelle (08:11):

Yeah, exactly. It's always a benefit risk. I think one thing that helps with risk is to keep in mind that there's no such thing as a risk-free option. Another example of a risk is—actually if you'll indulge me for a minute, I'm going to read a quote from someone that I think your audience will know no matter who's in the audience. One of our founding fathers, the inventor of loads of things. Quite the dilettante in France, Benjamin Franklin. Benjamin Franklin had the opportunity to inoculate his son against smallpox. Now vaccination is a form of inoculation, but this was different. This was variolation, which predated vaccination and it did carry risks, many more risks than vaccination. But his quote was, "In 1736, I lost one of my sons, a fine boy of four years old, by the smallpox taken in the common way. In other words, he caught smallpox in the wild and died. I long regretted bitterly and still regret that I had not given it to him by inoculation." And he says, you know, "For those who are making the same decision, know that the pain is just as bad either way." And what he was demonstrating is something called omission bias, which is you believe that you'll escape the risk by omitting the action. So if you don't do the action, you feel less responsible if something bad happens to you because you chose not to act. "So if I don't get the vaccine and I get COVID, well, that's just bad luck. But if I get the vaccine and I get an adverse event, well, that's my fault because I got the vaccine." That's faulty thinking because not acting is also an active choice. And the best example, you know the best comparison is seatbelts. If you put on the seatbelt and then you get injured as a result of having put on your seatbelt in an accident, you know the right response is not that I shouldn't have put on the seatbelt, right? Because you know, far more often, the seatbelt's gonna save your life. It's not that no one's ever been hurt by a seatbelt, but you know the relative risk of the seatbelt is definitely better to be wearing one. So that's a really hard one. Just keep in mind that there is no such thing as a risk-free choice in life. Staying in your house nonstop—well, not during a pandemic. I'm not going to use that example. Scratch that.

Perry Roth-Johnson (10:30):

That's okay.

Tara Haelle (10:30):

I can't use that example anymore.

Devin Waller (10:30):

That's fair. Yeah.

Tara Haelle (10:30):

Drinking water, you could choke on the water. And a lot of times, the way to overcome those fears is to do something right away. You know, if you fall off your bike, you get back on the bike as soon as you can. If you, if you, go into the water and swallow it and it gets up your nose in the pool, the best thing to do is to get back in the pool, as long as someone's there with you. I one time was eating fruit at a bicycle workshop I was at and I choked on some honeydew. And it was stuck in my throat and I couldn't breathe. And someone had to do the Heimlich on me. And the very first thing I did after it got out was I took some breaths, I drank some water and then I ate another piece of honeydew because I knew that if I didn't, I would never touch honeydew again. Um, so, you know, you have to examine your own fears and, you know, really think about, you know, "Why am I scared of this? And even if it feels this way, you know, what's the rational response to that?"

Devin Waller (11:30):

There's been an onslaught of misinformation and even disinformation online. What should people look for to avoid falling into these traps?

Tara Haelle (11:39):

Well, first of all, it's helpful for people to understand the difference between misinformation and disinformation. Misinformation is just what it sounds like. It's inaccurate information, regardless of intent. It's just not factual. Um, it may be accidentally infactual. It may be intentionally infactual, but it's just not factual information. Disinformation is an intentional attempt to mislead. The person misleading either they know that it's not accurate information and they're intentionally lying. Or they have made a conscious effort not to find contradictory information. They haven't made an intellectually honest attempt to confirm that it's accurate, right? They've been neglectful. Um, so that's one thing to keep in mind. And there are lots of motivations for disinformation, disinformation agents. Um, the two biggest ones are usually political and financial. You know in the anti-vaccine community, there's a lot of money in being an anti-vaccine advocate. There's a lot of money in it. In terms of being able to recognize it, the first thing you should do is cross check anything you find. What are other outlets saying? And if you're only seeing it in a certain type of news. If you're only seeing it on super liberal blogs or super conservative blogs, or, you know, if you're seeing it on a site that's selling something. I don't trust if I find a doctor who is selling something on their site, that doesn't mean that that doctor is not a good doctor or is pseudo-scientific or anything, but I'm not going to trust what's on his site because I can't be sure that what he's telling me isn't connected to what he's selling. So I look at what they're selling, you know, what is their, what is their motivation? So, it's sort of like being back in school. Author's Purpose—"What's the purpose of the writer? Why is this person communicating this information? What do they have to gain from it?" And a lot of people think...This is kind of ironic because then there's people who think that journalists have this, um, agenda, right? And there are some journalists who have agendas and there are opinion journalists out there. Um, there are journalism outlets that might have a particular point of view. But in terms of straight news, the average journalist goal is to serve the public by providing them accurate and thorough information, and then getting a paycheck for doing that. And that paycheck doesn't come from a company, beyond the company that's hired them to write the story. So like, if I write something for the Washington Post, I want my paycheck from the Washington Post. Um, now the Washington Post is owned by Amazon, but I don't have anything to do with Jeff Bezos. He doesn't know me. I don't know him. Um, you know, it's only my editors of the Washington Post that I'm dealing with. So thinking about Author's Purpose, cross checking, using fact-checkers and look for things that seem too good to be true or too scary. Now that last one is tricky because at the start of the pandemic, we were all terrified.

Devin Waller (14:43):

Uh huh. Yeah.

Tara Haelle (14:43):

And frankly, we had good reason to be. So, you know, I'm not saying that just because the news is scary, that automatically means it's not true, but it means have a higher threshold of skepticism about it. In the case of the pandemic, the reason I was like, "Okay, there's good reason to be scared", is because the infectious disease experts I followed were scared. In fact, a friend asked me, oh gosh, this was maybe six or eight months ago. She says, "Tara, how do I know when to worry about the variance?" Or something like that. And I said, "When Anthony Fauci worries, then you worry." She said, "Okay." And it was funny because about a month ago she sent me a text. She sent me a headline with Fauci, and she was like, "Fauci is worried. Now I'm worried." And I said, "You have a right to be."

Perry Roth-Johnson (15:26):


Tara Haelle (15:29):

So you find the experts and see when they worry, then you worry. If they're feeling good, then you should feel good. So I think where it gets most challenging with misinformation is when it has to do with political or ideological things. Where there's not a fact, or non-facts. Like there's facts that can support one position and there's facts that can support another position, but neither of those positions can be considered the "right" position. When you have things like that, it's really hard to pick apart when you might be misinformed, because you might be following someone who shares your values and shares your ideas and worldview and ideology. That doesn't mean they're not misleading you sometimes or misleading themselves and sharing that with you and not realizing it. So I would encourage people to regularly examine their own biases. We all have biases. You cannot escape bias. It is a human trait. In fact, dogs have biases, okay. It's how they get trained.

Perry Roth-Johnson (16:28):

I've never thought about that.

Tara Haelle (16:32):

Well I mean, how do you, you train them with association, right? So it's a reality. It's in our DNA. Okay. It's in our neurons. We can't not have bias. What we can do is be aware that that bias is there and look for it and then find ways to examine it. That doesn't mean counter it necessarily, but examine it. "Am I being as thoughtful about this issue?" Expose yourself to people who are different from you and people who have different ideas from you who are respected in their circles, okay? Look for people who if you're a Republican, who were the Democrats supporting? If you're a Democrat who the Republicans look up? And look at what they're saying. And I'm not saying you won't get angry sometimes by looking at it or upset or frustrated. But expose yourself to that so that you know that you're not staying in an echo chamber and that will help you. Um, one of the most insidious types of bias is confirmation bias and confirmation bias means that you are unconsciously looking for things or clicking on things or exposing yourself to things that already fit with what you believe. So it reaffirms, it confirms what you already believe. So you have to make a conscious effort to look for things that do not necessarily fit with what you agree with. It doesn't mean it's going to change your mind, but you have to examine them and make sure that, you know, you're not leaving out things that you hadn't thought of. Otherwise, you won't be exposed to new information that might change your mind. Um, I laugh because someone in my family would often joke about the fact that, "Tara is always right. Tara thinks she's always right." And I actually got really offended when they said that because it's a good day when I find out I'm wrong. It's good to find out you're wrong. It's good to tell people you're wrong. One time someone asked on Twitter, "Who has ever admitted they were wrong on something? Find some tweets." And I went back and found like 15 tweets easily. And said, "Look I was wrong about this. And I was wrong about that. And I was..." I like being wrong because it means that I'm still learning. I'm still bringing in new information. And I am not too, I hate to use the word "arrogant". I am humble enough to realize that I am never going to know everything. And sometimes I'm going to have ideas on things based on information that's limited to a certain point. And then when I get more information that might change what I believe and changing my mind is not a weakness. It doesn't make me a flip-flopper like we hear in politics sometimes. It means that I'm constantly taking in new information. And re-evaluating what I think and what I believe based on that new information. And guess what? That's, how science works! And, you know, that's why scientists changed their minds. Um, you know, I never understood the idea of blaming a politician for being a flip-flopper because to me, a politician changing their mind is evidence that they're taking in new information and reconsidering things. And I think that's a positive attribute. So, when I was working on my book, The Informed Parent, there were two topics that I knew I had my own confirmation bias on. Like, I knew that I felt strongly about them. And I knew that my opinions had opposing opinions in the community. And I set out when I wrote those sections to prove myself wrong. I acted like I was a debate candidate. And I was having to prepare myself to debate the opposite side of what I believed. And I explicitly looked for evidence to support the other side. And as it turned out, one of my beliefs was further affirmed. Like I felt even more strongly about it by doing that exercise. The other one, I did a 180. The other one I realized I had been wrong and I completely changed my perspective on it. And it changed the way I wrote about it. So, I think that's an important exercise and it shows that it can happen both ways. You know, in one case I found more evidence that I didn't even know about that supported what I thought. Um, but on the other one, I found evidence that I wasn't aware of and flaws in the evidence that I had been relying on that I hadn't considered. Um, and both of those made me a better science consumer. And both of those made me a better journalist.

Devin Waller (20:46):


Perry Roth-Johnson (20:46):

Got it. That's our show, and thanks for listening! Until next time, keep wondering. Ever Wonder? from the California Science Center is produced by me, Perry Roth-Johnson, along with Devin Waller. Liz Roth-Johnson is our editor. Theme music provided by Michael Nickolas and Pond5. We'll drop new episodes every other Wednesday. If you're a fan of the show, be sure to subscribe and leave us a rating or review on Apple Podcasts—it really helps other people discover our show. Have a question you've been wondering about? Send an email or voice recording to everwonder@californiasciencecenter.org, to tell us what you'd like to hear in future episodes.