The AI Technology Page:

Ten Strong Suggestions for Surviving the Age of Misinformation:

by A.J. Jacobs

 

My original title for this post was….

“Ten Commandments for Surviving the Age of Misinformation.” 

But then I realized the word “Commandment” was counterproductive. One of my favourite strategies for avoiding misinformation is to be wary of absolutist thinking. Hence, the full and official title for this post is: 

“Ten Strong Suggestions For Surviving the Age of Misinformation — Suggestions that Will Inevitably Have Exceptions and Nuances But that Might Serve as a Good Initial Guide.” 

Catchy, I know! 

 

I developed many of these strategies a couple of years ago while working on a book called Fact-checking My Life. The book was a reaction to the knowledge crisis we’re facing now: People don’t agree on basic facts. The media has splintered into different realities. Trust in science has tumbled. Misinformation and disinformation are everywhere. 

 

So, I decided to try an experiment. I’d attempt to figure out how to distinguish truth and falsity. I’d examine my most basic beliefs and attempt to determine how I know they’re true. I’d ask myself: How do I know the earth is round? How do I know my wife Julie loves me? How do I know that the Atlantic Monthly is more reliable than Newsmax?

 

After three months of research, I put this book project on hold. Why? Partly for my mental health. Immersing myself in the epistemic crisis was making me stressed and overwhelmed. This was mid-pandemic, so I was already stressed and overwhelmed. I decided to switch to another topic: Puzzles, which were having a renaissance thanks to the worldwide quarantine. 

 

Maybe one day I’ll return to writing and researching Fact-checking My Life, but in the meantime, I thought I’d share some strategies I discovered while working on that project.

 

Here are my Ten Strong Suggestions: 

 

THOU SHALT APPROACH TRUTH-SEEKING AS A PUZZLE 

In the past, when I was talking to someone I disagreed with, I saw it as a battle. I went on the offensive, trying to beat my opponent down with evidence and statistics. That approach worked approximately zero per cent of the time. In fact, it was usually counterproductive: Both of us usually just got angrier and more entrenched. 

 

Nowadays, when talking to someone from the other side of the political spectrum, I try to frame the discussion as a puzzle, not a war of words. It’s a mystery we can investigate together. I ask questions such as: Why does she believe what she believes? Why do I believe what I believe? What evidence, if any, would change her mind? And what would change my mind? Even if we disagree on some points, what do we agree on? And given those agreements, are there practical actions we could take that would work for both of us? 

 

I once heard a child psychologist use the phrase: “Get curious, not furious.” I love that phrase, and not just because it rhymes. I think it’s wise. I believe this curiosity-based approach is much more likely to result in effective, positive solutions. 

 

In fact, research shows this puzzle-solving approach might be the best way to actually evolve our views. One promising technique— called deep canvassing—focuses on the participants asking the puzzling question: Why do we believe what we believe?

I want to extend a thank you to the Puzzler Mindset. Even though I put the truth project on hold in favour of puzzles, puzzles still inform my epistemology. 

 

 

THOU SHALT PAY HEED TO EXPERTS (PLURAL) BUT BE SKEPTICAL OF ANY ONE EXPERT (SINGULAR) 

I am a fan of expertise. But I’m not a fan of diehard devotion to a single expert. 

I first learned this while writing my book Drop Dead Healthy. You can usually find a study or scientist to back up almost any premise (see supporters of the Twinkie Diet, in which Twinkies supposedly help you shed pounds and lower your cholesterol). 

 

I prefer to look at the big picture. What do the majority of legitimate scientists say? It’s sort of the Rotten Tomatoes approach to judging scientific knowledge. Rotten Tomatoes is a site that aggregates movie reviews from dozens of critics. If a movie has a 90 per cent rating, it means that 90 per cent of reviewers liked the film. I find looking at Rotten Tomatoes more helpful than reading a single reviewer who might be the outlier who loved Winnie-The-Pooh: Blood and Honey (a horror movie about Christopher Robin’s murderous pudgy bear). 

 

For science, I gravitate toward meta-studies. Or even meta-meta-studies. This is when researchers look at the aggregate of studies to see what the majority says. 

 

It’s not a foolproof strategy. Meta-studies can have biases too. And sometimes the scientific establishment is mistaken. Sometimes that maverick scientist is correct. I’m guessing Galileo would have had a low Scientific Rotten Tomatoes score among his natural philosopher peers. 

 

So I don’t want to fetishize this heuristic. But in general, especially now that science has become so complicated that it’s hard for a layperson to judge the merits of a particular study, it’s a decent guide. 

 

NOTE: When trying to assess health and science reporting, I’ve found the Cochrane organization helpful. It conducts meta-studies and evaluates scientific data. 

 

THOU SHALT GIVE PERCENTAGES TO YOUR BELIEFS 

I am 90 per cent sure it’s a good idea to assign percentages to your beliefs. For instance, there are some beliefs I’m almost certain about. I’m 99.999 per cent sure the theory of evolution is true. That’s based on the evidence I’ve read and the experts I’ve talked to. But I’m not 100 per cent sure evolution is true. There could always be shocking new evidence that overturns my belief. That’s the point of science — it must be falsifiable. 

 

On the other hand, many of my other beliefs are much lower on the percentage scale. Based on the studies I’ve read, I’m 70 per cent confident that coffee is a net positive for my health. But I could be wrong. If I read a study tomorrow that coffee has the potential to damage our hearing, I might adjust my level of confidence to 65 percent. (Note: There is no such study).

 

 In other words, most of my beliefs are not deep-seated. They are shallow-seated. They are open to being changed based on evidence. (My one true deep-seated belief is that we should be kind to each other.)

 

The fancy term for this type of thinking is Bayesian, named for Thomas Bayes, an eighteenth-century statistician and minister. And I’m a Bayes fanboy. Bayesian thinking involves adjusting your prior beliefs in light of new evidence. It’s about being nimble, about degrees of belief instead of absolutism. 

 

Thinking in percentages is particularly important when trying to predict the future. I recommend two good books that discuss this:  Superforecasting by Philip E. Tetlock and Dan Gardner and Thinking In Bets by Annie Duke. 

 

To use a poker term, I’ve gone all in on putting percentages on my beliefs and predictions. I even use it in my interactions with Julie. 

“When will you be home?” Julie will text me. 

“There’s a 70 per cent chance I’ll be home by 6:30 p.m. But 30 per cent it will be later.” 

There’s a 90 per cent chance she rolls her eyes at this. But I stand by it. 

 

THOU SHALT BE WARY OF HUMAN MEMORY   

More than a decade ago, I wrote an article for Esquire magazine about the fallibility of human memory. Here’s the first paragraph of that article… 

 

There are a lot of unreliable things in this world — WiFi service on Amtrak, meth addicts, Tour de France winners. But the most unreliable of them all? The human memory. It's a disaster. It's bad enough that we forget the vast majority of our lives. Study after study has shown that when we do recall an event, our memory of it is almost always self-serving, warped — or totally fabricated.

 

I stand by that paragraph. Except for the stuff about WiFi service on Amtrak, which has really improved. Also, the Tour de France winners are probably more honest nowadays (the article was written at the time of the Lance Armstrong scandal). 

 

The article itself was about an experiment I did: For three months, I recorded every moment of my life with a tiny video camera attached to my face. Part of my impetus for doing the experiment was to resolve arguments with Julie. If we got in a spat in which she said, “You never told me you were going out to dinner!” I could say, “Well, let’s look at the videotape!: 

It was a Black Mirror episode before Black Mirror existed.

 

In the end, recording one’s life is a mixed bag. It has some positives (I love some of the footage of my kids). But it comes with a ton of negatives  (when I played back my arguments with Julie, it just got her more agitated). 

 

But the experiment did reinforce that my memory is so often wrong. I’m talking about both recent memories and childhood memories (I have a vivid memory of visiting Epcot with my family. It is totally fabricated). 

 

Study after study shows the problems with memory. People are bad at identifying criminals in a lineup. Memories are easily manipulated. There’s a classic study in which people watch a film of a car crash and then have to estimate the speed of the car. Their estimates varied depending on whether the questioner used the word “smashed” or “bumped.” 

 

Accepting that your memory is fallible can be disturbing, but it can also be liberating. I’ve found it helpful in many ways. It makes me less stubborn and less likely to double down on a false claim. For instance, here’s how I used to answer a question from Julie. 

Julie: “Did I give you the car keys?” 

Former AJ: “Nope, absolutely not.” 

Versus how I do now. 

Julie: “Did I give you the car keys?” 

Current AJ: “I don’t have a memory of it, but it’s possible.”

And then I’ll spend a few seconds searching for keys in my backpack. 

Sometimes I even find them there. 

 

THOU SHALT EMBRACE SOURCES THAT ARE OPEN ABOUT THEIR MISTAKES  

Let me start this section by practicing what I preach: I’ve made a lot of mistakes in my books over the years. 

Some are typos: In the first edition of my book The Know-It-All, I misspelled the name of hockey legend Wayne Gretzky. 

I spelled it with an ’s’, “Wayne Gretsky.” 

Wow. My Canadian readers were not happy. (I always thought Canadians were nice, and maybe they are, but not when you misspell their athletic icons). 

 

Some of my mistakes are statements that seemed true at the time but have subsequently been discredited. In my book Drop Dead Healthy about being maximally healthy, I made some now-dubious claims about the benefits of flossing. I made it sound like flossing was a magic bullet that prevented heart disease by cleansing bacteria in your mouth. Since then, I’ve discovered those claims are pretty weakly supported and likely overstated (Though if you floss, please keep flossing, I don’t want to get in trouble with your hygienist). 

 

And sometimes, I’m mistaken about a bigger point. For instance, I’ve written and spoken about the importance of delusional optimism when starting any project. I argued that you need to be convinced your book/business/movie would be a smash success, or you wouldn’t have the motivation to persist. 

 

But I’ve tempered that position. I think delusional optimism can sometimes help. But it’s not necessary in all cases. And sometimes, it’s best to balance the delusional optimism with cold reality. (For more on this, I recommend Julia Galef’s book, The Scout Mindset). 

 

As I type this, I realise I need a section on my website that lists all the incorrect or misleading information in my books and articles. So look for that soon(ish). 

 

Regardless, I encourage everyone to admit their mistaken beliefs. It should not be seen as a weakness. It should not be derided as “flip-flopping.” We need to reward people who evolve their thinking and admit past mistakes. Humans — at least those in current society — are drawn to certitude. We need to fight against that urge.  

 

My sense is that those politicians and media outlets that routinely admit they’re wrong are more reliable in the first place. They’re more likely to care about the truth, so they’re likely to have made false statements. (I’d love an empirical study on this). 

 

Now, as I mentioned in the extended headline, life is complicated. There are always exceptions and nuances. So sometimes, when a politician claims their previous stance was “wrong,” it’s not a show of courage. It’s a way to pander to voters or curry favour with a bully in their political party. But in general, I’m a fan of those who fess up to their errors. 

 

So that’s Strong Suggestion Number Five. Originally, I was going to put all ten suggestions. in a single post. But I’ve come to realise I was wrong — that post would be too long. Plus I’m already behind deadline. 

So I’ve decided to split this post into two parts. This is Part One of the Strong Suggestions.  Part Two will come in a future newsletter.