Home > Crazy Town Podcast > Episode 32 – Cognitive Bias and Global Warming, or… the Story of Cattle Prods and Ice Cream Shops

March 10, 2021


If only we were as rational as we think we are! It turns out that we’re all subject to cognitive biases, those errors in thinking that influence how we process the complex information we encounter in daily life. Jason, Rob, and Asher take a tour of ice cream shops, Scandinavian DMVs, and the chess team to explain such cognitive biases as the Dunning-Kruger effect, confirmation bias, default effect, and sunk cost bias. Listen as your hosts try to overcome their own biases to uncover how human irrationality has driven us into a sustainability crisis where climate change meets overshoot. Super-brainy brain scientistTM Dr. Peter Whybrow joins the program to shed light on why we act the way we do and to propose ways to work with our reflexive side, restructure some of our institutions, and act with an eye toward the long term.

 

Click here to go back to the Crazy Town home page.

Show Notes

Listen to Episode 32

Transcript

Asher Miller 0:00
Hi, I’m Asher Miller.

Jason Bradford 0:02
I’m Jason Bradford.

Rob Dietz 0:03
And I’m Rob Dietz. Welcome to Crazy Town where Margaret Atwood, George Orwell and Aldous Huxley tell happy fun stories around the campfire. Today’s topic is cognitive bias and how it leads into all sorts of irrational behavior. And stay tuned for an interview with renowned psychiatrist, neuroscientist, and author, Dr. Peter Whybrow.

Jason Bradford 0:27
Okay, I’m going to tell a story. It’s not a single incident. This is sort of like what always seems to happen when we would go out with my kids. And a good example is the ice cream shop. Okay, so I got twin boys, so I treat them exactly the same. Okay, love them equally.

Asher Miller 0:42
He’s just saying that because they listen, right?

Jason Bradford 0:45
Well, yeah, but they’re fraternal twins.

Rob Dietz 0:48
Oh, so you could treat one way better than the other, like a social experiment.

Jason Bradford 0:52
One’s in the closet; one we hug, whatever.

Asher Miller 0:54
That explains why one is so much darker skinned than the other.

Jason Bradford 0:58
Yeah, exactly. So anyway, we go out to an ice cream shop, and one of my boys walks in — Curtis. And within 20 seconds, they’re serving him up. It’s chocolate. He’s got a chocolate, chocolate, chocolate, chocolate, whatever. He knows what he wants. He gets in, gets out. He’s licking, he’s going, he’s happy. Davis, the other one — he’s taking an embarrassing number of samples. And he’s doing every permutation possible…

Asher Miller 1:23
And there’s a line out the door.

Jason Bradford 1:25
Right? And we’re all done, you know, and he’s still there, trying to figure this out. And so this introduced [the idea that] when you have kids, you really see how, I didn’t do anything, but they have totally different ways of thinking and moving through the world.

Asher Miller 1:44
Right, it’s clearly not a nurture situation.

Rob Dietz 1:47
I’m still stuck on what is an embarrassing number of samples to try.

Asher Miller 1:52
1,427 combinations.

Rob Dietz 1:54
Yeah, I’d like one of every single thing you have!

Asher Miller 1:58
Think about it: 39 flavors times each other?

Jason Bradford 2:01
I know, so anyway, I’m bringing this up, because we’re exploring hidden drivers this season. And I think one of the things we should talk about are called cognitive biases.

Rob Dietz 2:12
Yeah. And before we get into cognitive biases, I want to explain to our listeners this idea for Season Three of hidden drivers. And what we’re doing is we’re looking at: what are the things that push us like a cow at the end of a cattle prod into Crazy Town? What are these hidden things about how we think? Hidden things in the culture? Hidden things about the world and the way it works, that have pushed us into Crazy Town, where we’re consuming too many resources, taking a bite out of the earth faster than then it can handle it.

Asher Miller 2:44
Yeah. Great explanation, Rob, about what we’re doing this season, but a cattle prod is a hidden driver?

Rob Dietz 2:52
Yeah, it’s a hidden cattle prod.

Asher Miller 2:55
Not the best metaphor! I think you gotta work on that. Okay, so cognitive biases… I think it’s a great idea to start with that, Jason. But But before we get into them, can we at least define what we mean by cognitive biases?

Jason Bradford 3:07
Yeah, go for it.

Asher Miller 3:08
I’m just gonna go back and forth, by the way between bias-es and bias-ees, because I can’t…

Jason Bradford 3:12
I don’t know – to-may-to, to-mah-to.

Asher Miller 3:14
So a cognitive bias, as far as I understand, is this pattern of thinking that deviates from what’s considered rational or normal, right? Is that fair?

Jason Bradford 3:27
Yeah, well, it may be normal because a lot of people do it. But yes, it doesn’t ascribe to the notion of logic and reason and…

Asher Miller 3:29
Right.

Jason Bradford 3:29
Right. It deviates from that.

Rob Dietz 3:38
So this is a fancy way of saying that we’re idiots.

Jason Bradford 3:41
Well, I don’t know. I mean, let’s get back to that story, okay, and explain something important about it.

Rob Dietz 3:50
I’m not calling your kids, by the way, okay? Let’s just get that straight.

Jason Bradford 3:55
Because what my kids were doing was bringing up a classic conundrum. To get by in the world, we ultimately need to make decisions, we make choices. But how do we do this? Because our time and our information are not infinite. We don’t have enough information. We don’t have an infinite amount of time.

Rob Dietz 4:14
Well, Davis might have an infinite amount of information with the tastings.

Jason Bradford 4:18
Yeah, exactly.

Asher Miller 4:19
Or he thinks he does.

Jason Bradford 4:20
Right. So this is what’s interesting is that what Curtis is doing: he’s very quickly taking a bit of information, he’s looking quickly at what’s available for flavors, and then he’s letting his feelings, based on this past experience, guide him very quickly to resolution. And others, like Davis, are computationally processing and trying to optimize. It’s the idea of: I need to optimize the decision, irrespective of time and information. It’s almost impossible, right? But so we can see there’s these trade offs at work here. And so what Curtis is doing is he’s emphasizing a quick process, called heuristics in psychology. It’s called heuristics. He’s letting a lot of his unconscious feelings lead him to a resolution.

Rob Dietz 5:08
So heuristics… I don’t like these big fancy terms, as you know. So I like to think of it as rule of thumb, maybe or just some way of coming to a quick answer.

Jason Bradford 5:20
Yes, it’s coming to a good enough answer. And so It’s actually called bounded rationality. That’s the bias. It’s saying it’s a good enough, it may not be perfect, but that’s good enough. And bounded rationality is one of these cognitive biases. But we’re not saying it’s necessarily irrational, in a sense. It makes sense in a situation like when you’re in an ice cream shop. For God’s sakes, you’re not gonna spend an hour.

Rob Dietz 5:45
It’s nice to have some bounded rationality in this situation.

Jason Bradford 5:49
Now, what you’re seeing with Davis is he’s suffering what’s called choice overload. That’s another cognitive bias. Okay? If there were only six flavors available, it’s a lot easier for Davis. Okay. So anyway, heuristics can usually work.

Asher Miller 6:03
So you never take him to grocery stores is what you’re saying,

Jason Bradford 6:05
Oh, he’s so amazing in the grocery store! Oh, he’s still really good. He’s 21 now, and when he shops, he’s looking at every ingredient and comparing things. It takes him frickin’ forever!

Asher Miller 6:16
So it’s a full-day outing.

Jason Bradford 6:17
Yeah, it’s really incredible. So anyway, I was talking to Davis about this. And I said, “You know, hey, I’m gonna do the show, and I want to use this example.” And I said, “The problem with you Davis is that if the lion starts chasing you on the savanna, you’re going to start thinking: okay, if I take this path through the ravine, and I can climb this tree, but this one goes across the river… you’re already dead.”

Rob Dietz 6:38
Meanwhile, the rest of us just start running. And we’re already dead too.

Asher Miller 6:43
And Curtis is riding on the lion’s back.

Jason Bradford 6:46
So you can see where there’s different modes of thinking to have at different times. We should be using them and have different ways we should be going about our lives. And some people are going to emphasize heuristics, but heuristics generally lead to these cognitive biases. So that’s what is important: our past influences the heuristics we’re going to use. And it’s not always necessarily a good guide to how we should be going about the present — the problem in front of us, and this is called status quo bias. So we may not truly recognize novel situations, and then we make poor heuristic decisions as a result, and that’s called planned continuation bias.

Rob Dietz 7:29
Okay, I’m glazing over here, we got so many biases…

Jason Bradford 7:33
There are so many!

Asher Miller 7:33
Oh, by the way, if you want to… I think Wikipedia has a full list of 185.

Jason Bradford 7:33
I counted 203, but I don’t know, maybe I double counted here. So yeah, you can go to the wiki — we’ll have links in our show notes — and you can see the hundreds of biases out there.

Asher Miller 7:49
My goal is to is to have as many of them — use as many of them in a single day as possible.

Jason Bradford 7:56
Hey, why don’t we do an exercise? You guys throw out one of your favorite cognitive biases.

Rob Dietz 8:01
Okay, let’s talk about but before we do that exercise, can we just highlight the Pacific treefrog that’s chirping in the background. Is that what they do? Chirp? Croak, I guess.

Jason Bradford 8:10
Something like that. So that’s a resident treefrog.

Rob Dietz 8:12
I’m gonna go scare it away real quick while you continue.

Jason Bradford 8:16
So the treefrog will go off now and then. It’s it’s a happy feature of our studio nowadays.

Asher Miller 8:22
Yeah, I don’t know why we’re shutting it up.

Jason Bradford 8:23
Yeah, it’s good guy. Okay,

Asher Miller 8:26
Hey, that actually worked, Rob!

Jason Bradford 8:27
You just got to walk over there.

Asher Miller 8:29
You walk over and wave at it? It’s more like a motion-sensing frog.

Rob Dietz 8:34
I can intimidate the hell out of Pacific treefrogs. It’s the only animal on earth that’s scared of me.

Jason Bradford 8:39
Okay, who’s got a cognitive bias?

Rob Dietz 8:43
I’ll come in with one. I don’t know very many, but this one I’ve known for a while. It’s the confirmation bias. And this is the tendency to interpret things, to focus on things, or remember information in ways that confirm your own preconceptions. So you already think a certain way, and any bit of information, like on social media, that comes your way that says you’re right… you’re kind of like, “That’s right. I’m right.” And it’s like this upward spiral. And if your information is bad, or you had a an incorrect view, you just you keep doubling down on it, getting more and more faith in it.

Jason Bradford 9:22
Yeah, a big problem nowadays, obviously, with social media and living in different realities almost.

Rob Dietz 9:28
Yeah, I can tell you I used to be a technological optimist and…

Asher Miller 9:33
And then you met us

Rob Dietz 9:34
Well, no, it was before I met you. But yeah, you’re like confirmation bias the other way. There is no optimism when it comes to technology! But if you believe that technology is going to save us, there’s a ton of stuff out there that will confirm that bias.

Asher Miller 9:50
You’re making a joke, but I’m glad you brought that up, which is: something for us at PCI that we need to constantly test is our own confirmation. bias, right? We sit here, and we can judge other people’s and see very clearly other people’s confirmation biases. But I think we all have to be testing our own.

Jason Bradford 10:10
Yes. Very healthy to do that.

Rob Dietz 10:12
Yeah. Yeah.

Asher Miller 10:13
Okay, well, I’ve got one, which is maybe not quite as serious. And I think we may have talked about this before, but the Dunning-Kruger effect.

Jason Bradford 10:22
That’s my favorite.

Rob Dietz 10:23
Oh, it’s great one.

Asher Miller 10:24
So for folks who are not familiar with the Dunning-Kruger effect, I think it’s named after two psychiatrists who did this study of behavior. And basically what it means…

Jason Bradford 10:33
Their names are Dunning and Kruger.

Asher Miller 10:36
Thanks.

Rob Dietz 10:37
It’s like Sherlock Holmes over here.

Asher Miller 10:39
No, that’s actually first name/last name. No, it’s actually two people. So in any case, they did a bunch of different studies, basically assessing how people rated how well they did on tests, compared to how they actually did. So they did things around people performing in a debate or taking a survey, an assessment of their emotional intelligence. And what this effect basically is, is that people who score the lowest tend to overestimate how well they did the most. And the people who actually performed the best tended to underestimate how well they did.

Rob Dietz 11:20
So basically, people who know the least think they’re the smartest person in the room.

Asher Miller 11:24
Yeah. And the gap is pretty significant. It’s astonishing. Now we see this in the real world all the time. Our listeners know people in their life, who suffer from this effect.

Rob Dietz 11:36
I’m gonna ask for you guys to be quiet from now on. And I’ll just deliver the rest of this podcast, because I’m pretty sure I know all of this more than you guys.

Jason Bradford 11:44
This happens on social media too, where some of us are experts in certain things at a scale that you would say we’re like a chess master. Okay? And then someone walks in, who’s on the second grade chess team and starts talking about something. You’re like a chess master expert out there.

Rob Dietz 12:03
They’re actually on the checkers team.

Jason Bradford 12:06
It’s so easy for you just to say, “Wait a second, you have no idea what you’re talking about. I do. Let me…” But no, they’re gonna double down, and they’re gonna be like, “No!!!” This happens all the time.

Asher Miller 12:18
We see this in our own lives, but we also see this a lot in politics. Think about the Trump administration. The Trump administration was the Dunning-Kruger effect writ large, right? All these people coming in, pretending that they’re experts on things, you know, on COVID, on the economy, on foreign policy. Let’s make Jared figure out the Israeli-Palestinian solution because he’s a Jew.

Jason Bradford 12:44
He was in charge of the COVID response for a while too — that went well!

Rob Dietz 12:46
Well, maybe history will reframe it as the Dunning-Kruger administration. So I want to list one more of these cognitive biases, because this one’s fascinating to me, too. I don’t know if it’s as funny as Dunning-Kruger, but this one’s called the default effect. And I got introduced to this in one of Dan Ariely’s books. I think it was Predictably Irrational. He’s a behavioral economist, really fascinating — a professor and good writer too. He talked about organ donors in paired countries that have similar cultures, and how you get these really wild swings and how much the population was willing to donate their organs. So in Denmark and Sweden, they looked at this. They’re trying to figure out why is it that in Sweden, 86% of people say, “Yes, I’m going to donate my organs, if I get in an auto wreck.” And in Denmark, it’s only 4%.

Jason Bradford 12:58
Jeez!

Rob Dietz 13:07
And very similar cultures. So the reason was, when you go down to the Department of Motor Vehicles to get your driver’s license, there are two boxes: you want to be a donor or not. In Denmark, the default checked box is “not,” and in Sweden, the default is “yes.” So basically, people just go with the default. And you could actually expand this to a lot of stuff in how we live our lives. If you grew up learning something like that the earth is the center of the universe, that’s your default. And when new information comes along to change that, you tend to stick with your default.

Asher Miller 14:27
Not to bring this back to politics, but we see that with political party party affiliation too. I vote Democrat because I’ve always voted Democrat, or Republican because I’ve always done that, or my family did.

Jason Bradford 14:41
Well, now thanks, guys for those. Those are okay, but since I put this episode together, we’re gonna talk about mine now.

Rob Dietz 14:49
Now, is this real? Are you “Dunning-Krugering” it right now? Can’t tell anymore.

Jason Bradford 14:55
We introduced this key concept with my kids, the fact that we go through the world needing to make quick decisions. And that means we operate through heuristics. And that’s this major path to having biases in our decision making. And I think a really big one is the sunk cost bias. Are you guys familiar with the sunk cost bias?

Asher Miller 15:20
Yeah, I live my life based on the sunk cost bias.

Rob Dietz 15:24
I often think about it with gambling, like, “Oh, I’m already in the hole, so I gotta climb out of this hole.”

Jason Bradford 15:30
Yeah, double down. Right. So yeah, that’s the idea. Sometimes it’s called the psychology of previous investment. We’ve already gone a long way down a certain path, and even if it’s not working so well, it’s really hard to back out.

Asher Miller 15:47
There are so many examples that I think in people’s personal lives, you can look at things as significant as: I went to college to study this major. I hate it, right?

Rob Dietz 15:58
Econ major!

Asher Miller 15:59
I signed up for law school, spent this money, and I hate it. I gotta ride this thing out. It can be as simple as, “Oh, I’m not feeling well tonight. But I bought tickets to go to this theater or concert,” and then I spread COVID to the rest of the world.

Rob Dietz 16:18
You almost made me cry with your example of someone going to law school, and they’re gonna ride it out and do the next 50 years of their life in a profession they hate.

Asher Miller 16:29
I think a lot of people, some of the decisions they make in their life, including really big ones like that, are sunk costs.

Rob Dietz 16:36
If you broaden that out socially, too, it’s pretty amazing how it applies to what we’re doing together as a whole community or a society. So we’ve already invested in a huge fossil fuel infrastructure. And the idea that we need to do something different… “No, we’re gonna keep investing in this thing that we’ve already poured so much into.”

Jason Bradford 17:01
We had in season one, the car culture episode, where the transportation expert at UC Davis was talking about…

Rob Dietz 17:07
a billion cars!

Jason Bradford 17:08
I think it was 2 billion, right? Like, we already have a billion. So yeah, it’s just nuts.

Asher Miller 17:12
But I think what’s important about that is: not only are we not recognizing that maybe we’re making these decisions, these ongoing investments, because of sunk costs. We rationalize them in different ways, so we actually can’t imagine something other than the car. And so our thinking about what’s possible: that’s also back to the status quo thing that you were talking about. We just default to whatever that was.

Jason Bradford 17:16
Yeah. It’s interesting how many of these biases are somewhat related. There’s a Venn diagram.

Rob Dietz 17:43
They compound each other. I feel like, even if I realize I’m doing something stupid in a sort of a confirmation bias way, I also better be careful to make sure I’m not also doing a sunk cost at the same time.

Jason Bradford 17:58
Yeah. What we talk about in our show, is the fact that there’s these these giant predicaments, these huge problems about the environment. And they’re so enormous, they’re hard to wrap your head around. And I just learned that there’s actually a word for this. Environmental philosopher Timothy Morton came up with a name. He calls it hyperobjects, right? And I think you’ve heard about it before.

Rob Dietz 18:23
I have now — you’re kind of a hyperobject, Jason!

Asher Miller 18:26
Only when he has coffee.

Jason Bradford 18:28
Your dog’s a hyperobject — that’s for sure. Holy mackerel! Okay. So here’s what I think there’s a good way to introduce hyperobjects with a parable. You remember, there’s that parable of the two blind men who have to describe an elephant. One’s got a tail and one’s got a trunk. And they’resaying, “Oh, what I see or what I feel is this.”

Rob Dietz 18:50
An elephant’s like a snake?

Jason Bradford 18:52
Yeah. Right. Hyperobjects are like that. In a sense, they’re so big (kind of like an elephant — that’s why I think it’s a good parable) that it’s almost impossible for anyone’s senses, to grasp them. We did not evolve to grasp things of this scale, either in space or in time. And so we haven’t evolved the perception or the cognitive abilities to really understand them and do something about them. It’s like when I said about Davis, you’re going to get killed by the lion, and Curtis is going to start running right away. We evolved in situations where when there were threats, they were usually things in our environment that we could look at, react to, and do something about it, right? Whereas hyperobjects are not at that kind of scale.

Rob Dietz 19:39
Right. It’s like you can’t see it. You can’t smell it. You can’t hear it. It’s not immediately perceptible.

Jason Bradford 19:45
Or if you do, you can’t understand it. So maybe you’re saying global warming is a good example. The temperature changes so much from day to day and season to season and to perceive global warming…

Asher Miller 19:58
Gradual change — one degree Celsius over 150 years…

Rob Dietz 20:03
And like you said, spatially, over 150 miles or 1000s of miles.

Jason Bradford 20:08
Right. And so there’s a lot of good examples of hyperobjects in our world. And these are the sorts of things that we actually need to have the ability to grapple with.

Asher Miller 20:18
And the existential threats that we face are… maybe most of them are hyperobjects.

Rob Dietz 20:25
Yeah. Think about the size of the global economy, and the fact that we’re trying to grow it and grow it. If that’s not a hyperobject — something that you don’t really perceive day to day. When I think of the economy, I think about, “Oh, how much money do I have? Can I afford to go down to the store and get this?” I don’t think about, “Wow, this thing is growing at the expense of the ecosystems that contain it.” That’s maybe a good way to think of

Asher Miller 20:50
It really is a skill thing. Think about what we’ve experienced with the coronavirus pandemic, when you have millions of people dying around the world, hundreds of 1000s of people dying in the United States where we are. And you hear these numbers of how many new infections there are every day. These are numbers that are really hard to conceive of, and they don’t necessarily create the same emotional responses if you hear the story of one person that’s a neighbor or someone else you can imagine or you’ve had contact with.

Jason Bradford 21:24
Yes. And and so I think there are tons of reasons why our cognitive biases, our use of heuristics does not allow us, and our evolutionary history doesn’t allow us, to deal with these well. And there’s a good example I saw. I was reading those lists on Wikipedia. It was the bias of attribute substitution. And that’s the notion that when something is too computationally complex, we use a simple heuristic to make a judgement about it. So the example might be, “Well, I like warmer winters, so global warming is not a bad thing.” Okay? And you hear this from people in Canada, or Russia or Minnesota.

Rob Dietz 22:00
Oh, they may be right in Canada, right? So that’s really interesting, though. You bring up the heuristics for making decisions or these rules of thumb: it seems incredibly irrational that we do that. But you’ve got to think, based on the way we perceive the world and the way we evolved, that on some level, it’s a really logical way to get by, to think in a lot of different situations.

Asher Miller 22:26
It came from somewhere, right?

Rob Dietz 22:27
Yeah, but I mean, it could actually be rational on some level.

Jason Bradford 22:31
There’s actually a guy who is a critic of calling cognitive bias illogical. He says, “No, no, no, you just have to understand the context.” And so he has a really interesting term. He’s a German, named Gerd — I’m so sorry — Gerd Gigerenzer. Something like that.

Rob Dietz 22:52
Okay. All right. I’ll take your word for it, because I don’t know him

Jason Bradford 22:55
He looks like a very, very interesting fellow. He calls heuristics part of the adaptive toolbox. There’s modes of thinking, and heuristic modes are not necessarily bad. He says that we have to think about the notion of ecological rationality. So you go back to the ice cream shop. I really like Curtis’s mode.

Asher Miller 23:19
Of course you do. You don’t want to be in this store for an hour and a half,

Jason Bradford 23:23
Right. I want Davis’s mode, if we’re building a bridge,

Rob Dietz 23:27
No, I want a chocolate bridge.

Jason Bradford 23:33
We have to deal with the fact that we have different modes of thinking, some are going to be very biased, but very useful. And some are going to be really logical and really appropriate, especially for dealing with hyperobjects.

Rob Dietz 23:47
This is actually a good metaphor — the toolbox rather than the cattle prod. If we each understood what was in our toolbox, and we understood which tool fit which scenario that we’re trying to think through…

Asher Miller 24:05
I have the best tools. my toolbox is the absolute best.

Rob Dietz 24:08
Do you have any hyperobject tools? Asher Dunning over here and Jason Kruger!

Asher Miller 24:16
On a serious note, I actually think it’s true applying different tools for different situations. But I think it also starts with recognizing that we have these biases, and trying to, in certain moments, especially ones of maybe significant importance, like when we’re trying to deal with hyperobjects, is to step back a little bit. And maybe in those moments, heuristics are bad. A quick default kind of assumption or resolution maybe is a thing to avoid in situations when you have a very important decision or matter to deal with. And the rest of the time, just fuck it.

Jason Bradford 24:54
Yeah. Hey, use your gut, man, use your gut!

Asher Miller 24:57
You’re the best baby!

Rob Dietz 24:59
It’s all chocolate!

Asher Miller 25:11
Stay tuned for our George Costanza Memorial do the opposite segment where we discuss things we can do to get the hell out of Crazy Town.

Jason Bradford 25:18
Now you don’t have to just listen to the three of us blather on anymore.

Rob Dietz 25:21
We’ve actually invited someone intelligent on the program to provide inspiration.

Rob Dietz 25:26
So as we said, for season three, we’re exploring the hidden drivers that have pushed us into Crazy Town. But we don’t just want to explain the situation and leave it there. We want to give our listeners something they can do differently. So that’s why we’re running the George Costanza do-the-opposite segment.

Rob Dietz 26:59
So I was wondering if either of you two has a story or an example of someone doing the opposite when it comes to cognitive biases?

Asher Miller 27:35
Yeah, the first thing that comes to mind is this guy I know, Steve Lambert, who’s a pretty amazing artist and activist. I hope he doesn’t mind me sharing this story. You know how we sometimes get hate mail, right?

Rob Dietz 27:51
Yeah, yeah. Well, especially you.

Asher Miller 27:53
Especially me, yeah.

Rob Dietz 27:54
We went over that last last time with your Luciferian/satan worshipping stuff.

Asher Miller 27:59
Right. And you see this all the time in online discourse — people attacking each other, trolling each other. And the instinct is to go right back at that person and attack them. Steve had done this project where he built this enormous mobile neon sign. And it was the word capitalism. And then below it says, “works for me,” and there was a way to vote. This is interactive art, so people go up to it and see: capitalism works for me, yes or no? And it has a counter. He’s doing this project, and he’s taking it around the country. He took it to Times Square.

Rob Dietz 28:39
Wow, you could have a default bias problem here. We’re in the hotbed of capitalism here.

Asher Miller 28:46
So he gets this email — I think it was in response to the project — this long email, this screed from this guy, who’s telling him he’s the worst person in the world for attacking capitalism. He must be communist — all this stuff — a pages-long email. And when Steve replied, he decided, “I’m going to do the opposite. I’m not going to go right back at this guy. And I’m also not going to just ignore him.” He just wrote one thing back to him and said, “Have you ever been in love?” And the guy replied, and his reply was, “What are you — what are you talking about? I know what you’re doing.”

Asher Miller 29:26
And he goes off again. This long thing, and then Steve replies one more time, “It’s wonderful, isn’t it?” And the guy comes back a third time? He’s like, “Yeah, yeah, it actually is.”

Asher Miller 29:48
Jason, you’re the expert on cognitive biases. So we’d have to try to figure out what was the cognitive bias with this guy. Maybe it’s attacking something that’s not part of his anti-affirmation bias, right. But there was Steve doing the opposite of the butting-heads rams going after each other.

Rob Dietz 30:07
What’s really interesting to me in that story, besides how funny it is, is that Steve actually found a way to kick this guy out of his bias, out of his worldview, out of his problematic way of thinking, which is really what doing the opposite is here. I mean, we can all say, “Recognize when you’re throwing good money after bad or when you’re in a sunk cost situation, and don’t do that.” That would be sort of the opposite. But it’s not really helpful, because what’s helpful is, how do you actually get to the point where you realize you’re in a sunk cost situation, or where you’re in a confirmation bias situation. And so maybe there’s some little nugget in there about how Steve got Captain Angry to stop ranting?

Jason Bradford 30:59
The irony is as we’ve been talking about how we’ve got to maybe tone down our emotions, the emotions are sort of the heuristics, right? It’s the gut feeling. But what Steve did was: he actually made an emotional connection to the person, instead of going after the person’s logic. Whatever reasoning they were applying, instead of doing that, Steve said, I’m going to open up an emotional connection. And that’ll allow that person maybe to listen. So there’s a little bit of an irony here. If someone feels that you are caring about them, and you’re a real person and you you’ve got an emotional connection, you can share something. You can share the fact that you know what love is, and maybe then you can have a conference — I mean a conversation.

Rob Dietz 31:51
Or a conference!

Jason Bradford 31:52
A conversation that actually gets to the real points and the issues you want to deal with.

Asher Miller 32:00
I’m just gonna warn our listeners. If you send us screeds about what we’re doing wrong on the podcast…

Rob Dietz 32:07
We’ll have Steve respond, right?

Asher Miller 32:09
You’re gonna get a very loving email.

Asher Miller 32:25
Dr. Peter Whybrow is, a Director Emeritus of the Jane and Terry Semel Institute for Neuroscience and Human Behavior at the University of California, Los Angeles, and the Judson Braun Distinguished Professor of Psychiatry and Biobehavioral Sciences at UCLA’s David Geffen School of Medicine. Dr. Whybrow is an international authority on depression, manic depressive disease, and the effects of thyroid hormones on the brain and human behavior. He is the author of numerous scientific papers and six books, including American Mania: When More Is not Enough, and The Well Tuned Brain: Neuroscience and the Life Well Lived. He’s explored deeply how the evolutionary development and function of our brains meet American culture, the economy and planetary limits, which is why I’m so pleased to be speaking with him today. Peter, welcome to Crazy Town.

Peter Whybrow 33:13
Well, thank you so much for inviting me. I’m a big fan of the programs you put together.

Asher Miller 33:19
Thank you. So on this season of the podcast, we’re talking about hidden drivers that make us act in ways that are a bit insane, considering the existential threats that we face. A lot of these, of course, do come down to individual and collective behavior. Earlier, Jason, Rob, and I talked about cognitive biases, like confirmation bias, sunk cost bias, and even the Dunning-Kruger effect, which I particularly enjoy, and how they lead us to make decisions that are sending humanity over the edge. But cognitive biases are just one example of how humans are not quite as rational, or not as rational as often as we think we are. Can you talk about the roots of that, especially in terms of our evolutionary bioneurology?

Peter Whybrow 34:01
Yes, I think that’s, for me, the nub of the issue. In fact, the several books that I’ve written about this have that at their core, which is that you have to be able to understand the evolution of the biology of the human brain before you can really understand why it works the way it does. And it’s a layered process. The core of it, we share with the lizard, really, really ancient — millions and millions of years old. On top of that come the mammals, which arrived about the same time as the dinosaurs were here. And that’s the little core — the cortex, but the first cortex, which brings us some sort of sense of being a human, because it’s the it’s the part of the brain that deals with attachment, which is very unusual in other animals. And so you find that this is the beginning of life where families begin, etc. Then on top of that, very recently is the issue of the cortex, which has grown out of those underlying systems. Of course, dogs and cats and all mammals have a cortex, but they don’t have the complex cortex that we have, especially with the frontal part of the brain where we have the capacity for not only reflexive thought, but also reflective thought. We’re able to take abstract concepts and think them through and behave based upon our conclusions. Now, this is a very sophisticated way of thinking about things. And it only goes back, as far as we can see it, in the record for close to 80,000 to 100,000 years, that if you look at the cave records, for example, so this brain is an evolution in itself. And so we have to ask ourselves, how does that evolution bear upon the struggles, and the challenges that, for example, you face and handle extraordinarily well, in terms of public awareness in the Post Carbon Institute? We have to first get to understand what the core of the brain does, and how that differs, and is similar to the reflective part of the brain, because it’s only when the reflexive (that is, the old brain) and the new brain (the reflective brain) work in synchrony, that things really begin to work well. And we’ve built society in a way that doesn’t necessarily foster the interaction and the balance of those two parts of our being.

Asher Miller 36:45
So you’ve talked about this a bit as the old and the new brain. The brain is sort of a system, a complex system, in that our brains should function — ideally would function — as a system, sort of connecting these different parts of our brains. Can you talk more about that?

Peter Whybrow 37:03
Yes, as a system, they should function. And they do function in many, many people. It evolves through the natural rearing of a child. And so when a child is first born, we know that they’re very primitive, of course, using only the core of their brain to feed themselves. The mammal suckles, and that’s about the only thing they do for the first few days, etc, etc. But amazingly quickly, they begin to show their cognate abilities. And those abilities range, largely in the beginning, from organizing the motor system, so that they slowly begin to be able to grasp something, for example, and they slowly walk, etc. All these are learned behaviors, they’re learned within the framework of the biology, which builds the individual, but that comes quite rapidly. And so by the age of one, or one and a half, the child is walking and making noise, they begin to use that noise to mimic their parents. And slowly speech occurs. And so suddenly, they move out of this sense of being a biological person, just like another mammal to something which is entirely distinct, and human in our own reference. So that becomes really a way in which the two parts of the brain, the old brain, which is what drives things in the early part of that development, becomes joined with the reflective part, which then begins to assess the environment, and draw conclusions from it. Then you begin to see the extraordinary ability that we have, as humans, which is totally distinct from that of other mammals. For example, the present circumstance where many of us are actually schooling children at home, you get to see this. They have a vibrant life, beyond the intellectual, which is the reflexive life. And then the reflective life is what you have to meld and that’s the task of early childhood and adolescence. Now, it’s also the task of society — that doesn’t just come through family membership. That comes through living in a society which takes that evolution seriously and works on it in a way which produces a reflective person. Now, you asked me earlier, when we were preparing for the chat, how do these things meld together? Well, they meld very well. If the person is given the opportunity to see that there’s a lot of fun in life being reflexive, but in order to better and better understand other people, and to move oneself forward, you have to be reflective, you have to place in the context of what this means looking forward. In other words, you’re putting yourself into the future, in order to understand what is the best activity you should do in the present. We’re doing it with the help of both a primitive brain and an evolved brain. Now, if you put those two things together, we have extraordinary capacity to do all sorts of things as human beings. And we’ve seen that. Look at the way in which over the last 300 years, the industrial revolution has brought us a completely different way of living. That is amazing. And we have all our literature and ability to figure out science etc. We’re very smart animals. The problem is that we still have buried inside this ancient, primitive brain. I talked about this little story of my friend who runs a restaurant near where I live in LA, a French restaurant. One evening, we were in the early New Year, and we were having dinner, and we ate this wonderful meal. But we were mindful of the fact that it was early January, and we all wanted to lose a little weight. We had escargot and wine, and it was good. The owner comes around and says, “I have this wonderful dessert for you.” He’s French, speaking in this wonderful accent. We say, “No, thank you so much. We’re we’re full. We’re going home where we’ve had enough to eat.” He he looks crestfallen. He goes away, comes back, and brings us the bill. And then suddenly he appears with this plate of cheesecake, he puts it in the middle of the table. And we say, “No, no, no, no.” He says, “Oh, just try it for me. Try it for me!” He gives us three forks, and about 45 seconds later, the whole thing has disappeared. Now you ask yourself, “How does this happen?” The rational side of ourselves was saying, “No, we’re not going to eat that — we’re already full. And no, we want to lose some weight.” The thing is put in front of you, and suddenly all that long-term planning goes out of the window, and you regress to this completely reflexive creature. The smell of the thing is sugary, and you know what it’s going to taste like. You wolf it down. It’s from the ancient way in which, as mammals we used to eat everything we possibly could find, because we didn’t know when we were going to find it again. And so that is the old brain saying, “Hey, let’s discount the future. We’re going to eat this stuff. It’s just here, and it’s free! Yes!” That is an example of how, despite our best efforts, reflective, long-term planning gets hijacked by reflexive, ancient modalities, which are embedded in the brain. If you then take that little example, and you extrapolate it into the way in which we run society, we have to ask ourselves, “What are we doing when we have invented, especially so in the US, a culture — a market culture — which totally rewards on the short term. It doesn’t matter whether we’re burning oil, or we’re buying ourselves cheesecake; it works on the short term. And yet we know, because we’re sentient creatures, that we’re driving ourselves down a path, which has a cliff edge, which we’re going to go over very soon. But how do we stop that? How do we in the moment, recognize that in the long term, we’re destroying ourselves? There’s an interesting quote that I have in the book The Well Tuned Brain:

Peter Whybrow 44:04
Watch your habits, for they become your character, and watch your character because they become your destiny. We have the built-in habit that we’re interested in short-term advantage. The whole of our business enterprise is built on that idea. But the long-term survival of the human race? The planet will do just fine, but we could destroy ourselves quite easily, because of the short-term fascination we have and the addiction we have to short-term issues, even when we know that we are in fact going down a perilous path.

Asher Miller 44:48
I find it so fascinating to think about, especially in the last few 100 years, we’ve actually used our — if you want to call it the rational brain, the new brain — to figure out how to develop all of these advances, technological advances, harnessing energy sources that we didn’t harness before. And we tend to think of ourselves as so rational, we celebrate our capacity to do that. And we tend to think of ourselves as being these highly rational people and a species. But in a sense, what we’ve actually done was use that part of our brains to provide more opportunity for the reflexive parts of us to go hog wild. And we actually have a system, as you pointed out, that rewards that behavior. Even though we say that we’re so rational, we tend to think of Homo economus as this rational creature — everyone’s pursuing their self interest in this market, and we’re all rational. And we know that that’s not true. So you talked about habit formation, and that leading to character, and you pose this question: we know this, we know that we’ve set up a system that plays off of these deeply embedded behaviors — this evolutionary biology in our in our brains — to consume everything, to look for short-term benefits. And we’ve created a system that rewards that. But we know that we have to shift that behavior collectively. What is your answer to that question? How do we go about doing that? How do we create the conditions for new habits as a society?

Peter Whybrow 46:28
It’s the core question of our time — the one that you’re asking, and I think that’s what you’ve devoted your professional life to being able to bring that to public attention. And I think that’s very, very important. Michael Lewis is a person I know some and is a very popular social critic and author, as you know. He just wrote a book called The Fifth Risk, in which he defines the fifth risk as the risk a society runs when it falls into the habit of responding to long-term risks with short-term solutions. And we do that repeatedly. I’m not particularly politically interested, but I think you could argue that the last president has done us a great service, because he has focused upon the short term with complete oblivion about the long term. And people are beginning to say, “Boy, I don’t think this works.” And so the answer to your question, in part is that it’s not a moral issue; it is an issue for reflection. And we need to build that more carefully into the cautionary tales and the positive tales that we provide for young people as they go through the school systems that we build. And that can be done. And it is done, but not in a consistent way. We build the educational system at the moment, largely around the idea that you’ve got to get a job and be able to support yourself financially. But that’s not the only way in which we have to manage each other’s worlds, we have to be able to help each other in all sorts of ways, along the lines of helping realize that if we do things together collectively, and we do them in a way that is constructive for all of us, as opposed to just one of us… If you read Adam Smith, going back to the beginnings of our western competitive market culture, he did not say that the world is based upon individual success and competition. What he said was that people should have an opportunity to participate in the larger economy, but it should be collectively something which is valuable to everybody. And if you read, particularly his first book, the one before The Wealth of Nations called Moral Sentiments, he writes that as the basis for why it is that a market society really works. We have tended to let that go out of gear. If we think about the market society in a rational way, and bring together what’s good for people, as well as what’s good for culture, then I think we can get that back. But it’s a different way of thinking and teaching about a market society than we have at the moment. So we’ve got to see the collective as something that reinforces the advantage of rational reflective thinking, but not at the expense of reflexive thinking. There’s no reason why you can’t have a fun life and an interesting life at the same time as you’re thoughtful about managing the future and managing your family and making sure that your children grow up to be thoughtful careful adults, because that will benefit them as well as benefiting society. So The answer to your question is a much greater focus upon education, as one part of it. The other part of it is not to have a society that essentially celebrates short-term gain, which is what we have at the moment. And that’s not such an easy effort. But I think if we could begin to say, there are various forms of short-term gain — short-term gain is not just material gain. It is loving somebody and bringing them to a maturity, where they themselves can love others. That’s a short-term gain for every parent I know; it must be for you — it certainly has been for me. But at the same time, it’s investing in what the society needs in terms of intimacy and caring, and so on. So if we’re willing to go down into an understanding of evolutionary biology, that, in some ways, can help us enormously in crafting a society, which will take into account our many assets on the reflective side, but also understanding that the reflexive part of us can be self-destructive. We’re smart enough to figure that out, but we’ve got to have some way in which people can quantify it and put it in their own understanding. If they know that they are complicated creatures, then I think for the future, we have an opportunity to help them see how that complication, if it’s working together in an integrated way, can conserve both the culture and the individual.

Asher Miller 51:41
Part of what I’m getting from this conversation, which is really helpful (thank you) is that being more reflective, in part, is recognizing how reflexive we actually are. It’s not trying to overcome, or somehow tamp down and control the old part of our brain. We’re not going to evolve the biology of our brain overnight, so we have to recognize that we’re not as (quote-unquote) rational as we think that we are. And we have to learn to balance those pieces of ourselves. And the other thing I’m hearing from you, and I’m taking away from this conversation, is that the formation of habits is very important. A lot of that begins in early development. And so we should be focusing on that. And we also should be thinking about it as a system. So when we think of — let’s say — the metaphor that you use of the cheesecake, it’s not so much saying to people, “It’s your fault, if you eat the cheesecake.” It’s the sort of the system that puts the cheesecake in front of you, in a sense. Maybe individually, we’re going to have that reflexive response to these short-term stimuli that we have. But maybe collectively, we can create conditions that put those into some sort of balance, or put them into a boundary of some kind, so that we do not run amok. So it is the system itself. And that we could use maybe the rational, new brain part of ourselves collectively to create the right conditions. Is that a fair way of putting it?

Peter Whybrow 53:16
In eating the cheesecake, we thoroughly enjoyed it. And I’m not a Scrooge — I wouldn’t want to ruin the opportunity to have enjoyable things like that, which are given to us by our own collective brains or whatnot. I perfectly agree with what you’re saying. And that is that the the rational individual embraces both the old part of ourselves, which enjoys life. And when you find something really good, and it gives you pleasure, you should tuck into it, but you don’t do it to your self-destruction, you could eat caramel cheesecake in January, maybe a couple of times, but if you ate it every day, for the whole year, you would be in really bad shape at the end of it, and most of us realize that. So if you use that as an extended metaphor, then what we’re doing is exactly what you were just saying, which is trying to take the best part of us in terms of evolution, and think about how do you integrate them. We wouldn’t have all the many wonders that we have in the world, if we didn’t have the extraordinary brain that we do have. We can sort of comfort ourselves perhaps with the fact that fossil fuels were a necessity at the time, and they have passed. And now we need to step on forward. We have the intelligence to do that, and it’s beginning to show that we do. The problem that has constrained that has been the market idea and the short-term notion that we will never be able to return to success and affluence if we get rid of one of the dominant underlying drivers of our energy systems. Maybe we’re not smart enough to think that through yet, but we can think it through. And as long as we keep at it, we’re going to be okay, and we’ll still have a good life. And we can still eat cheesecake. I mean, why not?

Asher Miller 55:08
On that note, thank you very much, Dr. Peter Whybrow. I really appreciate you taking the time.

Peter Whybrow 55:13
Well, thank you for inviting me. And good luck with all these wonderful things you do there. Thank you.

Jason Bradford 55:29
Thanks for listening to this episode of Crazy Town.

Asher Miller 55:32
Yeah, if by some miracle, you actually got something out of it, please take a minute and give us a positive rating or leave a review on your preferred podcast app.

Rob Dietz 55:40
And thanks to all our listeners, supporters and volunteers, and special thanks to our producer, Melody Travers.

Rob Dietz 55:59
Hey, you guys this week, the episode is brought to you by a very special product. It’s Krugerade. It’s the energy drink that lifts you up into the Dunning-Kruger atmosphere.

Asher Miller 56:11
Oh, awesome. You mean so if I absolutely suck at something, I just take a sip and then I think I’m like the best ever?

Rob Dietz 56:18
Yeah, you’re you’re actually not the best ever, but you’ll feel like it!

Asher Miller 56:22
That’s all that matters is how I feel.

Rob Dietz 56:23
Right! You can go out there and be an expert on anything. I mean, Jason, you have a PhD in conservation biology, plant ecology — doesn’t matter…

Jason Bradford 56:34
Evolutionary biology.

Rob Dietz 56:35
Because here’s the thing: you’ll have a PhD in whatever you want after you drink Krugerade.

Jason Bradford 56:39
Now what I want understand, though, is: should I take this instead of Gatorade before I go play tennis and will it make sure I win?

Asher Miller 56:48
No, you’ll think you’ll win

Rob Dietz 56:50
You’ll know that you’re the winner.

Asher Miller 56:52
Even if you got completely trounced!

Jason Bradford 56:54
And got blown out on the court — I’ll think I won. Shit, I want this.

Rob Dietz 56:58
Krugerade. Get your drinks, wherever drinks are available.