Title

Introduction to Cognitive Bias: Crash Course Scientific Thinking #1
Title Decode
Thumbnail X-Ray
Hero's Journey
Emotion Rollercoaster
Money Shots
Content Highlights
Full Article
Deconstructing Crash Course's Narrative Arc
The Relatable Failure
The Geocentrism Parable
The Pattern Engine
Evolutionary Context
Availability Bias
Real World Example
Confirmation Bias
Myth Busting
Systematic Skepticism
The Scientific Solution
Cognitive Flexibility
Application & CTA
Emotion-Driven Narrative Analysis
Curiosity
Historical Hook
Amusement
Humor Bridge
Surprise
Reality Check
Refreshed
Format Break
Empowerment
Identity Upgrade
What This Video Nailed for Monetization
Sponsor Magnetism
Product Placement Craft
Long-Term Value
What Could Sponsors Pay?
Introduction to Cognitive Bias: Crash Course Scientific Thinking #1
Structure Breakdown
Psychological Triggers
Formula Recognition
SEO Potential
Visual Design Breakdown

Composition Analysis
Emotion Expression
Color Strategy
Text Strategy
Design Formula
Title-Thumbnail Synergy
Content Highlights
The 'Relatable Failure' Hook
The 'Segment Shift' Retention Hack
Making Abstract Concepts Sticky
Historical Context of Geocentric Model
Two thousand years ago, people looked up at the sky, and they saw that everything up there seemed to move. So, naturally, the Earth was staying still, and everything else was rotating around us. It was a story that just made sense to millions of people, and it stuck around well into the 16th century. If I'd been alive then, I would have believed this story. I mean, even today, I feel like I must be at the center of something. That idea, of course, was wrong, but we didn't believe it forever. We found ways to step out of our old stories and find something much more interesting.
Introduction to Crash Course Scientific Thinking
Hi, I'm Hank Green, and this is Crash Course Scientific Thinking. Science. It is a never-ending quest for knowledge, a way of interrogating our universe to figure out how it works, a tool to guide us when our intuition isn't enough. And also, it can be quite fun. Sometimes you get to blow stuff up.
Evolution of Scientific Understanding Post-Copernicus
In the years since Copernicus put forth the theory that the Earth revolves around the Sun, we've learned that some questions are just too big, too complex, or too bizarre to trust our gut with. When we rely on intuition alone to answer those big, complicated questions, our brains fall prey to cognitive biases, predictable weaknesses in the way we've evolved to think.
Pattern Recognition in Human Evolution
Our brains are very good at finding patterns. We've evolved this skill because it's super helpful for survival. It helped our ancestors spot the telltale signs of predators and recognize when certain plants might be poisonous. We have always paid attention to and learned from our world. Those pattern recognition skills have also been linked to some very special human qualities, like our ability to imagine and invent. Like, it's what made me notice that Hank and Angler fish sound vaguely similar so that I could invent something called the Hankler fish. Bad puns are still good pattern recognition. It's also why we are so good at telling stories because really that's all a story is: a recognizable pattern of information. And more importantly, our highly evolved pattern recognition skills allow our brains to apply mental shortcuts or heuristics that help us solve simple problems quickly and make life livable.
Heuristics and Mental Shortcuts
They are the brain's way of copy-pasting stories we already have onto new information so that we don't expend a bunch of brain power in every direction all the time. They are why I don't have to stop and think what will happen if I touch this hot stovetop. My brain picked up the pattern of touching hot things bad long ago, and it keeps resurfacing it whenever I need it to keep me safe. And that's all well and good for avoiding hot things, but those same mental shortcuts also open us up to cognitive bias.
Nature of Cognitive Bias
Now, cognitive bias isn't inherently bad. And that's good because everybody's got it. And I'm not talking about explicit bias here where someone is consciously aware that they are discriminating against a person. Cognitive biases happen unconsciously. They are implicit biases when our decision-making is influenced by beliefs and patterns we aren't even consciously aware of.
Availability Bias Example with Air Travel
Consider this modern-day example of a cognitive bias skewing many people's perception of risk. In early 2025, after a mid-air collision between a commercial airplane and a military helicopter, people started paying a lot of attention to every near miss and airport mishap. And it felt like planes were crashing every day. The media ran with this, and the algorithms amplified it. At the time, 65% of Americans said they felt more anxious about flying. But when we took a look at the actual data, the number of accidents compared to the same time in 2024 remained the same, and flying remained an incredibly safe form of travel. So why did so many of us feel like it wasn't? Well, for efficiency, our brains often put more weight on the most readily available information around. We call this, wait for it, availability bias. When people make judgments based on the information that's easily available. The truth is, you're far more likely to be in a car crash than a plane crash. But when plane crashes do happen, we hear about them a lot, especially in today's algorithm-driven news cycle. So that information is way easier to call to mind than the fact that over 120 people in the US die in car accidents every day. Availability bias is a big way our brains mislead us, but probably the biggest is confirmation bias.
Confirmation Bias and Learning Styles Myth
That's our brain's tendency to accept information that agrees with things we already believe and filter out stuff that contradicts it. Like if someone already believed that flying was dangerous, then the news stories about the 2025 crash likely reinforced that belief. Or here's an example you might be familiar with. Have you ever been told that you are a visual learner or maybe that you learn best by listening to other people? In a study published in the Journal of Educational Psychology, more than 90% of participants said that people learn better when they're taught using the learning style that best suits them. A similar survey of colleges in the US revealed that out of the 39 surveyed, 29 of them teach learning style theory as part of their guidance for teachers. But here's the kicker. There is no scientific evidence to support the idea of personalized learning styles. So why is this myth so prevalent? Researchers have pointed out that it persists at least in part thanks to confirmation bias. As one researcher put it, 'People are obviously different and learning styles appear to offer educators a way to accommodate individual learner differences. ' So, someone who believes they've seen these methods have a positive impact might reject evidence against them, just like I might and sometimes do reject evidence that the butt is not part of the legs.
Other Cognitive Biases
And these are just a few of the cognitive biases we all have. We haven't even gotten into how we cling to first impressions. That's anchoring bias. Or how we tend to believe the events of the past were predictable. That's hindsight bias. Ultimately, we all want the world to make sense. But when we rely on these simple shortcuts for the wrong things, they can keep us from being open to evidence.
Science as a Method to Overcome Biases
So, what do we do about this? Well, over the last few centuries, humans have developed a new way of looking at the world. One that doesn't just explain what feels right, but tests what is right. This is what people are usually talking about when they say the word science. Not the body of knowledge, but the systems used to interrogate the universe. From astronomy to zoology, science is a way of building knowledge that's durable, communal, and long-lasting. And to help us learn more, I think we need a little sage advice. Let's give it up for science, everyone.
Sage Advice Introduction
Hi, Sage, everybody. This is Sage. Hello, I'm Sage the Bad Naturalist. I'm a dork, a painter, a creator of the YouTube channel Sage the Bad Naturalist. I make videos about fungi, plants, research papers, and learning something new even when science goes wrong. And I'm here to help Hank spread some sage advice. I love that we're talking about cognitive bias today, Hank, because the process of science is actually designed to overcome biases from methods to reliance on evidence and especially the fact that science is communal. It gets vetted by a whole community, not just one guy in a bathtub shouting 'Eureka!' No shame to Archimedes, of course. I personally love peer review, which we'll talk about in a later episode, but Sage, what is your favorite bias-busting science method?
Randomized Controlled Trials
For me, it has to be randomized control trials. It's a multi-step process for research that's used a lot in testing new medicine, and each step was designed to reduce chance of bias. Say scientists want to test a new diabetes medicine. Well, there's a lot of potential for bias in that process. Like they might accidentally influence the results of the trial by their selection of the participants. That's why we say everyone has cognitive bias even scientists. Exactly. So to avoid those biases, scientists select the members at random. And when they're testing a new drug, there's all kinds of potential for confirmation bias. So they sort some of the members into a control group that gets either no treatment, a placebo, which looks like the real drug but doesn't actually do anything, or an older proven drug. That way they can compare the results. I love a randomized control trial because it is such a good example of the ways that scientists have recognized their own potential for bias and designed their research to reduce it as much as possible. Like how sometimes when researchers need to eliminate bias even further, they do double-blind studies where neither the participants nor the scientists know which people are in the trial and which are in the control. Right. Scientific thinking at work.
We've come so far from thinking we're the center of the solar system. Science high five. And that's been your Sage advice.
Personal Strategies to Combat Cognitive Bias
Thanks, Sage. And here's the coolest part. You can watch out for cognitive bias in your own thinking, too. One of the easiest and most important things you can do to fight back against bias is just understanding that it's real and accepting that you have it. Being aware of bias gives you the chance to look out for its influence on your decisions. Anybody who says they don't have any biases is just waving a huge red flag. Another way is to interact with lots of people, especially people who are different from you. Bias likes to tell us that our experience is the only reality. But to really understand the world, we need community. Scientists do this too. They are endlessly testing and vetting each other's claims. Without expertise from a diversity of scientists, the scientific process would fail. And by that same token, you can't overcome your biases on your own either. Unlike those gut feelings we get, science requires evidence. So whenever we consume science news that goes against an idea or experience we believe to be true, it's good to.
Cognitive Bias and Flexibility
Remember that our cognitive bias might be working against us. Which ties into another big bias buster, cognitive flexibility. Your ability to imagine options or explanations beyond your gut reaction. In other words, being able to say, "You know what? Maybe I was wrong. " I know, right? In this economy, on this internet, you can do it, though. It's good for you. By the way, uh isn't it just extremely wild that our brains can think of ways to outsmart the ways they think?
Understanding Scientific Thinking
Throughout this series, we're going to talk a lot more about how the scientific process works, so that when we see news stories about science, we'll better understand what's going on behind the scenes. And that knowledge is going to help us all respond better to the science on our social media feeds, in our group chats, and at our dinner tables. Remember, we all have cognitive biases. They're not something to be ashamed of. They're just our brain's way of solving problems faster. But the world is very weird. So often our mental shortcuts don't work. That's where scientific thinking comes in. Science relies on evidence evaluated by a community of experts, and it has systems that are designed to reduce bias. It's not perfect, but it's one of the best tools we have.
Next Episode Preview
Next time, we're going to explore the wild world of statistics. I'll see you then.
Production Partnership and Resources
This episode of Crash Course Scientific Thinking was produced in partnership with HHMI Bio Interactive, bringing real science stories to thousands of high school and undergrad life science classrooms. If you're a teacher, visit their website for resources that explore the topics we discussed in today's video.
Closing Credits and Support
Thanks for watching this episode of Crash Course Scientific Thinking, which was filmed in Missoula, Montana, and was made with the help of all of these nice people. If you want to help keep Crash Course free for everyone forever, you can join our community on Patreon.