Kahneman wouldn’t ask you what you think. He’d ask you how confident you are about what you think. Then he’d show you why the confidence itself is the problem.
He’d give you a number. “What percentage of African countries are in the United Nations?” But first, he’d spin a wheel. The wheel would land on 65. Or 10. It was rigged — he’d chosen the number in advance. Then he’d ask for your answer. And your answer would be anchored to the number on the wheel, even though the wheel had nothing to do with African countries, even though you knew it was random, even though you were warned. Your brain would use the irrelevant number as a starting point anyway.
He demonstrated this in 1974 with Amos Tversky. They called it anchoring. It was one of dozens of cognitive biases they catalogued over twenty years of collaboration — systematic errors that aren’t bugs in human thinking but features. The brain doesn’t make mistakes randomly. It makes the same mistakes, the same way, for the same reasons, across cultures, across education levels, across expertise. Experts are just as anchored as amateurs. They’re just more confident about it.
Why He Asks That
Kahneman’s entire career was a sustained investigation of one question: why do smart people make predictable errors? Not random errors — predictable ones. The kind you can see coming if you know where to look. He was looking for the seams in human rationality, the places where the hardware glitches in reliable, replicable ways.
He found them everywhere. Loss aversion: you feel the pain of losing $100 roughly twice as intensely as the pleasure of gaining $100. The planning fallacy: you underestimate how long a project will take, even when you know you’ve underestimated every previous project. WYSIATI — “What You See Is All There Is” — the brain’s tendency to build complete stories from incomplete information and then feel confident about the story.
He’d demonstrate these on you. Not to embarrass you. Because the demonstration is the lesson. Reading about cognitive bias doesn’t fix it. Experiencing it might. He told an interviewer that after fifty years of studying bias, his own judgment was only marginally better than anyone else’s. The biases are wired in. Knowing about them helps you build systems that compensate. It doesn’t make the biases go away.
What Happens When You Answer
You’d give him an answer — any answer, about any topic — and he’d ask a follow-up that would make you revise. Not because he’d given you new information. Because the follow-up question changed the frame.
“How happy are you with your life?” Simple question. But if he’d asked you about your dating life first, your life satisfaction answer would be lower than if he’d asked about your work first. The preceding question primes the frame. You think you’re answering independently. You’re not. He proved this experimentally. Thousands of subjects. The effect is robust.
He spoke softly. Israeli accent, never quite lost after decades at Princeton. The voice was gentle and the content was devastating. He’d describe the architecture of your own poor judgment the way a friendly doctor describes your X-ray — with concern, without alarm, with the assumption that knowing the diagnosis is better than not knowing, even when there’s no cure.
The Thing He’d Teach You Without You Realizing It
The real lesson wasn’t about bias. It was about the difference between thinking fast and thinking slow. System 1 — fast, automatic, effortless. System 2 — slow, deliberate, effortful. System 1 does most of the work. System 2 thinks it’s in charge. The gap between those two statements is where every important mistake lives.
He and Tversky worked together for nearly thirty years. Tversky died in 1996. Kahneman won the Nobel Prize in Economics in 2002 — the only psychologist to win it — and said publicly that Tversky deserved it equally and would have shared it if he’d been alive. The Nobel committee doesn’t award posthumously. Kahneman spent the rest of his career making sure people knew the work was joint.
He’d want you to leave the conversation slower. Not sadder. Slower. More willing to check your first answer. More suspicious of your confidence. More aware that the feeling of certainty and the fact of certainty are different things, and that the brain doesn’t tell you which one you’re experiencing.
He proved that confidence is noise, not signal. That the brain builds stories from fragments and calls them truth. He’d show you yours — gently, precisely, and without offering a fix. The awareness is the fix.
Talk to Daniel Kahneman — but check your confidence at the door. He’s been measuring it.