Julia Galef thinks we should be more like scouts instead of soldiers

An illustration of Julia Galef.
Rebecca Clarke for Vox

Rationalist Julia Galef believes everyday people will benefit from assessing all sides of a debate, rather than just their own.

I first met Julia Galef in 2012 at the inaugural weekend workshop of the Center for Applied Rationality (CFAR), a nonprofit that Galef co-founded that same year to teach the concepts and practical skills of human rationality.

For CFAR, this means refining techniques for reasoning more accurately, understanding the world, and making plans that work (and happen on schedule!). At the retreat, Galef advised a room full of philosophy students and programmers (and me, a random intensive care unit nurse) on how to think about probabilities and uncertainty in real-life contexts.

A decade later, Galef continues to write about and advocate for a concept that is more important than ever in our irrational age: that we shouldn’t assume which conclusions must be defended, and need to stay open to uncertainty.

She and the other founders of CFAR — who met via the rationality blog LessWrong (for which I’ve written) — believe that human intelligence alone isn’t enough to address the biggest problems facing the world. While intelligence is responsible for the astounding advances in technology, prosperity, and quality of human lives over the past several thousand years, it can also be applied to causing massive harm.

As the Nobel Prize-winning economist and psychologist Daniel Kahneman argued in his influential 2011 book Thinking, Fast and Slow, human beings are prone to systematic errors: making assumptions or jumping to conclusions, seeing what we expect to see, and often leaning over-optimistic on plans and deadlines.

LessWrong was a space to discuss ways of mitigating and working around these very human flaws, but discussions there tended to be abstract. CFAR, by contrast, hoped to provide concrete, immediately useful training. In the 2012 workshop I attended, we learned to apply Bayes’ theorem — the formalized math for changing your mind based on new information about an uncertain situation — to our thinking and carefully mapped out the different, sometimes conflicting motivations and goals involved in major life decisions.

Galef came to the project with a varied background: After completing an undergraduate degree in statistics, she did social sciences research at Harvard and MIT, international economics work for Harvard Business School, and spent several years in New York as a freelance journalist. Shortly before CFAR came together, she co-launched the podcast Rationally Speaking, where she interviews a wide range of experts on topics related to rationality and effective altruism.

Galef moved on from CFAR in 2016 and pivoted to working on a book about rationality, The Scout Mindset: Why Some People See Things Clearly and Others Don’t. Published last year, the book delves into the importance of curiosity and genuinely trying to learn the details of a situation rather than fighting for a particular side. In Galef’s view, this means acting more like a scout on the battlefield of debate, rather than a single-minded soldier.

Galef doesn’t claim to have solved the challenges of acting rationally even for herself. But as she said in a Vox interview after the book’s release: “Even when you’re motivated to try to improve your own reasoning and decision-making, just having the knowledge itself isn’t all that effective. The bottleneck is more like wanting to notice the things that you’re wrong about, wanting to see the ways in which your decisions have been imperfect in the past, and wanting to change it.”


Post a Comment

0 Comments