Sketchplanations
Sketchplanations podcast photo of Rob Bell, Tom Pellereau and Jono Hey
Prefer to listen?
🎉 Help keep Sketchplanations ad-free by supporting me

Explaining the world one sketch at a time

Have great conversations about ideas through simple and insightful sketches.

In a Book: Big Ideas Little Pictures

5-star rated on Amazon!

Absorb big ideas with crystal-clear understanding through this collection of 135 visual explanations. Including 24 exclusive new sketches and enhanced versions of classic favourites, each page shares life-improving ideas through beautifully simple illustrations.

Perfect for curious minds and visual learners alike.

See inside the book

Hi, I'm Jono 👋

I'm an author and illustrator creating one of the world's largest libraries of hand-drawn sketches explaining the world—sketch-by-sketch.

Sketchplanations have been shared millions of times and used in books, articles, classrooms, and more. Learn more about the project, search for a sketch you like, or see recent sketches below.

Recent sketches

The McNamara Fallacy, also known as the quantitative fallacy, where we measure what is easy and ignore what is hard

The McNamara Fallacy

The McNamara Fallacy is a belief in easy-to-measure quantitative metrics at the expense of ignoring hard-to-measure qualitative factors. Robert McNamara was president of Ford Motor Company and later the Secretary of Defense for the USA during the war in Vietnam. He was highly intelligent and excelled at dealing with data and using it to inform strategy. Coined by social scientist Daniel Yankelovich, the McNamara Fallacy, also called the Quantitative Fallacy, involves: Prioritising easier-to-measure quantitative metrics and Disregarding hard-to-measure qualitative metrics as unimportant The fallacy can progress as follows: Measure what can be measured Disregard what we can’t measure Assume what cannot be measured is not important Conclude that what can’t be measured doesn’t exist In the words of Yankelovich: “The fallacy is: If you’re confronted by a complex problem that is full of intangibles, you decide to measure only those aspects of the problem that lend themselves to easy quantification, either because you find other aspects difficult to measure or because you assume that they can’t be very important or don’t even exist.” … “It is a short, fatal step from the statement, ‘There are many intangibles and imponderables that we can’t put on our computers,’ to the statement, ‘Let’s measure what we can and forget about the intangibles.’” Yankelovich cites working with Ford during McNamara’s time and sharing research that included both quantitative and qualitative factors. As the research was assessed, the qualitative data on the meanings people gave to small cars were discarded, and the less significant quantitative and demographic data were retained. Examples of the McNamara Fallacy I first learned about the fallacy from a reader’s article about one of the easiest to measure aspects of a bike: its weight. All things being equal, we’d probably prefer a lighter bike, but other aspects like maintenance, reliability, and handling are important yet harder to measure, report on and compare. As a result, weight often comes to the fore at the expense of the others. Other examples might include: Perhaps your hiring time is down, but how is the fit of the people you’re bringing in? Maybe more people are visiting your website, but they aren’t the right people for your service. We can calculate a country’s GDP, but GDP doesn’t account for human labour without a monetary transaction—like a home-cooked meal—and vital work done by nature, like filtering water, sequestering carbon, or lifting spirits. If food in a can gives you all the nutrients you need, what are you missing by skipping family meals? A commonly cited example of the McNamara Fallacy is the US military's approach to measuring progress in the Vietnam War. The McNamara Fallacy and the Vietnam War As the US Secretary of Defense from 1961–1968, McNamara employed similar techniques to those he had used successfully in business contexts to assess the progress of the war in Vietnam. If wars were won by inflicting damage on the enemy, then metrics measuring the damage inflicted should be decent proxies. In particular, assessing body count evolved to be the primary measure of progress. Reliance on purely quantitative metrics had significant shortcomings. In this context, enemy body count was more easily quantified than their morale, political support, or resolve to defend. Because of the interest in this measure, many officers believed that it was also often inflated (see Goodhart’s Law and Campbell’s Law below). Measuring the tons of explosives dropped is easier than measuring the reduction in capabilities they caused. Knowing the number of troops you have is easier to measure than the abilities of those troops. According to the numbers, the U.S. was winning the war, yet it failed to overcome the resistance of North Vietnam. Related Ideas to The McNamara Fallacy An over-reliance on quantitative metrics quickly leads to several other related problems to deal with: Goodhart’s Law When a measure becomes a target, it ceases to be a good measure. For example, assessing the progress of war on the numbers of enemy dead may lead to increased killing to inflate numbers. Campbell’s Law The more a metric counts for real decisions, the greater the pressure for corruption, and the more it distorts the situation it’s intended to measure. For example, if reducing crime rates matters for law enforcement jobs, it creates an incentive to under-report cases or downgrade crimes. The Streetlight Effect Looking where it’s easiest to look, rather than where it matters. Also known as the drunkard’s search or looking under the lamppost, the Streetlight Effect is named after the economists’ joke of a person scrabbling on the ground for their car keys under a street light. When asked where they lost them, the person says they dropped them “over there,” but the light’s much better over here. For example, optimising an existing product because it is known and does well, rather than working on an uncertain new product. You Get What you Measure The instrument you use to measure affects what you see. “For example, in school it is easy to measure training and hard to measure education, and hence you tend to see on final exams an emphasis on the training part and a great neglect of the education part.”—Richard Hamming The Cobra Effect When an implemented policy backfires causing the opposite of the intended outcome. Named after a British attempt to reduce cobras in Delhi by introducing a bounty on dead cobras. Seeing the lucrative bounty, people started farming cobras thereby increasing their numbers. For example, the Streisand Effect is a well-documented case of a cobra effect when Barbara Streisand tried to suppress images of the coast that included her house. The attention drawn to her attempts to suppress the images brought thousands of people to look at them who otherwise would never have considered it. And all of these are examples of the broader Law of Unintended Consequences, which comes from trying to regulate a complex system with a simple system. More Related Ideas to The McNamara Fallacy If the ideas above aren’t enough, you might also see: In theory, practice is the same as theory, but not in practice All models are wrong but some are useful Reliability vs validity: You can hit the target but miss the point. Analytics maturity What gets measured gets better The map is not the territory (no sketch!) For a super discussion on the measurement of all sorts of things, including acute (being run over by a bus) and chronic risks (smoking), I recommend the fun podcast episode: Microlives & The Art of Uncertainty with Sir David Spiegelhalter The article about bike weight is: The McNamara Fallacy and Bikes by Peter Verdone Daniel Yankelovich, who named the McNamara Fallacy, stressed no disrespect to “one of our most distinguished citizens,” Robert McNamara, “a brilliant and dedicated man who brings a vital intensity to bear on his work.” Some of the complexities of the war and McNamara are covered in the Academy Award-winning documentary The Fog of War, which makes fascinating and, at times, uncomfortable viewing.
Read more…
All models are wrong, but some are useful - George Box statistics quote example and meaning with a map and simple model

All models are wrong, but some are useful — George Box quote

One of my favourite statisticians’ quotes is British statistician George Box’s point that: “All models are wrong, but some are useful.” — George Box His point, as I understand it, is simply that any model is inherently a simplification and approximation of reality. It will never capture reality in its entirety. This applies to your formula of expected revenue and expenses that you drag down cells in a spreadsheet. And it applies to our largest, immensely complex, weather and climate models with millions of individual variables and data points. They can never be complete. And while we should keep this in mind for the limitations of all our models, forecasts, predictions and explanations, it doesn’t stop them from, many times, being useful. While you can, and many people do, argue with the premise of models being “wrong,” I like the quote because it reminds me to be humble and duly sceptical of any models I read about or put together. While their accuracy varies enormously, they are all fallible. It also reminds me that any model has a purpose—emphasising some features of reality and ignoring others. On Maps While a map is not a model in the statistical sense that George Box likely had in mind when he uttered the phrase, I find that maps are a great example of selective focus. In the sketch, you can see the messy complexity of the world, the traditional map where buildings and roads become marks on paper, and the simplification of a bus route’s sequential stops. Other times, maps may focus on: Human geography: streets, buildings, shops Tourist attractions Relief, elevation and geographic features Water courses and watersheds Traffic and travel times Political boundaries And any number of other things. They are all models representing some aspects more accurately at the expense of others. As the saying goes, “The map is not the territory.” Other Examples Some other examples of simplified, yet still useful models. Weather: imagine trying to condense all the complexities of a day’s weather in the UK into a single icon. Not so easy. And yet, it can still help you decide if you should bring an umbrella or postpone a sailing trip. The Mercator projection: Useful for sailors. Massively distorted at high latitudes. Naismith’s Rule: for estimating walking time in the mountains based on distance and elevation An orthographic drawing of a house or product: may highlight relevant features for construction—wiring, plumbing, structural integrity—while ignoring textures, colours, imperfections or cost. The Bohr model of the atom: electrons orbiting like planets around the nucleus. Imperfect, but useful. Normal distributions: such as you might see in people’s height, are rarely, if ever, perfectly normal and yet can still be useful approximations. Supply and demand curves: approximate buyer and supplier behaviour while making assumptions about rationality and the availability of information. The London Underground map: famously simplified to perpendicular and angled lines, transforms travelling on the tube and confuses walking distances. A model is a tool, and its usefulness depends on what you are trying to do. If you’re trying to plot courses on a flat map, then the Mercator projection is extremely helpful. If you’re trying to understand how large Greenland is compared to other countries, then it’s misleading. Box advocates using economical models that allow us to interrogate practice while staying aware of where they may be importantly wrong. It’s not lost on me that I present a lot of simpler models than reality in Sketchplanations. I think that’s because, when we consciously acknowledge their imperfections and limitations, a good model, like a good framework, can be just so helpful. As Larry Keeley said, “Building a good framework is like cutting cubes out of fog.” Related Ideas to All Models are Wrong Also see: Without data, you’re just another person with an opinion — W. Edwards Deming The Metrics Onion VUCA: Volatility, Uncertainty, Complexity, Ambiguity Sneaky averages Measures of central tendency: Mean, median, mode Correlation is not causation In theory, practice is the same as theory, but not in practice Sampling bias Chihuahua syndrome Science and engineering: what’s the difference? The mathematics of everyday life Rick Wicklin wrote a helpful write-up of his investigations on the quote: Did George Box say, "All models are wrong, but some are useful"? (Spoiler: Yes)
Read more…
Iceberg model systems thinking diagram showing Events above the surface and Patterns, Structures, and Mental Models below the surface

The Iceberg Model

Icebergs make powerful metaphors because much of what matters is hidden. Despite knowing the maths, it’s still genuinely surprising and sometimes hard to believe that 90% of an iceberg is beneath the surface. Icebergs are, therefore, a super metaphor for when visible outcomes are shaped by unseen forces. And where we see just a small amount that is built on top of a whole lot that’s out of sight. Even better if this “whole lot more” is unexpected. In my experience, this fits a lot of things. It’s a fair representation, for example, for the unseen work that goes into near enough anything. Icebergs have been mobilised in the name of better understanding psychology, culture, and, in this case, systems. I like the model because it prompts us to question what could be “under the surface” and, in terms of observed behaviour, what’s driving what we see? This version of the iceberg model comes from the work and teaching of the systems thinker Donella Meadows and the Academy for Systems Change. Layers of the Systems Thinking Iceberg Model The four layers of this iceberg are: Above the surface Events What is happening? What are the facts? What behaviours do you see? Below the Surface Patterns Are the events connected by trends over time? Do events relate to each other? Structures What influences the patterns you see? Think about routines, incentives, environments, organisations, policies, and rituals. Mental Models What are the values, assumptions and beliefs people hold that shape the system? The deeper down you go, the more leverage you have to create change. For example, changing someone’s attitudes and beliefs about eating will have far-reaching effects on a person’s behaviours around meals. Example of the Iceberg Model What might this look like in practice? Event A child steals their sibling’s toys. Patterns The behaviour only started recently. It tends to happen after swimming lessons or before bed. Structures They’re tired and hungry after swimming, lowering their tolerance and patience. Parents are often busy preparing dinner. The younger child gets more attention. Mental Models Older children are expected to be more self-sufficient. “When I act up, I get attention.” Overscheduling activities. Or Event A colleague is late to a meeting, again. Patterns They used to be on time for meetings. Being late to the first meeting in the morning is most common. Structures Meetings rarely start on time. There are no consequences to being late. Some people have many meetings, others have few, and so don’t pay close attention to the calendar. Mental Models Don’t value the meetings. They don’t understand the impact it has on others. Promptness is respect and required vs promptness doesn’t matter if you get results. In each case, looking below the surface reveals what’s really driving the behaviour. How to Use the Iceberg Model Start with an event or situation you want to understand. Work down through the iceberg, asking yourself questions at each level and writing the answers. It doesn’t have to be a big exercise. Write down patterns you see about the event and those that might be related to it. Think about what structures might be causing it. Consider physical things, organisations, policies, and rituals that all may be contributing. Write these down. Then dig into the source of these structures. What beliefs enable them? What values align with them? What attitudes or expectations keep them going? You don’t need to be exhaustive — even a quick pass can reveal useful insights. Many solutions focus on the events. But lasting change often comes from addressing issues below the surface. I made some blank versions of these with the blue and on white that you can use to fill out your own icebergs: Iceberg model templates Related Ideas to the Iceberg Model Systems, thinking and systems thinking: Pace layers Rich pictures 9 Windows Powers of 10 The S-Curve Data - Information - Knowledge - Wisdom Icebergs: Overnight success Iceberg orientation — for the 90% bit Buoyancy I created this iceberg model illustration as one of a series of visuals for Kaine Ugwu about Systems Thinking.
Read more…
What is uitwaaien: Dutch word meaning walking in the wind to clear your head, with worries blown away on a windy beach

Uitwaaien: The Dutch Word for Walking in the Wind

Have you ever found something rather wonderful about a walk in a strong wind? You might appreciate the Dutch concept of Uitwaaien. What does “uitwaaien” mean? Uitwaaien is a Dutch word meaning to go out into the wind, often for a walk or bike ride, to clear your head and refresh your mind. Uitwaaien combines two words: uit, meaning out, and waaien, meaning “to blow”, as wind blows. How to pronounce uitwaaien: roughly o-ut-vye-en. In English, the idea of uitwaaien is the refreshing feeling of going out into strong wind and letting it clear your head. Like getting an airing out, I picture it blowing your worries away. A breath of fresh air. We like to go to the southwest of England in autumn and winter. A walk on the coast or up a tor in Dartmoor, getting battered around with a strong wind in your face pulls you into the roar and immerses you in it. It’s bracing and never fails to give a satisfying, relaxed calm when you finally get back inside and exhale. And there are plenty of windy spots across the Netherlands, where the Dutch word uitwaaien comes from, to enjoy a regular walk in the wind and clear your head. A walk in nature always has the power to soothe, calm, and clear your head, be it the three-day effect, forest bathing, or solvitur ambulando. But nature, combined with a strong wind, seems to have an extra power. Uitwaaien is one of many beautiful foreign words that capture a feeling English describes only with a longer phrase. I’ve linked some others below. I also made prints of Uitwaaien with words and without words Related Ideas About Walking in Nature Also see: The three-day effect Forest bathing Solvitur ambulando Apricity: the warmth of winter sun 5 Ways to Wellbeing How to Instantly Feel Better Some other super, non-English words: Rückenfigur Wabi-sabi Kaffikok: the distance before you need a cup of coffee Tsundoku: buying books and letting them pile up without reading them Vorfreude: the pleasure of anticipation Schadenfreude: pleasure at someone else’s misfortune Grateful to my Dutch friend for sanity checking all this 🙏
Read more…
Coffee: the glorious solution to the coffee sleep cycle with poor sleep caused by coffee, leading to tiredness, solved by coffee

Coffee: The Glorious Solution to The Coffee–Sleep Cycle

The genius of coffee: it’s the solution to a problem of its own making. Too much coffee leads to poor sleep, which leads to waking up tired, which leads to craving that glorious coffee fix to make the tired go away. An ingenious self-reinforcing cycle. Few other substances or systems seem to manage the elegance and simplicity of this cycle. Imagine if wine cured the effects of drinking too much wine. The stark clarity of this coffee-sleep cycle was made clear to me in Michael Pollan’s recent This is Your Mind on Plants, which has one-third of the book devoted to caffeine. Before coffee and tea made their way to Western Europe with their doses of caffeine, Michael explains, it was common to drink ale and beer throughout the day, starting at breakfast. The result surely being that many people must have been going about their duties in a light haze. The replacement of the fuzziness of alcohol with the clarity, focus, and alertness that caffeine brings may genuinely have played a major part in human progress since. Intriguing. Although many people—though not me—can fall asleep after a late coffee, there’s evidence that it still affects sleep quality. However, despite knowing what coffee is doing in perpetuating my need for it, I, and millions of others, happily rejoin the cycle each day. As I sketched this, I wondered if it were clearer simply as The Coffee-Sleep Cycle (prints). What do you think? Right. About finished this. I’m off for a coffee. More Coffee-related Ideas Also see: Kaffikok: the distance you can go before you need a cup of coffee When drinking tea, just drink tea Half caff: adjust your daily caffeine intake Nappuccino The Half-life of caffeine Clear the fog with a redeye American - Long black Make an Irish coffee Make Vietnamese coffee Microlife: the unit of life The virtuous cycle of exercise and sleep Painkillers and vitamins Addiction
Read more…
Pyrrhic saving concept showing shoppers leaving a megastore with an overloaded trolley under deal signs, captioned “If we save any more money, we’ll be ruined” – spending more to save money example

Pyrrhic Saving: "Saving" money by spending more

Perhaps you've heard of a Pyrrhic victory: a victory so costly that it feels almost like a defeat. The name comes from the tale of King Pyrrhus of Epirus in 279 BCE. On defeating the Romans, yet seeing the loss of so many of his best warriors, he declared, "If we are victorious in one more battle, we shall be ruined!" A Pyrrhic victory is an apparent gain that comes at too high a cost. What Is a Pyrrhic Saving? A Pyrrhic saving is a saving that costs you more than it benefits you—spending more to save money you wouldn’t otherwise have spent. “If we save any more, we shall be ruined!” Ever bought something you didn’t need just because it was on sale? Or chosen the bundle when you only wanted one? These are Pyrrhic savings. A saving money trap disguised as a bargain. Examples of Pyrrhic Savings Some personal Pyrrhic saving examples: A £20 voucher that has you spending another £40 to use it. Low-cost memberships that make you buy more than you would have done (Prime?). Going to buy one digital download and choosing the discounted pack of three, then never using the other two. A free kids' meal that results in eating out as a family when you would otherwise have stayed in. In business, it can look similar: Getting the larger office space that’s cheaper per square foot, but largely sits empty. Upgrading to the higher software tier because it’s a better deal, yet barely using the extra features. Bulk-purchasing inventory at a discount that ties up cash and costs you to store. Locking into a multi-year contract at a saving when you would otherwise have switched. Saving money is satisfying. And sometimes a saving genuinely pays you back. But it could be worth asking: is this truly a saving — or money better left for another day? Related Ideas to Pyrrhic Saving Also see: Pyrrhic Victory The Subscription Trap Contentment Sales aren’t always the bargains they seem The decoy price The mathematics of everyday life Sneaky casinos Spend better Debt’s vicious cycle Euphemisms for losing money Mental accounting Bull market, bear market
Read more…
Buy Me A Coffee