The most surprising thing about back-of-the-envelope estimation is that it’s not about getting the exact number, but about demonstrating a structured thought process and understanding of scale.

Let’s say you’re asked to estimate the number of piano tuners in Chicago. You don’t pull out a calculator. Instead, you break it down:

  1. Population: Chicago has roughly 3 million people.
  2. Households: Assume an average household size of 2.5 people. So, 3,000,000 / 2.5 = 1,200,000 households.
  3. Piano Ownership: Not every household has a piano. Let’s guess 1 in 20 households has a piano. So, 1,200,000 / 20 = 60,000 pianos.
  4. Tuning Frequency: Pianos need tuning, but not constantly. Maybe once a year on average. So, 60,000 tunings per year.
  5. Tuner Capacity: How many tunings can one tuner do? A tuning takes about 2 hours. A tuner works maybe 8 hours a day, 5 days a week, 50 weeks a year. That’s 8 * 5 * 50 = 2,000 working hours per year. If a tuning takes 2 hours, that’s 2,000 / 2 = 1,000 tunings per year per tuner.
  6. Number of Tuners: 60,000 tunings per year / 1,000 tunings per tuner per year = 60 piano tuners in Chicago.

This process isn’t about the precision of "1 in 20 households" or "2 hours per tuning." It’s about the logic: population -> units of interest -> demand -> supply. Each step is a reasonable assumption that can be defended. The interviewer is looking for your ability to:

  • Deconstruct a large problem: Break it into smaller, manageable pieces.
  • Make reasonable assumptions: State them clearly and justify them.
  • Perform simple arithmetic: Multiply and divide powers of 10.
  • Think about constraints and capacity: Understand how much work can be done.

Here’s another example: Estimate the number of Google searches per day.

  1. World Population: ~8 billion people.
  2. Internet Users: Not everyone uses the internet. Let’s say 60% are online: 8 billion * 0.6 = 4.8 billion users.
  3. Active Searchers: Not every internet user searches Google daily. Let’s assume 10% of internet users search Google per day. That’s 4.8 billion * 0.1 = 480 million people searching per day.
  4. Searches per Person: Some people search once, some many times. Let’s average it to 3 searches per person per day.
  5. Total Searches: 480 million people * 3 searches/person = 1.44 billion searches per day.

You can refine this. What if you consider different regions? What if you assume different search frequencies for developed vs. developing countries? The goal is to show you can build a model. The "real" number of Google searches per day is in the trillions, but your initial estimate of ~1.5 billion is a good start, and the interviewer might ask you to refine it based on specific information. For instance, if they point out that many people in India have smartphones but limited data plans, you might adjust the "active searchers" or "searches per person" for that segment.

The core idea is to move from a known, large number (population, internet users) to the unknown (searches, piano tuners) by applying logical ratios and constraints. You’re not expected to know the exact ratio of piano tuners to households, but you are expected to know that most households don’t have a piano and that tuners have limited working hours.

The real power comes when you start to think about the sensitivity of your assumptions. If you change the "piano tuning frequency" from once a year to once every two years, how does your final number change? If you change the "searches per person" from 3 to 5, what’s the impact? This demonstrates a deeper understanding of the system you’re modeling, showing you can identify the most impactful variables.

The next challenge in this vein is often a variation on the theme, like estimating the storage needed for all photos taken by users of a specific app, or the number of flights per year between two cities.

Want structured learning?

Take the full System Design course →