When I was young, I read several books about the Titanic’s sinking. I remember reading that when people were boarding the lifeboats, there were many panicked individuals that forced their way into the boats, pushing aside the young, elderly, and women. Little me thought to himself “I wouldn’t have done that, I would have let the others board first”, as if I was some kinder, more rational human.
There are more examples of young me thinking like this: I would do better than the average in a certain test, I would be good in a multiplayer video game, I would still be the fastest kid in my new school, and more. Typical child-like thinking, but one that is still often found amongst adults. Why is it that people think like this? To assume that they will be the special one, the one that succeeds, the one that becomes an exception. Sure, sometimes you’re part of that small elite group. But for the most part, we’re all typical.
This flawed thinking is neatly snapshotted by the principle of mediocrity, which states that what we are, see, and experience is likely typical. This principle is derived from the Copernican Principle, which is used in astronomy and states that our position in the universe is typical. For instance, the Earth is just one insignificant planet of the eight that go around the sun. Scaling up, the sun isn’t a rare object, there’s at least 100 billion more stars in our galaxy. Then there are hundreds of billions of other galaxies in the universe.
You sometimes hear that there are more stars in the universe than all grains of sand on Earth; this is a great statement to represent how small we are. We shouldn’t expect to be a special outlier in a set on the order of 1021 sand grains. Whether or not the topic is space, the principle of mediocrity tells us to be humble and expect to be an average outcome.
Origins of the principle
The Copernican principle was born out of a series of events in the history of astronomy, each lowering the stature of people to an inferior state. Going through them also helps project what the principle can say about future events.
The Geocentric model
Millenia ago, we thought that we were at the center of the universe. After all, we don’t feel the Earth moving and objects like the sun and moon seem to go around us, so therefore we must be in the center. Additionally, the world seemed flat because if it were round, then surely we would be able to see the ground slope away. The geocentric model of the universe had everything revolving around a stationary Earth at the center of the universe; the moon, planets, the sun, and all other celestial bodies. Luckily, the flat Earth part dissolved away rather quickly and a spherical planet was implemented.
The geocentric model was convincing to to all civilizations no matter how advanced or rudimentary they were, for nearly two thousand years. It put the Earth and humans at the center; the heavens were filled with perfect divine objects made of a pure substance called aether. The all orbited around the Earth, performing elaborate, spaghetti-strand orbital dances to explain the retrograde motion seen from Earth. Even the sun, which was widely accepted as a divine figure since human inception, was revolving around us.
Obviously in hindsight it is easy to indefinitely mock geocentrism for its outdated views. The model was created when astronomy didn’t exist and astrology was dominant, religion was involved in every field of study, and people believed that the universe was made from air, fire, water, and dirt. The important part to know is the anthropocentrism that went into the geocentric model; how humans are at the center of the whole universe and that the divine and almighty gods and heavens are revolving around us. If gods are almighty and we revere them, shouldn’t it have been the other way?
The Heliocentric model
In the 16th century, the geocentric model saw its greatest threat yet: Nicholaus Copernicus’s heliocentric model with the sun at the center of the universe and Earth was one of the planets revolving around the sun with the other planets and stars. The heliocentric model was able to elegantly explain several problems the geocentric model had; retrograde motion is caused by the Earth’s movement, seasons exist because Earth’s axis isn’t perfectly upright, and more.
The heliocentric model is actually far older than a few centuries, it originated around the same era as the geocentric model and other astronomers had their own heliocentric models before Copernicus made it widespread.
The heliocentric model had two big problems in it:
- The sun is at the center of the universe
- All orbits are perfectly circular (later corrected by Johannes Kepler)
What I find particularly interesting about the heliocentric model is the effect it had on the status of humans; demoted from being at the center of the universe to orbiting the sun. Still close to the center, but no longer the most important figure. So bad was the damage that the Church had to suppress Gallileo’s work for heresy. The heliocentric model was the beginning of when religion began to leave astronomy, and instead mathematics became the way to explain events.
Galactocentrism
In the 1800s, Friedrich Bessel was the first astronomer to determine the distance to another star. The distance, 11 light years, was so vast that suddenly the universe expanded from the orbit of Saturn to nearly 1011 times larger. Now the sun was no longer at the center of the universe, but it too was just another tiny blip in a much larger galactic system. The Milky Way became the then-known universe, and now we know it contains roughly 100 billion stars and is about 100 000 light years in diameter.
In this new universe, the sun and Earth, once again, were assumed to be at the center of the galaxy. Like the flaws of the geocentric model, the assumptions that went to form galactocentrism were likely limitations in then-present technology and existing beliefs. Ulimately, the anthropocentric mindset had persisted once again. Soon after, astronomers realized that the solar sytem was in the outskirts of the galaxy, not anywhere near the center; another Copernican realization.
Island universes
In the early 1900s, the Milky Way universe was enlargened again when Edwin Hubble observed distant galaxies that existed millions of light years beyond the Milky Way. The term “island universe” refers to a hypothesis which argued that nebulas understood to be in the Milky Way were separate galaxies. For example, the famous Andromeda galaxy was then called the Andromeda nebula and was thought to exist in the Milky Way. With telescopes becoming more powerful, we’ve peered deep into the universe; now an astounding 46 billion light years in radius.
The acentric model
Long story short, the universe doesn’t seem to have a center. An established fact is that the universe is expanding; if you were to accelerate the flow of time, you would see distant galaxies zooming away from the Milky Way. If you were in the Andromeda galaxy, you would see the same thing. Even if you were in a galaxy 46 billion light years away, you would still see the same thing. If you draw evenly spread dots on a balloon and inflated it, where is the center from a dot’s perspective? Every place is the center and space is expanding away from every observer, therefore there is no absolute center in the universe. With Albert Einstein’s work on relativity showing that there is no absolute reference fram in the universe, the assumption that humans stood in the center of the universe has completely disintegrated.
Humans started out studying the night sky by assuming they were at the center of the universe. Then they realized the sun was. Then the sun was the center of a much larger galaxy. Then the sun was in the outskirts of the galaxy, but the galaxy was still the entire universe. Then other galaxies were confirmed to exist and the scale of the universe suddenly expanded. And finally, we’ve learned that there is no center of the universe; no absolute reference frame.
This is a story filled with repeated devaluations of where we stand; from being the center of a divine dance to being in the “middle” of nowhere. Out of these demotions, the principle of mediocrity was born; we aren’t special, we should expect what we experience to be the norm. It’s a general rule that applies to all kinds of situations too, not just astronomy.
Applying the principle
Using the principle of mediocrity in real life is rather a down-to-Earth approach to any event in your life.
Imaginary parties
Let’s say I was invited to a party. I’m not fond of them, but let’s ignore that for now. I can imagine what the party will be like; maybe I’ll end up meeting some cool people, have lots of fun, and enjoy the night. Or I could end up being awkward for two hours with one can of beer, forced to listen to hiphop music blasted from junk speakers, watching some people actually having fun before leaving early and regretting leaving my house.
What’s more likely out of these two situations? Probably the latter. The party likely won’t be as bad as the second scenario, but there’s a good change that the ideal story of the first case won’t play out. Maybe I’ll just get a can of beer and talk to one or two people, then go home and not regret nor enjoy the experience; this sounds more realistic.
Everyone has these moments where they expect something great to happen, like what I said in the opening paragraphs of this article. Sure, they’ll sometimes happen. More often than not though, they don’t; your experience with something is mediocre, typical, not unusual.
Fall of the Berlin Wall
There was a clever use of the principle of mediocrity I read about; Princeton University professor Richard Gott made a prediction of when the Berlin Wall would fall. The wall was constructed in 1961 and Gott visited it in 1969. Reasonably assuming that his visit time was nothing unusual (not near the very end), he estimated that the wall would last a minimum of 2.67 years and maximum of 24. The wall fell 22 years later.
This kind of guess is nothing rigourous and a high confidence interval stretches out the time span significantly. For instance, modern humans have existed for around 200 000 years now. Using the mediocrity principle, we can say that we are living in a typical period and calculate that we should go extinct within 5000 years to 7.8 million years with 95% confidence.
Another good example is the age of the Earth, 4.6 billion years old. In about 5-6 billion years, the Earth will be destroyed by the sun. You can see how we’re living very close to the middle of the planet’s lifetime, which is very typical (although Earth will already be uninhabitable in a rather short 1 billion years). You can find out more about these fun estimates here.
Flaws of the principle
One of the most debatable applications of the principle is alien life. The mediocrity principle tells us that planets, stars, and galaxies are very common; there are more stars than grains of sand on Earth. Therefore, if we happen to live in a rather average region in space, shouldn’t aliens be common even if the chances of abiogenesis is one in a billion? This is the reasoning for the well known Fermi paradox: why don’t we see extraterrestrial life?
The Fermi paradox is unsolvable with our current knowledge and technology, it will take many generations to be resolved. The application of mediocrity to alien life clearly shows its weakness; the principle would tell you that aliens should be abundant. But if the chances of life spawning are so incredibly, pitifully low, then that can explain why we don’t see anyone out there. We might be alone in the galaxy, maybe even the whole universe.
Ultimately, the principle of mediocrity is just a principle. It’s not a mathematical law or a proven theory, just a generally applicable heuristic that suprisingly works often. The smart people at NASA have made mistakes following this principle, and so have historic astronomers who have assumed stars to be sun-like. So when mediocrity assumes that your next social gathering will likely be less magnificent than you hope, just remember that you can be an odd one out.