13.8 billion years ago, our universe began. Then followed particles… then galaxies… then stars… then planets. One of these planets, formed 4.5 million years ago, witnessed the birth of our first human ancestors around 3.2 million years. Over these years, our brains evolved to understand emotions, communicate with others, create meaning out of trivial matters, and read online articles pondering their own existence. This seems to be an accurate summary of our existence, right?
As the dawn of the twentieth century came in sight, the great Ludwig Boltzmann revolutionized our understanding of pretty much everything. Etched on a stone above his grave are the letters: S = k log W. These letters form the equation for entropy. You must’ve heard of entropy in science fiction shows, depressing literature, or a song by 1975. But what is entropy? At its simplest, entropy is a measure of the number of possible rearrangements of a system. A system with a large set of possible rearrangements has high entropy. Conversely, a system with not many rearrangements has low entropy.
Imagine yourself the night before your assignment is due. You’re sitting at your table dreading the 500-word essay you must write. Once again, you stare at the blank page on your laptop and cannot believe you let it come to this point. In that moment, your document has low entropy. You cannot make changes to the page without changing its look. In other words, there are no rearrangements of your page. Fast forward to the morning, and you have somehow managed to complete your assignment. In front of you is the glorious sight of a full page of sentences. Now, you can replace words with their synonyms, add a sentence, change tenses, or remove a sentence. Just looking at the page, nothing seems to change. Your document now has many rearrangements; it has high entropy.
With a clearer understanding of entropy, let us move towards the second law of thermodynamics which states that any system is likely to go from low entropy to high entropy. Likely is the key word here because, unlike most other laws we study, the second law of thermodynamics is probabilistic. Said differently, i.e., it is more likely that entropy will increase over time, but it is possible it may decrease. Why, then, do we call it a law? Well, because the likelihood of a system’s entropy falling is so incredibly low that you don’t have to ever worry about it. That is, if you don’t have an eternity to live.
Consider a box containing the broken shards of a picture frame. We can shuffle around the broken pieces, and the box won’t look any different. According to our definition of entropy, the box has high entropy. If you spend a lifetime staring at it, you will not witness any changes to the broken wood, the shattered glass, or the torn picture inside. But give it an arbitrary number of lifetimes, and all the pieces will come back together to form a complete frame, all with that picture of your Golden Retriever inside. The probabilistic nature of the second law of thermodynamics assures us of this. No matter how low the probability is (and it is extremely low), entropy will decrease when given an arbitrary amount of time. Now, it will immediately go back to a lower entropy state, but for a second, it did increase, and that is all we need moving forward.
For our final haul, let’s leave our surroundings for nothingness. There are no humans, planets, stars, or galaxies. Just empty space. Random particles are moving around, but there are no structures. Considering the number of rearrangements this scene can take, we are sure that entropy is high. Now, we wait an arbitrary amount of time for the particles to rearrange and go into a low entropy state an arbitrary number of times. While we wait, an elephant may pop up out of nowhere, or maybe your favourite pair of socks you lost years ago, or even the CN tower! Again, we have an arbitrary amount of time, so anything that can happen will happen. In one of these instances, the particles come together to form the exact configuration your brain has, with all your memories and beliefs. This will be a Boltzmann brain. This will be you. It will believe what you believe, remember what you remember, feel what you feel, and think what you think. It will remember reading the previous sentence, it will remember opening this article, and it will remember agreeing with its origin story at the beginning of this article. So now it is time to ponder: are you real? Or are you just a spontaneous presence of a collection of particles?
- Greene, B. (2021). Until the end of time: Mind, matter, and our search for meaning in an evolving universe. Vintage Books.