top of page
Writer's pictureAbhishek Sharma

What is Entropy and Why ??

Updated: Aug 25, 2020


 

All of us, who have been through some science education, have heard this term, ENTROPY.

And I am sure most of us never got it right and even if we did we probably never got the intuition of why is there a such a concept.

And the nerds among us have probably seen this concept popping out in all sorts of areas, from physics to computer science to information theory etc.

I am sure I don't need to motivate any of my target audience to dig deeper into this concept. Having said that I will not give any definition of it, instead, I will try to tell you how to reach towards such a thing from very intuitive analogous examples.


And one more thing I will not be providing any historical context either.

 

With that out of the way, let's start with a straightforward example.

Consider you want to go to some city to meet your friend far away from your place, and in your universe, there is only one mode of travel, say by teleportation (or train whichever you want to imagine). Now your friend will know exactly how you reached there because he/she knows there is only one option right.


Now let's complicate things for your friend. Suppose there are two possible routes you can take to reach to your friend (for example there are two different trains which go to your friend's city or whatever). Now your friend is not sure anymore. He/She is divided into a fifty-fifty probability. And if there are more travel modes: Flight, Road, Boat, Teleportation etc., with different routes each you can see how quickly your friend will get extremely confused and start loosing more and more information about your actual mode of travel.


Well, what is happening here?

If you see from this example you will notice that the more the options you have to choose from, the lesser is the probability to choose the correct one or in other words lesser is the information about the true path. Examples like this are very general in nature, they are true in every field and every sense.



Let's take a physics example now. A bunch of molecules in a closed container (Gas in a box) and you want to know the temperature of the gas/system. Temperature is nothing but like a number stamp which tells us about how fast those molecules are moving on average (it is actually related to average kinetic energy but let's take the simpler assumption for the sake of argument). So the devil is in the details, a single average speed of the molecules can be achieved by any number of combinations of the individual speeds of the different molecules.


Let me be more specific, consider you have three molecules, and the average speed of the system is one unit, and the molecules only have three choices of speed (1,2 or 3) units. An average speed of one unit can be achieved by two unique ways; either all three have a speed of one unit, then the average = (1+1+1)/3 = 1 unit or one of them has a speed of 3 units and rest two have a speed of zero units, then the average = (3+0+0)/3 = 1. Note that I am not counting the (0,3,0) or (0,0,3) as a separate combination from (3,0,0) because gas molecules are indistinguishable from each other, so it doesn't matter who comes first and who comes last.



The point of the matter is that this is a similar kind of situation as your friend. Now we don't know which combination the molecules are in because both combinations give the same result. And as you may have already noticed, these number of equally likely combinations increase if I increase the number of choices the molecules have in terms of the speed they can have. Similar to the number of routes your friend could take. Now, this measure of missing information or the measure of the number of combinations you have that represent the same outcome is called ENTROPY. That's right Entropy is nothing but the measure of how many unique combinations your system can take to describe the same physical state.


In technical language, these number of unique combinations are called Microstates, and the physical state they are representing (the average value in our case) is called the Macrostate of those Microstates. Now I know the language is not perfect and very confusing, but it is what it is.

Also, notice that if the average speed is zero, then there exists one and only one unique combination to achieve that; all molecules go to zero so only one microstate. Mathematically Entropy(S) is defined as the Log of the number of microstates (denoted by Omega) times a constant, known as Boltzmann constant (K_b).


So for a state of zero average speed, we only have one microstate and log(1) = 0 so that is a state of zero entropy.

 

Now that was the What part of the question and now coming to the Why part. I mean why we need such a notion, why we need to track the notion of how many ways the system can give the same physical output. Well, this is the tricky part and not so obvious to understand.

All of us must have heard the notion that Entropy always increases or more popular The Second Law of Thermodynamics. But have you ever asked why? Like how the hell these senseless molecules know that they have to switch to a state with a higher number of microstates in it. Well, the short answer is they don't it just happens to be purely statistical.


It is like when you are continuously meeting a new Indian every second and after a few hours you claim that Indians want to be brown for some reason because you have met more brown colour people than red or blue or purple or white colour, Well you see the misconception there it is not that Indians wants to be brown, it is just that the population of brown people is the largest so of course you will find most Indians brown than other shades. A similar situation is here as well any system is found to be more likely to be in a Macrostate with more number of Microstates and most likely to be found in a state with the maximum possible number of microstates because that is statistically more probable.


Let me give you one more analogy, consider you have a bag of 50 balls and 40 of them are Red 8 of them are Green and 2 of them are Blue and you don't know the colour distribution and I ask you that what is the colour of the balls in the bag and you try to answer by picking ten balls at random one at a time and try to give an answer. What do you think you will answer based on the distribution I just gave, most of the times you will see the balls are Red as evident. Because that colour has the most number of options available.


Now coming back to the gas in a box, the colour represents the Macrostate (Temperature, Pressure etc.) and the number of balls of a particular colour represents the Microstates in that Macrostate. So if you measure the colour of that Gas you will find it in the colour with the maximum number of balls most of the times. i.e., Maximum Entropy.


Fig. 1 ) Gas molecules changing Microstates.


But that still doesn't explain why entropy increases. For example, why does a system which is in a state with very low entropy doesn't stay there, well you see our gas molecules are continuously moving and bumping into each other and thus changing speed which means changing the distribution of speeds which means changing the Macrostate. Now it is similar to choosing a new ball from the bag. As shown in figure 1, the cubes with different colour are representing a particular Macrostate, and their respective size is indicating the number of Microstates they hold. Now those gas molecules are randomly changing their Microstate (distribution of speed), which is similar to choosing a different coloured ball randomly but as you can see they will choose red more often than the others because there are more red balls. So this means when we will measure the Macrostate of the system, we will find it more in the red colour state than the other.


Now, this does not mean the system can't attain the yellow colour, which is the least in number, of course, it can but see it is a game of numbers. In real life, a real gas has billions, and billions of molecules which can arrange themselves in trillions of ways and the difference in the number of Microstates of the most probable Macrostate and the 2nd most probable one is so massive is that you rarely ever see the system in a Macrostate which is not the most probable one, forget about the least probable one.


So you will find as the time goes on the system will most likely be found in the maximum entropy state. That means a state with the maximum number of Microstates. And we call it the equilibrium state.

That's the reason heat always flows towards the absence of it because that's the most probable state for the system. I know that's not obvious from the discussion above. Think of it like when the system is in a state where heat is concentrated to a very few numbers of molecules, i.e., only a few of the molecules are moving very fast, and rest of them are very slow then that Macrostate can be achieved by a relatively small number of combinations of molecules than the Macrostate with that heat distributed more uniformly among the molecules, so the system quickly ends up with that Macrostate as time goes on.


Explaining the heat flow as consequences of the statistics requires much deeper discussion so that will be the next episode of the Entropy series so stay tuned for that.


That is why we need this concept to really understand the universe around us because this whole universe is also a system which is changing its Macrostate continuously and the Macrostate with the maximum number of Microstates is inevitable statistically speaking so the universe is dying towards a state of maximum entropy and there is no return from that.


I think that is enough for this article. I just scratched the surface of the topic and believe it or not; this is deeply connected to the notion of time and time travel to the past. But that's a topic for another day. I hope you have made this far and I increased your interest in how you see the world around you.


 



218 views1 comment

1 Comment


Shivam Gakkhar
Aug 21, 2020

simply awesome....great explanation......I also want to know about Maxwell demon.....

Like
bottom of page