The Arrow of Time: Entropy and Irreversibility
The Arrow of Time: Entropy and Irreversibility
by Michelle Ng
Source: https://www.flickr.com/photos/timaz/4430250834/
Why does a bowl of hot soup cool down when you leave it out at room temperature? Why doesn’t the room heat up the soup? At first glance this question is simple. ‘I have common sense,’ you think, ‘it’s because the room is much colder than the soup; surely the room can’t heat up the soup.’
Next question. Why does time flow forwards?
These two questions may seem to be entirely unrelated: how is an aromatic delicious dish in any way connected to an abstract concept that many people struggle to define? However, at closer inspection we will see that there is an intrinsic relationship between the two—these two questions are tied together by a concept called entropy. The room being colder than the soup may seem like an easy answer, but it is not the full story: why does energy always flow from a high to a low temperature in the first place? To take this a step further, why is this process irreversible?
This article explores the idea of irreversibility and how this can be explained by entropy.
ENTROPY
Entropy is often said to be ‘a measure of disorder’, and the second law of thermodynamics is often quoted as ‘in a spontaneous process, the entropy of an isolated system always increases’, meaning that processes in the universe always turn order into disorder if we don’t do anything to intervene. [1] A bowl of soup is at a much higher temperature than the room; since temperature is directly proportional to the average kinetic energy of molecules, the molecules in the soup have a higher average kinetic energy than the air molecules in the room. As the soup cools down, the molecules of the soup collide with the molecules of the air, causing the soup molecules to transfer energy to the air. Since the air molecules gain energy, they move faster and more ‘messily’—the air becomes more ‘disordered’. In contrast, as soup molecules lose energy, they move slower and the soup becomes more ordered. There are many more air molecules than soup molecules, so the increase in disorder of the air outweighs the decrease in disorder of the soup. This is in agreement with the second law of thermodynamics, so the soup cools down instead of the other way round.
The requirement of increasing disorder explains myriad phenomena: chemical reactions, the diffusion of fluids, why thermal conduction occurs. Processes and events occur in an order such that entropy increases: time flows in the direction of increasing entropy. This is why we perceive time to be a one-way street, and why we cannot retrieve the past. If time were to run backwards —if processes started reversing themselves and the soup started being heated up by the room—entropy would decrease, which is not allowed.
However, the mere statement that ‘disorder always increases’ does not illustrate how the universe knows what is ‘messy’ and ‘disordered’; just because the air particles move faster doesn’t mean the universe will know that they are ‘messier’. ‘Messy’ seems like a term that is subjective to us humans; how can this idea be used as a law of physics?
Moreover, this statement does not explain why ordered systems tend towards disorder over time in the first place—what is the fundamental reason behind increasing entropy? The simple answer to this is that a disordered system is more probable than an ordered system. [1] To explain this, we can consider the idea of thermodynamic macrostates and microstates.
THE ENERGY IN A SINGLE BODY
A macrostate is what we can observe about what is happening to a system and its energy. This can be described by a set of macroscopic properties such as temperature, pressure and volume. [2] For instance, initially, a bowl of hot soup is at a high temperature and has a large amount of energy: this is the macrostate of the soup. The air’s macrostate, however, is that it is at room temperature. After some time, the bowl of soup cools down and will be in a lower energy macrostate. (And to a very miniscule extent, the air heats up and has a higher energy macrostate.)
On the other hand, a microstate refers to the specific details about the behaviour of all the atoms of the system and describes how the energy of the system is distributed between the atoms. [2] There are often multiple microstates that correspond to the same macrostate because there are many ways to distribute energy amongst a number of atoms. Such a set of microstates is known as an ensemble; microstates in the same ensemble only differ on an atomic level. [3]
This idea is only possible because of quantum physics, which introduced the theory that the energy of a system is made up of discrete ‘packets’ of energy (called quanta) that can be thought of as particles. Because of this, energy is no longer viewed as a continuum with infinite possible values (which would lead to infinite microstates). Instead, atoms can only have an integer number of energy quanta, so there is only a finite number of unique microstates corresponding to one macrostate.
This may seem abstract, but an example can be considered: a system has 3 quanta of energy in total which are shared between 2 atoms, atom A and atom B. Overall, the fact that the system has 3 quanta in total does not change: this is the ‘macrostate’ of the system. But how many ways are there to distribute the energy? There are 4 ways to do so (see Figure 1). Each of these ways is a ‘microstate’; all the microstates correspond to the same macrostate because no matter the distribution, the total energy remains the same.
Figure 1: a table illustrating the possible ways to distribute 3 quanta of energy between 2 atoms in a system.
The fundamental assumption of statistical mechanics states that over time, the probability of each microstate for a system in a given macrostate is equal [4]: there is an equal chance of finding the 2-atom system in any of its possible microstates listed above at a given time (see the last column of Figure 1).
As the total quanta of energy in a single system increases, the number of possible microstates for the system increases. (Consider distributing 4 quanta of energy between 2 atoms: you can have (4,0), (3,1), (2,2), (1,3), (0,4) = 5 microstates which is greater than before). Similarly, if the system consists of more atoms that you can distribute energy between, the number of microstates also increases. (Consider distributing 3 quanta between 3 atoms instead of 2: (3,0,0), (1,2,0), (1,0,2), (1,1,1), (2,1,0)... etc. The number of possible microstates is greater than before.) How many possible microstates there are for a given macrostate is called the multiplicity of the macrostate. [5] Derived using the theory of permutations and combinations, the following equation describes this relationship mathematically: [3]
where
q = the total number of quanta of energy in the system
N = the number of atoms in the system.
ENERGY TRANSFER BETWEEN 2 INTERACTING BODIES
When 2 systems are in contact with each other, collisions occur between the molecules of the 2 systems, transferring energy between them. These energy transfers happen randomly, however, probability can be used to predict the most likely macrostates reached by the 2 systems. It happens that one particular macrostate is much more probable than all the others. [6]
To illustrate this, an arbitrary example can be used again; this time, there are 2 systems each with 2 atoms. Each system has a different starting number of energy quanta, for instance, system 1 starts with 5 quanta and system 2 starts with 1 quanta. In total, the two systems have 6 quanta altogether. The two systems can give or take quanta from each other at random. After some time, what is the most probable split of quanta between the 2 systems?
In this case, we are effectively arranging a total of 6 (5+1) quanta between the 2 systems. This time, there are multiple ways to separate the quanta between the two systems— there are multiple macrostates (how energy is distributed between the systems) for each system that allow for a total of 6 (see Figure 2). However, within each individual system, there are also many possible ways to arrange the energy between the atoms inside the system—each macrostate has many microstates (how energy is distributed between the atoms within each system).
Furthermore, as shown in the previous section, the number of possible microstates for a system in a certain macrostate depends on both how many total quanta a system in that macrostate has, and the number of atoms in the system. Therefore, each possible macrostate has a different number of possible microstates corresponding to it. Since every individual microstate has an equal chance of occuring (based on the fundamental assumption of statistical mechanics),
the probability of a given macrostate = the number of possible microstates x the probability of a single microstate (which is consistent for all microstates).
This means that ultimately, some macrostates are more probable than others (see Figure 2):
Figure 2. a table illustrating the possible ways to distribute 6 quanta of energy between 2 systems, each with 2 atoms.
As you can see, it is more likely for there to be a 1 / 2 split between the systems (P=12/20) rather than a 3 / 0 split (P=8/20)—it is more likely for the quanta to be distributed more evenly between the 2 systems rather than having all the energy in one object and none in the other.
In general, when 2 systems come into contact, energy transfer occurs randomly. Over time, the 2 systems will be found in a macrostate that is the most likely, which is the one that has the greatest multiplicity—Ω1 x Ω2 is at a maximum. This maximum point can be found mathematically.
In these examples, very small systems with only 2 atoms and 6 total quanta, and the difference in probability is not as large (12/20 compared to 8/20). In reality, objects are made up of many atoms with many more energy quanta. In this situation, the disparity between the different macrostates’ probabilities is much greater. (See Figure 3.) One of the macrostates is significantly more likely than the others; even though the probabilities of other macrostates are non-zero, their values are extremely small. That is not to say these macrostates would not be seen, but if they were observed, it would be very rare and only be for a very small instant of time [1] before the system then moves to a different state that is more probable; over a long period of time, we would mostly see the system in the most probable macrostate.
Figure 3. A graph showing the probability distribution for the different macrostates of an object when in contact with another object (with equal numbers of atoms), with the fraction of total quanta in one object on the x-axis (the macrostate of the object), and the corresponding probability on the y-axis. The highest probability occurs where the fraction of quanta in one of the objects is 0.5, i.e. the energy is equally distributed between the 2 objects. As the number of total quanta (labelled N on the graph) increases, the peak becomes narrower, i.e. the probability of that macrostate is much more probable compared to other macrostates. The graph should actually be a series of dots, because energy quanta are discrete values, but because the number of quanta is so great in reality, the discrete distribution looks more continuous when presented graphically.
A MICROSCOPIC DEFINITION OF ENTROPY
What does any of this have to do with entropy? Actually, the entropy of a system is directly linked to the number of possible microstates corresponding to the given macrostate of the system. Entropy is more formally defined as S=kBlnΩ [7], where as the number of possible microstates for a given macrostate increases, the entropy of a system in that macrostate increases.
where
S = entropy of the system
kB = the Boltzmann constant
= the number of possible microstates corresponding to a given macrostate
The natural logarithm of the multiplicity of a system, ln Ω, is used in the definition of entropy, rather than just using the number itself—this is because the multiplicity can range from extremely small numbers to extremely large numbers; taking the logarithm makes these numbers easier to handle. [1] Also, a unique property of logarithms that would allows entropies of separate systems to add together to give the total entropy when the 2 systems are considered together, which makes calculations more convenient: [4]
Hence when Ω1 x Ω2 is at a maximum, the sum of the entropies of the 2 systems is at a maximum. If the 2 systems are (1) an object, and (2) the rest of the universe, this is when the total entropy of the universe is at a maximum. This means that energy transfer occurs to maximise the total entropy—which is why the total entropy increases spontaneously without intervention.
The state that maximises total entropy depends on the number of atoms in each system and can be found mathematically. In general, however, rather than having all the quanta concentrated in one region, energy will try to ‘spread out’, but in such a way that more quanta is eventually present in the body with more atoms, because this arrangement is the most favourable in terms of maximising the multiplicity of the 2 systems combined. (see figure 4.)
Figure 4. A graph to show how entropy of a 2 interacting systems changes when the number of quanta in the system changes, with entropy (S=kBlnΩ) on the y-axis and the number of quanta in System 1 on the x-axis. In this case, System 2 (green) has a greater number of atoms as shown by the fact that its maximum entropy is greater. The dark blue line represents the total entropy and is equal to the sum of the values of the light blue & green lines, which has a maximum value where the gradient is 0. For this circumstance the maximum value occurs when system 1 has fewer quanta than system 2, because system 2 has more atoms.
ENTROPY AND IRREVERSIBILITY
Applying this newfound concept to the soup scenario, the reason why the hot soup transfers energy to the cold room is because the air has a far greater number of atoms. If some of the soup’s quanta of energy is transferred and ‘spread out’ to the room, although the soup itself’s multiplicity decreases, the air’s number of microstates increases far more and outweighs the decrease in entropy of the soup. This situation has the greatest multiplicity: this is by far the most probable outcome.
The same reasoning applies for spontaneous processes like diffusion, which can only occur one way. Diffusion is the net movement of molecules down the concentration gradient and after the molecules have spread out to fill the container, it is very unlikely for them to suddenly decide to congregate back together again.The internal energy of a gas is very high and molecules have high velocities—in such a situation, the case where the molecules are more spread out in a container has a far greater multiplicity than if all the molecules were packed in one region. [1]
Using the reverse argument, if natural processes were to happen in reverse order, energy transfer would need to occur in a way such that entropy decreases, which is extremely, extremely unlikely. A bowl of cooled-down soup cannot suddenly be reheated by the room because this would require the total entropy to decrease. Hence, this process is irreversible. Therefore, time—which is how we perceive the order of events—can only flow forwards.
WHY DISORDER?
If entropy is based on multiplicity and probability on the most fundamental level, where did the idea of ‘disorder’ originate from? Why did we come up with using ‘order to disorder’ to simplify the concept? Actually, correlating entropy to disorder is not the most accurate way to define it:
Firstly, increasing entropy does not always mean order to disorder; there are some situations where this is not the case. When water is boiling to form steam, the entropy of the system is increasing as the energy of the molecules increase; however, one could argue that the system is going from order (pure water) to disorder (a mixture of water and steam—’messy’) then back to order (pure steam). [8] This paves its way for a second reason why ‘disorder’ is not an accurate definition: ‘disorder’ is ultimately a subjective concept. One could say a mixture of water and steam is more disordered than pure water or steam because it is a mixture; one could also argue that pure steam is more disordered because the molecules are moving faster and more randomly. Hence ‘disorder’ is a relative term.
Perhaps the reason why the scientific community has made the association between entropy and disorder is because of a historical factor: the kinetic theory of gases and the theory of entropy were developed at around the same time. [8] During this time, it was found that gases have high energies and therefore high entropy (because there are many more ways to distribute a large amount of energy). In the kinetic theory of gases, gas molecules are also modelled to be in ‘rapid irregular motion’—a concept that, to the human mind, has connotations of disorder. Therefore the link between irregularity, disorder and high energy was made. Nowadays, in general, ‘order to disorder’ is still seen as an easier way to visualise many processes, so this association is still prevalent today.
In Figure 5 below, one could argue that the gas diffusing is because the molecules are going from an orderly arrangement to a disorderly one, but a more accurate explanation is that the configuration on the right has a much lower probability of occurring than the one on the left.
IS TIME AN ILLUSION?
Even though we have considered how entropy is related to time, the nature of time remains a mystery for us because different theories have conflicting implications about what time really is. On one hand, the conclusion reached using this multiplicity approach has been that that time’s arrow flows in the direction of increasing entropy. In this sense, time truly does flow one way.
On the other hand, general relativity established the fact that time is relative and that there is no common present between different observers; the theory of relativity also unified space and time into a 4-dimensional space-time in which only the speed of light is consistent for different observers. [9] This gave way for the theory of a ‘block universe’, where events of the past, future and present are already existing in space-time. Since we are living in space-time, we cannot see this 4-dimensional universe; we cannot see all these events simultaneously. This is akin to how if you were an ant on a sheet of paper amongst many other ants, you would not be able to see the other ants because you are living on a 2-dimensional area; however a 3-dimensional human would be able to see all the ants on the paper. [9] Physicists who believe in the block universe think that time is an illusion. In this theory, time does not flow, and we can only ‘see’ time flow one way because we are living in space-time itself—we can only see a subset of all the events in the universe.
Moreover, another interpretation of quantum physics predicts something different. Things are neither waves or particles but can behave like waves or particles in different circumstances [10] Based on Heisenberg’s uncertainty principle, the position and momentum of an entity (known as conjugate variables) cannot both be known accurately at the same time; the same applies for energy and time. Instead, these conjugate variables are described by a probability function, called the wave function. The Copenhagen interpretation of this is that without observing an entity, it is impossible to make any statement what state an entity is in. The act of observing or interacting with something causes its wave function to collapse and ‘forces’ the entity to choose a state, causing us to observe what we see. [10] This caused some physicists to believe that the microstates and macrostates of something comes from our interactions, and so entropy is relative to us—the flow of time and the idea of increasing entropy comes from us and is not an absolute property of the universe. [11]
To this day, the reason for the flow of time is still an unanswered question and we do not know why we have memories of the past but not the future. Answering this question requires finding a way to resolve the contradictions between quantum mechanics and general relativity, currently the two most significant theories. THIS IS WHY U GUYS NEED TO STUDY PHYSICS…...
BIBLIOGRAPHY
[1] "Entropy". Hyperphysics.Phy-Astr.Gsu.Edu, 2021, http://hyperphysics.phy-astr.gsu.edu/hbase/Therm/entrop.html.
[2] “3.1 Microstates and Macrostats”. Department of Physics and Astronomy - The University of Machester, 2021, https://theory.physics.manchester.ac.uk/~judith/stat_therm/node55.html
[3] Sekerka, R. F. Thermal Physics. 2015.
[4] Chabay, Ruth W. Matter and Interactions. John Wiely, 2017.
[5] Physics.Usu.Edu, 2015, http://www.physics.usu.edu/torre/3700_Spring_2015/Lectures/03.pdf.
[6] “3.1 Microstates and Macrostats”. Department of Physics and Astronomy - The University of Machester, 2021, https://theory.physics.manchester.ac.uk/~xian/thermal/chap3.pdf
[7] "Isaac Physics". Isaac Physics, https://isaacphysics.org/concepts/cc_entropy.
[8] “Entropy as Disorder: History of a Misconception”. AAPT Physics Education, https://aapt.scitation.org/doi/10.1119/1.5126822
[9] “What Is A Block Universe?". Plus.Maths.Org, 2021, https://plus.maths.org/content/what-block-time.
[10] Gribbin, John. In Search Of Schrödinger's Cat. 1984.
[11] Rovelli, Carlo et al. The Order Of Time. 2017.
Comments
Post a Comment