Time is a topic not often discussed in pre-university physics, yet it is intrinsic to many of the deeper elements of the field. Moreover, by exploring time, we can expand our understanding of entropy as we try to answer questions such as: Where does the delineation between the past and the future reside? Is this nexus between what was and what can be the present? What distinguishes movement through time?
Firstly, what is time? Time is a seemingly elusive concept. Classically, time is thought of as independent, with all processes unfolding over the same duration for everyone everywhere. Time is treated as a background parameter, external to the system itself. However, Einstein’s General Theory of Relativity reveals that time is not a separate parameter; space and time are not independent but interwoven in a single fabric called space-time. Like the axes of a graph, space and time are perpendicular dimensions, creating the 4D structure of the universe that contains all matter and energy across all time.
Around objects with mass, from stars to people, space-time curves similarly to how water is displaced by objects placed on it. The water flowing near the object takes curved paths and moves inward towards the object. This warping of space-time creates curved geodesics—paths that are straight in curved topographies. Just as lines of latitude on a globe appear curved and converge at the poles when projected onto a flat surface, the curvature of space-time affects the movement of objects. The closer you are to a massive object, like the Earth, the more space-time is warped. This distorts time and space so that the duration of events passes at different rates depending on your position. Those at a higher altitude experience a faster passage of time compared to those at lower altitudes. Although the difference is small, it is measurable with precise atomic clocks.
Thus, time’s duration is not universal or independent and varies depending on your location. This challenges the notion of a single present. However, time still flows as we travel this one-way street from the past to the future. Yet, the physics of the universe does not depend on the flow of time, raising the question of why we perceive this invasive illusion. In most cases, the physical laws describing the universe do not distinguish between past and future, except when heat is involved. The thermodynamics of the universe highlights the arrow of time.
This brings us to the concept of entropy. We will explore entropy from the perspective of statistical mechanics. In classical thermodynamics, the microscopic details of a system are not considered. Instead, the behaviour of a system is described in terms of macroscopic thermodynamic variables such as temperature and pressure. In statistical mechanics, properties are defined by the statistics of the motions of the microscopic constituents of a system. Entropy quantifies the number of microscopic configurations or microstates, Ω, consistent with the macroscopic quantities that characterize the system.
Imagine a messy pile of cards compared to three organized stacks. We would say that there are more possible arrangements or states for the cards in the messy pile compared to the organized stacks. As there are more possible states for the messy pile, it has higher entropy. Entropy can be seen as a measure of the uncertainty or disorder of a system that cannot be determined from a macroscopic view. For a given set of macroscopic variables, entropy measures the degree to which the probability of the system is spread over different possible microstates.
Mathematically, entropy is expressed as a logarithmic measure of the number of states with a significant probability of representing the given macrostate. This is equivalent to the expected value of the logarithm of the probability that the system is in a given microstate:
S = - kB p log|p|
Using the fundamental assumption that any microstate is equally probable (i.e. p = 1/Ω, where Ω is the number of microstates). Then the previous equation reduces to the natural logarithm of the number of microstates, multiplied by the Boltzmann constant kB.
S = kB log|Ω|
How does the notion of entropy, as a measure of the number of microstates a given system could hold, define the direction of time? Only processes involving changes in entropy within a system exhibit distinct motion in time.
Imagine a swing oscillating without losing energy or a ball bouncing elastically without energy loss. Playing these motions in reverse, you would not be able to discern any difference. It is only when energy is lost—such as the swing losing height or the ball bouncing lower—that the motion becomes non-reversible. This dissipation of energy increases the overall entropy of the system. This increase in entropy gives rise to our perception of time—the movement from order to disorder. The second law of thermodynamics states that the entropy of a closed system (the universe) must tend to increase. The only distinction between the past and the future is that the universe was in a state of lower entropy in the past.
However, as stated, entropy measures how many microstates are characterized by a specific macrostate, but every possible configuration can be specific, as each can be characterized uniquely. Therefore, the increase in disorder is only apparent from a limited perspective. The movement of heat and the transition from lower disorder in the past are only noticeable from this incomplete view. If you could view every detail of the microscopic state of the universe, the seeming flow of time would disappear, as the ‘past’ and ‘future’ could be determined by the present state.
In conclusion, “there is no spoon.” Time has lost its familiar aspects, but it remains part of our human experience. Like the distinction between up and down, time is an illusion in the sense that it has no intrinsic meaning in outer space. I will leave you with a quote from Einstein on the death of his closest friend: "Now he has departed from this strange world a little ahead of me. That signifies nothing. For those of us who believe in physics, the distinction between past, present, and future is only a stubbornly persistent illusion."
Comments