Prologue to Entropy
Mathematician Arnold Sommerfeld (1868–1951), one of the founding fathers of atomic theory, said about thermodynamics, the branch of physics that incorporates the concept of entropy:
Thermodynamics is a funny subject. The first time you go through it, you don’t understand it at all. The second time you go through it, you think you understand it, except for one or two small points. The third time you go through it, you know you don’t understand it, but by that time you are so used to it, it doesn’t bother you anymore.
This is true of many more things than we would like to imagine, but it is especially true of our subject matter, namely entropy.
Entropy is a physical quantity, yet it is different from any other quantity in nature. It is definite only for systems in a state of equilibrium, and it tends to increase: in fact, entropy’s tendency to increase is the source of all change in our universe.
Since the concept of entropy was first discovered during a study of the efficiency of heat engines, the law that defines entropy and its properties is called the second law of thermodynamics. (This law, as its name implies, is part of the science called thermodynamics, which deals with energy flows.) Despite its nondescript name, however, the second law is actually something of a super-law, governing nature as it does.
It is generally accepted that thermodynamics has four laws, designated by numbers, although not necessarily in order of importance, and not in order of their discovery.
- The first law, also known as the law of conservation of energy, states that the energy in an isolated system remains constant.
- The second law is the one with which this book is concerned, and we shall soon go back to discuss it.
- The third law states that at the temperature of -273°C, an object is devoid of all energy (meaning that it is impossible to cool anything to a temperature lower than that).
- There is another law, called the zero law, which deals with the meaning of thermal equality between bodies.
The second law of thermodynamics concerns the inherent uncertainty in nature, which is independent of forces. Because the nature of particles and the nature of forces are not intrinsically parts of the second law, we may conclude that any calculation basing on this law that will be made in some area of physics, will also apply to other areas as well.
The main quantity behind the second law of thermodynamics is entropy – which is a measure of uncertainty. The forces deployed by entropy are not as trictly determined as those described by other laws of nature. They are expressed as a tendency of changes to occur, in a manner somewhat analogous to human will. In order for entropy to increase, energy must flow; and any flow of energy that leads to change, increases entropy. This means that any action in nature – whether “natural” or man-made – increases entropy. In other words, entropy’s tendency to increase is the tendency of nature, including us humans, to make energy flow.
The importance of entropy is immeasurable, yet it is a safe bet that the average educated person, who is familiar without doubt with such terms as relativity, gravity, evolution and other scientific concepts, may never have heard of it, or else misunderstands it. For even those familiar with the concept of entropy admit that while they may understand its mathematical properties, they cannot always comprehend its meaning. So what, then, is entropy? We hope that after reading this book you will understand what it is, why it is significance, and how it is reflected in everything around us.
This book has two main sections:
The first section deals with the historical development of the second law of thermodynamics, from its beginning in the early 19th century to first decades of the 20th. Here, the physical aspects of the concept of entropy will be discussed, along with the energy distributions that arise due to entropy’s propensity to increase.
The second part of this book deals with the effects of entropy in communications, computers and logic (studied since the beginning of the 1940s), along with the influence entropy has had on various social phenomena.
The first section is concerned with thermodynamics, which is, in essence, the study of energy flows. Every physical entity, whatever its nature, ultimately involves energy. We consume energy in order to live and we release energy when we die. Our thoughts are energy flowing through our neurons. We put in energy to communicate and consume energy to keep warm. Since energy cannot be created (or destroyed), this means it has to come, or flow, from elsewhere. Why does energy flow, and what principles guide it? Well, the reason energy flows is this: there is a theoretical quantity that is not directly measurable, called entropy, and it tends to increase. In order for it to increase, energy must flow.
The first person who quantified this flow of energy, in 1824, was a young French aristocrat and engineer called Sadi Carnot. He understood that in order to obtain work – for example, to apply a force in order to move an object from one place to another in space – heat must flow from a hot area to a colder one. He calculated the maximum amount of work that can be obtained by transferring a given amount of energy between objects having two different temperatures, and thus laid the foundations of thermodynamics.
About forty years later, in 1865, physicist Rudolf Clausius, a fanatic Prussian nationalist, formulated the laws of thermodynamics and defined a mysterious quantity that he named “entropy.” Entropy, as Clausius defined it, is the ratio between energy and temperature.
About twelve years later, in 1877, an eccentric Austrian, Ludwig Boltzmann, and an unassuming American, Josiah Willard Gibbs, derived an equation for entropy – concurrently but independently. They described entropy as the lack of information associated with a statistical system. At that point, the second law obtained some unexpected significance: it was now understood that uncertainty in nature has a tendency to increase.
At approximately the same time, Scottish estate-owner James Clerk Maxwell, an illustrious figure in the history of modern science – comparable only to Galileo, Newton, or Einstein – calculated energy distribution in gases and demonstrated how energy is distributed among the gas particles in accordance with the second law.
Some forty years have passed, the 20th century began, science found itself addressing the problem of electromagnetic radiation that is emitted by all objects, and discovered that it is determined solely by their temperature. But there was an anomaly: although the radiation equation had already been formulated by Maxwell, the intensity of the radiation as a function of temperature refused to behave according the laws of thermodynamics, as understood at the time.
This “black body radiation” problem, as it was known, was solved in 1901 by Max Planck, a German intellectual. He showed that if the energy distribution is calculated according to the second law of thermodynamics while assuming that radiation energy is quantized (that is, comes in discrete amounts), the distribution observed experimentally is the one anticipated theoretically. Planck’s discovery started the quantum revolution, a storm that raged in the world of physics over the next decades. Planck’s calculated energy distribution was such that Maxwell’s distribution turned out to be a special case of the Planck distribution. Later on, Planck’s distribution was generalized even more, and today it is known as the Bose–Einstein distribution (unfortunately, discussing it lies beyond the scope of this book).
Up to this point, entropy had been discussed in purely physical contexts. Part Two of this book discusses “logical entropy” and shows how the boundaries of entropy were expanded to include the field of signals transmission in communication. This began when, during the Second World War, an American mathematician and gadgets aficionado called Claude Shannon demonstrated that the uncertainty inherent in a data file is its entropy. What was unique here was that the entropy that Shannon calculated was purely logical: that is, it lacked any physical dimension. In the second part of our book we shall explain under what conditions physical entropy becomes logical entropy, and devote a special section to numerical files and Benford’s Law, which states that the distribution of digits within a random numerical file is uneven. We shall see that this uneven distribution results from the second law.
The significance of the tendency for logical entropy to increase is manifested in a spontaneous increase of information, analogous to the spontaneous increases of ordered phenomena (life, buildings,roads, etc.) around us. Now, if the second law is indeed responsible for this creation of order, it should show up as logical distribution.
We shall examine the distribution of links in networks, of wealth among people, and of answers to polls, and we shall see that calculations obtained from logical thermodynamics do in fact firm up actual rules of thumb that have been known for some time. In economics, the Pareto rule (also known as the 80:20 Law) is quite famous. In linguistics, there is Zipf’s Law. These rules, which physicists classify under the general name of “power laws,” are all derived from calculations made by Planck, basing on Boltzmann’s work, more than a century ago.
So it seems that contrary to Einstein’s famous quote that “God does not play dice with the world,” logical entropy and its expression in probability distributions proves that God does, indeed, play dice, and actually plays quite fairly.
This book has three layers: historical background, intuitive explanation and formal analysis.
The historical aspect includes the biographies of the heroes of the second law – those scientists whose names are perhaps not as familiar to the general public as those of singers, actors, artists and other celebrities are, but as long as humankind aspires to know more, they will be engraved forever in science’s hall of fame. Our heroes are Carnot (a Frenchman), Clausius (a Prussian), Boltzmann (an Austrian), Maxwell (a Scotsman), Planck (a German), and Gibbs and Shannon (Americans). Their biographies reveal the historical motives that informed these scientists in their various research efforts: the industrial revolution, science’s golden age, and the age of information. It is important however to remember that these persons did not do their work in a vacuum, so a few brief biographical sketches of others who worked alongside them or were otherwise affiliated with their work are given at the end of this book.
The second layer in this book presents intuitive explanations, those which scientists sometimes contemptuously dismiss as “hand waving arguments.” Yet these intuitive explanations represent the essence of science and the essence of culture. Of course, there is no alternative in science to formal mathematical analysis, but this has value only so long as that it quantifies the verbal explanation which consists of words, mental images and often “gut feelings” – which is the reason why they are called “intuitive.” Every mathematician knows that behind his or her pure mathematical equations resides intuitive understanding, and everyone accepts that the laws under which we live are guided by intuition.
Finally, there is the third, formal layer, contained in the appendixes at the end of the book, providing mathematical derivations for readers with technical and scientific background. If you studied engineering, economics or the physical sciences, you will be able to examine these mathematical analyses directly, without having to bother with monographs on the various subjects.
The derivations were chosen so that they would match the intuitive explanations in the book.
This book was written jointly by a team of husband and wife. He – a scientist in the relevant fields: thermodynamics, optics and information science. She – an educator and welfare worker, translator and editor. The process which led to this work was an ongoing dialogue between two different worlds, which, we hope, will make our book an enjoyable, educational, and enlightening experience. It is intended not only for the general public interested in the nature of things, but also for students, engineers and scientists.