Free will and entropy:
Why do we want more and more money regardless of how much we already have? Why do we hate to be manipulated and to lose? Why do twenty percent of the people own eighty percent of the wealth? Why in most languages, does the most common word appear twice as often as the second most common word? Why does the digit “1” appear in company balance sheets six and a half times more often than the digit “9”? Why does nature hate “bubbles”? The cause for all these phenomena is the very same law that makes water flow from high to low, and heat flow from a hot place to a cold one. This law, which for historical reasons is called the Second Law of Thermodynamics, states that there is a never-decreasing always-increasing quantity called entropy. Entropy represents the uncertainty of a system in hypothetical equilibrium where everybody and everything have equal opportunities but slim chances to win; or in other words – the majority have little and a few have a lot. The book describes the historical evolution of the understanding of entropy, alongside the biographies of the scientists who contributed to its definition and to the exploration of its effects in numerous domains including exact sciences, communication theory, economics and sociology. This book should be of interest to a broad audience, from scientists, engineers and students to the general public.
Synopsis
The book provides a panoramic view of entropy and the second law of thermodynamics. Entropy shows up in a wide variety of contexts including physics, information theory and philosophy. In physics it is conceived of as measure of disorder. In information theory the same expression is conceived of as measure of uncertainty. In philosophy and social sciences entropy is used as a metaphor for decay and death. Since the second law of thermodynamics states that the universe’s entropy never decreases and always increases, it implies that disorder, uncertainty and decay all tend to increase. This contradicts our experience on this planet, in which highly complex and “ordered” entities constantly come into being and proliferate. Moreover, although the expressions of entropy in information theory (Shannon’s entropy) and in physics (Gibbs’s entropy) are identical, they are sometimes interpreted as opposites. Some scholars claim that the identity of Shannon’s and Gibbs’s expressions is just coincidental. The book describes the multiple faces of entropy in a unified way, namely: entropy is a measure of uncertainty. It is shown that in sparse systems, where the number of particles is much smaller than the number of possible sites in which they can be arranged, uncertainty is disorder, whereas in dense systems, where there are many more particles than possible sites, uncertainty is complexity. Therefore, the tendency of entropy to increase (namely the second law of thermodynamics) increases disorder in sparse systems, but increases complexity in dense systems. This increase in complexity in nature is interpreted by us as a spontaneous generation of “order”. Entropy differs from most physical quantities by being a statistical quantity. Its major effect is to stimulate statistical systems to reach the most stable distribution that can exist in equilibrium. This driving force is interpreted in this book to be the physical origin of the “free will” in nature. The book describes in detail some important distributions in nature in order to support the claim that a law of physics – the second law of thermodynamics – also prevails in communication networks, social structures, human decisions and human will.