Entropy Blog

Economic Inequality

The inequality of wealth among people is a major issue in all societies. It is believed to determine the happiness of the people, and therefore the political stability of the countries.  Can the second law describe well the economic inequality which exists in developed country? The answer is positive. When we apply the Planck Benford distribution namely;

formula

where n is the income rank and E(n) is the relative income of the rank to the wealth distribution in the OECD countries, we obtain a remarkable agreement.

Parameter OECD average Second Law
Gini Index 0.32 0.327
Relative Poverty 9% 9%
Ration between the upper decile and lower decile 9.6 7.5
wealth of the upper percentile 18% 15%
wealth of the upper ten percentiles 50% 52%

The Planck Benford distribution yields also very good agreements to the executive salaries of big companies. If N is the number of employees in a company, according to the second law the highest salary in the company should be the SQRT(N) times the average salary for more info please visit their site.  We calculated the ratio between the actual top salaries in Fortune 100 companies and the expected theoretical values and found that the average ratio is 0.87. This results are explained in full in the recently published paper.

 

The Physical Meaning of Money

Is there a physical meaning to money? The book Entropy God’s dice game ends with a statement that freedom is entropy, equal opportunity is the second law, and the microstates in which we exist is our destiny, that was determined in God’s dice game. The reasoning behind this, somewhat bombastic, statement is that the higher the freedom, the higher is the number of the microstates available for us, and therefore the higher the entropy. The entropy has a maximum value when there are no constraints and each state (and therefore each microstate) has equal probability. It is also argued in the book that in networks the number of links of a node represents its wealth. For example, in our society, Bill Gates is linked, that is to say, has access to many people, due to his wealth. In a similar sense, financial investments, including assets like cryptocurrency, expand wealth by increasing the number of opportunities available to an investor. Each investment, whether in traditional markets or through platforms such as Secure Crypto Trading Platforms UK, acts as an additional link or microstate that increases potential outcomes.

In the paper “Money, Information and Heat in Social Networks Dynamics” that was recently published at “Mathematical Finance Letters”, this argument is further investigated. In this paper a network is defined as a microcanonical ensemble of states and links. The states are the possible pairs of the nodes in the nets, and the links are communication bits exchanged between the nodes. This net is an analogue to people trading with money. This approach is consistent with the intuition of the book that wealth is entropy (information), and therefore money is entropy, which can be interpreted as a freedom to choose from many options. Therefore, the physical meaning of money is entropy. In this case, money transfer is simulated by bits transfer which is heat (energy transfered). With analogy to bosons gas, we define for these networks’ model: entropy, volume, pressure and temperature.

We show that these definitions are consistent with Carnot efficiency (the second law) and ideal gas law. Therefore, if we have two large networks: hot and cold having temperatures TH and TC, and we remove Q bits (money) from the hot network to the cold network, we can save W profit bits.  The profit will be calculated from W< Q (1-TH/TC), namely, Carnot formula. In addition it is shown that when two of these networks are merged, the entropy increases. This explains the tendency of economic and social networks to merge. The derived temperature of the net in model is the average number of links available per state.  

Kolmogorov Complexity and Entropy

Kolmogorov complexity and entropy are described as related quantities, without explaining how. In order to explore the connection between these quantities, we have to understand that information is a transfer of energetic content from Bob to Alice. Bob knows the content of his message, and therefore it carries no entropy (uncertainty) for him. However, it does carry entropy for Alice, who is ignorant of its content. If Bob sends Alice N bits, the entropy for Alice is NLn2, no matter what is the message. Bob would like to send his message with the minimum number of bits possible. Why? Because sending short message saves energy and time and also exposes the message to less noise. In our book we claim that Bob compress his files because he obeys Clausius inequality (the second law of thermodynamics). Regardless of Bob’s reasons for compressing his file, the complexity of his message defines how efficiently he can do it.

Suppose Bob wants to send Alice the first N digits of the number Pi, compressed to its Shannon limit. Pi is an irrational number, and since Bob cannot find in it periodicity (the digits in irrational numbers are random), Pi cannot be compressed by conventional methods. Nevertheless, it can be calculated by a simple function, and therefore Bob can send Alice the generating function of Pi, which will enable Alice to produce the N digits of Pi at her leisure.

In order to send Alice a method of generating Pi, Bob has to find out how “complex” is Pi, or in other words, the simplest and shortest way of generating the digits of Pi. Hence, the minimum amount of operations required to generate a given content is the complexity of the content, while the length of the file that carries the content is its entropy.

Unified evolution

Evolution theory was suggested as an answer to one of the most intriguing questions: How was the variety of biological species on earth created? Contemporary evolution theory is based on biological and chemical changes. Many believe that life started from some primordial chemical soup.

Does evolution have a deeper root than chemistry? Is there a physical law that is responsible for evolution? Our book advocates a positive answer.

There is a semantic differentiation between “natural things” and “artificial things”. Namely, natural things are created by nature and artificial things were made by us. Is there a justification for this egocentric view of the world? Here it is argued that both “we” and the “things” we make are all part of nature and are subject to the same laws. From holistic point of view, the computers, telephones, cars, roads, etc. are all created spontaneously in nature by the same invisible hand that created us. There are no special laws of nature for humans’ actions. A unified evolution theory should explain the origin of our passion to make tools, to develop technology, to create social networks, to trade and to seek “justice”.

Here we claim that entropy, in its propensity to increase, is the invisible hand of both our life and our actions.

Understanding Uncertainty

Probability is a well-established mathematical branch of high importance. In mathematics probability is calculated with consistency with a set of axioms. Sometimes uncertainty is defined by the statisticians according to probability rules.

For example: Suppose Bob plans to dine with Alice in the evening: there is 1/10 chance the he will not be available. Since the total probability is 1 (Kolmogorov 2nd axiom), therefore there is 9/10 chance that they will dine together and 1/10 that they will not. If there is a chance of 1/2 that Bob will not be available, the total probability is still 1, but now it comprises of a probability of 1/2 for a joint dinner. In General, if the probability that Bob will not be available is p, it implies that the probability of the joint dinner is 1-p.

In this example some statisticians may say that the uncertainty of having a joint dinner in the first case is 10%, and 50% in the second. This is not correct.

Uncertainty is defined by its Shannon’s entropy and its expression for the joint dinner is,

-plnp-(1-p)ln(1-p).

Usually engineers use the logarithm in base 2 and the uncertainty is expressed in bits. If p=1/2 then the uncertainty is 1 bit (one or zero).  If p=1/10 then the uncertainty is 0.46 bit, namely, it is little less than half a bit. The entropy is a physical quantity which is a function of a mathematical quantity p, but unlike mathematical quantities that exist in a formal mathematical space defined by its axioms, entropy is bounded by a physical law, the second law of thermodynamics. Namely, entropy tends to increase to its maximum.

The maximum value of S, in our example, is ln⁡2 when p=1/2. Does it mean that nature prefers the chance of Bob not being available for dinner with Alice to be 1/2, where the entropy is at its maximum? The answer, surprisingly for a mathematician, is yes! If we will examine many events of this nature we will see a (bell-like) distribution that has a pick at the value p=1/2.

Similarly, the average of many polls in which one picks, randomly, 1 out of 3 choices, will be a distribution of  50%:29%:21% and not 33%:33%:33%  as is expected from simple probability calculations. Laws of nature (the second law) can tell us something about the probabilities of probabilities. The function that describes the most probable distribution of the various events is called the distribution function.

The distribution functions in nature that are the result of the tendency of entropy to maximize are, among others:

  • Bell like distributions for humans: mortality, heights, IQ etc.
  • Long tail distributions for humans: Zipf law in networks and texts, Benford’s law in numbers, Pareto Law for wealth etc.

Benford law

Benford law is about the uneven distribution of digits in random decimal files. It was discovered by Simon Newcomb by way of noting consistent differentiation in the wear-and-tear of logarithmic books at the end of the 19th century. The phenomenon was re-discovered by Frank Benford in 1938.

Newcomb found and stated the law in its most general form by declaring that mantissa is uniformly distributed. Benford set out to check the law empirically and also guessed successfully its equation for the 1st digits :ρ(n)=log10[(n+1)/n]:namely, the probability of digit n (n=1,2,3,…,8,9), ρ(n) is monotonically decreasing such that digit 9 will be found about 6.5 times less than digit 1. The law is also called “the first digit law”. Benford has shown that this law holds for many naturally generated decimal files.

Misconception: Benford law applies only for the first digits of numbers.

NOT TRUE. Benford law holds for the first, second, third, or any other digit order of decimal data. The law was originally stated mostly in terms of 1st digit sense which does not include the 0 digit. Second and higher orders naturally incorporate the 0 digit as a distinct possibility of course.

Benford law is applied for any decimal file that is compressed to Shannon limit. In a binary file at the Shannon limit all the bits excluding the 0’s are 1. In the case of 0, 1, 2 counting system the ratio between the digits 1 and 2 is 63:37 and in 0,1,2,3 counting system, the ratios between the digits 1, 2 and 3 are 50:29:21. In the same way a compressed decimal file has Benford’s law distribution.

Why calculating the Shannon limit does not gives us information about the “0”s? Strictly speaking zero has no entropy and therefore it does not count. Or in a formal way entropy is logarithmic and this is also the reason why the changes in frequencies of the digits are logarithmic (exactly like the distances in a slide rule).

Why entropy is logarithmic? Because, that IS the way God plays dice!