Kolmogorov complexity and entropy are described as related quantities, without explaining how. In order to explore the connection between these quantities, we have to understand that information is a transfer of energetic content from Bob to Alice. Bob knows the content of his message, and therefore it carries no entropy (uncertainty) for him. However, it does carry entropy for Alice, who is ignorant of its content. If Bob sends Alice N bits, the entropy for Alice is NLn2, no matter what is the message. Bob would like to send his message with the minimum number of bits possible. Why? Because sending short message saves energy and time and also exposes the message to less noise. In our book we claim that Bob compress his files because he obeys Clausius inequality (the second law of thermodynamics). Regardless of Bob’s reasons for compressing his file, the complexity of his message defines how efficiently he can do it.
Suppose Bob wants to send Alice the first N digits of the number Pi, compressed to its Shannon limit. Pi is an irrational number, and since Bob cannot find in it periodicity (the digits in irrational numbers are random), Pi cannot be compressed by conventional methods. Nevertheless, it can be calculated by a simple function, and therefore Bob can send Alice the generating function of Pi, which will enable Alice to produce the N digits of Pi at her leisure.
In order to send Alice a method of generating Pi, Bob has to find out how “complex” is Pi, or in other words, the simplest and shortest way of generating the digits of Pi. Hence, the minimum amount of operations required to generate a given content is the complexity of the content, while the length of the file that carries the content is its entropy.