A Coherent View on Entropy

Informational entropy is often identified as physical entropy. This is surprising because the two quantities are differently defined and furthermore the former is a subjective quantity while the latter is an objective one. We describe the problems and then present a possible view that reconciles the two entropies. Informational entropy of a system is interpreted as physical entropy of a whole composed of both the system and"memories"containing information about the system.


I. INTRODUCTION
The second law of thermodynamics is one of the most concrete laws in nature [1, 2, 3,4,5]. The law says that, a quantity S, physical entropy, of an isolated system does not decrease. Boltzmann gave a microscopic interpretation of the physical entropy [1, 2, 3, 4, 5] as follows, (1) Here W is the number of microstates belonging to a given macrostate. It is meaningful to talk about physical entropy of a specific microstate. The physical entropy of a microstate is defined by the number of other microstates belonging to the same macrostate [6]. Therefore, S is an objectively defined quantity, and is observer independent. On the other hand, Shannon proposed a quantity H, informational entropy, to measure randomness of an entity [7]. Here p i denotes the probability that the entity will be the i-th. Note that informational entropy H is a subjectively defined quantity. That is, the probability or information should be someone's. We denote the 'someone' by an 'observer'. Proposition 1: H cannot be determined without an observer for whom the probability p i 's are given, and is meaningless.
However, informational entropy H is often identified with physical entropy S [2, 3,4,5,8,9]. This is quite surprising because the former is a subjective quantity while the latter is an objective one. Thus the identification should be problematic. The aim of this paper is to examine the usual identification and provide a coherent resolution of the problem. H can be interpreted as S of the whole that contains both the original system for which H is given and memories that is correlated with the system. Here memory play the role of the observer. This paper is organized as follows. Section II discusses why the identification is problematic. Section III describes how informational entropy can be interpreted properly in terms of objective entropy. In section IV, a few related questions are discussed. The conclusion is reported in section V. The same problem exists in both classical and quantum mechanics. However, let us first pose the problem in a quantum form that might be more familiar to current physicists.
In quantum mechanics, von Neumann introduced an analog of informational entropy, where ρ is the density operator. If p i is the eigenvalue of ρ with the associated eigenvector |i , we have where |i 's are orthonormal. The density operator in Eq. (4) corresponds to an ensemble of the quantum states |i with a probability p i . Now the von Neumann entropy N can also be expressed as, Note that von Neumann entropy of a pure state is zero while that of a mixed state is positive. However, a state that is pure to one observer can appear mixed to another. Thus, von Neumann entropy is observer dependent because it is a type of informational entropy (Proposition 1). However, von Neumann entropy is identified with physical entropy S (for example, see Ref. [2]). Therefore, how can an observer-dependent quantity, von Neumann entropy, be identified with an objective quantity, physical entropy? Does it mean that physical entropy is also observer dependent? The answer to the latter question is no. The problem can be posed in a different form. 'What is the entropy of the universe (if von Neumann entropy is identified with physical entropy)?' There is no one outside the universe for whom the probability p i is defined.
Another argument for the identification?: One may give another reason why informational entropy is the same as physical entropy. Let us consider a macrostate to which W different microstates belong to. Practically, an observer knows nothing about which microstate the actual one is. Assuming equal probability for each microstate, we get p = 1/W . This recovers Eq.
(1). However, the argument implies no more than 'Informational entropy in the case of zero knowledge is the same as physical entropy.'. Therefore, they are not the same in the case of non-zero knowledge. The problem states that informational entropy is not meaningful without an observer for whom the probability p i is defined. To solve the problem, it should be noted that there are objects other than the system that might correlated with the system. However, the more the object is correlated with the system, the more information about the system the object has. Here the object plays the role of observer. The object can be anything that are correlated with the system more or less, e.g., a human brain, computer memory, or stones.
Proposition 2 (coherent explanation): Informational entropy N of a system should be interpreted as physical entropy S of both the system and an object containing information on the system, namely N(system)= S(system +memory).
Let us illustrate proposition 2 [10]. Let us consider informational entropy N of a system of n quantum bits (qubits) [11]. (n is a positive integer.) For each qubit, there is a corresponding qubit, which is the object discussed above. The new qubit plays the role of memory. Therefore, there will be n memory qubits. Here, we consider only those cases where each qubit is either in one of two orthogonal states, |0 and |1 . Two extreme cases are considered.
The first case is that the system and memory are perfectly correlated. This means that memory has perfect information about the system. In this case, each qubit of the system might be either |0 or |1 . However, each memory-qubit is in the same state as that of the corresponding system-qubit [12]. For example, a possible case is, (|0 ,|0 ), (|1 ,|1 ), (|0 ,|0 ), (|0 ,|0 ), (|1 ,|1 ),... . Here the first (second) one in each parenthesis denotes the system-qubit (memory-qubit). Therefore, the total number of states W is 2 n . From Eq. (1), we have, The second case is that the system and memory are not correlated. This means that memory has no information about the system. For example, (|0 ,|1 ), (|1 ,|1 ), (|0 ,|1 ), (|0 ,|0 ), (|1 ,|0 ),... . The total number of states W is 2 2n and, S 2 = log 2 2n = 2n. Therefore, S 1 (S 2 ) is the physical entropy of a case where memory has perfect (no) information on a system. Eq. (8) suggests that a case of perfect information has smaller physical entropy than a case with no information. This is a coherent explanation in terms of physical entropy for how the loss of information of a memory means an increase in physical entropy.

IV. DISCUSSION
As is well known [2, 3,4,5], there is another related problem due to the identification of informational entropy with physical entropy. The problem, which is often regarded as a paradox, is 'the entropy doesn't increase [2,4,5].' The arguments are as follows. Because the Schrödinger equation is deterministic, the observer can track the quantum state [13]. That is, if one knows the initial quantum state then they know quantum state at any time later. The fact that it is possible to track and the unitarity of quantum mechanics means that the informational entropy N is constant [2]. This is also the origin of the famous problem involved with blackhole evaporation [8,9]. However, resolution of the paradox in terms of the coherent view is as follows. The capacity of a memory is limited and it cannot perfectly track the system state. That is, the correlation degrades naturally. However, this explanation appears to be related to resolution of the paradox by the 'coarse graining' idea [3,5].
With the coherent view, the case of multiple observers can be dealt with. It is simple to see that the more observers there are for a given system, the larger the increase in entropy due to the 'information loss' of the observers. When there is no observer, there is no increase in entropy due to information loss.

V. CONCLUSION
Informational entropy is often identified as physical entropy. This is problematic because the former is a subjective quantity while the latter is an objective one. This papers described the problems and reconciles the two entropies. An explanation was provided in terms of physical entropy by which we can coherently understand how a 'loss of information' of a subject (here a memory) induces an increase in entropy. We emphasize that physical entropy should be defined objectively in any case.