Hidden Information, Energy Dispersion and Disorder: Does Entropy Really Measure Disorder?
Driss Soubane1,2,3*, Mohamed El Garah4, Mohamed Bouhassoune5,6, Ahmed Tirbiyine7, Abdelaziz Ramzi8, Said Laasri3
1Advanced Laser Light Source (ALLS), Centre Energie, Matériaux et Télécommunications Institut National de la Recherche Scientifique INRS-EMT, Varennes, Canada.
2Laboratoire des Nanomatériaux pour l’Energie et l’Environnement LN2E, FPS, Cadi Ayyad University, Marrakesh, Morocco.
3Laboratoire des Sciences de l’Ingénieur pour l’Energie, ENSAJ, Université Chouaibdoukkali, El Jadida, Morocco.
4University of Strasbourg, CNRS, ISIS UMR, Strasbourg, France.
5Département de Physique, FPS, Cadi Ayyad University, Marrakesh, Morocco.
6Laboratoire de Dynamique et Optique des Matériaux, FS, Université Mohamed Premier, Oujda, Morocco.
7Equipe des Matériaux Supraconducteurs, Université Ibn Zohr, Faculté des Sciences, Agadir, Morocco.
8Department of Electrical and Computer Engineering, Université de Sherbrooke, Sherbrooke, Canada.
DOI: 10.4236/wjcmp.2018.84014   PDF    HTML   XML   1,039 Downloads   3,081 Views   Citations

Abstract

Despite its appearance in physics around the 1850th, the second law of thermodynamics is still attracting more efforts to be clarified. More specifically, fifteen years later (1865) after its definition and introduction, entropy has been the subject of various interpretations. Hence, in physical sciences and notably in different education levels, its concept seems to be relatively tough to unambiguous decipher. In this work, we re-introduce the notion of entropy from classical, quantum and information theories viewpoints. The controversial over entropy and a measure of disorder misconception, stated by many scientists, is addressed as well to come up with less confusing physical interpretation of entropy. Hence, over time, an increase of entropy, a quantitative quantity, is most often associated to a rising of disorder, a non-quantitative quantity and no value-returning mathematical equation, rather than a continuously increasing of hidden data. In other words, linking disorder to hidden data is typically raising more confusion than clarification. Here, we shed more light on both concepts to find out an acceptable interpretation of entropy.

Share and Cite:

Soubane, D. , Garah, M. , Bouhassoune, M. , Tirbiyine, A. , Ramzi, A. and Laasri, S. (2018) Hidden Information, Energy Dispersion and Disorder: Does Entropy Really Measure Disorder?. World Journal of Condensed Matter Physics, 8, 197-202. doi: 10.4236/wjcmp.2018.84014.

1. Introduction

The notion of entropy was first introduced in 1880 by Ludwig Boltzmann [1] . He gave a statistical interpretation of entropy i.e., disorder in energy space. At that time, Democritus’s suggestion of atomism model was not unanimously admitted by physicists. The Aristotle’s physics was still prevailing, though. Worth noting that for a given system and from a statistical view point, L. Boltzmann linked entropy S of a closed complex system to a natural logarithmic of different microstates according to this equation:

S = k LnN (1)

where k is a constant named after Boltzmann, even though it has been introduced by Max Planck k = 1.38 10−23 J∙K−1 and N is microstates number or the probability of such a peculiar state to occur.

L. Boltzmann has desperately struggled to get his atomistic model recognized before sadly making an end to his life. This work paved a way to the statement made in 1902 by Josiah Willard Gibbs in his book [2] to infer that the entropy maximum occurs at thermodynamics equilibrium.

Unlike chemists who straight forward have adopted the modern atomic model as theorized in 1808 by their British fellow John Dalton, physicists were careful patient till the apparition of solid evidences, such as A. Einstein’s theory of Brownian motion in 1905 [3] [4] to join the atomists mainstream trend. For an isolated system, Rudolf Clausius in 1865 established an inequality stating that entropy would only increase, or at best remain constant, for a reversible transformation. Entropy cannot then decrease over time.

δ Q T 0 (2)

where δ Q is the differential heat transfer and T is the absolute temperature at

the boundary where the heat transfer occurs and dS = δ Q T is true for a reversible process only.

2. Entropy Association with Disorder and Missing Information

2.1. Entropy and Disorder

Entropy is a thermodynamics property; misused sometimes as a measure of disorder [5] [6] . If energy, momentum, kinetic energy is conservative, entropy on the other hand can be continuously generated. This generated entropy measures then the irreversibility of a system or how less efficient is a machine. Unintentionally, a confusing pitfall in defining and confounding entropy to a measure of disorder on a system has been raised. Following this logic, the universe or multiverse for example is then supposed to be driven towards chaos as time evolves due to a natural entropy creation. Entropy can be successively regarded as a time arrow (Figure 1).

Figure 1. Schematic representation of entropy as a time arrow.

Presumably, the observable expanding universe is not necessarily heading towards chaos even if its entropy is rising over time. 13.8 billion years ago, the big bang in this sense corresponded presumably to a minimal entropy. Entropy of the universe increases in a spontaneous process and remains unchanged in an equilibrium process. It can be created but not destroyed. To this end, the disorder is an unacceptable term to introduce the entropy. Various critics [7] [8] [9] have been reported in literature, specifically F. L Lambert [7] was likely the first to point out the fallible of the disorder and its misleading in entropy interpretation. In an appropriate term, it is better taught to introduce the entropy change as an energy dispersion that happens between the microscopic states in a system at a specific absolute temperature T.

2.2. Entropy and Missing Information

Shannon-Macmillan-Breiman found that the mathematical formula of Boltzmann defines a useful quantity in information theory. In the beginning, he was not thoroughly confident about naming this new quantity entropy [10] . Entropy in this sense naturally measures a lack of information of a system. Over time, information contained in an isolated system can only be destroyed but never created, in exactly the opposite way as entropy can only be generated but never destroyed, it naturally increases. Entropy can be subsequently perceived as a measure of the missing information. Moreover and peculiarly, on equilibrium, the conserved quantities are the sole determined information. Meanwhile, for an isolated system, conservative values are momentum, electric charge, for example. In effect, for an isolated system, the equilibrium position is a favorable position with a maximum entropy. Any shift from this position would generate a drop of entropy. On the other hand, if the system is not isolated, the equilibrium state is not necessary consistent with the maximum of entropy. Even though the whole system and its surroundings entropy is at its maximum. Moreover, according to Boltzmann’s Equation (1), equilibrium is associated with a maximum number of micro-states distribution.

Likewise, from a quantum physics perspective, information can never be destroyed it is conserved. This might be perceived as a paradox unless we consider the holographic principle that enables to encode three dimensions (3D) object data into two dimensions (2D). According to Max Born, a wavefunction of a system squared provides a probability to positioning a particle. From all that, detecting such a particle in many different positions might be then logically admitted unless the particle wavefunction collapses. For a particle of mass m moving in a potential V ( r ) the wavefunction ψ ( r , t ) is associated with De Broglie’s particle wave and a solution of the time-dependent Schrödinger’s equation:

i h 2 π ψ ( r , t ) t = [ h 2 8 π 2 m 2 + V ( r ) ] ψ ( r , t )

It is localized in space and its mathematical expression is:

ψ ( r , t ) = A ( k ) e i ( k . r ω t ) d k

where k is the wave vector or wavenumber; h = 6.626 10−34 J∙s, is a Planck constant; ω is the angular frequency; t is time and A (k) is the amplitude of the wave.

Holographic principle in this way reconciles most likely the Newtonian’s determinism and the skepticism derived from Heisenberg’s uncertainty principle, as many copies of the same 3D object can be seen in many different locations at the same time in its 2D forms.

Historically, Demon’s Laplace [11] stating that by simply and only knowing position and velocity at any given time we may be able to determine the evolution of an object back in the past and forth in the future. From a macroscopic viewpoint, a gas state is determined by measuring its pressure, volume and temperature. To this macro-state corresponds a set of different microstates or arrangements. So, releasing gas in a larger volume, for example involves logically an increase of particles arrangements i.e., higher number of microstates, synonymous to a rise of entropy. Visually we can notice an increase of order, though.

Disorder and order are not countable quantities but rather qualitative. Analogically, the observable universe expansion and increase of its entropy is subsequently a result of an extra number of microstates. Anisotropic collisions of gas atoms lead to chaotic agitations and hence loss of data.

In effect, information can be regarded as a degree of certainty or a probability that an event occurs. Lower entropy can be then correlated to how confident we are about a process to happen. As a result, a highly predictable transformation is consistent with a lower entropy. Following this logic, randomness implies higher entropy and entropy turns into an indicator of randomness or lack of confident information. It measures then the amount of the missing information.

In this sense, the second law of thermodynamics introduces the natural trend to the loss of information, for a spontaneous natural transformation.

Additionally, if disorder provides an indication of lack of pattern or an apparent missing of organization, the more precise way to measure it, seems inconvertibly controversial and unlikely to be modeled in an accurate value-returning mathematical equation. As a result, by combining both classical and quantum definitions of entropy and its definition from the theory of information perspective [12] [13] to the above definitions of order and disorder, it turns out that these notions are inconsistent since entropy definition from the classical, quantum and information theories are mathematically explicitly accurate and quantitative while the notion of order and disorder are qualitative and not mathematically a precise value.

3. Conclusion

Through this work, we raised debate about a quantitative entropy and a non-quantitative disorder confusion. We concluded then that disorder might not be the more accurate information we can draw from studying entropy evolution. The entropy is not and cannot be exclusive disorder. Measuring a degree of disorder through rise of entropy is then doomed. Additionally, we showed that lack of information and loss of data would be more appropriate to describe the entropy increase. This is more relevant to provide a correct interpretation of the entropy from a thermodynamics perspective.

Conflicts of Interest

The authors declare no conflicts of interest.

References

[1] Harris, S. (2004) An Introduction to the Theory of the Boltzmann Equation Courier Corporation.
[2] Gibbs, J.W. (1902) Elementary Principles in Statistical Mechanics: Developed with Special Reference to the Rational Foundation of Thermodynamics. Dover Publications.
[3] Einstein, A. (1956) Investigations on the Theory of the Brownian Movement. Courier Corporation.
[4] Feynman, R.P. (1964) The Brownian Movement. The Feynman Lectures of Physics, 1, 41.
[5] Arnheim, R. (1974) Entropy and Art: An Essay on Disorder and Order. University of California Press, Oakland, CA.
[6] Baranger, M. (2000) Chaos, Complexity, and Entropy. New England Complex Systems Institute, Cambridge.
[7] Lambert, F.L. (1999) Shuffled Cards, Messy Desks, and Disorderly Dorm Rooms-Examples of Entropy Increase? Nonsense! Journal of Chemical Education, 76, 1385.
https://doi.org/10.1021/ed076p1385
[8] Lambert, F.L. (2002) Entropy Is Simple, Qualitatively. Journal of Chemical Education, 79, 1241.
https://doi.org/10.1021/ed079p1241
[9] Lambert, F.L. (2012) The Misinterpretation of Entropy as “Disorder”. Journal of Chemical Education, 89, 310-310.
https://doi.org/10.1021/ed2002708
[10] Shannon, C.E. (1948) A Mathematical Theory of Communication. Bell System Technical Journal, 27, 379-423, 623-656.
[11] Laplace, P.S. (1951) A Philosophical Essay on Probabilities. French 6th Edition, Translation to English by F. W. Truscott and F. L Emory, Dover Publication, New York.
[12] Denbigh, K.G. (1989) Note on Entropy, Disorder and Disorganization. The British Journal for the Philosophy of Science, 40, 323-332.
https://doi.org/10.1093/bjps/40.3.323
[13] Wright, P.G. (1970) Entropy and Disorder. Contemporary Physics, 11, 581-588.
https://doi.org/10.1080/00107517008202196

Copyright © 2024 by authors and Scientific Research Publishing Inc.

Creative Commons License

This work and the related PDF file are licensed under a Creative Commons Attribution 4.0 International License.