Next Article in Journal
On the Sign of Local Temperature in Molecules
Next Article in Special Issue
Entropy Calculation of Reversible Mixing of Ideal Gases Shows Absence of Gibbs Paradox
Previous Article in Journal
Tsallis Entropy and the Transition to Scaling in Fragmentation
Previous Article in Special Issue
Some Observations on the Concepts of Information-Theoretic Entropy and Randomness
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Editorial

Diversity and Entropy

Molecular Diversity Preservation International (MDPI), Matthaeusstrasse 11, CH-4057 Basel, Switzerland
Submission received: 28 January 1999 / Accepted: 5 February 1999 / Published: 11 February 1999
(This article belongs to the Special Issue Gibbs Paradox and Its Resolutions)
Entropy has been launched as a scientific journal to provide an advanced forum for the community of entropy and information researchers.
There are many types of entropy reported in the scientific literature [1]. The great diversity in the concept and definition may cause tremendous problems. My own humble suggestion is the following regarding the main two kinds of entropy: 1. Any information-theoretic entropy (Shannon's entropy [2], H) should be defined in a way that its relation with information is clear. 2. Any theories regarding thermodynamic entropy (classical entropy, S, or the entropy of Clausius, Gibbs and Boltzmann and Plank) should conform with the second law of thermodynamics.
For information-theoretic entropy, if one uses entropy and information interchangeably, which has often happened even among some physicists [3], for any well defined system and processes, we cannot make meaningful intellectual discussion [3].
A famous thermodynamic entropy theory is Dr. Ilya Prigogine's dissipative structure theory. It has been presented by most of my respected teachers as unbelievably important, beautiful and useful. Therefore, 20 years ago as a young student of chemistry, I wanted to understand Prigogine's theory and I studied all kinds of related mathematics and physics, including several graduate courses in physics, to prepare myself. Now, after more than 20 years, first 10 years of theoretical investigation, then, several years of diverse experimental practice in chemistry laboratories, I have a clear opinion regarding this entropy theory. Its main problem is that it does not conform with the second law of thermodynamics [4]. Therefore, it is not a surprise that an honest chemist (among any other educated chemists, physicists, biologists, etc.) will tell you that he has never found an application of this entropy theory in chemistry (or in biology, physics, engineering, ...) [5].
However, the messy and confusing situation regarding entropy-related studies has provided opportunities for us: clearly there are still many very interesting studies to pursue. For instance, one immediate task is to investigate whether the information-theoretic entropy and the thermodynamic entropy are compatible or not compatible, i.e., whether both need to satisfy a definition of "information is related to entropy loss", what relation the information-theoretic entropy has to the second law of thermodynamics and whether there is any correlation between information-theoretic entropy and thermodynamic entropy. To make the life of the students of younger generations easier, entropy related concepts need to be clarified and well defined. The relation of entropy with many other concepts need to be studied. These reasons alone would justify our launching of this new journal Entropy.
Nevertheless, diversity is good in another sense: the very diverse areas involved in the adaption and the application of the entropy concept and of those various good theories of entropy will definitely generate a very active scientific forum. Our journal Entropy will strongly support this forum, publishing high quality papers from areas as diverse as physics, chemistry, biology, economy and philosophy.

Acknowledgments

I am very grateful to Dr. Georgi P. Gladyshev (url http://endeav.org email [email protected]) for his numerous useful comments. The author also would like to thank Dr. Thomas D. Schneider (url http://www-lecb.ncifcrf.gov/~toms/ email [email protected]) for English corrections.

References and Notes

  1. Hillman, C. Entropy on the world wide web. ( http://www.mdpi.org/entropy/entropyweb/). Denbigh, K.G. Brit. J. Phil. Sci. 1989, 40, 323–332. ( http://www.endeav.org/evolut/text/denbig1/denbig1e.htm). Lowe, J.P. Entropy: conceptual disorder. J. Chem. Educ. 1988, 65, 403–406. [Google Scholar] [PubMed]Lambert, F.L. The second law of thermodynamics. ( http://www.secondlaw.com/index.html).
  2. (a) Shannon, C.E. A mathematical theory of communication. Bell Sys. Tech. J. 1948, 27, 323-332; 379-423. [Google Scholar] [CrossRef] (b) Claude E. Shannon’s classic 1948 paper [2a] is now available electronically: Shannon, C.E. A Mathematical Theory of Communication. ( http://cm.bell-labs.com/cm/ms/what/shannonday /paper.html).
  3. Lin, S.-K. Understanding structural stability and process spontaneity based on the rejection of the Gibbs paradox of entropy of mixing. J. Mol. Struc. (Theorochem) 1997, 398, 145–153. [Google Scholar] Lin, S.-K. Gibbs paradox of entropy of mixing: Experimental facts, its rejection, and the theoretical consequences. J. Theoret. Chem. 1996, 1, 135–150. (This paper in pdf format can be downloaded at http://www.mdpi.org/lin/lin-rpu.htm). [Google Scholar] Lin, S.-K. Molecular diversity assessment: Logarithmic relations of information and species diversity and logarithmic relations of entropy and indistinguishability after rejection of Gibbs paradox of entropy of mixing. Molecules 1996, 1, 57–67. (This paper in pdf format can be downloaded at http://www.mdpi.org/lin/lin-rpu.htm). [Google Scholar] Lin, S.-K. Correlation of entropy with similarity and symmetry. J. Chem. Inf. Comp. Sci. 1996, 36, 367–376. [Google Scholar]
  4. (a) Based on his observation of the so-called "order out of chaos" examples, Pregogine himself also questioned the validity of the second law of thermodynamics; see Prigogine's comments in the paragraph regarding the second law of thermodynamics in his Nobel lecture [4b]. (b) Prigogine, I. Time, structure, and fluctuations. Science 1978, 201, 777–785. [Google Scholar] (c) For an early critique, see: Lin, S.-K. Time symmetry and thermodynamics. Comp. Math. Applic. Int'l. J. 1991, 22(12), 67–76. [Google Scholar]
  5. However, all contributions on Prigogine's dissipative structure theory are still welcomed based on our editorial policy.

Share and Cite

MDPI and ACS Style

Lin, S.-K. Diversity and Entropy. Entropy 1999, 1, 1-3. https://0-doi-org.brum.beds.ac.uk/10.3390/e1010001

AMA Style

Lin S-K. Diversity and Entropy. Entropy. 1999; 1(1):1-3. https://0-doi-org.brum.beds.ac.uk/10.3390/e1010001

Chicago/Turabian Style

Lin, Shu-Kun. 1999. "Diversity and Entropy" Entropy 1, no. 1: 1-3. https://0-doi-org.brum.beds.ac.uk/10.3390/e1010001

Article Metrics

Back to TopTop