Uncertainty in entropy
DOI:
https://doi.org/10.53727/rbhc.v12i1.47Keywords:
Communication, entropy, Shannon, informationAbstract
Claude E. Shannon, in his seminal paper of 1948, founded Information Theory, a theory that is the precursor of the information age, whose impacts and ramifications, since then, have steadily grown. Among the concepts of Shannon’s Mathematical Theory of Communication, entropy is the one that holds the most diverging interpretations. Thus, our goal in this article is to develop some historical considerations on the concept of entropy, without any intention of exhausting the subject, working with the “uncertainty” in its definition. The pun is deliberate, since we argue that scientific or technological innovations in Information Theory stem from the interplay of the broad scope and precise descriptions of the concept of entropy.
Downloads
Downloads
Published
Issue
Section
License
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.