Terrapin Station wrote: ↑May 16th, 2020, 9:57 amNot to mention, by the way that "all energy is information" is nonsense in the first place. Anytime I encounter "information" talk in a context like that my eyes glaze over, because the notion is so ill-defined, including by Claude Shannon, whose work on that is a bunch of gobbledygook.
Shannon was well aware that…
QUOTE>
"The word 'information' has been given many different meanings by various writers in the general field of information theory. It is likely that at least a number of these will prove sufficiently useful in certain applications to deserve further study and permanent recognition. It is hardly to be expected that a single concept of information would satisfactorily account for the numerous possible applications of this general field."
(Shannon, Claude E. "The Lattice Theory of Information." 1950. Reprinted in
Claude Elwood Shannon: Collected Papers, edited by N. J. A. Sloane and Aaron D. Wyner, 180-183. Piscataway, NJ: IEEE Press, 1993. p. 180)
<QUOTE
QUOTE>
"In the twentieth century various proposals for formalization of concepts of information were made:
Qualitative Theories of Information
Semantic Information: Bar-Hillel and Carnap developed a theory of semantic Information (1953). Floridi (2002, 2003, 2011) defines semantic information as well-formed, meaningful and truthful data. Formal entropy based definitions of information (Fisher, Shannon, Quantum, Kolmogorov) work on a more general level and do not necessarily measure information in meaningful truthful datasets, although one might defend the view that in order to be measurable the data must be well-formed (for a discussion see section 6.6 on Logic and Semantic Information). Semantic information is close to our everyday naive notion of information as something that is conveyed by true statements about the world.
Information as a state of an agent: the formal logical treatment of notions like knowledge and belief was initiated by Hintikka (1962, 1973). Dretske (1981) and van Benthem & van Rooij (2003) studied these notions in the context of information theory, cf. van Rooij (2003) on questions and answers, or Parikh & Ramanujam (2003) on general messaging. Also Dunn seems to have this notion in mind when he defines information as “what is left of knowledge when one takes away believe, justification and truth” (Dunn 2001: 423; 2008). Vigo proposed a Structure-Sensitive Theory of Information based on the complexity of concept acquisition by agents (Vigo 2011, 2012).
Quantitative Theories of Information
Nyquist’s function: Nyquist (1924) was probably the first to express the amount of “intelligence” that could be transmitted given a certain line speed of a telegraph systems in terms of a log function: W=klogm, where W is the speed of transmission, K is a constant, and m are the different voltage levels one can choose from.
Fisher information: the amount of information that an observable random variable X carries about an unknown parameter θ upon which the probability of X depends (Fisher 1925).
The Hartley function: (Hartley 1928, Rényi 1961, Vigo 2012). The amount of information we get when we select an element from a finite set S under uniform distribution is the logarithm of the cardinality of that set.
Shannon information: the entropy, H, of a discrete random variable X is a measure of the amount of uncertainty associated with the value of X (Shannon 1948; Shannon & Weaver 1949).
Kolmogorov complexity: the information in a binary string x is the length of the shortest program p that produces x on a reference universal Turing machine U (Turing 1937; Solomonoff 1960, 1964a,b, 1997; Kolmogorov 1965; Chaitin 1969, 1987).
Entropy measures in Physics: Although they are not in all cases strictly measures of information, the different notions of entropy defined in physics are closely related to corresponding concepts of information. We mention Boltzmann Entropy (Boltzmann, 1866) closely related to the Hartley Function (Hartley 1928), Gibbs Entropy (Gibbs 1906) formally equivalent to Shannon entropy and various generalizations like Tsallis Entropy (Tsallis 1988) and Rényi Entropy (Rényi 1961).
Quantum Information: The qubit is a generalization of the classical bit and is described by a quantum state in a two-state quantum-mechanical system, which is formally equivalent to a two-dimensional vector space over the complex numbers (Von Neumann 1932; Redei & Stöltzner 2001)."
Information:
https://plato.stanford.edu/entries/information/
<QUOTE