Types of Entropies |

**Boltzmann Entropy**

* __Idea__: For a given macrostate, it is
the statistical quantity *S* =* k*_{B} ln
Ω, where *k*_{B} is the Boltzmann constant
(it can be omitted to get a dimensionless entropy), and Ω the number of microstates
compatible with the macrostate.

* __And ignorance__: Generally, one views
entropy as a measure of our ignorance of the microscopic state of a system; This seems
to make entropy a subjective thing which, for a given system, depends on how much we
wanted to find out, and could be decreased if we just measured something more; This is
so, but in practice it is not a real problem, because *S* ~ ln Ω, and the
kind of measurements we could think of to decrease our subjective entropy might lower
Ω by a factor of, say, 100; After taking the log, this becomes a very small amount
to subtract from the previous entropy, and it makes no real difference; Therefore, in
practice one doesn't need to introduce a definition of entropy different from the usual
one (< RDS, 1.02.1985 meeting).

@ __General references__:
Jaynes AJP(65)may [vs Gibbs entropy];
Kalogeropoulos MPLB(08)-a0804 [variation with respect to energy].

@ __Examples__: Swendsen AJP(06)mar [colloids, clarification];
Yoshida PRA(20)-a1909 [quantum field systems].

**Information Theoretical (Shannon) Entropy**
> s.a. Brudno's Theorem; hamiltonian dynamics;
information; *H* Theorem;
Landauer's Principle.

$ __Def__: The Shannon uncertainty; For
a mixed state *ρ* = ∑_{n}
*p*_{n} \(|n\rangle\langle n|\),
where {|*n*\(\rangle\)} is a complete set of states,

*S* = −*k*_{B}
∑_{n} *p*_{n}
ln *p*_{n} ;

It is equivalent to the Boltzmann-Gibbs definition of entropy under equilibrium
conditions, when it corresponds to *N* equally probable microscopic states.

* __Remark__: Actually, we could use the log
with any base, or any other convex function
*f*(*p*_{n}/*q*_{n}),
with *q*_{n} the equilibrium probabilities;
For any such *f* the entropy would increase; Other properties like additivity
put constraints on *f*.

* __Remark__: Then one bit of information
corresponds to an entropy ln 2.

@ __General references__:
Fahn FP(96) [and thermodynamics];
Fa JPA(98) [generalization];
Chakrabarti & Chakrabarty IJMMS(05)qp [axiomatic];
Maroney qp/07 [Gibbs-Von Neumann, motivation];
Ladyman et al SHPMP(08);
Wilde in(13)-a1106 [quantum Shannon theory];
Baccetti & Visser JSM(13)-a1212 [states with infinite entropy];
Weilenmann et al PRL(16)-a1501 [and thermodynamic entropy];
Anza & Vedral SRep(17)-a1509 [thermodynamical meaning, and thermal equilibrium];
Carcassi et al EJP-a1912 [characterization].

@ __And correlations__: Van Drie mp/00;
Gu et al JPA(08)qp/06.

@ __Wehrl information entropy__: Miranowicz et al JPA(01)qp;
Piatek & Leonski JPA(01) [entanglement and correlations].

> __Online resources__: see
Wikipedia page.

**Rényi Entropy** > s.a. entanglement entropy;
statistical mechanical systems; XY Chain.

* __Idea__: For a mixed
quantum state described by a density matrix *ρ*, it is defined as
*S*_{n}^{R}
= *k*_{B} (ln ∑_{i}
*p*_{i}^{n})
/ (1−*n*), where the *p*_{i}
are the eigenstates of *ρ* (probabilities), *n* is an integer, and
it is used as a measure of the entanglement of the system with its environment.

* __Special cases__: For *n* = 1 it
gives the von Neumann entropy, and for *n* = 2 the mixedness of the system.

@ __References__:
Harremoës PhyA(06) [operational];
Parvan & Biró PLA(10) [Rényi statistics in canonical and microcanonical ensembles];
Oikonomou PhyA(11) [multinomial coefficients method];
Baez a1102 [as free energy of Gibbs state];
Adesso et al PRL(12)-a1203 [and quantum information for Gaussian states];
Linden et al PRS(13) [Rényi entropic inequalities];
Mosonyi & Ogawa CMP(15)-a1309 [operational interpretation];
Dong nComm(16)-a1601 [quantum field theory, area law];
Bebiano et al a1706 [as basis for quantum thermodynamics];
Cresswell a1709 [entanglement timescale];
Bellomo et al SRep-a1710 [operational interpretation];
Kendall & Kempf a2004 [and interaction between systems].

@ __Rényi mutual information__: Schnitzer a1406 [for widely separated identical compound systems];
Hayashi & Tomamichel JMP(16)-a1408 [operational interpretation].

@ __For specific systems__:
Romera & Nagy PLA(08) [atoms];
Swingle et al PRB(13)-a1211 [Fermi gases and liquids];
Lee et al JHEP(15)-a1407,
Lashkari PRL(14)-a1404 [conformal field theory];
Dowker a1512
[free scalar fields, charged Rényi entropies];
Sugino & Korepin IJMPB(18)-a1806 [highly-entangled spin chains];
Olendski EJP(19)-a1811 [and Tsallis entropy, examples];
> s.a. dirac quantum field theory.

@ __Vs von Neumann entropy__: Fannes & Van Ryn JPA(12)-a1205 [for fermions];
Fannes a1310 [monotonicity of the von Neumann entropy].

@ __Generalizations__: Müller-Lennert et al JMP(13)-a1306;
Nishioka JHEP(14),
Hama et al JHEP(14)-a1410 [supersymmetric];
Johnson a1807;
Mukhamedov QIP(19)-a1905 [on C*-algebras].

**Other Entropies** > s.a. Kolmogorov-Sinai Entropy;
non-extensive statistics [Tsallis entropy]; quantum entropy
and entanglement entropy [or geometric].

* __Relative entropy__: The relative entropy
of a probability distribution *p* with respect to another probability distribution
*q*; For discrete distributions, one definition is the Kullback-Leibler distance,
\(d(p,q)\):= ∑_{i} \(p_i \log_2(p_i/q_i)\);
> s.a. MathWorld page;
Wikipedia page.

* __Metric entropy of families of metric spaces__:
The asymptotic behavior of covering numbers.

* __Shore-Johnson axioms__: Consistency conditions
ensuring that probability distributions inferred from limited data by maximizing the entropy satisfy
the multiplication rule of probability for independent events; They are satisfied by the Boltzmann-Gibbs
form of the entropy, but not by non-additive entropies.

@ __Proposals__: Lubkin IJTP(87) [entropy of measurement];
Kaniadakis et al PhyA(04) [from deformed log's];
Petz a1009 [quasi-entropy];
Polkovnikov AP(11) [diagonal entropy];
Rastegin JSP(11)-a1106 [unified entropies];
Biró PhyA(13) [deformed entropy formulas for systems with finite reservoirs];
Kalogeropoulos AMP-a1705 [\(\delta\)-entropy];
Portesi et al EPJst(18)-a1802 [generalized (*h*, *φ*)-entropies].

@ __Classical mechanics__: Brun & Hartle PRE(99)qp/98 [histories];
McLachlan & Ryland JMP(03)mp/02 [algebraic].

@ __Relative entropy__: Narnhofer & Thirring Fiz(87);
Baez & Fritz TAC-a1402 [Bayesian characterization];
Anshu et al IEEE(16)-a1404 [operational interpretation];
Lashkari PRL(14)-a1404 [in conformal field theory];
Rajagopal et al PRA(14) [Tsallis, and conditional entropy];
Leditzky a1611-PhD [and quantum information theory];
Kalogeropoulos PhyA-a1905 [relative *q*-entropy];
Dutta & Guha a1908
[Tsallis, modification for classical information theory];
> s.a. entropy bound; entropy in quantum theory;
modified thermodynamics; non-equilibrium systems.

@ __Dynamical entropy__: Connes et al CMP(87)*;
Hudetz LMP(88);
Hudetz JMP(94) [topological entropy];
Benatti et al JPA(04) [and discrete chaos];
Segre a1004 [and deformation quantization].

@ __Algebraic entropy__:
Bellon & Viallet CMP(99) [discrete-time systems];
Viallet IJGMP(08).

@ __Localization entropy__: Schroer ht/01 [and area law].

@ __Related topics__:
Addison & Gray JPA(01) [extensivity];
Pérez-Madrid PhyA(04) [Gibbs entropy and irreversibility];
Edwards JSP(04) [granular or glassy systems];
Souza & Tsallis PhyA(04) [concavity and stability];
Malik & Lopez-Mobilia a2004 [based on the level of irreversibility of a process];
Mollabashi et al a2011 [pseudo-entropy];
> s.a. Coarse Graining;
modified thermodynamics [relativistic]; Topos Theory.

main page
– abbreviations
– journals – comments
– other sites – acknowledgements

send feedback and suggestions to bombelli at olemiss.edu – modified 8 feb 2021