|  Types of Entropies | 
Boltzmann Entropy
  * Idea: For a given macrostate, it is
    the statistical quantity S = kB ln
    Ω, where kB is the Boltzmann constant
    (it can be omitted to get a dimensionless entropy), and Ω the number of microstates
    compatible with the macrostate.
  * And ignorance: Generally, one views
    entropy as a measure of our ignorance of the microscopic state of a system; This seems
    to make entropy a subjective thing which, for a given system, depends on how much we
    wanted to find out, and could be decreased if we just measured something more; This is
    so, but in practice it is not a real problem, because S ~ ln Ω, and the
    kind of measurements we could think of to decrease our subjective entropy might lower
    Ω by a factor of, say, 100; After taking the log, this becomes a very small amount
    to subtract from the previous entropy, and it makes no real difference; Therefore, in
    practice one doesn't need to introduce a definition of entropy different from the usual
    one (< RDS, 1.02.1985 meeting).
  @ General references:
    Jaynes AJP(65)may [vs Gibbs entropy];
    Kalogeropoulos MPLB(08)-a0804 [variation with respect to energy].
  @ Examples: Swendsen AJP(06)mar [colloids, clarification];
    Yoshida PRA(20)-a1909 [quantum field systems].
Information Theoretical (Shannon) Entropy
  > s.a. Brudno's Theorem; hamiltonian dynamics;
  information; H Theorem;
  Landauer's Principle.
  $ Def: The Shannon uncertainty; For
    a mixed state ρ = ∑n
    pn \(|n\rangle\langle n|\),
    where {|n\(\rangle\)} is a complete set of states,
S = −kB ∑n pn ln pn ;
    It is equivalent to the Boltzmann-Gibbs definition of  entropy under equilibrium
    conditions, when it corresponds to N equally probable microscopic states.
  * Remark: Actually, we could use the log
    with any base, or any other convex function
    f(pn/qn),
    with qn the equilibrium probabilities;
    For any such f the entropy would increase; Other properties like additivity
    put constraints on f.
  * Remark: Then one bit of information
    corresponds to an entropy ln 2.
  @ General references:
    Fahn FP(96) [and thermodynamics];
    Fa JPA(98) [generalization];
    Chakrabarti & Chakrabarty IJMMS(05)qp [axiomatic];
    Maroney qp/07 [Gibbs-Von Neumann, motivation];
    Ladyman et al SHPMP(08);
    Wilde in(13)-a1106 [quantum Shannon theory];
    Baccetti & Visser JSM(13)-a1212 [states with infinite entropy];
    Weilenmann et al PRL(16)-a1501 [and thermodynamic entropy];
    Anza & Vedral SRep(17)-a1509 [thermodynamical meaning, and thermal equilibrium];
    Carcassi et al EJP-a1912 [characterization].
  @ And correlations: Van Drie mp/00;
    Gu et al JPA(08)qp/06.
  @ Wehrl information entropy: Miranowicz et al JPA(01)qp;
    Piatek & Leonski JPA(01) [entanglement and correlations].
  > Online resources: see
    Wikipedia page. 
Rényi Entropy > s.a. entanglement entropy;
  statistical mechanical systems; XY Chain.
  * Idea: For a mixed
    quantum state described by a density matrix ρ, it is defined as 
    SnR
    = kB (ln ∑i
    pin)
    / (1−n), where the pi
    are the eigenstates of ρ (probabilities), n is an integer, and
    it is used as a measure of the entanglement of the system with its environment.
  * Special cases: For n = 1 it
    gives the von Neumann entropy, and for n = 2 the mixedness of the system.
  @ References:
    Harremoës PhyA(06) [operational];
    Parvan & Biró PLA(10) [Rényi statistics in canonical and microcanonical ensembles];
    Oikonomou PhyA(11) [multinomial coefficients method];
    Baez a1102 [as free energy of Gibbs state];
    Adesso et al PRL(12)-a1203 [and  quantum information for Gaussian states];
    Linden et al PRS(13) [Rényi entropic inequalities];
    Mosonyi & Ogawa CMP(15)-a1309 [operational interpretation];
    Dong nComm(16)-a1601 [quantum field theory, area law];
    Bebiano et al a1706 [as basis for quantum thermodynamics];
    Cresswell a1709 [entanglement timescale];
    Bellomo et al SRep-a1710 [operational interpretation];
    Kendall & Kempf a2004 [and interaction between systems].
  @ Rényi mutual information: Schnitzer a1406 [for widely separated identical compound systems];
    Hayashi & Tomamichel JMP(16)-a1408 [operational interpretation].
  @ For specific systems:
    Romera & Nagy PLA(08) [atoms];
    Swingle et al PRB(13)-a1211 [Fermi gases and liquids];
    Lee et al JHEP(15)-a1407,
    Lashkari PRL(14)-a1404 [conformal field theory];
    Dowker a1512
      [free scalar fields, charged Rényi entropies];
    Sugino & Korepin IJMPB(18)-a1806 [highly-entangled spin chains];
    Olendski EJP(19)-a1811 [and Tsallis entropy, examples];
    > s.a. dirac quantum field theory.
  @ Vs von Neumann entropy: Fannes & Van Ryn JPA(12)-a1205 [for fermions];
    Fannes a1310 [monotonicity of the von Neumann entropy].
  @ Generalizations: Müller-Lennert et al JMP(13)-a1306;
    Nishioka JHEP(14),
    Hama et al JHEP(14)-a1410 [supersymmetric];
    Johnson a1807;
    Mukhamedov QIP(19)-a1905 [on C*-algebras].
Other Entropies > s.a. Kolmogorov-Sinai Entropy;
  non-extensive statistics [Tsallis entropy]; quantum entropy
  and entanglement entropy [or geometric].
  * Relative entropy: The relative entropy
    of a probability distribution p with respect to another probability distribution
    q; For discrete distributions, one definition is the Kullback-Leibler distance,
    \(d(p,q)\):= ∑i \(p_i \log_2(p_i/q_i)\);
    > s.a. MathWorld page;
    Wikipedia page.
  * Metric entropy of families of metric spaces:
    The asymptotic behavior of covering numbers.
  * Shore-Johnson axioms: Consistency conditions
    ensuring that probability distributions inferred from limited data by maximizing the entropy satisfy
    the multiplication rule of probability for independent events; They are satisfied by the Boltzmann-Gibbs
    form of the entropy, but not by non-additive entropies.
  @ Proposals: Lubkin IJTP(87) [entropy of measurement];
    Kaniadakis et al PhyA(04) [from deformed log's];
    Petz a1009 [quasi-entropy];
    Polkovnikov AP(11) [diagonal entropy];
    Rastegin JSP(11)-a1106 [unified entropies];
    Biró PhyA(13) [deformed entropy formulas for systems with finite reservoirs];
    Kalogeropoulos AMP-a1705 [\(\delta\)-entropy];
    Portesi et al EPJst(18)-a1802 [generalized (h, φ)-entropies].
  @ Classical mechanics: Brun & Hartle PRE(99)qp/98 [histories];
    McLachlan & Ryland JMP(03)mp/02 [algebraic].
  @ Relative entropy: Narnhofer & Thirring Fiz(87);
    Baez & Fritz TAC-a1402 [Bayesian characterization];
    Anshu et al IEEE(16)-a1404 [operational interpretation];
    Lashkari PRL(14)-a1404 [in conformal field theory];
    Rajagopal et al PRA(14) [Tsallis, and conditional entropy];
    Leditzky a1611-PhD [and quantum information theory];
    Kalogeropoulos PhyA-a1905 [relative q-entropy];
    Dutta & Guha a1908
      [Tsallis, modification for classical information theory];
    > s.a. entropy bound; entropy in quantum theory;
      modified thermodynamics; non-equilibrium systems.
  @ Dynamical entropy: Connes et al CMP(87)*;
    Hudetz LMP(88);
    Hudetz JMP(94) [topological entropy];
    Benatti et al JPA(04) [and discrete chaos];
    Segre a1004 [and deformation quantization].
  @ Algebraic entropy:
    Bellon & Viallet CMP(99) [discrete-time systems];
    Viallet IJGMP(08).
  @ Localization entropy: Schroer ht/01 [and area law].
  @ Related topics:
    Addison & Gray JPA(01) [extensivity];
    Pérez-Madrid PhyA(04) [Gibbs entropy and irreversibility];
    Edwards JSP(04) [granular or glassy systems];
    Souza & Tsallis PhyA(04) [concavity and stability];
    Malik & Lopez-Mobilia a2004 [based on the level of irreversibility of a process];
    Mollabashi et al a2011 [pseudo-entropy];
    > s.a. Coarse Graining;
      modified thermodynamics [relativistic]; Topos Theory.
 main page
  – abbreviations
  – journals – comments
  – other sites – acknowledgements
  send feedback and suggestions to bombelli at olemiss.edu – modified 8 feb 2021