Comments: | (or for this version) |
Abstract: We derive a generalization of the Second Law of Thermodynamics that uses Bayesian updates to explicitly incorporate the effects of a measurement of a system at some point in its evolution. By allowing an experimenter's knowledge to be updated by the measurement process, this formulation resolves a tension between the fact that the entropy of a statistical system can sometimes fluctuate downward and the information-theoretic idea that knowledge of a stochastically-evolving system degrades over time. The Bayesian Second Law can be written as $\Delta H(\rho_m, \rho) + \langle \mathcal{Q}\rangle_{F|m}\geq 0$, where $\Delta H(\rho_m, \rho)$ is the change in the cross entropy between the original phase-space probability distribution $\rho$ and the measurement-updated distribution $\rho_m$, and $\langle \mathcal{Q}\rangle_{F|m}$ is the expectation value of a generalized heat flow out of the system. We also derive refined versions of the Second Law that bound the entropy increase from below by a non-negative number, as well as Bayesian versions of the Jarzynski equality. We demonstrate the formalism using simple analytical and numerical examples.
![Tutorial on Second law of Thermodynamics](/img/video/tutorial_on_second_law_of_thermodynamics_1.jpg)
Tutorial on Second law of Thermodynamics
![第29講 Entropy and the Second Law of Thermodynamics I A](/img/video/29_entropy_and_the_second_law.jpg)
第29講 Entropy and the Second Law of Thermodynamics I A
Entropy - God's Dice Game: The book describes the historical evolution of the understanding of entropy, alongside biographies of the scientists who ... communication theory, economy, and sociology Book (CreateSpace Independent Publishing Platform)
|