From Source Coding to Isomorphism: A Constructive Proof of Ornstein’s Theorem
Loading...
Date
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
In ergodic theory, the Isomorphism Theorem of Donald Ornstein stands as a landmark result, establishing that entropy rate is a complete invariant for the class of Bernoulli schemes. This theorem provides a profound structural counterpart to Shannon's Source Coding Theorem from information theory; while both find their answer in the entropy rate, one governs the limits of data compression and the other determines when two processes are abstractly the same. This thesis presents a self-contained, pedagogical proof of Ornstein's theorem by developing the elegant and powerful constructive method of Michael Keane and Meir Smorodinsky. Their approach yields a result stronger than mere existence, demonstrating the existence of a \emph{finitary} isomorphism---a map determined by finite, data-dependent windows—which makes the connection to the algorithmic nature of information theory explicit. The work is structured to build this argument from first principles, beginning with the necessary preliminaries from measure and information theory, followed by a development of Shannon's theorem to provide context and intuition. The core of the thesis is the detailed presentation of the Keane-Smorodinsky proof, from the initial entropy-matching reduction to the construction of the isomorphism through the hierarchical framework of skeletons, fillers, and societies. By presenting these two pillars of information theory and ergodic theory in parallel, this thesis aims to illuminate the deep unity between the statistical limits of communication and the measure-theoretic classification of dynamical systems.