Dr. Liu's thesis, " Information Theory From a Functional Viewpoint" was supervised by Professors Sergio Verdú and Paul Cuff. The abstract reads as follows:
A perennial theme of information theory is to find new methods to determine the fundamental limits of various communication systems. Traditional methods have focused on the notion of "sets": the method of types concerns the cardinality of subsets of the typical sets; the blowing-up lemma bounds the probability of the neighborhood of decoding sets; the information-spectrum approach uses the likelihood threshold to define sets. This thesis promotes the idea of deriving the fundamental limits using functional inequalities, where the central notion is "functions" instead of "sets". Canonically, a functional inequality follows from the entropic definition of an information measure by convex duality. For example, the Gibbs variational formula follows from the Legendre transform of the relative entropy. The first part of the thesis proposes a new methodology of deriving converse (impossibility) bounds based on more sophisticated versions of such duality and the reverse hypercontractivity of Markov semigroups. This methodology is broadly applicable to source and channel networks and common randomness generation, and in particular resolves the optimal scaling of the second-order rate for "side-information problems" which was previously open. As a second example, the Legendre transform of the so-called $E_{\gamma}$ metric gives rise to a functional inequality which is useful for the change-of-measure in non-asymptotic information theory. Thus by analyzing $E_{\gamma}$-resolvability we obtain non-asymptotic bounds for source coding, wiretap channel and mutual covering. Along the way, we derive general convex duality results allowing us to provide a unified treatment to many inequalities and information measures such as the Brascamp-Lieb inequality and its reverse, strong data processing inequality, hypercontractivity and its reverse, transportation-cost inequalities, and Rényi divergences. Capitalizing on such dualities, we show how information-theoretic methods are better tools for proving certain properties of functional inequalities such as the Gaussian optimality – the antithesis of the functional approach to information theory!
Jingbo Liu received the B.E. degree from Tsinghua University, Beijing, China in 2012, and the M.A. and Ph.D. degrees from Princeton University, Princeton, NJ, USA, in 2014 and 2018, all in electrical engineering. His research interests include signal processing, information theory, coding theory, high dimensional statistics, and the related fields. His undergraduate thesis received the best undergraduate thesis award at Tsinghua University (2012). He gave a semi-plenary presentation at the 2015 IEEE Int. Symposium on Information Theory, Hong-Kong, China. He was a recipient of the Princeton University Wallace Memorial Honorific Fellowship in 2016.