Analytic information theory aims at studying problems of information theory using analytic techniques of combinatorics, algorithmics, and discrete mathematics. Following Hadamard's precept, we tackle these problems by complex analysis methods such as generating functions, Rice's formula, Mellin transform, Fourier series, ``Gleichverteilung'' (i.e., uniform distribution mod 1), saddle point method, analytic poissonization and depoissonization, and singularity analysis. As Andrew Odlyzko noticed: ``analytic methods are extremely powerful and when they apply, they often yield estimates of unparalleled precision''. In this talk, we concentrate on one facet of information theory (more precisely, source coding better known as data compression), namely the redundancy rate problem. This is an ideal candidate for analytic information theory since it investigates second-order asymptotics of the performance of codes. We survey some techniques used in deriving our recent analytic results such as: (i) maximal minimax redundancy for memoryless sources, Markov sources, and renewal processes; and (ii) relationship between the average and the maximal minimax redundancy. We concentrate on the latter problem.
Back to the Index