**Abstract**

We study three distinct and important problems in the intersection of information and estimation theory. The first problem we tackle is known in the literature as the amplitude constrained Gaussian channel. It is well known that for a peak-power constrained Gaussian channel the capacity-achieving input distribution is discrete with finitely many mass points. However, an unfortunate shortcoming of the prior proof technique, a bound on the number of mass points in the capacity-achieving input distribution was not accessible previously. Here, we provide an alternative proof of the finiteness of the number of mass points of the capacity-achieving input distribution while producing the first firm upper bound on the number of mass points. We also generalize this novel proof technique to multi-dimensional settings as well as to the amplitude constrained Gaussian mean estimation problem. The second problem we resolve is in the realm of channel resolvability and error exponents. Using simple but non-trivial techniques, we establish the exact exponents for the soft-covering phenomenon of a memoryless channel under the total variation metric when random i.i.d. and random constant-composition channel codes are used. Moreover, we provide alternative representations of these exponents in terms of alpha-mutual information, relating the two seemingly unrelated mathematical concepts in a very pleasing manner. Lastly, we turn our attention to universal lossless compression. We characterize the redundancy for universal lossless compression of discrete memoryless sources in Campbell's setting as a minimax Rényi divergence, which we show to be equal to the maximal alpha-mutual information via a generalized redundancy-capacity theorem. We place particular emphasis on the analysis of the asymptotics of minimax Rényi divergence, which we determine up to a term vanishing in blocklength, building a bridge between the asymptotics of minimax regret and minimax redundancy.