Interest in the informational content of truncation motivates the study of the residual entropy function, that is, the entropy of a right truncated random variable as a function of the truncation point. In this note we show that, under mild regularity conditions, the residual entropy function characterizes the probability distribution. We also derive relationships among residual entropy, monotonicity of the failure rate, and stochastic dominance. Information theoretic measures of distances between distributions are also revisited from a similar perspective. In particular, we study the residual divergence between two positive random variables and investigate some of its monotonicity properties. The results are relevant to information theory, reliability theory, search problems, and experimental design.