Renyi's entropy and divergence of order α are given operational characterizations in terms of block coding and hypothesis testing, as so-called β-cutoff rates, with α = 1/1 + β for entropy and α = 1/1 - β for divergence. Out of several possible definitions of mutual information of order α (for channel W and input distribution P) we adopt Iα(P,W) = minQΣxP(x)Dα(W(·|x)∥Q). This admits interpretation as a β-cutoff rate, with α = 1/1 - β (at least for α ≥ 1/2 ), and so does maxpIα(P,W), the 'Renyi capacity.' Geometrically, the β-cutoff rate for a discrete memoryless source or channel is the τ-axis intercept of the tangent of slope β to the curve e(r), where c(τ) is the exponent of the probability of error resp, of correct decoding for the best codes of rate r, according as r is an achievable rate or not. The ordinary cutoff rate of a DMC is the β-cutoff rate with β = -1. The β-cutoff rate for hypothesis testing has a similar geometric representation, c(τ) being the exponent of convergence of the probability of type 2 error to 0 or 1, for the best tests of sample size n → ∞ with probability exp(-nτ) of type 1 error.