We consider the task of estimating a conditional density using i.i.d. samples from a joint distribution. For joint density estimation, minimax rates are known for general classes in terms of their (metric) entropy, a fundamental and well-studied notion of statistical capacity. However, applying these results to estimating conditional densi- ties can yield suboptimal rates due to their dependence on uniform entropy, which is infinite when the covariate space is unbounded and suffers from the curse of dimensionality. We resolve this problem for well-specified models, obtaining matching upper and lower bounds on the minimax Kullback–Leibler risk in terms of the empirical Hellinger entropy of the conditional density class. In contrast to uniform en- tropy, empirical entropy provides the correct dependence on the size of the covariate space. We only require that the conditional densities are bounded above, but not that they are bounded below or otherwise satisfy any tail conditions.