Non-negative matrix factorization (NMF) approximates a given matrix as a product of two non-negative matrix factors. Multiplicative algorithms deliver reliable results, but they show slow convergence for high-dimensional data and may be stuck away from local minima. Gradient descent methods have better behavior, but only apply to smooth losses. For non-smooth losses such as the Kullback-Leibler (KL) loss, surprisingly, these methods are lacking. In this paper, we propose a first-order primal-dual algorithm for non-negative decomposition problems (one of the two factors is fixed) with the KL distance. All required computations may be obtained in closed form and we provide an efficient heuristic way to select step-sizes. By using alternating optimization, our algorithm readily extends to NMF and, on synthetic or real world data, it is either faster than existing algorithms, or leads to improved local optima, or both.