A random variable $X$ with characteristic function $f_X$ is said to be monotone if $|f_X|$ is decreasing on $[0,\pi]$. Examples of monotone random variables include those with a Bernoulli, Poisson or geometric distribution, and a sum of independent monotone random variables is again monotone. \par The authors consider a sum $X=\sum_{j=0}^{k-1}X_j$ of independent monotone integer-valued random variables with mean $\mu_j=\Bbb{E}[X_j]$ and variance $\sigma_j^2={\rm Var}(X_j)$. Letting $\mu$ and $\sigma^2$ be the mean and variance of $X$, respectively, their main result is the following estimate of the probability mass function of $X$: $$ \left|\Bbb{P}(X=\mu+t\sigma)-\frac{1}{\sqrt{2\pi}\sigma}e^{-t^2/2}\right|\leq c\left(\frac{\sum_{j=0}^{k-1}\Bbb{E}\left[|X_j-\mu_j|^3\right]}{\sigma^3}\right)^2, $$ for some universal constant $c$. \par The authors also present a similar bound for the mass function of $X$ under the stronger assumption that there exists a non-trivial exponential tilting of $X$ and that each such exponential tilting is monotone; this is referred to as strong monotonicity. The advantage of this latter bound is that it can perform better than the result stated above away from the mean of $X$.