Single-cell RNA sequencing (scRNA-seq) is one of the most exciting technological breakthroughs that many believe would revolutionise medical research. At the same time, this new technology has created some urgent data-mining challenges. Unlike sequencing data obtained from the old bulk sequencing technology, the scRNA-seq data are more noisy and biased. One particular data challenge is dropouts, where a low amount of mRNA leads to zero detection. Recently, an appealing idea has been to embrace dropouts and use potential dropout patterns for analysis. It has led to surprising results. In this paper, we take this idea further and focus on the problem of recovering gene dynamics from single-cell data. We show that dropouts can mislead the most commonly used model to produce the wrong dynamics. We propose a solution with two components, a nonlinear neural model based on neural ODEs and a hurdle distribution adaptable to potential signals in dropout patterns. We provide empirical evidence that demonstrates the advantages of our proposed model over a state-of-the-art method for scRNA-seq analysis.