Eye dynamics, a typical expression of brain activities, is an emerging modality for emerging and promising smart health applications. Electrooculogram (EOG) – a natural bio-electric signal generated during eye movements, if decoded, is of great potential to reveal the user’s mind and enable voice-free communication for patients with amyotrophic lateral sclerosis (ALS). ALS patients usually lose physical movement abilities including speech and handwriting but fortunately can move their eyes. In this study, we propose a novel deep transfer learning-empowered system, called "eyeSay", which leverages both deep learning and transfer learning for intelligent eye EOG-to-speech translation. More specifically, we have designed a multi-stage convolutional neural network (CNN) to analyze the eye-written words, named as CNN-word. Moreover, to reveal fundamental patterns of eye movements, we build a transferable feature extractor, CNN-stroke, upon eye strokes that are building components of an eye word. Then, we transfer the CNN-stroke model to the eye word learning task in an innovative way, that is, use CNN-stroke as an additional branch of CNN-word to generate a stroke probability map. The achieved boostCNN-word model, enhanced by the transferable feature extractor, has greatly improved the eye word decoding performance. This novel study will directly contribute to voice-free communications for ALS patients, and greatly advance the ubiquitous eye EOG-based smart health area.