In real-world translation scenarios, particularly those involving safety in the aviation domain, it is expected that neural machine translation (NMT) can translate with idiomatic expressions and constraints provided by aviation experts. Generally, this topic problem can be addressed by fine-tuning NMT model with parallel corpus. However, parallel corpus is usually lacking and with noise. Conventional pre-training & fine-tuning paradigm may not perform well in this situation. Recently, a series prompt-tuning works have been proposed to solve the few-shot and zero-shot problems existing in downstream task. Inspired by those, we designed a prompt-based methods for NMT. With the help of injected related domain knowledge, the NMT models can strictly follow to word-level constraints. Specifically, in this work, we proposed two well-designed modules that can satisfy translation constraints in the aviation domain, including the preservation of words and observation of terminology. We extensively conduct experiments on a real-world large corpus of aviation maintenance text in English and Chinese and evaluated our method with four metrics, including missing or redundant information, incorrect formats, bad word or sentence patterns, and unconventional expressions. The experiment results show that our prompt methods were able to make the translation satisfy domain-specific lexical constraints in low-resource settings, using a pre-trained Transformer as the backbone model across all tasks.