Manual summarization of long text is labor-costing and time-consuming. Therefore, the automatic text summarization methods are necessary for information retrieval, text semantic mining, and question answering. In this paper, we propose an end-to-end automatic text summarization model (BSSA) based on the Bidirectional Encoder Representations from Transformers (BERT) and sequence-to-sequence model. First, the BERT model extracts the text feature as a part of the encoder. Second, the long short-term memory network generates the semantic feature of the text as another main part of the encoder. Finally, the decoder with an attention mechanism generates the abstractive summaries. The experimental results on both the Chinese and English datasets show that the BSSA model achieves significant improvements over the baseline models.