equential transfer learning (STL) techniques aim to improve learning in the target task by leveraging knowledge from a related domain using pre-trained representations. Approaches based on these techniques have achieved state-of-the-art results on a wide range of natural language processing (NLP) tasks. In the context of event detection and key sentence extraction, we propose to explore STL-based techniques using the last generation of pre-trained language representations, namely, ALBERT, BERT, DistilBERT, ELECTRA, OpenAI GPT2, RoBERTa and, XLNet. Experiments are conducted as a part of our contribution to the CLEF 2019 ProtestNews Track, which aims to classify and identify protest events in English-language news from India and China. Averaged results show that a STL-based method with OpenAI GPT2 outperforms prevailing methods in this domain by achieving better performance across event detection and key sentence extraction tasks. In addition, OpenAI GPT2 also obtains the best results on the majority of datasets tested in comparison to the best system presented during the CLEF 2019 ProtestNews challenge.