In 2018 Google released BERT (bidirectional encoder representations from transformers), a pretrained language model that scored SOTA results on a range on natural language processing (NLP) tasks and revolutionized the research field. Similar transformer-based models such as Open AI’s GPT-2 and Baidu’s ERNIE followed. In October 2019 Facebook AI came up with BART, a newContinue Reading