In 2018, Google published the Bidirectional Encoder Representations (BERT) model. BERT enabled not just contextualised representations of words but also bidirectional contextualised representations of words. The same word appearing in different contexts can have entirely different vectors depending on the meaning represented. Contextualised representations capture this nuance in meaning. Whether left-to-right or right-to-left, the model understands the context and assigns vectors to the words accordingly.