References
Tags: concept
Sources: BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
Related notes:
Language Model
Updates:
April 8th, 2021: created note.
Notes {{word-count}}
Summary:
Key points:
BERT is a bidirectional Transformer encoder. It is also a Language Model that has been studied and used widely after the paper was published.
BERT