Bert
-
Masked Language Model
Predicts randomly masked tokens from context; primary pre-training objective for bidirectional encoders like BERT.
-
BERT
Bidirectional Encoder Representations from Transformers; bidirectional transformer pre-trained with masked language modeling, foundational for NLP tasks.