Описание тега bert-language-model
BERT, or Bidirectional Encoder Representations from Transformers, is a new method of pre-training language representations which obtains state-of-the-art results on a wide array of Natural Language Processing (NLP) tasks.
The academic paper can be found here. And the original implementation of the BERT by google can be found here.
Reference
BERT Paper: https://arxiv.org/abs/1810.04805.
BERT Implementation: https://github.com/google-research/bert