SP
BravenNow
🏒
🌐 Entity

BERT (language model)

Series of language models developed by Google AI

πŸ“Š Rating

1 news mentions Β· πŸ‘ 0 likes Β· πŸ‘Ž 0 dislikes

πŸ“Œ Topics

  • Artificial Intelligence (1)
  • Healthcare Technology (1)
  • Natural Language Processing (1)

🏷️ Keywords

transformer (1) Β· named entity recognition (1) Β· entity linking (1) Β· biomedical NLP (1) Β· SympTEMIST (1) Β· RoBERTa (1) Β· SapBERT (1) Β· clinical text (1)

πŸ“– Key Information

Bidirectional encoder representations from transformers (BERT) is a language model introduced in October 2018 by researchers at Google. It learns to represent text as a sequence of vectors using self-supervised learning. It uses the encoder-only transformer architecture.

πŸ“° Related News (1)

πŸ”— External Links