π’
π Entity
BERT (language model)
Series of language models developed by Google AI
π Rating
1 news mentions Β· π 0 likes Β· π 0 dislikes
π Topics
- Artificial Intelligence (1)
- Healthcare Technology (1)
- Natural Language Processing (1)
π·οΈ Keywords
transformer (1) Β· named entity recognition (1) Β· entity linking (1) Β· biomedical NLP (1) Β· SympTEMIST (1) Β· RoBERTa (1) Β· SapBERT (1) Β· clinical text (1)
π Key Information
Bidirectional encoder representations from transformers (BERT) is a language model introduced in October 2018 by researchers at Google. It learns to represent text as a sequence of vectors using self-supervised learning. It uses the encoder-only transformer architecture.
π° Related News (1)
-
πΊπΈ Team Fusion@ SU@ BC8 SympTEMIST track: transformer-based approach for symptom recognition and linking
arXiv:2604.06424v1 Announce Type: cross Abstract: This paper presents a transformer-based approach to solving the SympTEMIST named entity recognition...