-
Attention Is All You Need
Paper β’ 1706.03762 β’ Published β’ 110 -
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
Paper β’ 1810.04805 β’ Published β’ 25 -
RoBERTa: A Robustly Optimized BERT Pretraining Approach
Paper β’ 1907.11692 β’ Published β’ 9 -
DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter
Paper β’ 1910.01108 β’ Published β’ 21
Taufiq Dwi Purnomo
taufiqdp
AI & ML interests
SLM, VLM
Recent Activity
liked
a model
2 days ago
black-forest-labs/FLUX.2-klein-9B
liked
a model
2 days ago
google/translategemma-4b-it
upvoted
a
paper
2 days ago
TranslateGemma Technical Report