NLP¶
embedding¶
ELMo (NAACL 2018)¶
BERT (2018)¶
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding Google AI Blog: Open Sourcing BERT: State-of-the-Art Pre-training for Natural Language Processing
[R] BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
The Illustrated BERT, ELMo, and co. (How NLP Cracked Transfer Learning)
Tasks:
- Sentence Pair Classification
- Single Sentence Classification
- Question Answering
- Single Sentence Tagging