Posts
Updates and Generalization Shortcuts
SpanBERT: Improving Pre-training by Representing and Predicting Spans (and a bit on pre-processing techniques)
Learning Which Features Matter: RoBERTa Acquires a Preference for Linguistic Generalizations (Eventually)
How Can We Accelerate Progress Towards Human-like Linguistic Generalization?
Intermediate-Task Transfer Learning with Pretrained Models for Natural Language Understanding: When and Why Does It Work?
Implicit Representations of Meaning in Neural Language Models
Welcome to my blog
subscribe via RSS