Adapters for the paper "M2QA: Multi-domain Multilingual Question Answering".
We evaluate 2 setups: MAD-X+Domain and MAD-XΒ²
AI & ML interests
Parameter-Efficient Fine-Tuning
Adapters from the paper "What to Pre-Train on? Efficient Intermediate Task Selection" (Poth et al., 2021)
-
What to Pre-Train on? Efficient Intermediate Task Selection
Paper β’ 2104.08247 β’ Published -
AdapterHub/roberta-base-pf-imdb
Text Classification β’ Updated β’ 4 -
AdapterHub/roberta-base-pf-conll2003
Token Classification β’ Updated β’ 4 β’ 1 -
AdapterHub/bert-base-uncased-pf-anli_r3
Text Classification β’ Updated β’ 6
Adapters from the paper "AdapterFusion: Non-Destructive Task Composition for Transfer Learning" (Pfeiffer et al., 2021)
MAD-X language adapters from the paper "MAD-X: An Adapter-Based Framework for Multi-Task Cross-Lingual Transfer" for BERT and XLM-RoBERTa.
-
MAD-X: An Adapter-Based Framework for Multi-Task Cross-Lingual Transfer
Paper β’ 2005.00052 β’ Published β’ 1 -
AdapterHub/xlm-roberta-base-de-wiki_pfeiffer
Updated β’ 2 -
AdapterHub/bert-base-multilingual-cased-mhr-wiki_houlsby
Updated β’ 1 -
AdapterHub/xlm-roberta-large-sw-wiki_pfeiffer
Updated β’ 1
Adapters from the paper "Lifting the Curse of Multilinguality by Pre-training Modular Transformers"
Adapters for the paper "M2QA: Multi-domain Multilingual Question Answering".
We evaluate 2 setups: MAD-X+Domain and MAD-XΒ²
MAD-X language adapters from the paper "MAD-X: An Adapter-Based Framework for Multi-Task Cross-Lingual Transfer" for BERT and XLM-RoBERTa.
-
MAD-X: An Adapter-Based Framework for Multi-Task Cross-Lingual Transfer
Paper β’ 2005.00052 β’ Published β’ 1 -
AdapterHub/xlm-roberta-base-de-wiki_pfeiffer
Updated β’ 2 -
AdapterHub/bert-base-multilingual-cased-mhr-wiki_houlsby
Updated β’ 1 -
AdapterHub/xlm-roberta-large-sw-wiki_pfeiffer
Updated β’ 1
Adapters from the paper "What to Pre-Train on? Efficient Intermediate Task Selection" (Poth et al., 2021)
-
What to Pre-Train on? Efficient Intermediate Task Selection
Paper β’ 2104.08247 β’ Published -
AdapterHub/roberta-base-pf-imdb
Text Classification β’ Updated β’ 4 -
AdapterHub/roberta-base-pf-conll2003
Token Classification β’ Updated β’ 4 β’ 1 -
AdapterHub/bert-base-uncased-pf-anli_r3
Text Classification β’ Updated β’ 6
Adapters from the paper "AdapterFusion: Non-Destructive Task Composition for Transfer Learning" (Pfeiffer et al., 2021)
Adapters from the paper "Lifting the Curse of Multilinguality by Pre-training Modular Transformers"