Follow
Timo Schick
Timo Schick
NLP Researcher
Verified email at fb.com - Homepage
Title
Cited by
Cited by
Year
Exploiting cloze questions for few shot text classification and natural language inference
T Schick, H Schütze
arXiv preprint arXiv:2001.07676, 2020
10452020
Bloom: A 176b-parameter open-access multilingual language model
BS Workshop, TL Scao, A Fan, C Akiki, E Pavlick, S Ilić, D Hesslow, ...
arXiv preprint arXiv:2211.05100, 2022
7062022
It's not just size that matters: Small language models are also few-shot learners
T Schick, H Schütze
arXiv preprint arXiv:2009.07118, 2020
6582020
Beyond the imitation game: Quantifying and extrapolating the capabilities of language models
A Srivastava, A Rastogi, A Rao, AAM Shoeb, A Abid, A Fisch, AR Brown, ...
arXiv preprint arXiv:2206.04615, 2022
4392022
Toolformer: Language models can teach themselves to use tools
T Schick, J Dwivedi-Yu, R Dessì, R Raileanu, M Lomeli, L Zettlemoyer, ...
arXiv preprint arXiv:2302.04761, 2023
3452023
Atlas: Few-shot learning with retrieval augmented language models
G Izacard, P Lewis, M Lomeli, L Hosseini, F Petroni, T Schick, ...
arXiv preprint arXiv 2208, 2022
187*2022
Self-diagnosis and self-debiasing: A proposal for reducing corpus-based bias in nlp
T Schick, S Udupa, H Schütze
Transactions of the Association for Computational Linguistics 9, 1408-1424, 2021
1822021
Augmented language models: a survey
G Mialon, R Dessì, M Lomeli, C Nalmpantis, R Pasunuru, R Raileanu, ...
arXiv preprint arXiv:2302.07842, 2023
1652023
Automatically identifying words that can serve as labels for few-shot text classification
T Schick, H Schmid, H Schütze
arXiv preprint arXiv:2010.13641, 2020
1422020
Generating datasets with pretrained language models
T Schick, H Schütze
arXiv preprint arXiv:2104.07540, 2021
1282021
Rare words: A major problem for contextualized embeddings and how to fix it by attentive mimicking
T Schick, H Schütze
Proceedings of the AAAI Conference on Artificial Intelligence 34 (05), 8766-8774, 2020
982020
Few-shot text generation with natural language instructions
T Schick, H Schütze
Proceedings of the 2021 Conference on Empirical Methods in Natural Language …, 2021
782021
Few-shot text generation with pattern-exploiting training
T Schick, H Schütze
arXiv preprint arXiv:2012.11926, 2020
752020
Unnatural instructions: Tuning language models with (almost) no human labor
O Honovich, T Scialom, O Levy, T Schick
arXiv preprint arXiv:2212.09689, 2022
702022
Peer: A collaborative language model
T Schick, J Dwivedi-Yu, Z Jiang, F Petroni, P Lewis, G Izacard, Q You, ...
arXiv preprint arXiv:2208.11663, 2022
542022
Attentive mimicking: Better word embeddings by attending to informative contexts
T Schick, H Schütze
arXiv preprint arXiv:1904.01617, 2019
462019
BERTRAM: Improved word embeddings have big impact on contextualized model performance
T Schick, H Schütze
arXiv preprint arXiv:1910.07181, 2019
392019
True few-shot learning with prompts—a real-world perspective
T Schick, H Schütze
Transactions of the Association for Computational Linguistics 10, 716-731, 2022
372022
Task-aware retrieval with instructions
A Asai, T Schick, P Lewis, X Chen, G Izacard, S Riedel, H Hajishirzi, ...
arXiv preprint arXiv:2211.09260, 2022
302022
Learning semantic representations for novel words: Leveraging both form and context
T Schick, H Schütze
Proceedings of the AAAI Conference on Artificial Intelligence 33 (01), 6965-6973, 2019
302019
The system can't perform the operation now. Try again later.
Articles 1–20