Follow
Thomas Wolf
Thomas Wolf
Co-founder at HuggingFace
Verified email at polytechnique.edu - Homepage
Title
Cited by
Cited by
Year
Transformers: State-of-the-art natural language processing
T Wolf, L Debut, V Sanh, J Chaumond, C Delangue, A Moi, P Cistac, ...
Proceedings of the 2020 conference on empirical methods in natural language …, 2020
14448*2020
DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter
V Sanh, L Debut, J Chaumond, T Wolf
arXiv preprint arXiv:1910.01108, 2019
71602019
Multitask prompted training enables zero-shot task generalization
V Sanh, A Webson, C Raffel, SH Bach, L Sutawika, Z Alyafeai, A Chaffin, ...
arXiv preprint arXiv:2110.08207, 2021
13982021
Bloom: A 176b-parameter open-access multilingual language model
T Le Scao, A Fan, C Akiki, E Pavlick, S Ilić, D Hesslow, R Castagné, ...
13502023
Transfer learning in natural language processing
S Ruder, ME Peters, S Swayamdipta, T Wolf
Proceedings of the 2019 conference of the North American chapter of the …, 2019
7072019
Starcoder: may the source be with you!
R Li, LB Allal, Y Zi, N Muennighoff, D Kocetkov, C Mou, M Marone, C Akiki, ...
arXiv preprint arXiv:2305.06161, 2023
513*2023
Transfertransfo: A transfer learning approach for neural network based conversational agents
T Wolf, V Sanh, J Chaumond, C Delangue
arXiv preprint arXiv:1901.08149, 2019
5112019
Datasets: A community library for natural language processing
Q Lhoest, AV Del Moral, Y Jernite, A Thakur, P Von Platen, S Patil, ...
arXiv preprint arXiv:2109.02846, 2021
479*2021
Movement pruning: Adaptive sparsity by fine-tuning
V Sanh, T Wolf, A Rush
Advances in neural information processing systems 33, 20378-20389, 2020
4002020
Two-dimensional superconductivity at a Mott insulator/band insulator interface LaTiO3/SrTiO3
J Biscaras, N Bergeal, A Kushwaha, T Wolf, A Rastogi, RC Budhani, ...
Nature communications 1 (1), 89, 2010
3432010
Natural language processing with transformers
L Tunstall, L Von Werra, T Wolf
" O'Reilly Media, Inc.", 2022
3052022
Diffusers: State-of-the-art diffusion models
P Von Platen, S Patil, A Lozhkov, P Cuenca, N Lambert, K Rasul, ...
2922022
A hierarchical multi-task approach for learning embeddings from semantic tasks
V Sanh, T Wolf, S Ruder
Proceedings of the AAAI conference on artificial intelligence 33 (01), 6949-6956, 2019
2632019
Zephyr: Direct distillation of lm alignment
L Tunstall, E Beeching, N Lambert, N Rajani, K Rasul, Y Belkada, ...
arXiv preprint arXiv:2310.16944, 2023
2452023
Open llm leaderboard
E Beeching, C Fourrier, N Habib, S Han, N Lambert, N Rajani, ...
Hugging Face, 2023
2002023
The stack: 3 tb of permissively licensed source code
D Kocetkov, R Li, LB Allal, J Li, C Mou, CM Ferrandis, Y Jernite, M Mitchell, ...
arXiv preprint arXiv:2211.15533, 2022
1792022
Scaling data-constrained language models
N Muennighoff, A Rush, B Barak, T Le Scao, N Tazi, A Piktus, S Pyysalo, ...
Advances in Neural Information Processing Systems 36, 2024
1192024
Large-scale transfer learning for natural language generation
S Golovanov, R Kurbanov, S Nikolenko, K Truskovskyi, A Tselousov, ...
Proceedings of the 57th Annual Meeting of the Association for Computational …, 2019
1032019
Distilbert, a distilled version of BERT: smaller, faster, cheaper and lighter. CoRR abs/1910.01108 (2019)
V Sanh, L Debut, J Chaumond, T Wolf
URL: http://arxiv. org/abs/1910 1108, 1910
1021910
Grounding large language models in interactive environments with online reinforcement learning
T Carta, C Romac, T Wolf, S Lamprier, O Sigaud, PY Oudeyer
International Conference on Machine Learning, 3676-3713, 2023
912023
The system can't perform the operation now. Try again later.
Articles 1–20