Folgen
Julien Launay
Julien Launay
Extreme-Scale Team Lead @ HuggingFace
Bestätigte E-Mail-Adresse bei huggingface.co - Startseite
Titel
Zitiert von
Zitiert von
Jahr
BLOOM: A 176B-Parameter Open-Access Multilingual Language Model
TL Scao, A Fan, C Akiki, E Pavlick, S Ilić, D Hesslow, R Castagné, ...
arXiv preprint arXiv:2211.05100, 2022
11482022
The RefinedWeb Dataset for Falcon LLM: Outperforming Curated Corpora with Web Data Only
G Penedo, Q Malartic, D Hesslow, R Cojocaru, H Alobeidli, A Cappelli, ...
Thirty-seventh Conference on Neural Information Processing Systems Datasets …, 2023
380*2023
The Falcon Series of Open Language Models
E Almazrouei, H Alobeidli, A Alshamsi, A Cappelli, R Cojocaru, M Debbah, ...
arXiv preprint arXiv:2311.16867, 2023
238*2023
What Language Model Architecture and Pretraining Objective Works Best for Zero-Shot Generalization?
T Wang, A Roberts, D Hesslow, T Le Scao, HW Chung, I Beltagy, ...
International Conference on Machine Learning, 22964-22984, 2022
932022
What Language Model to Train if You Have One Million GPU Hours?
TL Scao, T Wang, D Hesslow, L Saulnier, S Bekman, MS Bari, ...
arXiv preprint arXiv:2210.15424, 2022
782022
Direct Feedback Alignment Scales to Modern Deep Learning Tasks and Architectures
J Launay, I Poli, F Boniface, F Krzakala
Advances in Neural Information Processing Systems 33, 2020
652020
Principled Training of Neural Networks with Direct Feedback Alignment
J Launay, I Poli, F Krzakala
arXiv preprint arXiv:1906.04554, 2019
322019
Hardware Beyond Backpropagation: a Photonic Co-Processor for Direct Feedback Alignment
J Launay, I Poli, K Müller, G Pariente, I Carron, L Daudet, F Krzakala, ...
arXiv preprint arXiv:2012.06373, 2020
172020
A Holistic Assessment of the Carbon Footprint of Noor, a Very Large Arabic Language Model
I Lakim, E Almazrouei, I Abualhaol, M Debbah, J Launay
Proceedings of BigScience Episode\# 5--Workshop on Challenges & Perspectives …, 2022
142022
Adversarial robustness by design through analog computing and synthetic gradients
A Cappelli, R Ohana, J Launay, L Meunier, I Poli, F Krzakala
ICASSP 2022-2022 IEEE International Conference on Acoustics, Speech and …, 2022
102022
LightOn Optical Processing Unit: Scaling-up AI and HPC with a Non von Neumann co-processor
C Brossollet, A Cappelli, I Carron, C Chaintoutis, A Chatelain, L Daudet, ...
arXiv preprint arXiv:2107.11814, 2021
92021
Photonic differential privacy with direct feedback alignment
R Ohana, H Medina, J Launay, A Cappelli, I Poli, L Ralaivola, ...
Advances in Neural Information Processing Systems 34, 22010-22020, 2021
82021
Method and system for machine learning using optical data
I Poli, J Launay, K Müller, G Pariente, I Carron, L Daudet
US Patent 11,137,289, 2021
62021
ROPUST: Improving Robustness through Fine-tuning with Photonic Processors and Synthetic Gradients
A Cappelli, R Ohana, J Launay, L Meunier, I Poli
ICML 2021 Workshop on Adversarial Machine Learning, 2021
62021
Light-in-the-loop: using a photonics co-processor for scalable training of neural networks
J Launay, I Poli, K Müller, I Carron, L Daudet, F Krzakala, S Gigan
arXiv preprint arXiv:2006.01475, 2020
62020
PAGnol: An Extra-Large French Generative Model
J Launay, EL Tommasone, B Pannier, F Boniface, A Chatelain, A Cappelli, ...
arXiv preprint arXiv:2110.08554, 2021
52021
Is the Number of Trainable Parameters All That Actually Matters?
A Chatelain, A Djeghri, D Hesslow, J Launay
I (Still) Can't Believe It's Not Better! Workshop at NeurIPS 2021, 27-32, 2022
42022
Analysis of factors affecting the performance of BIPV panels
J Launay, EWM Lee, R Bennacer, RKK Yuen
The European Physical Journal Applied Physics 84 (1), 10902, 2018
32018
Method and system for distributed training using synthetic gradients
J Launay, I Poli, K Müller, G Pariente, I Carron, L Daudet
US Patent App. 17/117,925, 2022
22022
AlGhafa Evaluation Benchmark for Arabic Language Models
E Almazrouei, R Cojocaru, M Baldo, Q Malartic, H Alobeidli, D Mazzotta, ...
Proceedings of ArabicNLP 2023, 244-275, 2023
12023
Das System kann den Vorgang jetzt nicht ausführen. Versuchen Sie es später erneut.
Artikel 1–20