フォロー
Gowtham Ramesh
タイトル
引用先
引用先
Samanantar: The largest publicly available parallel corpora collection for 11 indic languages
G Ramesh, S Doddapaneni, A Bheemaraj, M Jobanputra, R AK, ...
Transactions of the Association for Computational Linguistics 10, 145-162, 2022
58*2022
A primer on pretrained multilingual language models
S Doddapaneni, G Ramesh, A Kunchukuttan, P Kumar, MM Khapra
arXiv preprint arXiv:2107.00676, 2021
212021
Towards building asr systems for the next billion users
T Javed, S Doddapaneni, A Raman, KS Bhogale, G Ramesh, ...
Proceedings of the AAAI Conference on Artificial Intelligence 36 (10), 10813 …, 2022
72022
SuperShaper: Task-Agnostic Super Pre-training of BERT Models with Variable Hidden Dimensions
V Ganesan, G Ramesh, P Kumar
arXiv preprint arXiv:2110.04711, 2021
12021
現在システムで処理を実行できません。しばらくしてからもう一度お試しください。
論文 1–4