Ben Peters
Ben Peters
Instituto de Telecomunicações
Bestätigte E-Mail-Adresse bei uw.edu
Titel
Zitiert von
Zitiert von
Jahr
Sparse sequence-to-sequence models
B Peters, V Niculae, AFT Martins
arXiv preprint arXiv:1905.05702, 2019
602019
Massively multilingual neural grapheme-to-phoneme conversion
B Peters, J Dehdari, J van Genabith
arXiv preprint arXiv:1708.01464, 2017
292017
Interpretable structure induction via sparse attention
B Peters, V Niculae, AFT Martins
Proceedings of the 2018 EMNLP Workshop BlackboxNLP: Analyzing and …, 2018
112018
It–ist at the sigmorphon 2019 shared task: Sparse two-headed models for inflection
B Peters, AFT Martins
Proceedings of the 16th Workshop on Computational Research in Phonetics …, 2019
102019
One-Size-Fits-All Multilingual Models
B Peters, AFT Martins
Proceedings of the 17th SIGMORPHON Workshop on Computational Research in …, 2020
92020
Smoothing and Shrinking the Sparse Seq2Seq Search Space
B Peters, AFT Martins
arXiv preprint arXiv:2103.10291, 2021
22021
Das System kann den Vorgang jetzt nicht ausführen. Versuchen Sie es später erneut.
Artikel 1–6