Folgen
Ben Peters
Ben Peters
Instituto de Telecomunicações
Bestätigte E-Mail-Adresse bei uw.edu
Titel
Zitiert von
Zitiert von
Jahr
Sparse sequence-to-sequence models
B Peters, V Niculae, AFT Martins
arXiv preprint arXiv:1905.05702, 2019
1382019
Massively multilingual neural grapheme-to-phoneme conversion
B Peters, J Dehdari, J van Genabith
arXiv preprint arXiv:1708.01464, 2017
452017
One-size-fits-all multilingual models
B Peters, AFT Martins
Proceedings of the 17th SIGMORPHON Workshop on Computational Research in …, 2020
192020
Interpretable structure induction via sparse attention
B Peters, V Niculae, AFT Martins
Proceedings of the 2018 EMNLP workshop blackboxnlp: analyzing and …, 2018
152018
It–ist at the sigmorphon 2019 shared task: Sparse two-headed models for inflection
B Peters, AFT Martins
Proceedings of the 16th Workshop on Computational Research in Phonetics …, 2019
112019
Smoothing and shrinking the sparse seq2seq search space
B Peters, AFT Martins
arXiv preprint arXiv:2103.10291, 2021
92021
Beyond characters: Subword-level morpheme segmentation
B Peters, AFT Martins
Proceedings of the 19th SIGMORPHON Workshop on Computational Research in …, 2022
32022
DeepSPIN: Deep Structured Prediction for Natural Language Processing
AFT Martins, B Peters, C Zerva, C Lyu, G Correia, M Treviso, P Martins, ...
Proceedings of the 23rd Annual Conference of the European Association for …, 2022
12022
Das System kann den Vorgang jetzt nicht ausführen. Versuchen Sie es später erneut.
Artikel 1–8