Sparse sequence-to-sequence models B Peters, V Niculae, AFT Martins arXiv preprint arXiv:1905.05702, 2019 | 175 | 2019 |
Massively multilingual neural grapheme-to-phoneme conversion B Peters, J Dehdari, J van Genabith arXiv preprint arXiv:1708.01464, 2017 | 52 | 2017 |
One-size-fits-all multilingual models B Peters, AFT Martins Proceedings of the 17th SIGMORPHON Workshop on Computational Research in …, 2020 | 19 | 2020 |
Interpretable structure induction via sparse attention B Peters, V Niculae, AFT Martins Proceedings of the 2018 EMNLP workshop blackboxnlp: analyzing and …, 2018 | 16 | 2018 |
Smoothing and shrinking the sparse Seq2Seq search space B Peters, AFT Martins arXiv preprint arXiv:2103.10291, 2021 | 13 | 2021 |
It–ist at the sigmorphon 2019 shared task: Sparse two-headed models for inflection B Peters, AFT Martins Proceedings of the 16th Workshop on Computational Research in Phonetics …, 2019 | 11 | 2019 |
Beyond characters: Subword-level morpheme segmentation B Peters, AFT Martins Proceedings of the 19th SIGMORPHON Workshop on Computational Research in …, 2022 | 7 | 2022 |
DeepSPIN: Deep Structured Prediction for Natural Language Processing AFT Martins, B Peters, C Zerva, C Lyu, G Correia, M Treviso, P Martins, ... Proceedings of the 23rd Annual Conference of the European Association for …, 2022 | 1 | 2022 |