Follow
Marjan Ghazvininejad
Marjan Ghazvininejad
Research Scientist, FAIR (Facebook AI Research)
Verified email at fb.com - Homepage
Title
Cited by
Cited by
Year
Bart: Denoising sequence-to-sequence pre-training for natural language generation, translation, and comprehension
M Lewis
arXiv preprint arXiv:1910.13461, 2019
107792019
Multilingual denoising pre-training for neural machine translation
Y Liu
arXiv preprint arXiv:2001.08210, 2020
17722020
A knowledge-grounded neural conversation model
M Ghazvininejad, C Brockett, MW Chang, B Dolan, J Gao, W Yih, ...
Proceedings of the AAAI conference on artificial intelligence 32 (1), 2018
6532018
Mask-predict: Parallel decoding of conditional masked language models
M Ghazvininejad, O Levy, Y Liu, L Zettlemoyer
arXiv preprint arXiv:1904.09324, 2019
5802019
Generating Topical Poetry
M Ghazvininejad, X Shi, Y Choi, K Knight
Empirical Methods on Natural Language Processing, 2016
1962016
Hafez: an Interactive Poetry Generation System
M Ghazvininejad, X Shi, J Priyadarshi, K Knight
proceeding of ACL Demo Track, 2017
1942017
Detecting hallucinated content in conditional neural sequence generation
C Zhou, G Neubig, J Gu, M Diab, P Guzman, L Zettlemoyer, ...
arXiv preprint arXiv:2011.02593, 2020
1802020
Towards controllable story generation
N Peng, M Ghazvininejad, J May, K Knight
Proceedings of the First Workshop on Storytelling, 43-49, 2018
1742018
Pre-training via paraphrasing
M Lewis, M Ghazvininejad, G Ghosh, A Aghajanyan, S Wang, ...
Advances in Neural Information Processing Systems 33, 18470-18481, 2020
1592020
In-context examples selection for machine translation
S Agrawal, C Zhou, M Lewis, L Zettlemoyer, M Ghazvininejad
arXiv preprint arXiv:2212.02437, 2022
1542022
A review on language models as knowledge bases
B AlKhamissi, M Li, A Celikyilmaz, M Diab, M Ghazvininejad
arXiv preprint arXiv:2204.06031, 2022
1452022
Delight: Deep and light-weight transformer
S Mehta, M Ghazvininejad, S Iyer, L Zettlemoyer, H Hajishirzi
arXiv preprint arXiv:2008.00623, 2020
1312020
Non-autoregressive machine translation with disentangled context transformer
J Kasai, J Cross, M Ghazvininejad, J Gu
International conference on machine learning, 5144-5155, 2020
116*2020
Aligned cross entropy for non-autoregressive machine translation
M Ghazvininejad, V Karpukhin, L Zettlemoyer, O Levy
International Conference on Machine Learning, 3515-3523, 2020
1122020
Training on synthetic noise improves robustness to natural noise in machine translation
V Karpukhin, O Levy, J Eisenstein, M Ghazvininejad
Proceedings of the 5th Workshop on Noisy User-generated Text (W-NUT 2019), 42-47, 2019
1062019
Improving zero and few-shot abstractive summarization with intermediate fine-tuning and data augmentation
AR Fabbri, S Han, H Li, H Li, M Ghazvininejad, S Joty, D Radev, ...
arXiv preprint arXiv:2010.12836, 2020
1032020
Natural language to code translation with execution
F Shi, D Fried, M Ghazvininejad, L Zettlemoyer, SI Wang
arXiv preprint arXiv:2204.11454, 2022
872022
Bart: Denoising sequence-to-sequence pre-training for natural language generation, translation, and comprehension, 2019
M Lewis, Y Liu, N Goyal, M Ghazvininejad, A Mohamed, O Levy, ...
arXiv preprint arXiv:1910.13461, 1910
731910
Prompting contrastive explanations for commonsense reasoning tasks
B Paranjape, J Michael, M Ghazvininejad, L Zettlemoyer, H Hajishirzi
arXiv preprint arXiv:2106.06823, 2021
712021
Semi-autoregressive training improves mask-predict decoding
M Ghazvininejad, O Levy, L Zettlemoyer
arXiv preprint arXiv:2001.08785, 2020
652020
The system can't perform the operation now. Try again later.
Articles 1–20