Intermediate-Task Transfer Learning with Pretrained Models for Natural Language Understanding: When and Why Does It Work? Y Pruksachatkun, J Phang, H Liu, PM Htut, X Zhang, RY Pang, C Vania, ... Proceedings of the 58th Annual Meeting of the Association for Computational …, 2020 | 25 | 2020 |
Unsupervised Evaluation Metrics and Learning Criteria for Non-Parallel Textual Transfer RY Pang, K Gimpel Proceedings of the 3rd Workshop on Neural Generation and Translation, 138-147, 2019 | 9* | 2019 |
ENGINE: Energy-Based Inference Networks for Non-Autoregressive Machine Translation L Tu, RY Pang, S Wiseman, K Gimpel Proceedings of the 58th Annual Meeting of the Association for Computational …, 2020 | 7 | 2020 |
Consistency of a Recurrent Language Model With Respect to Incomplete Decoding S Welleck, I Kulikov, J Kim, RY Pang, K Cho Proceedings of the 2020 Conference on Empirical Methods in Natural Language …, 2020 | 5 | 2020 |
Improving Joint Training of Inference Networks and Structured Prediction Energy Networks L Tu, RY Pang, K Gimpel Proceedings of the 4th Workshop on Structured Prediction for NLP, 62-73, 2020 | 3 | 2020 |
The Daunting Task of Real-World Textual Style Transfer Auto-Evaluation RY Pang arXiv preprint arXiv:1910.03747, 2019 | 3* | 2019 |
Text Generation by Learning from Off-Policy Demonstrations RY Pang, H He International Conference on Learning Representations 2021, 2021 | | 2021 |