13 results on '"Reid, Machel"'
Search Results
2. On the Role of Parallel Data in Cross-lingual Transfer Learning
3. PARADISE: Exploiting Parallel Data for Multilingual Sequence-to-Sequence Pretraining
4. On the Impact of Data Augmentation on Downstream Performance in Natural Language Processing
5. A Few Thousand Translations Go a Long Way! Leveraging Pre-trained Models for African News Translation
6. M2D2: A Massively Multi-Domain Language Modeling Dataset
7. Learning to Model Editing Processes
8. Variational Inference for Learning Representations of Natural Language Edits
9. LEWIS: Levenshtein Editing for Unsupervised Text Style Transfer
10. AfroMT: Pretraining Strategies and Reproducible Benchmarks for Translation of 8 African Languages
11. Low-Resource Machine Translation Using Cross-Lingual Language Model Pretraining
12. Subformer: Exploring Weight Sharing for Parameter Efficiency in Generative Transformers
13. VCDM: Leveraging Variational Bi-encoding and Deep Contextualized Word Representations for Improved Definition Modeling
Catalog
Books, media, physical & digital resources
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.