153 results on '"Dayiheng Liu"'
Search Results
2. Effective Approaches to Neural Query Language Identification
3. An Empirical Study of Parameter Efficient Fine-tuning on Vision-Language Pre-train Model.
4. Rationales for Answers to Simple Math Word Problems Confuse Large Language Models.
5. How Abilities in Large Language Models are Affected by Supervised Fine-tuning Data Composition.
6. Knowledge Enhanced Pre-training for Cross-lingual Dense Retrieval.
7. MoNMT: Modularly Leveraging Monolingual and Bilingual Knowledge for Neural Machine Translation.
8. Talk Funny! A Large-Scale Humor Response Dataset with Chain-of-Humor Interpretation.
9. Towards a Unified View of Preference Learning for Large Language Models: A Survey.
10. Qwen2.5-Coder Technical Report.
11. Qwen2.5-Math Technical Report: Toward Mathematical Expert Model via Self-Improvement.
12. Qwen2-VL: Enhancing Vision-Language Model's Perception of the World at Any Resolution.
13. Qwen2 Technical Report.
14. Hallucination Detection: Robustly Discerning Reliable Answers in Large Language Models.
15. MAPO: Boosting Large Language Model Performance with Model-Adaptive Prompt Optimization.
16. DotaMath: Decomposition of Thought with Code Assistance and Self-correction for Mathematical Reasoning.
17. MAPO: Boosting Large Language Model Performance with Model-Adaptive Prompt Optimization.
18. Noisy Pair Corrector for Dense Retrieval.
19. Dynamic Voting for Efficient Reasoning in Large Language Models.
20. Unifying Discrete and Continuous Representations for Unsupervised Paraphrase Generation.
21. Bridging the Domain Gaps in Context Representations for k-Nearest Neighbor Neural Machine Translation.
22. Tailor: A Soft-Prompt-Based Approach to Attribute-Based Controlled Text Generation.
23. Fantastic Expressions and Where to Find Them: Chinese Simile Generation with Multiple Constraints.
24. Hallucination Detection: Robustly Discerning Reliable Answers in Large Language Models.
25. Competency-Aware Neural Machine Translation: Can Machine Translation Know its Own Translation Quality?
26. Dangling-Aware Entity Alignment with Mixed High-Order Proximities.
27. Bridging the Gap between Training and Inference: Multi-Candidate Optimization for Diverse Neural Machine Translation.
28. Should We Rely on Entity Mentions for Relation Extraction? Debiasing Relation Extraction with Counterfactual Analysis.
29. Self-supervised Product Title Rewrite for Product Listing Ads.
30. Unsupervised Preference-Aware Language Identification.
31. GCPG: A General Framework for Controllable Paraphrase Generation.
32. UniTE: Unified Translation Evaluation.
33. Attention Mechanism with Energy-Friendly Operations.
34. Alibaba-Translate China's Submission for WMT 2022 Quality Estimation Shared Task.
35. Alibaba-Translate China's Submission for WMT2022 Metrics Shared Task.
36. Frequency-Aware Contrastive Learning for Neural Machine Translation.
37. KGR4: Retrieval, Retrospect, Refine and Rethink for Commonsense Generation.
38. EMMA-X: An EM-like Multilingual Pre-training Algorithm for Cross-lingual Representation Learning.
39. CoupGAN: Chinese couplet generation via encoder-decoder model and adversarial training under global control.
40. Prediction, selection, and generation: a knowledge-driven conversation system.
41. OccuQuest: Mitigating Occupational Bias for Inclusive Large Language Models.
42. Towards Fine-Grained Information: Identifying the Type and Location of Translation Errors.
43. Qwen Technical Report.
44. PolyLM: An Open Source Polyglot Large Language Model.
45. Interactive Natural Language Processing.
46. How Abilities in Large Language Models are Affected by Supervised Fine-tuning Data Composition.
47. Mask Attention Networks: Rethinking and Strengthen Transformer.
48. GLGE: A New General Language Generation Evaluation Benchmark.
49. Bridging Subword Gaps in Pretrain-Finetune Paradigm for Natural Language Generation.
50. Towards User-Driven Neural Machine Translation.
Catalog
Books, media, physical & digital resources
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.