Back to Search Start Over

Harnessing Earnings Reports for Stock Predictions: A QLoRA-Enhanced LLM Approach

Authors :
Ni, Haowei
Meng, Shuchen
Chen, Xupeng
Zhao, Ziqing
Chen, Andi
Li, Panfeng
Zhang, Shiyao
Yin, Qifu
Wang, Yuanqing
Chan, Yuxi
Publication Year :
2024

Abstract

Accurate stock market predictions following earnings reports are crucial for investors. Traditional methods, particularly classical machine learning models, struggle with these predictions because they cannot effectively process and interpret extensive textual data contained in earnings reports and often overlook nuances that influence market movements. This paper introduces an advanced approach by employing Large Language Models (LLMs) instruction fine-tuned with a novel combination of instruction-based techniques and quantized low-rank adaptation (QLoRA) compression. Our methodology integrates 'base factors', such as financial metric growth and earnings transcripts, with 'external factors', including recent market indices performances and analyst grades, to create a rich, supervised dataset. This comprehensive dataset enables our models to achieve superior predictive performance in terms of accuracy, weighted F1, and Matthews correlation coefficient (MCC), especially evident in the comparison with benchmarks such as GPT-4. We specifically highlight the efficacy of the llama-3-8b-Instruct-4bit model, which showcases significant improvements over baseline models. The paper also discusses the potential of expanding the output capabilities to include a 'Hold' option and extending the prediction horizon, aiming to accommodate various investment styles and time frames. This study not only demonstrates the power of integrating cutting-edge AI with fine-tuned financial data but also paves the way for future research in enhancing AI-driven financial analysis tools.<br />Comment: Accepted by 2024 6th International Conference on Data-driven Optimization of Complex Systems

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2408.06634
Document Type :
Working Paper