Back to Search Start Over

Can Low-Rank Knowledge Distillation in LLMs be Useful for Microelectronic Reasoning?

Authors :
Rouf, Nirjhor
Amin, Fin
Franzon, Paul D.
Publication Year :
2024

Abstract

In this work, we present empirical results regarding the feasibility of using offline large language models (LLMs) in the context of electronic design automation (EDA). The goal is to investigate and evaluate a contemporary language model's (Llama-2-7B) ability to function as a microelectronic Q & A expert as well as its reasoning, and generation capabilities in solving microelectronic-related problems. Llama-2-7B was tested across a variety of adaptation methods, including introducing a novel low-rank knowledge distillation (LoRA-KD) scheme. Our experiments produce both qualitative and quantitative results.<br />Comment: 4 pages, 2 figures, 2 tables, The First IEEE International Workshop on LLM-Aided Design (LAD'24)

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2406.13808
Document Type :
Working Paper