Back to Search Start Over

Web2Code: A Large-scale Webpage-to-Code Dataset and Evaluation Framework for Multimodal LLMs

Authors :
Yun, Sukmin
Lin, Haokun
Thushara, Rusiru
Bhat, Mohammad Qazim
Wang, Yongxin
Jiang, Zutao
Deng, Mingkai
Wang, Jinhong
Tao, Tianhua
Li, Junbo
Li, Haonan
Nakov, Preslav
Baldwin, Timothy
Liu, Zhengzhong
Xing, Eric P.
Liang, Xiaodan
Shen, Zhiqiang
Publication Year :
2024

Abstract

Multimodal large language models (MLLMs) have shown impressive success across modalities such as image, video, and audio in a variety of understanding and generation tasks. However, current MLLMs are surprisingly poor at understanding webpage screenshots and generating their corresponding HTML code. To address this problem, we propose Web2Code, a benchmark consisting of a new large-scale webpage-to-code dataset for instruction tuning and an evaluation framework for the webpage understanding and HTML code translation abilities of MLLMs. For dataset construction, we leverage pretrained LLMs to enhance existing webpage-to-code datasets as well as generate a diverse pool of new webpages rendered into images. Specifically, the inputs are webpage images and instructions, while the responses are the webpage's HTML code. We further include diverse natural language QA pairs about the webpage content in the responses to enable a more comprehensive understanding of the web content. To evaluate model performance in these tasks, we develop an evaluation framework for testing MLLMs' abilities in webpage understanding and web-to-code generation. Extensive experiments show that our proposed dataset is beneficial not only to our proposed tasks but also in the general visual domain, while previous datasets result in worse performance. We hope our work will contribute to the development of general MLLMs suitable for web-based content generation and task automation. Our data and code will be available at https://github.com/MBZUAI-LLM/web2code.<br />Comment: Website at https://mbzuai-llm.github.io/webpage2code/

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2406.20098
Document Type :
Working Paper