Back to Search Start Over

Graph Deep Active Learning Framework for Data Deduplication

Authors :
Huan Cao
Shengdong Du
Jie Hu
Yan Yang
Shi-Jinn Horng
Tianrui Li
Source :
Big Data Mining and Analytics, Vol 7, Iss 3, Pp 753-764 (2024)
Publication Year :
2024
Publisher :
Tsinghua University Press, 2024.

Abstract

With the advent of the era of big data, an increasing amount of duplicate data are expressed in different forms. In order to reduce redundant data storage and improve data quality, data deduplication technology has never become more significant than nowadays. It is usually necessary to connect multiple data tables and identify different records pointing to the same entity, especially in the case of multi-source data deduplication. Active learning trains the model by selecting the data items with the maximum information divergence and reduces the data to be annotated, which has unique advantages in dealing with big data annotations. However, most of the current active learning methods only employ classical entity matching and are rarely applied to data deduplication tasks. To fill this research gap, we propose a novel graph deep active learning framework for data deduplication, which is based on similarity algorithms combined with the bidirectional encoder representations from transformers (BERT) model to extract the deep similarity features of multi-source data records, and first introduce the graph active learning strategy to build a clean graph to filter the data that needs to be labeled, which is used to delete the duplicate data that retain the most information. Experimental results on real-world datasets demonstrate that the proposed method outperforms state-of-the-art active learning models on data deduplication tasks.

Details

Language :
English
ISSN :
20960654
Volume :
7
Issue :
3
Database :
Directory of Open Access Journals
Journal :
Big Data Mining and Analytics
Publication Type :
Academic Journal
Accession number :
edsdoj.7418ce9816034b249a0f1e842d8d83cd
Document Type :
article
Full Text :
https://doi.org/10.26599/BDMA.2023.9020040