1. An Attribute-Aware Attentive GCN Model for Attribute Missing in Recommendation
- Author
-
Fan Liu, Lei Zhu, Chenghao Liu, Zhiyong Cheng, and Liqiang Nie
- Subjects
Information retrieval ,Computer science ,Aggregate (data warehouse) ,02 engineering and technology ,Construct (python library) ,Recommender system ,Computer Science Applications ,Computational Theory and Mathematics ,020204 information systems ,Node (computer science) ,0202 electrical engineering, electronic engineering, information engineering ,Leverage (statistics) ,Graph (abstract data type) ,Representation (mathematics) ,Feature learning ,Information Systems - Abstract
As important side information, attributes have been widely exploited in the existing recommender system for better performance. Prior studies usually use a default value (i.e., ``other") to represent the missing attribute, resulting in sub-optimal performance. To address this problem, in this paper, we present an attribute-aware attentive graph convolution network. In particular, we first construct a graph, where users, items, and attributes are three types of nodes and their associations are edges. Thereafter, we leverage the graph convolution network to characterize the complicated interactions among . Furthermore, to learn the node representation, we adopt the message-passing strategy to aggregate the messages passed from the other directly linked types of nodes (e.g., a user or an attribute). Towards this end, we are capable of incorporating associate attributes to strengthen the user and item representation learning, and thus naturally solve the attribute missing problem. Given that for different users, the attributes of an item have different influence on their preference to this item, we design a novel attention mechanism to filter the message passed from an item to a target user by considering the attribute information. Extensive experiments on several publicly accessible datasets demonstrate the superiority of our model over several state-of-the-art methods.
- Published
- 2022
- Full Text
- View/download PDF