Back to Search Start Over

TA-BLSTM: Tag Attention-based Bidirectional Long Short-Term Memory for Service Recommendation in Mashup Creation

Authors :
Jianxun Liu
Min Shi
Yufei Tang
Source :
IJCNN
Publication Year :
2019
Publisher :
IEEE, 2019.

Abstract

The service-oriented architecture makes it possible for developers to create value-added Mashup applications by composing multiple available Web services. Due to the overwhelming number of Web services online, it is often hard and time-consuming for developers to find their desired ones from the entire service repository. In the past, various approaches aim at recommending Web services for automatic Mashup creation have been proposed, i.e., TFIDF, collaborative filtering and topic model-based methods, which rely on the original service descriptions given by service providers. However, most traditional methods fail to capture the function-related features of services since words contained in service descriptions usually correspond to different intent aspects (e.g., functional and non-functional related). To tackle this problem, we propose a tag attention-based recurrent neural networks model for Web service recommendation. The model consists of two Siamese bidirectional Long Short-Term Memory (LSTM) networks, which jointly learn two embeddings representing the functional features of Web services and the functional requirements of Mashups. In addition, by considering the tags of services as functional context information, the model can learn to assign attention scores to different words in service descriptions according to their intent importance, thus words used to reveal the functional properties of Web service will be given special attention. We compare our approach with the state-of-the-art methods (e.g., RTM, Word2vec, etc.) on a real-world dataset crawled from ProgrammableWeb, and the experimental results demonstrate the effectiveness of the proposed model.

Details

Database :
OpenAIRE
Journal :
2019 International Joint Conference on Neural Networks (IJCNN)
Accession number :
edsair.doi...........04907e888db6a0e2d3d15c3fd073ad20