10 results on '"Oloulade, Babatounde Moctard"'
Search Results
2. Adaptive graph contrastive learning with joint optimization of data augmentation and graph encoder
- Author
-
Wu, Zhenpeng, Chen, Jiamin, Al-Sabri, Raeed, Oloulade, Babatounde Moctard, and Gao, Jianliang
- Published
- 2024
- Full Text
- View/download PDF
3. Graph neural architecture prediction
- Author
-
Gao, Jianliang, Oloulade, Babatounde Moctard, Al-Sabri, Raeed, Chen, Jiamin, Lyu, Tengfei, and Wu, zhenpeng
- Published
- 2024
- Full Text
- View/download PDF
4. GM2NAS: multitask multiview graph neural architecture search
- Author
-
Gao, Jianliang, Al-Sabri, Raeed, Oloulade, Babatounde Moctard, Chen, Jiamin, Lyu, Tengfei, and Wu, Zhenpeng
- Published
- 2023
- Full Text
- View/download PDF
5. Neural predictor-based automated graph classifier framework
- Author
-
Oloulade, Babatounde Moctard, Gao, Jianliang, Chen, Jiamin, Al-Sabri, Raeed, and Lyu, Tengfei
- Published
- 2023
- Full Text
- View/download PDF
6. Transformer-Based Graph Convolutional Network for Sentiment Analysis
- Author
-
Barakat AlBadani, Ronghua Shi, Jian Dong, Raeed Al-Sabri, and Oloulade Babatounde Moctard
- Subjects
sentiment analysis ,graph neural network ,deep learning ,NLP transformer ,Technology ,Engineering (General). Civil engineering (General) ,TA1-2040 ,Biology (General) ,QH301-705.5 ,Physics ,QC1-999 ,Chemistry ,QD1-999 - Abstract
Sentiment Analysis is an essential research topic in the field of natural language processing (NLP) and has attracted the attention of many researchers in the last few years. Recently, deep neural network (DNN) models have been used for sentiment analysis tasks, achieving promising results. Although these models can analyze sequences of arbitrary length, utilizing them in the feature extraction layer of a DNN increases the dimensionality of the feature space. More recently, graph neural networks (GNNs) have achieved a promising performance in different NLP tasks. However, previous models cannot be transferred to a large corpus and neglect the heterogeneity of textual graphs. To overcome these difficulties, we propose a new Transformer-based graph convolutional network for heterogeneous graphs called Sentiment Transformer Graph Convolutional Network (ST-GCN). To the best of our knowledge, this is the first study to model the sentiment corpus as a heterogeneous graph and learn document and word embeddings using the proposed sentiment graph transformer neural network. In addition, our model offers an easy mechanism to fuse node positional information for graph datasets using Laplacian eigenvectors. Extensive experiments on four standard datasets show that our model outperforms the existing state-of-the-art models.
- Published
- 2022
- Full Text
- View/download PDF
7. Auto-GNAS: A Parallel Graph Neural Architecture Search Framework.
- Author
-
Chen, Jiamin, Gao, Jianliang, Chen, Yibo, Oloulade, Babatounde Moctard, Lyu, Tengfei, and Li, Zhao
- Subjects
GRAPH algorithms ,SEARCH algorithms ,GENETIC algorithms ,LINEAR acceleration ,PARALLEL programming ,COMPUTER architecture - Abstract
Graph neural networks (GNNs) have received much attention as GNNs have recently been successfully applied on non-euclidean data. However, artificially designed graph neural networks often fail to get satisfactory model performance for a given graph data. Graph neural architecture search effectively constructs the GNNs that achieve the expected model performance with the rise of automatic machine learning. The challenge is efficiently and automatically getting the optimal GNN architecture in a vast search space. Existing search methods serially evaluate the GNN architectures, severely limiting system efficiency. To solve these problems, we develop an Automatic Graph Neural Architecture Search framework (Auto-GNAS) with parallel estimation to implement an automatic graph neural search process that requires almost no manual intervention. In Auto-GNAS, we design the search algorithm with multiple genetic searchers. Each searcher can simultaneously use evaluation feedback information, information entropy, and search results from other searchers based on sharing mechanism to improve the search efficiency. As far as we know, this is the first work using parallel computing to improve the system efficiency of graph neural architecture search. According to the experiment on the real datasets, Auto-GNAS obtain competitive model performance and better search efficiency than other search algorithms. Since the parallel estimation ability of Auto-GNAS is independent of search algorithms, we expand different search algorithms based on Auto-GNAS for scalability experiments. The results show that Auto-GNAS with varying search algorithms can achieve nearly linear acceleration with the increase of computing resources. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
8. A Principal Neighborhood Aggregation-Based Graph Convolutional Network for Pneumonia Detection.
- Author
-
Guail, Akram Ali Ali, Jinsong, Gui, Oloulade, Babatounde Moctard, and Al-Sabri, Raeed
- Subjects
CONVOLUTIONAL neural networks ,PNEUMONIA ,X-ray imaging ,CHILD mortality ,DEEP learning - Abstract
Pneumonia is one of the main causes of child mortality in the world and has been reported by the World Health Organization (WHO) to be the cause of one-third of child deaths in India. Designing an automated classification system to detect pneumonia has become a worthwhile research topic. Numerous deep learning models have attempted to detect pneumonia by applying convolutional neural networks (CNNs) to X-ray radiographs, as they are essentially images and have achieved great performances. However, they failed to capture higher-order feature information of all objects based on the X-ray images because the topology of the X-ray images' dimensions does not always come with some spatially regular locality properties, which makes defining a spatial kernel filter in X-ray images non-trivial. This paper proposes a principal neighborhood aggregation-based graph convolutional network (PNA-GCN) for pneumonia detection. In PNA-GCN, we propose a new graph-based feature construction utilizing the transfer learning technique to extract features and then construct the graph from images. Then, we propose a graph convolutional network with principal neighborhood aggregation. We integrate multiple aggregation functions in a single layer with degree-scalers to capture more effective information in a single layer to exploit the underlying properties of the graph structure. The experimental results show that PNA-GCN can perform best in the pneumonia detection task on a real-world dataset against the state-of-the-art baseline methods. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
9. AutoAMS: Automated attention-based multi-modal graph learning architecture search.
- Author
-
Al-Sabri, Raeed, Gao, Jianliang, Chen, Jiamin, Oloulade, Babatounde Moctard, and Wu, Zhenpeng
- Subjects
- *
GRAPH neural networks , *SENTIMENT analysis , *REPRESENTATIONS of graphs , *SARCASM - Abstract
Multi-modal attention mechanisms have been successfully used in multi-modal graph learning for various tasks. However, existing attention-based multi-modal graph learning (AMGL) architectures heavily rely on manual design, requiring huge effort and expert experience. Meanwhile, graph neural architecture search (GNAS) has made great progress toward automatically designing graph-based learning architectures. However, it is challenging to directly adopt existing GNAS methods to search for better AMGL architectures because of the search spaces that only focus on designing graph neural network architectures and the search objective that ignores multi-modal interactive information between modalities and long-term content dependencies within different modalities. To address these issues, we propose an automated attention-based multi-modal graph learning architecture search (AutoAMS) framework, which can automatically design the optimal AMGL architectures for different multi-modal tasks. Specifically, we design an effective attention-based multi-modal (AM) search space consisting of four sub-spaces, which can jointly support the automatic search of multi-modal attention representation and other components of multi-modal graph learning architecture. In addition, a novel search objective based on an unsupervised multi-modal reconstruction loss and task-specific loss is introduced to search and train AMGL architectures. The search objective can extract the global features and capture multi-modal interactions from multiple modalities. The experimental results on multi-modal tasks show strong evidence that AutoAMS is capable of designing high-performance AMGL architectures. • AutoAMS searches for attention-based multimodal graph learning (AMGL) architectures. • AM search space supports finding multi-modal attention and graph learning components. • A search objective using unsupervised and task-specific losses helps find optimal AMGL. • AutoAMS surpasses state-of-the-art methods in sarcasm and sentiment detection tasks. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
10. Decoupled differentiable graph neural architecture search.
- Author
-
Chen, Jiamin, Gao, Jianliang, Wu, Zhenpeng, Al-Sabri, Raeed, and Oloulade, Babatounde Moctard
- Subjects
- *
GRAPH neural networks , *TIME complexity , *DISCRIMINATION against overweight persons - Abstract
The differentiable graph neural architecture search (GNAS) effectively designs graph neural networks (GNNs) efficiently and automatically with excellent performance based on different graph data distributions. Given a GNN search space containing multiple GNN component operation candidates, the differentiable GNAS method builds a mixed supernet using learnable architecture parameters multiplied by the GNN component operation candidates. When the mixed supernet completes optimization, the mixed supernet is pruned based on the best architecture parameters to efficiently identify the optimal GNN architecture in the GNN search space. However, the multiplicative relationship between the architecture parameters and the GNN component operation candidates introduces a coupled optimization bias into the weight optimization process of the mixed supernet GNN component operation candidates. This bias results in differentiable GNAS performance degradation. To solve the problem of coupled optimization bias in the previous differentiable GNAS method, we propose the D ecoupled D ifferentiable G raph N eural A rchitecture S earch (D2GNAS). It utilizes the Gumbel distribution as a bridge to decouple the weights optimization of supernet GNN component candidate operation and architecture parameters for constructing the decoupled differentiable GNN architecture sampler. The sampler is capable of selecting promising GNN architectures based on architecture parameters treated as sampling probabilities, and it is further optimized through the validation gradients derived from the sampled GNN architectures. Simultaneously, D2GNAS builds a single-path supernet with a pruning strategy to compress the supernet progressively to improve search efficiency further. We conduct extensive experiments on multiple benchmark graphs. The experimental findings demonstrate that D2GNAS outperforms all established baseline methods, both manual GNN and GNAS methods, in terms of performance. Additionally, D2GNAS has a lower time complexity than previous differentiable GNAS methods. Based on the fair GNN search space, it achieves an average 5x efficiency improvement. Codes are available at https://github.com/AutoMachine0/D2GNAS. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.