118 results on '"Yi, Pan"'
Search Results
2. A fine acquisition algorithm based on fast three-time FRFT for dynamic and weak GNSS signals
- Author
-
Yi Pan, Sheng Zhang, Xiao Wang, Manhao Liu, and Yiran Luo
- Subjects
General Medicine - Published
- 2023
- Full Text
- View/download PDF
3. Multiview Subspace Clustering via Low-Rank Symmetric Affinity Graph
- Author
-
Wei Lan, Tianchuan Yang, Qingfeng Chen, Shichao Zhang, Yi Dong, Huiyu Zhou, and Yi Pan
- Subjects
Artificial Intelligence ,Computer Networks and Communications ,Software ,Computer Science Applications - Published
- 2023
- Full Text
- View/download PDF
4. Microbe-Disease Association Prediction Using RGCN through Microbe-Drug-Disease Network
- Author
-
Yueyue Wang, Xiujuan Lei, and Yi Pan
- Subjects
Applied Mathematics ,Genetics ,Biotechnology - Published
- 2023
- Full Text
- View/download PDF
5. DEFNet: Dual-Branch Enhanced Feature Fusion Network for RGB-T Crowd Counting
- Author
-
Wujie Zhou, Yi Pan, Jingsheng Lei, Lv Ye, and Lu Yu
- Subjects
Mechanical Engineering ,Automotive Engineering ,Computer Science Applications - Published
- 2022
- Full Text
- View/download PDF
6. BIIoVT: Blockchain-Based Secure Storage Architecture for Intelligent Internet of Vehicular Things
- Author
-
Pradip Kumar Sharma, Yi Pan, Jong Hyuk Park, and Sushil Kumar Singh
- Subjects
Vehicular ad hoc network ,Distributed database ,business.industry ,Emerging technologies ,Computer science ,ComputerSystemsOrganization_COMPUTER-COMMUNICATIONNETWORKS ,Cloud computing ,Computer security ,computer.software_genre ,Computer Science Applications ,Distributed hash table ,Human-Computer Interaction ,Hardware and Architecture ,Smart city ,The Internet ,Electrical and Electronic Engineering ,business ,computer ,Information exchange - Abstract
Today, the rapid growth of vehicles connected to the Internet enables provides various services to consumers, including traffic management, traffic safety, and entertainment. Vehicular Ad-Hoc Network (VANET) is one of the most prominent and emerging technologies on the Internet of Vehicular Things (IoVT). This technology offers to fulfill requirements such as robust information exchange, and infotainment among vehicles for the smart city environment. Still, it has some challenges such as centralization, storage, security, and privacy because all city vehicular networks send vehicles and road-related information data directly to the cloud. This article proposes BIIoVT: Blockchain-based Secure Storage Architecture for Intelligent Internet of Vehicular Things (IIVoT) to mitigate the above-mention issues. Blockchain provides security and privacy at each city's vehicular networks and decentralized storage at the cloud layer with a Distributed Hash Table (DHT). It also examines how the vehicular network offers a secure platform. The validation results of the proposed architecture show an outstanding balance of secure storage and efficiency for the IoVT compared to existing methods.
- Published
- 2022
- Full Text
- View/download PDF
7. Guest Editorial Special Issue on Explainable Deep Learning for Medical Image Processing and Analysis
- Author
-
Yu-Dong Zhang, Juan Manuel Górriz, Yi Pan, and Oscar Cordon
- Subjects
Computational Mathematics ,Control and Optimization ,Artificial Intelligence ,Computer Science Applications - Published
- 2023
- Full Text
- View/download PDF
8. MMCo-Clus – An Evolutionary Co-clustering Algorithm for Gene Selection
- Author
-
Sudipta Acharya, Yi Pan, Joshua Zhexue Huang, Sumit Mishra, and Laizhong Cui
- Subjects
Biclustering ,Set (abstract data type) ,Biological data ,Computational Theory and Mathematics ,Computer science ,Benchmark (computing) ,Sorting ,Feature selection ,Algorithm ,Plot (graphics) ,Computer Science Applications ,Information Systems ,Visualization - Abstract
In the era of big data, data analysis of high-dimensional data-sets often suffers from Curse-of-dimensionality. In the current article, we propose a dimensionality-reduction method through feature selection from high-dimensional gene expression (GE) data-set using a Multi-objective optimization based Multi-view Co-Clustering algorithm (named M2Co-Clus). A popular evolutionary technique- Non-dominated Sorting Genetic Algorithm-II or NSGA-II has been utilized as the underlying optimization strategy of the proposed method. First, we construct two views of the chosen data-set, utilizing knowledge from two different biological data sources. Next, we develop the M2Co-Clus algorithm considering the constructed views to identify a set of good co-clustering solutions. Finally, based on a concept of consensus operation on co-clustering outcome, a small number of most relevant and non-redundant features are extracted from the original feature-space. The reduced dimension formed by new feature-space causes to decrease the computational burden and noise level of original data. For experimental analysis, we have chosen three benchmark GE data-sets. Our feature selection strategy's effectiveness is evaluated through sample-classification accuracy, accompanied by visualization tests like cluster profile plot/Eisen plot, and biological significance test. A thorough comparative analysis with existing feature selection algorithms using external and internal evaluation metrics supports the potency of our proposed method.
- Published
- 2022
- Full Text
- View/download PDF
9. Edge-Based Video Surveillance With Graph-Assisted Reinforcement Learning in Smart Construction
- Author
-
Wei Xiao, Yi Pan, Jinshen Chen, Lixin Zhou, Shu Yang, Zhongxing Ming, and Laizhong Cui
- Subjects
Computer Networks and Communications ,Computer science ,Distributed computing ,Latency (audio) ,Computer Science Applications ,Scheduling (computing) ,Hardware and Architecture ,Signal Processing ,Reinforcement learning ,Graph (abstract data type) ,Enhanced Data Rates for GSM Evolution ,Applications of artificial intelligence ,Latency (engineering) ,Edge computing ,Information Systems - Abstract
The smart construction site is developing repidly with the intelligentization of industrial management. Intelligent devices are beening widely deployed in construction industry to support artificial intelligence applications. Video surveillance is a core function of smart construction, which demands both high accuracy and low latency. The challenge is that the computation and networking resources in a construction site are often limited, and the inefficient scheduling policies create congestions in network and brings additional delay that is unbearable to realtime surveillance. Adaptive video configuration and edge computing have been proposed to improve accuracy and reduce latency with limited resources. However, optimizing the video configuration and task scheduling in edge computing involves several factors that often interfere with each other, which significantly decreases the performance of video surveillance. In this paper, we present an edge-based solution of video surveillance in smart construction site assisted by Graph Neural Network. It leverages the distributed computing model to realize flexible allocation of resources. A graph-assisted hierarchical reinforcement learning algorithm is developed to illustrate the feature of mobile edge network and optimize the scheduling policy by Deep-Q Network. We implement and test the proposed solution in the commercial residential buildings of a fortune global 500 real estate company, and observe that the proposed algorithm is efficient to maintain a reliable accuracy and keep lower delay. We further conduct a case study to demonstrate the superiority of the proposed solution by comparing it with traditional mechanisms.
- Published
- 2022
- Full Text
- View/download PDF
10. RTT-Based Rogue UAV Detection in IoV Networks
- Author
-
Jie Chen, Ying He, Nilesh Chakraborty, Yao Chao, Sumit Mishra, Yi Pan, Jianqiang Li, and Chengwen Luo
- Subjects
Computer Networks and Communications ,Computer science ,business.industry ,ComputerApplications_COMPUTERSINOTHERSYSTEMS ,Adversary ,Computer Science Applications ,Information sensitivity ,Hardware and Architecture ,Margin (machine learning) ,Signal Processing ,Point (geometry) ,The Internet ,Enhanced Data Rates for GSM Evolution ,Communications protocol ,business ,Information Systems ,Communication channel ,Computer network - Abstract
Unmanned Aerial Vehicles (UAVs) are being used in different emerging domains for accomplishing many critical tasks. However, due to the various constraints like – battery life, computational resources, etc., an UAV under a mission (M-UAV) often needs assistance from an edge/cloud server that is reachable from the M-UAV’s location. A connection between an M-UAV and edge server can be established via an access point or AP. Therefore, before sharing any sensitive information with the edge server, it is essential for an M-UAV to determine the legitimacy of the selected AP. Recently, some works in this direction indicate that a rogue UAV (R-UAV) can successfully mimic a legitimate AP for intercepting the communication channel. Hence, there should be a robust detection mechanism in place for addressing such a threat scenario. In this paper, considering one of the emerging domains – the Internet of Vehicle (IoV) networks, at first, we show that communication in the IoV networks can get benefit from the presence of M-UAVs. However, as the link between the M-UAV and edge server can be intercepted by an R-UAV, the adversary may access the sensitive information from the IoV networks. Followed by this, we propose a timing-based algorithm for identifying the presence of rogue APs (or R-UAVs) in the channel. The M-UAV executes the timing-based algorithm, and the detection method does not require any auxiliary hardware or any modification to the network protocols for meeting the objective. Supported by an extensive evaluation study, we show that without any rigid restriction on the M-UAV’s speed (e.g., by limiting it to almost static) the proposed approach significantly enhances the detection accuracy (at least by a margin of 29.7% and 16.65%) compared to the state-of-the-art methods.
- Published
- 2022
- Full Text
- View/download PDF
11. Cryptanalysis of a Honeyword System in the IoT Platform
- Author
-
Mithun Mukherjee, Mohammad Shojafar, Yi Pan, Jianqiang Li, and Nilesh Chakraborty
- Subjects
Password ,Matching (statistics) ,Authentication ,Cover (telecommunications) ,Computer Networks and Communications ,Computer science ,business.industry ,Computer security ,computer.software_genre ,Computer Science Applications ,law.invention ,Domain (software engineering) ,Hardware and Architecture ,law ,Signal Processing ,Benchmark (computing) ,Cryptanalysis ,Internet of Things ,business ,computer ,Information Systems - Abstract
Password is one of the most well-known authentication methods in accessing many Internet of Things (IoT) devices. The usage of passwords, however, inherits several drawbacks and emerging vulnerabilities in the IoT platform. However, many solutions have been proposed to tackle these limitations. Most of these defense strategies suffer from a lack of computational power and memory capacity and do not have immediate cover in the IoT platform. Motivated by this consideration, the goal of this paper is fivefold. First, we analyze the feasibility of implementing a honeyword-based defense strategy to prevent the latest developed server-side threat on the IoT domain’s password. Second, we perform thorough cryptanalysis of a recently developed honeyword-based method to evaluate its advancement in preventing the threat and explore the best possible way to incorporate it in the IoT platform. Third, we verify that we can add a honeyword-based solution to the IoT infrastructure by ensuring specific guidelines. Forth, we propose a generic attack model, namely matching attack utilizing the compromised password-file to perform the security check of any legacy-UI approach for meeting the all essential flatness security criterion. Last, we compare the matching attack’s performance with the corresponding one of a benchmark technological methods over the legacy-UI model and confirm that our attack has 5%~22% more vulnerable than others.
- Published
- 2022
- Full Text
- View/download PDF
12. Predicting Drug-Drug Interactions Based on Integrated Similarity and Semi-Supervised Learning
- Author
-
Zhang Yayan, Yi Pan, Fang-Xiang Wu, Jianxin Wang, Cheng Yan, and Guihua Duan
- Subjects
Drug ,Computer science ,media_common.quotation_subject ,0206 medical engineering ,02 engineering and technology ,Semi-supervised learning ,Machine learning ,computer.software_genre ,Cross-validation ,Genetics ,Humans ,Drug Interactions ,Drug reaction ,Least-Squares Analysis ,media_common ,business.industry ,Applied Mathematics ,Cosine similarity ,Pharmaceutical Preparations ,Drug development ,Learning methods ,Supervised Machine Learning ,Artificial intelligence ,business ,Classifier (UML) ,computer ,Algorithms ,020602 bioinformatics ,Biotechnology - Abstract
A drug-drug interaction (DDI) is defined as an association between two drugs where the pharmacological effects of a drug are influenced by another drug. Positive DDIs can usually improve the therapeutic effects of patients, but negative DDIs cause the major cause of adverse drug reactions and even result in the drug withdrawal from the market and the patient death. Therefore, identifying DDIs has become a key component of the drug development and disease treatment. In this study, we propose a novel method to predict DDIs based on the integrated similarity and semi-supervised learning (DDI-IS-SL). DDI-IS-SL integrates the drug chemical, biological and phenotype data to calculate the feature similarity of drugs with the cosine similarity method. The Gaussian Interaction Profile kernel similarity of drugs is also calculated based on known DDIs. A semi-supervised learning method (the Regularized Least Squares classifier) is used to calculate the interaction possibility scores of drug-drug pairs. In terms of the 5-fold cross validation, 10-fold cross validation and de novo drug validation, DDI-IS-SL can achieve the better prediction performance than other comparative methods. In addition, the average computation time of DDI-IS-SL is shorter than that of other comparative methods. Finally, case studies further demonstrate the performance of DDI-IS-SL in practical applications.
- Published
- 2022
- Full Text
- View/download PDF
13. Adaptive Zone–Assisted Iterative Localization in Energy-Efficient Wireless Sensor Networks
- Author
-
Chun-Yi Wei and Hsuan-Yi Pan
- Subjects
Computer science ,Real-time computing ,Electrical and Electronic Engineering ,Instrumentation ,Wireless sensor network ,Efficient energy use - Published
- 2021
- Full Text
- View/download PDF
14. A Deep Learning Framework for Gene Ontology Annotations With Sequence- and Network-Based Information
- Author
-
Yi Pan, Min Zeng, Hong Song, Fang-Xiang Wu, Min Li, Fuhao Zhang, and Yaohang Li
- Subjects
InterPro ,0206 medical engineering ,02 engineering and technology ,Convolutional neural network ,Deep Learning ,Subsequence ,Genetics ,Word2vec ,Amino Acid Sequence ,Protein Interaction Maps ,Sequence ,Artificial neural network ,business.industry ,Applied Mathematics ,Deep learning ,Computational Biology ,Proteins ,Molecular Sequence Annotation ,Pattern recognition ,Gene Ontology ,Embedding ,Artificial intelligence ,business ,Algorithms ,Software ,020602 bioinformatics ,Biotechnology - Abstract
Knowledge of protein functions plays an important role in biology and medicine. With the rapid development of high-throughput technologies, a huge number of proteins have been discovered. However, there are a great number of proteins without functional annotations. A protein usually has multiple functions and some functions or biological processes require interactions of a plurality of proteins. Additionally, Gene Ontology provides a useful classification for protein functions and contains more than 40,000 terms. We propose a deep learning framework called DeepGOA to predict protein functions with protein sequences and protein-protein interaction (PPI) networks. For protein sequences, we extract two types of information: sequence semantic information and subsequence-based features. We use the word2vec technique to numerically represent protein sequences, and utilize a Bi-directional Long and Short Time Memory (Bi-LSTM) and multi-scale convolutional neural network (multi-scale CNN) to obtain the global and local semantic features of protein sequences, respectively. Additionally, we use the InterPro tool to scan protein sequences for extracting subsequence-based information, such as domains and motifs. Then, the information is plugged into a neural network to generate high-quality features. For the PPI network, the Deepwalk algorithm is applied to generate its embedding information of PPI. Then the two types of features are concatenated together to predict protein functions. To evaluate the performance of DeepGOA, several different evaluation methods and metrics are utilized. The experimental results show that DeepGOA outperforms DeepGO and BLAST.
- Published
- 2021
- Full Text
- View/download PDF
15. ECG Biometrics Based on Attention Enhanced Domain Adaptive Feature Fusion Network
- Author
-
Yi, Pan, primary, Si, Yujuan, additional, Fan, Wei, additional, and Zhang, Yang, additional
- Published
- 2023
- Full Text
- View/download PDF
16. CoronaPep: An Anti-Coronavirus Peptide Generation Tool
- Author
-
Aamir Mehmood, Aman Chandra Kaushik, Gurudeeban Selvaraj, Xiaofeng Dai, Dong-Qing Wei, and Yi Pan
- Subjects
2019-20 coronavirus outbreak ,COVID-19 Vaccines ,Coronavirus disease 2019 (COVID-19) ,Computer science ,viruses ,Severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) ,Peptide ,Genome, Viral ,Computational biology ,medicine.disease_cause ,Antiviral Agents ,Genome ,Viral Proteins ,Genetics ,medicine ,Humans ,Databases, Protein ,Pandemics ,Coronavirus ,chemistry.chemical_classification ,Host Microbial Interactions ,SARS-CoV-2 ,Applied Mathematics ,COVID-19 ,Computational Biology ,virus diseases ,COVID-19 Drug Treatment ,chemistry ,Drug Design ,Target protein ,Peptides ,Software ,Biotechnology - Abstract
The novel coronavirus (COVID-19) infections have adopted the shape of a global pandemic now, demanding an urgent vaccine design. The current work reports contriving an anti-coronavirus peptide scanner tool to discern anti-coronavirus targets in the embodiment of peptides. The proffered CoronaPep tool features the fast fingerprinting of the anti-coronavirus target serving supreme prominence in the current bioinformatics research. The anti-coronavirus target protein sequences reported from the current outbreak are scanned against the anti-coronavirus target data-sets via CORONAPEP which provides precision-based anti-coronavirus peptides. This tool is specifically for the coronavirus data, which can predict peptides from the whole genome, or a gene or protein's list. Besides it is relatively fast, accurate, userfriendly and can generate maximum output from the limited information. The availability of tools like CORONAPEP will immeasurably perquisite researchers in the discipline of oncology and structure-based drug design.
- Published
- 2021
- Full Text
- View/download PDF
17. Deletion Detection Method Using the Distribution of Insert Size and a Precise Alignment Strategy
- Author
-
Junwei Luo, Fang-Xiang Wu, Zhen Zhang, Juan Shang, Jianxin Wang, Yi Pan, and Min Li
- Subjects
Genome, Human ,Computer science ,business.industry ,Applied Mathematics ,Breakpoint ,Computational Biology ,Genomics ,Pattern recognition ,Sequence Analysis, DNA ,Insert (molecular biology) ,Structural variation ,Mutagenesis, Insertional ,Distribution (mathematics) ,Genomic Structural Variation ,Genetics ,Humans ,Human genome ,Artificial intelligence ,business ,Sequence Alignment ,Gene Deletion ,Biotechnology - Abstract
Homozygous and heterozygous deletions commonly exist in the human genome. For current structural variation detection tools, it is significant to determine whether a deletion is homozygous or heterozygous. However, the problems of sequencing errors, micro-homologies, and micro-insertions prohibit common alignment tools from identifying accurate breakpoint locations, and often result in detecting false structural variations. In this study, we present a novel deletion detection tool called Sprites2. Comparing with Sprites, Sprites2 makes the following modifications: (1) The distribution of insert size is used in Sprites2, which can identify the type of deletions and improve the accuracy of deletion calls. (2) A precise alignment method based on AGE (one algorithm simultaneously aligning 5’ and 3’ ends between two sequences) is adopted in Sprites2 to identify breakpoints, which is helpful to resolve the problems introduced by sequencing errors, micro-homologies, and micro-insertions. In order to test and verify the performance of Sprites2, some simulated and real datasets are adopted in our experiments, and Sprites2 is compared with five popular tools. The experimental results show that Sprites2 can improve the performance of deletion detection. Sprites2 can be downloaded from https://github.com/zhangzhen/sprites2 .
- Published
- 2021
- Full Text
- View/download PDF
18. Periodic-Aware Intelligent Prediction Model for Information Diffusion in Social Networks
- Author
-
Xiaokang Zhou, Yi Pan, Zijia Luo, and Wei Liang
- Subjects
Topic model ,Ubiquitous computing ,Social computing ,Social network ,Computer Networks and Communications ,business.industry ,Computer science ,Deep learning ,Big data ,Mobile computing ,Data science ,Computer Science Applications ,Data modeling ,Control and Systems Engineering ,Artificial intelligence ,business - Abstract
Due to the rapid development of information and communication technologies with several emerging computing paradigms, such as ubiquitous computing, social computing, and mobile computing, modeling of information diffusion becomes an increasingly significant issue in the big data era. In this study, we focus on a periodic-aware intelligent prediction method based on a comprehensive modeling of user and contagion features, which can be applied to support information diffusion across social networks in accordance with users’ adoption behaviors. In particular, the Dynamically Socialized User Networking (DSUN) model and sentiment-Latent Dirichlet Allocation (LDA) topic model, which consider a series of social factors, including user interests and social roles, semantic topics and sentiment polarities, are constructed and integrated together to facilitate the information diffusion process. A periodic-aware preception mechanism usingreinforcement learning with a newly designed reward rule based on topic distribution is then designed to detect and classify different periods into the so-called routine period and emergency period. Finally, a deep learning scheme based on multi-factor analysis is developed for adoption behavior prediction within the identified different periods. Experiments using the real-world data demonstrate the effectiveness and usefulness of our proposed model and method in heterogenous social network environments.
- Published
- 2021
- Full Text
- View/download PDF
19. A Novel Drug Repositioning Approach Based on Collaborative Metric Learning
- Author
-
Fang-Xiang Wu, Jianxin Wang, Huimin Luo, Cheng Yan, Yi Pan, and Min Li
- Subjects
Drug ,Computer science ,Association (object-oriented programming) ,media_common.quotation_subject ,0206 medical engineering ,02 engineering and technology ,ENCODE ,Machine learning ,computer.software_genre ,Toxicogenetics ,Task (project management) ,Machine Learning ,Genetics ,Humans ,media_common ,Models, Statistical ,business.industry ,Applied Mathematics ,Drug Repositioning ,Computational Biology ,Drug repositioning ,Metric space ,Drug development ,Metric (mathematics) ,Artificial intelligence ,business ,computer ,Algorithms ,020602 bioinformatics ,Biotechnology - Abstract
Computational drug repositioning, which is an efficient approach to find potential indications for drugs, has been used to increase the efficiency of drug development. The drug repositioning problem essentially is a top-K recommendation task that recommends most likely diseases to drugs based on drug and disease related information. Therefore, many recommendation methods can be adopted to drug repositioning. Collaborative metric learning (CML) algorithm can produce distance metrics that capture the important relationships among objects, and has been widely used in recommendation domains. By applying CML in drug repositioning, a joint metric space is learned to encode drug's relationships with different diseases. In this study, we propose a novel drug repositioning computational method using Collaborative Metric Learning to predict novel drug-disease associations based on known drug and disease related information. Specifically, the proposed method learns latent vectors of drugs and diseases by applying metric learning, and then predicts the association probability of one drug-disease pair based on the learned vectors. The comprehensive experimental results show that CMLDR outperforms the other state-of-the-art drug repositioning algorithms in terms of precision, recall, and AUPR.
- Published
- 2021
- Full Text
- View/download PDF
20. On Designing a Lesser Obtrusive Authentication Protocol to Prevent Machine-Learning-Based Threats in Internet of Things
- Author
-
Chengwen Luo, Fei Chen, Mamoun Alazab, Jianqiang Li, Samrat Mondal, Nilesh Chakraborty, Yi Pan, and Huihui Wang
- Subjects
Password ,021110 strategic, defence & security studies ,Authentication ,Computer Networks and Communications ,business.industry ,Computer science ,Node (networking) ,0211 other engineering and technologies ,Access control ,Usability ,02 engineering and technology ,Machine learning ,computer.software_genre ,Computer Science Applications ,Hardware and Architecture ,Authentication protocol ,Signal Processing ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,Artificial intelligence ,State (computer science) ,business ,computer ,Protocol (object-oriented programming) ,Information Systems - Abstract
In the era of the Internet of Things (IoT), people access many applications through smartphones for controlling smart devices. Therefore, such a centralized node must follow a robust access control mechanism so that an intruder cannot control the connected devices. Recent reports suggest that password can be used as an authentication factor for accessing the smart setups. However, this static information can be compromised under the light of different machine learning (ML)-empowered attack mechanisms. Alarmingly, different sensors used in the IoT setup can also expose this static information to the adversaries. Password-based authentication that uses a challenge–response strategy is an effective solution for handling such threat scenarios. In this article, at first, we show that no existing usable challenge–response protocol is safe to be used in the public area network. Following this, we propose a challenge–response protocol that is more secure to use in the public domain. By using eight classifiers, we show that a learning-based threat specific to our protocol has a marginal impact on the method’s security standard. The discussion in this article also suggests that the proposed protocol has usability and security advantages compared to the existing state of the art (e.g., reduces the number of interactions between the user and verifier by a factor of 0.5).
- Published
- 2021
- Full Text
- View/download PDF
21. A Gene Rank Based Approach for Single Cell Similarity Assessment and Clustering
- Author
-
Feng Luo, Jianxin Wang, Fang-Xiang Wu, Yunpei Xu, Yi Pan, and Hong-Dong Li
- Subjects
Cell type ,Computer science ,0206 medical engineering ,Population ,02 engineering and technology ,Correlation ,Mice ,Similarity (network science) ,Databases, Genetic ,Genetics ,Animals ,Cluster Analysis ,Humans ,Cluster analysis ,education ,education.field_of_study ,Sequence Analysis, RNA ,business.industry ,Applied Mathematics ,Rank (computer programming) ,Computational Biology ,Pattern recognition ,Gene Ontology ,Key (cryptography) ,Unsupervised learning ,Artificial intelligence ,Single-Cell Analysis ,Transcriptome ,business ,Algorithms ,020602 bioinformatics ,Biotechnology - Abstract
Single-cell RNA sequencing (scRNA-seq) technology provides quantitative gene expression profiles at single-cell resolution. As a result, researchers have established new ways to explore cell population heterogeneity and genetic variability of cells. One of the current research directions for scRNA-seq data is to identify different cell types accurately through unsupervised clustering methods. However, scRNA-seq data analysis is challenging because of their high noise level, high dimensionality and sparsity. Moreover, the impact of multiple latent factors on gene expression heterogeneity and on the ability to accurately identify cell types remains unclear. How to overcome these challenges to reveal the biological difference between cell types has become the key to analyze scRNA-seq data. For these reasons, the unsupervised learning for cell population discovery based on scRNA-seq data analysis has become an important research area. A cell similarity assessment method plays a significant role in cell clustering. Here, we present BioRank, a new cell similarity assessment method based on annotated gene sets and gene ranks. To evaluate the performances, we cluster cells by two classical clustering algorithms based on the similarity between cells obtained by BioRank. In addition, BioRank can be used by any clustering algorithm that requires a similarity matrix. Applying BioRank to 12 public scRNA-seq datasets, we show that it is better than or at least as well as several popular similarity assessment methods for single cell clustering.
- Published
- 2021
- Full Text
- View/download PDF
22. On Understanding the Impact of RTT in the Mobile Network for Detecting the Rogue UAVs
- Author
-
Yi Pan, Jianqiang Li, Yao Chao, Jie Chen, Pan Ziying, Nilesh Chakraborty, and Chengwen Luo
- Subjects
Computer Networks and Communications ,business.industry ,Computer science ,020208 electrical & electronic engineering ,Real-time computing ,ComputerApplications_COMPUTERSINOTHERSYSTEMS ,020206 networking & telecommunications ,02 engineering and technology ,Spotting ,Information sensitivity ,Artificial Intelligence ,Hardware and Architecture ,Server ,0202 electrical engineering, electronic engineering, information engineering ,Cellular network ,Wireless ,Enhanced Data Rates for GSM Evolution ,business ,Communications protocol ,Wireless sensor network - Abstract
In recent times, Unmanned-aerial-vehicles (UAVs) have grasped significant attentions for performing various operations without a constant intervention of the human users. Due to the power and computing constraints, however, it is difficult for an UAV to perform all the tasks independently. Hence, to achieve its goals, a UAV may share various sensitive data with the nearest edge servers through some access points (APs). Prior to sending any sensitive information through an AP, it is very important for an UAV to determine the authenticity of the selected AP. In this paper, we have applied a timing-based algorithm for spotting rogue APs by the UAVs − while selecting an edge server. The timing-based algorithm does not require any additional hardware or any change in the network protocol for detection purpose. This fact, in turn, reduces the effort from UAV’s side for detecting a breach. An extensive experimental study shows that with a little manipulation in speed, an UAV can detect the presence of any rogue AP almost every time with only 6% false-positive rate.
- Published
- 2020
- Full Text
- View/download PDF
23. Deep Fuzzy Neural Networks for Biomarker Selection for Accurate Cancer Detection
- Author
-
Xueli Xiao, Yi Pan, Thosini Bamunu Mudiyanselage, and Yan-Qing Zhang
- Subjects
Artificial neural network ,business.industry ,Computer science ,Applied Mathematics ,media_common.quotation_subject ,Small number ,Feature extraction ,02 engineering and technology ,Ambiguity ,Fuzzy control system ,Machine learning ,computer.software_genre ,Hybrid algorithm ,Support vector machine ,ComputingMethodologies_PATTERNRECOGNITION ,Computational Theory and Mathematics ,Artificial Intelligence ,Control and Systems Engineering ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,Artificial intelligence ,Noise (video) ,business ,computer ,media_common - Abstract
Different biomedical computing methods for cancer-specific gene recognition have been developed in recent years. Currently, building an open-box machine learning system to discover explainable knowledge from gene expression data is a difficult research problem due to a large number of genes, a small number of samples, and noise. Fuzzy systems can be used to deal with data ambiguity and noise issues and extract meaningful knowledge from gene data. In this article, we create a new deep fuzzy neural network to handle the uncertainty in gene data to generate useful knowledge for specific disease diagnosis. A new hybrid algorithm is designed to preprocess data and select informative genes for accurate cancer detection. Various experiments using six different cancer datasets indicate that the new method has better and more reliable performance than the other conventional classification methods with different gene selection methods.
- Published
- 2020
- Full Text
- View/download PDF
24. Rethinking Fast and Friendly Transport in Data Center Networks
- Author
-
Geyong Min, Tao Zhang, Jiawei Huang, Jianer Chen, Kai Chen, Yi Pan, and Jianxin Wang
- Subjects
Computer Networks and Communications ,business.industry ,Computer science ,Distributed computing ,ComputerSystemsOrganization_COMPUTER-COMMUNICATIONNETWORKS ,020206 networking & telecommunications ,02 engineering and technology ,Computer Science Applications ,Convergence (routing) ,Transport layer ,0202 electrical engineering, electronic engineering, information engineering ,Bandwidth (computing) ,Overhead (computing) ,Data center ,The Internet ,Electrical and Electronic Engineering ,business ,Protocol (object-oriented programming) ,Software - Abstract
The sustainable growth of bandwidth has been an inevitable tendency in current Data Center Networks (DCN). However, the dramatic expansion of link capacity offers a remarkable challenge to the transport layer protocols of DCN, i.e., how to converge fast and enable data flow to utilize the high bandwidth effectively. Meanwhile, the new protocol should be compatible to the traditional TCP because the applications with old TCP versions are still widely deployed. Therefore, it is important to achieve a trade-off between the aggressiveness and TCP-friendliness in protocol design. In this article, we first empirically investigate why the existing typical data center TCP variants naturally fail to guarantee both fast convergence and TCP friendliness. Then, we design a new transport protocol for DCN, namely Fast and Friendly Converging (FFC), which makes independent decisions and self-adjustment through retrieving the two-dimensional congestion notification from both RTT and ECN. We further present a mathematic model to analyze its competing behavior and converging process. The results from simulation experiments and real implementation show that FFC can achieve fast convergence, thus benefiting the flow completion time. Moreover, when coexisting with the traditional TCP, FFC also presents a moderate behavior, while introducing trivial deployment overhead only at the end-hosts.
- Published
- 2020
- Full Text
- View/download PDF
25. Special Issue Editorial: Intelligent Data Analysis for Sustainable Computing
- Author
-
Geyong Min, Nektarios Georgalas, Yi Pan, and Yulei Wu
- Subjects
Control and Optimization ,Renewable Energy, Sustainability and the Environment ,Computer science ,business.industry ,Perspective (graphical) ,Computational intelligence ,Cloud computing ,Data science ,Green computing ,Computational Theory and Mathematics ,Hardware and Architecture ,Key (cryptography) ,Special section ,business ,Software - Abstract
The ten papers in this special section are devoted to the most recent developments and research outcomes addressing the related theoretical and practical aspects of computational intelligence solutions in sustainable computing and aims at presenting latest innovative ideas targeted at the corresponding key challenges, either from a methodological or from an application perspective.
- Published
- 2020
- Full Text
- View/download PDF
26. An Efficient and Compacted DAG-Based Blockchain Protocol for Industrial Internet of Things
- Author
-
Ke Xu, Shu Yang, Laizhong Cui, Ziteng Chen, Mingwei Xu, and Yi Pan
- Subjects
Blockchain ,SIMPLE (military communications protocol) ,Computer science ,business.industry ,020208 electrical & electronic engineering ,02 engineering and technology ,Data structure ,Directed acyclic graph ,Computer Science Applications ,Control and Systems Engineering ,0202 electrical engineering, electronic engineering, information engineering ,Industrial Internet ,Electrical and Electronic Engineering ,business ,Protocol (object-oriented programming) ,Throughput (business) ,Information Systems ,Computer network - Abstract
Industrial Internet of Things (IIoT) has been widely used in many fields. Meanwhile, blockchain is considered promising to address the issues of the IIoT. However, the current blockchains have a limited throughput. In this article, we devise an efficient and secure blockchain protocol compacted directed acyclic graph (CoDAG) based on a compacted directed acyclic graph, where blocks are organized in levels and width. New-generated blocks in the CoDAG will be placed appropriately and point to those in the previous level, making it a well-connected channel. Transactions in the network will be confirmed in a deterministic period, and the CoDAG keeps a simple data structure at the same time. We also illustrate the attack strategies by adversary, and it is proved that our protocols are resistant to these attacks. Furthermore, we design a CoDAG-based IIoT architecture to improve the efficiency of the IIoT system. Experimental results show that the CoDAG achieves 164 $\times$ Bitcoin's throughput and 77 $\times$ Ethererum's throughput.
- Published
- 2020
- Full Text
- View/download PDF
27. A Decentralized and Trusted Edge Computing Platform for Internet of Things
- Author
-
Laizhong Cui, Shu Yang, Zhong Ming, Yi Pan, Mingwei Xu, and Ziteng Chen
- Subjects
021110 strategic, defence & security studies ,Distrust ,Computer Networks and Communications ,business.industry ,Computer science ,Domain Name System ,media_common.quotation_subject ,0211 other engineering and technologies ,020206 networking & telecommunications ,02 engineering and technology ,Computer Science Applications ,Incentive ,Hardware and Architecture ,Server ,Signal Processing ,0202 electrical engineering, electronic engineering, information engineering ,Trusted Platform Module ,Internet of Things ,business ,Edge computing ,Information Systems ,Computer network ,media_common - Abstract
With the development of Internet of Things (IoT), edge computing becomes more and more prevalent currently. However, edge computing needs to deploy a large number of edge servers to reduce the communication latency, which will bring additional costs to the system. Although there exist some idle computing resources at the edge, the owners distrust each other and lack the incentives to contribute to the system. In this article, we propose a new edge computing platform decentralized and trusted platform for edge computing (DeTEC), which provides a unified interface to users, resolves the user’s requests to the most appropriate edge server through domain name server, and returns the computational results to the IoT user. To build a trustworthy system, DeTEC integrates the blockchain technology with edge computing, such that the contributions of each participant could be accounted and rewarded. We formulate the task allocation problem, taking both node capacity and reward fairness into consideration, and solve it through a heuristic algorithm. Finally, to guarantee the trustworthiness of computational results, we utilize a police patrol model and try to optimize the system overall reward. We implement DeTEC based on an open source project and conduct comprehensive experiments to test its performance. The results show that our DeTEC system works well in the IoT scenario.
- Published
- 2020
- Full Text
- View/download PDF
28. MEC: Misassembly Error Correction in Contigs based on Distribution of Paired-End Reads and Statistics of GC-contents
- Author
-
Yi Pan, Junwei Luo, Xingyu Liao, Binbin Wu, Jianxin Wang, Fang-Xiang Wu, and Min Li
- Subjects
Contig ,Computer science ,Applied Mathematics ,0206 medical engineering ,food and beverages ,Sequence assembly ,02 engineering and technology ,Repetitive Regions ,Computational biology ,Genome ,Genetics ,Statistical analysis ,Error detection and correction ,020602 bioinformatics ,Biotechnology - Abstract
The de novo assembly tools aim at reconstructing genomes from next-generation sequencing (NGS) data. However, the assembly tools usually generate a large amount of contigs containing many misassemblies, which are caused by problems of repetitive regions, chimeric reads, and sequencing errors. As they can improve the accuracy of assembly results, detecting and correcting the misassemblies in contigs are appealing, yet challenging. In this study, a novel method, called MEC, is proposed to identify and correct misassemblies in contigs. Based on the insert size distribution of paired-end reads and the statistical analysis of GC-contents, MEC can identify more misassemblies accurately. We evaluate our MEC with the metrics (NA50, NGA50) on four datasets, compared it with the most available misassembly correction tools, and carry out experiments to analyze the influence of MEC on scaffolding results, which shows that MEC can reduce misassemblies effectively and result in quantitative improvements in scaffolding quality. MEC is publicly available at https://github.com/bioinfomaticsCSU/MEC .
- Published
- 2020
- Full Text
- View/download PDF
29. miRTRS: A Recommendation Algorithm for Predicting miRNA Targets
- Author
-
Fang-Xiang Wu, Wei Lan, Yi Pan, Min Li, Hui Jiang, and Jianxin Wang
- Subjects
Models, Genetic ,Computer science ,Applied Mathematics ,0206 medical engineering ,Feature extraction ,Computational Biology ,02 engineering and technology ,Cross-validation ,Mirna target ,MicroRNAs ,Prediction algorithms ,Prediction methods ,microRNA ,Genetics ,Humans ,Gene sequence ,Algorithm ,Algorithms ,020602 bioinformatics ,Biotechnology - Abstract
microRNAs (miRNAs) are small and important non-coding RNAs that regulate gene expression in transcriptional and post-transcriptional level by combining with their targets (genes). Predicting miRNA targets is an important problem in biological research. It is expensive and time-consuming to identify miRNA targets by using biological experiments. Many computational methods have been proposed to predict miRNA targets. In this study, we develop a novel method, named miRTRS, for predicting miRNA targets based on a recommendation algorithm. miRTRS can predict targets for an isolated (new) miRNA with miRNA sequence similarity, as well as isolated (new) targets for a miRNA with gene sequence similarity. Furthermore, when compared to supervised machine learning methods, miRTRS does not need to select negative samples. We use 10-fold cross validation and independent datasets to evaluate the performance of our method. We compared miRTRS with two most recently published methods for miRNA target prediction. The experimental results have shown that our method miRTRS outperforms competing prediction methods in terms of AUC and other evaluation metrics.
- Published
- 2020
- Full Text
- View/download PDF
30. Stochastic Load Balancing for Virtual Resource Management in Datacenters
- Author
-
Zhipeng Cai, Yi Pan, Lei Yu, Haiying Shen, Yi Liang, and Liuhua Chen
- Subjects
Computer Networks and Communications ,Computer science ,business.industry ,Distributed computing ,020206 networking & telecommunications ,Cloud computing ,02 engineering and technology ,Dynamic priority scheduling ,Load balancing (computing) ,Virtualization ,computer.software_genre ,Network topology ,Computer Science Applications ,Load management ,Network Load Balancing Services ,Hardware and Architecture ,Virtual machine ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,business ,computer ,Software ,Information Systems - Abstract
Cloud computing offers a cost-effective and elastic computing paradigm that facilitates large scale data storage and analytics. By deploying virtualization technologies in the datacenter, cloud enables efficient resource management and isolation for various big data applications. Since the hotspots (i.e., overloaded machines) can degrade the performance of these applications, virtual machine migration has been utilized to perform load balancing in the datacenters to eliminate hotspots and guarantee Service Level Agreements (SLAs). However, the previous load balancing schemes make migration decisions based on deterministic resource demand estimation and workload characterization, without considering their stochastic properties. By studying real world traces, we show that the resource demand and workload of virtual machines are highly dynamic and bursty, which can cause these schemes to make inefficient migrations for load balancing. To address this problem, in this paper we propose a stochastic load balancing scheme which aims to provide probabilistic guarantee against the resource overloading with virtual machine migration, while minimizing the total migration overhead. Our scheme effectively addresses the prediction of the distribution of resource demand and the multidimensional resource requirements with stochastic characterization. Moreover, as opposed to the previous works that measure the migration cost without considering the network topology, our scheme explicitly takes into account the distance between the source physical machine and the destination physical machine for a virtual machine migration. The trace-driven experiments show that our scheme outperforms the previous schemes in terms of SLA violation and the migration cost.
- Published
- 2020
- Full Text
- View/download PDF
31. GUEST EDITORIAL: Special Issue on Social Sensing and Privacy Computing in Intelligent Social Systems
- Author
-
Neil Y. Yen, Yulei Wu, Victor C. M. Leung, Fei Hao, Yi Pan, and Juanjuan Li
- Subjects
Social network ,business.industry ,Computer science ,Cartel ,Data science ,Variety (cybernetics) ,Human-Computer Interaction ,Power (social and political) ,Social system ,Modeling and Simulation ,Social media ,business ,Sensing system ,Social Sciences (miscellaneous) - Abstract
The dramatic spread of online social network services, such as Facebook, Twitter, Instagram, and Google+, has led to increasing awareness of the power of incorporating social elements into a variety of data-centric applications. These applications, in recent years, apply various sensors with social media platforms to continuously collect massive data that can be directly associated with human interactions. This phenomenon has led to the creation of numerous social sensing systems, such as Biketastic, BikeNet, CarTel, and Pier, which use social sensors (i.e., users) for a variety of social sensing systems and applications. Social sensing has become an emerging and promising sensing paradigm that relies on the voluntary cooperation of users equipped with embedded or integrated sensors.
- Published
- 2020
- Full Text
- View/download PDF
32. Editorial: Computational Genomics and Molecular Medicine for Emerging COVID-19
- Author
-
Gurudeeban Selvaraj, Aman Chandra Kaushik, Yi Pan, and Dong-Qing Wei
- Subjects
Coronavirus disease 2019 (COVID-19) ,Computer science ,Applied Mathematics ,Severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) ,Pharmacogenomics ,Pandemic ,Computational genomics ,Genetics ,Genomics ,Disease ,Data science ,Molecular medicine ,Biotechnology - Abstract
The papers in this special section focus on computational genomics and molecular medicine for emerging COVID-19. In 2020, World Health Organization announced Coronavirus disease (COVID)-19 is a pandemic disease, which is devastated the socio-economic life around the world. The disease caused by the zoonotic single-strand RNA virus known as “SARS-CoV-2”. To overcome the pandemic, the diagnosis and therapeutics products needs to be developed in short term. Developing therapeutics for infectious diseases, especially viral diseases always a challenging task for the scientific community. However, the utility of high-performance computational resources, artificial intelligence, and machine-learning algorithms can make the process in an affordable way through the usage of genomics, proteomics, pharmacogenomics, and chemical data. Thus, the special section received potential research articles related to computational genomics, molecular medicine, and COVID-19 from reputed scientist around the world. Different articles were employed machine learning, molecular dynamics, computer aided drug design techniques, and emphasizing viral genomics, mutation, drug target, drug candidates, and patient data, were included in this special section.
- Published
- 2021
- Full Text
- View/download PDF
33. A graph convolution network-based model for prioritizing personalized cancer driver genes of individual patients
- Author
-
Wei Peng, Piaofang Yu, Wei Dai, Xiaodong Fu, Li Liu, and Yi Pan
- Subjects
Biomedical Engineering ,Pharmaceutical Science ,Medicine (miscellaneous) ,Bioengineering ,Electrical and Electronic Engineering ,Computer Science Applications ,Biotechnology - Published
- 2023
- Full Text
- View/download PDF
34. A Study on the Digital Forensic Investigation Method of Clever Malware in IoT Devices
- Author
-
Dohyun Kim, Yi Pan, and Jong Hyuk Park
- Subjects
social engineering malware ,Software_OPERATINGSYSTEMS ,General Computer Science ,Computer science ,IoT security ,Digital forensics ,IoT malware ,Computer security ,computer.software_genre ,Digital forensic investigation ,Web page ,General Materials Science ,Android (operating system) ,Hacker ,business.industry ,Social engineering (security) ,General Engineering ,Phishing ,IoT device forensics ,malware investigation ,Malware ,The Internet ,lcsh:Electrical engineering. Electronics. Nuclear engineering ,business ,lcsh:TK1-9971 ,computer ,Mobile device - Abstract
As IoT devices are always connected to mobile devices or other computing devices via the Internet, clever malwares targeting IoT devices or other computing devices connected to IoT devices are emerging. Therefore, effective IoT security research is needed to respond to hacking attacks by these kinds of malware. This paper studied the method of identifying and analyzing malware combined with social engineering from the perspective of digital forensics. The paper classified and analyzed intelligent malware characteristics and proposed a method of quickly identifying and analyzing the malware that secretly intruded into the devices installed with Android, Linux OS, using digital forensics techniques. Moreover, this paper proved its effectiveness by applying this investigation method to two actual malware cases. The research outcomes will be useful in responding to increasingly clever malware attacking IoT devices.
- Published
- 2020
- Full Text
- View/download PDF
35. Block5GIntell: Blockchain for AI-Enabled 5G Networks
- Author
-
Abir El Azzaoui, Sushil Kumar Singh, Yi Pan, and Jong Hyuk Park
- Subjects
blockchain ,Blockchain ,General Computer Science ,Standardization ,Computer science ,General Engineering ,020206 networking & telecommunications ,security ,02 engineering and technology ,Energy consumption ,artificial intelligence ,privacy ,Computer security ,computer.software_genre ,5G networks ,TK1-9971 ,0202 electrical engineering, electronic engineering, information engineering ,Cellular network ,020201 artificial intelligence & image processing ,General Materials Science ,Electrical engineering. Electronics. Nuclear engineering ,smart contract ,computer ,5G - Abstract
Nowadays, 5G network is considered to be one of the main pillars of various industries, including the Internet of Things (IoT), smart cities, virtual reality, and many more. Unlike previous network generations, 5G utilizes complex digital technologies such as massive Multiple Input Multiple Output (mMIMO) and runs over higher radio frequencies. The introduction of new technologies and advanced features in the 5G network raises new challenges for network operators, and merging Artificial Intelligence (AI) is one of the effective solutions to address these complexities. However, AI-enabled 5G network engenders security concerns and requires improvement to meet the standardization and qualification of the new network generation. To mitigate these dilemmas, Blockchain must be integrated. Blockchain, as a decentralized methodology provides a secure sharing of information and resources among various nodes of 5G environments. Blockchain can support other technologies, such as AI-based 5G, to create smarter, more efficient, and secure cellular networks. In this article, we present a comprehensive intelligence and secure data analytics framework for 5G networks based on the convergence of Blockchain and AI named “Block5GIntell”. We depict the applications of Blockchain and AI on 5G networks separately and we argue on the support that Blockchain can provide for AI to create smart and secure 5G networks relying on our proposed framework. To support our proposition, we present an energy-saving case study using Blockchain for AI-enabled 5G. The simulation shows an overall 20% decrease in energy consumption at the RAN level.
- Published
- 2020
- Full Text
- View/download PDF
36. Big Data Transmission in Industrial IoT Systems With Small Capacitor Supplying Energy
- Author
-
Junzhou Luo, Guangchun Luo, Xiaolin Fang, Weiwei Wu, Zhipeng Cai, and Yi Pan
- Subjects
Competitive analysis ,Computer science ,business.industry ,Network packet ,Approximation algorithm ,Energy storage ,Computer Science Applications ,Transmission (telecommunications) ,Control and Systems Engineering ,Electrical and Electronic Engineering ,Online algorithm ,business ,Energy harvesting ,Energy (signal processing) ,Computer Science::Information Theory ,Information Systems ,Computer network ,Data transmission - Abstract
Transmission is crucial for big data analysis and learning in industrial Internet of Things (IoT) systems. To transmit data with limited energy is a challenge. This paper studies the problem of data transmission in energy harvesting systems with capacitor to supply energy where the energy receiving rate varies over time. The energy receiving rate is slower when the capacitor receives more energy. Based on this characteristic, we study the problem of how to transmit more data when the energy receiving time is not continuous. Given many packets that arrive at different time instances, there is a tradeoff between transmitting the packet right now or saving the energy to transmit the future arriving packets. We formalize two types of problems. The first one is how to minimize the total completion time when there is enough energy to transmit all the packets. The second one is how to transmit as many packets as possible when the energy is not enough to transmit all the packets. For the first problem, we give a $1+\alpha$ approximation offline algorithm when all the information of the packets and the energy receiving periods is known in advance, and a $\max \lbrace 2,\beta \rbrace$ competitive ratio online algorithm where the information is not known in advance. For the second problem, we study three cases and give a $6+\lceil \frac{h}{b/R} \rceil$ approximation offline algorithm for the general situation. We also prove that there does not exit a constant competitive ratio online algorithm.
- Published
- 2019
- Full Text
- View/download PDF
37. BridgeTaint: A Bi-Directional Dynamic Taint Tracking Method for JavaScript Bridges in Android Hybrid Applications
- Author
-
Yi Pan, Yan Qin, Weiping Wang, Junyang Bai, Jianxin Wang, and Shigeng Zhang
- Subjects
021110 strategic, defence & security studies ,Java ,Computer Networks and Communications ,business.industry ,Computer science ,Interoperability ,0211 other engineering and technologies ,02 engineering and technology ,JavaScript ,Communications security ,Embedded system ,Android (operating system) ,Safety, Risk, Reliability and Quality ,business ,computer ,Code injection attacks ,computer.programming_language - Abstract
Hybrid applications (apps) are becoming more and more popular due to their cross-platform capabilities and high performance. These apps use the JavaScript (JS) bridge communication scheme to interoperate between native code and Web code. Although greatly extending the functionalities of hybrid apps by enabling cross-language invocations and making them more powerful, the bridge communication scheme might also cause some new security issues, e.g., cross-language code injection attacks and privacy leaks. In this paper, we propose BridgeTaint, a bi-directional dynamic taint tracking method that can detect bridge security issues in hybrid apps. BridgeTaint uses a method different from existing ones to track tainted data: it records the taint information of sensitive data when the data are transmitted through the bridge, and uses a cross-language taint mapping method to restore the taint tags of corresponding data. Such a novel design enables BridgeTaint to dynamically track tainted data during the execution of the app and analyze hybrid apps developed using frameworks, which cannot be done with existing solutions based on static code analyses. Based on BridgeTaint, we implement the BridgeInspector tool to detect cross-language privacy leaks and code injection attacks in hybrid apps using JS bridges. A benchmark called BridgeBench is also developed for bridge communication security test. The experimental results on BridgeBench and 1172 apps from Android market demonstrate that BridgeInspector can effectively detect potential privacy leaks and cross-language code injection attacks in hybrid apps using bridge communications.
- Published
- 2019
- Full Text
- View/download PDF
38. Hippocampal Segmentation in Brain MRI Images Using Machine Learning Methods: A Survey
- Author
-
Yi, PAN, primary, Jin, LIU, additional, Xu, TIAN, additional, Wei, LAN, additional, and Rui, GUO, additional
- Published
- 2021
- Full Text
- View/download PDF
39. On Overcoming the Identified Limitations of a Usable PIN Entry Method
- Author
-
Jianqiang Li, Samrat Mondal, Fei Chen, Yi Pan, and Nilesh Chakraborty
- Subjects
General Computer Science ,Computer science ,observation-attack ,02 engineering and technology ,Computer security ,computer.software_genre ,Login ,USable ,Password strength ,Personal identification number ,0202 electrical engineering, electronic engineering, information engineering ,General Materials Science ,Protocol (object-oriented programming) ,Authentication ,key-logger-attack ,General Engineering ,PIN ,020206 networking & telecommunications ,defense ,Identification (information) ,human-intelligence-factor ,020201 artificial intelligence & image processing ,lcsh:Electrical engineering. Electronics. Nuclear engineering ,lcsh:TK1-9971 ,computer - Abstract
In the domain of password security, research has made significant progress in handling different kinds of threats which require human intelligence factor to fix the vulnerabilities. In spite of having strong theoretical establishments, most of these defense mechanisms cannot be used in practice as humans have limitations in processing complex information. The little bit of good news is that very few research proposals in this field have shown the promises to be deployable in practice. This paper focuses on such one method - proposed by Roth et al. back in 2004, which provides adequate user-friendliness to enter Personal Identification Number (PIN) securely in the presence of human shoulder surfers. Surprisingly, the background algorithm of this method for validating users’ responses runs in linear time on a search space of cardinality 5 and hence, the validation process does not put much load on the authenticating device. Therefore, such human identification protocol can also be integrated into the IoT infrastructure for conducting a more secured login from the client-side. Having such advantages, though remained secure for almost ten years after its release in 2004, recently, few proposals revealed some serious vulnerable aspects of the Roth et al. ’s proposal. In this paper, we have taken an attempt to save this user-friendly form of authentication. Firstly, we have made a critical discussion on the importance of the targeted PIN entry method in the domain of usable security and then given a brief overview of the identified limitations of this protocol. Followed by this, a few initiatives have been taken to fix the identified vulnerabilities of Roth et al. ’s proposal by revising its working principle, while the login procedure and the usability standard of this method stay unaffected.
- Published
- 2019
- Full Text
- View/download PDF
40. A Narrowband Anti-Jamming Acquisition Algorithm Based on All-Phase Processing for BOC Signals
- Author
-
Zhang Tianqi, Gang Zhang, Zhongtao Luo, and Yi Pan
- Subjects
General Computer Science ,Finite impulse response ,Computer science ,020208 electrical & electronic engineering ,Fast Fourier transform ,General Engineering ,Phase (waves) ,acquisition ,020206 networking & telecommunications ,all-phase processing ,02 engineering and technology ,Filter (signal processing) ,Signal ,Narrowband ,Transmission (telecommunications) ,Binary offset carrier modulation ,0202 electrical engineering, electronic engineering, information engineering ,General Materials Science ,lcsh:Electrical engineering. Electronics. Nuclear engineering ,narrowband interference (NBI) ,lcsh:TK1-9971 ,Algorithm ,Binary offset carrier (BOC) signal - Abstract
The accuracy transmission of a binary offset carrier (BOC) signal is always affected by narrowband interference (NBI), and a novel anti-jamming acquisition algorithm based on all-phase processing finite impulse response (apFIR) and partially matched filter-all phase fast Fourier transform (PMF-apFFT) is proposed in this paper. First, an apFIR filter with precise settings for arbitrary notch points was constructed, and then the PMF-apFFT method was adopted to achieve the acquisition of the contaminated BOC signals. Simulation and analysis show that the proposed algorithm can effectively suppress NBI and simultaneously complete correct pseudo-random (PN) phase search and accurate Doppler frequency estimation, and it shows superior performance in mean square error (MSE) and detection probability.
- Published
- 2019
- Full Text
- View/download PDF
41. Improving Alzheimer's Disease Classification by Combining Multiple Measures
- Author
-
Fang-Xiang Wu, Bin Hu, Jianxin Wang, Zhenjun Tang, Yi Pan, and Jin Liu
- Subjects
Male ,Databases, Factual ,Feature extraction ,Computed tomography ,02 engineering and technology ,computer.software_genre ,03 medical and health sciences ,Mri image ,0302 clinical medicine ,Alzheimer Disease ,Image Interpretation, Computer-Assisted ,0202 electrical engineering, electronic engineering, information engineering ,Genetics ,medicine ,Humans ,Cognitive Dysfunction ,Cognitive impairment ,Feature set ,Aged ,Mathematics ,Aged, 80 and over ,Multiple kernel learning ,medicine.diagnostic_test ,business.industry ,Applied Mathematics ,Brain ,Disease classification ,Pattern recognition ,Magnetic Resonance Imaging ,Female ,020201 artificial intelligence & image processing ,Artificial intelligence ,Data mining ,business ,Classifier (UML) ,computer ,Algorithms ,030217 neurology & neurosurgery ,Biotechnology - Abstract
Several anatomical magnetic resonance imaging (MRI) markers for Alzheimer's disease (AD) have been identified. Cortical gray matter volume, cortical thickness, and subcortical volume have been used successfully to assist the diagnosis of Alzheimer's disease including its early warning and developing stages, e.g., mild cognitive impairment (MCI) including MCI converted to AD (MCIc) and MCI not converted to AD (MCInc). Currently, these anatomical MRI measures have mainly been used separately. Thus, the full potential of anatomical MRI scans for AD diagnosis might not yet have been used optimally. Meanwhile, most studies currently only focused on morphological features of regions of interest (ROIs) or interregional features without considering the combination of them. To further improve the diagnosis of AD, we propose a novel approach of extracting ROI features and interregional features based on multiple measures from MRI images to distinguish AD, MCI (including MCIc and MCInc), and health control (HC). First, we construct six individual networks based on six different anatomical measures (i.e., CGMV, CT, CSA, CC, CFI, and SV) and Automated Anatomical Labeling (AAL) atlas for each subject. Then, for each individual network, we extract all node (ROI) features and edge (interregional) features, and denoted as node feature set and edge feature set, respectively. Therefore, we can obtain six node feature sets and six edge feature sets from six different anatomical measures. Next, each feature within a feature set is ranked by $F$ -score in descending order, and the top ranked features of each feature set are applied to MKBoost algorithm to obtain the best classification accuracy. After obtaining the best classification accuracy, we can get the optimal feature subset and the corresponding classifier for each node or edge feature set. Afterwards, to investigate the classification performance with only node features, we proposed a weighted multiple kernel learning (wMKL) framework to combine these six optimal node feature subsets, and obtain a combined classifier to perform AD classification. Similarly, we can obtain the classification performance with only edge features. Finally, we combine both six optimal node feature subsets and six optimal edge feature subsets to further improve the classification performance. Experimental results show that the proposed method outperforms some state-of-the-art methods in AD classification, and demonstrate that different measures contain complementary information.
- Published
- 2018
- Full Text
- View/download PDF
42. Tuning the Aggressive TCP Behavior for Highly Concurrent HTTP Connections in Intra-Datacenter
- Author
-
Jianer Chen, Tao Zhang, Jiawei Huang, Geyong Min, Yi Pan, and Jianxin Wang
- Subjects
Web server ,Computer Networks and Communications ,Computer science ,Transmission Control Protocol ,Network packet ,Distributed computing ,ComputerSystemsOrganization_COMPUTER-COMMUNICATIONNETWORKS ,Congestion window ,020206 networking & telecommunications ,02 engineering and technology ,computer.software_genre ,Computer Science Applications ,Packet loss ,0202 electrical engineering, electronic engineering, information engineering ,Overhead (computing) ,020201 artificial intelligence & image processing ,Electrical and Electronic Engineering ,computer ,Queue ,Software ,Data transmission - Abstract
Modern data centers host diverse hyper text transfer protocol (HTTP)-based services, which employ persistent transmission control protocol (TCP) connections to send HTTP requests and responses. However, the ON/OFF pattern of HTTP traffic disturbs the increase of TCP congestion window, potentially triggering packet loss at the beginning of ON period. Furthermore, the transmission performance becomes worse due to severe congestion in the concurrent transfer of HTTP response. In this paper, we provide the first extensive study to investigate the root cause of performance degradation of highly concurrent HTTP connections in data center network. We further present the design and implementation of TCP-TRIM, which employs probe packets to smooth the aggressive increase of congestion window in persistent TCP connection and leverages congestion detection and control at end-host to limit the growth of switch queue length under highly concurrent TCP connections. The experimental results of at-scale simulations and real implementations demonstrate that TCP-TRIM reduces the completion time of HTTP response by up to 80%, while introducing little deployment overhead only at the end hosts.
- Published
- 2017
- Full Text
- View/download PDF
43. Alzheimer’s Disease Classification Based on Individual Hierarchical Networks Constructed With 3-D Texture Features
- Author
-
Yi Pan, Jianxin Wang, Fang-Xiang Wu, Jin Liu, and Bin Hu
- Subjects
Male ,0301 basic medicine ,Computer science ,Feature extraction ,Biomedical Engineering ,Pharmaceutical Science ,Medicine (miscellaneous) ,Bioengineering ,Sensitivity and Specificity ,Cross-validation ,Pattern Recognition, Automated ,03 medical and health sciences ,Imaging, Three-Dimensional ,0302 clinical medicine ,Neuroimaging ,Alzheimer Disease ,Connectome ,medicine ,Humans ,Dementia ,Cognitive Dysfunction ,Computer vision ,Electrical and Electronic Engineering ,Aged ,Multiple kernel learning ,business.industry ,Node (networking) ,Brain ,Reproducibility of Results ,Pattern recognition ,medicine.disease ,Magnetic Resonance Imaging ,Computer Science Applications ,030104 developmental biology ,Pattern recognition (psychology) ,Female ,Artificial intelligence ,Nerve Net ,Alzheimer's disease ,business ,Algorithms ,030217 neurology & neurosurgery ,Biotechnology - Abstract
Brain network plays an important role in representing abnormalities in Alzheimers disease (AD) and mild cognitive impairment (MCI), which includes MCIc (MCI converted to AD) and MCInc (MCI not converted to AD). In our previous study, we proposed an AD classification approach based on individual hierarchical networks constructed with 3D texture features of brain images. However, we only used edge features of the networks without node features of the networks. In this paper, we propose a framework of the combination of multiple kernels to combine edge features and node features for AD classification. An evaluation of the proposed approach has been conducted with MRI images of 710 subjects (230 health controls (HC), 280 MCI (including 120 MCIc and 160 MCInc), and 200 AD) from the Alzheimer's disease neuroimaging initiative database by using ten-fold cross validation. Experimental results show that the proposed method is not only superior to the existing AD classification methods, but also efficient and promising for clinical applications for the diagnosis of AD via MRI images. Furthermore, the results also indicate that 3D texture could detect the subtle texture differences between tissues in AD, MCI, and HC, and texture features of MRI images might be related to the severity of AD cognitive impairment. These results suggest that 3D texture is a useful aid in AD diagnosis.
- Published
- 2017
- Full Text
- View/download PDF
44. Searching Genome-Wide Multi-Locus Associations for Multiple Diseases Based on Bayesian Inference
- Author
-
Jing Zhang, Ding-Zhu Du, Yi Pan, Zhipeng Cai, and Xuan Guo
- Subjects
0301 basic medicine ,0206 medical engineering ,Single-nucleotide polymorphism ,Locus (genetics) ,Genomics ,Genome-wide association study ,02 engineering and technology ,Computational biology ,Biology ,computer.software_genre ,Bayesian inference ,Polymorphism, Single Nucleotide ,03 medical and health sciences ,Genetics ,Humans ,SNP ,Genetic Predisposition to Disease ,Genetic association ,Applied Mathematics ,Computational Biology ,Bayes Theorem ,Epistasis, Genetic ,030104 developmental biology ,Epistasis ,Data mining ,computer ,Algorithms ,020602 bioinformatics ,Genome-Wide Association Study ,Biotechnology - Abstract
Taking the advantage of high-throughput single nucleotide polymorphism (SNP) genotyping technology, large genome-wide association studies (GWASs) have been considered to hold promise for unraveling complex relationships between genotypes and phenotypes. Current multi-locus-based methods are insufficient to detect interactions with diverse genetic effects on multifarious diseases. Also, statistic tests for high-order epistasis ( $\geq 2$ SNPs) raise huge computational and analytical challenges because the computation increases exponentially as the growth of the cardinality of SNPs combinations. In this paper, we provide a simple, fast and powerful method, named DAM, using Bayesian inference to detect genome-wide multi-locus epistatic interactions in multiple diseases. Experimental results on simulated data demonstrate that our method is powerful and efficient. We also apply DAM on two GWAS datasets from WTCCC, i.e . , Rheumatoid Arthritis and Type 1 Diabetes, and identify some novel findings. Therefore, we believe that our method is suitable and efficient for the full-scale analysis of multi-disease-related interactions in GWASs.
- Published
- 2017
- Full Text
- View/download PDF
45. Analysis of an improved acquisition method for high-dynamic BOC signal
- Author
-
Zhang Tianqi, Gang Zhang, Yi Pan, and Zhongtao Luo
- Subjects
Computer science ,0202 electrical engineering, electronic engineering, information engineering ,Electronic engineering ,020206 networking & telecommunications ,020201 artificial intelligence & image processing ,02 engineering and technology ,Signal - Published
- 2016
- Full Text
- View/download PDF
46. A New Method for Predicting Protein Functions From Dynamic Weighted Interactome Networks
- Author
-
Yaohang Li, Jianxin Wang, Min Li, Bihai Zhao, Fang-Xiang Wu, Xueyong Li, and Yi Pan
- Subjects
0301 basic medicine ,Saccharomyces cerevisiae Proteins ,0206 medical engineering ,Biomedical Engineering ,Pharmaceutical Science ,Medicine (miscellaneous) ,Bioengineering ,02 engineering and technology ,Biology ,computer.software_genre ,Interactome ,Protein–protein interaction ,Domain (software engineering) ,03 medical and health sciences ,Annotation ,Protein Interaction Mapping ,Protein function prediction ,Protein Interaction Maps ,Electrical and Electronic Engineering ,Databases, Protein ,Computational Biology ,Construct (python library) ,Function (mathematics) ,Protein engineering ,Computer Science Applications ,ComputingMethodologies_PATTERNRECOGNITION ,030104 developmental biology ,Data mining ,computer ,Algorithms ,020602 bioinformatics ,Biotechnology - Abstract
Automated annotation of protein function is challenging. As the number of sequenced genomes rapidly grows, the overwhelming majority of proteins can only be annotated computationally. Under new conditions or stimuli, not only the number and location of proteins would be changed, but also their interactions. This dynamic feature of protein interactions, however, was not considered in the existing function prediction algorithms. Taking the dynamic nature of protein interactions into consideration, we construct a dynamic weighted interactome network (DWIN) by integrating protein-protein interaction (PPI) network and time course gene expression data, as well as proteins' domain information and protein complex information. Then, we propose a new prediction approach that predicts protein functions from the constructed dynamic weighted interactome network. For an unknown protein, the proposed method visits dynamic networks at different time points and scores functions derived from all neighbors. Finally, the method selects top N functions from these ranked candidate functions to annotate the testing protein. Experiments on PPI datasets were conducted to evaluate the effectiveness of the proposed approach in predicting unknown protein functions. The evaluation results demonstrated that the proposed method outperforms other competing methods.
- Published
- 2016
- Full Text
- View/download PDF
47. Inferring Metabolite-disease Association Using Graph Convolutional Networks
- Author
-
Xiujuan Lei, Yi Pan, and Jiaojiao Tie
- Subjects
Computational model ,business.industry ,Applied Mathematics ,Node (networking) ,Association (object-oriented programming) ,Computational Biology ,Pattern recognition ,Fingerprint recognition ,Cross-validation ,Similarity (network science) ,Genetics ,Artificial intelligence ,Noise (video) ,business ,Algorithms ,Heterogeneous network ,Biotechnology - Abstract
As is well known, biological experiments are time-consuming and laborious, so there is absolutely no doubt that developing an effective computational model will help solve these problems. Most of computational models rely on the biological similarity and network-based methods that cannot consider the topological structures of metabolite-disease association graphs. We proposed a novel method based on graph convolutional networks to infer potential metabolite-disease association, named MDAGCN. We first calculated three kinds of metabolite similarities and three kinds of disease similarities. The final similarity of disease and metabolite will be obtained by integrating three kinds' similarities of each and filtering out the noise similarity values. Then metabolite similarity network, disease similarity network and known metabolite-disease association network were used to construct a heterogenous network. Finally, heterogeneous network with rich information is fed into the graph convolutional networks to obtain new features of a node through aggregation of node information so as to infer the potential associations between metabolites and diseases. Experimental results show that MDAGCN achieves more reliable results in cross validation and case studies when compared with other existing methods.
- Published
- 2021
- Full Text
- View/download PDF
48. Guest Editors’ Introduction to the Special Section on Bioinformatics Research and Applications
- Author
-
Yi Pan, Ion I. Mandoiu, and Alexander Zelikovsky
- Subjects
Phylogenetic inference ,business.industry ,Computer science ,Applied Mathematics ,Kernel canonical correlation analysis ,Genetics ,Special section ,Review process ,business ,Bioinformatics ,Health informatics ,Biotechnology - Abstract
This special section includes a selection of papers presented at the Eighth International Symposium on Bioinformatics Research and Application (ISBRA), which was held in Dallas, Texas, on 21-23 May 2012. The ISBRA symposium provides a forum for the exchange of ideas and results among researchers, developers, and practitioners working on all aspects of bioinformatics and computational biology and their applications. In 2012, 66 papers were submitted in response to the call for papers, out of which 26 papers appeared in the ISBRA proceedings published as volume 7292 of Springer Verlag's Lecture Notes in Bioinformatics series. Extended versions of nine symposium papers were invited and accepted for publication in this special section following a rigorous review process. The selected papers cover a broad range of bioinformatics topics, including biological networks, computational complexity of problems in structural biology and genomics, and phylogenetic inference and analysis. Herein, we briefly introduce each of them.
- Published
- 2019
- Full Text
- View/download PDF
49. Guest Editorial Special Issue on Advanced Computational Technologies in Mobile Edge Computing for the Internet of Things
- Author
-
Hsiao-Hwa Chen, Vincenzo Piuri, Yi Pan, and Jong Hyuk Park
- Subjects
Mobile edge computing ,Emergency management ,Computer Networks and Communications ,Computer science ,business.industry ,Big data ,Cloud computing ,Context (language use) ,Energy consumption ,Network interface ,Computer security ,computer.software_genre ,Computer Science Applications ,Hardware and Architecture ,Signal Processing ,Scalability ,The Internet ,business ,Internet of Things ,computer ,Information Systems - Abstract
Nowadays, for different purposes and contexts, we interact with a lot of different smart devices in our daily lives. Most of these devices are connected to the Internet and are therefore commonly referred to as the Internet of Things (IoT). Mobile edge computing (MEC) has recently evolved as an emerging technique by moving the computing and storage resources from the cloud to the edge of the network. The MEC supports IoT devices to improve their efficiency and scalability; helps to reduce latency delay for real-time applications, bandwidth bottlenecks, and energy consumption; and delivers contextual information processing. MEC offers many features and capabilities, such as access to a multitude of network interface (from 4G and 5G to Wi-Fi), support for device mobility, device context, geo-location awareness, and geographical distribution. Such attributes can support the real-time processing requirements of the Internet of Everything application, such as patient care, disaster management and detection (e.g., earthquakes), and flood monitoring. However, to fully exploit the potential of MEC in the IoT applications, many challenges need to be addressed, such as issues related to IoT Big Data, effective management of data storage and computing, privacy and security concerns, and innovative and emerging communication paradigm (e.g., 5G), require new architectures, applications, and methods.
- Published
- 2019
- Full Text
- View/download PDF
50. Distributed Strain and Vibration Sensing System Based on Phase-Sensitive OTDR
- Author
-
Xizhang Wang, Xin-hua Zhang, Zhenqing Sun, J. Hua, Yi Pan, Ling Zhou, and Fengqiu Wang
- Subjects
Materials science ,Optical fiber ,business.industry ,Physics::Optics ,Polarization-maintaining optical fiber ,Optical time-domain reflectometer ,Distributed acoustic sensing ,Graded-index fiber ,Atomic and Molecular Physics, and Optics ,Electronic, Optical and Magnetic Materials ,law.invention ,Optics ,Fiber Bragg grating ,law ,Fiber optic sensor ,Dispersion-shifted fiber ,Electrical and Electronic Engineering ,business - Abstract
A system based on phase-sensitive optical time domain reflectometry is proposed for simultaneously strain and vibration sensing. The strain of fiber is detected by comparing the patterns of signal for different laser frequencies, and the vibration of fiber is detected simultaneously from the signals for any certain laser frequency. During the measurement, frequencies of the probe optical pulses are modulated sequentially in ascending or descending order. Using the signals generated by optical pulses with the same frequency, the vibration of fiber is detected with fast response speed; using that with different frequencies, the strain of fiber is detected with high resolution. In our experiment, a sensing system with 2-m spatial resolution, up to 1-kHz frequency measurement range and 10- $\text{n}{{\varepsilon }}$ strain resolution is realized for a 9-km sensing fiber length.
- Published
- 2015
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.