177 results on '"She, Yang"'
Search Results
2. Phase‐shift Perfluoropentane Nanoemulsions Enhance Pulsed High‐intensity Focused Ultrasound Ablation in an Isolated Perfused Liver System and Their Potential Value for Cancer Therapy
- Author
-
Xu Chao, Feng Wu, Jianzhong Zou, Guo-Guan Wang, Lu-Yan Zhao, and Bing-She Yang
- Subjects
Fluorocarbons ,Necrosis ,Radiological and Ultrasound Technology ,business.industry ,medicine.medical_treatment ,Ultrasound ,Ablation ,High-intensity focused ultrasound ,PLGA ,chemistry.chemical_compound ,Liver ,chemistry ,Neoplasms ,Perfused liver ,medicine ,Animals ,High-Intensity Focused Ultrasound Ablation ,Radiology, Nuclear Medicine and imaging ,Rabbits ,medicine.symptom ,business ,Nuclear medicine ,Ultrasound energy ,Saline - Abstract
PURPOSE To investigate whether phase-shift perfluoropetane (PFP) nanoemulsions can enhance pulsed high-intensity focused ultrasound (HIFU) ablation. METHODS PFP was encapsulated by poly(lactic-co-glycolic acid) (PLGA) to form a nanometer-sized droplet (PLGA-PFP), which was added to an isolated perfused liver system. Meanwhile, phosphate-buffered saline (PBS) was used as a control. The perfused liver was exposed to HIFU (150 W, t = 3/5/10 s) at various duty cycles (DCs). The ultrasound images, cavitation emissions, and temperature were recorded. Rabbits with subcutaneous VX2 tumors were exposed to HIFU (150 W) at various DCs with or without PLGA-PFP. After ablation, necrosis volume and energy efficiency factor were calculated. Pathologic characteristics were observed. RESULTS Compared to the PBS control, PLGA-PFP nanoemulsions markedly enhanced HIFU-induced necrosis volume in both perfused livers and subcutaneous VX2 tumor-bearing rabbits (P
- Published
- 2021
3. Fusing wearable and remote sensing data streams by fast incremental learning with swarm decision table for human activity recognition
- Author
-
Simon Fong, Kelvin K. L. Wong, Ying Wu, Xuqi Li, Tengyue Li, and Xin-She Yang
- Subjects
Computer science ,business.industry ,Data stream mining ,Wearable computer ,Swarm behaviour ,020206 networking & telecommunications ,Feature selection ,Ranging ,02 engineering and technology ,Sensor fusion ,Activity recognition ,Hardware and Architecture ,Signal Processing ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,Computer vision ,Artificial intelligence ,Decision table ,business ,Software ,Information Systems - Abstract
Human activity recognition (HAR) by machine learning finds wide applications ranging from posture monitoring for healthcare and rehabilitation to suspicious or dangerous actions detection for security surveillance. Infrared cameras such as Microsoft Kinect and wearable sensors have been the two most adopted devices for collecting data for measuring the bodily movements. These two types of sensors generally are categorized as contactless sensing and contact sensing respectively. Due to hardware limitation, each of the two sensor types has their inherent limitations. One most common problem associating with contactless sensing like Kinect is the distance and indirect angle between the camera and the subject. For wearable sensor, it is limited in recognizing complex human activities. In this paper, a novel data fusion framework is proposed for combining data which are collected from both sensors with the aim of enhancing the HAR accuracy. Kinect is able to capture details of bodily movements from complex activities, but the accuracy is dependent heavily on the angle of view; wearable sensor is relatively primitive in gathering spatial data but reliable for detecting basic movements. Fusing the data from the two sensor types enables complimenting each other by their unique strengths. In particular, a new scheme using incremental learning with decision table coupled with swarm-based feature selection is proposed in our framework for achieving fast and accurate HAR by fusing data of two sensors. Our experiment results show that HAR accuracy could be improved from 23.51% to 68.35% in a case of almost 90 degrees slanted view of Kinect sensing while a wearing sensor is used at the same time. The swarm feature selection in general is shown to enhance the HAR performance compared to standard feature selection method. The experiment results reported here contribute to the possibilities of using hybridized sensors from the machine learning perspective.
- Published
- 2020
4. Data-Driven Optimization for Transportation Logistics and Smart Mobility Applications [Guest Editorial]
- Author
-
Joshue Perez Rastelli, Javier Del Ser, Javier J. Sánchez Medina, Xin-She Yang, Eneko Osaba, Eleni I. Vlahogianni, and Antonio D. Masegosa
- Subjects
Government ,Technological change ,business.industry ,Mechanical Engineering ,Post-industrial society ,Environmental economics ,Natural resource ,Computer Science Applications ,Information and Communications Technology ,Sustainable management ,Public transport ,Automotive Engineering ,Business ,Social capital - Abstract
The articles in this special section focus on data driven optimization for transportation and smart mobility applications. We live in an era of major societal and technological changes. Transportation de-carbonization and postindustrial demographic trends, such as massive migrations and an aging society, generate new challenges for cities, making the efficient and sustainable management of services and resources more necessary than ever. Cities must evolve, transform, and become smart to cope with these realities. According to the literature, a city can be referred to as smart when investments in human and social capital and traditional (transportation) and modern [information and communications technology (ICT)] communication infrastructure fuel sustainable economic growth and high quality of life, with a wise management of natural resources, through participatory government.
- Published
- 2020
5. White Learning: A White-Box Data Fusion Machine Learning Framework for Extreme and Fast Automated Cancer Diagnosis
- Author
-
Xingshi He, Xin-She Yang, Tengyue Li, Jinan Fiaidhi, Simon Fong, Sabah Mohammed, and Liansheng Liu
- Subjects
Artificial neural network ,business.industry ,Computer science ,Deep learning ,Conditional probability ,02 engineering and technology ,Machine learning ,computer.software_genre ,Computer Science Applications ,Data modeling ,Hardware and Architecture ,020204 information systems ,Black box ,0202 electrical engineering, electronic engineering, information engineering ,Artificial intelligence ,White box ,Hidden Markov model ,business ,computer ,Software ,Interpretability - Abstract
Deep learning as a data modeling tool is hard to be understood of how its predicted result came about from its inner working. It is generally known as a black box and is not interpretable. Often in medical applications, physicians need to understand why a model predicts a result. On the other hand, BN is a probabilistic graph with nodes representing the variables, and the arcs present the conditional dependences between the variables. In this article, a white learning framework is proposed, which advocates three levels of fusing the black-box deep learning and white box BN, that offers both predictive power and interpretability. A case of breast cancer classification is conducted in an experiment. From the results, it is observed that white learning, which combines black-box and white-box machine learning, has an edge in performance over individually BN alone or deep learning alone. The white learning framework has the benefits of interpretability and high predictive power, making it suitable for critical decision-making task where a reliable prediction is as important as knowing how the outcome is predicted. The predicted output, which is generated from white learning, can be traced back via the conditional probability at each node. It is, hence, anticipated that in the future, especially for medical domain, white learning, which has the benefits of both black -box and white-box learning, would be highly valued and raised in popularity.
- Published
- 2019
6. Utilizing Biology-Guided Radiotherapy for Coronary Artery Avoidance During Free-Breathing External Beam Radiation Delivery
- Author
-
O.M. Oderinde, A. Da Silva, Sibo Tian, T. Cornwell, Xin-She Yang, M. Owens, Shervin M. Shirvani, and Kristin Higgins
- Subjects
Cancer Research ,Radiation ,business.industry ,medicine.medical_treatment ,External beam radiation ,medicine.disease ,Coronary arteries ,Radiation therapy ,medicine.anatomical_structure ,Left coronary artery ,Oncology ,Right coronary artery ,medicine.artery ,medicine ,Radiology, Nuclear Medicine and imaging ,Radiation treatment planning ,Nuclear medicine ,business ,Lung cancer ,Artery - Abstract
PURPOSE/OBJECTIVE(S) Radiation-induced cardiotoxicity is associated with the dose delivered to the coronary arteries (CA). Breath-hold and respiratory gating may help reduce CA dose in some individuals, but patients with cardiopulmonary comorbidities may not be candidates for these strategies. Biology-guided radiotherapy (BgRT) uses outgoing tumor PET emissions to deliver a tracked dose distribution to a moving target during the normal breathing cycle, which may reduce dose to sensitive structures like the CA without the need for additional motion management techniques. To understand dosimetric implications for the CA, we conducted a planning study utilizing the RefleXion X1 to develop plans for lung tumors in the biology-guided mode. MATERIALS/METHODS Right coronary artery (RCA) and left coronary artery (LCA) branches were delineated in three lung cancer patients (P1 = Left upper lung tumor, P2 = Right upper lung tumor, and P3 = Right lower lung tumor). BgRT plans were created with AAPM TG-101 50 Gy in 5-fraction protocol using a research version of the RefleXion treatment planning system (TPS), and dose-volume parameters of the heart, RCA, and LCA were analyzed. Using the Pearson correlation model, the correlation between: (1) the mean heart dose (MHD) and tumor location, (2) the MHD and RCA relationship, (3) the MHD and LCA relationship, (4) tumor-to-ipsilateral/contralateral branches and mean coronary dose were evaluated. RESULTS The BgRT-PTVs were 27.3, 14, and 66.7 cm3 for P1, P2, and P3, respectively. The average MHD was 2.15 Gy (Range: 0.48-5.03 Gy). The MHD was highly correlated with the tumor location (R = -0.99 and P = 0.037). The RCA and LCA received an average mean dose of 1.46 Gy (Range: 0.3-3.21 Gy) and 1.01 Gy (Range: 0.75-1.45 Gy), respectively. The RCA branches in P1, P2, and P3 received a low dose (1 Gy) of 0.50%, 40.70%, and 91.10%, respectively. Also, LCA branches in P1, P2, and P3 received V1 dose of 45.70%, 37.20%, and 58.00%, respectively. The LCA V15 was less than 10% in all cases, meeting the constraint recently proposed by Atkins, et al. The MHD had a correlation coefficient of 1.00 (P = 0.06) and 0.98 (P = 0.14) with mean RCA and LCA doses, respectively. There were insignificant correlations between targets-to-ipsilateral/contralateral branches and mean coronary dose. CONCLUSION Tracked dose distributions utilizing biology-guidance were associated with robust CA avoidance without the requirement for gating or breath-holding maneuvers. This observation also held for MHD, which was correlated with CA dose in this investigation. AUTHOR DISCLOSURE O.M. Oderinde: Stock Options; RefleXion Medical. T. Cornwell: Stock Options; RefleXion Medical. M. Owens: Stock Options; RefleXion Medical. S. Tian: None. X. Yang: None. K.A. Higgins: Research Grant; RefleXion Medical. Consultant; Astra Zeneca, Varian, Precisca. Advisory Board; Genetech; NRG Oncology. A. Da Silva: None. S.M. Shirvani: Employee; Sutter Health. Stock; RefleXion Medical. Stock Options; RefleXion Medical. Manage clinical, medical and scientific affairs; RefleXion Medical.
- Published
- 2021
7. Dosimetric Comparison of Single-Isocenter and Multiple-Isocenter Techniques for Two-Lesion Lung SBRT Using the RefleXion High-Speed Ring-Gantry System
- Author
-
O.M. Oderinde, Sibo Tian, Y. Voronenko, Xin-She Yang, A. Da Silva, Shervin M. Shirvani, and Kristin Higgins
- Subjects
Cancer Research ,Radiation ,Lung ,business.industry ,Isocenter ,Collimator ,law.invention ,Target dose ,Lesion ,medicine.anatomical_structure ,Oncology ,law ,Medicine ,Dosimetry ,Radiology, Nuclear Medicine and imaging ,medicine.symptom ,business ,Nuclear medicine ,Radiation treatment planning ,Revolutions per minute - Abstract
PURPOSE/OBJECTIVE(S) Due to the technical challenges that accompany stereotactic body radiotherapy (SBRT) to multiple lesions, a single-isocenter treatment plan technique for multi-target scenarios may optimize the final, aggregate dose distribution. The RefleXion™ X1 is a new ring-gantry based system with several features that may optimize dosimetry in a multi-target SBRT scenario, including high speed rotation (60 revolutions per minute), a high-speed multi-leaf collimator (100 Hz), and a variable pitch couch. A comprehensive dosimetric comparison for multiple targets that contrasts multiple isocenters versus a single-isocenter technique has not been performed. Therefore, this investigation performed such a comparison in two-lesion lung SBRT cases. MATERIALS/METHODS Three patients with two synchronous lung lesions with maximum dimension of 5 cm underwent single-isocenter and two-isocenter planning with the RefleXion treatment planning system using a Collapsed Cone Convolution Superposition algorithm. Treatment doses were in accordance with AAPM TG101 criteria (50Gy in 5 fractions) and planned such that 95% of the planning target volume (PTV) would receive at least 95% of the prescription dose. In the single-isocenter plans, the isocenter was placed between the two-lesions, whereas the two-isocenter plans localized the isocenters on each target with a limitation on the IEC-X direction to avoid couch-bore collisions. Conformity index (CI), homogenous index (HI), R50%, and D2cm were calculated. The volume of the normal lung receiving 5 Gy (V5), V10, V15, V20, mean lung dose, and dose to other OARs were evaluated. In addition, the treatment time to deliver each plan was recorded. RESULTS The mean isocenter to the tumor distance in XYZ-space was 8.53 cm (range: 7.66-10.62 cm). The mean combined PTV was 42.47cc (range: 9.7- 97.8 cc). There was no significant difference in target coverage, maximum target dose, CI, HI, R50%, V20, V15, and V10 between single-isocenter and two-isocenters for lung SBRT plans. On average, single-isocenter plans contributed an increase in OAR dose, especially to the carina and esophagus, by 5.17% (range: 2.91-6.52%) and 15.62% (range: 6.44-22.97%), respectively. With the RefleXion X1 set at 850MU/min, the average difference in total beam-on time was approximately 33 seconds between the two approaches. However, set-up time for the second isocenter in the two-isocenter plan would likely prolong the total treatment delivery time. CONCLUSION This study shows that a single-isocenter technique for the scenario of two lung lesions is dosimetrically feasible and clinically efficient with the RefleXion X1 in SBRT mode. Single-isocenter treatment delivery techniques may be a desirable option for efficient radio-ablative therapies directed at multiple metastatic targets.
- Published
- 2021
8. Simulated Annealing
- Author
-
Xin-She Yang
- Subjects
0209 industrial biotechnology ,Materials processing ,Materials science ,business.industry ,Annealing (metallurgy) ,02 engineering and technology ,01 natural sciences ,Crystal ,Condensed Matter::Materials Science ,Random search ,010104 statistics & probability ,020901 industrial engineering & automation ,Simulated annealing ,0103 physical sciences ,0202 electrical engineering, electronic engineering, information engineering ,Optoelectronics ,020201 artificial intelligence & image processing ,0101 mathematics ,business ,010306 general physics ,Global optimization - Abstract
Simulated annealing (SA) is a trajectory-based, random search technique for global optimization. It mimics the annealing process in materials processing when a metal cools and freezes into a crystalline state with minimum energy and larger crystal sizes so as to reduce the defects in metallic structures. The annealing process involves the careful control of temperature and its cooling schedule. SA has been successfully applied in many areas.
- Published
- 2021
9. Data Mining and Deep Learning
- Author
-
Xin-She Yang
- Subjects
Computer science ,business.industry ,Deep learning ,Artificial intelligence ,business ,Data science - Published
- 2021
10. How is artificial intelligence applied in solid tumor imaging?
- Author
-
Jian-She Yang and Qiang Wang
- Subjects
Computer science ,business.industry ,Artificial intelligence ,Solid tumor ,business - Published
- 2021
11. COEBA: A Coevolutionary Bat Algorithm for Discrete Evolutionary Multitasking
- Author
-
Andrés Iglesias, Javier Del Ser, Eneko Osaba, Akemi Gálvez, and Xin-She Yang
- Subjects
FOS: Computer and information sciences ,0209 industrial biotechnology ,Optimization problem ,Computer science ,Computer Science - Artificial Intelligence ,Evolutionary algorithm ,Context (language use) ,02 engineering and technology ,Travelling salesman problem ,Evolutionary computation ,Article ,Bat algorithm ,Traveling salesman problem ,020901 industrial engineering & automation ,Evolutionary multitasking ,0202 electrical engineering, electronic engineering, information engineering ,Human multitasking ,Neural and Evolutionary Computing (cs.NE) ,Metaheuristic ,business.industry ,Process (computing) ,Computer Science - Neural and Evolutionary Computing ,Multifactorial optimization ,Artificial Intelligence (cs.AI) ,Transfer optimization ,020201 artificial intelligence & image processing ,Artificial intelligence ,business - Abstract
Multitasking optimization is an emerging research field which has attracted lot of attention in the scientific community. The main purpose of this paradigm is how to solve multiple optimization problems or tasks simultaneously by conducting a single search process. The main catalyst for reaching this objective is to exploit possible synergies and complementarities among the tasks to be optimized, helping each other by virtue of the transfer of knowledge among them (thereby being referred to as Transfer Optimization). In this context, Evolutionary Multitasking addresses Transfer Optimization problems by resorting to concepts from Evolutionary Computation for simultaneous solving the tasks at hand. This work contributes to this trend by proposing a novel algorithmic scheme for dealing with multitasking environments. The proposed approach, coined as Coevolutionary Bat Algorithm, finds its inspiration in concepts from both co-evolutionary strategies and the metaheuristic Bat Algorithm. We compare the performance of our proposed method with that of its Multifactorial Evolutionary Algorithm counterpart over 15 different multitasking setups, composed by eight reference instances of the discrete Traveling Salesman Problem. The experimentation and results stemming therefrom support the main hypothesis of this study: the proposed Coevolutionary Bat Algorithm is a promising meta-heuristic for solving Evolutionary Multitasking scenarios., 13 pages, 0 figures, paper submitted and accepted in the 11th workshop Computational Optimization, Modelling and Simulation (COMS 2020), part of the International Conference on Computational Science (ICCS 2020)
- Published
- 2020
12. Neighborhood Information-based Probabilistic Algorithm for Network Disintegration
- Author
-
Qian Li, Sanyang Liu, and Xin-She Yang
- Subjects
FOS: Computer and information sciences ,0209 industrial biotechnology ,Computer science ,Computer Science - Artificial Intelligence ,Distributed computing ,02 engineering and technology ,020901 industrial engineering & automation ,Betweenness centrality ,90C59, 90C26, 05C82 ,Artificial Intelligence ,Robustness (computer science) ,0202 electrical engineering, electronic engineering, information engineering ,FOS: Mathematics ,Neural and Evolutionary Computing (cs.NE) ,Mathematics - Optimization and Control ,Social and Information Networks (cs.SI) ,business.industry ,General Engineering ,Probabilistic logic ,Computer Science - Neural and Evolutionary Computing ,Computer Science - Social and Information Networks ,Complex network ,Computer Science Applications ,Randomized algorithm ,Artificial Intelligence (cs.AI) ,Optimization and Control (math.OC) ,020201 artificial intelligence & image processing ,The Internet ,business ,Centrality - Abstract
Many real-world applications can be modelled as complex networks, and such networks include the Internet, epidemic disease networks, transport networks, power grids, protein-folding structures and others. Network integrity and robustness are important to ensure that crucial networks are protected and undesired harmful networks can be dismantled. Network structure and integrity can be controlled by a set of key nodes, and to find the optimal combination of nodes in a network to ensure network structure and integrity can be an NP-complete problem. Despite extensive studies, existing methods have many limitations and there are still many unresolved problems. This paper presents a probabilistic approach based on neighborhood information and node importance, namely, neighborhood information-based probabilistic algorithm (NIPA). We also define a new centrality-based importance measure (IM), which combines the contribution ratios of the neighbor nodes of each target node and two-hop node information. Our proposed NIPA has been tested for different network benchmarks and compared with three other methods: optimal attack strategy (OAS), high betweenness first (HBF) and high degree first (HDF). Experiments suggest that the proposed NIPA is most effective among all four methods. In general, NIPA can identify the most crucial node combination with higher effectiveness, and the set of optimal key nodes found by our proposed NIPA is much smaller than that by heuristic centrality prediction. In addition, many previously neglected weakly connected nodes are identified, which become a crucial part of the newly identified optimal nodes. Thus, revised strategies for protection are recommended to ensure the safeguard of network integrity. Further key issues and future research topics are also discussed., Comment: 25 pages, 13 figures, 2 tables
- Published
- 2020
- Full Text
- View/download PDF
13. Firefly algorithm and flower pollination algorithm
- Author
-
Xin-She Yang and Yuxin Zhao
- Subjects
Pollination ,business.industry ,Computer science ,MathematicsofComputing_NUMERICALANALYSIS ,Firefly algorithm ,business ,Swarm intelligence ,Algorithm ,Multi-objective optimization ,Subdivision - Abstract
The firefly algorithm is a swarm intelligence-based algorithm, and its nonlinearity in search mechanisms can usually lead to subdivision and multiswarms, which means that it can be potentially more effective than single-swarm algorithms. This chapter introduces the main ideas of the firefly algorithm, followed by the introduction of the flower pollination algorithm. Both implementation details and examples will be presented to show how these algorithms work. Suggestions on modifications and multiobjective optimization will also be discussed.
- Published
- 2020
14. Nature-inspired computation and swarm intelligence: a state-of-the-art overview
- Author
-
Xin-She Yang and Mehmet Karamanoglu
- Subjects
Optimization problem ,Computer science ,business.industry ,Computation ,State (computer science) ,Artificial intelligence ,Nature inspired ,business ,Swarm intelligence - Abstract
Nature-inspired algorithms can be flexible and efficient for solving optimization problems. There are a wide spectrum of nature-inspired algorithms in the literature, and most of such algorithms are based on swarm intelligence. This chapter provides an overview of some widely used algorithms for optimization. Their main characteristics will be discussed in comparison with traditional algorithms such as gradient-based algorithms. Some open problems concerning swarm intelligence and nature-inspired computation will be highlighted.
- Published
- 2020
15. Soft Computing for Swarm Robotics: New Trends and Applications
- Author
-
Xin-She Yang, Eneko Osaba, Andrés Iglesias, and Javier Del Ser
- Subjects
Soft computing ,General Computer Science ,Computer science ,business.industry ,Autonomous agent ,Swarm robotics ,Robotics ,Context (language use) ,Computational intelligence ,02 engineering and technology ,01 natural sciences ,Data science ,Swarm intelligence ,010305 fluids & plasmas ,Theoretical Computer Science ,Modeling and Simulation ,0103 physical sciences ,0202 electrical engineering, electronic engineering, information engineering ,Robot ,020201 artificial intelligence & image processing ,Artificial intelligence ,business - Abstract
Robotics have experienced a meteoric growth over the last decades, reaching unprecedented levels of distributed intelligence and self-autonomy. Today, a myriad of real-world scenarios can benefit from the application of robots, such as structural health monitoring, complex manufacturing, efficient logistics or disaster management. Related to this topic, there is a paradigm connected to Swarm Intelligence which is grasping significant interest from the Computational Intelligence community. This branch of knowledge is known as Swarm Robotics, which refers to the development of tools and techniques to ease the coordination of multiple small-sized robots towards the accomplishment of difficult tasks or missions in a collaborative fashion. The success of Swarm Robotics applications comes from the efficient use of smart sensing, communication and organization functionalities endowed to these small robots, which allow for collaborative information sensing, operation and knowledge inference from the environment. The numerous industrial and social applications that can be addressed efficiently by virtue of swarm robotics unleashes a vibrant research area focused on distributing intelligence among autonomous agents with simple behavioral rules and communication schedules, yet potentially capable of realizing the most complex tasks. In this context, we present and overview recent contributions reported around this paradigm, which serves as an exemplary excerpt of the potential of Swarm Robotics to become a major research catalyst of the Computational Intelligence arena in years to come.
- Published
- 2020
- Full Text
- View/download PDF
16. Navigation, Routing and Nature-Inspired Optimization
- Author
-
Yuxin Zhao and Xin-She Yang
- Subjects
Animal navigation ,Optimization problem ,Computer science ,business.industry ,Particle swarm optimization ,Firefly algorithm ,Artificial intelligence ,Routing (electronic design automation) ,Cuckoo search ,business ,Swarm intelligence ,Bat algorithm - Abstract
Navigation abilities are crucial for survival in nature, and there are a wide range of sophisticated abilities concerning animal navigation and migration. Many applications are related to navigation and routing problems, which are in turn related to optimization problems. This chapter provides an overview of navigation in nature, navigation and routing problems as well as their mathematical formulations. We will then introduce some nature-inspired algorithms for solving optimization problems with discussions about their main characteristics and the ways of solution representations.
- Published
- 2020
17. Multi-species Cuckoo Search Algorithm for Global Optimization
- Author
-
Suash Deb, Xin-She Yang, and Sudhanshu K. Mishra
- Subjects
FOS: Computer and information sciences ,0209 industrial biotechnology ,Optimization problem ,Computer science ,Cognitive Neuroscience ,Survival of the fittest ,02 engineering and technology ,Nonlinear programming ,020901 industrial engineering & automation ,90C26, 78M32 ,Genetic algorithm ,FOS: Mathematics ,0202 electrical engineering, electronic engineering, information engineering ,Local search (optimization) ,Neural and Evolutionary Computing (cs.NE) ,Cuckoo search ,Mathematics - Optimization and Control ,Global optimization ,business.industry ,Computer Science - Neural and Evolutionary Computing ,Computer Science Applications ,Nonlinear system ,Optimization and Control (math.OC) ,Benchmark (computing) ,020201 artificial intelligence & image processing ,Computer Vision and Pattern Recognition ,business ,Algorithm - Abstract
Many optimization problems in science and engineering are highly nonlinear, and thus require sophisticated optimization techniques to solve. Traditional techniques such as gradient-based algorithms are mostly local search methods, and often struggle to cope with such challenging optimization problems. Recent trends tend to use nature-inspired optimization algorithms. This work extends the standard cuckoo search (CS) by using the successful features of the cuckoo-host co-evolution with multiple interacting species, and the proposed multi-species cuckoo search (MSCS) intends to mimic the multiple species of cuckoos that compete for the survival of the fittest, and they co-evolve with host species with solution vectors being encoded as position vectors. The proposed algorithm is then validated by 15 benchmark functions as well as five nonlinear, multimodal design case studies in practical applications. Simulation results suggest that the proposed algorithm can be effective for finding optimal solutions and in this case all optimal solutions are achievable. The results for the test benchmarks are also compared with those obtained by other methods such as the standard cuckoo search and genetic algorithm, which demonstrated the efficiency of the present algorithm. Based on numerical experiments and case studies, we can conclude that the proposed algorithm can be more efficient in most cases, leading a potentially very effective tool for solving nonlinear optimization problems., Comment: 15 pages, 1 figures
- Published
- 2018
18. Bat-inspired algorithms with natural selection mechanisms for global optimization
- Author
-
Mohammed A. Awadallah, Xin-She Yang, Mohammed Azmi Al-Betar, Hossam Faris, Osama Ahmad Alomari, and Ahamad Tajudin Khader
- Subjects
0209 industrial biotechnology ,Mathematical optimization ,Rank (linear algebra) ,business.industry ,Cognitive Neuroscience ,Population-based incremental learning ,02 engineering and technology ,Swarm intelligence ,Computer Science Applications ,020901 industrial engineering & automation ,Artificial Intelligence ,Genetic algorithm ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,Local search (optimization) ,business ,Algorithm ,Global optimization ,Bat algorithm ,Selection (genetic algorithm) ,Mathematics - Abstract
In this paper, alternative selection mechanisms in the bat-inspired algorithm for global optimization problems are studied. The bat-inspired algorithm is a recent swarm-based intelligent system which mimics the echolocation system of micro-bats. In the bat-inspired algorithm, the bats randomly fly around the best bat locations found during the search so as to improve their hunting of prey. In practice, one bat location from a set of best bat locations is selected. Thereafter, that best bat location is used by local search with a random walk strategy to inform other bats about the prey location. This selection mechanism can be improved using other natural selection mechanisms adopted from other advanced algorithms like Genetic Algorithm. Therefore, six selection mechanisms are studied to choose the best bat location: global-best, tournament, proportional, linear rank, exponential rank, and random. Consequently, six versions of bat-inspired algorithm are proposed and studied which are global-best bat-inspired algorithm (GBA), tournament bat-inspired algorithm (TBA), proportional bat-inspired algorithm (PBA), linear rank bat-inspired algorithm (LBA), exponential rank bat-inspired algorithm (EBA), and random bat-inspired algorithm (RBA). Using two sets of global optimization functions, the bat-inspired versions are evaluated and the sensitivity analyses of each version to its parameters studied. Our results suggest that there are positive effects of the selection mechanisms on the performance of the classical bat-inspired algorithm which is GBA. For comparative evaluation, eighteen methods are selected using 25 IEEE-CEC2005 functions. The results show that the bat-inspired versions with various selection schemes observing the “survival-of-the-fittest” principle are largely competitive to established methods.
- Published
- 2018
19. Feasibility of Using FDG in the Stereotactic Ablative Setting for Tracked Dose Delivery With BgRT: Results from a Prospective Study of Serial Inter-Fraction PET/CTs
- Author
-
Xin-She Yang, A. Da Silva, P. Olcott, Jeffrey D. Bradley, Shervin M. Shirvani, S. Mazin, Kristin Higgins, Sibo Tian, David M. Schuster, Ila Sethi, and Taofeek K. Owonikoko
- Subjects
Cancer Research ,Dose delivery ,Radiation ,business.industry ,medicine.medical_treatment ,SABR volatility model ,Clinical trial ,Radiation therapy ,Oncology ,Planned Dose ,Ablative case ,medicine ,Radiology, Nuclear Medicine and imaging ,Fraction (mathematics) ,Nuclear medicine ,business ,Prospective cohort study - Abstract
Purpose/Objective(s) Biology-guided radiotherapy (BgRT) is a new radiation modality that utilizes real-time limited time sampled FDG PET images of a tumor to deliver a dynamically tracked dose distribution. Since the delivered radiotherapy dose is dependent on the FDG distribution at each fraction, the ability to deliver the planned dose despite variations in daily FDG signal must be established. This study evaluates the predicted dose distribution for BgRT based on serial FDG scans obtained over a course of stereotactic ablative radiotherapy (SABR). Materials/Methods Six patients with 1 (n = 5) or 2 (n = 1) lung tumors at least 2 cm in diameter with SUVmax ≥ 6 treated with SABR (50 Gy in 5 fractions) were selected for this investigation. These patients received 3 PET/CT scans as part of an IRB approved prospective clinical trial: within 2 weeks of treatment start (PET1), between fractions 1 and 2 (PET2), and between fractions 4 and 5 (PET3). A simulation tool was developed to convert PET1 into a simulated BgRT planning PET image accounting for differences in system sensitivity, acquisition time, reconstruction method and geometry between the diagnostic PET/CT system and the BgRT machine. This same tool was used to convert PET2 and PET3 into simulated pre-scan PET images, short PET acquisitions made just prior to a BgRT delivery fraction to ensure the current FDG distribution is safe for treatment. BgRT plans designed to deliver 50 Gy in 5 fractions following RTOG 0813 dose constraints were developed for each of the 7 targets using the simulated planning PET images as input. Expected variations in the dynamically calculated dose delivery are presented as bounds on the dose-volume histogram (DVH). For each plan, the predicted DVH for fractions 2 and 5 was calculated from the simulated pre-scan PET images and compared to the bounded DVH of the BgRT plan. The pass %, defined as the percentage of points on the predicted DVH curves falling within the planned DVH bounds, was automatically calculated; the pass % must be ≥ 95% to proceed with BgRT delivery. Results All 7 BgRT plans met target coverage objectives and normal tissue constraints. The mean pass % for the 14 pre-scans was 98.6%, with 13 exceeding the 95% threshold required for BgRT. There was no statistically significant difference between pass % at fraction 2 versus fraction 5 (two-tailed paired t-test, P = 0.44). The pass % was 92.2% for 1 pre-scan. In this case, the target SUVmax had decreased by 55.5% of the baseline value. Conclusion This investigation provides initial evidence that FDG scans remain stable enough over a course of ablative radiotherapy to enable dynamically tracked dose delivery with BgRT. Significant changes in FDG biodistribution that would prevent dose from being delivered within the pre-approved bounds of the BgRT plan can be identified through analysis of a pre-scan PET image. A prospective clinical trial to confirm these results is in preparation. Clinical trial number NCT03493789.
- Published
- 2021
20. EDITORIAL: Special Issue of 2018 India International Congress on Computational Intelligence
- Author
-
Xin-She Yang, Ka-Chun Wong, and Suash Deb
- Subjects
Engineering ,Artificial Intelligence ,business.industry ,Management science ,International congress ,Computational Science and Engineering ,Computational intelligence ,business ,Software - Published
- 2020
21. White learning methodology: A case study of cancer-related disease factors analysis in real-time PACS environment
- Author
-
Shirley W. I. Siu, Sabah Mohammed, Xin-She Yang, Simon Fong, Liansheng Liu, and Tengyue Li
- Subjects
Clustering high-dimensional data ,Computer science ,Health Informatics ,Breast Neoplasms ,Machine learning ,computer.software_genre ,030218 nuclear medicine & medical imaging ,Machine Learning ,03 medical and health sciences ,Naive Bayes classifier ,0302 clinical medicine ,Black box ,Humans ,business.industry ,Deep learning ,Bayesian network ,Statistical model ,Bayes Theorem ,Computer Science Applications ,Graph (abstract data type) ,Artificial intelligence ,White box ,business ,computer ,030217 neurology & neurosurgery ,Software ,Algorithms - Abstract
Background and Objective Bayesian network is a probabilistic model of which the prediction accuracy may not be one of the highest in the machine learning family. Deep learning (DL) on the other hand possess of higher predictive power than many other models. How reliable the result is, how it is deduced, how interpretable the prediction by DL mean to users, remain obscure. DL functions like a black box. As a result, many medical practitioners are reductant to use deep learning as the only tool for critical machine learning application, such as aiding tool for cancer diagnosis. Methods In this paper, a framework of white learning is being proposed which takes advantages of both black box learning and white box learning. Usually, black box learning will give a high standard of accuracy and white box learning will provide an explainable direct acyclic graph. According to our design, there are 3 stages of White Learning, loosely coupled WL, semi coupled WL and tightly coupled WL based on degree of fusion of the white box learning and black box learning. In our design, a case of loosely coupled WL is tested on breast cancer dataset. This approach uses deep learning and an incremental version of Naive Bayes network. White learning is largely defied as a systemic fusion of machine learning models which result in an explainable Bayes network which could find out the hidden relations between features and class and deep learning which would give a higher accuracy of prediction than other algorithms. We designed a series of experiments for this loosely coupled WL model. Results The simulation results show that using WL compared to standard black-box deep learning, the levels of accuracy and kappa statistics could be enhanced up to 50%. The performance of WL seems more stable too in extreme conditions such as noise and high dimensional data. The relations by Bayesian network of WL are more concise and stronger in affinity too. Conclusion The experiments results deliver positive signals that WL is possible to output both high classification accuracy and explainable relations graph between features and class.
- Published
- 2019
22. Firefly Algorithm and Its Variants in Digital Image Processing: A Comprehensive Review
- Author
-
Nilanjan Dey, Simon Fong, Xin-She Yang, Jyotismita Chaki, and Luminița Moraru
- Subjects
Optimization problem ,Computer science ,business.industry ,Ant colony optimization algorithms ,Particle swarm optimization ,Image processing ,Machine learning ,computer.software_genre ,Digital image processing ,Genetic algorithm ,Firefly algorithm ,Artificial intelligence ,business ,computer ,Metaheuristic - Abstract
The significance and requirements of digital image processing arise from two main areas of applications: the improvement of visual information for human interpretation and the encoding of scene data for the independent perception of machines. However, human is often involved in such processing for manually tuning up the parameters, which takes a long time, and it remains as an unresolved issue. Nature is a brilliant and enormous source of inspiration for resolving difficult and complicated problems in computer science, as it possesses incredibly diverse, vibrant, flexible, complicated, and intriguing phenomena. In practice, selecting the optimum parameters for any technique is an optimization problem. Nature-inspired algorithms are metaheuristics that imitate the works of nature to solve optimization issues, leading to a new era in computing. There are several dozens of classical metaheuristic optimization algorithms reported in the literature, such as genetic algorithm, ant colony optimization, and particle swarm optimization. Though due to the efficacy and success in solving various digital image analysis problems, the firefly algorithm which is also a metaheuristic algorithm, inspired by fireflies’ flashing behaviour in nature, is used in various image analysis optimization studies. This work is dedicated to a comprehensive review of the firefly algorithm to solve optimization problems in various steps of digital image analysis, like image preprocessing, segmentation, compression, feature selection, and classification. Various applications of the firefly algorithm in image analysis are also discussed in this review. Key issues and future research directions will also be highlighted.
- Published
- 2019
23. Improved tabu search and simulated annealing methods for nonlinear data assimilation
- Author
-
Elias D. Nino-Ruiz and Xin-She Yang
- Subjects
0209 industrial biotechnology ,Computer science ,business.industry ,02 engineering and technology ,Tabu search ,Nonlinear system ,020901 industrial engineering & automation ,Data assimilation ,Simulated annealing ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,Ensemble Kalman filter ,Local search (optimization) ,business ,Gradient descent ,Algorithm ,Software - Abstract
Nonlinear data assimilation can be a very challenging task. Four local search methods are proposed for nonlinear data assimilation in this paper. The methods work as follows: At each iteration, the observation operator is linearized around the current solution, and a gradient approximation of the three dimensional variational (3D-Var) cost function is obtained. Then, samples along potential steepest descent directions of the 3D-Var cost function are generated, and the acceptance/rejection criteria for such samples are similar to those proposed by the Tabu Search and the Simulated Annealing framework. In addition, such samples can be drawn within certain sub-spaces so as to reduce the computational effort of computing search directions. Once a posterior mode is estimated, matrix-free ensemble Kalman filter approaches can be implemented to estimate posterior members. Furthermore, the convergence of the proposed methods is theoretically proven based on the necessary assumptions and conditions. Numerical experiments have been performed by using the Lorenz-96 model. The numerical results show that the cost function values on average can be reduced by several orders of magnitudes by using the proposed methods. Even more, the proposed methods can converge faster to posterior modes when sub-space approximations are employed to reduce the computational efforts among iterations.
- Published
- 2019
24. Introduction to Optimization
- Author
-
Xingshi He and Xin-She Yang
- Subjects
0301 basic medicine ,business.industry ,Computer science ,Ranging ,01 natural sciences ,Swarm intelligence ,010101 applied mathematics ,03 medical and health sciences ,030104 developmental biology ,0103 physical sciences ,Artificial intelligence ,0101 mathematics ,business ,Engineering design process ,010303 astronomy & astrophysics - Abstract
Optimization is part of many university courses because of its importance in many disciplines and applications such as engineering design, business planning, computer science, data mining, machine learning, artificial intelligence and industries. The techniques and algorithms for optimization are diverse, ranging from the traditional gradient-based algorithms to contemporary swarm intelligence based algorithms. This chapter introduces the fundamentals of optimization and some of the traditional optimization techniques.
- Published
- 2019
25. Neural networks and deep learning
- Author
-
Xin-She Yang
- Subjects
Network architecture ,ComputingMethodologies_PATTERNRECOGNITION ,Quantitative Biology::Neurons and Cognition ,Artificial neural network ,business.industry ,Computer science ,Deep learning ,Computer Science::Neural and Evolutionary Computation ,Key (cryptography) ,ComputingMethodologies_GENERAL ,Artificial intelligence ,Extension (predicate logic) ,business - Abstract
This chapter introduces the key concepts of artificial neural networks and their extension to deep learning. Topics include neural networks, network architecture, optimizers, convolutionary neural network, and Boltzmann machines.
- Published
- 2019
26. Data mining techniques
- Author
-
Xin-She Yang
- Subjects
Naive Bayes classifier ,ComputingMethodologies_PATTERNRECOGNITION ,Computer science ,business.industry ,Big data ,Decision tree ,Data mining ,business ,computer.software_genre ,computer ,Random forest - Abstract
This chapter introduces some of the most widely used techniques for data mining, including nearest-neighbor algorithm, k -mean algorithm, decision trees, random forests, Bayesian classifier, and others. Special techniques such as CURE and BFR for mining big data are also briefly introduced.
- Published
- 2019
27. Support vector machine and regression
- Author
-
Xin-She Yang
- Subjects
Statistics::Theory ,Computer science ,business.industry ,Machine learning ,computer.software_genre ,Regression ,Statistics::Computation ,Support vector machine ,Statistics::Machine Learning ,Nonlinear system ,ComputingMethodologies_PATTERNRECOGNITION ,Statistics::Methodology ,Artificial intelligence ,business ,computer - Abstract
This chapter introduces a powerful tool for classification and regression, including linear support vector machines, nonlinear support vector machines, and support vector regression.
- Published
- 2019
28. Comparison of bio-inspired algorithms applied to the coordination of mobile robots considering the energy consumption
- Author
-
Nunzia Palmieri, Floriano De Rango, Salvatore Marano, and Xin-She Yang
- Subjects
0209 industrial biotechnology ,Computer science ,business.industry ,Distributed computing ,Ant colony optimization algorithms ,Swarm robotics ,Particle swarm optimization ,Swarm behaviour ,Mobile robot ,02 engineering and technology ,Swarm intelligence ,020901 industrial engineering & automation ,Artificial Intelligence ,0202 electrical engineering, electronic engineering, information engineering ,Robot ,020201 artificial intelligence & image processing ,Firefly algorithm ,Ant robotics ,Artificial intelligence ,business ,Metaheuristic ,Software - Abstract
Many applications, related to autonomous mobile robots, require to explore in an unknown environment searching for static targets, without any a priori information about the environment topology and target locations. Targets in such rescue missions can be fire, mines, human victims, or dangerous material that the robots have to handle. In these scenarios, some cooperation among the robots is required for accomplishing the mission. This paper focuses on the application of different bio-inspired metaheuristics for the coordination of a swarm of mobile robots that have to explore an unknown area in order to rescue and handle cooperatively some distributed targets. This problem is formulated by first defining an optimization model and then considering two sub-problems: exploration and recruiting. Firstly, the environment is incrementally explored by robots using a modified version of ant colony optimization. Then, when a robot detects a target, a recruiting mechanism is carried out to recruit a certain number of robots to deal with the found target together. For this latter purpose, we have proposed and compared three approaches based on three different bio-inspired algorithms (Firefly Algorithm, Particle Swarm Optimization, and Artificial Bee Algorithm). A computational study and extensive simulations have been carried out to assess the behavior of the proposed approaches and to analyze their performance in terms of total energy consumed by the robots to complete the mission. Simulation results indicate that the firefly-based strategy usually provides superior performance and can reduce the wastage of energy, especially in complex scenarios.
- Published
- 2019
29. Evolutionary intelligence techniques for humanized computing
- Author
-
Xin-She Yang, Suresh Chandra Satapathy, and Vikrant Bhateja
- Subjects
Mathematics (miscellaneous) ,Artificial Intelligence ,business.industry ,Computer science ,Cognitive Neuroscience ,Computer Vision and Pattern Recognition ,Artificial intelligence ,business - Published
- 2021
30. The effects of different levels of superoxide dismutase in Modena on boar semen quality during liquid preservation at 17°C
- Author
-
Gong-She Yang, Hao Li, Jian-Hong Hu, Le Wang, Liang Guodong, Yun-Hui Ma, Yang-Yi Hao, and Xiao-Gang Zhang
- Subjects
endocrine system ,Semen ,medicine.disease_cause ,Superoxide dismutase ,03 medical and health sciences ,chemistry.chemical_compound ,0302 clinical medicine ,medicine ,Food science ,Hydrogen peroxide ,Acrosome ,Sperm motility ,chemistry.chemical_classification ,Reactive oxygen species ,030219 obstetrics & reproductive medicine ,biology ,urogenital system ,business.industry ,0402 animal and dairy science ,04 agricultural and veterinary sciences ,General Medicine ,Malondialdehyde ,040201 dairy & animal science ,Biotechnology ,chemistry ,biology.protein ,General Agricultural and Biological Sciences ,business ,Oxidative stress - Abstract
This study was conducted to investigate the influence of superoxide dismutase (SOD) on the quality of boar semen during liquid preservation at 17°C. Semen samples from 10 Duroc boars were collected and pooled, divided into five equal parts and diluted with Modena containing different concentrations (0, 100, 200, 300 and 400 U/mL) of SOD. During the process of liquid preservation at 17°C, sperm motility, acrosome integrity, membrane integrity, total antioxidant capacity (T-AOC) activity, malondialdehyde (MDA) content and hydrogen peroxide (H2 O2 ) content were measured and analyzed every 24 h. Meanwhile, effective survival time of boar semen during preservation was evaluated and analyzed. The results indicated that different concentrations of SOD in Modena showed different protective effects on boar sperm quality. Modena supplemented with SOD decreased the effects on reactive oxygen species on boar sperm quality during liquid preservation compared with that of the control group. The added 200 U/mL SOD group showed higher sperm motility, membrane integrity, acrosome integrity, effective survival time and T-AOC activity. Meanwhile, the added 200 U/mL SOD group showed lower MDA content and H2 O2 content. In conclusion, addition of SOD to Modena improved the boar sperm quality by reducing oxidative stress during liquid preservation at 17°C and the optimum concentration was 200 U/mL.
- Published
- 2016
31. Economic dispatch using chaotic bat algorithm
- Author
-
T. Raghunathan, T. Jayabarathi, B.R. Adarsh, and Xin-She Yang
- Subjects
Engineering ,Mathematical optimization ,business.industry ,Metaheuristic optimization ,020209 energy ,Mechanical Engineering ,Chaotic ,Economic dispatch ,02 engineering and technology ,Building and Construction ,Pollution ,Industrial and Manufacturing Engineering ,Evolutionary computation ,General Energy ,Transmission (telecommunications) ,Power Balance ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,Electrical and Electronic Engineering ,business ,Bat algorithm ,Civil and Structural Engineering - Abstract
This paper presents the application of a new metaheuristic optimization algorithm, the chaotic bat algorithm for solving the economic dispatch problem involving a number of equality and inequality constraints such as power balance, prohibited operating zones and ramp rate limits. Transmission losses and multiple fuel options are also considered for some problems. The chaotic bat algorithm, a variant of the basic bat algorithm, is obtained by incorporating chaotic sequences to enhance its performance. Five different example problems comprising 6, 13, 20, 40 and 160 generating units are solved to demonstrate the effectiveness of the algorithm. The algorithm requires little tuning by the user, and the results obtained show that it either outperforms or compares favorably with several existing techniques reported in literature.
- Published
- 2016
32. Community detection in networks using bio-inspired optimization: Latest developments, new results and perspectives with a selection of recent meta-heuristics
- Author
-
Xin-She Yang, Javier Del Ser, David Camacho, Eneko Osaba, and Miren Nekane Bilbao
- Subjects
Community detection ,Network partition ,business.industry ,Computer science ,Computation ,Swarm intelligence ,Evolutionary computation ,Machine learning ,computer.software_genre ,Partition (database) ,Bio-inspired computation ,Artificial intelligence ,Heuristics ,business ,computer ,Metaheuristic ,Software - Abstract
Detecting groups within a set of interconnected nodes is a widely addressed problem that can model a diversity of applications. Unfortunately, detecting the optimal partition of a network is a computationally demanding task, usually conducted by means of optimization methods. Among them, randomized search heuristics have been proven to be efficient approaches. This manuscript is devoted to providing an overview of community detection problems from the perspective of bio-inspired computation. To this end, we first review the recent history of this research area, placing emphasis on milestone studies contributed in the last five years. Next, we present an extensive experimental study to assess the performance of a selection of modern heuristics over weighted directed network instances. Specifically, we combine seven global search heuristics based on two different similarity metrics and eight heterogeneous search operators designed ad-hoc. We compare our methods with six different community detection techniques over a benchmark of 17 Lancichinetti–Fortunato–Radicchi network instances. Ranking statistics of the tested algorithms reveal that the proposed methods perform competitively, but the high variability of the rankings leads to the main conclusion: no clear winner can be declared. This finding aligns with community detection tools available in the literature that hinge on a sequential application of different algorithms in search for the best performing counterpart. We end our research by sharing our envisioned status of this area, for which we identify challenges and opportunities which should stimulate research efforts in years to come.
- Published
- 2020
33. Handling dropout probability estimation in convolution neural networks using meta-heuristics
- Author
-
João Paulo Papa, Gustavo Henrique de Rosa, Xin-She Yang, Universidade Estadual Paulista (Unesp), and Middlesex Univ
- Subjects
0209 industrial biotechnology ,Computer science ,Context (language use) ,Computational intelligence ,02 engineering and technology ,Overfitting ,Machine learning ,computer.software_genre ,Meta-heuristic optimization ,Convolutional neural network ,Regularization (mathematics) ,Theoretical Computer Science ,020901 industrial engineering & automation ,0202 electrical engineering, electronic engineering, information engineering ,Dropout (neural networks) ,Artificial neural network ,business.industry ,Dropout ,Deep learning ,Identification (information) ,Convolutional neural networks ,020201 artificial intelligence & image processing ,Geometry and Topology ,Artificial intelligence ,business ,computer ,Software - Abstract
Made available in DSpace on 2018-11-26T17:55:04Z (GMT). No. of bitstreams: 0 Previous issue date: 2018-09-01 Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP) Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq) Deep learning-based approaches have been paramount in recent years, mainly due to their outstanding results in several application domains, ranging from face and object recognition to handwritten digit identification. Convolutional neural networks (CNNs) have attracted a considerable attention since they model the intrinsic and complex brain working mechanisms. However, one main shortcoming of such models concerns their overfitting problem, which prevents the network from predicting unseen data effectively. In this paper, we address this problem by means of properly selecting a regularization parameter known as dropout in the context of CNNs using meta-heuristic-driven techniques. As far as we know, this is the first attempt to tackle this issue using this methodology. Additionally, we also take into account a default dropout parameter and a dropout-less CNN for comparison purposes. The results revealed that optimizing dropout-based CNNs is worthwhile, mainly due to the easiness in finding suitable dropout probability values, without needing to set new parameters empirically. Sao Paulo State Univ, Dept Comp, BR-17033360 Bauru, SP, Brazil Middlesex Univ, Sch Sci & Technol, London NW4 4BT, England Sao Paulo State Univ, Dept Comp, BR-17033360 Bauru, SP, Brazil FAPESP: 2015/25739-4 FAPESP: 2014/12236-1 FAPESP: 2014/16250-9 CNPq: 306166/2014-3
- Published
- 2018
34. New directional bat algorithm for continuous optimization problems
- Author
-
Asma Chakri, Xin-She Yang, Mohamed Benouaret, and Rabia Khelif
- Subjects
FOS: Computer and information sciences ,0209 industrial biotechnology ,Computer science ,Human echolocation ,02 engineering and technology ,90C30, 68W20 ,Machine learning ,computer.software_genre ,Swarm intelligence ,020901 industrial engineering & automation ,Artificial Intelligence ,0202 electrical engineering, electronic engineering, information engineering ,FOS: Mathematics ,Neural and Evolutionary Computing (cs.NE) ,Mathematics - Optimization and Control ,Bat algorithm ,Continuous optimization ,business.industry ,General Engineering ,Computer Science - Neural and Evolutionary Computing ,Computer Science Applications ,Optimization and Control (math.OC) ,Benchmark (computing) ,020201 artificial intelligence & image processing ,Artificial intelligence ,business ,Algorithm ,computer ,Premature convergence - Abstract
Bat algorithm (BA) is a recent optimization algorithm based on swarm intelligence and inspiration from the echolocation behavior of bats. One of the issues in the standard bat algorithm is the premature convergence that can occur due to the low exploration ability of the algorithm under some conditions. To overcome this deficiency, directional echolocation is introduced to the standard bat algorithm to enhance its exploration and exploitation capabilities. In addition to such directional echolocation, three other improvements have been embedded into the standard bat algorithm to enhance its performance. The new proposed approach, namely the directional Bat Algorithm (dBA), has been then tested using several standard and non-standard benchmarks from the CEC’2005 benchmark suite. The performance of dBA has been compared with ten other algorithms and BA variants using non-parametric statistical tests. The statistical test results show the superiority of the directional bat algorithm.
- Published
- 2018
35. Smart Trends in Systems, Security and Sustainability
- Author
-
Amit Joshi, Atulya K. Nagar, and Xin-She Yang
- Subjects
Sustainability ,Business ,Environmental economics - Published
- 2018
36. Navigability analysis of magnetic map with projecting pursuit-based selection method by using firefly algorithm
- Author
-
Yongxu He, Yan Ma, Xin-She Yang, Ligang Wu, and Yuxin Zhao
- Subjects
Matching (statistics) ,Basis (linear algebra) ,Computer science ,business.industry ,Cognitive Neuroscience ,Computation ,Particle swarm optimization ,Machine learning ,computer.software_genre ,Computer Science Applications ,Earth's magnetic field ,Artificial Intelligence ,Projection pursuit ,Firefly algorithm ,Artificial intelligence ,business ,computer ,Algorithm ,Selection (genetic algorithm) - Abstract
The performance of geomagnetic aided navigation is closely related to the selection of geomagnetic matching area. This paper mainly studies the selection method of multi-parameter geomagnetic matching area based on the projection pursuit model, and then adopts the firefly algorithm, particle swarm optimization algorithm and differential evolution algorithm to obtain the optimal projection direction. After that, we compare the optimizing results comprehensively and give the evaluation of navigation performance in geomagnetic matching area to provide the basis for the selection of matching area. The implementation results show that when there are large differences between the characteristic parameters, the computation efficiency of firefly algorithm performs significantly better than the other two optimization algorithms. And under the same experimental conditions, the optimal geomagnetic matching area obtained by the method presented in this paper has the minimum matching position error and optimal navigation performance.
- Published
- 2015
37. Color Image Segmentation By Cuckoo Search
- Author
-
Achintya Das, Xin-She Yang, Partha Pratim Sarkar, and Sudarshan Nandy
- Subjects
Segmentation-based object categorization ,Computer science ,business.industry ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,Scale-space segmentation ,Particle swarm optimization ,Pattern recognition ,Image segmentation ,Machine learning ,computer.software_genre ,Theoretical Computer Science ,Computational Theory and Mathematics ,Artificial Intelligence ,Computer Science::Computer Vision and Pattern Recognition ,Firefly algorithm ,Segmentation ,Artificial intelligence ,business ,Cuckoo search ,Cluster analysis ,computer ,Software - Abstract
In this paper, a clustering based color image segmentation technique is proposed and the clustering technique is optimized by the cuckoo search method. The proposed approach consists of two phase segmentation processes. In the first phase, cluster centres are optimized by using the cuckoo search algorithm and in the second phase, empty and frequent clutters are removed and merged according to pre-defined rules. This cluster centre based clustering technique is then used to find the optimum centre within a cluster, while cuckoo search is applied to find the optimum cluster centre for each segment in the image. Comparison of the proposed method is performed with the genetic algorithm (GA), dynamic control particle swarm optimization (DCPSO) algorithm and firefly algorithm based color image segmentation methods over five benchmark color images. The parameters of the proposed method are tuned through empirical testing. Results demonstrated that the proposed method can be an effective tool for image segmentation.
- Published
- 2015
38. Lévy flight artificial bee colony algorithm
- Author
-
Karm Veer Arya, Harish Sharma, Xin-She Yang, and Jagdish Chand Bansal
- Subjects
0209 industrial biotechnology ,education.field_of_study ,Mathematical optimization ,Engineering ,business.industry ,Population ,Probabilistic logic ,ComputingMilieux_LEGALASPECTSOFCOMPUTING ,02 engineering and technology ,Swarm intelligence ,Computer Science Applications ,Theoretical Computer Science ,Artificial bee colony algorithm ,020901 industrial engineering & automation ,Lévy flight ,Control and Systems Engineering ,Convergence (routing) ,0202 electrical engineering, electronic engineering, information engineering ,Memetic algorithm ,020201 artificial intelligence & image processing ,Optimisation algorithm ,education ,business ,Hardware_LOGICDESIGN - Abstract
Artificial bee colony ABC optimisation algorithm is a relatively simple and recent population-based probabilistic approach for global optimisation. The solution search equation of ABC is significantly influenced by a random quantity which helps in exploration at the cost of exploitation of the search space. In the ABC, there is a high chance to skip the true solution due to its large step sizes. In order to balance between diversity and convergence in the ABC, a Levy flight inspired search strategy is proposed and integrated with ABC. The proposed strategy is named as Levy Flight ABC LFABC has both the local and global search capability simultaneously and can be achieved by tuning the Levy flight parameters and thus automatically tuning the step sizes. In the LFABC, new solutions are generated around the best solution and it helps to enhance the exploitation capability of ABC. Furthermore, to improve the exploration capability, the numbers of scout bees are increased. The experiments on 20 test problems of different complexities and five real-world engineering optimisation problems show that the proposed strategy outperforms the basic ABC and recent variants of ABC, namely, Gbest-guided ABC, best-so-far ABC and modified ABC in most of the experiments.
- Published
- 2015
39. A heuristic optimization method inspired by wolf preying behavior
- Author
-
Suash Deb, Xin-She Yang, and Simon Fong
- Subjects
Incremental heuristic search ,Mathematical optimization ,Optimization problem ,business.industry ,Heuristic ,Computer science ,Computation ,Tabu search ,Parallel metaheuristic ,Artificial Intelligence ,Search algorithm ,Metaheuristic algorithms ,Local search (optimization) ,Artificial intelligence ,business ,Metaheuristic ,Software - Abstract
Optimization problems can become intractable when the search space undergoes tremendous growth. Heuristic optimization methods have therefore been created that can search the very large spaces of candidate solutions. These methods, also called metaheuristics, are the general skeletons of algorithms that can be modified and extended to suit a wide range of optimization problems. Various researchers have invented a collection of metaheuristics inspired by the movements of animals and insects (e.g., firefly, cuckoos, bats and accelerated PSO) with the advantages of efficient computation and easy implementation. This paper studies a relatively new bio-inspired heuristic optimization algorithm called the Wolf Search Algorithm (WSA) that imitates the way wolves search for food and survive by avoiding their enemies. The WSA is tested quantitatively with different values of parameters and compared to other metaheuristic algorithms under a range of popular non-convex functions used as performance test problems for optimization algorithms, with superior results observed in most tests.
- Published
- 2015
40. The Selection of Commercial Residential Building Energy Saving Reconstruction Object Research
- Author
-
Jing Wang and Zhan She Yang
- Subjects
Life-cycle hypothesis ,Architectural engineering ,Engineering ,business.industry ,General Medicine ,Appropriate technology ,Resource (project management) ,Economic evaluation ,Architecture ,Human resources ,business ,Energy (signal processing) ,Simulation ,Efficient energy use - Abstract
Energy saving of buildings in our country started from 90’s,which is so late that remaining at a low level .Although certain success has been achieved ,architecture reconstruction moves slowly on the whole ,especially for residential buildings. One of the reasons for this situation is lacking of money for energy saving reconstruction .Considering the enormous number of energy saving reconstruction buildings with the limited funds ,human resource and material resource, it is necessary to make a selection before the reconstruction ,weeding out the projects that are too poor to reconstruct in safety, functionality and energy-saving reconstruction, as well as the ones that have good energy saving performance and meet the energy efficiency design standards, which aren’t built for ages. The key point is to choose the existing buildings which are in urgent need of transformation in performance, economically rational and have feasible technic proceeding in batches with plans. Meanwhile, the reconstruction should use appropriate technology, combining with local climate characteristics.
- Published
- 2014
41. Quaternion-based deep belief networks fine-tuning
- Author
-
Gustavo Henrique de Rosa, Xin-She Yang, Danillo Roberto Pereira, João Paulo Papa, Universidade Estadual Paulista (Unesp), and Middlesex Univ
- Subjects
Hypercomplex number ,Theoretical computer science ,Harmony Search ,Fitness landscape ,business.industry ,Computer science ,Deep learning ,Binary image ,Deep Belief Networks ,020206 networking & telecommunications ,02 engineering and technology ,Quaternion ,Deep belief network ,0202 electrical engineering, electronic engineering, information engineering ,Harmony search ,020201 artificial intelligence & image processing ,Artificial intelligence ,Representation (mathematics) ,business ,Software - Abstract
Made available in DSpace on 2018-11-26T17:42:02Z (GMT). No. of bitstreams: 0 Previous issue date: 2017-11-01 Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP) Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq) Deep learning techniques have been paramount in the last years, mainly due to their outstanding results in a number of applications. In this paper, we address the issue of fine-tuning parameters of Deep Belief Networks by means of meta-heuristics in which real-valued decision variables are described by quaternions. Such approaches essentially perform optimization in fitness landscapes that are mapped to a different representation based on hypercomplex numbers that may generate smoother surfaces. We therefore can map the optimization process onto a new space representation that is more suitable to learning parameters. Also, we proposed two approaches based on Harmony Search and quaternions that outperform the state-of-the-art results obtained so far in three public datasets for the reconstruction of binary images. (C) 2017 Elsevier B.V. All rights reserved. Sao Paulo State Univ, Dept Comp, Av Eng Luiz Edmundo Carrijo Coube 14-01, BR-17033360 Bauru, SP, Brazil Middlesex Univ, Sch Sci & Technol, London NW4 4BT, England Sao Paulo State Univ, Dept Comp, Av Eng Luiz Edmundo Carrijo Coube 14-01, BR-17033360 Bauru, SP, Brazil FAPESP: 2014/12236-1 FAPESP: 2014/16250-9 FAPESP: 2015/25739-4 CNPq: 470571/2013-6 CNPq: 306166/2014-3
- Published
- 2017
42. Why the Firefly Algorithm Works?
- Author
-
Xingshi He and Xin-She Yang
- Subjects
0209 industrial biotechnology ,020901 industrial engineering & automation ,business.industry ,Computer science ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,Firefly algorithm ,02 engineering and technology ,Artificial intelligence ,ComputerSystemsOrganization_PROCESSORARCHITECTURES ,business ,Swarm intelligence - Abstract
Firefly algorithm is a nature-inspired optimization algorithm and there have been significant developments since its appearance about 10 years ago. This chapter summarizes the latest developments about the firefly algorithm and its variants as well as their diverse applications. Future research directions are also highlighted.
- Published
- 2017
43. On Efficiently Solving the Vehicle Routing Problem with Time Windows Using the Bat Algorithm with Random Reinsertion Operators
- Author
-
Iztok Fister, Roberto Carballedo, Javier Del Ser, Eneko Osaba, Pedro Lopez-Garcia, and Xin-She Yang
- Subjects
Engineering ,Mathematical optimization ,060102 archaeology ,Heuristic (computer science) ,business.industry ,Node (networking) ,06 humanities and the arts ,02 engineering and technology ,Travelling salesman problem ,Vehicle routing problem ,0202 electrical engineering, electronic engineering, information engineering ,Benchmark (computing) ,Combinatorial optimization ,020201 artificial intelligence & image processing ,0601 history and archaeology ,Minification ,business ,Bat algorithm - Abstract
An evolutionary and discrete variant of the Bat Algorithm (EDBA) is proposed for solving the Vehicle Routing Problem with Time Windows, or VRPTW. The EDBA developed not only presents an improved movement strategy, but it also combines with diverse heuristic operators to deal with this type of complex problems. One of the main new concepts is to unify the search process and the minimization of the routes and total distance in the same operators. This hybridization is achieved by using selective node extractions and subsequent reinsertions. In addition, the new approach analyzes all the routes that compose a solution with the intention of enhancing the diversification ability of the search process. In this study, several variants of the EDBA are shown and tested in order to measure the quality of both metaheuristic algorithms and their operators. The benchmark experiments have been carried out by using the 56 instances that compose the 100 customers Solomon’s benchmark. Two statistical tests have also been carried out so as to analyze the results and draw proper conclusions.
- Published
- 2017
44. How Meta-heuristic Algorithms Contribute to Deep Learning in the Hype of Big Data Analytics
- Author
-
Simon Fong, Suash Deb, and Xin-She Yang
- Subjects
Bridging (networking) ,Artificial neural network ,business.industry ,Process (engineering) ,Computer science ,Deep learning ,Feature vector ,Big data ,Context (language use) ,02 engineering and technology ,010501 environmental sciences ,01 natural sciences ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,Algorithm design ,Artificial intelligence ,business ,0105 earth and related environmental sciences - Abstract
Deep learning (DL) is one of the most emerging types of contemporary machine learning techniques that mimic the cognitive patterns of animal visual cortex to learn the new abstract features automatically by deep and hierarchical layers. DL is believed to be a suitable tool so far for extracting insights from very huge volume of so-called big data. Nevertheless, one of the three “V” or big data is velocity that implies the learning has to be incremental as data are accumulating up rapidly. DL must be fast and accurate. By the technical design of DL, it is extended from feed-forward artificial neural network with many multi-hidden layers of neurons called deep neural network (DNN). In the training process of DNN, it has certain inefficiency due to very long training time required. Obtaining the most accurate DNN within a reasonable run-time is a challenge, given there are potentially many parameters in the DNN model configuration and high dimensionality of the feature space in the training dataset. Meta-heuristic has a history of optimizing machine learning models successfully. How well meta-heuristic could be used to optimize DL in the context of big data analytics is a thematic topic which we pondered on in this paper. As a position paper, we review the recent advances of applying meta-heuristics on DL, discuss about their pros and cons and point out some feasible research directions for bridging the gaps between meta-heuristics and DL.
- Published
- 2017
45. Sequence optimization for integrated radar and communication systems using meta-heuristic multiobjective methods
- Author
-
Hans-Jurgen Zepernick, Momin Jamil, and Xin-She Yang
- Subjects
Mathematical optimization ,Engineering ,Signal processing ,business.industry ,Autocorrelation ,Sequence optimization ,020206 networking & telecommunications ,02 engineering and technology ,Communications system ,law.invention ,Task (project management) ,Maxima and minima ,law ,Face (geometry) ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,Radar ,business - Abstract
In real-world engineering problems, several conflicting objective functions have often to be optimized simultaneously. Typically, the objective functions of these problems are too complex to solve using derivative-based optimization methods. Integration of navigation and radar functionality with communication applications is such a problem. Designing sequences for these systems is a difficult task. This task is further complicated by the following factors: (i) conflicting requirements on autocorrelation and crosscorrelation characteristics; (ii) the associated cost functions might be irregular and may have several local minima. Traditional or gradient based optimization methods may face challenges or are unsuitable to solve such a complex problem. In this paper, we pose simultaneous optimization of autocorrelation and crosscorrelation characteristics of Oppermann sequences as a multiobjective problem. We compare the performance of prominent state-of-the-art multiobjective evolutionary meta-heuristic algorithms to design Oppermann sequences for integrated radar and communication systems.
- Published
- 2017
46. Trigonometry
- Author
-
Xin-She Yang
- Subjects
Computer science ,business.industry ,Calculus ,Robotics ,Degree (angle) ,Artificial intelligence ,Trigonometry ,Radian ,business ,Complex number ,Robotic arm ,Engineering mathematics ,Law of cosines - Abstract
Trigonometry is an essential part of engineering mathematics. For example, in robotics, trigonometry can be useful in calculating the positions of robotic arms, rotations as well as other quantities. In addition, trigonometrical functions are also intrinsically related to complex numbers. This chapter introduces the fundamentals of trigonometrical functions.
- Published
- 2017
47. Laplace Transforms
- Author
-
Xin-She Yang
- Subjects
Laplace transform ,Differential equation ,business.industry ,MathematicsofComputing_NUMERICALANALYSIS ,Mathematics::Spectral Theory ,Z-transform ,Automation ,Control system ,ComputingMethodologies_SYMBOLICANDALGEBRAICMANIPULATION ,Mathematics::Metric Geometry ,Applied mathematics ,business ,Scaling ,Mathematics - Abstract
Laplace transforms are an important tool with many applications in engineering such as control system and automation. This chapter introduces the fundamentals of Laplace transforms, their properties and applications in solving differential equations.
- Published
- 2017
48. Optimum Tuning of Mass Dampers by Using a Hybrid Method Using Harmony Search and Flower Pollination Algorithm
- Author
-
Xin-She Yang, Gebrail Bekdaş, and Sinan Melih Nigdeli
- Subjects
0301 basic medicine ,Damping ratio ,Mathematical optimization ,Engineering ,Pollination ,business.industry ,Computation ,Process (computing) ,02 engineering and technology ,021001 nanoscience & nanotechnology ,03 medical and health sciences ,030104 developmental biology ,Tuned mass damper ,Convergence (routing) ,Harmony search ,Local search (optimization) ,0210 nano-technology ,business ,Algorithm - Abstract
In this study, a new approach is proposed for optimization of tuned mass damper positioned on the top of seismic structures. The usage of metaheuristic algorithms is a well-known and effective technique for optimum tuning of parameters such as mass, period and damping ratio. The aim of the study is to generate a new methodology in order to improve the computation capacity and precision of the final results. For that reason, harmony search (HS) and flower pollination algorithm (FPA) are hybridized by proposing a probability based approach. In the methodology, global and local search processes of HS are used together with global and local pollination stages of FPA. In that case, four different types of generation are used. In the methodology, these four types of generation have the same chance at the start of the optimization process and probabilities are reduced when the corresponding type of the generation is chosen. If an improvement is provided for the objective of the optimization, the probability of the effective type is increased. The proposed method has an effective convergence by providing improvement of the optimization objective comparing to classical FPA.
- Published
- 2017
49. Analysis of the Deformation State of Different Forms of Guardrails
- Author
-
Kai Yang, Fu She Yang, Wen Wen Tang, and Huan Wen Shi
- Subjects
Engineering ,business.industry ,Finite element software ,Energy absorption ,Transverse shear deformation ,General Medicine ,Structural engineering ,Deformation (meteorology) ,Static analysis ,business ,Finite element method ,Beam (structure) - Abstract
Guardrails,consisting of metal post of the profile Σ and a protective beam of the profile W are commonly used .In order to guide the choice of the form of motor vehicle road guardrails, the article used finite element software ANSYS to calculate six kinds of guardrails on static analysis. The guardrails were made of of two forms of beam and three types of post. The result showed that the guardrail composed by MS-A and MST-2 had the biggest transverse deformation ,the value was 7.21 cm. The conclusion can be drawn that the guardrail composed by MS-A and MST-2 has the better ability about deformation and energy absorption, which will be the optimal choice for the road guardrail .
- Published
- 2014
50. Feature Selection in Life Science Classification: Metaheuristic Swarm Search
- Author
-
Simon Fong, Jinyan Li, Suash Deb, and Xin-She Yang
- Subjects
business.industry ,Computer science ,Swarm behaviour ,Feature selection ,Machine learning ,computer.software_genre ,Swarm intelligence ,Computer Science Applications ,Statistical classification ,Hardware and Architecture ,Feature (computer vision) ,Combinatorial search ,Artificial intelligence ,Data mining ,business ,Metaheuristic ,computer ,Software ,Selection (genetic algorithm) - Abstract
The purpose of classification in medical informatics is to predict the presence or absence of a particular disease as well as disease types from historical data. Medical data often contain irrelevant features and noise, and an appropriate subset of the significant features can improve classification accuracy. Therefore, researchers apply feature selection to identify and remove irrelevant and redundant features. The authors propose a versatile feature selection approach called Swarm Search Feature Selection (SS-FS), based on stochastic swarm intelligence. It is designed to overcome NP-hard combinatorial search problems such as the selection of an optimal feature subset from an extremely large array of features--which is not uncommon in biomedical data. SS-FS is demonstrated to be a feasible computing tool in achieving high accuracy in classification via testing with two empirical biomedical datasets. This article is part of a special issue on life sciences computing.
- Published
- 2014
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.