30,935 results
Search Results
2. A fully-automated paper ECG digitisation algorithm using deep learning.
- Author
-
Wu, Huiyi, Patel, Kiran Haresh Kumar, Li, Xinyang, Zhang, Bowen, Galazis, Christoforos, Bajaj, Nikesh, Sau, Arunashis, Shi, Xili, Sun, Lin, Tao, Yanda, Al-Qaysi, Harith, Tarusan, Lawrence, Yasmin, Najira, Grewal, Natasha, Kapoor, Gaurika, Waks, Jonathan W., Kramer, Daniel B., Peters, Nicholas S., and Ng, Fu Siong
- Subjects
DEEP learning ,ELECTROCARDIOGRAPHY ,ELECTRONIC paper ,ATRIAL fibrillation ,ALGORITHMS ,HEART failure ,HEART rate monitors - Abstract
There is increasing focus on applying deep learning methods to electrocardiograms (ECGs), with recent studies showing that neural networks (NNs) can predict future heart failure or atrial fibrillation from the ECG alone. However, large numbers of ECGs are needed to train NNs, and many ECGs are currently only in paper format, which are not suitable for NN training. We developed a fully-automated online ECG digitisation tool to convert scanned paper ECGs into digital signals. Using automated horizontal and vertical anchor point detection, the algorithm automatically segments the ECG image into separate images for the 12 leads and a dynamical morphological algorithm is then applied to extract the signal of interest. We then validated the performance of the algorithm on 515 digital ECGs, of which 45 were printed, scanned and redigitised. The automated digitisation tool achieved 99.0% correlation between the digitised signals and the ground truth ECG (n = 515 standard 3-by-4 ECGs) after excluding ECGs with overlap of lead signals. Without exclusion, the performance of average correlation was from 90 to 97% across the leads on all 3-by-4 ECGs. There was a 97% correlation for 12-by-1 and 3-by-1 ECG formats after excluding ECGs with overlap of lead signals. Without exclusion, the average correlation of some leads in 12-by-1 ECGs was 60–70% and the average correlation of 3-by-1 ECGs achieved 80–90%. ECGs that were printed, scanned, and redigitised, our tool achieved 96% correlation with the original signals. We have developed and validated a fully-automated, user-friendly, online ECG digitisation tool. Unlike other available tools, this does not require any manual segmentation of ECG signals. Our tool can facilitate the rapid and automated digitisation of large repositories of paper ECGs to allow them to be used for deep learning projects. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
3. A Machine Learning Model to Predict Citation Counts of Scientific Papers in Otology Field.
- Author
-
Alohali, Yousef A., Fayed, Mahmoud S., Mesallam, Tamer, Abdelsamad, Yassin, Almuhawas, Fida, and Hagr, Abdulrahman
- Subjects
DECISION trees ,SERIAL publications ,NATURAL language processing ,BIBLIOMETRICS ,MACHINE learning ,REGRESSION analysis ,RANDOM forest algorithms ,CITATION analysis ,DESCRIPTIVE statistics ,PREDICTION models ,ARTIFICIAL neural networks ,MEDICAL research ,MEDICAL specialties & specialists ,ALGORITHMS - Abstract
One of the most widely used measures of scientific impact is the number of citations. However, due to its heavy-tailed distribution, citations are fundamentally difficult to predict but can be improved. This study was aimed at investigating the factors and parts influencing the citation number of a scientific paper in the otology field. Therefore, this work proposes a new solution that utilizes machine learning and natural language processing to process English text and provides a paper citation as the predicted results. Different algorithms are implemented in this solution, such as linear regression, boosted decision tree, decision forest, and neural networks. The application of neural network regression revealed that papers' abstracts have more influence on the citation numbers of otological articles. This new solution has been developed in visual programming using Microsoft Azure machine learning at the back end and Programming Without Coding Technology at the front end. We recommend using machine learning models to improve the abstracts of research articles to get more citations. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
4. Efficient and Effective Academic Expert Finding on Heterogeneous Graphs through (k, P)-Core based Embedding.
- Author
-
YUXIANG WANG, JUN LIU, XIAOLIANG XU, XIANGYU KE, TIANXING WU, and XIAOXUAN GOU
- Subjects
COMMUNITIES ,SEMANTICS ,ALGORITHMS - Abstract
Expert finding is crucial for a wealth of applications in both academia and industry. Given a user query and trove of academic papers, expert finding aims at retrieving the most relevant experts for the query, from the academic papers. Existing studies focus on embedding-based solutions that consider academic papers’ textual semantic similarities to a query via document representation and extract the top-n experts from the most similar papers. Beyond implicit textual semantics, however, papers’ explicit relationships (e.g., co-authorship) in a heterogeneous graph (e.g., DBLP) are critical for expert finding, because they help improve the representation quality. Despite their importance, the explicit relationships of papers generally have been ignored in the literature. In this article, we study expert finding on heterogeneous graphs by considering both the explicit relationships and implicit textual semantics of papers in one model. Specifically, we define the cohesive (k, P)-core community of papers w.r.t. a meta-path P (i.e., relationship) and propose a (k, P)-core based document embedding model to enhance the representation quality. Based on this, we design a proximity graph-based index (PGIndex) of papers and present a threshold algorithm (TA)-based method to efficiently extract top-n experts from papers returned by PG-Index. We further optimize our approach in two ways: (1) we boost effectiveness by considering the (k, P)-core community of experts and the diversity of experts’ research interests, to achieve high-quality expert representation from paper representation; and (2) we streamline expert finding, going from “extract top-n experts from top-m (m > n) semantically similar papers” to “directly return top-n experts”. The process of returning a large number of top-m papers as intermediate data is avoided, thereby improving the efficiency. Extensive experiments using real-world datasets demonstrate our approach’s superiority. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
5. Investigators from Midwest Orthopaedics at Rush Target Machine Learning (Paper 19: Evidence-based Machine Learning Algorithm To Predict Failure Following Cartilage Preservation Procedures In the Knee)
- Subjects
Data warehousing/data mining ,Algorithm ,Data mining ,Paper machines ,Algorithms ,Machine learning ,Papermaking machinery - Abstract
2023 MAY 28 (NewsRx) -- By a News Reporter-Staff News Editor at Medical Devices & Surgical Technology Week -- Fresh data on Machine Learning are presented in a new report. [...]
- Published
- 2023
6. FDA RELEASES TWO DISCUSSION PAPERS TO SPUR CONVERSATION ABOUT ARTIFICIAL INTELLIGENCE AND MACHINE LEARNING IN DRUG DEVELOPMENT AND MANUFACTURING
- Subjects
United States. Food and Drug Administration ,Artificial intelligence ,Machine learning ,Computer science ,Algorithms ,Pharmaceutical industry ,News, opinion and commentary ,Algorithm ,Artificial intelligence - Abstract
SILVER SPRING, MD -- The following information was released by the U.S. Food and Drug Administration (FDA): By: Patrizia Cavazzoni, M.D., Director of the Center for Drug Evaluation and Research [...]
- Published
- 2023
7. Cost Optimal Production-Scheduling Model Based on VNS-NSGA-II Hybrid Algorithm—Study on Tissue Paper Mill.
- Author
-
Zhang, Huanhuan, Li, Jigeng, Hong, Mengna, Man, Yi, and He, Zhenglei
- Subjects
PAPER mills ,FLOW shop scheduling ,PRODUCTION scheduling ,INDUSTRIAL costs ,ALGORITHMS - Abstract
With the development of the customization concept, small-batch and multi-variety production will become one of the major production modes, especially for fast-moving consumer goods. However, this production mode has two issues: high production cost and the long manufacturing period. To address these issues, this study proposes a multi-objective optimization model for the flexible flow-shop to optimize the production scheduling, which would maximize the production efficiency by minimizing the production cost and makespan. The model is designed based on hybrid algorithms, which combine a fast non-dominated genetic algorithm (NSGA-II) and a variable neighborhood search algorithm (VNS). In this model, NSGA-II is the major algorithm to calculate the optimal solutions. VNS is to improve the quality of the solution obtained by NSGA-II. The model is verified by an example of a real-world typical FFS, a tissue papermaking mill. The results show that the scheduling model can reduce production costs by 4.2% and makespan by 6.8% compared with manual scheduling. The hybrid VNS-NSGA-II model also shows better performance than NSGA-II, both in production cost and makespan. Hybrid algorithms are a good solution for multi-objective optimization issues in flexible flow-shop production scheduling. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
8. Research on the Fusion of Hybrid Fuzzy Clustering Algorithm and Computer Automatic Test Paper Composition Algorithm.
- Author
-
Kan, Baopeng
- Subjects
COMPUTERS ,COMPUTER algorithms ,FUZZY algorithms ,COMPUTER workstation clusters ,ALGORITHMS ,HIGHER education exams - Abstract
In order to improve the effect of intelligent automatic test paper composition, this paper combines the hybrid fuzzy clustering algorithm to study the computer automatic test paper composition algorithm. In this paper, a computer automatic test paper composition system based on hybrid fuzzy clustering algorithm is constructed. Moreover, the hybrid fuzzy clustering method used in this paper is used as the basic algorithm of the system, and the algorithm is improved according to the actual needs of intelligent paper composition. In addition, this paper uses an intelligent algorithm to input the relevant constraint parameters and combines the original parameters to select the most suitable test questions from the database and combine them into test papers. Finally, this paper constructs the system structure based on the requirements of intelligent test paper composition. The experimental research shows that the computer automatic test paper composition system based on the hybrid fuzzy clustering algorithm proposed in this paper has a good test paper composition function, which can effectively promote the progress of the intelligent examination mode in colleges and universities. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
9. New Findings from Northeast Forestry University Update Understanding of Information Science (Airank: an Algorithm On Evaluating the Academic Influence of Papers Based On Heterogeneous Academic Network)
- Subjects
Information science ,Algorithms ,Algorithm ,Computers - Abstract
2023 MAR 14 (VerticalNews) -- By a News Reporter-Staff News Editor at Information Technology Newsweekly -- Data detailed on Information Technology - Information Science have been presented. According to news [...]
- Published
- 2023
10. Tools and algorithms for the construction and analysis of systems: a special issue on tool papers for TACAS 2021.
- Author
-
Jensen, Peter Gjøl and Neele, Thomas
- Subjects
ALGORITHMS ,SOFTWARE verification ,INTEGRATED circuit verification ,SYSTEMS software ,CONFERENCES & conventions - Abstract
This special issue contains six revised and extended versions of tool papers that appeared in the proceedings of TACAS 2021, the 27th International Conference on Tools and Algorithms for the Construction and Analysis of Systems. The issue is dedicated to the realization of algorithms in tools and the studies of the application of these tools for analysing hard- and software systems. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
11. Socio‐technical issues in the platform‐mediated gig economy: A systematic literature review: An Annual Review of Information Science and Technology (ARIST) paper.
- Author
-
Dedema, Meredith and Rosenbaum, Howard
- Subjects
INFORMATION science ,TECHNOLOGY ,CORPORATE culture ,ALGORITHMS ,ECONOMICS - Abstract
The gig economy and gig work have grown quickly in recent years and have drawn much attention from researchers in different fields. Because the platform mediated gig economy is a relatively new phenomenon, studies have produced a range of interesting findings; of interest here are the socio‐technical issues that this work has surfaced. This systematic literature review (SLR) provides a snapshot of a range of socio‐technical issues raised in the last 12 years of literature focused on the platform mediated gig economy. Based on a sample of 515 papers gathered from nine databases in multiple disciplines, 132 were coded that specifically studied the gig economy, gig work, and gig workers. Three main socio‐technical themes were identified: (1) the digital workplace, which includes information infrastructure and digital labor that are related to the nature of gig work and the user agency; (2) algorithmic management, which includes platform governance, performance management, information asymmetry, power asymmetry, and system manipulation, relying on a diverse set of technological tools including algorithms and big data analytics; (3) ethical design, as a relevant value set that gig workers expect from the platform, which includes trust, fairness, equality, privacy, and transparency. A social informatics perspective is used to rethink the relationship between gig workers and platforms, extract the socio‐technical issues noted in prior research, and discuss the underexplored aspects of the platform mediated gig economy. The results draw attention to understudied yet critically important socio‐technical issues in the gig economy that suggest short‐ and long‐term opportunities for future research directions. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
12. A mathematical foundation for foundation paper pieceable quilts.
- Author
-
Leake, Mackenzie, Bernstein, Gilbert, Davis, Abe, and Agrawala, Maneesh
- Subjects
QUILTS ,QUILTING ,PATCHWORK quilts ,SEWING patterns ,ALGORITHMS - Abstract
Foundation paper piecing is a popular technique for constructing fabric patchwork quilts using printed paper patterns. But, the construction process imposes constraints on the geometry of the pattern and the order in which the fabric pieces are attached to the quilt. Manually designing foundation paper pieceable patterns that meet all of these constraints is challenging. In this work we mathematically formalize the foundation paper piecing process and use this formalization to develop an algorithm that can automatically check if an input pattern geometry is foundation paper pieceable. Our key insight is that we can represent the geometric pattern design using a certain type of dual hypergraph where nodes represent faces and hyperedges represent seams connecting two or more nodes. We show that determining whether the pattern is paper pieceable is equivalent to checking whether this hypergraph is acyclic, and if it is acyclic, we can apply a leaf-plucking algorithm to the hypergraph to generate viable sewing orders for the pattern geometry. We implement this algorithm in a design tool that allows quilt designers to focus on producing the geometric design of their pattern and let the tool handle the tedious task of determining whether the pattern is foundation paper pieceable. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
13. Performance Evaluation of the Extractive Methods in Automatic Text Summarization Using Medical Papers.
- Author
-
Kus, Anil and Aci, Cigdem Inan
- Subjects
PERFORMANCE evaluation ,TEXT summarization ,MEDICAL sciences ,ALGORITHMS ,SEMANTICS - Abstract
Copyright of Gazi Journal of Engineering Sciences (GJES) / Gazi Mühendislik Bilimleri Dergisi is the property of Gazi Journal of Engineering Sciences and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2023
- Full Text
- View/download PDF
14. Scientific papers and artificial intelligence. Brave new world?
- Author
-
Nexøe, Jørgen
- Subjects
COMPUTERS ,MANUSCRIPTS ,ARTIFICIAL intelligence ,MACHINE learning ,DATA analysis ,MEDICAL literature ,MEDICAL research ,ALGORITHMS - Published
- 2023
- Full Text
- View/download PDF
15. Discussion paper: implications for the further development of the successfully in emergency medicine implemented AUD2IT-algorithm.
- Author
-
Przestrzelski, Christopher, Jakob, Antonina, Jakob, Clemens, and Hoffmann, Felix R.
- Subjects
DOCUMENTATION ,CURRICULUM ,HUMAN services programs ,EMERGENCY medicine ,EXPERIENCE ,MEDICAL records ,ELECTRONIC publications ,ALGORITHMS ,PATIENTS' attitudes - Abstract
The AUD2IT-algorithm is a tool to structure the data, which is collected during an emergency treatment. The goal is on the one hand to structure the documentation of the data and on the other hand to give a standardised data structure for the report during handover of an emergency patient. AUD2IT-algorithm was developed to provide residents a documentation aid, which helps to structure the medical reports without getting lost in unimportant details or forgetting important information. The sequence of anamnesis, clinical examination, considering a differential diagnosis, technical diagnostics, interpretation and therapy is rather an academic classification than a description of the real workflow. In a real setting, most of these steps take place simultaneously. Therefore, the application of the AUD2IT-algorithm should also be carried out according to the real processes. A big advantage of the AUD2IT-algorithm is that it can be used as a structure for the entire treatment process and also is entirely usable as a handover protocol within this process to make sure, that the existing state of knowledge is ensured at each point of a team-timeout. PR-E-(AUD2IT)-algorithm makes it possible to document a treatment process that, in principle, does not have to be limited to the field of emergency medicine. Also, in the outpatient treatment the PR-E-(AUD2IT)-algorithm could be used and further developed. One example could be the preparation and allocation of needed resources at the general practitioner. The algorithm is a standardised tool that can be used by healthcare professionals of any level of training. It gives the user a sense of security in their daily work. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
16. Digitalized Control Algorithm of Bridgeless Totem-Pole PFC with a Simple Control Structure Based on the Phase Angle.
- Author
-
Lee, Gi-Young, Park, Hae-Chan, Ji, Min-Woo, and Kim, Rae-Young
- Subjects
ELECTRIC current rectifiers ,ELECTRONIC paper ,PHASE-locked loops ,ALGORITHMS ,ANGLES ,VOLTAGE - Abstract
Compared to the conventional boost power factor correction (PFC) converter, a totem-pole bridgeless PFC has high efficiency because it does not have an input diode rectifier stage, but a current spike may occur when the polarity of the grid voltage changes. This paper proposes a digital control algorithm for bridgeless totem-pole PFC with a simple control structure based on the phase angle of grid voltage. The proposed algorithm has a PI-based double-loop control structure and performs DC-link voltage and input inductor current control. Rectifying switches operate based on the proposed rectification algorithm using phase angle information calculated through a single-phase phase-locked loop (PLL) to prevent current spikes. The feed-forward duty ratio value is calculated according to the polarity of the grid voltage and added to the double-loop controller to perform appropriate power factor control. The performance and feasibility of the proposed control algorithm are verified through a 3 kW hardware prototype. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
17. Reports from Zhejiang Normal University Highlight Recent Findings in Engineering Software (Paper Intrusion Detection Approach for Cloud and Iot Environments Using Deep Learning and Capuchin Search Algorithm)
- Subjects
Engineering -- Computer programs ,Security software ,Algorithms ,Network security software ,Algorithm ,Engineering software - Abstract
2023 MAR 8 (VerticalNews) -- By a News Reporter-Staff News Editor at Computer Weekly News -- Current study results on Engineering - Engineering Software have been published. According to news […]
- Published
- 2023
18. Automated analysis of pen-on-paper spirals for tremor detection, quantification, and differentiation.
- Author
-
Rajan, Roopa, Anandapadmanabhan, Reghu, Nageswaran, Sharmila, Radhakrishnan, Vineeth, Saini, Arti, Krishnan, Syam, Gupta, Anu, Vishnu, Venugopalan Y., Pandit, Awadh K., Singh, Rajesh Kumar, Radhakrishnan, Divya M, Singh, Mamta Bhushan, Bhatia, Rohit, Srivastava, Achal, Kishore, Asha, and Padma Srivastava, M. V.
- Subjects
STATISTICS ,RESEARCH ,CONFIDENCE intervals ,ANALYSIS of variance ,TASK performance ,HANDWRITING ,ACCELEROMETERS ,DYSTONIA ,MOVEMENT disorders ,TREMOR ,DRAWING ,DESCRIPTIVE statistics ,PARKINSON'S disease ,SENSITIVITY & specificity (Statistics) ,DATA analysis ,RECEIVER operating characteristic curves ,DATA analysis software ,ALGORITHMS - Abstract
OBJECTIVE: To develop an automated algorithm to detect, quantify, and differentiate between tremor using pen-on-paper spirals. METHODS: Patients with essential tremor (n = 25), dystonic tremor (n = 25), Parkinson’s disease (n = 25), and healthy volunteers (HV, n = 25) drew free-hand spirals. The algorithm derived the mean deviation (MD) and tremor variability from scanned images. MD and tremor variability were compared with 1) the Bain and Findley scale, 2) the Fahn–Tolosa–Marin tremor rating scale (FTM–TRS), and 3) the peak power and total power of the accelerometer spectra. Inter and intra loop widths were computed to differentiate between the tremor. RESULTS: MD was higher in the tremor group (48.9±26.3) than in HV (26.4±5.3; p < 0.001). The cut-off value of 30.3 had 80.9% sensitivity and 76.0% specificity for the detection of the tremor [area under the curve: 0.83; 95% confidence index (CI): 0.75, 0.91, p < 0.001]. MD correlated with the Bain and Findley ratings (rho = 0.491, p = 0 < 0.001), FTM–TRS part B (rho = 0.260, p = 0.032) and accelerometric measures of postural tremor (total power, rho = 0.366, p < 0.001; peak power, rho = 0.402, p < 0.001). Minimum Detectable Change was 19.9%. Inter loop width distinguished Parkinson’s disease spirals from dystonic tremor (p < 0.001, 95% CI: 54.6, 211.1), essential tremor (p = 0.003, 95% CI: 28.5, 184.9), or HV (p = 0.036, 95% CI: -160.4, -3.9). CONCLUSION: The automated analysis of pen-on-paper spirals generated robust variables to quantify the tremor and putative variables to distinguish them from each other. SIGNIFICANCE: This technique maybe useful for epidemiological surveys and follow-up studies on tremor. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
19. Tachyum Unveils Air Defense Superiority Using Prodigy AI White Paper
- Subjects
Air defenses ,Algorithms ,Antiairborne warfare ,Algorithm ,Business ,Business, international - Abstract
LAS VEGAS -- Tachyum[TM] today released details of Prodigy([R]), the world's first universal processor, for advanced military and defense avionics applications in a new white paper 'Air Dominance Powered by [...]
- Published
- 2023
20. 基于多目标优化的联邦学习进化.
- Author
-
胡智勇, 于千城, 王之赐, and 张丽丝
- Subjects
FEDERATED learning ,ALGORITHMS ,PRIVACY - Abstract
Copyright of Application Research of Computers / Jisuanji Yingyong Yanjiu is the property of Application Research of Computers Edition and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2024
- Full Text
- View/download PDF
21. Classification of forensic hyperspectral paper data using hybrid spectral similarity algorithms.
- Author
-
Devassy, Binu Melit, George, Sony, Nussbaum, Peter, and Thomas, Tessamma
- Subjects
SPECTRAL imaging ,FORGERY ,ALGORITHMS ,FORENSIC sciences ,CLASSIFICATION ,CONFIDENCE intervals ,CLASSIFICATION algorithms - Abstract
Document forgeries that involve modification of the materials used, such as ink and paper, provide evidence of any malpractices being performed. Forensic specialists use different techniques to identify and classify these samples; however, the most preferred method is to use nondestructive techniques to avoid any potential damage to the original specimen under investigation. Hyperspectral imaging has already been explored in several application domains and used as a powerful method in forensic investigations to extract information about the materials under examination. To precisely classify the material information and utilize the hyperspectral imaging technique's potential, we probed the potential of some hybrid spectral similarity measures to classify different commonly used paper samples. A comparison of these methods is quantitatively presented in this article. Hybrid spectral similarity algorithms are tested on forensic analysis of paper data. We compared the classification capabilities of various hybrid spectral similarity algorithms on hyperspectral data of 40 different paper samples. The overall accuracy (OA), kappa K̂, Z‐score of kappa (ZK̂), and the 95% confidence interval of kappa (CI(K̂)) are used for comparison. The SID‐SAM and SID‐SCA produced an overall accuracy of 88% and 87%, respectively, which is highest among the hybrid spectral similarity measures tested. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
22. Selected Papers of the 32nd International Workshop on Combinatorial Algorithms, IWOCA 2021.
- Author
-
Flocchini, Paola and Moura, Lucia
- Subjects
EULERIAN graphs ,ALGORITHMS ,APPROXIMATION algorithms ,WEB hosting - Abstract
They give fixed parameter tractable algorithms for the problem parameterized by various structural parameters. The authors give a greedy loop-free algorithm for the exhaustive generation, a successor algorithm that runs in constant amortized time, among other algorithms, as well as results for the fixed spin generalization of this problem. IWOCA (International Workshop on Combinatorial Algorithms) is an annual conference series covering all aspects of combinatorial algorithms. [Extracted from the article]
- Published
- 2023
- Full Text
- View/download PDF
23. A review paper of optimal resource allocation algorithm in cloud environment.
- Author
-
Patadiya, Namrata and Bhatt, Nirav
- Subjects
RESOURCE allocation ,LITERATURE reviews ,SERVICE level agreements ,ALGORITHMS ,ELECTRONIC data processing ,CLOUD computing - Abstract
Cloud computing has become a popular approach for processing data and running computationally expensive services on a pay-as-you-go basis. Due to the ever-increasing requirement for cloud-based apps, appropriately allocating resources according to user requests while meeting service-level agreements between customers and service providers has become increasingly complex. An efficient and versatile resource allocation method is required to properly deploy these assets and meet user needs. The technique of distributing resources has become more arduous as user demand has increased. One of the key areas of research experts is how to design optimal solutions for this approach. In this paper, a literature review on proposed dynamic resource allocation approaches is introduced. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
24. Awarded papers focus on solving basic algorithmic problems, with applications in bioinformatics
- Subjects
Computational biology ,RNA ,College teachers ,Algorithms ,Business ,Health ,Health care industry ,Algorithm - Abstract
2022 NOV 13 (NewsRx) -- By a News Reporter-Staff News Editor at Medical Letter on the CDC & FDA -- The prize for the best research article of the Workshop [...]
- Published
- 2022
25. Aspira Women's Health Announces Publication of Paper Validating OvaWatch Algorithm in the Detection of Ovarian Cancer
- Subjects
Algorithm ,Cancer diagnosis ,Algorithms ,Women's health ,Ovarian cancer ,Women -- Health aspects ,Cancer -- Diagnosis - Abstract
AUSTIN: Aspira Women's Health Inc. has issued the following news release: Aspira Women's Health Inc. ('Aspira'), a bio-analytical based women's health company focused on gynecologic disease, today announced the online [...]
- Published
- 2022
26. New White Paper Centers AI as a Critical Component of 5G Telecoms Network Success
- Subjects
Algorithms ,Algorithm ,Telecommunications industry - Abstract
(GlobeNewswire) - Citing artificial intelligence (AI) and 5G as cornerstone technologies of this decade, a new whitepaper released today by InterDigital, Inc. (NASDAQ: IDCC) and written by ABI Research details [...]
- Published
- 2022
27. Development and Validation of an Algorithm for the Digitization of ECG Paper Images.
- Author
-
Randazzo, Vincenzo, Puleo, Edoardo, Paviglianiti, Annunziata, Vallan, Alberto, and Pasero, Eros
- Subjects
DIGITIZATION ,DIGITAL images ,ELECTROCARDIOGRAPHY ,HEART rate monitors ,PEARSON correlation (Statistics) ,MEASUREMENT errors ,HEART beat ,ALGORITHMS - Abstract
The electrocardiogram (ECG) signal describes the heart's electrical activity, allowing it to detect several health conditions, including cardiac system abnormalities and dysfunctions. Nowadays, most patient medical records are still paper-based, especially those made in past decades. The importance of collecting digitized ECGs is twofold: firstly, all medical applications can be easily implemented with an engineering approach if the ECGs are treated as signals; secondly, paper ECGs can deteriorate over time, therefore a correct evaluation of the patient's clinical evolution is not always guaranteed. The goal of this paper is the realization of an automatic conversion algorithm from paper-based ECGs (images) to digital ECG signals. The algorithm involves a digitization process tested on an image set of 16 subjects, also with pathologies. The quantitative analysis of the digitization method is carried out by evaluating the repeatability and reproducibility of the algorithm. The digitization accuracy is evaluated both on the entire signal and on six ECG time parameters (R-R peak distance, QRS complex duration, QT interval, PQ interval, P-wave duration, and heart rate). Results demonstrate the algorithm efficiency has an average Pearson correlation coefficient of 0.94 and measurement errors of the ECG time parameters are always less than 1 mm. Due to the promising experimental results, the algorithm could be embedded into a graphical interface, becoming a measurement and collection tool for cardiologists. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
28. LA INTEL·LIGÈNCIA ARTIFICIAL EN LA DETECCIÓ DE LES PRÀCTIQUES DE BID RIGGING: EL PAPER CAPDAVANTER DE L'ACCO.
- Author
-
Jiménez Cardona, Noemí
- Subjects
GOVERNMENT purchasing ,ARTIFICIAL intelligence ,ANTITRUST law ,SOFTWARE development tools ,CARTELS - Abstract
Copyright of Revista Catalana de Dret Públic is the property of Revista Catalana de Dret Public and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2022
- Full Text
- View/download PDF
29. Utilizing tables, figures, charts and graphs to enhance the readability of a research paper.
- Author
-
Divecha C. A., Tullu M. S., and Karande S.
- Subjects
GRAPHIC arts ,READABILITY (Literary style) ,SERIAL publications ,RESEARCH methodology ,COPYRIGHT ,MEDICAL research ,ALGORITHMS - Abstract
The authors offer observation on utilizing tables, figures, charts and graphs to help understand the research presented in a simple manner but also engage and sustain the reader's interest. Topics discussed include benefits provided by the use of tables/figures/charts/graphs, general methodology of design and submission, and copyright issues of using material from government publications/public domain.
- Published
- 2023
- Full Text
- View/download PDF
30. Special issue "Discrete optimization: Theory, algorithms and new applications".
- Author
-
Werner, Frank
- Subjects
MATHEMATICAL optimization ,METAHEURISTIC algorithms ,ONLINE algorithms ,LINEAR matrix inequalities ,ALGORITHMS ,ROBUST stability analysis ,NONLINEAR integral equations - Abstract
This document is an editorial for a special issue of the journal AIMS Mathematics on the topic of discrete optimization. The issue includes 21 papers covering a range of subjects, including molecular trees, network systems, variational inequality problems, scheduling, image restoration, spectral clustering, integral equations, convex functions, graph products, optimization algorithms, air quality prediction, humanitarian planning, inertial methods, neural networks, transportation problems, emotion identification, fixed-point problems, structural engineering design, single machine scheduling, and ensemble learning. The papers present new theoretical results, algorithms, and applications in these areas. The guest editor expresses gratitude to the journal staff and reviewers and hopes that readers will find inspiration for their own research. [Extracted from the article]
- Published
- 2024
- Full Text
- View/download PDF
31. 基于子空间多尺度特征融合的试卷语义分割.
- Author
-
夏源祥, 刘 渝, 楚程钱, 万永菁, and 蒋翠玲
- Subjects
PYRAMIDS ,ALGORITHMS ,HANDWRITING ,CLASSIFICATION ,SUBSPACES (Mathematics) - Abstract
Copyright of Journal of East China University of Science & Technology is the property of Journal of East China University of Science & Technology Editorial Office and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2023
- Full Text
- View/download PDF
32. 23‐1: Invited Paper: Visualization of Color‐Gamut Coverage‐Gamut Ring Intersection.
- Subjects
VISUALIZATION ,COLOR ,ALGORITHMS - Abstract
The D50 CIELAB gamut volume was standardized as a suitable and unified gamut size measurement methodology. The color gamut volume is visualized using proportionate gamut rings in a two‐dimensional diagram. Gamut ring intersection assists the visual comparison of the size and shape of the color gamut coverage between a display and a reference. This paper introduces a desktop application with an intuitive graphical interface for visualizing the gamut rings and describes the algorithm of the gamut ring intersection. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
33. Selected Papers of the 31st International Workshop on Combinatorial Algorithms, IWOCA 2020.
- Author
-
Gąsieniec, Leszek, Klasing, Ralf, and Radzik, Tomasz
- Subjects
MATHEMATICAL proofs ,ALGORITHMS ,CHARTS, diagrams, etc. ,POLYNOMIAL time algorithms ,ONLINE algorithms ,GRAPH labelings ,HAMMING distance - Published
- 2022
- Full Text
- View/download PDF
34. Digital marginalization, data marginalization, and algorithmic exclusions: a critical southern decolonial approach to datafication, algorithms, and digital citizenship from the Souths.
- Author
-
Chaka, Chaka
- Subjects
CITIZENSHIP ,DECOLONIZATION ,ELECTRONIC paper ,ALGORITHMS ,COMMUNITIES ,CHIEF information officers - Abstract
This paper explores digital marginalization, data marginalization, and algorithmic exclusions in the Souths. To this effect, it argues that underrepresented users and communities continue to be marginalized and excluded by digital technologies, by big data, and by algorithms employed by organizations, corporations, institutions, and governments in various data jurisdictions. Situating data colonialism within the Souths, the paper contends that data ableism, data disablism, and data colonialism are at play when data collected, collated, captured, configured, and processed from underrepresented users and communities is utilized by mega entities for their own multiple purposes. It also maintains that data coloniality, as opposed to data colonialism, is impervious to legal and legislative interventions within data jurisdictions. Additionally, it discusses digital citizenship (DC) and its related emerging regimes. Moreover, the paper argues that digital exclusion transcends the simplistic haves versus the have nots dualism as it manifests itself in multiple layers and in multiple dimensions. Furthermore, it characterizes how algorithmic exclusions tend to perpetuate historical human biases despite the pervasive view that algorithms are autonomous, neutral, rational, objective, fair, unbiased, and non-human. Finally, the paper advances a critical southern decolonial (CSD) approach to datafication, algorithms, and digital citizenship by means of which data coloniality, algorithmic coloniality, and the coloniality embodied in DC have to be critiqued, challenged, and dismantled. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
35. Edge computing-enabled green multisource fusion indoor positioning algorithm based on adaptive particle filter.
- Author
-
Li, Mengyao, Zhu, Rongbo, Ding, Qianao, Wang, Jun, Wan, Shaohua, and Ma, Maode
- Subjects
ADAPTIVE filters ,PROBABILITY density function ,ALGORITHMS ,EDGE computing ,FILTER paper ,PARTICLE swarm optimization - Abstract
Edge computing enables portable devices to provide smart applications, and the indoor positioning technique offers accurate location-based indoor navigation and personalized smart services. To achieve the high positioning accuracy, an indoor positioning algorithm based on particle filter requires a large number of sample particles to approximate the probability density function, which leads to the additional computational cost and high fusion delay. Focusing on real-time and accurate positioning, an edge computing-enabled green multi-source fusion indoor positioning algorithm called APFP is proposed based on adaptive particle filter in this paper. APFP considers both pedestrian dead reckoning (PDR) signals in mobile terminals and the received signal strength indication (RSSI) of Bluetooth, and effectively merges the error-free accumulation of trilateral positioning and the accurate short-range positioning of PDR, which enables mobile terminals adaptively perform particle filter to reduce the computing time and power consumption while ensuring positioning accuracy simultaneously. Detailed experimental results show that, compared with the traditional particle filter algorithm and the map-constrained algorithm, the proposed APFP reduces fusion computing cost by 59.89% and 54.37%, respectively. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
36. 多视图融合TextRCNN的论文自动推荐算法.
- Author
-
杨秀璋, 武帅, 杨琪, 项美玉, 李娜, 周既松, and 赵小明
- Subjects
CONVOLUTIONAL neural networks ,DEEP learning ,MACHINE learning ,AUTOMATIC classification ,ACCURACY of information ,ALGORITHMS - Abstract
Copyright of Journal of Computer Engineering & Applications is the property of Beijing Journal of Computer Engineering & Applications Journal Co Ltd. and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2023
- Full Text
- View/download PDF
37. Aspira Women's Health Announces Publication of Paper Validating OvaWatch(TM) Algorithm in the Detection of Ovarian Cancer
- Subjects
Women -- Health aspects ,Cancer -- Diagnosis ,Ovarian cancer ,Algorithms ,Algorithm ,Banking, finance and accounting industries ,Business - Abstract
AUSTIN, Texas, Jun 16, 2022 (GLOBE NEWSWIRE via COMTEX) -- EQNX::TICKER_START (NASDAQ:AWH), EQNX::TICKER_END Aspira Women's Health Inc. ('Aspira'), a bio-analytical based women's health company focused on gynecologic disease, today announced [...]
- Published
- 2022
38. Explainable Rules and Heuristics in AI Algorithm Recommendation Approaches--A Systematic Literature Review and Mapping Study.
- Author
-
García-Peñalvo, Francisco José, Vázquez-Ingelmo, Andrea, and García-Holgado, Alicia
- Subjects
ARTIFICIAL intelligence ,LITERATURE reviews ,SOFTWARE engineering ,ALGORITHMS ,HEURISTIC ,SOFTWARE engineers - Abstract
The exponential use of artificial intelligence (AI) to solve and automated complex tasks has catapulted its popularity generating some challenges that need to be addressed. While AI is a powerful means to discover interesting patterns and obtain predictive models, the use of these algorithms comes with a great responsibility, as an incomplete or unbalanced set of training data or an unproper interpretation of the models' outcomes could result in misleading conclusions that ultimately could become very dangerous. For these reasons, it is important to rely on expert knowledge when applying these methods. However, not every user can count on this specific expertise; non-AI-expert users could also benefit from applying these powerful algorithms to their domain problems, but they need basic guidelines to obtain the most out of AI models. The goal of this work is to present a systematic review of the literature to analyze studies whose outcomes are explainable rules and heuristics to select suitable AI algorithms given a set of input features. The systematic review follows the methodology proposed by Kitchenham and other authors in the field of software engineering. As a result, 9 papers that tackle AI algorithm recommendation through tangible and traceable rules and heuristics were collected. The reduced number of retrieved papers suggests a lack of reporting explicit rules and heuristics when testing the suitability and performance of AI algorithms. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
39. 边信息嵌入的学术论文推荐算法研究.
- Author
-
沈小烽, 刘柏嵩, 吴俊超, and 钱江波
- Subjects
RANDOM walks ,QUALITY factor ,PROBLEM solving ,ALGORITHMS ,COSINE function ,SEMANTICS - Abstract
Copyright of Journal of Computer Engineering & Applications is the property of Beijing Journal of Computer Engineering & Applications Journal Co Ltd. and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2022
- Full Text
- View/download PDF
40. Using Paper Texture for Choosing a Suitable Algorithm for Scanned Document Image Binarization.
- Author
-
Lins, Rafael Dueire, Bernardino, Rodrigo, Barboza, Ricardo da Silva, and De Oliveira, Raimundo Correa
- Subjects
DOCUMENT imaging systems ,HISTORICAL source material ,TEXTURES ,ALGORITHMS - Abstract
The intrinsic features of documents, such as paper color, texture, aging, translucency, the kind of printing, typing or handwriting, etc., are important with regard to how to process and enhance their image. Image binarization is the process of producing a monochromatic image having its color version as input. It is a key step in the document processing pipeline. The recent Quality-Time Binarization Competitions for documents have shown that no binarization algorithm is good for any kind of document image. This paper uses a sample of the texture of the scanned historical documents as the main document feature to select which of the 63 widely used algorithms, using five different versions of the input images, totaling 315 document image-binarization schemes, provides a reasonable quality-time trade-off. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
41. 基于改进YOLOv5 的纸病检测方法.
- Author
-
张开生 and 关凯凯
- Subjects
CLASSIFICATION algorithms ,LIGHT sources ,FEATURE extraction ,ALGORITHMS ,SPEED - Abstract
Copyright of China Pulp & Paper is the property of China Pulp & Paper Magazines Publisher and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2022
- Full Text
- View/download PDF
42. Data from Norwegian University of Science and Technology (NTNU) Provide New Insights into Information Technology (Classification of Forensic Hyperspectral Paper Data Using Hybrid Spectral Similarity Algorithms)
- Subjects
Algorithms ,Algorithm ,Computers - Abstract
2022 FEB 1 (VerticalNews) -- By a News Reporter-Staff News Editor at Information Technology Newsweekly -- Investigators publish new report on Information Technology. According to news reporting originating in Gjovik, [...]
- Published
- 2022
43. ITERATIVE ALGORITHMS FOR VARIATIONAL INCLUSIONS IN BANACH SPACES.
- Author
-
ANSARI, QAMRUL HASAN, BALOOEE, JAVAD, and PETRUŞEL, ADRIAN
- Subjects
BANACH spaces ,LIPSCHITZ continuity ,PAPER arts ,DIFFERENTIAL inclusions ,ALGORITHMS - Abstract
The present paper is in two folds. In the first fold, we prove the Lipschitz continuity of the proximal mapping associated with a general strongly H-monotone mapping and compute an estimate of its Lipschitz constant under some mild assumptions imposed on the mapping H involved in the proximal mapping. We provide two examples to show that a maximal monotone mapping need not be a general H-monotone for a single-valued mapping H from a Banach space to its dual space. A class of multi-valued nonlinear variational inclusion problems is considered, and by using the notion of proximal mapping and Nadler's technique, an iterative algorithm with mixed errors is suggested to compute its solutions. Under some appropriate hypotheses imposed on the mappings and parameters involved in the multi-valued nonlinear variational inclusion problem, the strong convergence of the sequences generated by the proposed algorithm to a solution of the aforesaid problem is verified. The second fold of this paper investigates and analyzes the notion of Cn-monotone mappings defined and studied in [S.Z. Nazemi, A new class of monotone mappings and a new class of variational inclusions in Banach spaces, J. Optim. Theory Appl. 155(3)(2012) 785-795]. Several comments related to the results and algorithm appeared in the above mentioned paper are given. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
44. Finance journal rankings: a paper affiliation methodology.
- Author
-
Docampo, Domingo and Safón, Vicente
- Subjects
PYTHON programming language ,BIBLIOMETRICS ,CLASSIFICATION ,ALGORITHMS - Abstract
Purpose: In this paper, the authors use a new methodology, called paper affiliation index, to create finance journal ranking using expert judgment and research impact, both of which are based on secondary, objective measures, thus making it possible to produce lists every year without human manipulation at virtually no cost. Design/methodology/approach: Bibliometrics. Python implementation. Findings: A new ranking with 65 finance journals. Research limitations/implications: This procedure helps to reduce bias and to deal with known problems associated with current methodologies. The data used in the methodology comes from public sources; the procedure is therefore easily replicable. This methodology is not subject-dependent and thus can be transferred to other realms of knowledge. Once the bibliometric institutional data has been gathered, the procedure is not computationally costly: a Python implementation of the algorithm executes the whole computation in a few seconds. Results seem to correct the pernicious Matthew effect which is so evident in citation-based methods. Originality/value: The institutional classification created includes all institutions that have contributed papers to the field of finance. The procedure helps to reduce bias and to deal with known problems associated with current methodologies. The data used in the methodology comes from public sources, the procedure is therefore easily replicable. The methodology is not subject-dependent and thus can be transferred to other realms of knowledge. Once the bibliometric institutional data has been gathered, the procedure is not computationally costly. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
45. Evaluation of fog application placement algorithms: a survey.
- Author
-
Smolka, Sven and Mann, Zoltán Ádám
- Subjects
ALGORITHMS ,EDGE computing ,ENERGY consumption ,FOG ,SERVER farms (Computer network management) - Abstract
Recently, the concept of cloud computing has been extended towards the network edge. Devices near the network edge, called fog nodes, offer computing capabilities with low latency to nearby end devices. In the resulting fog computing paradigm (also called edge computing), application components can be deployed to a distributed infrastructure, comprising both cloud data centers and fog nodes. The decision which infrastructure nodes should host which application components has a large impact on important system parameters like performance and energy consumption. Several algorithms have been proposed to find a good placement of applications on a fog infrastructure. In most cases, the proposed algorithms were evaluated experimentally by the respective authors. In the absence of a theoretical analysis, a thorough and systematic empirical evaluation is of key importance for being able to make sound conclusions about the suitability of the algorithms. The aim of this paper is to survey how application placement algorithms for fog computing are evaluated in the literature. In particular, we identify good and bad practices that should be utilized respectively avoided when evaluating such algorithms. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
46. SOFTWARE DEFECT PREDICTION APPROACHES REVISITED.
- Author
-
Shebl, Khaled S., Afify, Yasmine M., and Badr, Nagwa
- Subjects
SEMANTICS ,DATABASES ,ALGORITHMS ,COMPUTER software testing ,MACHINE learning - Abstract
A crucial field in software development and testing is Software Defect Prediction (SDP) because the quality, dependability, efficiency, and cost of the software are all improved by forecasting software defects at an earlier stage. Many existing models predict defects to facilitate software testing process for testers. A comprehensive review of these models from different perspectives is crucial to help new researchers enter this field and learn about its latest developments. Algorithms, method types, datasets, and tools were the only perspectives discussed in the current literature. A comprehensive study that takes into account a wide spectrum of viewpoints hasn't yet been published. Examining the development and advancement of SDP-related studies is the goal of this literature review. It provides a comprehensive and updated state-of-the-art that satisfies all stated criteria. Out of 591 papers retrieved from 6 reputable databases, 73 papers were eligible for analysis. This review addresses relevant research questions regarding techniques & method types, data details, tools, code syntax, semantics, structural and domain information. Motivation to conduct this comprehensive review is to equip the readers with the necessary information and keep them informed about the software defect prediction domain. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
47. Special Issue on papers from the 2019 Workshop on Models and Algorithms for Planning and Scheduling Problems.
- Author
-
Khuller, Samir
- Subjects
SCHEDULING ,ALGORITHMS ,ONLINE algorithms - Abstract
The paper "Well-behaved Online Load Balancing Against Strategic Jobs" by Li, Li and Wu considers a truthful online load-balancing problem with the objective of the makespan minimization on related machines. The 2019 workshop on models and algorithms for planning and scheduling problems was held in Renesse (The Netherlands). [Extracted from the article]
- Published
- 2023
- Full Text
- View/download PDF
48. Smart Random Walk Distributed Secured Edge Algorithm Using Multi-Regression for Green Network.
- Author
-
Saba, Tanzila, Haseeb, Khalid, Rehman, Amjad, Damaševičius, Robertas, and Bahaj, Saeed Ali
- Subjects
RANDOM walks ,ALGORITHMS ,ARTIFICIAL intelligence ,INTERNET of things ,ELECTRONIC paper ,INTERNET traffic - Abstract
Smart communication has significantly advanced with the integration of the Internet of Things (IoT). Many devices and online services are utilized in the network system to cope with data gathering and forwarding. Recently, many traffic-aware solutions have explored autonomous systems to attain the intelligent routing and flowing of internet traffic with the support of artificial intelligence. However, the inefficient usage of nodes' batteries and long-range communication degrades the connectivity time for the deployed sensors with the end devices. Moreover, trustworthy route identification is another significant research challenge for formulating a smart system. Therefore, this paper presents a smart Random walk Distributed Secured Edge algorithm (RDSE), using a multi-regression model for IoT networks, which aims to enhance the stability of the chosen IoT network with the support of an optimal system. In addition, by using secured computing, the proposed architecture increases the trustworthiness of smart devices with the least node complexity. The proposed algorithm differs from other works in terms of the following factors. Firstly, it uses the random walk to form the initial routes with certain probabilities, and later, by exploring a multi-variant function, it attains long-lasting communication with a high degree of network stability. This helps to improve the optimization criteria for the nodes' communication, and efficiently utilizes energy with the combination of mobile edges. Secondly, the trusted factors successfully identify the normal nodes even when the system is compromised. Therefore, the proposed algorithm reduces data risks and offers a more reliable and private system. In addition, the simulations-based testing reveals the significant performance of the proposed algorithm in comparison to the existing work. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
49. A BPNN Model-Based AdaBoost Algorithm for Estimating Inside Moisture of Oil–Paper Insulation of Power Transformer.
- Author
-
Liu, Jiefeng, Ding, Zheshi, Fan, Xianhao, Geng, Chuhan, Song, Boshu, Wang, Qingyin, and Zhang, Yiyi
- Subjects
POWER transformers ,TRANSFORMER insulation ,MOISTURE ,ALGORITHMS ,MACHINE learning ,CLASSIFICATION algorithms - Abstract
The traditional method for transformer moisture diagnosis is to establish empirical equations between feature parameters extracted from frequency domain spectroscopy (FDS) and the transformer’s moisture content. However, the established empirical equation may not be applicable to a novel testing environment, resulting in an unreliable evaluation result. In this regard, it is acknowledged that FDS combined with machine learning is more suitable for estimating moisture content in a variety of test environments. Nonetheless, the accuracy of the estimation results obtained using the existing method is limited by the algorithm’s inability to generalize. To address this issue, we propose an AdaBoost algorithm-enhanced back-propagation neural network (BP_AdaBoost). This study creates a database by extracting feature parameters from the FDS that characterize the insulation states of the prepared samples. Then, using the BP_AdaBoost algorithm and the newly constructed database, the moisture estimation models are trained. Finally, the results of the estimation are discussed in terms of laboratory and field transformers. By comparing the proposed BP_AdaBoost algorithm to other intelligence algorithms, it is demonstrated that it not only performs better in generalization, but also maintains a high level of accuracy. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
50. Special Issue "Scheduling: Algorithms and Applications".
- Author
-
Werner, Frank
- Subjects
METAHEURISTIC algorithms ,FLOW shop scheduling ,OPTIMIZATION algorithms ,ALGORITHMS ,ASSEMBLY line balancing ,JOB applications - Abstract
The paper [[10]] considers an assignment problem and some modifications which can be converted to routing, distribution, or scheduling problems. This special issue of I Algorithms i is dedicated to recent developments of scheduling algorithms and new applications. References 1 Werner F., Burtseva L., Sotskov Y. Special Issue on Algorithms for Scheduling Problems. For this problem, a hybrid metaheuristic algorithm is presented which combines a genetic algorithm with a so-called spotted hyena optimization algorithm. [Extracted from the article]
- Published
- 2023
- Full Text
- View/download PDF
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.