34,411 results
Search Results
2. A fully-automated paper ECG digitisation algorithm using deep learning.
- Author
-
Wu, Huiyi, Patel, Kiran Haresh Kumar, Li, Xinyang, Zhang, Bowen, Galazis, Christoforos, Bajaj, Nikesh, Sau, Arunashis, Shi, Xili, Sun, Lin, Tao, Yanda, Al-Qaysi, Harith, Tarusan, Lawrence, Yasmin, Najira, Grewal, Natasha, Kapoor, Gaurika, Waks, Jonathan W., Kramer, Daniel B., Peters, Nicholas S., and Ng, Fu Siong
- Subjects
DEEP learning ,ELECTROCARDIOGRAPHY ,ELECTRONIC paper ,ATRIAL fibrillation ,ALGORITHMS ,HEART failure ,HEART rate monitors - Abstract
There is increasing focus on applying deep learning methods to electrocardiograms (ECGs), with recent studies showing that neural networks (NNs) can predict future heart failure or atrial fibrillation from the ECG alone. However, large numbers of ECGs are needed to train NNs, and many ECGs are currently only in paper format, which are not suitable for NN training. We developed a fully-automated online ECG digitisation tool to convert scanned paper ECGs into digital signals. Using automated horizontal and vertical anchor point detection, the algorithm automatically segments the ECG image into separate images for the 12 leads and a dynamical morphological algorithm is then applied to extract the signal of interest. We then validated the performance of the algorithm on 515 digital ECGs, of which 45 were printed, scanned and redigitised. The automated digitisation tool achieved 99.0% correlation between the digitised signals and the ground truth ECG (n = 515 standard 3-by-4 ECGs) after excluding ECGs with overlap of lead signals. Without exclusion, the performance of average correlation was from 90 to 97% across the leads on all 3-by-4 ECGs. There was a 97% correlation for 12-by-1 and 3-by-1 ECG formats after excluding ECGs with overlap of lead signals. Without exclusion, the average correlation of some leads in 12-by-1 ECGs was 60–70% and the average correlation of 3-by-1 ECGs achieved 80–90%. ECGs that were printed, scanned, and redigitised, our tool achieved 96% correlation with the original signals. We have developed and validated a fully-automated, user-friendly, online ECG digitisation tool. Unlike other available tools, this does not require any manual segmentation of ECG signals. Our tool can facilitate the rapid and automated digitisation of large repositories of paper ECGs to allow them to be used for deep learning projects. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
3. A Machine Learning Model to Predict Citation Counts of Scientific Papers in Otology Field.
- Author
-
Alohali, Yousef A., Fayed, Mahmoud S., Mesallam, Tamer, Abdelsamad, Yassin, Almuhawas, Fida, and Hagr, Abdulrahman
- Subjects
DECISION trees ,SERIAL publications ,NATURAL language processing ,BIBLIOMETRICS ,MACHINE learning ,REGRESSION analysis ,RANDOM forest algorithms ,CITATION analysis ,DESCRIPTIVE statistics ,PREDICTION models ,ARTIFICIAL neural networks ,MEDICAL research ,MEDICAL specialties & specialists ,ALGORITHMS - Abstract
One of the most widely used measures of scientific impact is the number of citations. However, due to its heavy-tailed distribution, citations are fundamentally difficult to predict but can be improved. This study was aimed at investigating the factors and parts influencing the citation number of a scientific paper in the otology field. Therefore, this work proposes a new solution that utilizes machine learning and natural language processing to process English text and provides a paper citation as the predicted results. Different algorithms are implemented in this solution, such as linear regression, boosted decision tree, decision forest, and neural networks. The application of neural network regression revealed that papers' abstracts have more influence on the citation numbers of otological articles. This new solution has been developed in visual programming using Microsoft Azure machine learning at the back end and Programming Without Coding Technology at the front end. We recommend using machine learning models to improve the abstracts of research articles to get more citations. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
4. Efficient and Effective Academic Expert Finding on Heterogeneous Graphs through (k, P)-Core based Embedding.
- Author
-
YUXIANG WANG, JUN LIU, XIAOLIANG XU, XIANGYU KE, TIANXING WU, and XIAOXUAN GOU
- Subjects
COMMUNITIES ,SEMANTICS ,ALGORITHMS - Abstract
Expert finding is crucial for a wealth of applications in both academia and industry. Given a user query and trove of academic papers, expert finding aims at retrieving the most relevant experts for the query, from the academic papers. Existing studies focus on embedding-based solutions that consider academic papers’ textual semantic similarities to a query via document representation and extract the top-n experts from the most similar papers. Beyond implicit textual semantics, however, papers’ explicit relationships (e.g., co-authorship) in a heterogeneous graph (e.g., DBLP) are critical for expert finding, because they help improve the representation quality. Despite their importance, the explicit relationships of papers generally have been ignored in the literature. In this article, we study expert finding on heterogeneous graphs by considering both the explicit relationships and implicit textual semantics of papers in one model. Specifically, we define the cohesive (k, P)-core community of papers w.r.t. a meta-path P (i.e., relationship) and propose a (k, P)-core based document embedding model to enhance the representation quality. Based on this, we design a proximity graph-based index (PGIndex) of papers and present a threshold algorithm (TA)-based method to efficiently extract top-n experts from papers returned by PG-Index. We further optimize our approach in two ways: (1) we boost effectiveness by considering the (k, P)-core community of experts and the diversity of experts’ research interests, to achieve high-quality expert representation from paper representation; and (2) we streamline expert finding, going from “extract top-n experts from top-m (m > n) semantically similar papers” to “directly return top-n experts”. The process of returning a large number of top-m papers as intermediate data is avoided, thereby improving the efficiency. Extensive experiments using real-world datasets demonstrate our approach’s superiority. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
5. Wet Paper Coding-Based Deep Neural Network Watermarking
- Author
-
Xuan Wang, Yuliang Lu, Xuehu Yan, and Long Yu
- Subjects
Neural Networks, Computer ,Electrical and Electronic Engineering ,Biochemistry ,Instrumentation ,deep neural network ,watermarking ,wet paper encoding ,embedding rate ,Atomic and Molecular Physics, and Optics ,Algorithms ,Computer Security ,Analytical Chemistry - Abstract
In recent years, the wide application of deep neural network models has brought serious risks of intellectual property rights infringement. Embedding a watermark in a network model is an effective solution to protect intellectual property rights. Although researchers have proposed schemes to add watermarks to models, they cannot prevent attackers from adding and overwriting original information, and embedding rates cannot be quantified. Therefore, aiming at these problems, this paper designs a high embedding rate and tamper-proof watermarking scheme. We employ wet paper coding (WPC), in which important parameters are regarded as wet blocks and the remaining unimportant parameters are regarded as dry blocks in the model. To obtain the important parameters more easily, we propose an optimized probabilistic selection strategy (OPSS). OPSS defines the unimportant-level function and sets the importance threshold to select the important parameter positions and to ensure that the original function is not affected after the model parameters are changed. We regard important parameters as an unmodifiable part, and only modify the part that includes the unimportant parameters. We selected the MNIST, CIFAR-10, and ImageNet datasets to test the performance of the model after adding a watermark and to analyze the fidelity, robustness, embedding rate, and comparison schemes of the model. Our experiment shows that the proposed scheme has high fidelity and strong robustness along with a high embedding rate and the ability to prevent malicious tampering.
- Published
- 2022
6. Cost Optimal Production-Scheduling Model Based on VNS-NSGA-II Hybrid Algorithm—Study on Tissue Paper Mill.
- Author
-
Zhang, Huanhuan, Li, Jigeng, Hong, Mengna, Man, Yi, and He, Zhenglei
- Subjects
PAPER mills ,FLOW shop scheduling ,PRODUCTION scheduling ,INDUSTRIAL costs ,ALGORITHMS - Abstract
With the development of the customization concept, small-batch and multi-variety production will become one of the major production modes, especially for fast-moving consumer goods. However, this production mode has two issues: high production cost and the long manufacturing period. To address these issues, this study proposes a multi-objective optimization model for the flexible flow-shop to optimize the production scheduling, which would maximize the production efficiency by minimizing the production cost and makespan. The model is designed based on hybrid algorithms, which combine a fast non-dominated genetic algorithm (NSGA-II) and a variable neighborhood search algorithm (VNS). In this model, NSGA-II is the major algorithm to calculate the optimal solutions. VNS is to improve the quality of the solution obtained by NSGA-II. The model is verified by an example of a real-world typical FFS, a tissue papermaking mill. The results show that the scheduling model can reduce production costs by 4.2% and makespan by 6.8% compared with manual scheduling. The hybrid VNS-NSGA-II model also shows better performance than NSGA-II, both in production cost and makespan. Hybrid algorithms are a good solution for multi-objective optimization issues in flexible flow-shop production scheduling. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
7. Socio‐technical issues in the platform‐mediated gig economy: A systematic literature review: An Annual Review of Information Science and Technology (ARIST) paper.
- Author
-
Dedema, Meredith and Rosenbaum, Howard
- Subjects
INFORMATION science ,TECHNOLOGY ,CORPORATE culture ,ALGORITHMS ,ECONOMICS - Abstract
The gig economy and gig work have grown quickly in recent years and have drawn much attention from researchers in different fields. Because the platform mediated gig economy is a relatively new phenomenon, studies have produced a range of interesting findings; of interest here are the socio‐technical issues that this work has surfaced. This systematic literature review (SLR) provides a snapshot of a range of socio‐technical issues raised in the last 12 years of literature focused on the platform mediated gig economy. Based on a sample of 515 papers gathered from nine databases in multiple disciplines, 132 were coded that specifically studied the gig economy, gig work, and gig workers. Three main socio‐technical themes were identified: (1) the digital workplace, which includes information infrastructure and digital labor that are related to the nature of gig work and the user agency; (2) algorithmic management, which includes platform governance, performance management, information asymmetry, power asymmetry, and system manipulation, relying on a diverse set of technological tools including algorithms and big data analytics; (3) ethical design, as a relevant value set that gig workers expect from the platform, which includes trust, fairness, equality, privacy, and transparency. A social informatics perspective is used to rethink the relationship between gig workers and platforms, extract the socio‐technical issues noted in prior research, and discuss the underexplored aspects of the platform mediated gig economy. The results draw attention to understudied yet critically important socio‐technical issues in the gig economy that suggest short‐ and long‐term opportunities for future research directions. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
8. A Machine Learning Model to Predict Citation Counts of Scientific Papers in Otology Field
- Author
-
Yousef A. Alohali, Mahmoud S. Fayed, Tamer Mesallam, Yassin Abdelsamad, Fida Almuhawas, and Abdulrahman Hagr
- Subjects
Machine Learning ,Otolaryngology ,Article Subject ,General Immunology and Microbiology ,Linear Models ,General Medicine ,Algorithms ,General Biochemistry, Genetics and Molecular Biology - Abstract
One of the most widely used measures of scientific impact is the number of citations. However, due to its heavy-tailed distribution, citations are fundamentally difficult to predict but can be improved. This study was aimed at investigating the factors and parts influencing the citation number of a scientific paper in the otology field. Therefore, this work proposes a new solution that utilizes machine learning and natural language processing to process English text and provides a paper citation as the predicted results. Different algorithms are implemented in this solution, such as linear regression, boosted decision tree, decision forest, and neural networks. The application of neural network regression revealed that papers’ abstracts have more influence on the citation numbers of otological articles. This new solution has been developed in visual programming using Microsoft Azure machine learning at the back end and Programming Without Coding Technology at the front end. We recommend using machine learning models to improve the abstracts of research articles to get more citations.
- Published
- 2022
9. Research on the Fusion of Hybrid Fuzzy Clustering Algorithm and Computer Automatic Test Paper Composition Algorithm.
- Author
-
Kan, Baopeng
- Subjects
COMPUTERS ,COMPUTER algorithms ,FUZZY algorithms ,COMPUTER workstation clusters ,ALGORITHMS ,HIGHER education exams - Abstract
In order to improve the effect of intelligent automatic test paper composition, this paper combines the hybrid fuzzy clustering algorithm to study the computer automatic test paper composition algorithm. In this paper, a computer automatic test paper composition system based on hybrid fuzzy clustering algorithm is constructed. Moreover, the hybrid fuzzy clustering method used in this paper is used as the basic algorithm of the system, and the algorithm is improved according to the actual needs of intelligent paper composition. In addition, this paper uses an intelligent algorithm to input the relevant constraint parameters and combines the original parameters to select the most suitable test questions from the database and combine them into test papers. Finally, this paper constructs the system structure based on the requirements of intelligent test paper composition. The experimental research shows that the computer automatic test paper composition system based on the hybrid fuzzy clustering algorithm proposed in this paper has a good test paper composition function, which can effectively promote the progress of the intelligent examination mode in colleges and universities. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
10. Tools and algorithms for the construction and analysis of systems: a special issue on tool papers for TACAS 2021.
- Author
-
Jensen, Peter Gjøl and Neele, Thomas
- Subjects
ALGORITHMS ,SOFTWARE verification ,INTEGRATED circuit verification ,SYSTEMS software ,CONFERENCES & conventions - Abstract
This special issue contains six revised and extended versions of tool papers that appeared in the proceedings of TACAS 2021, the 27th International Conference on Tools and Algorithms for the Construction and Analysis of Systems. The issue is dedicated to the realization of algorithms in tools and the studies of the application of these tools for analysing hard- and software systems. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
11. Visualization Display System of Gannan Hakka Paper-Cut Works Based on Computer Graphics Algorithm
- Author
-
Xingping Li
- Subjects
Article Subject ,General Computer Science ,General Mathematics ,General Neuroscience ,Computer Graphics ,Image Processing, Computer-Assisted ,General Medicine ,Algorithms - Abstract
Today, computer graphics and graphic image processing techniques have been widely used in daily life and industrial production. Due to the development of computers, computer graphics has brought more convenience to our daily life. In order to give full play to the value of computers, this paper takes the Hakka paper-cut art with local characteristics as the starting point, first of all its development history, artistic characteristics, compositional forms, expression techniques, cultural connotations, Hakka paper-cut patterns, and the symbolic meaning of folk customs, and then we design a visualization system for the paper-cut works of Gannan Hakka based on computer graphics. In addition, the system provides a solution for the integration of Gannan Hakka paper-cut art and Jiangxi native product packaging design and provides a reference for the theory and practice of modern native product packaging design.
- Published
- 2022
12. Performance Evaluation of the Extractive Methods in Automatic Text Summarization Using Medical Papers.
- Author
-
Kus, Anil and Aci, Cigdem Inan
- Subjects
PERFORMANCE evaluation ,TEXT summarization ,MEDICAL sciences ,ALGORITHMS ,SEMANTICS - Abstract
Copyright of Gazi Journal of Engineering Sciences (GJES) / Gazi Mühendislik Bilimleri Dergisi is the property of Gazi Journal of Engineering Sciences and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2023
- Full Text
- View/download PDF
13. A fully-automated paper ECG digitisation algorithm using deep learning
- Author
-
Huiyi Wu, Kiran Haresh Kumar Patel, Xinyang Li, Bowen Zhang, Christoforos Galazis, Nikesh Bajaj, Arunashis Sau, Xili Shi, Lin Sun, Yanda Tao, Harith Al-Qaysi, Lawrence Tarusan, Najira Yasmin, Natasha Grewal, Gaurika Kapoor, Jonathan W. Waks, Daniel B. Kramer, Nicholas S. Peters, and Fu Siong Ng
- Subjects
Electrocardiography ,Deep Learning ,Multidisciplinary ,Atrial Fibrillation ,Humans ,Neural Networks, Computer ,Algorithms - Abstract
There is increasing focus on applying deep learning methods to electrocardiograms (ECGs), with recent studies showing that neural networks (NNs) can predict future heart failure or atrial fibrillation from the ECG alone. However, large numbers of ECGs are needed to train NNs, and many ECGs are currently only in paper format, which are not suitable for NN training. We developed a fully-automated online ECG digitisation tool to convert scanned paper ECGs into digital signals. Using automated horizontal and vertical anchor point detection, the algorithm automatically segments the ECG image into separate images for the 12 leads and a dynamical morphological algorithm is then applied to extract the signal of interest. We then validated the performance of the algorithm on 515 digital ECGs, of which 45 were printed, scanned and redigitised. The automated digitisation tool achieved 99.0% correlation between the digitised signals and the ground truth ECG (n = 515 standard 3-by-4 ECGs) after excluding ECGs with overlap of lead signals. Without exclusion, the performance of average correlation was from 90 to 97% across the leads on all 3-by-4 ECGs. There was a 97% correlation for 12-by-1 and 3-by-1 ECG formats after excluding ECGs with overlap of lead signals. Without exclusion, the average correlation of some leads in 12-by-1 ECGs was 60–70% and the average correlation of 3-by-1 ECGs achieved 80–90%. ECGs that were printed, scanned, and redigitised, our tool achieved 96% correlation with the original signals. We have developed and validated a fully-automated, user-friendly, online ECG digitisation tool. Unlike other available tools, this does not require any manual segmentation of ECG signals. Our tool can facilitate the rapid and automated digitisation of large repositories of paper ECGs to allow them to be used for deep learning projects.
- Published
- 2022
14. Numerical simulations of paper-based electrophoretic separations with open-source tools
- Author
-
Santiago Marquez Damian, Federico Schaumburg, Nicolas Franck, Gabriel Gerlero, and Pablo A. Kler
- Subjects
Finite volume method ,Computer science ,010401 analytical chemistry ,Clinical Biochemistry ,Microfluidics ,02 engineering and technology ,Paper based ,Models, Theoretical ,021001 nanoscience & nanotechnology ,Supercomputer ,01 natural sciences ,Biochemistry ,Toolbox ,0104 chemical sciences ,Analytical Chemistry ,Image (mathematics) ,Computational science ,Open source ,Lab-On-A-Chip Devices ,Compatibility (mechanics) ,0210 nano-technology ,Algorithms ,Software - Abstract
A new tool for the solution of electromigrative separations in paper-based microfluidics devices is presented. The implementation is based on a recently published complete mathematical model for describing these types of separations, and was developed on top of the open-source toolbox electroMicroTransport, based on OpenFOAM® , inheriting all its features as native 3D problem handling, support for parallel computation, and a GNU GPL license . The presented tool includes full support for paper-based electromigrative separations (including EOF and the novel mechanical and electrical dispersion effects), compatibility with a well-recognized electrolyte database, and a novel algorithm for computing and controlling the electric current in arbitrary geometries. Additionally, the installation on any operating system is available due to its novel installation option in the form of a Docker image. A validation example with data from literature is included, and two extra application examples are provided, including a 2D free-flow IEF problem, which demonstrates the capabilities of the toolbox for dealing with computational and physicochemical modeling challenges simultaneously. This tool will enable efficient and reliable numerical prototypes of paper-based electrophoretic devices to accompany the contemporary fast growth in paper-based microfluidics.
- Published
- 2021
15. Discussion paper: implications for the further development of the successfully in emergency medicine implemented AUD2IT-algorithm.
- Author
-
Przestrzelski, Christopher, Jakob, Antonina, Jakob, Clemens, and Hoffmann, Felix R.
- Subjects
DOCUMENTATION ,CURRICULUM ,HUMAN services programs ,EMERGENCY medicine ,EXPERIENCE ,MEDICAL records ,ELECTRONIC publications ,ALGORITHMS ,PATIENTS' attitudes - Abstract
The AUD2IT-algorithm is a tool to structure the data, which is collected during an emergency treatment. The goal is on the one hand to structure the documentation of the data and on the other hand to give a standardised data structure for the report during handover of an emergency patient. AUD2IT-algorithm was developed to provide residents a documentation aid, which helps to structure the medical reports without getting lost in unimportant details or forgetting important information. The sequence of anamnesis, clinical examination, considering a differential diagnosis, technical diagnostics, interpretation and therapy is rather an academic classification than a description of the real workflow. In a real setting, most of these steps take place simultaneously. Therefore, the application of the AUD2IT-algorithm should also be carried out according to the real processes. A big advantage of the AUD2IT-algorithm is that it can be used as a structure for the entire treatment process and also is entirely usable as a handover protocol within this process to make sure, that the existing state of knowledge is ensured at each point of a team-timeout. PR-E-(AUD2IT)-algorithm makes it possible to document a treatment process that, in principle, does not have to be limited to the field of emergency medicine. Also, in the outpatient treatment the PR-E-(AUD2IT)-algorithm could be used and further developed. One example could be the preparation and allocation of needed resources at the general practitioner. The algorithm is a standardised tool that can be used by healthcare professionals of any level of training. It gives the user a sense of security in their daily work. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
16. A mathematical foundation for foundation paper pieceable quilts.
- Author
-
Leake, Mackenzie, Bernstein, Gilbert, Davis, Abe, and Agrawala, Maneesh
- Subjects
QUILTS ,QUILTING ,PATCHWORK quilts ,SEWING patterns ,ALGORITHMS - Abstract
Foundation paper piecing is a popular technique for constructing fabric patchwork quilts using printed paper patterns. But, the construction process imposes constraints on the geometry of the pattern and the order in which the fabric pieces are attached to the quilt. Manually designing foundation paper pieceable patterns that meet all of these constraints is challenging. In this work we mathematically formalize the foundation paper piecing process and use this formalization to develop an algorithm that can automatically check if an input pattern geometry is foundation paper pieceable. Our key insight is that we can represent the geometric pattern design using a certain type of dual hypergraph where nodes represent faces and hyperedges represent seams connecting two or more nodes. We show that determining whether the pattern is paper pieceable is equivalent to checking whether this hypergraph is acyclic, and if it is acyclic, we can apply a leaf-plucking algorithm to the hypergraph to generate viable sewing orders for the pattern geometry. We implement this algorithm in a design tool that allows quilt designers to focus on producing the geometric design of their pattern and let the tool handle the tedious task of determining whether the pattern is foundation paper pieceable. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
17. Digitalized Control Algorithm of Bridgeless Totem-Pole PFC with a Simple Control Structure Based on the Phase Angle.
- Author
-
Lee, Gi-Young, Park, Hae-Chan, Ji, Min-Woo, and Kim, Rae-Young
- Subjects
ELECTRIC current rectifiers ,ELECTRONIC paper ,PHASE-locked loops ,ALGORITHMS ,ANGLES ,VOLTAGE - Abstract
Compared to the conventional boost power factor correction (PFC) converter, a totem-pole bridgeless PFC has high efficiency because it does not have an input diode rectifier stage, but a current spike may occur when the polarity of the grid voltage changes. This paper proposes a digital control algorithm for bridgeless totem-pole PFC with a simple control structure based on the phase angle of grid voltage. The proposed algorithm has a PI-based double-loop control structure and performs DC-link voltage and input inductor current control. Rectifying switches operate based on the proposed rectification algorithm using phase angle information calculated through a single-phase phase-locked loop (PLL) to prevent current spikes. The feed-forward duty ratio value is calculated according to the polarity of the grid voltage and added to the double-loop controller to perform appropriate power factor control. The performance and feasibility of the proposed control algorithm are verified through a 3 kW hardware prototype. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
18. MIRSIG position paper: the use of image registration and fusion algorithms in radiotherapy
- Author
-
Nicholas, Lowther, Rob, Louwe, Johnson, Yuen, Nicholas, Hardcastle, Adam, Yeo, and Michael, Jameson
- Subjects
Radiological and Ultrasound Technology ,Radiotherapy Planning, Computer-Assisted ,Image Processing, Computer-Assisted ,Biomedical Engineering ,Biophysics ,Humans ,Radiotherapy Dosage ,Radiology, Nuclear Medicine and imaging ,Instrumentation ,Algorithms ,Biotechnology - Abstract
The report of the American Association of Physicists in Medicine (AAPM) Task Group No. 132 published in 2017 reviewed rigid image registration and deformable image registration (DIR) approaches and solutions to provide recommendations for quality assurance and quality control of clinical image registration and fusion techniques in radiotherapy. However, that report did not include the use of DIR for advanced applications such as dose warping or warping of other matrices of interest. Considering that DIR warping tools are now readily available, discussions were hosted by the Medical Image Registration Special Interest Group (MIRSIG) of the Australasian College of Physical Scientists & Engineers in Medicine in 2018 to form a consensus on best practice guidelines. This position statement authored by MIRSIG endorses the recommendations of the report of AAPM task group 132 and expands on the best practice advice from the ‘Deforming to Best Practice’ MIRSIG publication to provide guidelines on the use of DIR for advanced applications.
- Published
- 2022
19. Society of Skeletal Radiology– white paper. Guidelines for the diagnostic management of incidental solitary bone lesions on CT and MRI in adults: bone reporting and data system (Bone-RADS)
- Author
-
Connie Y. Chang, Hillary W. Garner, Shivani Ahlawat, Behrang Amini, Matthew D. Bucknor, Jonathan A. Flug, Iman Khodarahmi, Michael E. Mulligan, Jeffrey J. Peterson, Geoffrey M. Riley, Mohammad Samim, Santiago A. Lozano-Calderon, and Jim S. Wu
- Subjects
Adult ,Humans ,Radiology, Nuclear Medicine and imaging ,Radiology ,Tomography, X-Ray Computed ,Magnetic Resonance Imaging ,Algorithms - Abstract
The purpose of this article is to present algorithms for the diagnostic management of solitary bone lesions incidentally encountered on computed tomography (CT) and magnetic resonance (MRI) in adults. Based on review of the current literature and expert opinion, the Practice Guidelines and Technical Standards Committee of the Society of Skeletal Radiology (SSR) proposes a bone reporting and data system (Bone-RADS) for incidentally encountered solitary bone lesions on CT and MRI with four possible diagnostic management recommendations (Bone-RADS1, leave alone; Bone-RADS2, perform different imaging modality; Bone-RADS3, perform follow-up imaging; Bone-RADS4, biopsy and/or oncologic referral). Two algorithms for CT based on lesion density (lucent or sclerotic/mixed) and two for MRI allow the user to arrive at a specific Bone-RADS management recommendation. Representative cases are provided to illustrate the usability of the algorithms.
- Published
- 2022
20. Scientific papers and artificial intelligence. Brave new world?
- Author
-
Nexøe, Jørgen
- Subjects
COMPUTERS ,MANUSCRIPTS ,ARTIFICIAL intelligence ,MACHINE learning ,DATA analysis ,MEDICAL literature ,MEDICAL research ,ALGORITHMS - Published
- 2023
- Full Text
- View/download PDF
21. Computerized automated algorithm-based analyses of digitized paper ECGs in Brugada syndrome
- Author
-
Antoine Leenhardt, Pierre Maison-Blanche, Isabelle Denjoy, Fabio Badilini, Pierre-Léo Laporte, Fabrice Extramiana, and Martino Vaglio
- Subjects
Adult ,Male ,Acute effects ,medicine.medical_specialty ,Sudden death ,Electrocardiography ,QRS complex ,Internal medicine ,medicine ,Humans ,Repolarization ,cardiovascular diseases ,Brugada Syndrome ,Brugada syndrome ,business.industry ,Class I antiarrhythmic drug ,Middle Aged ,medicine.disease ,Increased risk ,Automated algorithm ,Cardiology ,Female ,Cardiology and Cardiovascular Medicine ,business ,Anti-Arrhythmia Agents ,Algorithms ,Software - Abstract
Background Brugada syndrome is a rare inherited arrhythmic syndrome with a coved type 1 ST-segment elevation on ECG and an increased risk of sudden death. Many studies have evaluated risk stratification performance based on ECG-derived parameters. However, since historical Brugada patient cohorts included mostly paper ECGs, most studies have been based on manual ECG parameter measurements. We hypothesized that it would be possible to run automated algorithm-based analysis of paper ECGs. We aimed: 1) to validate the digitization process for paper ECGs in Brugada patients; and 2) to quantify the acute class I antiarrhythmic drug effect on relevant ECG parameters in Brugada syndrome. Methods A total of 176 patients (30% female, 43 ± 13 years old) with induced type 1 Brugada syndrome ECG were included in the study. All of the patients had paper ECGs before and during class I antiarrhythmic drug challenge. Twenty patients also had a digital ECG, in whom printouts were used to validate the digitization process. Paper ECGs were scanned and then digitized using ECGScan software, version 3.4.0 (AMPS, LLC, New York, NY, USA) to obtain FDA HL7 XML format ECGs. Measurements were automatically performed using the Bravo (AMPS, LLC, New York, NY, USA) and Glasgow algorithms. Results ECG parameters obtained from digital and digitized ECGs were closely correlated (r = 0.96 ± 0.07, R2 = 0.93 ± 0.12). Class I antiarrhythmic drugs significantly increased the global QRS duration (from 113 ± 20 to 138 ± 23, p Conclusions Automated algorithm-based measurements of depolarization and repolarization parameters from digitized paper ECGs are reliable and could quantify the acute effects of class 1 antiarrhythmic drug challenge in Brugada patients. Our results support using computerized automated algorithm-based analyses from digitized paper ECGs to establish risk stratification decision trees in Brugada syndrome.
- Published
- 2021
22. 基于多目标优化的联邦学习进化.
- Author
-
胡智勇, 于千城, 王之赐, and 张丽丝
- Subjects
FEDERATED learning ,ALGORITHMS ,PRIVACY - Abstract
Copyright of Application Research of Computers / Jisuanji Yingyong Yanjiu is the property of Application Research of Computers Edition and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2024
- Full Text
- View/download PDF
23. Automated analysis of pen-on-paper spirals for tremor detection, quantification, and differentiation.
- Author
-
Rajan, Roopa, Anandapadmanabhan, Reghu, Nageswaran, Sharmila, Radhakrishnan, Vineeth, Saini, Arti, Krishnan, Syam, Gupta, Anu, Vishnu, Venugopalan Y., Pandit, Awadh K., Singh, Rajesh Kumar, Radhakrishnan, Divya M, Singh, Mamta Bhushan, Bhatia, Rohit, Srivastava, Achal, Kishore, Asha, and Padma Srivastava, M. V.
- Subjects
STATISTICS ,RESEARCH ,CONFIDENCE intervals ,ANALYSIS of variance ,TASK performance ,HANDWRITING ,ACCELEROMETERS ,DYSTONIA ,MOVEMENT disorders ,TREMOR ,DRAWING ,DESCRIPTIVE statistics ,PARKINSON'S disease ,SENSITIVITY & specificity (Statistics) ,DATA analysis ,RECEIVER operating characteristic curves ,DATA analysis software ,ALGORITHMS - Abstract
OBJECTIVE: To develop an automated algorithm to detect, quantify, and differentiate between tremor using pen-on-paper spirals. METHODS: Patients with essential tremor (n = 25), dystonic tremor (n = 25), Parkinson’s disease (n = 25), and healthy volunteers (HV, n = 25) drew free-hand spirals. The algorithm derived the mean deviation (MD) and tremor variability from scanned images. MD and tremor variability were compared with 1) the Bain and Findley scale, 2) the Fahn–Tolosa–Marin tremor rating scale (FTM–TRS), and 3) the peak power and total power of the accelerometer spectra. Inter and intra loop widths were computed to differentiate between the tremor. RESULTS: MD was higher in the tremor group (48.9±26.3) than in HV (26.4±5.3; p < 0.001). The cut-off value of 30.3 had 80.9% sensitivity and 76.0% specificity for the detection of the tremor [area under the curve: 0.83; 95% confidence index (CI): 0.75, 0.91, p < 0.001]. MD correlated with the Bain and Findley ratings (rho = 0.491, p = 0 < 0.001), FTM–TRS part B (rho = 0.260, p = 0.032) and accelerometric measures of postural tremor (total power, rho = 0.366, p < 0.001; peak power, rho = 0.402, p < 0.001). Minimum Detectable Change was 19.9%. Inter loop width distinguished Parkinson’s disease spirals from dystonic tremor (p < 0.001, 95% CI: 54.6, 211.1), essential tremor (p = 0.003, 95% CI: 28.5, 184.9), or HV (p = 0.036, 95% CI: -160.4, -3.9). CONCLUSION: The automated analysis of pen-on-paper spirals generated robust variables to quantify the tremor and putative variables to distinguish them from each other. SIGNIFICANCE: This technique maybe useful for epidemiological surveys and follow-up studies on tremor. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
24. Hybrid Methods of Bibliographic Coupling and Text Similarity Measurement for Biomedical Paper Recommendation
- Author
-
Hongmei, Guo, Zhesi, Shen, Jianxun, Zeng, and Na, Hong
- Subjects
Cluster Analysis ,Algorithms - Abstract
The amount of available scientific literature is increasing, and studies have proposed various methods for evaluating document-document similarity in order to cluster or classify documents for science mapping and knowledge discovery. In this paper, we propose hybrid methods for bibliographic coupling (BC) and linear evaluation of text or content similarity: We combined BC with BM25, Cosine, and PMRA to compare their performances with single methods in paper recommendation tasks using TREC Genomics Track 2005datasets. For paper recommendation, BC and text-based methods complement each other, and hybrid methods were better than single methods. The combinations of BC with BM25 and BC with Cosine performed better than BC with PMRA. The performances were best when the weights of BM25, Cosine, and PMRA were 0.025, 0.2, and 0.2, respectively, in hybrid methods. For paper recommendation, the combinations of BC with text-based methods were better than BC or text-based methods used alone. The choice of method should depend on the actual data and research needs. In the future, the underlying reasons for the differences in performance and the specific part or type of information they complement in text clustering or recommendation need to be examined.
- Published
- 2022
25. A review paper of optimal resource allocation algorithm in cloud environment.
- Author
-
Patadiya, Namrata and Bhatt, Nirav
- Subjects
RESOURCE allocation ,LITERATURE reviews ,SERVICE level agreements ,ALGORITHMS ,ELECTRONIC data processing ,CLOUD computing - Abstract
Cloud computing has become a popular approach for processing data and running computationally expensive services on a pay-as-you-go basis. Due to the ever-increasing requirement for cloud-based apps, appropriately allocating resources according to user requests while meeting service-level agreements between customers and service providers has become increasingly complex. An efficient and versatile resource allocation method is required to properly deploy these assets and meet user needs. The technique of distributing resources has become more arduous as user demand has increased. One of the key areas of research experts is how to design optimal solutions for this approach. In this paper, a literature review on proposed dynamic resource allocation approaches is introduced. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
26. Classification of forensic hyperspectral paper data using hybrid spectral similarity algorithms.
- Author
-
Devassy, Binu Melit, George, Sony, Nussbaum, Peter, and Thomas, Tessamma
- Subjects
SPECTRAL imaging ,FORGERY ,ALGORITHMS ,FORENSIC sciences ,CLASSIFICATION ,CONFIDENCE intervals ,CLASSIFICATION algorithms - Abstract
Document forgeries that involve modification of the materials used, such as ink and paper, provide evidence of any malpractices being performed. Forensic specialists use different techniques to identify and classify these samples; however, the most preferred method is to use nondestructive techniques to avoid any potential damage to the original specimen under investigation. Hyperspectral imaging has already been explored in several application domains and used as a powerful method in forensic investigations to extract information about the materials under examination. To precisely classify the material information and utilize the hyperspectral imaging technique's potential, we probed the potential of some hybrid spectral similarity measures to classify different commonly used paper samples. A comparison of these methods is quantitatively presented in this article. Hybrid spectral similarity algorithms are tested on forensic analysis of paper data. We compared the classification capabilities of various hybrid spectral similarity algorithms on hyperspectral data of 40 different paper samples. The overall accuracy (OA), kappa K̂, Z‐score of kappa (ZK̂), and the 95% confidence interval of kappa (CI(K̂)) are used for comparison. The SID‐SAM and SID‐SCA produced an overall accuracy of 88% and 87%, respectively, which is highest among the hybrid spectral similarity measures tested. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
27. Selected Papers of the 32nd International Workshop on Combinatorial Algorithms, IWOCA 2021.
- Author
-
Flocchini, Paola and Moura, Lucia
- Subjects
EULERIAN graphs ,ALGORITHMS ,APPROXIMATION algorithms ,WEB hosting - Abstract
They give fixed parameter tractable algorithms for the problem parameterized by various structural parameters. The authors give a greedy loop-free algorithm for the exhaustive generation, a successor algorithm that runs in constant amortized time, among other algorithms, as well as results for the fixed spin generalization of this problem. IWOCA (International Workshop on Combinatorial Algorithms) is an annual conference series covering all aspects of combinatorial algorithms. [Extracted from the article]
- Published
- 2023
- Full Text
- View/download PDF
28. Letter on the results of the BASiNET method in the paper 'A systematic evaluation of computational tools for lncRNA identification'
- Author
-
Fabrício Martins Lopes and Matheus H Pimenta-Zanon
- Subjects
Computational Biology ,RNA, Long Noncoding ,Molecular Biology ,Algorithms ,Information Systems - Abstract
This letter points out a conceptual error made by the authors of a published paper, which presents a review and evaluation of computational methods in lncRNA identification. The error was made in the execution of the BASiNET method when considering an example file (toy model) made available by the authors with the aim of showing how a classification model could be stored in a file for later use. In this letter, this error is contextualized, the correct use of the BASiNET method is pointed out and the results of its correct execution to one of the datasets used in the review article are presented. The results clearly show the misuse of the method and present its correct use so that it can be fairly compared with other methods in the literature and prevent its misuse from being replicated by new studies.
- Published
- 2022
29. 'Re-Materialized' Medical Data: Paper-Based Transmission of Structured Medical Data Using QR-Code, for Medical Imaging Reports
- Author
-
Arthur, Lauriot Dit Prevost, Raphaël, Bentegeac, Audrey, Dequesnes, Adrien, Billiau, Emmanuel, Baudelet, Rémi, Legleye, Marc-Antoine, Hubaut, Michel, Cassagnou, Philippe, Puech, Rémi, Besson, and Emmanuel, Chazard
- Subjects
Diagnostic Imaging ,Radiography ,Information Storage and Retrieval ,Smartphone ,Algorithms - Abstract
Although paper-based transmission of medical information might seem outdated, it has proven efficient, and remains structurally safe from massive data leaks. As part of the ICIPEMIR project for improving medical imaging report, we explored the idea of structured data storage within a medical report, by embedding the data themselves in a QR-Code (and no URL-to-the-data). Three different datasets from ICIPEMIR were serialized, then encoded in a QR-Code. We compared 4 compression algorithms to reduce file size before QR-Encoding. YAML was the most concise format (character sparing), and allowed for embedding of a 2633-character serialized file within a QR-Code. The best compression rate was obtained with gzip, with a compression ratio of 2.32 in 15.7ms. Data were easily extracted and decompressed from a digital QR-Code using a simple command line. YAML file was also successfully recovered from the printed QR-Code with both Android and iOS smartphone. Minimal detected size was 3*3cm.
- Published
- 2022
30. Reply to 'Describing center of pressure movement in stabilometry by ellipse area approximation' from Agnieszka Gołąb concerning the paper 'A Review of Center of Pressure (COP) Variables to Quantify Standing Balance in Elderly People: Algorithms and Open Access Code'
- Author
-
Flavien, Quijoux and Alice, Nicolaï
- Subjects
Access to Information ,Review Literature as Topic ,Movement ,Humans ,Postural Balance ,Algorithms ,Aged - Abstract
Letter to the Editor concerning "Describing center of pressure movement in stabilometry by ellipse area approximation" from Agnieszka Gołąb.
- Published
- 2022
31. LA INTEL·LIGÈNCIA ARTIFICIAL EN LA DETECCIÓ DE LES PRÀCTIQUES DE BID RIGGING: EL PAPER CAPDAVANTER DE L'ACCO.
- Author
-
Jiménez Cardona, Noemí
- Subjects
GOVERNMENT purchasing ,ARTIFICIAL intelligence ,ANTITRUST law ,SOFTWARE development tools ,CARTELS - Abstract
Copyright of Revista Catalana de Dret Públic is the property of Revista Catalana de Dret Public and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2022
- Full Text
- View/download PDF
32. Development and Validation of an Algorithm for the Digitization of ECG Paper Images.
- Author
-
Randazzo, Vincenzo, Puleo, Edoardo, Paviglianiti, Annunziata, Vallan, Alberto, and Pasero, Eros
- Subjects
DIGITIZATION ,DIGITAL images ,ELECTROCARDIOGRAPHY ,HEART rate monitors ,PEARSON correlation (Statistics) ,MEASUREMENT errors ,HEART beat ,ALGORITHMS - Abstract
The electrocardiogram (ECG) signal describes the heart's electrical activity, allowing it to detect several health conditions, including cardiac system abnormalities and dysfunctions. Nowadays, most patient medical records are still paper-based, especially those made in past decades. The importance of collecting digitized ECGs is twofold: firstly, all medical applications can be easily implemented with an engineering approach if the ECGs are treated as signals; secondly, paper ECGs can deteriorate over time, therefore a correct evaluation of the patient's clinical evolution is not always guaranteed. The goal of this paper is the realization of an automatic conversion algorithm from paper-based ECGs (images) to digital ECG signals. The algorithm involves a digitization process tested on an image set of 16 subjects, also with pathologies. The quantitative analysis of the digitization method is carried out by evaluating the repeatability and reproducibility of the algorithm. The digitization accuracy is evaluated both on the entire signal and on six ECG time parameters (R-R peak distance, QRS complex duration, QT interval, PQ interval, P-wave duration, and heart rate). Results demonstrate the algorithm efficiency has an average Pearson correlation coefficient of 0.94 and measurement errors of the ECG time parameters are always less than 1 mm. Due to the promising experimental results, the algorithm could be embedded into a graphical interface, becoming a measurement and collection tool for cardiologists. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
33. Development and Validation of an Algorithm for the Digitization of ECG Paper Images
- Author
-
Vincenzo Randazzo, Edoardo Puleo, Annunziata Paviglianiti, Alberto Vallan, and Eros Pasero
- Subjects
Digitization ,ECG ,digitization ,electrocardiogram ,heart pathologies ,Pearson’s coefficient measurement ,signals similarity ,Reproducibility of Results ,Signal Processing, Computer-Assisted ,Biochemistry ,Atomic and Molecular Physics, and Optics ,Analytical Chemistry ,Electrocardiography ,Computer-Assisted ,Algorithms ,Heart Rate ,Humans ,Signal Processing ,Electrical and Electronic Engineering ,Instrumentation - Abstract
The electrocardiogram (ECG) signal describes the heart’s electrical activity, allowing it to detect several health conditions, including cardiac system abnormalities and dysfunctions. Nowadays, most patient medical records are still paper-based, especially those made in past decades. The importance of collecting digitized ECGs is twofold: firstly, all medical applications can be easily implemented with an engineering approach if the ECGs are treated as signals; secondly, paper ECGs can deteriorate over time, therefore a correct evaluation of the patient’s clinical evolution is not always guaranteed. The goal of this paper is the realization of an automatic conversion algorithm from paper-based ECGs (images) to digital ECG signals. The algorithm involves a digitization process tested on an image set of 16 subjects, also with pathologies. The quantitative analysis of the digitization method is carried out by evaluating the repeatability and reproducibility of the algorithm. The digitization accuracy is evaluated both on the entire signal and on six ECG time parameters (R-R peak distance, QRS complex duration, QT interval, PQ interval, P-wave duration, and heart rate). Results demonstrate the algorithm efficiency has an average Pearson correlation coefficient of 0.94 and measurement errors of the ECG time parameters are always less than 1 mm. Due to the promising experimental results, the algorithm could be embedded into a graphical interface, becoming a measurement and collection tool for cardiologists.
- Published
- 2022
34. SchlossLab/Sovacool_OptiFit_mSphere_2022: OptiFit paper 1.0.0
- Author
-
Sovacool, Kelly L., Westcott, Sarah L., M. Brodie Mumphrey, Dotson, Gabrielle A., and Schloss, Patrick D.
- Subjects
amplicon sequencing ,microbiome ,bioinformatics ,microbial ecology ,algorithms - Abstract
The paper accompanying the OptiFit algorithm is now out in mSphere! https://journals.asm.org/doi/10.1128/msphere.00916-21 Assigning amplicon sequences to operational taxonomic units (OTUs) is an important step in characterizing microbial communities across large data sets. A notable difference betweende novoclustering and database-dependent reference clustering methods is that OTU assignments fromde novomethods may change when new sequences are added. However, one may wish to incorporate new samples to previously clustered data sets without clustering all sequences again, such as when comparing across data sets or deploying machine learning models. Existing reference-based methods produce consistent OTUs but only consider the similarity of each query sequence to a single reference sequence in an OTU, resulting in assignments that are worse than those generated byde novomethods. To provide an efficient method to fit sequences to existing OTUs, we developed the OptiFit algorithm. Inspired by thede novoOptiClust algorithm, OptiFit considers the similarity of all pairs of reference and query sequences to produce OTUs of the best possible quality. We tested OptiFit using four data sets with two strategies: (i) clustering to a reference database and (ii) splitting the data set into a reference and query set, clustering the references using OptiClust, and then clustering the queries to the references. The result is an improved implementation of reference-based clustering. OptiFit produces OTUs of a quality similar to that of OptiClust at faster speeds when using the split data set strategy. OptiFit provides a suitable option for users requiring consistent OTU assignments at the same quality as afforded byde novoclustering methods.
- Published
- 2022
- Full Text
- View/download PDF
35. Special issue "Discrete optimization: Theory, algorithms and new applications".
- Author
-
Werner, Frank
- Subjects
MATHEMATICAL optimization ,METAHEURISTIC algorithms ,ONLINE algorithms ,LINEAR matrix inequalities ,ALGORITHMS ,ROBUST stability analysis ,NONLINEAR integral equations - Abstract
This document is an editorial for a special issue of the journal AIMS Mathematics on the topic of discrete optimization. The issue includes 21 papers covering a range of subjects, including molecular trees, network systems, variational inequality problems, scheduling, image restoration, spectral clustering, integral equations, convex functions, graph products, optimization algorithms, air quality prediction, humanitarian planning, inertial methods, neural networks, transportation problems, emotion identification, fixed-point problems, structural engineering design, single machine scheduling, and ensemble learning. The papers present new theoretical results, algorithms, and applications in these areas. The guest editor expresses gratitude to the journal staff and reviewers and hopes that readers will find inspiration for their own research. [Extracted from the article]
- Published
- 2024
- Full Text
- View/download PDF
36. Utilizing tables, figures, charts and graphs to enhance the readability of a research paper.
- Author
-
Divecha C. A., Tullu M. S., and Karande S.
- Subjects
GRAPHIC arts ,READABILITY (Literary style) ,SERIAL publications ,RESEARCH methodology ,COPYRIGHT ,MEDICAL research ,ALGORITHMS - Abstract
The authors offer observation on utilizing tables, figures, charts and graphs to help understand the research presented in a simple manner but also engage and sustain the reader's interest. Topics discussed include benefits provided by the use of tables/figures/charts/graphs, general methodology of design and submission, and copyright issues of using material from government publications/public domain.
- Published
- 2023
- Full Text
- View/download PDF
37. Introduction to the papers and posters of TWG11: Algorithmics
- Author
-
Hodgen, Jeremy (Herausgeber/in), Bolondi, Giorgio (Herausgeber/in), Weber, Christof, Medova, Janka, Rafalska, Maryna, Kortenkamp, Ulrich, and Modeste, Simon
- Subjects
teaching and learning of algorithms ,algorithmic thinking ,Algorithms - Abstract
In CERME12, our working group "Algorithmics" started its work as a newly established TWG. Since algorithms have always been at the heart of mathematics and their importance has been steadily increasing since the beginnings of theoretical computer science, the design and analysis of algorithms – called algorithmics (Traub 1964, Knuth 1985) – lies at the intersection of mathematics and computer science. For this reason, on the one hand, various algorithms and algorithmic activities have their traditional place in mathematics curricula at all levels. At the school level, mathematics and computer science have interacted since the 1980s, when many schools set up labs with computers equipped with programming software. On the other hand, many questions arise in the context of teaching and learning algorithms: a first, more applied group of questions aims at algorithms in mathematics education and curricula, a second, more theoretical group of questions seeks to clarify the concepts of algorithm and algorithmic thinking, + ID der Publikation: phlu_16641 + Art des Beitrages: Buchkapitel/Beitrag in Sammelband + Seiten: 100 - 107 + Sprache: Englisch + Letzte Aktualisierung: 2023-01-31 14:17:15
- Published
- 2022
- Full Text
- View/download PDF
38. 基于子空间多尺度特征融合的试卷语义分割.
- Author
-
夏源祥, 刘 渝, 楚程钱, 万永菁, and 蒋翠玲
- Subjects
PYRAMIDS ,ALGORITHMS ,HANDWRITING ,CLASSIFICATION ,SUBSPACES (Mathematics) - Abstract
Copyright of Journal of East China University of Science & Technology is the property of Journal of East China University of Science & Technology Editorial Office and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2023
- Full Text
- View/download PDF
39. Selected Papers of the 31st International Workshop on Combinatorial Algorithms, IWOCA 2020.
- Author
-
Gąsieniec, Leszek, Klasing, Ralf, and Radzik, Tomasz
- Subjects
MATHEMATICAL proofs ,ALGORITHMS ,CHARTS, diagrams, etc. ,POLYNOMIAL time algorithms ,ONLINE algorithms ,GRAPH labelings ,HAMMING distance - Published
- 2022
- Full Text
- View/download PDF
40. Digital marginalization, data marginalization, and algorithmic exclusions: a critical southern decolonial approach to datafication, algorithms, and digital citizenship from the Souths.
- Author
-
Chaka, Chaka
- Subjects
CITIZENSHIP ,DECOLONIZATION ,ELECTRONIC paper ,ALGORITHMS ,COMMUNITIES ,CHIEF information officers - Abstract
This paper explores digital marginalization, data marginalization, and algorithmic exclusions in the Souths. To this effect, it argues that underrepresented users and communities continue to be marginalized and excluded by digital technologies, by big data, and by algorithms employed by organizations, corporations, institutions, and governments in various data jurisdictions. Situating data colonialism within the Souths, the paper contends that data ableism, data disablism, and data colonialism are at play when data collected, collated, captured, configured, and processed from underrepresented users and communities is utilized by mega entities for their own multiple purposes. It also maintains that data coloniality, as opposed to data colonialism, is impervious to legal and legislative interventions within data jurisdictions. Additionally, it discusses digital citizenship (DC) and its related emerging regimes. Moreover, the paper argues that digital exclusion transcends the simplistic haves versus the have nots dualism as it manifests itself in multiple layers and in multiple dimensions. Furthermore, it characterizes how algorithmic exclusions tend to perpetuate historical human biases despite the pervasive view that algorithms are autonomous, neutral, rational, objective, fair, unbiased, and non-human. Finally, the paper advances a critical southern decolonial (CSD) approach to datafication, algorithms, and digital citizenship by means of which data coloniality, algorithmic coloniality, and the coloniality embodied in DC have to be critiqued, challenged, and dismantled. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
41. 23‐1: Invited Paper: Visualization of Color‐Gamut Coverage‐Gamut Ring Intersection.
- Subjects
VISUALIZATION ,COLOR ,ALGORITHMS - Abstract
The D50 CIELAB gamut volume was standardized as a suitable and unified gamut size measurement methodology. The color gamut volume is visualized using proportionate gamut rings in a two‐dimensional diagram. Gamut ring intersection assists the visual comparison of the size and shape of the color gamut coverage between a display and a reference. This paper introduces a desktop application with an intuitive graphical interface for visualizing the gamut rings and describes the algorithm of the gamut ring intersection. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
42. Edge computing-enabled green multisource fusion indoor positioning algorithm based on adaptive particle filter.
- Author
-
Li, Mengyao, Zhu, Rongbo, Ding, Qianao, Wang, Jun, Wan, Shaohua, and Ma, Maode
- Subjects
ADAPTIVE filters ,PROBABILITY density function ,ALGORITHMS ,EDGE computing ,FILTER paper ,PARTICLE swarm optimization - Abstract
Edge computing enables portable devices to provide smart applications, and the indoor positioning technique offers accurate location-based indoor navigation and personalized smart services. To achieve the high positioning accuracy, an indoor positioning algorithm based on particle filter requires a large number of sample particles to approximate the probability density function, which leads to the additional computational cost and high fusion delay. Focusing on real-time and accurate positioning, an edge computing-enabled green multi-source fusion indoor positioning algorithm called APFP is proposed based on adaptive particle filter in this paper. APFP considers both pedestrian dead reckoning (PDR) signals in mobile terminals and the received signal strength indication (RSSI) of Bluetooth, and effectively merges the error-free accumulation of trilateral positioning and the accurate short-range positioning of PDR, which enables mobile terminals adaptively perform particle filter to reduce the computing time and power consumption while ensuring positioning accuracy simultaneously. Detailed experimental results show that, compared with the traditional particle filter algorithm and the map-constrained algorithm, the proposed APFP reduces fusion computing cost by 59.89% and 54.37%, respectively. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
43. 多视图融合TextRCNN的论文自动推荐算法.
- Author
-
杨秀璋, 武帅, 杨琪, 项美玉, 李娜, 周既松, and 赵小明
- Subjects
CONVOLUTIONAL neural networks ,DEEP learning ,MACHINE learning ,AUTOMATIC classification ,ACCURACY of information ,ALGORITHMS - Abstract
Copyright of Journal of Computer Engineering & Applications is the property of Beijing Journal of Computer Engineering & Applications Journal Co Ltd. and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2023
- Full Text
- View/download PDF
44. Machine learning methods for prediction of cancer driver genes: a survey paper
- Author
-
Renan Andrades and Mariana Recamonde-Mendoza
- Subjects
Genomics (q-bio.GN) ,FOS: Computer and information sciences ,Computer Science - Machine Learning ,J.3 ,Computational Biology ,Oncogenes ,Machine Learning (cs.LG) ,Machine Learning ,Neoplasms ,FOS: Biological sciences ,Mutation ,Humans ,Quantitative Biology - Genomics ,Molecular Biology ,Algorithms ,Information Systems - Abstract
Identifying the genes and mutations that drive the emergence of tumors is a critical step to improving our understanding of cancer and identifying new directions for disease diagnosis and treatment. Despite the large volume of genomics data, the precise detection of driver mutations and their carrying genes, known as cancer driver genes, from the millions of possible somatic mutations remains a challenge. Computational methods play an increasingly important role in discovering genomic patterns associated with cancer drivers and developing predictive models to identify these elements. Machine learning (ML), including deep learning, has been the engine behind many of these efforts and provides excellent opportunities for tackling remaining gaps in the field. Thus, this survey aims to perform a comprehensive analysis of ML-based computational approaches to identify cancer driver mutations and genes, providing an integrated, panoramic view of the broad data and algorithmic landscape within this scientific problem. We discuss how the interactions among data types and ML algorithms have been explored in previous solutions and outline current analytical limitations that deserve further attention from the scientific community. We hope that by helping readers become more familiar with significant developments in the field brought by ML, we may inspire new researchers to address open problems and advance our knowledge towards cancer driver discovery.
- Published
- 2021
45. Explainable Rules and Heuristics in AI Algorithm Recommendation Approaches--A Systematic Literature Review and Mapping Study.
- Author
-
García-Peñalvo, Francisco José, Vázquez-Ingelmo, Andrea, and García-Holgado, Alicia
- Subjects
ARTIFICIAL intelligence ,LITERATURE reviews ,SOFTWARE engineering ,ALGORITHMS ,HEURISTIC ,SOFTWARE engineers - Abstract
The exponential use of artificial intelligence (AI) to solve and automated complex tasks has catapulted its popularity generating some challenges that need to be addressed. While AI is a powerful means to discover interesting patterns and obtain predictive models, the use of these algorithms comes with a great responsibility, as an incomplete or unbalanced set of training data or an unproper interpretation of the models' outcomes could result in misleading conclusions that ultimately could become very dangerous. For these reasons, it is important to rely on expert knowledge when applying these methods. However, not every user can count on this specific expertise; non-AI-expert users could also benefit from applying these powerful algorithms to their domain problems, but they need basic guidelines to obtain the most out of AI models. The goal of this work is to present a systematic review of the literature to analyze studies whose outcomes are explainable rules and heuristics to select suitable AI algorithms given a set of input features. The systematic review follows the methodology proposed by Kitchenham and other authors in the field of software engineering. As a result, 9 papers that tackle AI algorithm recommendation through tangible and traceable rules and heuristics were collected. The reduced number of retrieved papers suggests a lack of reporting explicit rules and heuristics when testing the suitability and performance of AI algorithms. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
46. 边信息嵌入的学术论文推荐算法研究.
- Author
-
沈小烽, 刘柏嵩, 吴俊超, and 钱江波
- Subjects
RANDOM walks ,QUALITY factor ,PROBLEM solving ,ALGORITHMS ,COSINE function ,SEMANTICS - Abstract
Copyright of Journal of Computer Engineering & Applications is the property of Beijing Journal of Computer Engineering & Applications Journal Co Ltd. and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2022
- Full Text
- View/download PDF
47. Using Paper Texture for Choosing a Suitable Algorithm for Scanned Document Image Binarization.
- Author
-
Lins, Rafael Dueire, Bernardino, Rodrigo, Barboza, Ricardo da Silva, and De Oliveira, Raimundo Correa
- Subjects
DOCUMENT imaging systems ,HISTORICAL source material ,TEXTURES ,ALGORITHMS - Abstract
The intrinsic features of documents, such as paper color, texture, aging, translucency, the kind of printing, typing or handwriting, etc., are important with regard to how to process and enhance their image. Image binarization is the process of producing a monochromatic image having its color version as input. It is a key step in the document processing pipeline. The recent Quality-Time Binarization Competitions for documents have shown that no binarization algorithm is good for any kind of document image. This paper uses a sample of the texture of the scanned historical documents as the main document feature to select which of the 63 widely used algorithms, using five different versions of the input images, totaling 315 document image-binarization schemes, provides a reasonable quality-time trade-off. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
48. 基于改进YOLOv5 的纸病检测方法.
- Author
-
张开生 and 关凯凯
- Subjects
CLASSIFICATION algorithms ,LIGHT sources ,FEATURE extraction ,ALGORITHMS ,SPEED - Abstract
Copyright of China Pulp & Paper is the property of China Pulp & Paper Magazines Publisher and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2022
- Full Text
- View/download PDF
49. ITERATIVE ALGORITHMS FOR VARIATIONAL INCLUSIONS IN BANACH SPACES.
- Author
-
ANSARI, QAMRUL HASAN, BALOOEE, JAVAD, and PETRUŞEL, ADRIAN
- Subjects
BANACH spaces ,LIPSCHITZ continuity ,PAPER arts ,DIFFERENTIAL inclusions ,ALGORITHMS - Abstract
The present paper is in two folds. In the first fold, we prove the Lipschitz continuity of the proximal mapping associated with a general strongly H-monotone mapping and compute an estimate of its Lipschitz constant under some mild assumptions imposed on the mapping H involved in the proximal mapping. We provide two examples to show that a maximal monotone mapping need not be a general H-monotone for a single-valued mapping H from a Banach space to its dual space. A class of multi-valued nonlinear variational inclusion problems is considered, and by using the notion of proximal mapping and Nadler's technique, an iterative algorithm with mixed errors is suggested to compute its solutions. Under some appropriate hypotheses imposed on the mappings and parameters involved in the multi-valued nonlinear variational inclusion problem, the strong convergence of the sequences generated by the proposed algorithm to a solution of the aforesaid problem is verified. The second fold of this paper investigates and analyzes the notion of Cn-monotone mappings defined and studied in [S.Z. Nazemi, A new class of monotone mappings and a new class of variational inclusions in Banach spaces, J. Optim. Theory Appl. 155(3)(2012) 785-795]. Several comments related to the results and algorithm appeared in the above mentioned paper are given. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
50. Research on Denoising Algorithm of Composite Thermal Wave Detection Image Based on Improved Total Variation.
- Author
-
Shen, Xuehui and Huang, Qingjiu
- Subjects
IMAGE denoising ,IMAGE intensifiers ,COMPUTER vision ,ALGORITHMS ,HOUGH transforms ,INFORMATION filtering ,RECOMMENDER systems - Abstract
How to improve the high resolution and remove the noise of composite infrared thermal wave detection image have become important research topics in machine vision. Therefore, this paper proposes a composite image enhancement algorithm based on edge adaptive total variation of direction. Using the improved adaptive directional total variation model to construct the guidance image of the guidance filter can provide better structural information for the guidance filter and conduct the guidance filter. This process is iterated to eliminate noise and ladder effect. Experimental results show that this algorithm is superior to other advanced methods in objective evaluation index and subjective vision. The algorithm proposed in this paper can not only effectively eliminate noise and ladder effect, but also highlight image details and structure information, which proves the effectiveness of this algorithm. Discover the cutting-edge algorithm for high-resolution improvement and noise elimination in infrared thermal wave detection images. Our innovative composite image enhancement technique, based on edge adaptive total variation, outperforms traditional methods. The improved directional model provides better structural information, optimizing the guidance filter process. Experimental results reveal its superiority in objective evaluation and subjective vision. This algorithm effectively eliminates noise and ladder effect while highlighting image details, validating its remarkable effectiveness. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.