158,514 results
Search Results
2. A fully-automated paper ECG digitisation algorithm using deep learning.
- Author
-
Wu, Huiyi, Patel, Kiran Haresh Kumar, Li, Xinyang, Zhang, Bowen, Galazis, Christoforos, Bajaj, Nikesh, Sau, Arunashis, Shi, Xili, Sun, Lin, Tao, Yanda, Al-Qaysi, Harith, Tarusan, Lawrence, Yasmin, Najira, Grewal, Natasha, Kapoor, Gaurika, Waks, Jonathan W., Kramer, Daniel B., Peters, Nicholas S., and Ng, Fu Siong
- Subjects
DEEP learning ,ELECTROCARDIOGRAPHY ,ELECTRONIC paper ,ATRIAL fibrillation ,ALGORITHMS ,HEART failure ,HEART rate monitors - Abstract
There is increasing focus on applying deep learning methods to electrocardiograms (ECGs), with recent studies showing that neural networks (NNs) can predict future heart failure or atrial fibrillation from the ECG alone. However, large numbers of ECGs are needed to train NNs, and many ECGs are currently only in paper format, which are not suitable for NN training. We developed a fully-automated online ECG digitisation tool to convert scanned paper ECGs into digital signals. Using automated horizontal and vertical anchor point detection, the algorithm automatically segments the ECG image into separate images for the 12 leads and a dynamical morphological algorithm is then applied to extract the signal of interest. We then validated the performance of the algorithm on 515 digital ECGs, of which 45 were printed, scanned and redigitised. The automated digitisation tool achieved 99.0% correlation between the digitised signals and the ground truth ECG (n = 515 standard 3-by-4 ECGs) after excluding ECGs with overlap of lead signals. Without exclusion, the performance of average correlation was from 90 to 97% across the leads on all 3-by-4 ECGs. There was a 97% correlation for 12-by-1 and 3-by-1 ECG formats after excluding ECGs with overlap of lead signals. Without exclusion, the average correlation of some leads in 12-by-1 ECGs was 60–70% and the average correlation of 3-by-1 ECGs achieved 80–90%. ECGs that were printed, scanned, and redigitised, our tool achieved 96% correlation with the original signals. We have developed and validated a fully-automated, user-friendly, online ECG digitisation tool. Unlike other available tools, this does not require any manual segmentation of ECG signals. Our tool can facilitate the rapid and automated digitisation of large repositories of paper ECGs to allow them to be used for deep learning projects. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
3. Servis povezan s prijelomom Hrvatskog društva za fizikalnu i rehabilitacijsku medicinu Hrvatskoga liječničkog zbora – dokument o stajalištu.
- Author
-
Grazio, Simeon, Nikolić, Tatjana, Luke Vrbanić, Tea Schnurrer, Poljičanin, Ana, and Grubišić, Frane
- Abstract
Copyright of Lijecnicki Vjesnik is the property of Croatian Medical Association and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2024
- Full Text
- View/download PDF
4. A Machine Learning Model to Predict Citation Counts of Scientific Papers in Otology Field.
- Author
-
Alohali, Yousef A., Fayed, Mahmoud S., Mesallam, Tamer, Abdelsamad, Yassin, Almuhawas, Fida, and Hagr, Abdulrahman
- Subjects
DECISION trees ,SERIAL publications ,NATURAL language processing ,BIBLIOMETRICS ,MACHINE learning ,REGRESSION analysis ,RANDOM forest algorithms ,CITATION analysis ,DESCRIPTIVE statistics ,PREDICTION models ,ARTIFICIAL neural networks ,MEDICAL research ,MEDICAL specialties & specialists ,ALGORITHMS - Abstract
One of the most widely used measures of scientific impact is the number of citations. However, due to its heavy-tailed distribution, citations are fundamentally difficult to predict but can be improved. This study was aimed at investigating the factors and parts influencing the citation number of a scientific paper in the otology field. Therefore, this work proposes a new solution that utilizes machine learning and natural language processing to process English text and provides a paper citation as the predicted results. Different algorithms are implemented in this solution, such as linear regression, boosted decision tree, decision forest, and neural networks. The application of neural network regression revealed that papers' abstracts have more influence on the citation numbers of otological articles. This new solution has been developed in visual programming using Microsoft Azure machine learning at the back end and Programming Without Coding Technology at the front end. We recommend using machine learning models to improve the abstracts of research articles to get more citations. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
5. Efficient and Effective Academic Expert Finding on Heterogeneous Graphs through (k, P)-Core based Embedding.
- Author
-
YUXIANG WANG, JUN LIU, XIAOLIANG XU, XIANGYU KE, TIANXING WU, and XIAOXUAN GOU
- Subjects
COMMUNITIES ,SEMANTICS ,ALGORITHMS - Abstract
Expert finding is crucial for a wealth of applications in both academia and industry. Given a user query and trove of academic papers, expert finding aims at retrieving the most relevant experts for the query, from the academic papers. Existing studies focus on embedding-based solutions that consider academic papers’ textual semantic similarities to a query via document representation and extract the top-n experts from the most similar papers. Beyond implicit textual semantics, however, papers’ explicit relationships (e.g., co-authorship) in a heterogeneous graph (e.g., DBLP) are critical for expert finding, because they help improve the representation quality. Despite their importance, the explicit relationships of papers generally have been ignored in the literature. In this article, we study expert finding on heterogeneous graphs by considering both the explicit relationships and implicit textual semantics of papers in one model. Specifically, we define the cohesive (k, P)-core community of papers w.r.t. a meta-path P (i.e., relationship) and propose a (k, P)-core based document embedding model to enhance the representation quality. Based on this, we design a proximity graph-based index (PGIndex) of papers and present a threshold algorithm (TA)-based method to efficiently extract top-n experts from papers returned by PG-Index. We further optimize our approach in two ways: (1) we boost effectiveness by considering the (k, P)-core community of experts and the diversity of experts’ research interests, to achieve high-quality expert representation from paper representation; and (2) we streamline expert finding, going from “extract top-n experts from top-m (m > n) semantically similar papers” to “directly return top-n experts”. The process of returning a large number of top-m papers as intermediate data is avoided, thereby improving the efficiency. Extensive experiments using real-world datasets demonstrate our approach’s superiority. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
6. Cost Optimal Production-Scheduling Model Based on VNS-NSGA-II Hybrid Algorithm—Study on Tissue Paper Mill.
- Author
-
Zhang, Huanhuan, Li, Jigeng, Hong, Mengna, Man, Yi, and He, Zhenglei
- Subjects
PAPER mills ,FLOW shop scheduling ,PRODUCTION scheduling ,INDUSTRIAL costs ,ALGORITHMS - Abstract
With the development of the customization concept, small-batch and multi-variety production will become one of the major production modes, especially for fast-moving consumer goods. However, this production mode has two issues: high production cost and the long manufacturing period. To address these issues, this study proposes a multi-objective optimization model for the flexible flow-shop to optimize the production scheduling, which would maximize the production efficiency by minimizing the production cost and makespan. The model is designed based on hybrid algorithms, which combine a fast non-dominated genetic algorithm (NSGA-II) and a variable neighborhood search algorithm (VNS). In this model, NSGA-II is the major algorithm to calculate the optimal solutions. VNS is to improve the quality of the solution obtained by NSGA-II. The model is verified by an example of a real-world typical FFS, a tissue papermaking mill. The results show that the scheduling model can reduce production costs by 4.2% and makespan by 6.8% compared with manual scheduling. The hybrid VNS-NSGA-II model also shows better performance than NSGA-II, both in production cost and makespan. Hybrid algorithms are a good solution for multi-objective optimization issues in flexible flow-shop production scheduling. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
7. Tools and algorithms for the construction and analysis of systems: a special issue on tool papers for TACAS 2021.
- Author
-
Jensen, Peter Gjøl and Neele, Thomas
- Subjects
ALGORITHMS ,SOFTWARE verification ,INTEGRATED circuit verification ,SYSTEMS software ,CONFERENCES & conventions - Abstract
This special issue contains six revised and extended versions of tool papers that appeared in the proceedings of TACAS 2021, the 27th International Conference on Tools and Algorithms for the Construction and Analysis of Systems. The issue is dedicated to the realization of algorithms in tools and the studies of the application of these tools for analysing hard- and software systems. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
8. Research on the Fusion of Hybrid Fuzzy Clustering Algorithm and Computer Automatic Test Paper Composition Algorithm.
- Author
-
Kan, Baopeng
- Subjects
COMPUTERS ,COMPUTER algorithms ,FUZZY algorithms ,COMPUTER workstation clusters ,ALGORITHMS ,HIGHER education exams - Abstract
In order to improve the effect of intelligent automatic test paper composition, this paper combines the hybrid fuzzy clustering algorithm to study the computer automatic test paper composition algorithm. In this paper, a computer automatic test paper composition system based on hybrid fuzzy clustering algorithm is constructed. Moreover, the hybrid fuzzy clustering method used in this paper is used as the basic algorithm of the system, and the algorithm is improved according to the actual needs of intelligent paper composition. In addition, this paper uses an intelligent algorithm to input the relevant constraint parameters and combines the original parameters to select the most suitable test questions from the database and combine them into test papers. Finally, this paper constructs the system structure based on the requirements of intelligent test paper composition. The experimental research shows that the computer automatic test paper composition system based on the hybrid fuzzy clustering algorithm proposed in this paper has a good test paper composition function, which can effectively promote the progress of the intelligent examination mode in colleges and universities. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
9. A mathematical foundation for foundation paper pieceable quilts.
- Author
-
Leake, Mackenzie, Bernstein, Gilbert, Davis, Abe, and Agrawala, Maneesh
- Subjects
QUILTS ,QUILTING ,PATCHWORK quilts ,SEWING patterns ,ALGORITHMS - Abstract
Foundation paper piecing is a popular technique for constructing fabric patchwork quilts using printed paper patterns. But, the construction process imposes constraints on the geometry of the pattern and the order in which the fabric pieces are attached to the quilt. Manually designing foundation paper pieceable patterns that meet all of these constraints is challenging. In this work we mathematically formalize the foundation paper piecing process and use this formalization to develop an algorithm that can automatically check if an input pattern geometry is foundation paper pieceable. Our key insight is that we can represent the geometric pattern design using a certain type of dual hypergraph where nodes represent faces and hyperedges represent seams connecting two or more nodes. We show that determining whether the pattern is paper pieceable is equivalent to checking whether this hypergraph is acyclic, and if it is acyclic, we can apply a leaf-plucking algorithm to the hypergraph to generate viable sewing orders for the pattern geometry. We implement this algorithm in a design tool that allows quilt designers to focus on producing the geometric design of their pattern and let the tool handle the tedious task of determining whether the pattern is foundation paper pieceable. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
10. Risikoadaptierte Prostatakarzinomfrüherkennung 2.0 – Positionspapier der Deutschen Gesellschaft für Urologie 2024.
- Author
-
Michel, Maurice Stephan, Gschwend, Jürgen E., Wullich, Bernd, Krege, Susanne, Bolenz, Christian, Merseburger, Axel S., Krabbe, Laura-Maria, Schultz-Lampel, Daniela, König, Frank, Haferkamp, Axel, and Hadaschik, Boris
- Subjects
MORTALITY prevention ,RISK assessment ,BIOPSY ,PROSTATE-specific antigen ,EARLY detection of cancer ,PROSTATE tumors ,MAGNETIC resonance imaging ,ALGORITHMS - Abstract
Copyright of Die Urologie is the property of Springer Nature and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2024
- Full Text
- View/download PDF
11. Ethische Aspekte im Rahmen von extrakorporalen Herz-Kreislauf-Unterstützungssystemen (ECLS): Konsensuspapier der DGK, DGTHG und DGAI.
- Author
-
Dutzmann, Jochen, Grahn, Hanno, Boeken, Udo, Jung, Christian, Michalsen, Andrej, Duttge, Gunnar, Muellenbach, Ralf, Schulze, P. Christian, Eckardt, Lars, Trummer, Georg, and Michels, Guido
- Subjects
EXTRACORPOREAL membrane oxygenation ,DECISION making ,RESUSCITATION ,LIFE support systems in critical care ,INFORMED consent (Medical law) ,CARDIAC arrest ,CARDIAC pacemakers ,ALGORITHMS - Abstract
Copyright of Die Anaesthesiologie is the property of Springer Nature and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2024
- Full Text
- View/download PDF
12. Hybrid Methods of Bibliographic Coupling and Text Similarity Measurement for Biomedical Paper Recommendation.
- Author
-
Hongmei Guo, Zhesi Shen, Jianxun Zeng, and Na Hong
- Subjects
BIBLIOGRAPHY ,CONFERENCES & conventions ,CITATION analysis ,BIBLIOGRAPHICAL citations ,RESEARCH funding ,CONTENT analysis ,ALGORITHMS - Abstract
The amount of available scientific literature is increasing, and studies have proposed various methods for evaluating document-document similarity in order to cluster or classify documents for science mapping and knowledge discovery. In this paper, we propose hybrid methods for bibliographic coupling (BC) and linear evaluation of text or content similarity: We combined BC with BM25, Cosine, and PMRA to compare their performances with single methods in paper recommendation tasks using TREC Genomics Track 2005datasets. For paper recommendation, BC and text-based methods complement each other, and hybrid methods were better than single methods. The combinations of BC with BM25 and BC with Cosine performed better than BC with PMRA. The performances were best when the weights of BM25, Cosine, and PMRA were 0.025, 0.2, and 0.2, respectively, in hybrid methods. For paper recommendation, the combinations of BC with text-based methods were better than BC or text-based methods used alone. The choice of method should depend on the actual data and research needs. In the future, the underlying reasons for the differences in performance and the specific part or type of information they complement in text clustering or recommendation need to be examined. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
13. Automated analysis of pen-on-paper spirals for tremor detection, quantification, and differentiation.
- Author
-
Rajan, Roopa, Anandapadmanabhan, Reghu, Nageswaran, Sharmila, Radhakrishnan, Vineeth, Saini, Arti, Krishnan, Syam, Gupta, Anu, Vishnu, Venugopalan Y., Pandit, Awadh K., Singh, Rajesh Kumar, Radhakrishnan, Divya M, Singh, Mamta Bhushan, Bhatia, Rohit, Srivastava, Achal, Kishore, Asha, and Padma Srivastava, M. V.
- Subjects
STATISTICS ,RESEARCH ,CONFIDENCE intervals ,ANALYSIS of variance ,TASK performance ,HANDWRITING ,ACCELEROMETERS ,DYSTONIA ,MOVEMENT disorders ,TREMOR ,DRAWING ,DESCRIPTIVE statistics ,PARKINSON'S disease ,SENSITIVITY & specificity (Statistics) ,DATA analysis ,RECEIVER operating characteristic curves ,DATA analysis software ,ALGORITHMS - Abstract
OBJECTIVE: To develop an automated algorithm to detect, quantify, and differentiate between tremor using pen-on-paper spirals. METHODS: Patients with essential tremor (n = 25), dystonic tremor (n = 25), Parkinson’s disease (n = 25), and healthy volunteers (HV, n = 25) drew free-hand spirals. The algorithm derived the mean deviation (MD) and tremor variability from scanned images. MD and tremor variability were compared with 1) the Bain and Findley scale, 2) the Fahn–Tolosa–Marin tremor rating scale (FTM–TRS), and 3) the peak power and total power of the accelerometer spectra. Inter and intra loop widths were computed to differentiate between the tremor. RESULTS: MD was higher in the tremor group (48.9±26.3) than in HV (26.4±5.3; p < 0.001). The cut-off value of 30.3 had 80.9% sensitivity and 76.0% specificity for the detection of the tremor [area under the curve: 0.83; 95% confidence index (CI): 0.75, 0.91, p < 0.001]. MD correlated with the Bain and Findley ratings (rho = 0.491, p = 0 < 0.001), FTM–TRS part B (rho = 0.260, p = 0.032) and accelerometric measures of postural tremor (total power, rho = 0.366, p < 0.001; peak power, rho = 0.402, p < 0.001). Minimum Detectable Change was 19.9%. Inter loop width distinguished Parkinson’s disease spirals from dystonic tremor (p < 0.001, 95% CI: 54.6, 211.1), essential tremor (p = 0.003, 95% CI: 28.5, 184.9), or HV (p = 0.036, 95% CI: -160.4, -3.9). CONCLUSION: The automated analysis of pen-on-paper spirals generated robust variables to quantify the tremor and putative variables to distinguish them from each other. SIGNIFICANCE: This technique maybe useful for epidemiological surveys and follow-up studies on tremor. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
14. Digitalized Control Algorithm of Bridgeless Totem-Pole PFC with a Simple Control Structure Based on the Phase Angle.
- Author
-
Lee, Gi-Young, Park, Hae-Chan, Ji, Min-Woo, and Kim, Rae-Young
- Subjects
ELECTRIC current rectifiers ,ELECTRONIC paper ,PHASE-locked loops ,ALGORITHMS ,ANGLES ,VOLTAGE - Abstract
Compared to the conventional boost power factor correction (PFC) converter, a totem-pole bridgeless PFC has high efficiency because it does not have an input diode rectifier stage, but a current spike may occur when the polarity of the grid voltage changes. This paper proposes a digital control algorithm for bridgeless totem-pole PFC with a simple control structure based on the phase angle of grid voltage. The proposed algorithm has a PI-based double-loop control structure and performs DC-link voltage and input inductor current control. Rectifying switches operate based on the proposed rectification algorithm using phase angle information calculated through a single-phase phase-locked loop (PLL) to prevent current spikes. The feed-forward duty ratio value is calculated according to the polarity of the grid voltage and added to the double-loop controller to perform appropriate power factor control. The performance and feasibility of the proposed control algorithm are verified through a 3 kW hardware prototype. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
15. Classification of forensic hyperspectral paper data using hybrid spectral similarity algorithms.
- Author
-
Devassy, Binu Melit, George, Sony, Nussbaum, Peter, and Thomas, Tessamma
- Subjects
SPECTRAL imaging ,FORGERY ,ALGORITHMS ,FORENSIC sciences ,CLASSIFICATION ,CONFIDENCE intervals ,CLASSIFICATION algorithms - Abstract
Document forgeries that involve modification of the materials used, such as ink and paper, provide evidence of any malpractices being performed. Forensic specialists use different techniques to identify and classify these samples; however, the most preferred method is to use nondestructive techniques to avoid any potential damage to the original specimen under investigation. Hyperspectral imaging has already been explored in several application domains and used as a powerful method in forensic investigations to extract information about the materials under examination. To precisely classify the material information and utilize the hyperspectral imaging technique's potential, we probed the potential of some hybrid spectral similarity measures to classify different commonly used paper samples. A comparison of these methods is quantitatively presented in this article. Hybrid spectral similarity algorithms are tested on forensic analysis of paper data. We compared the classification capabilities of various hybrid spectral similarity algorithms on hyperspectral data of 40 different paper samples. The overall accuracy (OA), kappa K̂, Z‐score of kappa (ZK̂), and the 95% confidence interval of kappa (CI(K̂)) are used for comparison. The SID‐SAM and SID‐SCA produced an overall accuracy of 88% and 87%, respectively, which is highest among the hybrid spectral similarity measures tested. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
16. Optimization of Texture Rendering of 3D Building Model Based on Vertex Importance.
- Author
-
Shen, Wenfei, Huo, Liang, Shen, Tao, Zhang, Miao, and Li, Yucai
- Subjects
TEXTURE mapping ,DATA modeling ,CURVATURE ,ALGORITHMS - Abstract
In 3D building models, a large number of texture maps with different sizes increase the number of model data loading and drawing batches, which greatly reduces the drawing efficiency of the model. Therefore, this paper proposes a texture set mapping method based on vertex importance. Firstly, based on the 2D space boxing algorithm, the texture maps are merged and a series of Mipmap texture maps are generated, and then the vertex curvature, texture variability and location information of each vertex are calculated, normalized, and weighted to get the importance of each vertex, and then finally, different Mipmap-level textures are remapped according to the importance of the vertices. The experiment proves that the algorithm in this paper can reduce the amount of texture data on the one hand, and avoid the rendering pressure brought by the still large amount of data after merging on the other hand, so as to improve the rendering efficiency of the model. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
17. Discussion paper: implications for the further development of the successfully in emergency medicine implemented AUD2IT-algorithm.
- Author
-
Przestrzelski, Christopher, Jakob, Antonina, Jakob, Clemens, and Hoffmann, Felix R.
- Subjects
DOCUMENTATION ,CURRICULUM ,HUMAN services programs ,EMERGENCY medicine ,EXPERIENCE ,MEDICAL records ,ELECTRONIC publications ,ALGORITHMS ,PATIENTS' attitudes - Abstract
The AUD2IT-algorithm is a tool to structure the data, which is collected during an emergency treatment. The goal is on the one hand to structure the documentation of the data and on the other hand to give a standardised data structure for the report during handover of an emergency patient. AUD2IT-algorithm was developed to provide residents a documentation aid, which helps to structure the medical reports without getting lost in unimportant details or forgetting important information. The sequence of anamnesis, clinical examination, considering a differential diagnosis, technical diagnostics, interpretation and therapy is rather an academic classification than a description of the real workflow. In a real setting, most of these steps take place simultaneously. Therefore, the application of the AUD2IT-algorithm should also be carried out according to the real processes. A big advantage of the AUD2IT-algorithm is that it can be used as a structure for the entire treatment process and also is entirely usable as a handover protocol within this process to make sure, that the existing state of knowledge is ensured at each point of a team-timeout. PR-E-(AUD2IT)-algorithm makes it possible to document a treatment process that, in principle, does not have to be limited to the field of emergency medicine. Also, in the outpatient treatment the PR-E-(AUD2IT)-algorithm could be used and further developed. One example could be the preparation and allocation of needed resources at the general practitioner. The algorithm is a standardised tool that can be used by healthcare professionals of any level of training. It gives the user a sense of security in their daily work. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
18. Socio‐technical issues in the platform‐mediated gig economy: A systematic literature review: An Annual Review of Information Science and Technology (ARIST) paper.
- Author
-
Dedema, Meredith and Rosenbaum, Howard
- Subjects
INFORMATION science ,TECHNOLOGY ,CORPORATE culture ,ALGORITHMS ,ECONOMICS - Abstract
The gig economy and gig work have grown quickly in recent years and have drawn much attention from researchers in different fields. Because the platform mediated gig economy is a relatively new phenomenon, studies have produced a range of interesting findings; of interest here are the socio‐technical issues that this work has surfaced. This systematic literature review (SLR) provides a snapshot of a range of socio‐technical issues raised in the last 12 years of literature focused on the platform mediated gig economy. Based on a sample of 515 papers gathered from nine databases in multiple disciplines, 132 were coded that specifically studied the gig economy, gig work, and gig workers. Three main socio‐technical themes were identified: (1) the digital workplace, which includes information infrastructure and digital labor that are related to the nature of gig work and the user agency; (2) algorithmic management, which includes platform governance, performance management, information asymmetry, power asymmetry, and system manipulation, relying on a diverse set of technological tools including algorithms and big data analytics; (3) ethical design, as a relevant value set that gig workers expect from the platform, which includes trust, fairness, equality, privacy, and transparency. A social informatics perspective is used to rethink the relationship between gig workers and platforms, extract the socio‐technical issues noted in prior research, and discuss the underexplored aspects of the platform mediated gig economy. The results draw attention to understudied yet critically important socio‐technical issues in the gig economy that suggest short‐ and long‐term opportunities for future research directions. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
19. 基于多目标优化的联邦学习进化.
- Author
-
胡智勇, 于千城, 王之赐, and 张丽丝
- Subjects
FEDERATED learning ,ALGORITHMS ,PRIVACY - Abstract
Copyright of Application Research of Computers / Jisuanji Yingyong Yanjiu is the property of Application Research of Computers Edition and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2024
- Full Text
- View/download PDF
20. A review paper of optimal resource allocation algorithm in cloud environment.
- Author
-
Patadiya, Namrata and Bhatt, Nirav
- Subjects
RESOURCE allocation ,LITERATURE reviews ,SERVICE level agreements ,ALGORITHMS ,ELECTRONIC data processing ,CLOUD computing - Abstract
Cloud computing has become a popular approach for processing data and running computationally expensive services on a pay-as-you-go basis. Due to the ever-increasing requirement for cloud-based apps, appropriately allocating resources according to user requests while meeting service-level agreements between customers and service providers has become increasingly complex. An efficient and versatile resource allocation method is required to properly deploy these assets and meet user needs. The technique of distributing resources has become more arduous as user demand has increased. One of the key areas of research experts is how to design optimal solutions for this approach. In this paper, a literature review on proposed dynamic resource allocation approaches is introduced. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
21. The influence of memory, sample size effects, and filter paper material on online laser-based plant and soil water isotope measurements.
- Author
-
Cui, Jiangpeng, Tian, Lide, Gerlein‐Safdi, Cynthia, and Qu, Dongmei
- Subjects
ISOTOPES ,INFRARED spectroscopy ,CAVITY-ringdown spectroscopy ,ALGORITHMS ,FILTER paper - Abstract
Rationale The recent development of isotope ratio infrared spectroscopy (IRIS) was quickly followed by the addition of online extraction and analysis systems, making it faster and easier to measure soil and plant water isotopes. However, memory and sample size effects limit the efficiency and accuracy of these new setups. In response, this study presents a scheme dedicated to estimating and eliminating these two effects. Methods Memory effect was determined by injecting two standard waters alternately. Each standard was injected nine times in a row and analyzed using induction module cavity ring-down spectroscopy (IM-CRDS). Memory coefficients were calculated using a new 'multistage jump' algorithm. Sample size effects were evaluated by injecting water volumes ranging from 1 μL to 6 μL. Finally, the influence of cellulose filter paper on the isotopic measurements, the memory, and the sample size effect was evaluated by comparing it with glass filter paper. Results Memory effects were detected for both δ
18 O and δ2 H values, with the latter being stronger. Isotopic differences between replicates of the same plant or soil sample showed a clear decrease after memory correction. A small water volume effect was found only when the injected water volume was larger than 3 μL. However, while the correction method performed well for laboratory-made samples, it did not for field samples, due to the heterogeneity of the isotopic composition of the samples. Stronger memory and water volume effects were found for cellulose filter paper. Conclusions The memory coefficients and the water volume-isotope relationship improved the consistency and accuracy of both laboratory and field data. Our results indicate that cellulose filter paper may not be a suitable medium to measure standard waters and evaluate memory and water volume effects. Finally, a detailed correction and calibration protocol is suggested, along with notes on best practices to obtain good-quality IM-CRDS data. Copyright © 2017 John Wiley & Sons, Ltd. [ABSTRACT FROM AUTHOR]- Published
- 2017
- Full Text
- View/download PDF
22. Performance Evaluation of the Extractive Methods in Automatic Text Summarization Using Medical Papers.
- Author
-
Kus, Anil and Aci, Cigdem Inan
- Subjects
PERFORMANCE evaluation ,TEXT summarization ,MEDICAL sciences ,ALGORITHMS ,SEMANTICS - Abstract
Copyright of Gazi Journal of Engineering Sciences (GJES) / Gazi Mühendislik Bilimleri Dergisi is the property of Gazi Journal of Engineering Sciences and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2023
- Full Text
- View/download PDF
23. 39‐3: Invited Paper: Kirameki Display: Technical Approaches to Represent Real Texture with Light Fields.
- Author
-
Sumi, Naoki, Edo, Keiko, Shibazaki, Minoru, and Hagino, Shuji
- Subjects
MATERIALS texture ,DESIGN software ,SOFTWARE architecture ,ALGORITHMS - Abstract
We have developed a "Kirameki display" that can represent a real texture of materials. ("Kirameki" means a shining/glittering sense in Japanese.) In addition, we have investigated an improvement of "texture representation" by optimization of both an optical design and a software algorithm. Lastly, we discuss the future technical fields for texture representation on displays. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
24. Paper-based 3D printing of anthropomorphic CT phantoms: Feasibility of two construction techniques.
- Author
-
Jahnke, Paul, Schwarz, Stephan, Ziegert, Marco, Schwarz, Felix Benjamin, Hamm, Bernd, and Scheel, Michael
- Subjects
ALGORITHMS ,COMPUTED tomography ,DIAGNOSTIC imaging ,HEAD ,COMPUTERS in medicine ,IMAGING phantoms ,RESEARCH funding ,PILOT projects ,THREE-dimensional printing ,MEDICAL artifacts - Abstract
Objectives: To develop and evaluate methods for assembling radiopaque printed paper sheets to realistic patient phantoms for CT dose and image quality testing.Methods: CT images of two patients were radiopaque printed with aqueous potassium iodide solution (0.6 g/ml) on paper. Two methods were developed for assembling the paper sheets to head and neck phantoms. (1) Printed sheets were fed to a paper-based 3D printer along with corresponding 3D printable STL files. (2) Paper stacks of 5-mm thickness were glued with toner, cut to the patient shape and assembled to a phantom. In a sample application study, both phantoms were examined with five different tube current settings. Images were reconstructed using filtered-back projection (FBP) and iterative reconstruction (AIDR 3D) with three strength levels. Dose length product (DLP), signal-to-noise ratios (SNR) and contrast-to-noise ratios (CNRs) were analysed. Data were analysed using 2-way analysis of variance (ANOVA).Results: Both methods achieved anthropomorphic phantoms with detailed patient anatomy. The 3D printer yielded a precise reproduction of the external patient shape, but caused visible glue artefacts. Gluing with toner avoided these artefacts and yielded more flexibility with regard to phantom size. In the sample application study, non-inferior SNR and CNR and up to 83.7% lower DLP were achieved on the phantoms with AIDR 3D compared with FBP.Conclusions: Two methods for assembling radiopaque printed paper sheets to phantoms of individual patients are presented. The sample application demonstrates potential for simulation of patient imaging and systematic CT dose and image quality assessment.Key Points: • Two methods were developed to create realistic CT phantoms of individual patients from radiopaque printed paper sheets. • Analysis of five tube current and four reconstruction settings on two radiopaque 3D printed patient phantoms yielded non-inferior SNR and CNR and up to 83.7% lower dose with iterative reconstruction in comparison with filtered back projection. • Radiopaque 3D printed phantoms can simulate patients and allow systematic analysis of CT dose and image quality parameters. [ABSTRACT FROM AUTHOR]- Published
- 2019
- Full Text
- View/download PDF
25. Selected Papers of the 32nd International Workshop on Combinatorial Algorithms, IWOCA 2021.
- Author
-
Flocchini, Paola and Moura, Lucia
- Subjects
EULERIAN graphs ,ALGORITHMS ,APPROXIMATION algorithms ,WEB hosting - Abstract
They give fixed parameter tractable algorithms for the problem parameterized by various structural parameters. The authors give a greedy loop-free algorithm for the exhaustive generation, a successor algorithm that runs in constant amortized time, among other algorithms, as well as results for the fixed spin generalization of this problem. IWOCA (International Workshop on Combinatorial Algorithms) is an annual conference series covering all aspects of combinatorial algorithms. [Extracted from the article]
- Published
- 2023
- Full Text
- View/download PDF
26. Development and Validation of an Algorithm for the Digitization of ECG Paper Images.
- Author
-
Randazzo, Vincenzo, Puleo, Edoardo, Paviglianiti, Annunziata, Vallan, Alberto, and Pasero, Eros
- Subjects
DIGITIZATION ,DIGITAL images ,ELECTROCARDIOGRAPHY ,HEART rate monitors ,PEARSON correlation (Statistics) ,MEASUREMENT errors ,HEART beat ,ALGORITHMS - Abstract
The electrocardiogram (ECG) signal describes the heart's electrical activity, allowing it to detect several health conditions, including cardiac system abnormalities and dysfunctions. Nowadays, most patient medical records are still paper-based, especially those made in past decades. The importance of collecting digitized ECGs is twofold: firstly, all medical applications can be easily implemented with an engineering approach if the ECGs are treated as signals; secondly, paper ECGs can deteriorate over time, therefore a correct evaluation of the patient's clinical evolution is not always guaranteed. The goal of this paper is the realization of an automatic conversion algorithm from paper-based ECGs (images) to digital ECG signals. The algorithm involves a digitization process tested on an image set of 16 subjects, also with pathologies. The quantitative analysis of the digitization method is carried out by evaluating the repeatability and reproducibility of the algorithm. The digitization accuracy is evaluated both on the entire signal and on six ECG time parameters (R-R peak distance, QRS complex duration, QT interval, PQ interval, P-wave duration, and heart rate). Results demonstrate the algorithm efficiency has an average Pearson correlation coefficient of 0.94 and measurement errors of the ECG time parameters are always less than 1 mm. Due to the promising experimental results, the algorithm could be embedded into a graphical interface, becoming a measurement and collection tool for cardiologists. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
27. Scientific papers and artificial intelligence. Brave new world?
- Author
-
Nexøe, Jørgen
- Subjects
COMPUTERS ,MANUSCRIPTS ,ARTIFICIAL intelligence ,MACHINE learning ,DATA analysis ,MEDICAL literature ,MEDICAL research ,ALGORITHMS - Published
- 2023
- Full Text
- View/download PDF
28. Paper currency defect detection algorithm using quaternion uniform strength.
- Author
-
Gai, Shan, Xu, Xiaolin, and Xiong, Bangshu
- Subjects
ALGORITHMS ,QUATERNIONS ,MONEY ,IMAGE registration ,MATHEMATICAL convolutions - Abstract
In this paper, we propose a novel paper currency defect detection algorithm using quaternion uniform strength. We first build paper currency image preprocessing integration framework which includes intensity balancing, paper currency location, and geometric correction. We then propose a global–local paper currency image registration algorithm by moving key areas within certain range which can eliminate the false difference effectively. Finally, the quaternion uniform strength is calculated by using quaternion convolution edge detector. The defect degree of paper currency is determined by using the quaternion uniform color difference. The proposed algorithm is tested using different datasets from five countries: CNY, USD, EUR, VND, and RUB. The experimental results demonstrate that the proposed algorithm yields better results than the existing state-of-the-art paper currency defect detection techniques. The demo of the proposed paper currency defect detection algorithm will be publicly available. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
29. 24‐1: Invited Paper: A Novel Algorithm for Eliminating Abnormal Detection Data of Ultra‐Large 95inch 8K OLED Panels with External Compensation.
- Author
-
Feng, Xuehuan, Bao, Wenchao, Meng, Song, Zhang, Yao, Li, Yongqian, Peng, Yuqing, Yu, Jianwei, and Dong, Xue
- Subjects
ALGORITHMS ,ORGANIC light emitting diodes - Abstract
In this paper, we present a new algorithm which can eliminate abnormal detection data of our OLED panels caused by damaged detection TFTs. During the manufacture of oversize OLED TVs, particles bring many serious problems especially when they make detection TFTs be damaged and these TFTs will cause one or even several columns of detection data to be abnormal. The new "abnormal detection data forbidden algorithm" will replace the adjacent normal column detection data for the column with obvious abnormal detection data and the new "mosaic algorithm" will scatter the detection data of these columns that the detection data is not obviously abnormal but they are thin lines visible to the human eyes when the panel is working. To sum up, the new integration algorithm is very important for improving the image quality and stable operation of our products. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
30. Social and content aware One-Class recommendation of papers in scientific social networks.
- Author
-
Wang, Gang, He, XiRan, and Ishuga, Carolyne Isigi
- Subjects
INFORMATION technology ,SOCIAL networks ,SPARSE graphs ,HYBRID computers (Computer architecture) ,HYBRID power systems - Abstract
With the rapid development of information technology, scientific social networks (SSNs) have become the fastest and most convenient way for researchers to communicate with each other. Many published papers are shared via SSNs every day, resulting in the problem of information overload. How to appropriately recommend personalized and highly valuable papers for researchers is becoming more urgent. However, when recommending papers in SSNs, only a small amount of positive instances are available, leaving a vast amount of unlabelled data, in which negative instances and potential unseen positive instances are mixed together, which naturally belongs to One-Class Collaborative Filtering (OCCF) problem. Therefore, considering the extreme data imbalance and data sparsity of this OCCF problem, a hybrid approach of Social and Content aware One-class Recommendation of Papers in SSNs, termed SCORP, is proposed in this study. Unlike previous approaches recommended to address the OCCF problem, social information, which has been proved playing a significant role in performing recommendations in many domains, is applied in both the profiling of content-based filtering and the collaborative filtering to achieve superior recommendations. To verify the effectiveness of the proposed SCORP approach, a real-life dataset from CiteULike was employed. The experimental results demonstrate that the proposed approach is superior to all of the compared approaches, thus providing a more effective method for recommending papers in SSNs. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
31. Utilizing tables, figures, charts and graphs to enhance the readability of a research paper.
- Author
-
Divecha C. A., Tullu M. S., and Karande S.
- Subjects
GRAPHIC arts ,READABILITY (Literary style) ,SERIAL publications ,RESEARCH methodology ,COPYRIGHT ,MEDICAL research ,ALGORITHMS - Abstract
The authors offer observation on utilizing tables, figures, charts and graphs to help understand the research presented in a simple manner but also engage and sustain the reader's interest. Topics discussed include benefits provided by the use of tables/figures/charts/graphs, general methodology of design and submission, and copyright issues of using material from government publications/public domain.
- Published
- 2023
- Full Text
- View/download PDF
32. LA INTEL·LIGÈNCIA ARTIFICIAL EN LA DETECCIÓ DE LES PRÀCTIQUES DE BID RIGGING: EL PAPER CAPDAVANTER DE L'ACCO.
- Author
-
Jiménez Cardona, Noemí
- Subjects
GOVERNMENT purchasing ,ARTIFICIAL intelligence ,ANTITRUST law ,SOFTWARE development tools ,CARTELS - Abstract
Copyright of Revista Catalana de Dret Públic is the property of Revista Catalana de Dret Public and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2022
- Full Text
- View/download PDF
33. Selected Papers of the 31st International Workshop on Combinatorial Algorithms, IWOCA 2020.
- Author
-
Gąsieniec, Leszek, Klasing, Ralf, and Radzik, Tomasz
- Subjects
MATHEMATICAL proofs ,ALGORITHMS ,CHARTS, diagrams, etc. ,POLYNOMIAL time algorithms ,ONLINE algorithms ,GRAPH labelings ,HAMMING distance - Published
- 2022
- Full Text
- View/download PDF
34. Edge computing-enabled green multisource fusion indoor positioning algorithm based on adaptive particle filter.
- Author
-
Li, Mengyao, Zhu, Rongbo, Ding, Qianao, Wang, Jun, Wan, Shaohua, and Ma, Maode
- Subjects
ADAPTIVE filters ,PROBABILITY density function ,ALGORITHMS ,EDGE computing ,FILTER paper ,PARTICLE swarm optimization - Abstract
Edge computing enables portable devices to provide smart applications, and the indoor positioning technique offers accurate location-based indoor navigation and personalized smart services. To achieve the high positioning accuracy, an indoor positioning algorithm based on particle filter requires a large number of sample particles to approximate the probability density function, which leads to the additional computational cost and high fusion delay. Focusing on real-time and accurate positioning, an edge computing-enabled green multi-source fusion indoor positioning algorithm called APFP is proposed based on adaptive particle filter in this paper. APFP considers both pedestrian dead reckoning (PDR) signals in mobile terminals and the received signal strength indication (RSSI) of Bluetooth, and effectively merges the error-free accumulation of trilateral positioning and the accurate short-range positioning of PDR, which enables mobile terminals adaptively perform particle filter to reduce the computing time and power consumption while ensuring positioning accuracy simultaneously. Detailed experimental results show that, compared with the traditional particle filter algorithm and the map-constrained algorithm, the proposed APFP reduces fusion computing cost by 59.89% and 54.37%, respectively. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
35. 多视图融合TextRCNN的论文自动推荐算法.
- Author
-
杨秀璋, 武帅, 杨琪, 项美玉, 李娜, 周既松, and 赵小明
- Subjects
CONVOLUTIONAL neural networks ,DEEP learning ,MACHINE learning ,AUTOMATIC classification ,ACCURACY of information ,ALGORITHMS - Abstract
Copyright of Journal of Computer Engineering & Applications is the property of Beijing Journal of Computer Engineering & Applications Journal Co Ltd. and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2023
- Full Text
- View/download PDF
36. COAP 2019 Best Paper Prize: Paper of S. Gratton, C. W. Royer, L. N. Vicente, and Z. Zhang.
- Subjects
APPLIED mathematics ,ALGORITHMS ,SCIENTIFIC computing ,PRIZES (Contests & competitions) ,ALGORITHMIC randomness - Abstract
Each year, the editorial board of Computational Optimization and Applications selects a paper from the preceding year's publications for the Best Paper Award. This derivative-free algorithm relies on randomly generated directions and is analyzed from a probabilistic viewpoint, leading to complexity guarantees for both deterministic and probabilistic versions of the method. First, following recent developments in nonconvex optimization, complexity results have become increasingly popular in derivative-free optimization [[8]]. [Extracted from the article]
- Published
- 2020
- Full Text
- View/download PDF
37. 基于子空间多尺度特征融合的试卷语义分割.
- Author
-
夏源祥, 刘 渝, 楚程钱, 万永菁, and 蒋翠玲
- Subjects
PYRAMIDS ,ALGORITHMS ,HANDWRITING ,CLASSIFICATION ,SUBSPACES (Mathematics) - Abstract
Copyright of Journal of East China University of Science & Technology is the property of Journal of East China University of Science & Technology Editorial Office and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2023
- Full Text
- View/download PDF
38. 23‐1: Invited Paper: Visualization of Color‐Gamut Coverage‐Gamut Ring Intersection.
- Subjects
VISUALIZATION ,COLOR ,ALGORITHMS - Abstract
The D50 CIELAB gamut volume was standardized as a suitable and unified gamut size measurement methodology. The color gamut volume is visualized using proportionate gamut rings in a two‐dimensional diagram. Gamut ring intersection assists the visual comparison of the size and shape of the color gamut coverage between a display and a reference. This paper introduces a desktop application with an intuitive graphical interface for visualizing the gamut rings and describes the algorithm of the gamut ring intersection. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
39. Digital marginalization, data marginalization, and algorithmic exclusions: a critical southern decolonial approach to datafication, algorithms, and digital citizenship from the Souths.
- Author
-
Chaka, Chaka
- Subjects
CITIZENSHIP ,DECOLONIZATION ,ELECTRONIC paper ,ALGORITHMS ,COMMUNITIES ,CHIEF information officers - Abstract
This paper explores digital marginalization, data marginalization, and algorithmic exclusions in the Souths. To this effect, it argues that underrepresented users and communities continue to be marginalized and excluded by digital technologies, by big data, and by algorithms employed by organizations, corporations, institutions, and governments in various data jurisdictions. Situating data colonialism within the Souths, the paper contends that data ableism, data disablism, and data colonialism are at play when data collected, collated, captured, configured, and processed from underrepresented users and communities is utilized by mega entities for their own multiple purposes. It also maintains that data coloniality, as opposed to data colonialism, is impervious to legal and legislative interventions within data jurisdictions. Additionally, it discusses digital citizenship (DC) and its related emerging regimes. Moreover, the paper argues that digital exclusion transcends the simplistic haves versus the have nots dualism as it manifests itself in multiple layers and in multiple dimensions. Furthermore, it characterizes how algorithmic exclusions tend to perpetuate historical human biases despite the pervasive view that algorithms are autonomous, neutral, rational, objective, fair, unbiased, and non-human. Finally, the paper advances a critical southern decolonial (CSD) approach to datafication, algorithms, and digital citizenship by means of which data coloniality, algorithmic coloniality, and the coloniality embodied in DC have to be critiqued, challenged, and dismantled. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
40. A Multi-Metric Model for analyzing and comparing extractive text summarization approaches and algorithms on scientific papers.
- Author
-
DURSUN, Mehmet Ali and SERTTAŞ, Soydan
- Subjects
TEXT summarization ,EDUCATION research ,ALGORITHMS ,AUTOMATIC summarization ,STATISTICS - Abstract
In today's world, where data and information are increasingly proliferating, text summarization and technologies play a critical role in making large amounts of text data more accessible and meaningful. In business, the news industry, academic research, and many other fields, text summarization helps make quick decisions, access information faster, and manage resources more effectively. Additionally, text summarization research is conducted to further improve these technologies and develop new methods and algorithms to provide better summarization of texts. Therefore, text summarization and research in this field are of great importance in the information age. In this study, a new operating model for text summarization that can be applied to different algorithms is proposed and evaluated. Sixteen summarization algorithms covering six approaches (statistical, graph-based, content-based, pointer-based, position-based, and user-oriented) were implemented and tested on 50 different full-text article datasets. Four evaluation criteria (BLEU, Rouge-N, Rouge-L, METEOR) were used to assess the similarity between the generated summaries and the original summaries. The performance of the algorithms within each approach was averaged and the overall best-performing algorithm was selected. This best algorithm was subjected to further analysis through Topic Modelling and Keyword Extraction to identify key topics and keywords within the summarised text. The proposed model provides a standardized workflow for developing and thoroughly testing summarization algorithms across datasets and evaluation metrics to determine the most appropriate summarization approach. This study demonstrates the effectiveness of the model on a variety of algorithm types and text sources. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
41. The Economic Optimization of Pulp and Paper Making Processes Using Computational Intelligence.
- Author
-
Pant, Millie, Thangaraj, Radha, and Singh, V. P.
- Subjects
COMPUTATIONAL intelligence ,ALGORITHMS ,INDUSTRIAL efficiency ,PAPER industry ,MANUFACTURING processes - Abstract
In this paper we present an application of two Computational Intelligence Algorithms, namely Particle Swarm Optimization (PSO) Algorithm and Differential Evolution (DE) for finding and optimal solution to two optimization problems that occur in a paper industry. The first problem deals with the economic optimization of a hypothetical but realistic Kraft pulping process and in the second problem we have considered the optimization of Boiler load allocation problem. Both the problems form an integral part of paper making process. The simulation results show the efficiency and time effectiveness of DE and PSO. [ABSTRACT FROM AUTHOR]
- Published
- 2009
- Full Text
- View/download PDF
42. Quantifying the impact of scholarly papers based on higher-order weighted citations.
- Author
-
Bai, Xiaomei, Zhang, Fuli, Hou, Jie, Lee, Ivan, Kong, Xiangjie, Tolba, Amr, and Xia, Feng
- Subjects
CITATION analysis ,SCHOLARLY publishing ,BIBLIOMETRICS ,SIMULATION methods & models ,ALGORITHMS - Abstract
Quantifying the impact of a scholarly paper is of great significance, yet the effect of geographical distance of cited papers has not been explored. In this paper, we examine 30,596 papers published in Physical Review C, and identify the relationship between citations and geographical distances between author affiliations. Subsequently, a relative citation weight is applied to assess the impact of a scholarly paper. A higher-order weighted quantum PageRank algorithm is also developed to address the behavior of multiple step citation flow. Capturing the citation dynamics with higher-order dependencies reveals the actual impact of papers, including necessary self-citations that are sometimes excluded in prior studies. Quantum PageRank is utilized in this paper to help differentiating nodes whose PageRank values are identical. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
43. 边信息嵌入的学术论文推荐算法研究.
- Author
-
沈小烽, 刘柏嵩, 吴俊超, and 钱江波
- Subjects
RANDOM walks ,QUALITY factor ,PROBLEM solving ,ALGORITHMS ,COSINE function ,SEMANTICS - Abstract
Copyright of Journal of Computer Engineering & Applications is the property of Beijing Journal of Computer Engineering & Applications Journal Co Ltd. and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2022
- Full Text
- View/download PDF
44. Using Paper Texture for Choosing a Suitable Algorithm for Scanned Document Image Binarization.
- Author
-
Lins, Rafael Dueire, Bernardino, Rodrigo, Barboza, Ricardo da Silva, and De Oliveira, Raimundo Correa
- Subjects
DOCUMENT imaging systems ,HISTORICAL source material ,TEXTURES ,ALGORITHMS - Abstract
The intrinsic features of documents, such as paper color, texture, aging, translucency, the kind of printing, typing or handwriting, etc., are important with regard to how to process and enhance their image. Image binarization is the process of producing a monochromatic image having its color version as input. It is a key step in the document processing pipeline. The recent Quality-Time Binarization Competitions for documents have shown that no binarization algorithm is good for any kind of document image. This paper uses a sample of the texture of the scanned historical documents as the main document feature to select which of the 63 widely used algorithms, using five different versions of the input images, totaling 315 document image-binarization schemes, provides a reasonable quality-time trade-off. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
45. 基于改进YOLOv5 的纸病检测方法.
- Author
-
张开生 and 关凯凯
- Subjects
CLASSIFICATION algorithms ,LIGHT sources ,FEATURE extraction ,ALGORITHMS ,SPEED - Abstract
Copyright of China Pulp & Paper is the property of China Pulp & Paper Magazines Publisher and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2022
- Full Text
- View/download PDF
46. ITERATIVE ALGORITHMS FOR VARIATIONAL INCLUSIONS IN BANACH SPACES.
- Author
-
ANSARI, QAMRUL HASAN, BALOOEE, JAVAD, and PETRUŞEL, ADRIAN
- Subjects
BANACH spaces ,LIPSCHITZ continuity ,PAPER arts ,DIFFERENTIAL inclusions ,ALGORITHMS - Abstract
The present paper is in two folds. In the first fold, we prove the Lipschitz continuity of the proximal mapping associated with a general strongly H-monotone mapping and compute an estimate of its Lipschitz constant under some mild assumptions imposed on the mapping H involved in the proximal mapping. We provide two examples to show that a maximal monotone mapping need not be a general H-monotone for a single-valued mapping H from a Banach space to its dual space. A class of multi-valued nonlinear variational inclusion problems is considered, and by using the notion of proximal mapping and Nadler's technique, an iterative algorithm with mixed errors is suggested to compute its solutions. Under some appropriate hypotheses imposed on the mappings and parameters involved in the multi-valued nonlinear variational inclusion problem, the strong convergence of the sequences generated by the proposed algorithm to a solution of the aforesaid problem is verified. The second fold of this paper investigates and analyzes the notion of Cn-monotone mappings defined and studied in [S.Z. Nazemi, A new class of monotone mappings and a new class of variational inclusions in Banach spaces, J. Optim. Theory Appl. 155(3)(2012) 785-795]. Several comments related to the results and algorithm appeared in the above mentioned paper are given. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
47. Explainable Rules and Heuristics in AI Algorithm Recommendation Approaches--A Systematic Literature Review and Mapping Study.
- Author
-
García-Peñalvo, Francisco José, Vázquez-Ingelmo, Andrea, and García-Holgado, Alicia
- Subjects
ARTIFICIAL intelligence ,LITERATURE reviews ,SOFTWARE engineering ,ALGORITHMS ,HEURISTIC ,SOFTWARE engineers - Abstract
The exponential use of artificial intelligence (AI) to solve and automated complex tasks has catapulted its popularity generating some challenges that need to be addressed. While AI is a powerful means to discover interesting patterns and obtain predictive models, the use of these algorithms comes with a great responsibility, as an incomplete or unbalanced set of training data or an unproper interpretation of the models' outcomes could result in misleading conclusions that ultimately could become very dangerous. For these reasons, it is important to rely on expert knowledge when applying these methods. However, not every user can count on this specific expertise; non-AI-expert users could also benefit from applying these powerful algorithms to their domain problems, but they need basic guidelines to obtain the most out of AI models. The goal of this work is to present a systematic review of the literature to analyze studies whose outcomes are explainable rules and heuristics to select suitable AI algorithms given a set of input features. The systematic review follows the methodology proposed by Kitchenham and other authors in the field of software engineering. As a result, 9 papers that tackle AI algorithm recommendation through tangible and traceable rules and heuristics were collected. The reduced number of retrieved papers suggests a lack of reporting explicit rules and heuristics when testing the suitability and performance of AI algorithms. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
48. Finance journal rankings: a paper affiliation methodology.
- Author
-
Docampo, Domingo and Safón, Vicente
- Subjects
PYTHON programming language ,BIBLIOMETRICS ,CLASSIFICATION ,ALGORITHMS - Abstract
Purpose: In this paper, the authors use a new methodology, called paper affiliation index, to create finance journal ranking using expert judgment and research impact, both of which are based on secondary, objective measures, thus making it possible to produce lists every year without human manipulation at virtually no cost. Design/methodology/approach: Bibliometrics. Python implementation. Findings: A new ranking with 65 finance journals. Research limitations/implications: This procedure helps to reduce bias and to deal with known problems associated with current methodologies. The data used in the methodology comes from public sources; the procedure is therefore easily replicable. This methodology is not subject-dependent and thus can be transferred to other realms of knowledge. Once the bibliometric institutional data has been gathered, the procedure is not computationally costly: a Python implementation of the algorithm executes the whole computation in a few seconds. Results seem to correct the pernicious Matthew effect which is so evident in citation-based methods. Originality/value: The institutional classification created includes all institutions that have contributed papers to the field of finance. The procedure helps to reduce bias and to deal with known problems associated with current methodologies. The data used in the methodology comes from public sources, the procedure is therefore easily replicable. The methodology is not subject-dependent and thus can be transferred to other realms of knowledge. Once the bibliometric institutional data has been gathered, the procedure is not computationally costly. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
49. "Re-Materialized" Medical Data: Paper-Based Transmission of Structured Medical Data Using QR-Code, for Medical Imaging Reports.
- Author
-
LAURIOT DIT PREVOST, Arthur, BENTEGEAC, Raphaël, DEQUESNES, Audrey, BILLIAU, Adrien, BAUDELET, Emmanuel, LEGLEYE, Rémi, HUBAUT, Marc-Antoine, CASSAGNOU, Michel, PUECH, Philippe, BESSON, Rémi, and CHAZARD, Emmanuel
- Subjects
ELECTRONIC data interchange ,DIAGNOSTIC imaging ,TELECONFERENCING ,MEDICAL informatics ,ALGORITHMS ,TELEMEDICINE - Abstract
Although paper-based transmission of medical information might seem outdated, it has proven efficient, and remains structurally safe from massive data leaks. As part of the ICIPEMIR project for improving medical imaging report, we explored the idea of structured data storage within a medical report, by embedding the data themselves in a QR-Code (and no URL-to-the-data). Three different datasets from ICIPEMIR were serialized, then encoded in a QR-Code. We compared 4 compression algorithms to reduce file size before QR-Encoding. YAML was the most concise format (character sparing), and allowed for embedding of a 2633-character serialized file within a QR-Code. The best compression rate was obtained with gzip, with a compression ratio of 2.32 in 15.7ms. Data were easily extracted and decompressed from a digital QR-Code using a simple command line. YAML file was also successfully recovered from the printed QR-Code with both Android and iOS smartphone. Minimal detected size was 3*3cm. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
50. Guest editorial: AI for computational audition—sound and music processing.
- Author
-
Li, Zijin, Wang, Wenwu, Zhang, Kejun, and Zhu, Mengyao
- Subjects
ARTIFICIAL intelligence ,INTERDISCIPLINARY research ,TRANSVERSAL lines ,ALGORITHMS - Abstract
Nowadays, the application of artificial intelligence (AI) algorithms and techniques is ubiquitous and transversal. Fields that take advantage of AI advances include sound and music processing. The advances in interdisciplinary research potentially yield new insights that may further advance the AI methods in this field. This special issue aims to report recent progress and spur new research lines in AI-driven sound and music processing, especially within interdisciplinary research scenarios. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.