127,481 results
Search Results
2. Automatic Test Paper Generation Technology for Mandarin Based on Hilbert Huang Algorithm.
- Author
-
Wang, Lei
- Subjects
ARTIFICIAL neural networks ,ALGORITHMS ,COMPUTER engineering ,EMPLOYEE rights ,HUMAN resources departments - Abstract
With the development of computer technology, automatic test paper generation systems have gradually become an effective tool for detecting and maintaining national machine security and protecting the rights and interests of workers. This article achieved multi-level oral scores for different types of questions through online scoring using artificial neural networks in recent years. Based on its specific situation and evaluation index requirements, an analysis module that is reasonable, efficient, and in line with the hierarchical structure and module requirements of national conditions has been designed to complete the research on automatic test paper generation technology, in order to help better manage and allocate human resources and improve production efficiency. Afterwards, this article conducted functional testing on the technical module. The test results showed that the scalability of the system was over 82%. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
3. Paper Perfect: Robert Lang and the Science of Origami
- Author
-
Foer, Joshua
- Published
- 2014
4. Cooperative Multiobjective Decision Support for the Paper Industry
- Author
-
Murthy, Sesh, Akkiraju, Rama, Goodwin, Richard, Keskinocak, Pinar, Rachlin, John, Wu, Frederick, Yeh, James, Fuhrer, Robert, Kumaran, Santhosh, Aggarwal, Alok, Sturzenbecker, Martin, Jayaraman, Ranga, and Daigle, Robert
- Published
- 1999
5. Special Issue Paper: Robust Solutions and Risk Measures for a Supply Chain Planning Problem under Uncertainty
- Author
-
Poojari, C. A., Lucas, C., and Mitra, G.
- Published
- 2008
6. Special Issue: "2022 and 2023 Selected Papers from Algorithms' Editorial Board Members".
- Author
-
Werner, Frank
- Subjects
EDITORIAL boards ,ALGORITHMS ,OPTIMIZATION algorithms ,DIFFERENTIAL evolution ,QUADRATIC assignment problem ,MACHINE learning ,TABU search algorithm - Abstract
This document is a special issue of the journal Algorithms, featuring selected papers from the journal's editorial board members from 2022 and 2023. The issue includes 16 research papers covering a range of topics such as game theory, fault detection in cellular networks, optimization algorithms, machine learning, cryptocurrency trading, and more. Each paper presents its own unique research findings and methodologies. The issue aims to showcase the diverse research interests and expertise of the journal's editorial board members. [Extracted from the article]
- Published
- 2024
- Full Text
- View/download PDF
7. A fully-automated paper ECG digitisation algorithm using deep learning.
- Author
-
Wu, Huiyi, Patel, Kiran Haresh Kumar, Li, Xinyang, Zhang, Bowen, Galazis, Christoforos, Bajaj, Nikesh, Sau, Arunashis, Shi, Xili, Sun, Lin, Tao, Yanda, Al-Qaysi, Harith, Tarusan, Lawrence, Yasmin, Najira, Grewal, Natasha, Kapoor, Gaurika, Waks, Jonathan W., Kramer, Daniel B., Peters, Nicholas S., and Ng, Fu Siong
- Subjects
- *
DEEP learning , *ELECTROCARDIOGRAPHY , *ELECTRONIC paper , *ATRIAL fibrillation , *ALGORITHMS , *HEART failure , *HEART rate monitors - Abstract
There is increasing focus on applying deep learning methods to electrocardiograms (ECGs), with recent studies showing that neural networks (NNs) can predict future heart failure or atrial fibrillation from the ECG alone. However, large numbers of ECGs are needed to train NNs, and many ECGs are currently only in paper format, which are not suitable for NN training. We developed a fully-automated online ECG digitisation tool to convert scanned paper ECGs into digital signals. Using automated horizontal and vertical anchor point detection, the algorithm automatically segments the ECG image into separate images for the 12 leads and a dynamical morphological algorithm is then applied to extract the signal of interest. We then validated the performance of the algorithm on 515 digital ECGs, of which 45 were printed, scanned and redigitised. The automated digitisation tool achieved 99.0% correlation between the digitised signals and the ground truth ECG (n = 515 standard 3-by-4 ECGs) after excluding ECGs with overlap of lead signals. Without exclusion, the performance of average correlation was from 90 to 97% across the leads on all 3-by-4 ECGs. There was a 97% correlation for 12-by-1 and 3-by-1 ECG formats after excluding ECGs with overlap of lead signals. Without exclusion, the average correlation of some leads in 12-by-1 ECGs was 60–70% and the average correlation of 3-by-1 ECGs achieved 80–90%. ECGs that were printed, scanned, and redigitised, our tool achieved 96% correlation with the original signals. We have developed and validated a fully-automated, user-friendly, online ECG digitisation tool. Unlike other available tools, this does not require any manual segmentation of ECG signals. Our tool can facilitate the rapid and automated digitisation of large repositories of paper ECGs to allow them to be used for deep learning projects. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
8. Servis povezan s prijelomom Hrvatskog društva za fizikalnu i rehabilitacijsku medicinu Hrvatskoga liječničkog zbora – dokument o stajalištu.
- Author
-
Grazio, Simeon, Nikolić, Tatjana, Luke Vrbanić, Tea Schnurrer, Poljičanin, Ana, and Grubišić, Frane
- Abstract
Copyright of Lijecnicki Vjesnik is the property of Croatian Medical Association and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2024
- Full Text
- View/download PDF
9. A Machine Learning Model to Predict Citation Counts of Scientific Papers in Otology Field.
- Author
-
Alohali, Yousef A., Fayed, Mahmoud S., Mesallam, Tamer, Abdelsamad, Yassin, Almuhawas, Fida, and Hagr, Abdulrahman
- Subjects
DECISION trees ,SERIAL publications ,NATURAL language processing ,BIBLIOMETRICS ,MACHINE learning ,REGRESSION analysis ,RANDOM forest algorithms ,CITATION analysis ,DESCRIPTIVE statistics ,PREDICTION models ,ARTIFICIAL neural networks ,MEDICAL research ,MEDICAL specialties & specialists ,ALGORITHMS - Abstract
One of the most widely used measures of scientific impact is the number of citations. However, due to its heavy-tailed distribution, citations are fundamentally difficult to predict but can be improved. This study was aimed at investigating the factors and parts influencing the citation number of a scientific paper in the otology field. Therefore, this work proposes a new solution that utilizes machine learning and natural language processing to process English text and provides a paper citation as the predicted results. Different algorithms are implemented in this solution, such as linear regression, boosted decision tree, decision forest, and neural networks. The application of neural network regression revealed that papers' abstracts have more influence on the citation numbers of otological articles. This new solution has been developed in visual programming using Microsoft Azure machine learning at the back end and Programming Without Coding Technology at the front end. We recommend using machine learning models to improve the abstracts of research articles to get more citations. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
10. Computing Science: Pencil, Paper, and Pi
- Author
-
Hayes, Brian
- Published
- 2014
11. Development and Validation of an Algorithm for the Digitization of ECG Paper Images.
- Author
-
Randazzo, Vincenzo, Puleo, Edoardo, Paviglianiti, Annunziata, Vallan, Alberto, and Pasero, Eros
- Subjects
- *
DIGITIZATION , *DIGITAL images , *ELECTROCARDIOGRAPHY , *HEART rate monitors , *PEARSON correlation (Statistics) , *MEASUREMENT errors , *HEART beat , *ALGORITHMS - Abstract
The electrocardiogram (ECG) signal describes the heart's electrical activity, allowing it to detect several health conditions, including cardiac system abnormalities and dysfunctions. Nowadays, most patient medical records are still paper-based, especially those made in past decades. The importance of collecting digitized ECGs is twofold: firstly, all medical applications can be easily implemented with an engineering approach if the ECGs are treated as signals; secondly, paper ECGs can deteriorate over time, therefore a correct evaluation of the patient's clinical evolution is not always guaranteed. The goal of this paper is the realization of an automatic conversion algorithm from paper-based ECGs (images) to digital ECG signals. The algorithm involves a digitization process tested on an image set of 16 subjects, also with pathologies. The quantitative analysis of the digitization method is carried out by evaluating the repeatability and reproducibility of the algorithm. The digitization accuracy is evaluated both on the entire signal and on six ECG time parameters (R-R peak distance, QRS complex duration, QT interval, PQ interval, P-wave duration, and heart rate). Results demonstrate the algorithm efficiency has an average Pearson correlation coefficient of 0.94 and measurement errors of the ECG time parameters are always less than 1 mm. Due to the promising experimental results, the algorithm could be embedded into a graphical interface, becoming a measurement and collection tool for cardiologists. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
12. Cost Optimal Production-Scheduling Model Based on VNS-NSGA-II Hybrid Algorithm—Study on Tissue Paper Mill.
- Author
-
Zhang, Huanhuan, Li, Jigeng, Hong, Mengna, Man, Yi, and He, Zhenglei
- Subjects
PAPER mills ,FLOW shop scheduling ,PRODUCTION scheduling ,INDUSTRIAL costs ,ALGORITHMS - Abstract
With the development of the customization concept, small-batch and multi-variety production will become one of the major production modes, especially for fast-moving consumer goods. However, this production mode has two issues: high production cost and the long manufacturing period. To address these issues, this study proposes a multi-objective optimization model for the flexible flow-shop to optimize the production scheduling, which would maximize the production efficiency by minimizing the production cost and makespan. The model is designed based on hybrid algorithms, which combine a fast non-dominated genetic algorithm (NSGA-II) and a variable neighborhood search algorithm (VNS). In this model, NSGA-II is the major algorithm to calculate the optimal solutions. VNS is to improve the quality of the solution obtained by NSGA-II. The model is verified by an example of a real-world typical FFS, a tissue papermaking mill. The results show that the scheduling model can reduce production costs by 4.2% and makespan by 6.8% compared with manual scheduling. The hybrid VNS-NSGA-II model also shows better performance than NSGA-II, both in production cost and makespan. Hybrid algorithms are a good solution for multi-objective optimization issues in flexible flow-shop production scheduling. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
13. A reviewer-reputation ranking algorithm to identify high-quality papers during the review process.
- Author
-
Gao, Fujuan, Fenoaltea, Enrico Maria, Zhang, Pan, and Zeng, An
- Subjects
- *
ALGORITHMS , *CITATION networks , *REPUTATION , *RESEARCH personnel , *BIPARTITE graphs , *BEES algorithm - Abstract
With the exponential growth in the number of academic researchers, it is crucial for editors of scientific journals to identify the highest-quality papers. While several measures exist to evaluate a paper's impact post-publication, the challenge of determining the potential impact of a manuscript during the review process remains an understudied issue. In this paper, we propose a reviewer-reputation ranking algorithm to identify high-quality papers based on paper citations, where a reviewer's reputation is computed from the correlation between their past ratings and the current number of citations received by the papers they have evaluated. During the review process, reviewers with high reputation scores are given more weight to determine the quality of papers. We test the algorithm on an artificial network with 200 reviewers and 600 papers, as well as on the American Physical Society (APS) data set, including in the analysis 308,243 papers and 274,154 mutual citations. We compare our approach with two existing methods, demonstrating that our algorithm significantly outperforms the others in identifying manuscripts with the highest quality. Our findings can help improve the impact of scientific journals, thereby contributing to academic and scientific progress. • We propose an algorithm to identify the papers with the highest quality from a large number of submissions. • We compare our new algorithm with other existing methods of aggregating user ratings in various online services. • We test our algorithm both with an artificial network and with the empirical data of the APS data set. • We show that our algorithm outperforms the other methods in identifying the papers with the highest quality. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
14. S-integral points on elliptic curves - Notes on a paper of B. M. M. de Weger
- Author
-
HERRMANN, Emanuel and PETHŐ, Attila
- Published
- 2001
15. Cigarette paper as evidence: Forensic profiling using ATR-FTIR spectroscopy and machine learning algorithms.
- Author
-
Kapoor, Muskaan, Sharma, Akanksha, and Sharma, Vishal
- Subjects
- *
CIGARETTES , *FORENSIC sciences , *FOURIER transform infrared spectroscopy , *MACHINE learning , *ALGORITHMS - Abstract
This research highlights the underestimated significance of cigarette paper as evidence at crime scenes. The primary objective is to distinguish cigarette paper from similar-looking alternatives, addressing the first research objective. The second objective involves identifying cigarette paper brands using attenuated total reflectance Fourier transform infrared (ATR-FTIR) spectroscopy and machine learning (ML) algorithms. Accurate differentiation of cigarette paper from normal paper is emphasized. ATR-FTIR spectroscopy, coupled with principal component analysis (PCA) for dimensionality reduction, is employed for brand identification. Among fifteen ML algorithms compared, the CatBoost classifier excels for both objectives. This research presents a non-destructive, effective method for studying cigarette paper, contributing valuable insights to crime scene investigations. [Display omitted] • Forensic evaluation of cigarette paper utilizing ATR-FTIR spectroscopy and Machine learning algorithms. • Peak characterization and differentiation-distinguishing cigarette paper from other types. • Machine learning algorithm comparison: assessing discrimination across nine cigarette brands. • External validation of the dominant algorithm using unknown samples. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
16. A Paper-and-Pencil gcd Algorithm for Gaussian Integers
- Author
-
Szabó, Sándor
- Published
- 2005
- Full Text
- View/download PDF
17. PROBLEMATIC ISSUES OF H-INDEX CAPTURING – HOW TO WRITE PAPERS AND MAKE LIFE EASIER FOR THE ALGORITHMS.
- Author
-
Tumanishvili, George G.
- Subjects
AUTHOR-publisher relations ,TRACKING algorithms ,ALGORITHMS ,PERIODICAL publishing ,ENGLISH language - Abstract
Differentiation/ranking of authors in the scientific sphere is carried out according to their influence on the particular field(s) of science. The impact of the contribution is measured by the impact a text has on the development of the field/issue. It can be measured by the usage of ideas from other researchers works given as citations. Nowadays, citations are tracked by specific algorithms and citation management systems that have access to various databases, catalogues and bibliography systems through metadata. In the presented article, I discuss problematic issues that authors are facing while writing texts in different languages (other than the English language) and publishing them in periodicals. The most popular (often indicative) texts/authors still fail to be captured/cached by the algorithms, therefore creating an imbalance between the actual number of citations performed by other scholars and cached h-index displayed by the algorithm. The paper discusses the causes of the problem and suggests solutions for both authors and publishers. [ABSTRACT FROM AUTHOR]
- Published
- 2020
18. An Approach to Automatic Reconstruction of Apictorial Hand Torn Paper Document.
- Author
-
Lotus, Rayappan, Varghese, Justin, and Saudia, Subash
- Subjects
AUTOMATION ,PAPER ,ARCHAEOLOGY ,FORENSIC sciences ,ALGORITHMS - Abstract
Digital automation in reconstruction of apictorial hand torn paper document increases efficacy and reduces human effort. Reconstruction of torn document has importance in various fields like archaeology, art conservation and forensic sciences. The devised novel technique for hand torn paper document, consists of pre-processing, feature extraction and reconstruction phase. Torn fragment's boundaries are simplified as polygons using douglas peucker polyline simplification algorithm. Features such as Euclidean distance and number of sudden changes in contour orientation are extracted. Our matching criteria identify the matching counterparts. Proposed features curtail ambiguity and enriches efficacy in reconstruction. Reconstructed results of hand torn paper document favour the proposed methodology. [ABSTRACT FROM AUTHOR]
- Published
- 2016
19. Developments in the Design of Experiments, Correspondent Paper
- Author
-
Atkinson, A. C.
- Published
- 1982
- Full Text
- View/download PDF
20. Invited Papers
- Author
-
Aldous, D. J., Arratia, R. A., Barbour, A. D., Bollobás, B., Diaconis, P., Donnelly, P. J., Erdős, P., Harper, L. H., Harris, B., Kolchin, V. F., Odlyzko, A. M., Pitman, J. W., Pittel, B. G., Shepp, L. A., Spencer, J., Takács, L., Vatutin, V. A., Vershik, A. M., Viskov, O. V., and Wilf, H. S.
- Published
- 1992
- Full Text
- View/download PDF
21. AI GODS, JEANS GODS, AND THRIFT GODS: RESPONDING TO RESPONSES TO THE BLESSED BY THE ALGORITHM PAPER (SINGLER 2020).
- Author
-
Singler, Beth
- Subjects
- *
GODS , *ARTIFICIAL intelligence , *ALGORITHMS , *THRIFT institutions - Published
- 2023
- Full Text
- View/download PDF
22. Tools and algorithms for the construction and analysis of systems: a special issue on tool papers for TACAS 2021.
- Author
-
Jensen, Peter Gjøl and Neele, Thomas
- Subjects
- *
ALGORITHMS , *SOFTWARE verification , *INTEGRATED circuit verification , *SYSTEMS software , *CONFERENCES & conventions - Abstract
This special issue contains six revised and extended versions of tool papers that appeared in the proceedings of TACAS 2021, the 27th International Conference on Tools and Algorithms for the Construction and Analysis of Systems. The issue is dedicated to the realization of algorithms in tools and the studies of the application of these tools for analysing hard- and software systems. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
23. Research on the Fusion of Hybrid Fuzzy Clustering Algorithm and Computer Automatic Test Paper Composition Algorithm.
- Author
-
Kan, Baopeng
- Subjects
COMPUTERS ,COMPUTER algorithms ,FUZZY algorithms ,COMPUTER workstation clusters ,ALGORITHMS ,HIGHER education exams - Abstract
In order to improve the effect of intelligent automatic test paper composition, this paper combines the hybrid fuzzy clustering algorithm to study the computer automatic test paper composition algorithm. In this paper, a computer automatic test paper composition system based on hybrid fuzzy clustering algorithm is constructed. Moreover, the hybrid fuzzy clustering method used in this paper is used as the basic algorithm of the system, and the algorithm is improved according to the actual needs of intelligent paper composition. In addition, this paper uses an intelligent algorithm to input the relevant constraint parameters and combines the original parameters to select the most suitable test questions from the database and combine them into test papers. Finally, this paper constructs the system structure based on the requirements of intelligent test paper composition. The experimental research shows that the computer automatic test paper composition system based on the hybrid fuzzy clustering algorithm proposed in this paper has a good test paper composition function, which can effectively promote the progress of the intelligent examination mode in colleges and universities. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
24. VIBRANT-WALK: An algorithm to detect plagiarism of figures in academic papers.
- Author
-
Parmar, Shashank and Jain, Bhavya
- Subjects
- *
PLAGIARISM , *COMPUTER algorithms , *ALGORITHMS , *COMPUTER vision , *RANDOM walks - Abstract
Detecting plagiarism in academic papers is crucial for maintaining academic integrity, preserving the originality of published work, and safeguarding intellectual property. While existing applications excel at text plagiarism detection, they fall short when it comes to image plagiarism. This paper introduces a novel algorithm, named "VIBRANT-WALK," designed to detect image plagiarism in academic manuscripts. The challenge of identifying plagiarized images is formidable, requiring a unique approach. Traditional Computer Vision algorithms, proficient in image similarity tasks, face limitations in determining whether an image has been previously used in an article. To address this, the proposed algorithm leverages a repository of all published article pages, focusing on absolute identicality rather than image similarity. The algorithm comprises two stages. In the first stage, a "Vibrancy Matrix" is created through image preprocessing, aiding in contour determination. The second stage involves pixel-by-pixel comparison with images from published manuscripts. To enhance efficiency, the algorithm initiates comparisons from the pixel with the highest score in the Vibrancy Matrix, followed by pixel comparisons through random walks, significantly reducing complexity. To conduct the study, a custom dataset was compiled from 69 research articles, capturing snapshots of each page and figure. Overall, we present 485 unique test cases where we can test the accuracy and efficiency of the algorithm. The lack of publicly available datasets necessitated this approach. The proposed algorithm outperformed the existing models and algorithms in this field by achieving an overall accuracy of 94.8% on the collated dataset, identifying 460 instances of plagiarism out of the 485 test cases. The algorithm also demonstrated a 100% accuracy rate in avoiding false positives. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
25. The Folded Paper Size Illusion: Evidence of Inability to Perceptually Integrate More Than One Geometrical Dimension.
- Author
-
Carbon, Claus-Christian
- Subjects
- *
PAPER sizing , *PERCEPTUAL illusions , *SENSORIMOTOR integration , *COGNITION , *ALGORITHMS , *PSYCHOPHYSICS - Abstract
The folded paper-size illusion is as easy to demonstrate as it is powerful in generating insights into perceptual processing: First take two A4 sheets of paper, one original sized, another halved by folding, then compare them in terms of area size by centering the halved sheet on the center of the original one! We perceive the larger sheet as far less than double (i.e., 100%) the size of the small one, typically only being about two thirds larger--this illusion is preserved by rotating the inner sheet and even by aligning it to one or two sides, but is dissolved by aligning both sheets to three sides, here documented by 88 participants' data. A potential explanation might be the general incapability of accurately comparing more than one geometrical dimension at once--in everyday life, we solve this perceptual-cognitive bottleneck by reducing the complexity of such a task via aligning parts with same lengths. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
26. Automated analysis of pen-on-paper spirals for tremor detection, quantification, and differentiation.
- Author
-
Rajan, Roopa, Anandapadmanabhan, Reghu, Nageswaran, Sharmila, Radhakrishnan, Vineeth, Saini, Arti, Krishnan, Syam, Gupta, Anu, Vishnu, Venugopalan Y., Pandit, Awadh K., Singh, Rajesh Kumar, Radhakrishnan, Divya M, Singh, Mamta Bhushan, Bhatia, Rohit, Srivastava, Achal, Kishore, Asha, and Padma Srivastava, M. V.
- Subjects
STATISTICS ,RESEARCH ,CONFIDENCE intervals ,ANALYSIS of variance ,TASK performance ,HANDWRITING ,ACCELEROMETERS ,DYSTONIA ,MOVEMENT disorders ,TREMOR ,DRAWING ,DESCRIPTIVE statistics ,PARKINSON'S disease ,SENSITIVITY & specificity (Statistics) ,DATA analysis ,RECEIVER operating characteristic curves ,DATA analysis software ,ALGORITHMS - Abstract
OBJECTIVE: To develop an automated algorithm to detect, quantify, and differentiate between tremor using pen-on-paper spirals. METHODS: Patients with essential tremor (n = 25), dystonic tremor (n = 25), Parkinson’s disease (n = 25), and healthy volunteers (HV, n = 25) drew free-hand spirals. The algorithm derived the mean deviation (MD) and tremor variability from scanned images. MD and tremor variability were compared with 1) the Bain and Findley scale, 2) the Fahn–Tolosa–Marin tremor rating scale (FTM–TRS), and 3) the peak power and total power of the accelerometer spectra. Inter and intra loop widths were computed to differentiate between the tremor. RESULTS: MD was higher in the tremor group (48.9±26.3) than in HV (26.4±5.3; p < 0.001). The cut-off value of 30.3 had 80.9% sensitivity and 76.0% specificity for the detection of the tremor [area under the curve: 0.83; 95% confidence index (CI): 0.75, 0.91, p < 0.001]. MD correlated with the Bain and Findley ratings (rho = 0.491, p = 0 < 0.001), FTM–TRS part B (rho = 0.260, p = 0.032) and accelerometric measures of postural tremor (total power, rho = 0.366, p < 0.001; peak power, rho = 0.402, p < 0.001). Minimum Detectable Change was 19.9%. Inter loop width distinguished Parkinson’s disease spirals from dystonic tremor (p < 0.001, 95% CI: 54.6, 211.1), essential tremor (p = 0.003, 95% CI: 28.5, 184.9), or HV (p = 0.036, 95% CI: -160.4, -3.9). CONCLUSION: The automated analysis of pen-on-paper spirals generated robust variables to quantify the tremor and putative variables to distinguish them from each other. SIGNIFICANCE: This technique maybe useful for epidemiological surveys and follow-up studies on tremor. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
27. Digitalized Control Algorithm of Bridgeless Totem-Pole PFC with a Simple Control Structure Based on the Phase Angle.
- Author
-
Lee, Gi-Young, Park, Hae-Chan, Ji, Min-Woo, and Kim, Rae-Young
- Subjects
ELECTRIC current rectifiers ,ELECTRONIC paper ,PHASE-locked loops ,ALGORITHMS ,ANGLES ,VOLTAGE - Abstract
Compared to the conventional boost power factor correction (PFC) converter, a totem-pole bridgeless PFC has high efficiency because it does not have an input diode rectifier stage, but a current spike may occur when the polarity of the grid voltage changes. This paper proposes a digital control algorithm for bridgeless totem-pole PFC with a simple control structure based on the phase angle of grid voltage. The proposed algorithm has a PI-based double-loop control structure and performs DC-link voltage and input inductor current control. Rectifying switches operate based on the proposed rectification algorithm using phase angle information calculated through a single-phase phase-locked loop (PLL) to prevent current spikes. The feed-forward duty ratio value is calculated according to the polarity of the grid voltage and added to the double-loop controller to perform appropriate power factor control. The performance and feasibility of the proposed control algorithm are verified through a 3 kW hardware prototype. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
28. A BPNN Model-Based AdaBoost Algorithm for Estimating Inside Moisture of Oil–Paper Insulation of Power Transformer.
- Author
-
Liu, Jiefeng, Ding, Zheshi, Fan, Xianhao, Geng, Chuhan, Song, Boshu, Wang, Qingyin, and Zhang, Yiyi
- Subjects
- *
POWER transformers , *TRANSFORMER insulation , *MOISTURE , *ALGORITHMS , *MACHINE learning , *CLASSIFICATION algorithms - Abstract
The traditional method for transformer moisture diagnosis is to establish empirical equations between feature parameters extracted from frequency domain spectroscopy (FDS) and the transformer’s moisture content. However, the established empirical equation may not be applicable to a novel testing environment, resulting in an unreliable evaluation result. In this regard, it is acknowledged that FDS combined with machine learning is more suitable for estimating moisture content in a variety of test environments. Nonetheless, the accuracy of the estimation results obtained using the existing method is limited by the algorithm’s inability to generalize. To address this issue, we propose an AdaBoost algorithm-enhanced back-propagation neural network (BP_AdaBoost). This study creates a database by extracting feature parameters from the FDS that characterize the insulation states of the prepared samples. Then, using the BP_AdaBoost algorithm and the newly constructed database, the moisture estimation models are trained. Finally, the results of the estimation are discussed in terms of laboratory and field transformers. By comparing the proposed BP_AdaBoost algorithm to other intelligence algorithms, it is demonstrated that it not only performs better in generalization, but also maintains a high level of accuracy. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
29. Optimization of Texture Rendering of 3D Building Model Based on Vertex Importance.
- Author
-
Shen, Wenfei, Huo, Liang, Shen, Tao, Zhang, Miao, and Li, Yucai
- Subjects
TEXTURE mapping ,DATA modeling ,CURVATURE ,ALGORITHMS - Abstract
In 3D building models, a large number of texture maps with different sizes increase the number of model data loading and drawing batches, which greatly reduces the drawing efficiency of the model. Therefore, this paper proposes a texture set mapping method based on vertex importance. Firstly, based on the 2D space boxing algorithm, the texture maps are merged and a series of Mipmap texture maps are generated, and then the vertex curvature, texture variability and location information of each vertex are calculated, normalized, and weighted to get the importance of each vertex, and then finally, different Mipmap-level textures are remapped according to the importance of the vertices. The experiment proves that the algorithm in this paper can reduce the amount of texture data on the one hand, and avoid the rendering pressure brought by the still large amount of data after merging on the other hand, so as to improve the rendering efficiency of the model. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
30. SDP-Based Bounds for the Quadratic Cycle Cover Problem via Cutting-Plane Augmented Lagrangian Methods and Reinforcement Learning: INFORMS Journal on Computing Meritorious Paper Awardee.
- Author
-
de Meijer, Frank and Sotirov, Renata
- Subjects
- *
REINFORCEMENT learning , *COMBINATORIAL optimization , *TRAVELING salesman problem , *ALGORITHMS , *SEMIDEFINITE programming , *MACHINE learning , *DIRECTED graphs - Abstract
We study the quadratic cycle cover problem (QCCP), which aims to find a node-disjoint cycle cover in a directed graph with minimum interaction cost between successive arcs. We derive several semidefinite programming (SDP) relaxations and use facial reduction to make these strictly feasible. We investigate a nontrivial relationship between the transformation matrix used in the reduction and the structure of the graph, which is exploited in an efficient algorithm that constructs this matrix for any instance of the problem. To solve our relaxations, we propose an algorithm that incorporates an augmented Lagrangian method into a cutting-plane framework by utilizing Dykstra's projection algorithm. Our algorithm is suitable for solving SDP relaxations with a large number of cutting-planes. Computational results show that our SDP bounds and efficient cutting-plane algorithm outperform other QCCP bounding approaches from the literature. Finally, we provide several SDP-based upper bounding techniques, among which is a sequential Q-learning method that exploits a solution of our SDP relaxation within a reinforcement learning environment. Summary of Contribution: The quadratic cycle cover problem (QCCP) is the problem of finding a set of node-disjoint cycles covering all the nodes in a graph such that the total interaction cost between successive arcs is minimized. The QCCP has applications in many fields, among which are robotics, transportation, energy distribution networks, and automatic inspection. Besides this, the problem has a high theoretical relevance because of its close connection to the quadratic traveling salesman problem (QTSP). The QTSP has several applications, for example, in bioinformatics, and is considered to be among the most difficult combinatorial optimization problems nowadays. After removing the subtour elimination constraints, the QTSP boils down to the QCCP. Hence, an in-depth study of the QCCP also contributes to the construction of strong bounds for the QTSP. In this paper, we study the application of semidefinite programming (SDP) to obtain strong bounds for the QCCP. Our strongest SDP relaxation is very hard to solve by any SDP solver because of the large number of involved cutting-planes. Because of that, we propose a new approach in which an augmented Lagrangian method is incorporated into a cutting-plane framework by utilizing Dykstra's projection algorithm. We emphasize an efficient implementation of the method and perform an extensive computational study. This study shows that our method is able to handle a large number of cuts and that the resulting bounds are currently the best QCCP bounds in the literature. We also introduce several upper bounding techniques, among which is a distributed reinforcement learning algorithm that exploits our SDP relaxations. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
31. An FPGA Implementation of the Log-MAP Algorithm for a Dirty Paper Coding CODEC.
- Author
-
Lopes, Paulo A. C. and Gerald, José A. B.
- Subjects
- *
BIT rate , *VIDEO coding , *ALGORITHMS , *GATE array circuits , *CODECS , *DECODING algorithms - Abstract
This work describes the log-MAP (BCJR) algorithm implementation of a close to capacity dirty paper coding CODEC. The CODEC consists of eight deep pipeline processors. It decodes blocks of 975 bits in 26.9 ms using less than 9.7% of low-cost FPGA (and no DSP blocks). Two pipelines, for alpha and beta, calculate the values of gamma (of the BCJR) to reduce the storage requirements. The final log-likelihood ratio (LLR) is calculated together with alpha, reusing intermediate results. The number of bits used by the different signals of the processor is easily configurable. It was set to six bits to the channel measure signals and eight bits to log of probability signals like alpha, beta, and others. The CODEC clock was 100 MHz. The achieved bit rate is 36.2 Kbps per CODEC, but multiple CODECs can be fit into a single chip. The CODEC is 3.49 dB from the channel capacity. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
32. Comments on a Paper by Romesh Saigal: "A Constrained Shortest Route Problem"
- Author
-
Rosseel, Marc
- Published
- 1968
33. Comments on the Paper: 'Heuristic and Special Case Algorithms for Dispersion Problems' by S. S. Ravi, D. J. Rosenkrantz, and G. K. Tayi
- Author
-
Tamir, Arie
- Published
- 1998
34. Scientific papers and artificial intelligence. Brave new world?
- Author
-
Nexøe, Jørgen
- Subjects
COMPUTERS ,MANUSCRIPTS ,ARTIFICIAL intelligence ,MACHINE learning ,DATA analysis ,MEDICAL literature ,MEDICAL research ,ALGORITHMS - Published
- 2023
- Full Text
- View/download PDF
35. Using Hidden Markov Models for paper currency recognition
- Author
-
Hassanpour, Hamid and Farahabadi, Payam M.
- Subjects
- *
HIDDEN Markov models , *PAPER money , *ALGORITHMS , *PATTERN recognition systems , *FEATURE extraction , *MATERIALS texture - Abstract
Accurate characterization is an important issue in paper currency recognition system. This paper proposes a robust paper currency recognition method based on Hidden Markov Model (HMM). By employing HMM, the texture characteristics of paper currencies are modeled as a random process. The proposed algorithm can be used for distinguishing paper currency from different countries. A similarity measure has been used for the classification in the proposed algorithm. To evaluate the performance of the proposed algorithm, experiments have been conducted on more than 100 denominations from different countries. The results indicate 98% accuracy for recognition of paper currency. [Copyright &y& Elsevier]
- Published
- 2009
- Full Text
- View/download PDF
36. Objective paper structure comparison: Assessing comparison algorithms
- Author
-
Berger, Charles E.H. and Ramos, Daniel
- Subjects
- *
PAPER analysis , *ALGORITHMS , *PROVENANCE trials , *COMPARATIVE studies , *DATABASES , *INVARIANTS (Mathematics) , *LIGHT transmission - Abstract
Abstract: More than just being a substrate, paper can also provide evidence for the provenance of documents. An earlier paper described a method to compare paper structure, based on the Fourier power spectra of light transmission images. Good results were obtained by using the 2D correlation of images derived from the power spectra as a similarity score, but the method was very computationally intensive. Different comparison algorithms are evaluated in this paper, using information theoretical criteria. An angular invariant algorithm turned out to be as effective as the original one but 4 orders of magnitude faster, making the use of much larger databases possible. [Copyright &y& Elsevier]
- Published
- 2012
- Full Text
- View/download PDF
37. Social and content aware One-Class recommendation of papers in scientific social networks.
- Author
-
Wang, Gang, He, XiRan, and Ishuga, Carolyne Isigi
- Subjects
INFORMATION technology ,SOCIAL networks ,SPARSE graphs ,HYBRID computers (Computer architecture) ,HYBRID power systems - Abstract
With the rapid development of information technology, scientific social networks (SSNs) have become the fastest and most convenient way for researchers to communicate with each other. Many published papers are shared via SSNs every day, resulting in the problem of information overload. How to appropriately recommend personalized and highly valuable papers for researchers is becoming more urgent. However, when recommending papers in SSNs, only a small amount of positive instances are available, leaving a vast amount of unlabelled data, in which negative instances and potential unseen positive instances are mixed together, which naturally belongs to One-Class Collaborative Filtering (OCCF) problem. Therefore, considering the extreme data imbalance and data sparsity of this OCCF problem, a hybrid approach of Social and Content aware One-class Recommendation of Papers in SSNs, termed SCORP, is proposed in this study. Unlike previous approaches recommended to address the OCCF problem, social information, which has been proved playing a significant role in performing recommendations in many domains, is applied in both the profiling of content-based filtering and the collaborative filtering to achieve superior recommendations. To verify the effectiveness of the proposed SCORP approach, a real-life dataset from CiteULike was employed. The experimental results demonstrate that the proposed approach is superior to all of the compared approaches, thus providing a more effective method for recommending papers in SSNs. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
38. Utilizing tables, figures, charts and graphs to enhance the readability of a research paper.
- Author
-
Divecha C. A., Tullu M. S., and Karande S.
- Subjects
GRAPHIC arts ,READABILITY (Literary style) ,SERIAL publications ,RESEARCH methodology ,COPYRIGHT ,MEDICAL research ,ALGORITHMS - Abstract
The authors offer observation on utilizing tables, figures, charts and graphs to help understand the research presented in a simple manner but also engage and sustain the reader's interest. Topics discussed include benefits provided by the use of tables/figures/charts/graphs, general methodology of design and submission, and copyright issues of using material from government publications/public domain.
- Published
- 2023
- Full Text
- View/download PDF
39. Critical Appraisal of a Machine Learning Paper: A Guide for the Neurologist.
- Author
-
Vinny, Pulikottil W., Garg, Rahul, Srivastava, M. V. Padma, Lal, Vivek, and Vishnu, Venugoapalan Y.
- Subjects
- *
DEEP learning , *NEUROLOGISTS , *EVIDENCE-based medicine , *MACHINE learning , *BENCHMARKING (Management) , *TERMS & phrases , *ARTIFICIAL neural networks , *PREDICTION models , *ALGORITHMS - Abstract
Machine learning (ML), a form of artificial intelligence (AI), is being increasingly employed in neurology. Reported performance metrics often match or exceed the efficiency of average clinicians. The neurologist is easily baffled by the underlying concepts and terminologies associated with ML studies. The superlative performance metrics of ML algorithms often hide the opaque nature of its inner workings. Questions regarding ML model's interpretability and reproducibility of its results in real-world scenarios, need emphasis. Given an abundance of time and information, the expert clinician should be able to deliver comparable predictions to ML models, a useful benchmark while evaluating its performance. Predictive performance metrics of ML models should not be confused with causal inference between its input and output. ML and clinical gestalt should compete in a randomized controlled trial before they can complement each other for screening, triaging, providing second opinions and modifying treatment. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
40. LA INTEL·LIGÈNCIA ARTIFICIAL EN LA DETECCIÓ DE LES PRÀCTIQUES DE BID RIGGING: EL PAPER CAPDAVANTER DE L'ACCO.
- Author
-
Jiménez Cardona, Noemí
- Subjects
GOVERNMENT purchasing ,ARTIFICIAL intelligence ,ANTITRUST law ,SOFTWARE development tools ,CARTELS - Abstract
Copyright of Revista Catalana de Dret Públic is the property of Revista Catalana de Dret Public and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2022
- Full Text
- View/download PDF
41. CAI versus Paper and Pencil — discrepancies in students' performance
- Author
-
HATIVA, NIRA
- Published
- 1988
42. Variations on a Theme in Paper Folding.
- Author
-
Polster, Burkard
- Subjects
- *
PAPER folding (Graphic design) , *APPROXIMATION theory , *ANGLES , *ALGORITHMS , *POLYGONS , *MATHEMATICS - Abstract
Summarizes the construction of paper folding. Method for approximating rational subdivisions or arbitrary angles and line segments; Angle-folding algorithm; Approximating angles, regular polygons and star polygons; Dissection of angles into equal parts.
- Published
- 2004
- Full Text
- View/download PDF
43. Research on image processing algorithm of immune colloidal gold test paper detection.
- Author
-
Guang Yang, Tiefeng Wang, and Peng Zhang
- Subjects
- *
COLLOIDAL gold , *QUALITY control , *AUTOMATIC identification , *ALGORITHMS - Abstract
In order to better solve the problem of automatic identification of quality control line and detection line in the detection of gold standard test strip, this paper proposes to collect the image information of gold standard test strip after color rendering through CMOS sensor, preprocess the obtained information, transform RGB image into gray image, build cloud model in the CIELAB/HSV/HSL space, and apply the improved AdaBoost algorithm to determine the position of detection line and quality control line Place. Compared with the traditional template matching method, it improves the accuracy and accuracy of recognition. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
44. Papers Published in Technical Journals and Conference Proceedings.
- Subjects
- *
CONFERENCE proceedings (Publications) , *PERIODICALS , *TECHNOLOGY , *BLIND source separation , *ALGORITHMS - Published
- 2024
45. Digital marginalization, data marginalization, and algorithmic exclusions: a critical southern decolonial approach to datafication, algorithms, and digital citizenship from the Souths.
- Author
-
Chaka, Chaka
- Subjects
CITIZENSHIP ,DECOLONIZATION ,ELECTRONIC paper ,ALGORITHMS ,COMMUNITIES ,CHIEF information officers - Abstract
This paper explores digital marginalization, data marginalization, and algorithmic exclusions in the Souths. To this effect, it argues that underrepresented users and communities continue to be marginalized and excluded by digital technologies, by big data, and by algorithms employed by organizations, corporations, institutions, and governments in various data jurisdictions. Situating data colonialism within the Souths, the paper contends that data ableism, data disablism, and data colonialism are at play when data collected, collated, captured, configured, and processed from underrepresented users and communities is utilized by mega entities for their own multiple purposes. It also maintains that data coloniality, as opposed to data colonialism, is impervious to legal and legislative interventions within data jurisdictions. Additionally, it discusses digital citizenship (DC) and its related emerging regimes. Moreover, the paper argues that digital exclusion transcends the simplistic haves versus the have nots dualism as it manifests itself in multiple layers and in multiple dimensions. Furthermore, it characterizes how algorithmic exclusions tend to perpetuate historical human biases despite the pervasive view that algorithms are autonomous, neutral, rational, objective, fair, unbiased, and non-human. Finally, the paper advances a critical southern decolonial (CSD) approach to datafication, algorithms, and digital citizenship by means of which data coloniality, algorithmic coloniality, and the coloniality embodied in DC have to be critiqued, challenged, and dismantled. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
46. Quantifying the impact of scholarly papers based on higher-order weighted citations.
- Author
-
Bai, Xiaomei, Zhang, Fuli, Hou, Jie, Lee, Ivan, Kong, Xiangjie, Tolba, Amr, and Xia, Feng
- Subjects
CITATION analysis ,SCHOLARLY publishing ,BIBLIOMETRICS ,SIMULATION methods & models ,ALGORITHMS - Abstract
Quantifying the impact of a scholarly paper is of great significance, yet the effect of geographical distance of cited papers has not been explored. In this paper, we examine 30,596 papers published in Physical Review C, and identify the relationship between citations and geographical distances between author affiliations. Subsequently, a relative citation weight is applied to assess the impact of a scholarly paper. A higher-order weighted quantum PageRank algorithm is also developed to address the behavior of multiple step citation flow. Capturing the citation dynamics with higher-order dependencies reveals the actual impact of papers, including necessary self-citations that are sometimes excluded in prior studies. Quantum PageRank is utilized in this paper to help differentiating nodes whose PageRank values are identical. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
47. Using Paper Texture for Choosing a Suitable Algorithm for Scanned Document Image Binarization.
- Author
-
Lins, Rafael Dueire, Bernardino, Rodrigo, Barboza, Ricardo da Silva, and De Oliveira, Raimundo Correa
- Subjects
DOCUMENT imaging systems ,HISTORICAL source material ,TEXTURES ,ALGORITHMS - Abstract
The intrinsic features of documents, such as paper color, texture, aging, translucency, the kind of printing, typing or handwriting, etc., are important with regard to how to process and enhance their image. Image binarization is the process of producing a monochromatic image having its color version as input. It is a key step in the document processing pipeline. The recent Quality-Time Binarization Competitions for documents have shown that no binarization algorithm is good for any kind of document image. This paper uses a sample of the texture of the scanned historical documents as the main document feature to select which of the 63 widely used algorithms, using five different versions of the input images, totaling 315 document image-binarization schemes, provides a reasonable quality-time trade-off. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
48. ITERATIVE ALGORITHMS FOR VARIATIONAL INCLUSIONS IN BANACH SPACES.
- Author
-
ANSARI, QAMRUL HASAN, BALOOEE, JAVAD, and PETRUŞEL, ADRIAN
- Subjects
BANACH spaces ,LIPSCHITZ continuity ,PAPER arts ,DIFFERENTIAL inclusions ,ALGORITHMS - Abstract
The present paper is in two folds. In the first fold, we prove the Lipschitz continuity of the proximal mapping associated with a general strongly H-monotone mapping and compute an estimate of its Lipschitz constant under some mild assumptions imposed on the mapping H involved in the proximal mapping. We provide two examples to show that a maximal monotone mapping need not be a general H-monotone for a single-valued mapping H from a Banach space to its dual space. A class of multi-valued nonlinear variational inclusion problems is considered, and by using the notion of proximal mapping and Nadler's technique, an iterative algorithm with mixed errors is suggested to compute its solutions. Under some appropriate hypotheses imposed on the mappings and parameters involved in the multi-valued nonlinear variational inclusion problem, the strong convergence of the sequences generated by the proposed algorithm to a solution of the aforesaid problem is verified. The second fold of this paper investigates and analyzes the notion of Cn-monotone mappings defined and studied in [S.Z. Nazemi, A new class of monotone mappings and a new class of variational inclusions in Banach spaces, J. Optim. Theory Appl. 155(3)(2012) 785-795]. Several comments related to the results and algorithm appeared in the above mentioned paper are given. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
49. Canadian Association of Radiologists White Paper on De-identification of Medical Imaging: Part 2, Practical Considerations.
- Author
-
Parker, William, Jaremko, Jacob L., Cicero, Mark, Azar, Marleine, El-Emam, Khaled, Gray, Bruce G., Hurrell, Casey, Lavoie-Cardinal, Flavie, Desjardins, Benoit, Lum, Andrea, Sheremeta, Lori, Lee, Emil, Reinhold, Caroline, Tang, An, and Bromwich, Rebecca
- Subjects
- *
ALGORITHMS , *ARTIFICIAL intelligence , *DATA encryption , *DATABASE management , *DIAGNOSTIC imaging , *HEALTH services accessibility , *MACHINE learning , *MEDICAL protocols , *DICOM (Computer network protocol) , *COVID-19 pandemic - Abstract
The application of big data, radiomics, machine learning, and artificial intelligence (AI) algorithms in radiology requires access to large data sets containing personal health information. Because machine learning projects often require collaboration between different sites or data transfer to a third party, precautions are required to safeguard patient privacy. Safety measures are required to prevent inadvertent access to and transfer of identifiable information. The Canadian Association of Radiologists (CAR) is the national voice of radiology committed to promoting the highest standards in patient-centered imaging, lifelong learning, and research. The CAR has created an AI Ethical and Legal standing committee with the mandate to guide the medical imaging community in terms of best practices in data management, access to health care data, de-identification, and accountability practices. Part 2 of this article will inform CAR members on the practical aspects of medical imaging de-identification, strengths and limitations of de-identification approaches, list of de-identification software and tools available, and perspectives on future directions. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
50. Physics driven behavioural clustering of free-falling paper shapes.
- Author
-
Howison, Toby, Hughes, Josie, Giardina, Fabio, and Iida, Fumiya
- Subjects
- *
PHYSICS , *SET functions , *MACHINE learning , *PHENOMENOLOGICAL theory (Physics) , *CONTINUUM mechanics - Abstract
Many complex physical systems exhibit a rich variety of discrete behavioural modes. Often, the system complexity limits the applicability of standard modelling tools. Hence, understanding the underlying physics of different behaviours and distinguishing between them is challenging. Although traditional machine learning techniques could predict and classify behaviour well, typically they do not provide any meaningful insight into the underlying physics of the system. In this paper we present a novel method for extracting physically meaningful clusters of discrete behaviour from limited experimental observations. This method obtains a set of physically plausible functions that both facilitate behavioural clustering and aid in system understanding. We demonstrate the approach on the V-shaped falling paper system, a new falling paper type system that exhibits four distinct behavioural modes depending on a few morphological parameters. Using just 49 experimental observations, the method discovered a set of candidate functions that distinguish behaviours with an error of 2.04%, while also aiding insight into the physical phenomena driving each behaviour. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.