17 results
Search Results
2. An Operations Research-Based Teaching Unit for Grade 11: The ROAR Experience, Part II
- Author
-
Gabriella Colajanni, Alessandro Gobbi, Marinella Picchi, Alice Raffaele, and Eugenia Taranto
- Abstract
In this paper, we continue describing the project and the experimentation of "Ricerca Operativa Applicazioni Reali" (ROAR; in English, Real Applications of Operations Research), a three-year project for higher secondary schools, introduced. ROAR is composed of three teaching units, addressed to Grades 10, 11, and 12, respectively, having the main aim to improve students' interest, motivation, and skills related to Science, Technology, Engineering, and Mathematics disciplines by integrating mathematics and computer science through operations research. In a previous paper, we reported on the design and implementation of the first unit, started in Spring 2021 at the scientific high school IIS Antonietti in Iseo (Brescia, Italy), in a Grade-10 class. Here, we focus on the second unit, carried out in Winter/Spring 2022 with the same students, now in a Grade-11 class. In particular, we describe objectives, prerequisites, topics and methods, the organization of the lectures, digital technologies used, and a challenging final project. Moreover, we analyze the feedback from students and teachers involved in the experimentation.
- Published
- 2024
- Full Text
- View/download PDF
3. Representation of Learning in the Post-Digital: Students' Dropout Predictive Models with Artificial Intelligence Algorithms
- Author
-
Zanellati, Andrea, Macauda, Anita, Panciroli, Chiara, and Gabbrielli, Maurizio
- Abstract
Within scientific debate on post-digital and education, we present a position paper to describe a research project aimed at the design of a predictive model for students' low achievements in mathematics in Italy. The model is based on the INVALSI data set, an Italian large-scale assessment test, and we use decision trees as the classification algorithm. In designing this tool, we aim to overcome the use of economic, social, and cultural context indices as main factors for the prediction of a learning gap occurrence. Indeed, we want to include a suitable representation of students' learning in the model, by exploiting the data collected through the INVALSI tests. We resort to a knowledge-based approach to address this issue and specifically, we try to understand what knowledge is introduced into the model through the representation of learning. In this sense, our proposal allows a students' learning encoding, which is transferable to different students' cohort. Furthermore, the encoding methods may be applied to other large-scale assessments test. Hence, we aim to contribute to a debate on knowledge representation in AI tool for education.
- Published
- 2023
- Full Text
- View/download PDF
4. An Operations Research-Based Teaching Unit for Grade 10: The ROAR Experience, Part I
- Author
-
Colajanni, Gabriella, Gobbi, Alessandro, Picchi, Marinella, Raffaele, Alice, and Taranto, Eugenia
- Abstract
We introduce "Ricerca Operativa Applicazioni Reali" (ROAR; in English, "Real Applications of Operations Research"), a three-year project for higher secondary schools. Its main aim is to improve students' interest, motivation, and skills related to Science, Technology, Engineering, and Mathematics disciplines by integrating mathematics and computer science through operations research. ROAR offers examples and problems closely connected with students' everyday life or with the industrial reality, balancing mathematical modeling and algorithmics. The project is composed of three teaching units, addressed to grades 10, 11, and 12. The implementation of the first teaching unit took place in Spring 2021 at the scientific high school IIS Antonietti in Iseo (Brescia, Italy). In particular, in this paper, we provide a full description of this first teaching unit in terms of objectives, prerequisites, topics and methods, organization of the lectures, and digital technologies used. Moreover, we analyze the feedback received from students and teachers involved in the experimentation, and we discuss advantages and disadvantages related to distance learning that we had to adopt because of the COVID-19 pandemic.
- Published
- 2023
- Full Text
- View/download PDF
5. PREDICTION OF DEFORMATION CAUSED BY LANDSLIDES BASED ON GRAPH CONVOLUTION NETWORKS ALGORITHM AND DINSAR TECHNIQUE.
- Author
-
Khalili, M. A., Guerriero, L., Pouralizadeh, M., Calcaterra, D., and Di Martire, D.
- Subjects
LANDSLIDES ,MACHINE learning ,SYNTHETIC aperture radar ,ALGORITHMS ,DEEP learning ,K-nearest neighbor classification - Abstract
Around the world, the occurrence of landslides has become one of the greatest threats to human life, property, infrastructure, and natural environments. Despite extensive research and discussions on the spatiotemporal dependence of landslide displacements, there is still a lack of understanding concerning the factors that appear to control displacement distribution in landslides because of their significant variations. This paper implements a Graph Convolutional Network (GCN) to predict displacement following the Moio della Civitella landslide in southern Italy and identify factors that may affect the distribution of movement following the landslide. An interferometric technique, known as permanent scatter interferometry (PSI), has been developed based on Synthetic Aperture Radar (SAR) satellite imagery to derive permanent scatter points that can be used to represent the deformation of landslides. This study utilized the GCN regression model applied to PSs points and data reflecting geological and geomorphological factors to extract the interdependency between paired data points, resulting in an adjacency matrix of the interval [0, 0,8). The proposed model outperforms conventional machine learning and deep learning algorithms such as linear regression (LR), K-nearest neighbors (KNN), Support vector regression (SVR), Decision tree, lasso, and artificial neural network (ANN). The absolute error between the actual and predicted deformation is used to evaluate the proposed model, which is less than 2 millimeters for most test set points. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
6. Primary Care of the (Near) Future: Exploring the Contribution of Digitalization and Remote Care Technologies through a Case Study.
- Author
-
Pennestrì, Federico and Banfi, Giuseppe
- Subjects
MEDICAL consultation ,HEALTH services administration ,HEALTH services accessibility ,DIGITAL technology ,PUBLIC administration ,PRIMARY health care ,NATIONAL health services ,MEDICAL care research ,FINANCIAL management ,TELEMEDICINE ,LONG-term health care ,ALGORITHMS - Abstract
The Italian Government planned to invest €15 billion of European funds on National Health Service digitalization and primary care enhancement. The critical burden brought by the pandemic upon hospital care mean these investments could no longer be delayed, considering the extraordinary backlogs of many treatments and the ordinary gaps of fragmented long-term care, in Italy and abroad. National guidelines have been published to standardize interventions across the Italian regions, and telemedicine is frequently mentioned as a key innovation to achieve both goals. The professional resources needed to run the facilities introduced in primary care are defined with great precision, but no details are given on how digitalization and remote care technologies must be implemented in this context. Building on this policy case, this paper focuses on what contribution digitalization and telemedicine can offer to specific primary care innovations, drawing from implemented technology-driven policies which may support the effective stratification, prevention and management of chronic patient needs, including anticipatory healthcare, population health management, adjusted clinical groups, chronic care management, quality and outcomes frameworks, patient-reported outcomes and patient-reported experience. All these policies can benefit significantly from digitalization and remote care technology, provided that some risks and limitations are considered by design. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
7. Smart Design of Hip Replacement Prostheses Using Additive Manufacturing and Machine Learning Techniques.
- Author
-
Milone, Dario, D'Andrea, Danilo, and Santonocito, Dario
- Subjects
COMPUTER simulation ,TOTAL hip replacement ,RESEARCH evaluation ,HIP joint ,GAIT in humans ,MACHINE learning ,REGRESSION analysis ,ARTIFICIAL joints ,COMPARATIVE studies ,SURVEYS ,PROSTHESIS design & construction ,THREE-dimensional printing ,ALGORITHMS ,KINEMATICS - Abstract
The field of additive manufacturing, particularly 3D printing, has ushered in a significant transformation in the realm of joint arthritis treatment through prosthetic surgery. This innovative technology allows for the creation of bespoke prosthetic devices that are tailored to meet the specific needs of individual patients. These devices are constructed using high-performance materials, including titanium and cobalt-chrome alloys. Nevertheless, the routine physical activities of patients, such as walking, sitting, and running, can induce wear and tear on the materials comprising these prosthetic devices, subsequently diminishing their functionality and durability. In response to this challenge, this research has endeavored to leverage novel techniques. The primary focus of this study lies in the development of an algorithm designed to optimize hip replacement procedures via the mechanical design of the prosthesis. This optimization process exploits the capabilities of machine learning algorithms, multi-body dynamics, and finite element method (FEM) simulations. The paramount innovation in this methodology is the capacity to design a prosthetic system that intricately adapts to the distinctive characteristics of each patient (weight, height, gait cycle). The primary objective of this research is to enhance the performance and longevity of prosthetic devices by improving their fatigue strength. The evaluation of load distribution on the prosthetic device, facilitated by FEM simulations, anticipates a substantial augmentation in the useful life of the prosthetic system. This research holds promise as a notable advancement in prosthetic technology, offering a more efficacious treatment option for patients suffering from joint arthritis. The aim of this research is to make meaningful contributions to the enhancement of patient quality of life and the long-term performance of prosthetic devices. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
8. Assessing the performance of the Gaussian Process Regression algorithm to fill gaps in the time-series of daily actual evapotranspiration of different crops in temperate and continental zones using ground and remotely sensed data.
- Author
-
De Caro, Dario, Ippolito, Matteo, Cannarozzo, Marcella, Provenzano, Giuseppe, and Ciraolo, Giuseppe
- Subjects
- *
KRIGING , *EVAPOTRANSPIRATION , *STANDARD deviations , *MACHINE learning , *ALGORITHMS - Abstract
The knowledge of crop evapotranspiration is crucial for several hydrological processes, including those related to the management of agricultural water sources. In particular, the estimations of actual evapotranspiration fluxes within fields are essential to managing irrigation strategies to save water and preserve water resources. Among the indirect methods to estimate actual evapotranspiration, ET a , the eddy covariance (EC) method allows to acquire continuous measurement of latent heat flux (LE). However, the time series of EC measurements are sometimes characterized by a lack of data due to the sensors' malfunctions. At this aim, Machine Learning (ML) techniques could represent a powerful tool to fill possible gaps in the time series. In this paper, the ML technique was applied using the Gaussian Process Regression (GPR) algorithm to fill gaps in daily actual evapotranspiration. The technique was tested in six different plots, two in Italy, three in the United States of America, and one in Canada, with different crops and climatic conditions in order to consider the suitability of the ML model in various contexts. For each site, the climate variables were not the same, therefore, the performance of the method was investigated on the basis of the available information. Initially, a comparison of ground and reanalysis data, where both databases were available, and between two different satellite products, when both databases were available, have been conducted. Then, the GPR model was tested. The mean and the covariance functions were set by considering a database of climate variables, soil water status measurements, and remotely sensed vegetation indices. Then, five different combinations of variables were analyzed to verify the suitability of the ML approach when limited input data are available or when the weather variables are replaced with reanalysis data. Cross-validation was used to assess the performance of the procedure. The model performances were assessed based on the statistical indicators: Root Mean Square Error (RMSE), coefficient of determination (R2), Mean Absolute Error (MAE), regression coefficient (b), and Nash-Sutcliffe efficiency coefficient (NSE). The quite high Nash Sutcliffe Efficiency (NSE) coefficient, and the root mean square error (RMSE) low values confirm the suitability of the proposed algorithm. • GPR algorithm is suitable to fill gaps in daily ET a time series. • The best m(x) and k(x,x') functions required by the GPR algorithm were identified. • The best results were obtained when the dataset included climate data, SWC and VIs. • The use of ERA5-L data and VIs retrieved by Sentinel 2 or MODIS is a good alternative. • GPR algorithm was tested for different crops in continental and temperate zones. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
9. An AI-based algorithm for the automatic evaluation of image quality in canine thoracic radiographs.
- Author
-
Banzato, Tommaso, Wodzinski, Marek, Burti, Silvia, Vettore, Eleonora, Muller, Henning, and Zotti, Alessandro
- Subjects
ARTIFICIAL intelligence ,RADIOGRAPHS ,ALGORITHMS ,FOREIGN bodies ,MEDICAL equipment ,DATABASES - Abstract
The aim of this study was to develop and test an artificial intelligence (AI)-based algorithm for detecting common technical errors in canine thoracic radiography. The algorithm was trained using a database of thoracic radiographs from three veterinary clinics in Italy, which were evaluated for image quality by three experienced veterinary diagnostic imagers. The algorithm was designed to classify the images as correct or having one or more of the following errors: rotation, underexposure, overexposure, incorrect limb positioning, incorrect neck positioning, blurriness, cut-off, or the presence of foreign objects, or medical devices. The algorithm was able to correctly identify errors in thoracic radiographs with an overall accuracy of 81.5% in latero-lateral and 75.7% in sagittal images. The most accurately identified errors were limb mispositioning and underexposure both in latero-lateral and sagittal images. The accuracy of the developed model in the classification of technically correct radiographs was fair in latero-lateral and good in sagittal images. The authors conclude that their AI-based algorithm is a promising tool for improving the accuracy of radiographic interpretation by identifying technical errors in canine thoracic radiographs. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
10. A Machine Learning Approach for Predicting Capsular Contracture after Postmastectomy Radiotherapy in Breast Cancer Patients.
- Author
-
Bavaro, Domenica Antonia, Fanizzi, Annarita, Iacovelli, Serena, Bove, Samantha, Comes, Maria Colomba, Cristofaro, Cristian, Cutrignelli, Daniela, De Santis, Valerio, Nardone, Annalisa, Lagattolla, Fulvia, Rizzo, Alessandro, Ressa, Cosmo Maurizio, and Massafra, Raffaella
- Subjects
SURGICAL complication risk factors ,SUPPORT vector machines ,DECISION trees ,CONTRACTURE (Pathology) ,LYMPHADENECTOMY ,EPIDERMAL growth factor receptors ,MACHINE learning ,MAMMAPLASTY ,RANDOM forest algorithms ,CANCER patients ,RISK assessment ,CYTOCHEMISTRY ,ESTROGEN receptors ,BREAST implants ,RADIATION doses ,CELL proliferation ,RESEARCH funding ,MASTECTOMY ,PREDICTION models ,SENSITIVITY & specificity (Statistics) ,RADIOTHERAPY ,TUMOR markers ,MENOPAUSE ,COMBINED modality therapy ,BREAST tumors ,PROGESTERONE receptors ,ALGORITHMS ,TUMOR grading ,SYMPTOMS ,BLOOD - Abstract
In recent years, immediate breast reconstruction after mastectomy surgery has steadily increased in the treatment pathway of breast cancer (BC) patients due to its potential impact on both the morpho-functional and aesthetic type of the breast and the quality of life. Although recent studies have demonstrated how recent radiotherapy techniques have allowed a reduction of adverse events related to breast reconstruction, capsular contracture (CC) remains the main complication after post-mastectomy radio-therapy (PMRT). In this study, we evaluated the association of the occurrence of CC with some clinical, histological and therapeutic parameters related to BC patients. We firstly performed bivariate statistical tests and we then evaluated the prognostic predictive power of the collected data by using machine learning techniques. Out of a sample of 59 patients referred to our institute, 28 patients (i.e., 47%) showed contracture after PMRT. As a result, only estrogen receptor status (ER) and molecular subtypes were significantly associated with the occurrence of CC after PMRT. Different machine learning models were trained on a subset of clinical features selected by a feature importance approach. Experimental results have shown that collected features have a non-negligible predictive power. The extreme gradient boosting classifier achieved an area under the curve (AUC) value of 68% and accuracy, sensitivity, and specificity values of 68%, 64%, and 74%, respectively. Such a support tool, after further suitable optimization and validation, would allow clinicians to identify the best therapeutic strategy and reconstructive timing. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
11. A Method of Estimating Time-to-Recovery for a Disease Caused by a Contagious Pathogen Such as SARS-CoV-2 Using a Time Series of Aggregated Case Reports.
- Author
-
Koutsouris, Dimitrios-Dionysios, Pitoglou, Stavros, Anastasiou, Athanasios, and Koumpouros, Yiannis
- Subjects
DISEASE progression ,COMPUTER software ,COVID-19 ,CONFIDENCE intervals ,TIME ,CONVALESCENCE ,WORLD health ,EPIDEMICS ,TIME series analysis ,DESCRIPTIVE statistics ,SENSITIVITY & specificity (Statistics) ,PREDICTION models ,COVID-19 pandemic ,ALGORITHMS - Abstract
During the outbreak of a disease caused by a pathogen with unknown characteristics, the uncertainty of its progression parameters can be reduced by devising methods that, based on rational assumptions, exploit available information to provide actionable insights. In this study, performed a few (~6) weeks into the outbreak of COVID-19 (caused by SARS-CoV-2), one of the most important disease parameters, the average time-to-recovery, was calculated using data publicly available on the internet (daily reported cases of confirmed infections, deaths, and recoveries), and fed into an algorithm that matches confirmed cases with deaths and recoveries. Unmatched cases were adjusted based on the matched cases calculation. The mean time-to-recovery, calculated from all globally reported cases, was found to be 18.01 days (SD 3.31 days) for the matched cases and 18.29 days (SD 2.73 days) taking into consideration the adjusted unmatched cases as well. The proposed method used limited data and provided experimental results in the same region as clinical studies published several months later. This indicates that the proposed method, combined with expert knowledge and informed calculated assumptions, could provide a meaningful calculated average time-to-recovery figure, which can be used as an evidence-based estimation to support containment and mitigation policy decisions, even at the very early stages of an outbreak. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
12. Online Questionnaire with Fibromyalgia Patients Reveals Correlations among Type of Pain, Psychological Alterations, and Effectiveness of Non-Pharmacological Therapies.
- Author
-
Demori, Ilaria, Molinari, Elena, Rapallo, Fabio, Mucci, Viviana, Marinelli, Lucio, Losacco, Serena, and Burlando, Bruno
- Subjects
ALTERNATIVE medicine -- Evaluation ,WELL-being ,RESEARCH ,RELIABILITY (Personality trait) ,PAIN ,RESEARCH evaluation ,MULTIVARIATE analysis ,DIET ,MANN Whitney U Test ,FIBROMYALGIA ,TREATMENT effectiveness ,CRONBACH'S alpha ,QUESTIONNAIRES ,RESEARCH funding ,DESCRIPTIVE statistics ,SCALE analysis (Psychology) ,CHI-squared test ,STATISTICAL correlation ,ANXIETY ,RELAXATION techniques ,DATA analysis software ,SOCIODEMOGRAPHIC factors ,PSYCHOTHERAPY ,ALGORITHMS ,COMORBIDITY - Abstract
Fibromyalgia (FM) is a chronic pain syndrome with an unclear etiology. In addition to pain, FM patients suffer from a diverse array of symptoms and comorbidities, encompassing fatigue, cognitive dysfunction, mood disorders, sleep deprivation, and dizziness. Due to the complexity of FM, the diagnosis and treatment of it are highly challenging. The aim of the present work was to investigate some clinical and psychological characteristics of FM patients, and to uncover possible correlations with pharmacological and non-pharmacological therapies. We conducted a cross-sectional, questionnaire-based study aimed at evaluating pain, psychological traits, and the self-perceived effectiveness of pharmacological and non-pharmacological treatments in an Italian population of FM patients. Descriptive statistics, correlation, and inference analyses were performed. We found a prevalence of a neuropathic/nociplastic type of pain, which correlated with psychological traits such as anxiety, low mood, psychophysical discomfort, and the inability to relax. The pain type and psychological traits proved to play a role in determining the self-perceived effectiveness of therapeutic interventions. Patients revealed a better response to non-pharmacological therapies, particularly dietary interventions, relaxation techniques, and psychotherapy rather than pharmacological interventions. The sum of our data indicates that for better outcomes, the type of pain and psychological traits should be considered for tailor-made treatments considering non-pharmacological protocols as a complement to the use of drugs. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
13. 3-D spatial cluster analysis of seismic sequences through density-based algorithms.
- Author
-
Piegari, Ester, Herrmann, Marcus, and Marzocchi, Warner
- Subjects
BIG data ,SEQUENCE analysis ,ALGORITHMS ,MACHINE learning ,CLUSTER analysis (Statistics) ,STATISTICS ,EARTHQUAKE hazard analysis - Abstract
With seismic catalogues becoming progressively larger, extracting information becomes challenging and calls upon using sophisticated statistical analysis. Data are typically clustered by machine learning algorithms to find patterns or identify regions of interest that require further exploration. Here, we investigate two density-based clustering algorithms, DBSCAN and OPTICS, for their capability to analyse the spatial distribution of seismicity and their effectiveness in discovering highly active seismic volumes of arbitrary shapes in large data sets. In particular, we study the influence of varying input parameters on the cluster solutions. By exploring the parameter space, we identify a crossover region with optimal solutions in between two phases with opposite behaviours (i.e. only clustered and only unclustered data points). Using a synthetic case with various geometric structures, we find that solutions in the crossover region consistently have the largest clusters and best represent the individual structures. For identifying strong anisotropic structures, we illustrate the usefulness of data rescaling. Applying the clustering algorithms to seismic catalogues of recent earthquake sequences (2016 Central Italy and 2016 Kumamoto) confirms that cluster solutions in the crossover region are the best candidates to identify 3-D features of tectonic structures that were activated in a seismic sequence. Finally, we propose a list of recipes that generalizes our analyses to obtain such solutions for other seismic sequences. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
14. Redistribution of garbage codes to underlying causes of death: a systematic analysis on Italy and a comparison with most populous Western European countries based on the Global Burden of Disease Study 2019.
- Author
-
Monasta, Lorenzo, Alicandro, Gianfranco, Pasovic, Maja, Cunningham, Matthew, Armocida, Benedetta, Murray, Christopher J L, Ronfani, Luca, Naghavi, Mohsen, and Collaborators, GBD 2019 Italy Causes of Death
- Subjects
CAUSES of death ,DEVELOPED countries ,WASTE management ,GLOBAL burden of disease ,AGE distribution ,WORLD health ,HEALTH outcome assessment ,COMPARATIVE studies ,SEX distribution ,DESCRIPTIVE statistics ,ALGORITHMS - Abstract
Background The proportion of reported causes of death (CoDs) that are not underlying causes can be relevant even in high-income countries and seriously affect health planning. The Global Burden of Disease (GBD) study identifies these 'garbage codes' (GCs) and redistributes them to underlying causes using evidence-based algorithms. Planners relying on vital registration data will find discrepancies with GBD estimates. We analyse these discrepancies, through the analysis of GCs and their redistribution. Methods We explored the case of Italy, at national and regional level, and compared it to nine other Western European countries with similar population sizes. We analysed differences between official data and GBD 2019 estimates, for the period 1990–2017 for which we had vital registration data for most select countries. Results In Italy, in 2017, 33 000 deaths were attributed to unspecified type of stroke and 15 000 to unspecified type of diabetes, these making a fourth of the overall garbage. Significant heterogeneity exists on the overall proportion of GCs, type (unspecified or impossible underlying causes), and size of specific GCs among regions in Italy, and among the select countries. We found no pattern between level of garbage and relevance of specific GCs. Even locations performing below average show interesting lower levels for certain GCs if compared to better performing countries. Conclusions This systematic analysis suggests the heterogeneity in GC levels and causes, paired with a more detailed analysis of local practices, strengths and weaknesses, could be a positive element in a strategy for the reduction of GCs in Italy. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
15. Handwriting Declines With Human Aging: A Machine Learning Study.
- Author
-
Asci, Francesco, Scardapane, Simone, Zampogna, Alessandro, D'Onofrio, Valentina, Testa, Lucia, Patera, Martina, Falletti, Marco, Marsili, Luca, and Suppa, Antonio
- Subjects
NEUROLOGICAL disorders ,PREDICTIVE tests ,HANDWRITING ,MACHINE learning ,SMARTPHONES ,MANN Whitney U Test ,AGING ,DESCRIPTIVE statistics ,ARTIFICIAL neural networks ,RECEIVER operating characteristic curves ,SENSITIVITY & specificity (Statistics) ,DATA analysis software ,AGRAPHIA ,ALGORITHMS ,LONGITUDINAL method ,TELEMEDICINE ,DISEASE complications - Abstract
Background: Handwriting is an acquired complex cognitive and motor skill resulting from the activation of a widespread brain network. Handwriting therefore may provide biologically relevant information on health status. Also, handwriting can be collected easily in an ecological scenario, through safe, cheap, and largely available tools. Hence, objective handwriting analysis through artificial intelligence would represent an innovative strategy for telemedicine purposes in healthy subjects and people affected by neurological disorders. Materials and Methods: One-hundred and fifty-six healthy subjects (61 males; 49.6 ± 20.4 years) were enrolled and divided according to age into three subgroups: Younger adults (YA), middle-aged adults (MA), and older adults (OA). Participants performed an ecological handwriting task that was digitalized through smartphones. Data underwent the DBNet algorithm for measuring and comparing the average stroke sizes in the three groups. A convolutional neural network (CNN) was also used to classify handwriting samples. Lastly, receiver operating characteristic (ROC) curves and sensitivity, specificity, positive, negative predictive values (PPV, NPV), accuracy and area under the curve (AUC) were calculated to report the performance of the algorithm. Results: Stroke sizes were significantly smaller in OA than in MA and YA. The CNN classifier objectively discriminated YA vs. OA (sensitivity = 82%, specificity = 80%, PPV = 78%, NPV = 79%, accuracy = 77%, and AUC = 0.84), MA vs. OA (sensitivity = 84%, specificity = 56%, PPV = 78%, NPV = 73%, accuracy = 74%, and AUC = 0.7), and YA vs. MA (sensitivity = 75%, specificity = 82%, PPV = 79%, NPV = 83%, accuracy = 79%, and AUC = 0.83). Discussion: Handwriting progressively declines with human aging. The effect of physiological aging on handwriting abilities can be detected remotely and objectively by using machine learning algorithms. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
16. Machine-learning based vulnerability analysis of existing buildings.
- Author
-
Ruggieri, Sergio, Cardellicchio, Angelo, Leggieri, Valeria, and Uva, Giuseppina
- Subjects
- *
EARTHQUAKE hazard analysis , *DATA warehousing , *TEST reliability , *ALGORITHMS , *RISK assessment - Abstract
The paper presents a machine-learning based framework, named VULMA (VULnerability analysis using MAchine-learning), for vulnerability analysis of existing buildings. The underlying idea is to provide an indication of the seismic vulnerability by exploiting available photographs, which can be properly processed to provide some input data for empirical vulnerability algorithms. To this scope, a complete processing pipeline has been defined, which consists in four consecutive modules offering different and specific services. The first module, Street VULMA , performs the image gathering starting from the raw data; the second module, Data VULMA , provides a mean for the data labelling and storage; the third module, Bi VULMA , uses the collected data to train several machine-learning models for image classification; the fourth module, In VULMA , performs a ranking of the images, their analysis and consequently assigns the vulnerability index. The proposed procedure has been employed on the existing building portfolio in an extended area of the municipality of Bisceglie, Puglia, Southern Italy, for which all the modules have been tested and, above all, the machine-learning models of Bi VULMA have been trained. After, in order to test the efficiency and the reliability of the proposed tools, the entire procedure has been applied on five case study buildings. The results in terms of vulnerability index have been compared with the manual computations performed by the authors applying the same algorithm. Despite the proposed tool could be improved or modified in some of its modules, the obtained results show a good effectiveness of VULMA , which opens new scenarios in the field of vulnerability assessment procedures and risk mitigation strategies. • Proposal of a framework for the vulnerability analysis of existing building starting from a photo: VULMA ; • Definition of the four modules characterizing VULMA: Street VULMA , Data VULMA , Bi VULMA and In VULMA ; • Application and training of the proposed procedure to a dataset extracted from a municipality of Southern Italy and testing and validation of the tool; • Assessment and proposal of VULMA as new instrument for the definition of the vulnerability response of individual buildings and for the seismic risk estimate at large-scale. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
17. Automation of the peripheral resistance valve in a hydro-mechanical cardiovascular pulse duplicator system.
- Author
-
Rampazzo, Mirco, Manzoni, Eleonora, Lionello, Michele, Di Micco, Luigi, and Susin, Francesca Maria
- Subjects
- *
PRESSURE drop (Fluid dynamics) , *VALVES , *AUTOMATION , *MEDICAL equipment , *ALGORITHMS - Abstract
This paper considers the modernization of an existing non-commercial Pulse Duplicator in use at the Healing Research Laboratory at the University of Padova, Italy. The system reproduces human systemic circulation and it is used to test heart medical devices. The focus of this study is the full automation of a crucial system component that is the peripheral resistance manual valve that is replaced by a motorized one. First, under certain technological constraints, the problem of the automatic setting adjustment of the valve is tackled by using a Sliding Mode Extremum Seeking Control (ESC) method. This approach guarantees the system fundamental pressure drop to simulate the peripheral resistance to flow in the human systemic circulation in various system configurations and operating conditions. Then, the Sliding Mode ESC algorithm is embedded in an Arduino board driving the motorized valve. Finally, experimental tests are performed to assess the effectiveness of the motorized valve. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.