40 results
Search Results
2. The Challenges of Algorithm Management: The Spanish Perspective.
- Author
-
Perez del Prado, Daniel
- Subjects
ALGORITHMS ,LABOR laws ,ARTIFICIAL intelligence - Abstract
This paper focuses on how Spain's labour and employment law is dealing with technological disruption and, particularly, with algorithm management, looking for a harmonious equilibrium between traditional structures and profound changes. It pays special attention to the different actors affected and the most recent normative changes. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
3. International External Validation of Risk Prediction Model of 90-Day Mortality after Gastrectomy for Cancer Using Machine Learning.
- Author
-
Dal Cero, Mariagiulia, Gibert, Joan, Grande, Luis, Gimeno, Marta, Osorio, Javier, Bencivenga, Maria, Fumagalli Romario, Uberto, Rosati, Riccardo, Morgagni, Paolo, Gisbertz, Suzanne, Polkowski, Wojciech P., Lara Santos, Lucio, Kołodziejczyk, Piotr, Kielan, Wojciech, Reddavid, Rossella, van Sandick, Johanna W., Baiocchi, Gian Luca, Gockel, Ines, Davies, Andrew, and Wijnhoven, Bas P. L.
- Subjects
MORTALITY risk factors ,GASTRECTOMY ,RISK assessment ,RANDOM forest algorithms ,PREDICTION models ,STOMACH tumors ,RECEIVER operating characteristic curves ,SURGERY ,PATIENTS ,FISHER exact test ,LOGISTIC regression analysis ,HEMOGLOBINS ,CANCER patients ,HOSPITALS ,DESCRIPTIVE statistics ,AGE distribution ,RESEARCH methodology ,RESEARCH ,COMBINED modality therapy ,MACHINE learning ,DATA analysis software ,CONFIDENCE intervals ,SERUM albumin ,ALGORITHMS - Abstract
Simple Summary: A 90-day mortality predictive model for curative gastric cancer resection based on the Spanish EURECCA Esophagogastric Cancer database was externally validated using the GASTRODATA registry. The externally validated model showed a modestly worse performance compared to the original model, nevertheless maintaining its discriminating ability in clinical practice. Background: Radical gastrectomy remains the main treatment for gastric cancer, despite its high mortality. A clinical predictive model of 90-day mortality (90DM) risk after gastric cancer surgery based on the Spanish EURECCA registry database was developed using a matching learning algorithm. We performed an external validation of this model based on data from an international multicenter cohort of patients. Methods: A cohort of patients from the European GASTRODATA database was selected. Demographic, clinical, and treatment variables in the original and validation cohorts were compared. The performance of the model was evaluated using the area under the curve (AUC) for a random forest model. Results: The validation cohort included 2546 patients from 24 European hospitals. The advanced clinical T- and N-category, neoadjuvant therapy, open procedures, total gastrectomy rates, and mean volume of the centers were significantly higher in the validation cohort. The 90DM rate was also higher in the validation cohort (5.6%) vs. the original cohort (3.7%). The AUC in the validation model was 0.716. Conclusion: The externally validated model for predicting the 90DM risk in gastric cancer patients undergoing gastrectomy with curative intent continues to be as useful as the original model in clinical practice. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
4. Confidence of a k-Nearest Neighbors Python Algorithm for the 3D Visualization of Sedimentary Porous Media.
- Author
-
Bullejos, Manuel, Cabezas, David, Martín-Martín, Manuel, and Alcalá, Francisco Javier
- Subjects
PYTHON programming language ,K-nearest neighbor classification ,POROUS materials ,CONFIDENCE ,ECONOMIC decision making ,ALGORITHMS - Abstract
In a previous paper, the authors implemented a machine learning k-nearest neighbors (KNN) algorithm and Python libraries to create two 3D interactive models of the stratigraphic architecture of the Quaternary onshore Llobregat River Delta (NE Spain) for groundwater exploration purposes. The main limitation of this previous paper was its lack of routines for evaluating the confidence of the 3D models. Building from the previous paper, this paper refines the programming code and introduces an additional algorithm to evaluate the confidence of the KNN predictions. A variant of the Similarity Ratio method was used to quantify the KNN prediction confidence. This variant used weights that were inversely proportional to the distance between each grain-size class and the inferred point to work out a value that played the role of similarity. While the KNN algorithm and Python libraries demonstrated their efficacy for obtaining 3D models of the stratigraphic arrangement of sedimentary porous media, the KNN prediction confidence verified the certainty of the 3D models. In the Llobregat River Delta, the KNN prediction confidence at each prospecting depth was a function of the available data density at that depth. As expected, the KNN prediction confidence decreased according to the decreasing data density at lower depths. The obtained average-weighted confidence was in the 0.44−0.53 range for gravel bodies at prospecting depths in the 12.7−72.4 m b.s.l. range and was in the 0.42−0.55 range for coarse sand bodies at prospecting depths in the 4.6−83.9 m b.s.l. range. In a couple of cases, spurious average-weighted confidences of 0.29 in one gravel body and 0.30 in one coarse sand body were obtained. These figures were interpreted as the result of the quite different weights of neighbors from different grain-size classes at short distances. The KNN algorithm confidence has proven its suitability for identifying these anomalous results in the supposedly well-depurated grain-size database used in this study. The introduced KNN algorithm confidence quantifies the reliability of the 3D interactive models, which is a necessary stage to make decisions in economic and environmental geology. In the Llobregat River Delta, this quantification clearly improves groundwater exploration predictability. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
5. A K-Nearest Neighbors Algorithm in Python for Visualizing the 3D Stratigraphic Architecture of the Llobregat River Delta in NE Spain.
- Author
-
Bullejos, Manuel, Cabezas, David, Martín-Martín, Manuel, and Alcalá, Francisco Javier
- Subjects
K-nearest neighbor classification ,SUPERVISED learning ,PYTHON programming language ,ALGORITHMS ,MACHINE learning ,SEDIMENTARY structures ,PLIOCENE Epoch - Abstract
The k-nearest neighbors (KNN) algorithm is a non-parametric supervised machine learning classifier; which uses proximity and similarity to make classifications or predictions about the grouping of an individual data point. This ability makes the KNN algorithm ideal for classifying datasets of geological variables and parameters prior to 3D visualization. This paper introduces a machine learning KNN algorithm and Python libraries for visualizing the 3D stratigraphic architecture of sedimentary porous media in the Quaternary onshore Llobregat River Delta (LRD) in northeastern Spain. A first HTML model showed a consecutive 5 m-equispaced set of horizontal sections of the granulometry classes created with the KNN algorithm from 0 to 120 m below sea level in the onshore LRD. A second HTML model showed the 3D mapping of the main Quaternary gravel and coarse sand sedimentary bodies (lithosomes) and the basement (Pliocene and older rocks) top surface created with Python libraries. These results reproduce well the complex sedimentary structure of the LRD reported in recent scientific publications and proves the suitability of the KNN algorithm and Python libraries for visualizing the 3D stratigraphic structure of sedimentary porous media, which is a crucial stage in making decisions in different environmental and economic geology disciplines. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
6. KNN and adaptive comfort applied in decision making for HVAC systems.
- Author
-
Aparicio-Ruiz, Pablo, Barbadilla-Martín, Elena, Guadix, José, and Cortés, Pablo
- Subjects
THERMAL comfort ,DECISION making ,SUPPORT vector machines ,ALGORITHMS ,AIR conditioning ,HEATING & ventilation industry - Abstract
The decision making of a suitable heating, ventilating and air conditioning system's set-point temperature is an energy and environmental challenge in our society. In the present paper, a general framework to define such temperature based on a dynamic adaptive comfort algorithm is proposed. Due to the fact that the thermal comfort of the occupants of a building has different ranges of acceptability, this method is applied to learn such comfort temperature with respect to the running mean temperature and therefore to decide the suitable range of indoor temperature. It is demonstrated that this solution allows to dynamically build an adaptive comfort algorithm, an algorithm based on the human being's thermal adaptability, without applying the traditional theory. The proposed methodology based on the K-Nearest-Neighbour algorithm was tested and compared with data from an experimental thermal comfort field study carried out in a mixed mode building in the south-western area of Spain and with the Support Vector Machine method. The results show that K-Nearest-Neighbour algorithm represents the pattern of thermal comfort data better than the traditional solution and that it is a suitable method to learn the thermal comfort area of a building and to define the set-point temperature for a heating, ventilating and air-conditioning system. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
7. ALGORITHMIC (IN)VISIBILITY TACTICS AMONG IMMIGRANT TIKTOKERS.
- Author
-
JARAMILLO-DENT, DANIELA
- Subjects
SCIENTIFIC literature ,IMMIGRANTS ,SOCIAL media ,DIGITAL video - Abstract
It is well established in scientific literature that immigrants are excluded from their own stories, which are often instrumentalized to fulfill specific communicative, othering intentions. In this sense, migrant agency and voice are, in many cases, absent from narratives related to their life experiences and subject to various symbolic, digital, and material borders. Moreover, although social media has been recognized as a prime space for self-representation across different segments of society, immigrants are often excluded from these spaces due to the risks that sharing certain information publicly represent to them. In this article I draw from a 16-month digital ethnography and inductive, multimodal content analysis of videos created by 53 Latin American immigrant tiktokers in the United States and Spain. This enables the conceptualization of their algorithmic (in)visibility practices which refer to the set of strategies deployed by immigrant content creators on social media --and possibly other marginalized and vulnerable populations-- to negotiate the conspicuousness of their controversial content with the aim of avoiding its deletion from the platform. The findings unveil three exemplary algorithmic (in)visibility practices that include content reuse and re-upload, vernacular visibility, and partial deplatforming. I find that these strategies shift between collective and individual approaches to achieve selective visibility and concealed conspicuousness within algorithmic moderation systems. [ABSTRACT FROM AUTHOR]
- Published
- 2022
8. Estimation of COVID-19 epidemic curves using genetic programming algorithm.
- Author
-
Anđelić, Nikola, Šegota, Sandi Baressi, Lorencin, Ivan, Mrzljak, Vedran, and Car, Zlatan
- Subjects
HIGH performance computing ,COVID-19 ,CONVALESCENCE ,MACHINE learning ,INFECTIOUS disease transmission ,RESEARCH funding ,STATISTICAL models ,ALGORITHMS - Abstract
This paper investigates the possibility of the implementation of Genetic Programming (GP) algorithm on a publicly available COVID-19 data set, in order to obtain mathematical models which could be used for estimation of confirmed, deceased, and recovered cases and the estimation of epidemiology curve for specific countries, with a high number of cases, such as China, Italy, Spain, and USA and as well as on the global scale. The conducted investigation shows that the best mathematical models produced for estimating confirmed and deceased cases achieved R2 scores of 0.999, while the models developed for estimation of recovered cases achieved the R2 score of 0.998. The equations generated for confirmed, deceased, and recovered cases were combined in order to estimate the epidemiology curve of specific countries and on the global scale. The estimated epidemiology curve for each country obtained from these equations is almost identical to the real data contained within the data set [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
9. Bidders Recommender for Public Procurement Auctions Using Machine Learning: Data Analysis, Algorithm, and Case Study with Tenders from Spain.
- Author
-
García Rodríguez, Manuel J., Rodríguez Montequín, Vicente, Ortega Fernández, Francisco, and Villanueva Balsera, Joaquín M.
- Subjects
GOVERNMENT purchasing ,MACHINE learning ,ALGORITHMS ,RECOMMENDER systems ,RANDOM forest algorithms ,DATA analysis - Abstract
Recommending the identity of bidders in public procurement auctions (tenders) has a significant impact in many areas of public procurement, but it has not yet been studied in depth. A bidders recommender would be a very beneficial tool because a supplier (company) can search appropriate tenders and, vice versa, a public procurement agency can discover automatically unknown companies which are suitable for its tender. This paper develops a pioneering algorithm to recommend potential bidders using a machine learning method, particularly a random forest classifier. The bidders recommender is described theoretically, so it can be implemented or adapted to any particular situation. It has been successfully validated with a case study: an actual Spanish tender dataset (free public information) which has 102,087 tenders from 2014 to 2020 and a company dataset (nonfree public information) which has 1,353,213 Spanish companies. Quantitative, graphical, and statistical descriptions of both datasets are presented. The results of the case study were satisfactory: the winning bidding company is within the recommended companies group, from 24% to 38% of the tenders, according to different test conditions and scenarios. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
10. A Method of Estimating Time-to-Recovery for a Disease Caused by a Contagious Pathogen Such as SARS-CoV-2 Using a Time Series of Aggregated Case Reports.
- Author
-
Koutsouris, Dimitrios-Dionysios, Pitoglou, Stavros, Anastasiou, Athanasios, and Koumpouros, Yiannis
- Subjects
DISEASE progression ,COMPUTER software ,COVID-19 ,CONFIDENCE intervals ,TIME ,CONVALESCENCE ,WORLD health ,EPIDEMICS ,TIME series analysis ,DESCRIPTIVE statistics ,SENSITIVITY & specificity (Statistics) ,PREDICTION models ,COVID-19 pandemic ,ALGORITHMS - Abstract
During the outbreak of a disease caused by a pathogen with unknown characteristics, the uncertainty of its progression parameters can be reduced by devising methods that, based on rational assumptions, exploit available information to provide actionable insights. In this study, performed a few (~6) weeks into the outbreak of COVID-19 (caused by SARS-CoV-2), one of the most important disease parameters, the average time-to-recovery, was calculated using data publicly available on the internet (daily reported cases of confirmed infections, deaths, and recoveries), and fed into an algorithm that matches confirmed cases with deaths and recoveries. Unmatched cases were adjusted based on the matched cases calculation. The mean time-to-recovery, calculated from all globally reported cases, was found to be 18.01 days (SD 3.31 days) for the matched cases and 18.29 days (SD 2.73 days) taking into consideration the adjusted unmatched cases as well. The proposed method used limited data and provided experimental results in the same region as clinical studies published several months later. This indicates that the proposed method, combined with expert knowledge and informed calculated assumptions, could provide a meaningful calculated average time-to-recovery figure, which can be used as an evidence-based estimation to support containment and mitigation policy decisions, even at the very early stages of an outbreak. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
11. A study of differential microRNA expression profile in migraine: the microMIG exploratory study.
- Author
-
Gallardo, V. J., Gómez-Galván, J. B., Asskour, L., Torres-Ferrús, M., Alpuente, A., Caronna, E., and Pozo-Rosich, P.
- Subjects
RESEARCH ,MONONUCLEAR leukocytes ,MIGRAINE ,RESEARCH methodology ,MICRORNA ,INTERVIEWING ,CASE-control method ,RANDOM forest algorithms ,GENETIC markers ,GENE expression profiling ,QUESTIONNAIRES ,FACTOR analysis ,RESEARCH funding ,CLUSTER analysis (Statistics) ,HEADACHE ,WOMEN'S health ,LONGITUDINAL method ,ALGORITHMS ,EPIGENOMICS - Abstract
Background: Several studies have described potential microRNA (miRNA) biomarkers associated with migraine, but studies are scarcely reproducible primarily due to the heterogeneous variability of participants. Increasing evidence shows that disease-related intrinsic factors together with lifestyle (environmental factors), influence epigenetic mechanisms and in turn, diseases. Hence, the main objective of this exploratory study was to find differentially expressed miRNAs (DE miRNA) in peripheral blood mononuclear cells (PBMC) of patients with migraine compared to healthy controls in a well-controlled homogeneous cohort of non-menopausal women. Methods: Patients diagnosed with migraine according to the International Classification of Headache Disorders (ICHD-3) and healthy controls without familial history of headache disorders were recruited. All participants completed a very thorough questionnaire and structured-interview in order to control for environmental factors. RNA was extracted from PBMC and a microarray system (GeneChip miRNA 4.1 Array chip, Affymetrix) was used to determine the miRNA profiles between study groups. Principal components analysis and hierarchical clustering analysis were performed to study samples distribution and random forest (RF) algorithms were computed for the classification task. To evaluate the stability of the results and the prediction error rate, a bootstrap (.632 + rule) was run through all the procedure. Finally, a functional enrichment analysis of selected targets was computed through protein–protein interaction networks. Results: After RF classification, three DE miRNA distinguished study groups in a very homogeneous female cohort, controlled by factors such as demographics (age and BMI), life-habits (physical activity, caffeine and alcohol consumptions), comorbidities and clinical features associated to the disease: miR-342-3p, miR-532-3p and miR-758-5p. Sixty-eight target genes were predicted which were linked mainly to enriched ion channels and signaling pathways, neurotransmitter and hormone homeostasis, infectious diseases and circadian entrainment. Conclusions: A 3-miRNA (miR-342-3p, miR-532-3p and miR-758-5p) novel signature has been found differentially expressed between controls and patients with migraine. Enrichment analysis showed that these pathways are closely associated with known migraine pathophysiology, which could lead to the first reliable epigenetic biomarker set. Further studies should be performed to validate these findings in a larger and more heterogeneous sample. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
12. An Innovative JavaScript-Based Framework for Teaching Backtracking Algorithms Interactively.
- Author
-
Nasralla, Moustafa M.
- Subjects
JAVASCRIPT programming language ,ALGORITHMS ,CONCEPT learning ,ENGINEERING education ,EIGENFUNCTIONS ,DIGITAL learning - Abstract
Algorithm fundamentals are useful to learn at different levels engineering education. One of the most difficult concepts to teach and understand is backtracking algorithms with proper bounding functions. This article proposes a framework to implement interactive online tools showing examples of backtracking algorithms in which students can graphically observe execution step-by-step. This approach is illustrated with the n-queens problem with students from Prince Sultan University, Saudi Arabia, and Complutense University of Madrid, Spain. The results show 6.67% increased learning on a backtracking exercise in the experimental group over the control group, in which the algorithms were automatically validated with DOMjudge software (an automated system used to run programming contests). The proposed framework was evaluated as easy to use, with a score of 74.5% in the validated System Usability Scale (SUS); easy to learn, with a score of 6.22 out of 7 in the validated Usefulness, Satisfaction, and Ease-of-Use (USE) scale; and with a general satisfaction of 5.97 out of 7 in the validated USE scale. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
13. Day- and night-time aerosol optical depth implementation in CÆLIS.
- Author
-
González, Ramiro, Toledano, Carlos, Román, Roberto, Fuertes, David, Berjón, Alberto, Mateos, David, Guirado-Fuentes, Carmen, Velasco-Merino, Cristian, Carlos Antuña-Sanchez, Juan, Calle, Abel, E. Cachorro, Victoria, and M. de Frutos, Ángel
- Subjects
OPTICAL depth (Astrophysics) ,AEROSOLS ,OBSERVATIONS of the Moon ,ALGORITHMS ,QUALITY control - Abstract
The University of Valladolid (UVa, Spain) manages since 2006 a calibration center of the AErosol RObotic NETwork (AERONET). The CÆLIS software tool, developed by UVa, was created to manage the data generated by the AERONET photometers, for calibration, quality control and data processing purposes. This paper exploits the potential of this tool in order to obtain products like the aerosol optical depth (AOD) and Angstrom exponent (AE), which are of high interest for atmospheric and climate studies, as well as to enhance the quality control of the instruments and data managed by CÆLIS. The AOD and cloud screening algorithms implemented in CÆLIS, both based on AERONET version 3, are described in detail. The obtained products are compared with the AERONET database. In general, the differences in daytime AOD between CÆLIS and AERONET are far below the expected uncertainty of the instrument, ranging the mean differences between −1.3×10[sup −4] at 870 nm and 6.2×10[sup −4] at 380 nm. The standard deviations of the differences range from 2.8×10[sup −4] at 675 nm to 8.1×10[sup −4] at 340 nm. The AOD and AE at night-time calculated by CÆLIS from Moon observations are also presented, showing good continuity between day and night-time for different locations, aerosol loads and moon phase angles. Regarding cloud screening, around 99.9 % of the observations classified as cloud-free by CÆLIS are also assumed cloud-free by AERONET; this percentage is similar for the cases considered as cloud-contaminated by both databases. The obtained results point out the capability of CÆLIS as processing system. The AOD algorithm provides the opportunity to use this tool with other instrument types and to retrieve other aerosol products in the future. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
14. Breast Density Analysis Using an Automatic Density Segmentation Algorithm.
- Author
-
Oliver, Arnau, Tortajada, Meritxell, Lladó, Xavier, Freixenet, Jordi, Ganau, Sergi, Tortajada, Lidia, Vilagran, Mariona, Sentís, Melcior, and Martí, Robert
- Subjects
BREAST ,ALGORITHMS ,MAMMOGRAMS ,BREAST tumors ,DIAGNOSTIC imaging ,LONGITUDINAL method ,COMPUTERS in medicine ,PROBABILITY theory ,REGRESSION analysis ,RESEARCH funding ,T-test (Statistics) ,EVALUATION research ,DESCRIPTIVE statistics ,ANATOMY - Abstract
Breast density is a strong risk factor for breast cancer. In this paper, we present an automated approach for breast density segmentation in mammographic images based on a supervised pixel-based classification and using textural and morphological features. The objective of the paper is not only to show the feasibility of an automatic algorithm for breast density segmentation but also to prove its potential application to the study of breast density evolution in longitudinal studies. The database used here contains three complete screening examinations, acquired 2 years apart, of 130 different patients. The approach was validated by comparing manual expert annotations with automatically obtained estimations. Transversal analysis of the breast density analysis of craniocaudal (CC) and mediolateral oblique (MLO) views of both breasts acquired in the same study showed a correlation coefficient of ρ = 0.96 between the mammographic density percentage for left and right breasts, whereas a comparison of both mammographic views showed a correlation of ρ = 0.95. A longitudinal study of breast density confirmed the trend that dense tissue percentage decreases over time, although we noticed that the decrease in the ratio depends on the initial amount of breast density. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
15. The Empirically Corrected EP-TOMS Total Ozone Data Against Brewer Measurements at El Arenosillo (Southwestern Spain).
- Author
-
Antón, Manuel, Vilaplana, José Manuel, Kroon, Mark, Serrano, Antonio, Parias, Marta, Cancillo, María Luisa, and de la Morena, Benito A.
- Subjects
OZONE ,SPECTROMETERS ,SPECTRORADIOMETER ,SATELLITE geodesy - Abstract
This paper focuses on the validation of the empirically corrected total ozone column (TOC) data provided by the Earth Probe Total Ozone Mapping Spectrometer (EP-TOMS) using ground-based measurements recorded by a well-calibrated Brewer spectroradiometer located at El Arenosillo (Spain). In addition, satellite TOC observations derived from the OzoneMonitoring Instrument (OMI) with the TOMS algorithm are also used in this paper. The agreement between EP-TOMS TOC data and Brewer measurements is excellent (R² ~ 0.92) even for the period 2000-2005 when a higher EP-TOMS instrument degradation occurred. Despite its low magnitude, the EP-TOMS-Brewer relative differences depend on the solar zenith angle (SZA), showing a clear seasonal cycle with amplitude between ±2% and ±4%. Conversely, OMI-Brewer relative differences show a constant negative value around -1% with no significant dependence on SZA. No significant dependence on the ground-based to satellitebased differences with respect to the EP-TOMS scene or to the OMI crosstrack position is observed for either satellite retrieval algorithm. Finally, TOC, estimated by the two satellite instruments, have also been compared, showing a good agreement (R² ~ 0.88). Overall, we conclude that the empirical correction of the EP-TOMS data record provides a reprocessed set of high quality. However, EP-TOMS data after year 2000 should not be used in calculations of global-ozone trending due to remaining errors in the data set and because it is no longer an independent data set. [ABSTRACT FROM AUTHOR]
- Published
- 2010
- Full Text
- View/download PDF
16. Control Algorithm for Coordinated Reactive Power Compensation in a Wind Park.
- Author
-
Díaz-Dorado, E., Carrillo, C., and Cidrás, J.
- Subjects
WIND power plants ,WIND power ,ALGORITHMS ,POWER resources ,WIND turbines ,INDUCTION generators ,CAPACITOR banks ,DYNAMIC programming ,SIMULATION methods & models ,REACTIVE power - Abstract
The penetration level of wind energy is continuously growing, and it is especially relevant in European countries such as Denmark, Germany, and Spain. For this reason, grid codes in different countries have been recently revised, or are now under revision in order to integrate this energy in the network taking into account the security of supply. This paper is related to reactive compensation, which is one aspect usually included in these codes. On the other hand, a great number of installed wind parks are formed by fixed speed wind turbines equipped with induction generators. The typical scheme for reactive compensation in this kind of wind parks is based on capacitor banks locally controlled in each machine. This configuration makes very difficult to follow the requirements of the new grid codes. To overcome this problem, a configuration with a central controller that coordinates the actuation over all the capacitor steps in the wind park is proposed in this paper. A central controller algorithm that is based on a dynamic programming is presented and evaluated by means of simulation. At this time, the proposed scheme has been installed at the Sotavento Experimental Wind Park (Spain) and it is currently being tested. [ABSTRACT FROM AUTHOR]
- Published
- 2008
- Full Text
- View/download PDF
17. Application of the GoRoSo Feedforward Algorithm to Compute the Gate Trajectories for a Quick Canal Closing in the Case of an Emergency.
- Author
-
Soler, Joan, Gómez, Manuel, Rodellar, José, and Gamazo, Pablo
- Subjects
CANALS ,RIVERS ,OPEN-channel flow ,QUADRATIC programming ,FEEDFORWARD control systems - Abstract
The canal delivery system in the Left Hemidelta area of the Ebro River in Spain consists of a tree-shaped net of open canals. The overall system can be quickly isolated in the case of an emergency by closing the upstream pool. Transients, in which the initial state is hydraulically far from the final state, are difficult to handle and cannot be made in only one gate movement in order to protect the canal lining. Therefore, they have to be as smooth as possible. GoRoSo is a feedforward control algorithm for irrigation canals based on sequential quadratic programming. With this tool, it is possible to calculate the gate trajectories that smoothly carry the canal from the initial state to the final state by keeping the water depth constant at checkpoints. The paper shows the efficient implementation of GoRoSo in both the closure and opening operations of the canal delivery system. [ABSTRACT FROM AUTHOR]
- Published
- 2013
- Full Text
- View/download PDF
18. Multiple time scales in modeling the incidence of infections acquired in intensive care units.
- Author
-
Wolkewitz, Martin, Cooper, Ben S., Palomar-Martinez, Mercedes, Alvarez-Lerma, Francisco, Olaechea-Astigarraga, Pedro, Barnett, Adrian G., and Schumacher, Martin
- Subjects
INTENSIVE care units ,INFECTION risk factors ,NOSOCOMIAL infections ,CRITICAL care medicine ,HOSPITAL admission & discharge ,DISEASE prevalence ,METHICILLIN-resistant staphylococcus aureus ,ALGORITHMS ,COMPARATIVE studies ,CROSS infection ,LENGTH of stay in hospitals ,MATHEMATICAL models ,RESEARCH methodology ,MEDICAL cooperation ,RESEARCH ,RESEARCH funding ,RISK assessment ,STAPHYLOCOCCAL diseases ,TIME ,THEORY ,EVALUATION research ,DISEASE incidence ,PROPORTIONAL hazards models ,PHYSIOLOGY - Abstract
Background: When patients are admitted to an intensive care unit (ICU) their risk of getting an infection will be highly depend on the length of stay at-risk in the ICU. In addition, risk of infection is likely to vary over calendar time as a result of fluctuations in the prevalence of the pathogen on the ward. Hence risk of infection is expected to depend on two time scales (time in ICU and calendar time) as well as competing events (discharge or death) and their spatial location. The purpose of this paper is to develop and apply appropriate statistical models for the risk of ICU-acquired infection accounting for multiple time scales, competing risks and the spatial clustering of the data.Methods: A multi-center data base from a Spanish surveillance network was used to study the occurrence of an infection due to Methicillin-resistant Staphylococcus aureus (MRSA). The analysis included 84,843 patient admissions between January 2006 and December 2011 from 81 ICUs. Stratified Cox models were used to study multiple time scales while accounting for spatial clustering of the data (patients within ICUs) and for death or discharge as competing events for MRSA infection.Results: Both time scales, time in ICU and calendar time, are highly associated with the MRSA hazard rate and cumulative risk. When using only one basic time scale, the interpretation and magnitude of several patient-individual risk factors differed. Risk factors concerning the severity of illness were more pronounced when using only calendar time. These differences disappeared when using both time scales simultaneously.Conclusions: The time-dependent dynamics of infections is complex and should be studied with models allowing for multiple time scales. For patient individual risk-factors we recommend stratified Cox regression models for competing events with ICU time as the basic time scale and calendar time as a covariate. The inclusion of calendar time and stratification by ICU allow to indirectly account for ICU-level effects such as local outbreaks or prevention interventions. [ABSTRACT FROM AUTHOR]- Published
- 2016
- Full Text
- View/download PDF
19. Estimation of RVoG Scene Parameters by Means of PolInSAR With TanDEM-X Data: Effect of the Double-Bounce Contribution.
- Author
-
Romero-Puig, Noelia, Lopez-Sanchez, Juan M., and Ballester-Berman, J. David
- Subjects
CROPS ,ALGORITHMS ,PADDY fields ,BISTATIC radar ,GROUND vegetation cover - Abstract
This article evaluates the effect of the double-bounce (DB) decorrelation term that appears in single-pass bistatic acquisitions, as in the TanDEM-X system, on the inversion of scene parameters by means of polarimetric SAR interferometry (PolInSAR). The retrieval of all scene parameters involved in the Random Volume over Ground (RVoG) model (i.e., ground topography, vegetation height, extinction, and ground-to-volume ratios) is affected by this term when the radar response from the ground is dominated by the DB. The estimation error in all these parameters is analyzed by means of simulations over a wide range of system configurations and scene variables for both agricultural crops and forest scenarios. Simulations demonstrate that the inclusion of the DB term, which complicates the inversion algorithm, is necessary for the angles of incidence shallower than 30° to achieve an estimation error below 10% in vegetation height and to avoid a significant underestimation in the ground-to-volume ratios. At steep incidences, this decorrelation term does not affect the estimation of vegetation height and ground-to-volume ratios. Regarding the extinction, this parameter is intrinsically not well estimated, since most retrieved values are close to the initial guesses employed for the optimization algorithm, regardless of the use or not of the DB decorrelation term. Finally, these findings are compared with the experimental results from the TanDEM-X data acquired over the rice fields in Spain for the available system parameters (baseline and incidence angle) of the acquired data set. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
20. A genome-wide analysis of copy number variation in Murciano-Granadina goats.
- Author
-
Guan, Dailu, Martínez, Amparo, Castelló, Anna, Landi, Vincenzo, Luigi-Sierra, María Gracia, Fernández-Álvarez, Javier, Cabrera, Betlem, Delgado, Juan Vicente, Such, Xavier, Jordana, Jordi, and Amills, Marcel
- Subjects
GOAT breeds ,GOATS ,ATP-binding cassette transporters ,GENE targeting ,DNA copy number variations ,ALGORITHMS ,SECRETION ,GENETIC transduction - Abstract
Background: In this work, our aim was to generate a map of the copy number variations (CNV) segregating in a population of Murciano-Granadina goats, the most important dairy breed in Spain, and to ascertain the main biological functions of the genes that map to copy number variable regions. Results: Using a dataset that comprised 1036 Murciano-Granadina goats genotyped with the Goat SNP50 BeadChip, we were able to detect 4617 and 7750 autosomal CNV with the PennCNV and QuantiSNP software, respectively. By applying the EnsembleCNV algorithm, these CNV were assembled into 1461 CNV regions (CNVR), of which 486 (33.3% of the total CNVR count) were consistently called by PennCNV and QuantiSNP and used in subsequent analyses. In this set of 486 CNVR, we identified 78 gain, 353 loss and 55 gain/loss events. The total length of all the CNVR (95.69 Mb) represented 3.9% of the goat autosomal genome (2466.19 Mb), whereas their size ranged from 2.0 kb to 11.1 Mb, with an average size of 196.89 kb. Functional annotation of the genes that overlapped with the CNVR revealed an enrichment of pathways related with olfactory transduction (fold-enrichment = 2.33, q-value = 1.61 × 10
−10 ), ABC transporters (fold-enrichment = 5.27, q-value = 4.27 × 10−04 ) and bile secretion (fold-enrichment = 3.90, q-value = 5.70 × 10−03 ). Conclusions: A previous study reported that the average number of CNVR per goat breed was ~ 20 (978 CNVR/50 breeds), which is much smaller than the number we found here (486 CNVR). We attribute this difference to the fact that the previous study included multiple caprine breeds that were represented by small to moderate numbers of individuals. Given the low frequencies of CNV (in our study, the average frequency of CNV is 1.44%), such a design would probably underestimate the levels of the diversity of CNV at the within-breed level. We also observed that functions related with sensory perception, metabolism and embryo development are overrepresented in the set of genes that overlapped with CNV, and that these loci often belong to large multigene families with tens, hundreds or thousands of paralogous members, a feature that could favor the occurrence of duplications or deletions by non-allelic homologous recombination. [ABSTRACT FROM AUTHOR]- Published
- 2020
- Full Text
- View/download PDF
21. Players’ selection for basketball teams, through Performance Index Rating, using multiobjective evolutionary algorithms.
- Author
-
Pérez-Toledano, Miguel Ángel, Rodriguez, Francisco J., García-Rubio, Javier, and Ibañez, Sergio José
- Subjects
EVOLUTIONARY algorithms ,BASKETBALL teams ,SPORTS competitions ,SPORTS administration ,BIOLOGICAL evolution ,DIFFERENTIAL evolution - Abstract
In any sport the selection of players for a team is fundamental for its subsequent performance. Many factors condition the selection process from the characteristics of the sport discipline to financial limitations, including a long list of restrictions associated with the environment of the competitions in which the team takes part. All of this makes the process of selecting a roster of players very complex, as it is affected by multiple variables and in many cases marked by a great deal of subjectivity. The purpose of this article was to objectively select the players for a basketball team using an evolutionary algorithm, the Non-dominated Sorting Genetic Algorithm II (NSGA-II) that uses stochastic search methods based on the imitation of natural biological evolution. The sample was composed of the players from the teams competing in the top Spanish basketball league, the Association of Basketball Clubs (ACB). To assess the quality of the solutions obtained, the results were compared with the teams in the ACB playing in the same competition as the players used in the study. The results make it possible to obtain different solutions for composing teams rendering financial resources profitable and taking into account the restrictions of the competition and of each sport management. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
22. Photogrammetric Methodology for the Production of Geomorphologic Maps: Application to the Veleta Rock Glacier (Sierra Nevada, Granada, Spain).
- Author
-
de Matías, Javier, de Sanjosé, José Juan, López-Nicolás, Gonzalo, Sagüés, Carlos, and Guerrero, José Jesús
- Subjects
PHOTOGRAMMETRY ,GEOMORPHOLOGICAL mapping ,CARTOGRAPHY ,GLACIERS ,GEODETIC observations ,ALGORITHMS - Abstract
In this paper we present a stereo feature-based method using SIFT (Scale-invariant feature transform) descriptors. We use automatic feature extractors, matching algorithms between images and techniques of robust estimation to produce a DTM (Digital Terrain Model) using convergent shots of a rock glacier.The geomorphologic structure observed in this study is the Veleta rock glacier (Sierra Nevada, Granada, Spain). This rock glacier is of high scientific interest because it is the southernmost active rock glacier in Europe and it has been analyzed every year since 2001. The research on the Veleta rock glacier is devoted to the study of its displacement and cartography through geodetic and photogrammetric techniques. [ABSTRACT FROM AUTHOR]
- Published
- 2009
- Full Text
- View/download PDF
23. Survey of Visual and Force/Tactile Control of Robots for Physical Interaction in Spain.
- Author
-
Garcia, Gabriel J., Corrales, Juan A., Pomares, Jorge, and Torres, Fernando
- Subjects
ROBOTICS ,REMOTE sensing ,DETECTORS ,TACTILE sensors ,FORCE & energy ,TORQUE ,ALGORITHMS ,ARCHITECTURE - Abstract
Sensors provide robotic systems with the information required to perceive the changes that happen in unstructured environments and modify their actions accordingly. The robotic controllers which process and analyze this sensory information are usually based on three types of sensors (visual, force/torque and tactile) which identify the most widespread robotic control strategies: visual servoing control, force control and tactile control. This paper presents a detailed review on the sensor architectures, algorithmic techniques and applications which have been developed by Spanish researchers in order to implement these mono-sensor and multi-sensor controllers which combine several sensors. [ABSTRACT FROM AUTHOR]
- Published
- 2009
- Full Text
- View/download PDF
24. Humanoid robot RH-1 for collaborative tasks: a control architecture for human-robot cooperation.
- Author
-
Monje, Concepción A., Pierro, Paolo, and Balaguer, Carlos
- Subjects
ROBOTICS ,ROBOT kinematics ,ALGORITHMS ,VIRTUAL reality ,AUTONOMOUS robots ,UNIVERSITIES & colleges - Abstract
The full-scale humanoid robot RH-1 has been totally developed in the University Carlos III of Madrid. In this paper we present an advanced control system for this robot so that it can perform tasks in cooperation with humans. The collaborative tasks are carried out in a semi-autonomous way and are intended to be put into operation in real working environments where humans and robots should share the same space. Before presenting the control strategy, the kinematic model and a simplified dynamic model of the robot are presented. All the models and algorithms are verified by several simulations and experimental results. [ABSTRACT FROM AUTHOR]
- Published
- 2008
- Full Text
- View/download PDF
25. Supply Estimation Using Coevolutionary Genetic Algorithms in the Spanish Electrical Market.
- Author
-
De La Cal MarÍn, Enrique A. and Sánchez Ramos, Luciano
- Subjects
ALGORITHMS ,CONFIGURATIONS (Geometry) ,ELECTRICITY ,ELECTRIC utilities ,ELECTRIC generators ,ECONOMIC competition ,SUPPLY & demand ,ECONOMIC models ,ECONOMIC statistics - Abstract
The price of electrical energy in Spain has not been regulated by the government since 1998, but determined by the supply from the generators in a competitive market, the so-called "electrical pool". A genetic method for analyzing data from this new market is presented in this paper. The eventual objective is to determine the individual supply curves of the competitive agents. Adopting the point of view of the game theory, different genetic algorithm configurations using coevolutionary and non-coevolutionary strategies combined with scalar and multi-objective fitness are compared. The results obtained are the first step toward solving the induction of the optimal individual strategies into the Spanish electrical market from data in terms of perfect oligopolistic behavior. [ABSTRACT FROM AUTHOR]
- Published
- 2004
- Full Text
- View/download PDF
26. An Automatic Algorithm to Date the Reference Cycle of the Spanish Economy.
- Author
-
Camacho, Maximo, Gadea, María Dolores, and Gómez-Loscos, Ana
- Subjects
GAUSSIAN distribution ,BUSINESS cycles ,ECONOMIC indicators ,MARKOV processes ,ALGORITHMS ,RECESSIONS - Abstract
This paper provides an accurate chronology of the Spanish reference business cycle adapting a multiple change-point model. In that approach, each combination of peaks and troughs dated in a set of economic indicators is assumed to be a realization of a mixture of bivariate Gaussian distributions, whose number of components is estimated from the data. The means of each of these components refer to the dates of the reference turning points. The transitions across the components of the mixture are governed by Markov chain that is restricted to force left-to-right transition dynamic. In the empirical application, seven recessions in the period from February 1970 to February 2020 are identified, which are in high concordance with the timing of the turning point dates established by the Spanish Business Cycle Dating Committee (SBCDC). [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
27. Statistical Analysis and Machine Learning Prediction of Fog-Caused Low-Visibility Events at A-8 Motor-Road in Spain.
- Author
-
Cornejo-Bueno, Sara, Casillas-Pérez, David, Cornejo-Bueno, Laura, Chidean, Mihaela I., Caamaño, Antonio J., Cerro-Prada, Elena, Casanova-Mateo, Carlos, and Salcedo-Sanz, Sancho
- Subjects
STATISTICS ,MACHINE learning ,PARETO distribution ,MAXIMUM likelihood statistics ,ALGORITHMS ,PEARSON correlation (Statistics) - Abstract
This work presents a full statistical analysis and accurate prediction of low-visibility events due to fog, at the A-8 motor-road in Mondoñedo (Galicia, Spain). The present analysis covers two years of study, considering visibility time series and exogenous variables collected in the zone affected the most by extreme low-visibility events. This paper has then a two-fold objective: first, we carry out a statistical analysis for estimating the fittest probability distributions to the fog event duration, using the Maximum Likelihood method and an alternative method known as the L-moments method. This statistical study allows association of the low-visibility depth with the event duration, showing a clear relationship, which can be modeled with distributions for extremes such as Generalized Extreme Value and Generalized Pareto distributions. Second, we apply a neural network approach, trained by means of the ELM (Extreme Learning Machine) algorithm, to predict the occurrence of low-visibility events due to fog, from atmospheric predictive variables. This study provides a full characterization of fog events at this motor-road, in which orographic fog is predominant, causing important traffic problems during all year. We also show how the ELM approach is able to obtain highly accurate low-visibility events predictions, with a Pearson correlation coefficient of 0.8 , within a half-hour time horizon, enough to initialize some protocols aiming at reducing the impact of these extreme events in the traffic of the A-8 motor road. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
28. Corneal Stability following Hyperopic LASIK with Advanced Laser Ablation Profiles Analyzed by a Light Propagation Study.
- Author
-
Gharaibeh, Almutez M., Villanueva, Asier, Mas, David, Espinosa, Julian, and Alió, Jorge L.
- Subjects
CORNEA physiology ,ALGORITHMS ,CORNEAL topography ,HYPEROPIA ,SCIENTIFIC observation ,POSTOPERATIVE period ,REGRESSION analysis ,SURGEONS ,VISUAL acuity ,LASIK ,STATISTICAL reliability ,RETROSPECTIVE studies - Abstract
Purpose. To assess anterior corneal surface stability 12 months following hyperopic LASIK correction with a light propagation algorithm. Setting. Vissum Instituto Oftalmológico de Alicante, Universidad Miguel Hernández, Alicante, Spain. Methods. This retrospective consecutive observational study includes 37 eyes of 37 patients treated with 6th-generation excimer laser platform (Schwind Amaris). Hyperopic LASIK was performed in all of them by the same surgeon (JLA) and completed 12-month follow-up. Corneal topography was analyzed with a light propagation algorithm, to assess the stability of the corneal outcomes along one year of follow-up. Results. Between three and twelve months postoperatively, an objective corneal power (OCP) regression of 0.39D and 0.41D was found for 6mm and 9mm central corneal zone, respectively. Subjective outcomes at the end of the follow-up period were as follows: 65% of eyes had spherical equivalent within ±0.50 D. 70% of eyes had an uncorrected distance visual acuity 20/20 or better. 86% of eyes had the same or better corrected distance visual acuity. In terms of stability, 0.14D of regression was found. No statistically significant differences were found for all the study parameters evaluated at different postoperative moments over the 12-month period. Conclusions. Light propagation analysis confirms corneal surface stability following modern hyperopic LASIK with a 6th-generation excimer laser technology over a 12-month period. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
29. Strategic noise map of a major road carried out with two environmental prediction software packages.
- Author
-
Arana, M., Martin, R. San, Martin, M. L. San, and Aramendía, E.
- Subjects
NOISE ,ENVIRONMENTAL monitoring ,VEGETATION monitoring ,QUANTITATIVE research ,INTEGRATED software ,ALGORITHMS ,COMPUTER programming ,COMPUTER software - Abstract
The main objective of this study is to analyze the differences found in the results of noise mapping using two of the most popular software techniques for the prediction of environmental noise. The location selected to conduct the comparative study is an area encompassed by the ring road that surrounds the city of Pamplona and on a grid, with a total of 6 × 10
5 points, approximately. In fact, and as the Environmental Noise Directive points out, it is a major road designated by a Member State (Spain). Configuration of the calculation parameters (discretization of the sources, ground absorption, reflection order, etc.) was as equivalent as possible as far as programs allow. In spite of that, a great number of differences appear in the findings. Although in 95.5% of the points the difference in the noise level calculated from the two programs was less than 3 dB, this general statistic result concealed some great differences. These are due to the various algorithms that programs implement to evaluate noise levels. Most differences pertain to highly screened receivers or remote ones. In the former, the algorithm of visibility is the main cause of such differences. In the latter, differences are mainly brought about by a different implementation of the propagation under homogeneous and favorable atmospheric conditions from both software systems. [ABSTRACT FROM AUTHOR]- Published
- 2010
- Full Text
- View/download PDF
30. Predicting the onset of hazardous alcohol drinking in primary care: development and validation of a simple risk algorithm.
- Author
-
Bellón, Juan Ángel, de Dios Luna, Juan, King, Michael, Nazareth, Irwin, Motrico, Emma, GildeGómez-Barragán, María Josefa, Torres-González, Francisco, Montón-Franco, Carmen, Sánchez-Celaya, Marta, Díaz-Barreiros, Miguel Ángel, Vicens, Catalina, and Moreno-Peral, Patricia
- Subjects
ALCOHOL drinking ,PRIMARY care ,CLINICAL prediction rules ,ALGORITHMS ,CHILD sexual abuse ,SMOKING ,PREVENTION of alcoholism ,PSYCHOLOGY of alcoholism ,ALCOHOLISM ,COMPARATIVE studies ,LONGITUDINAL method ,RESEARCH methodology ,MEDICAL cooperation ,PRIMARY health care ,PROGNOSIS ,QUESTIONNAIRES ,RESEARCH ,RISK assessment ,EVALUATION research ,BEHAVIOR disorders - Abstract
Background: Little is known about the risk of progressing to hazardous alcohol use in abstinent or low-risk drinkers.Aim: To develop and validate a simple brief risk algorithm for the onset of hazardous alcohol drinking (HAD) over 12 months for use in primary care.Design and Setting: Prospective cohort study in 32 health centres from six Spanish provinces, with evaluations at baseline, 6 months, and 12 months.Method: Forty-one risk factors were measured and multilevel logistic regression and inverse probability weighting were used to build the risk algorithm. The outcome was new occurrence of HAD during the study, as measured by the AUDIT.Results: From the lists of 174 GPs, 3954 adult abstinent or low-risk drinkers were recruited. The 'predictAL-10' risk algorithm included just nine variables (10 questions): province, sex, age, cigarette consumption, perception of financial strain, having ever received treatment for an alcohol problem, childhood sexual abuse, AUDIT-C, and interaction AUDIT-C*Age. The c-index was 0.886 (95% CI = 0.854 to 0.918). The optimal cutoff had a sensitivity of 0.83 and specificity of 0.80. Excluding childhood sexual abuse from the model (the 'predictAL-9'), the c-index was 0.880 (95% CI = 0.847 to 0.913), sensitivity 0.79, and specificity 0.81. There was no statistically significant difference between the c-indexes of predictAL-10 and predictAL-9.Conclusion: The predictAL-10/9 is a simple and internally valid risk algorithm to predict the onset of hazardous alcohol drinking over 12 months in primary care attendees; it is a brief tool that is potentially useful for primary prevention of hazardous alcohol drinking. [ABSTRACT FROM AUTHOR]- Published
- 2017
- Full Text
- View/download PDF
31. A Novel Information Theoretical Criterion for Climate Network Construction.
- Author
-
Cornejo-Bueno, Sara, Chidean, Mihaela I., Caamaño, Antonio J., Prieto-Godino, Luis, and Salcedo-Sanz, Sancho
- Subjects
WIND speed ,CLIMATOLOGY ,CONSTRUCTION ,ALGORITHMS ,FORECASTING ,WIND power plants - Abstract
This paper presents a novel methodology for Climate Network (CN) construction based on the Kullback-Leibler divergence (KLD) among Membership Probability (MP) distributions, obtained from the Second Order Data-Coupled Clustering (SODCC) algorithm. The proposed method is able to obtain CNs with emergent behaviour adapted to the variables being analyzed, and with a low number of spurious or missing links. We evaluate the proposed method in a problem of CN construction to assess differences in wind speed prediction at different wind farms in Spain. The considered problem presents strong local and mesoscale relationships, but low synoptic scale relationships, which have a direct influence in the CN obtained. We carry out a comparison of the proposed approach with a classical correlation-based CN construction method. We show that the proposed approach based on the SODCC algorithm and the KLD constructs CNs with an emergent behaviour according to underlying wind speed prediction data physics, unlike the correlation-based method that produces spurious and missing links. Furthermore, it is shown that the climate network construction method facilitates the evaluation of symmetry properties in the resulting complex networks. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
32. Effect of country-of-origin contextual factors and length of stay on immigrants' substance use in Spain.
- Author
-
Sordo, L., Indave, B. I., Vallejo, F., Belza, M. J., Sanz-Barbero, B., Rosales-Statkus, M., Fernández-Balbuena, S., and Barrio, G.
- Subjects
SUBSTANCE abuse ,ALGORITHMS ,CONFIDENCE intervals ,EMIGRATION & immigration ,LENGTH of stay in hospitals ,POISSON distribution ,RESEARCH funding ,STATISTICAL sampling ,DATA analysis software ,DESCRIPTIVE statistics - Abstract
Background: Factors explaining disparities in risk of substance use between immigrants and natives and between immigrant subgroups are poorly understood. We aimed to describe such disparities and identify some explanatory factors in Spain. Methods: Participants were residents aged 15-64 years from 2005 to 07 nationally representative surveys. Outcomes were prevalences of alcohol, tobacco, sedative-hypnotics, cannabis and other illegal substance use. Immigrants were recent if <5 years of Spanish stay and long term if ≥10 years. Country-of-origin income per capita and population level of substance use were taken from international databases. Adjusted prevalence ratios (aPRs) and percent change from Poisson regression with robust variance were used to estimate risk disparities and effects of immigration variables. Results: Most immigrants had lower substance use than natives, although it generally increased with increasing Spanish stay, especially for illegal substances. This lower risk could be partially explained by country-of-origin contextual factors as a lower level of income or substance use and religious or cultural factors such as Islam. By origin, recent immigrant aPRs and convergence-divergence risk patterns were, respectively, as follows: lower aPRs with upward convergence (often incomplete) toward natives' risk in immigrants from Muslim area, Eastern-Europe and Latin-America excluding South-Cone, lower/ similar aPRs with upward overtaking or divergent patterns in South-Cone Americans and similar/higher aPRs with stable or upward divergent patterns in Non-Eastern-Europeans. Conclusion: Spain is a host context that seems to facilitate increased substance use among immigrants, even those from countries with prevalences close to Spain. However, country-of-origin context is important in explaining disparities in substance use among immigrants. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
33. Local implementation of a syndromic influenza surveillance system using emergency department data in Santander, Spain.
- Author
-
Schrell, S., Ziemann, A., Garcia-Castrillo Riesgo, L., Rosenkötter, N., Llorca, J., Popa, D., and Krafft, T.
- Subjects
EARLY medical intervention ,PUBLIC health surveillance ,ALGORITHMS ,STATISTICAL correlation ,HOSPITAL emergency services ,RESEARCH methodology ,POISSON distribution ,REACTION time ,RESEARCH funding ,TIME series analysis ,SYSTEMS development ,PREDICTIVE validity ,RETROSPECTIVE studies ,RECEIVER operating characteristic curves ,SEASONAL influenza ,MEDICAL coding - Abstract
Background We assessed the local implementation of syndromic surveillance (SyS) as part of the European project ‘System for Information on, Detection and Analysis of Risks and Threats to Health’ in Santander, Spain. Methods We applied a cumulative sum algorithm on emergency department (ED) chief complaints for influenza-like illness in the seasons 2010–11 and 2011–12. We fine tuned the algorithm using a receiver operating characteristic analysis to identify the optimal trade-off of sensitivity and specificity and defined alert criteria. We assessed the timeliness of the SyS system to detect the onset of the influenza season. Results The ED data correlated with the sentinel data. With the best algorithm settings we achieved 70/63% sensitivity and 89/95% specificity for 2010–11/2011–12. At least 2 consecutive days of signals defined an alert. In 2010–11 the SyS system alerted 1 week before the sentinel system and in 2011–12 in the same week. The data from the ED is available on a daily basis providing an advantage in timeliness compared with the weekly sentinel data. Conclusions ED-based SyS in Santander complements sentinel influenza surveillance by providing timely information. Local fine tuning and definition of alert criteria are recommended to enhance validity. [ABSTRACT FROM AUTHOR]
- Published
- 2013
- Full Text
- View/download PDF
34. Abnormal quality detection and isolation in water distribution networks using simulation models.
- Author
-
Nejjari, F., Pérez, R., Puig, V., Quevedo, J., Sarrate, R., Cugueró, M. A., Sanz, G., and Mirats, J. M.
- Subjects
WATER distribution ,ALGORITHMS ,CHLORINE ,RESIDUAL charges - Abstract
The article discusses the identification of abnormal quality for water distribution networks through a fault isolation algorithm in Barcelona, Spain. It notes that chlorine measurements and sensitivity analysis are the basis for a localization method on distribution. It mentions that a fault sensitivity matrix can be correlated with residual charges by the algorithm.
- Published
- 2012
- Full Text
- View/download PDF
35. Extreme normalised residuals of daily temperatures in Catalonia (NE Spain): sampling strategies, return periods and clustering process.
- Author
-
Serra, C., Martínez, M. D., Lana, X., and Burgueño, A.
- Subjects
STATISTICAL maps ,TEMPERATURE ,ALGORITHMS - Abstract
Extreme normalised residuals, defined as departures from the average values, of 65 daily maximum, T
max , and minimum, Tmin , temperature series recorded in Catalonia (NE Spain) during 1950–2004 are analysed. Similarly to the sampling strategies applied to long dry spells, the partial duration series (PDS) offer some advantages in comparison with the annual extreme series. Instead of using a common percentile threshold for all temperature series, PDS are chosen according to the mean excess plot procedure. Series of extreme residuals are modelled, in terms of the L-moments formulation, by the generalised Pareto distribution. Extreme residuals of Tmax and Tmin are estimated for return periods ranging from 2 to 50 years and their spatial distribution is represented for selected return periods of 2, 5, 10, 25 and 50 years. Two daily extreme temperatures events, a hot episode (in August) and a cold episode (in February), are simulated taking into account the average Tmax ( Tmin ) for a day in August (February), their standard deviations and the extremes for a 50-year return period. Both simulations are compared with outstanding real episodes recorded on August 13th 2003 and February 11th 1956. Additionally, a spatial regionalisation of Catalonia in several clusters, in terms of the extreme residuals for return periods from 2 to 50 years, is done. A principal component analysis is applied to the extreme residual curves characterising every temperature series and, using as variables the principal components, the regionalisation is obtained by applying the average linkage clustering algorithm. Finally, each cluster is characterised by its average extreme residual curve for return periods ranging from 2 to 50 years at 1-year interval. [ABSTRACT FROM AUTHOR]- Published
- 2010
- Full Text
- View/download PDF
36. Effects of climate change on the distribution of Iberian tree species.
- Author
-
Benito Garzón, Marta, Sánchez de Dios, Rut, and Sainz Ollero, Helios
- Subjects
CLIMATE change ,MULTIPURPOSE trees ,FOREST dynamics ,FOREST ecology ,ALGORITHMS - Abstract
Question: Will the predicted climate changes affect species distribution in the Iberian Peninsula? Location: Iberian Peninsula (Spain and Portugal). Methods: We modelled current and future tree distributions as a function of climate, using a computational framework that made use of one machine learning technique, the random forest (RF) algorithm. This algorithm provided good predictions of the current distribution of each species, as shown by the area under the corresponding receiver operating characteristics (ROC) curves. Species turnover, richness and the change in distributions over time to 2080 under four Intergovernmental panel on climate change (IPCC) scenarios were calculated using the species map outputs. Results and Conclusions: The results show a notable reduction in the potential distribution of the studied species under all the IPCC scenarios, particularly so for mountain conifer species such as Pinus sylvestris, P. uncinata and Abies alba. Temperate species, especially Fagus sylvatica and Quercus petraea, were also predicted to suffer a reduction in their range; also sub-mediterranean species, especially Q. pyrenaica, were predicted to undergo notable decline. In contrast, typically Mediterranean species appeared to be generally more capable of migration, and are therefore likely to be less affected. [ABSTRACT FROM AUTHOR]
- Published
- 2008
- Full Text
- View/download PDF
37. A New Methodology to Study Street Accessibility: A Case Study of Avila (Spain).
- Author
-
Curado, Manuel, Rodriguez, Rocio, Jimenez, Manuel, Tortosa, Leandro, and Vicent, Jose F.
- Subjects
ALGORITHMS ,MUNICIPAL services ,ECONOMIC models ,ECONOMIC impact ,FACTOR structure ,STREETS ,LOCAL transit access - Abstract
Taking into account that accessibility is one of the most strategic and determining factors in economic models and that accessibility and tourism affect each other, we can say that the study and improvement of one of them involved the development of the other. Using network analysis, this study presents an algorithm for labeling the difficulty of the streets of a city using different accessibility parameters. We combine network structure and accessibility factors to explore the association between innovative behavior within the street network, and the relationships with the commercial activity in a city. Finally, we present a case study of the city of Avila, locating the most inaccessible areas of the city using centrality measures and analyzing the effects, in terms of accessibility, on the commerce and services of the city. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
38. Contribution of Driving Efficiency to Vehicle-to-Building.
- Author
-
Borge-Diez, David, Ortega-Cabezas, Pedro Miguel, Colmenar-Santos, Antonio, and Blanes-Peiró, Jorge Juan
- Subjects
APPLICATION program interfaces ,ELECTRIC power consumption ,SOCIAL groups ,BRAIN-computer interfaces ,ALGORITHMS - Abstract
Energy consumption in the transport sector and buildings are of great concern. This research aims to quantify how eco-routing, eco-driving and eco-charging can increase the amount of energy available for vehicle-to-building. To do this, the working population was broken into social groups (freelancers, local workers and commuters) who reside in two cities with different climate zones (Alcalá de Henares-Spain and Jaén-Spain) since the way of using electric vehicles is different. An algorithm based on the Here
® application program interface and neural networks was implemented to acquire data of the stochastic usage of EVs of each social group. Finally, an increase in the amount of energy available for vehicle-to-building was assessed thanks to the algorithm. The results per day were as follows. Owing to the algorithm proposed a reduction ranging from 0.6 kWh to 2.2 kWh was obtained depending on social groups. The proposed algorithm facilitated an increase in energy available for vehicle-to-building ranging from 13.2 kWh to 33.6 kWh depending on social groups. The results show that current charging policies are not compatible with all social groups and do not consider the renewable energy contribution to the total electricity demand. [ABSTRACT FROM AUTHOR]- Published
- 2021
- Full Text
- View/download PDF
39. Free journal-ranking tool enters citation market.
- Author
-
Butler, Declan
- Subjects
DATABASES ,INTERNET ,STATISTICS ,BIBLIOGRAPHICAL citations ,ALGORITHMS ,RESEARCH ,DATA mining - Abstract
The article reports on the launch of an Internet database, called the SCImago Journal & Country Rank database, allowing users to generate on-the-fly citation statistics of published research papers for free. The open-access database calculates papers' impact factors using an algorithm. It is collaborating with Amsterdam-based science publisher Elsevier. SCImago is a data-mining and visualization group in Spain. The company ranks journals and countries using citation metrics as the popular Hirsch Index. It also includes the SCImago Journal Rank (SJR).
- Published
- 2008
- Full Text
- View/download PDF
40. Clinical validation of automatic local activation time annotation during focal premature ventricular complex ablation procedures.
- Author
-
Acosta, Juan, Soto-Iglesias, David, Fernández-Armenta, Juan, Frutos-López, Manuel, Jáuregui, Beatriz, Arana-Rueda, Eduardo, Fernández, Marcos, Penela, Diego, Alcaine, Alejandro, Cano, Lucas, Pedrote, Alonso, and Berruezo, Antonio
- Subjects
ARRHYTHMIA diagnosis ,ACTION potentials ,ALGORITHMS ,ARRHYTHMIA ,CATHETER ablation ,COMPARATIVE studies ,HEART beat ,HEART function tests ,RESEARCH methodology ,MEDICAL cooperation ,RESEARCH ,RESEARCH evaluation ,SIGNAL processing ,TIME ,EVALUATION research ,TREATMENT effectiveness ,PREDICTIVE tests - Abstract
Aims: Current navigation systems incorporate algorithms for automatic identification of local activation time (LAT). However, data about their utility and accuracy in premature ventricular complex (PVC) ablation procedures are scarce. This study analyses the accuracy of an algorithmic method based on automatic annotation of the maximal negative slope of the unipolar electrogram within the window demarcated by the bipolar electrogram compared with conventional manual annotation during PVC ablation procedures.Methods and results: Forty patients with successful ablation of focal PVC in three centres were included. Electroanatomical activation maps obtained with the automatic system (WF-map) were compared with manual annotation maps (M-map). Correlation and concordance of LAT obtained with both methods were assessed at 3536 points. The distance between the earliest activation site (EAS) and the effective radiofrequency application point (e-RFp) were determined in M-map and WF-map. The distance between WF-EAS and M-EAS was assessed. Successful ablation sites included left ventricular outflow tract (LVOT; 55%), right ventricular outflow tract (40%), and tricuspid annulus (5%). Good correlation was observed between the two annotation approaches (r = 0.655; P < 0.0001). Bland-Altman analysis revealed a systematic delayed detection of LAT by WF-map (bias 33.8 ± 30.9 ms), being higher in LVOT than in the right ventricle (42.6 ± 29.2 vs. 27.2 ± 30.5 ms, respectively; P < 0.0001). No difference in EAS-eRFp distance was observed between M-map and WF-map (1.8 ± 2.8 vs. 1.8 ± 3.4 mm, respectively; P = 0.986). The median (interquartile range) distance between WF-EAS and M-EAS was 2.2(0-6) mm.Conclusion: Good correlation was found between M-map and WF-map. Local activation time detection was systematically delayed in WF-map, especially in LVOT. Accurate identification of e-RFp was achieved with both annotation approaches. [ABSTRACT FROM AUTHOR]- Published
- 2018
- Full Text
- View/download PDF
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.