67 results on '"Francisco-Javier Gimeno-Blanes"'
Search Results
2. Sentiment Analysis of Political Tweets From the 2019 Spanish Elections
- Author
-
Margarita Rodriguez-Ibanez, Francisco-Javier Gimeno-Blanes, Pedro Manuel Cuenca-Jimenez, Cristina Soguero-Ruiz, and Jose Luis Rojo-Alvarez
- Subjects
Sentiment analysis ,twitter ,social networking sites ,text analysis ,lexicon ,politics ,Electrical engineering. Electronics. Nuclear engineering ,TK1-9971 - Abstract
The use of sentiment analysis methods has increased in recent years across a wide range of disciplines. Despite the potential impact of the development of opinions during political elections, few studies have focused on the analysis of sentiment dynamics and their characterization from statistical and mathematical perspectives. In this paper, we apply a set of basic methods to analyze the statistical and temporal dynamics of sentiment analysis on political campaigns and assess their scope and limitations. To this end, we gathered thousands of Twitter messages mentioning political parties and their leaders posted several weeks before and after the 2019 Spanish presidential election. We then followed a twofold analysis strategy: (1) statistical characterization using indices derived from well-known temporal and information metrics and methods –including entropy, mutual information, and the Compounded Aggregated Positivity Index– allowing the estimation of changes in the density function of sentiment data; and (2) feature extraction from nonlinear intrinsic patterns in terms of manifold learning using autoencoders and stochastic embeddings. The results show that both the indices and the manifold features provide an informative characterization of the sentiment dynamics throughout the election period. We found measurable variations in sentiment behavior and polarity across the political parties and their leaders and observed different dynamics depending on the parties’ positions on the political spectrum, their presence at the regional or national levels, and their nationalist or globalist aspirations.
- Published
- 2021
- Full Text
- View/download PDF
3. On the Statistical and Temporal Dynamics of Sentiment Analysis
- Author
-
Margarita Rodriguez-Ibanez, Francisco-Javier Gimeno-Blanes, Pedro Manuel Cuenca-Jimenez, Sergio Munoz-Romero, Cristina Soguero, and Jose Luis Rojo-Alvarez
- Subjects
Sentiment analysis ,machine learning techniques ,sentiment dictionaries ,social networking ,public opinions ,Twitter ,Electrical engineering. Electronics. Nuclear engineering ,TK1-9971 - Abstract
Despite the broad interest and use of sentiment analysis nowadays, most of the conclusions in current literature are driven by simple statistical representations of sentiment scores. On that basis, the generated sentiment evaluation consists nowadays of encoding and aggregating emotional information from a number of individuals and their populational trends. We hypothesized that the stochastic processes aimed to be measured by sentiment analysis systems will exhibit nontrivial statistical and temporal properties. We established an experimental setup consisting of analyzing the short text messages (tweets) of 6 user groups with different nature (universities, politics, musicians, communication media, technological companies, and financial companies), including in each group ten high-intensity users in their regular generation of traffic on social networks. Statistical descriptors were checked to converge at about 2000 messages for each user, for which messages from the last two weeks were compiled using a custom-made tool. The messages were subsequently processed for sentiment scoring in terms of different lexicons currently available and widely used. Not only the temporal dynamics of the resulting score time series per user was scrutinized, but also its statistical description as given by the score histogram, the temporal autocorrelation, the entropy, and the mutual information. Our results showed that the actual dynamic range of lexicons is in general moderate, and hence not much resolution is given within their end-of-scales. We found that seasonal patterns were more present in the time evolution of the number of tweets, but to a much lesser extent in the sentiment intensity. Additionally, we found that the presence of retweets added negligible effects over standard statistical modes, while it hindered informational and temporal patterns. The innovative Compounded Aggregated Positivity Index developed in this work proved to be characteristic for industries and at the same time an interesting way to identify singularities among peers. We conclude that temporal properties of messages provide with information about the sentiment dynamics, which is different in terms of lexicons and users, but commonalities can be exploited in this field using appropriate temporal digital processing tools.
- Published
- 2020
- Full Text
- View/download PDF
4. Towards Organization Management Using Exploratory Screening and Big Data Tests: A Case Study of the Spanish Red Cross
- Author
-
Margarita Rodriguez-Ibanez, Sergio Munoz-Romero, Cristina Soguero-Ruiz, Francisco-Javier Gimeno-Blanes, and Jose Luis Rojo-Alvarez
- Subjects
Big Data ,machine learning ,organization management ,organization efficiency ,prediction model ,Electrical engineering. Electronics. Nuclear engineering ,TK1-9971 - Abstract
With the emergence of information and communication technologies, a large amount of data has turned available for the organizations, which creates expectations on their value and content for management purposes. However, the exploratory analysis of available organizational data based on emerging Big Data technologies are still developing in terms of operative tools for solid and interpretable data description. In this work, we addressed the exploratory analysis of organization databases at early stages where little quantitative information is available about their efficiency. Categorical and metric single-variable tests are proposed and formalized in order to provide a mass criterion to identify regions in forms with clusters of significant variables. Bootstrap resampling techniques are used to provide nonparametric criteria in order to establish easy-to-use statistical tests, so that single-variable tests are represented each on a visual and quantitative statistical plot, whereas all the variables in a given form are jointly visualized in the so-called chromosome plots. More detailed profile plots offer deep comparison knowledge for categorical variables across the organization physical and functional structures, while histogram plots for numerical variables incorporate the statistical significance of the variables under study for preselected Pareto groups. Performance grouping is addressed by identifying two or three groups according to some representative empirical distribution of some convenient grouping feature. The method is applied to perform a Big-Data exploratory analysis on the follow-up forms of Spanish Red Cross, based on the number of interventions and on a by-record basis. Results showed that a simple one-variable blind-knowledge exploratory Big-Data analysis, as the one developed in this paper, offers unbiased comparative graphical and numerical information that characterize organizational dynamics in terms of applied resources, available capacities, and productivity. In particular, the graphical and numerical outputs of the present analysis proved to be a valid tool to isolate the underlying overloaded or under-performing resources in complex organizations. As a consequence, the proposed method allows a systematic and principled way for efficiency analysis in complex organizations, which combined with organizational internal knowledge could leverage and validate efficient decision-making.
- Published
- 2019
- Full Text
- View/download PDF
5. On the Black-Box Challenge for Fraud Detection Using Machine Learning (II): Nonlinear Analysis through Interpretable Autoencoders
- Author
-
Jacobo Chaquet-Ulldemolins, Francisco-Javier Gimeno-Blanes, Santiago Moral-Rubio, Sergio Muñoz-Romero, and José-Luis Rojo-Álvarez
- Subjects
credit fraud detection ,explainable machine learning ,interpretability ,autoencoders ,Technology ,Engineering (General). Civil engineering (General) ,TA1-2040 ,Biology (General) ,QH301-705.5 ,Physics ,QC1-999 ,Chemistry ,QD1-999 - Abstract
Artificial intelligence (AI) has recently intensified in the global economy due to the great competence that it has demonstrated for analysis and modeling in many disciplines. This situation is accelerating the shift towards a more automated society, where these new techniques can be consolidated as a valid tool to face the difficult challenge of credit fraud detection (CFD). However, tight regulations do not make it easy for financial entities to comply with them while using modern techniques. From a methodological perspective, autoencoders have demonstrated their effectiveness in discovering nonlinear features across several problem domains. However, autoencoders are opaque and often seen as black boxes. In this work, we propose an interpretable and agnostic methodology for CFD. This type of approach allows a double advantage: on the one hand, it can be applied together with any machine learning (ML) technique, and on the other hand, it offers the necessary traceability between inputs and outputs, hence escaping from the black-box model. We first applied the state-of-the-art feature selection technique defined in the companion paper. Second, we proposed a novel technique, based on autoencoders, capable of evaluating the relationship among input and output of a sophisticated ML model for each and every one of the samples that are submitted to the analysis, through a single transaction-level explanation (STE) approach. This technique allows each instance to be analyzed individually by applying small fluctuations of the input space and evaluating how it is triggered in the output, thereby shedding light on the underlying dynamics of the model. Based on this, an individualized transaction ranking (ITR) can be formulated, leveraging on the contributions of each feature through STE. These rankings represent a close estimate of the most important features playing a role in the decision process. The results obtained in this work were consistent with previous published papers, and showed that certain features, such as living beyond means, lack or absence of transaction trail, and car loans, have strong influence on the model outcome. Additionally, this proposal using the latent space outperformed, in terms of accuracy, our previous results, which already improved prior published papers, by 5.5% and 1.5% for the datasets under study, from a baseline of 76% and 93%. The contribution of this paper is twofold, as far as a new outperforming CFD classification model is presented, and at the same time, we developed a novel methodology, applicable across classification techniques, that allows to breach black-box models, erasingthe dependencies and, eventually, undesirable biases. We conclude that it is possible to develop an effective, individualized, unbiased, and traceable ML technique, not only to comply with regulations, but also to be able to cope with transaction-level inquiries from clients and authorities.
- Published
- 2022
- Full Text
- View/download PDF
6. On the Black-Box Challenge for Fraud Detection Using Machine Learning (I): Linear Models and Informative Feature Selection
- Author
-
Jacobo Chaquet-Ulldemolins, Francisco-Javier Gimeno-Blanes, Santiago Moral-Rubio, Sergio Muñoz-Romero, and José-Luis Rojo-Álvarez
- Subjects
credit fraud detection ,explainable machine learning ,interpretability ,feature selection ,Technology ,Engineering (General). Civil engineering (General) ,TA1-2040 ,Biology (General) ,QH301-705.5 ,Physics ,QC1-999 ,Chemistry ,QD1-999 - Abstract
Artificial intelligence (AI) is rapidly shaping the global financial market and its services due to the great competence that it has shown for analysis and modeling in many disciplines. What is especially remarkable is the potential that these techniques could offer to the challenging reality of credit fraud detection (CFD); but it is not easy, even for financial institutions, to keep in strict compliance with non-discriminatory and data protection regulations while extracting all the potential that these powerful new tools can provide to them. This reality effectively restricts nearly all possible AI applications to simple and easy to trace neural networks, preventing more advanced and modern techniques from being applied. The aim of this work was to create a reliable, unbiased, and interpretable methodology to automatically evaluate CFD risk. Therefore, we propose a novel methodology to address the mentioned complexity when applying machine learning (ML) to the CFD problem that uses state-of-the-art algorithms capable of quantifying the information of the variables and their relationships. This approach offers a new form of interpretability to cope with this multifaceted situation. Applied first is a recent published feature selection technique, the informative variable identifier (IVI), which is capable of distinguishing among informative, redundant, and noisy variables. Second, a set of innovative recurrent filters defined in this work are applied, which aim to minimize the training-data bias, namely, the recurrent feature filter (RFF) and the maximally-informative feature filter (MIFF). Finally, the output is classified by using compelling ML techniques, such as gradient boosting, support vector machine, linear discriminant analysis, and linear regression. These defined models were applied both to a synthetic database, for better descriptive modeling and fine tuning, and then to a real database. Our results confirm that our proposal yields valuable interpretability by identifying the informative features’ weights that link original variables with final objectives. Informative features were living beyond one’s means, lack or absence of a transaction trail, and unexpected overdrafts, which are consistent with other published works. Furthermore, we obtained 76% accuracy in CFD, which represents an improvement of more than 4% in the real databases compared to other published works. We conclude that with the use of the presented methodology, we do not only reduce dimensionality, but also improve the accuracy, and trace relationships among input and output features, bringing transparency to the ML reasoning process. The results obtained here were used as a starting point for the companion paper which reports on our extending the interpretability to nonlinear ML architectures.
- Published
- 2022
- Full Text
- View/download PDF
7. On the Differential Analysis of Enterprise Valuation Methods as a Guideline for Unlisted Companies Assessment (II): Applying Machine-Learning Techniques for Unbiased Enterprise Value Assessment
- Author
-
Germania Vayas-Ortega, Cristina Soguero-Ruiz, Margarita Rodríguez-Ibáñez, José-Luis Rojo-Álvarez, and Francisco-Javier Gimeno-Blanes
- Subjects
company valuation ,enterprise value ,machine learning ,feature selection ,supervised techniques ,non-linear regression techniques ,Technology ,Engineering (General). Civil engineering (General) ,TA1-2040 ,Biology (General) ,QH301-705.5 ,Physics ,QC1-999 ,Chemistry ,QD1-999 - Abstract
The search for an unbiased company valuation method to reduce uncertainty, whether or not it is automatic, has been a relevant topic in social sciences and business development for decades. Many methods have been described in the literature, but consensus has not been reached. In the companion paper we aimed to review the assessment capabilities of traditional company valuation model, based on company’s intrinsic value using the Discounted Cash Flow (DCF). In this paper, we capitalized on the potential of exogenous information combined with Machine Learning (ML) techniques. To do so, we performed an extensive analysis to evaluate the predictive capabilities with up to 18 different ML techniques. Endogenous variables (features) related to value creation (DCF) were proved to be crucial elements for the models, while the incorporation of exogenous, industry/country specific ones, incrementally improves the ML performance. Bagging Trees, Supported Vector Machine Regression, Gaussian Process Regression methods consistently provided the best results. We concluded that an unbiased model can be created based on endogenous and exogenous information to build a reference framework, to price and benchmark Enterprise Value for valuation and credit risk assessment.
- Published
- 2020
- Full Text
- View/download PDF
8. On the Differential Analysis of Enterprise Valuation Methods as a Guideline for Unlisted Companies Assessment (I): Empowering Discounted Cash Flow Valuation
- Author
-
Germania Vayas-Ortega, Cristina Soguero-Ruiz, José-Luis Rojo-Álvarez, and Francisco-Javier Gimeno-Blanes
- Subjects
stock market ,private equity ,valuation ,cash flow ,discounted cash flow ,enterprise value ,Technology ,Engineering (General). Civil engineering (General) ,TA1-2040 ,Biology (General) ,QH301-705.5 ,Physics ,QC1-999 ,Chemistry ,QD1-999 - Abstract
The Discounted Cash Flow (DCF) method is probably the most extended approach used in company valuation, its main drawbacks being probably the known extreme sensitivity to key variables such as Weighted Average Cost of Capital (WACC) and Free Cash Flow (FCF) estimations not unquestionably obtained. In this paper we propose an unbiased and systematic DCF method which allows us to value private equity by leveraging on stock markets evidences, based on a twofold approach: First, the use of the inverse method assesses the existence of a coherent WACC that positively compares with market observations; second, different FCF forecasting methods are benchmarked and shown to correspond with actual valuations. We use financial historical data including 42 companies in five sectors, extracted from Eikon-Reuters. Our results show that WACC and FCF forecasting are not coherent with market expectations along time, with sectors, or with market regions, when only historical and endogenous variables are taken into account. The best estimates are found when exogenous variables, operational normalization of input space, and data-driven linear techniques are considered (Root Mean Square Error of 6.51). Our method suggests that FCFs and their positive alignment with Market Capitalization and the subordinate enterprise value are the most influencing variables. The fine-tuning of the methods presented here, along with an exhaustive analysis using nonlinear machine-learning techniques, are developed and discussed in the companion paper.
- Published
- 2020
- Full Text
- View/download PDF
9. Enabling Heart Self-Monitoring for All and for AAL—Portable Device within a Complete Telemedicine System
- Author
-
Andrés-Lorenzo Bleda, Francisco-Manuel Melgarejo-Meseguer, Francisco-Javier Gimeno-Blanes, Arcadi García-Alberola, José Luis Rojo-Álvarez, Javier Corral, Ricardo Ruiz, and Rafael Maestre-Ferriz
- Subjects
ECG ,arterial blood pressure ,sensors ,e-health ,portability ,atrial fibrillation detector ,QRS detector ,Chemical technology ,TP1-1185 - Abstract
During the last decades there has been a rapidly growing elderly population and the number of patients with chronic heart-related diseases has exploded. Many of them (such as those with congestive heart failure or some types of arrhythmias) require close medical supervision, thus imposing a big burden on healthcare costs in most western economies. Specifically, continuous or frequent Arterial Blood Pressure (ABP) and electrocardiogram (ECG) monitoring are important tools in the follow-up of many of these patients. In this work, we present a novel remote non-ambulatory and clinically validated heart self-monitoring system, which allows ABP and ECG monitoring to effectively identify clinically relevant arrhythmias. The system integrates digital transmission of the ECG and tensiometer measurements, within a patient-comfortable support, easy to recharge and with a multi-function software, all of them aiming to adapt for elderly people. The main novelty is that both physiological variables (ABP and ECG) are simultaneously measured in an ambulatory environment, which to our best knowledge is not readily available in the clinical market. Different processing techniques were implemented to analyze the heart rhythm, including pause detection, rhythm alterations and atrial fibrillation, hence allowing early detection of these diseases. Our results achieved clinical quality both for in-lab hardware testing and for ambulatory scenario validations. The proposed active assisted living (AAL) Sensor-based system is an end-to-end multidisciplinary system, fully connected to a platform and tested by the clinical team from beginning to end.
- Published
- 2019
- Full Text
- View/download PDF
10. Electrocardiographic Fragmented Activity (II): A Machine Learning Approach to Detection
- Author
-
Francisco-Manuel Melgarejo-Meseguer, Francisco-Javier Gimeno-Blanes, María-Eladia Salar-Alcaraz, Juan-Ramón Gimeno-Blanes, Juan Martínez-Sánchez, Arcadi García-Alberola, and José Luis Rojo-Álvarez
- Subjects
ECG ,fragmentation detection ,multivariate techniques ,fibrosis detection ,machine learning ,Technology ,Engineering (General). Civil engineering (General) ,TA1-2040 ,Biology (General) ,QH301-705.5 ,Physics ,QC1-999 ,Chemistry ,QD1-999 - Abstract
Hypertrophic cardiomyopathy, according to its prevalence, is a comparatively common disease related to the risk of suffering sudden cardiac death, heart failure and stroke. This illness is characterized by the excessive deposition of collagen among healthy myocardium cells. This situation, which is medically known as fibrosis, constitutes effective conduction obstacles in the myocardium electrical path, and when severe enough, it can be outlined as additional peaks or notches in the QRS, clinically entitled as fragmentation. Nowadays, the fragmentation detection is performed by visual inspection, but the fragmented QRS can be confused with the noise present in the electrocardiogram (ECG). On the other hand, fibrosis detection is performed by magnetic resonance imaging with late gadolinium enhancement, the main drawback of this technique being its cost in terms of time and money. In this work, we propose two automatic algorithms, one for fragmented QRS detection and another for fibrosis detection. For this purpose, we used four different databases, including the subrogated database described in the companion paper and incorporating three additional ones, one compounded by more accurate subrogated ECG signals and two compounded by real and affected subjects as labeled by expert clinicians. The first real-world database contains QRS fragmented records and the second one contains records with fibrosis and both were recorded in Hospital Clínico Universitario Virgen de la Arrixaca (Spain). To deeply analyze the scope of these datasets, we benchmarked several classifiers such as Neural Networks, Support Vector Machines (SVM), Decision Trees and Gaussian Naïve Bayes (NB). For the fragmentation dataset, the best results were 0.94 sensitivity, 0.88 specificity, 0.89 positive predictive value, 0.93 negative predictive value and 0.91 accuracy when using SVM with Gaussian kernel. For the fibrosis databases, more limited accuracy was reached, with 0.47 sensitivity, 0.91 specificity, 0.82 predictive positive value, 0.66 negative predictive value and 0.70 accuracy when using Gaussian NB. Nevertheless, this is the first time that fibrosis detection is attempted automatically from ECG postprocessing, paving the way towards improved algorithms and methods for it. Therefore, we can conclude that the proposed techniques could offer a valuable tool to clinicians for both fragmentation and fibrosis diagnoses support.
- Published
- 2019
- Full Text
- View/download PDF
11. Electrocardiographic Fragmented Activity (I): Physiological Meaning of Multivariate Signal Decompositions
- Author
-
Francisco-Manuel Melgarejo-Meseguer, Francisco-Javier Gimeno-Blanes, María-Eladia Salar-Alcaraz, Juan-Ramón Gimeno-Blanes, Juan Martínez-Sánchez, Arcadi García-Alberola, and José-Luis Rojo-Álvarez
- Subjects
ECG ,fragmentation analysis ,multivariate techniques ,ICA ,PCA ,fragmentation detection ,Technology ,Engineering (General). Civil engineering (General) ,TA1-2040 ,Biology (General) ,QH301-705.5 ,Physics ,QC1-999 ,Chemistry ,QD1-999 - Abstract
Recent research has proven the existence of statistical relation among fragmented QRS and several highly prevalence diseases, such as cardiac sarcoidosis, acute coronary syndrome, arrythmogenic cardiomyopathies, Brugada syndrome, and hypertrophic cardiomyopathy. One out of five hundred people suffer from hypertrophic cardiomyopathies. The relation among the fragmentation and arrhythmias drives the objective of this work, which is to propose a valid method for QRS fragmentation detection. With that aim, we followed a two-stage approach. First, we identified the features that better characterize the fragmentation by analyzing the physiological interpretation of multivariate approaches, such as principal component analysis (PCA) and independent component analysis (ICA). Second, we created an invariant transformation method for the multilead electrocardiogram (ECG), by scrutinizing the statistical distributions of the PCA eigenvectors and of the ICA transformation arrays, in order to anchor the desired elements in the suitable leads in the feature space. A complete database was compounded incorporating real fragmented ECGs, surrogate registers by synthetically adding fragmented activity to real non-fragmented ECG registers, and standard clean ECGs. Results showed that the creation of beat templates together with the application of PCA over eight independent leads achieves 0.995 fragmentation enhancement ratio and 0.07 dispersion coefficient. In the case of ICA over twelve leads, the results were 0.995 fragmentation enhancement ratio and 0.70 dispersion coefficient. We conclude that the algorithm presented in this work constructs a new paradigm, by creating a systematic and powerful tool for clinical anamnesis and evaluation based on multilead ECG. This approach consistently consolidates the inconspicuous elements present in multiple leads onto designated variables in the output space, hence offering additional and valid visual and non-visual information to standard clinical review, and opening the door to a more accurate automatic detection and statistically valid systematic approach for a wide number of applications. In this direction and within the companion paper, further developments are presented applying this technique to fragmentation detection.
- Published
- 2019
- Full Text
- View/download PDF
12. Full recovery of Arundo donax particleboard from swelling test without waterproofing additives
- Author
-
Jose-Antonio Flores-Yepes, Jose-Joaquin Pastor-Perez, Francisco-Javier Gimeno-Blanes, Isabel Rodriguez-Guisado, and María-José Frutos-Fernandez
- Subjects
Particleboards ,Wood remainder ,Shredder ,Shredding machine ,Particle board ,Wood waste ,Giant reed ,Arundo donax ,Biotechnology ,TP248.13-248.65 - Abstract
This paper presents the development of particleboard based on common reed, reproducing the industry standard manufacturing process applied to wood chipboard. One of the main properties of the resulting board was its resistance to water, due to the hydrophobic properties of the common reed, despite there being no incorporation of melamine or any other waterproofing additive. The boards that were developed were analyzed using 2 mm and 4 mm sieves for fibre selection, a manufacturing pressure of 3 N/mm2 and 25 N/mm2, and a volume of urea formaldehyde resin content ranging from 5.2% to 13% (8 to 20% liquid format). Standard destructive tests were performed. It was found that under certain applied conditions, namely high pressure and adequate resin proportion (a pressure of over 3 N/mm2 and over 15% liquid resin), Arundo donax L. particleboard demonstrated full recovery from the swelling test. This finding highlights an unmatched property in terms of recovery from the swelling test of the designed board. This property confers a interesting property to be used in high humidity environments without the need for special resin or waterproofing process.
- Published
- 2012
13. On the Beat Detection Performance in Long-Term ECG Monitoring Scenarios
- Author
-
Francisco-Manuel Melgarejo-Meseguer, Estrella Everss-Villalba, Francisco-Javier Gimeno-Blanes, Manuel Blanco-Velasco, Zaida Molins-Bordallo, José-Antonio Flores-Yepes, José-Luis Rojo-Álvarez, and Arcadi García-Alberola
- Subjects
QRS detection ,ECG ,long-term monitoring ,Holter ,7-day ,Chemical technology ,TP1-1185 - Abstract
Despite the wide literature on R-wave detection algorithms for ECG Holter recordings, the long-term monitoring applications are bringing new requirements, and it is not clear that the existing methods can be straightforwardly used in those scenarios. Our aim in this work was twofold: First, we scrutinized the scope and limitations of existing methods for Holter monitoring when moving to long-term monitoring; Second, we proposed and benchmarked a beat detection method with adequate accuracy and usefulness in long-term scenarios. A longitudinal study was made with the most widely used waveform analysis algorithms, which allowed us to tune the free parameters of the required blocks, and a transversal study analyzed how these parameters change when moving to different databases. With all the above, the extension to long-term monitoring in a database of 7-day Holter monitoring was proposed and analyzed, by using an optimized simultaneous-multilead processing. We considered both own and public databases. In this new scenario, the noise-avoid mechanisms are more important due to the amount of noise that exists in these recordings, moreover, the computational efficiency is a key parameter in order to export the algorithm to the clinical practice. The method based on a Polling function outperformed the others in terms of accuracy and computational efficiency, yielding 99.48% sensitivity, 99.54% specificity, 99.69% positive predictive value, 99.46% accuracy, and 0.85% error for MIT-BIH arrhythmia database. We conclude that the method can be used in long-term Holter monitoring systems.
- Published
- 2018
- Full Text
- View/download PDF
14. Anomaly Detection From Low-Dimensional Latent Manifolds With Home Environmental Sensors.
- Author
-
Francisco Manuel Melgarejo-Meseguer, Andrés Lorenzo Bleda, Sergio Eduardo Abbenante, Francisco Javier Gimeno-Blanes, Estrella Everss-Villalba, Sergio Muñoz-Romero, José Luis Rojo-álvarez, and Rafael Maestre-Ferriz
- Published
- 2024
- Full Text
- View/download PDF
15. An Embedding Approach for Biomarker Identification in Hypertrophic Cardiomyopathy.
- Author
-
Arash Kazemi-Díaz, Luis Bote-Curiel, María Sabater-Molina, Juan-Ramón Gimeno-Blanes, Salvador Sala-Pla, Francisco Javier Gimeno-Blanes, Sergio Muñoz-Romero, and José Luis Rojo-álvarez
- Published
- 2023
- Full Text
- View/download PDF
16. Signal Processing and Machine Learning Automated Evaluation of Phrenic Nerve Affectation by Cardiac Stimulation.
- Author
-
Roberto Mateos-Gaitán, Antonio Gil-Izquierdo, Francisco Javier Gimeno-Blanes, Francisco Manuel Melgarejo-Meseguer, Carmen Muñoz-Esparza, José Luis Rojo-álvarez, Arcadi García-Alberola, and Juan José Sánchez-Muñoz
- Published
- 2023
- Full Text
- View/download PDF
17. Multicomponent Organization Analysis in Spatial Domains of Atrial Fibrillation.
- Author
-
Francisco Manuel Melgarejo-Meseguer, Román A. Lara-Cueva, Francisco Javier Gimeno-Blanes, Sergio Muñoz-Romero, Arcadi García-Alberola, Juan José Sánchez-Muñoz, Omer Berenfeld, and José Luis Rojo-álvarez
- Published
- 2023
- Full Text
- View/download PDF
18. High-Dimensional Feature Characterization of Single Nucleotide Variants in Hypertrophic Cardiomyopathy.
- Author
-
Dafne Lozano, Luis Bote, Concha Bielza, Pedro Larrañaga, María Sabater-Molina, Juan Ramón Gimeno, Sergio Muñoz, Francisco Javier Gimeno-Blanes, and José Luis Rojo-álvarez
- Published
- 2023
- Full Text
- View/download PDF
19. Cybersecurity Alert Prioritization in a Critical High Power Grid With Latent Spaces.
- Author
-
Juan Ramón Feijoo-Martínez, Alicia Guerrero-Curieses, Francisco Javier Gimeno-Blanes, Mario Castro-Fernandez, and José Luis Rojo-álvarez
- Published
- 2023
- Full Text
- View/download PDF
20. Generalization and Regularization for Inverse Cardiac Estimators.
- Author
-
Francisco Manuel Melgarejo-Meseguer, Estrella Everss-Villalba, Miriam Gutiérrez-Fernández-Calvillo, Sergio Muñoz-Romero, Francisco Javier Gimeno-Blanes, Arcadi García-Alberola, and José Luis Rojo-álvarez
- Published
- 2022
- Full Text
- View/download PDF
21. Manifold analysis of the P-wave changes induced by pulmonary vein isolation during cryoballoon procedure.
- Author
-
Laura Martinez-Mateu, Francisco Manuel Melgarejo-Meseguer, Sergio Muñoz-Romero, Francisco Javier Gimeno-Blanes, Arcadi García-Alberola, Sara Rocher Ventura, Javier Saiz, and José Luis Rojo-álvarez
- Published
- 2023
- Full Text
- View/download PDF
22. Cardiac Fibrosis Detection Applying Machine Learning Techniques to Standard 12-Lead ECG.
- Author
-
Francisco Manuel Melgarejo-Meseguer, Francisco Javier Gimeno-Blanes, José Luis Rojo-álvarez, Mariela Salar-Alcaraz, Juan-Ramón Gimeno-Blanes, and Arcadi García-Alberola
- Published
- 2018
- Full Text
- View/download PDF
23. On the Influence of Heart Rate and Coupling Interval Prematurity on Heart Rate Turbulence.
- Author
-
óscar Barquero-Pérez, Carlos Figuera, Rebeca Goya-Esteban, Inmaculada Mora-Jiménez, Francisco Javier Gimeno-Blanes, Pablo Laguna, Juan Pablo Martínez 0001, Eduardo Gil, Leif Sörnmo, Arcadio García-Alberola, and José Luis Rojo-álvarez
- Published
- 2017
- Full Text
- View/download PDF
24. QRS Fragmentation Index as a New Discriminator for Early Diagnosis of Heart Diseases.
- Author
-
Francisco Manuel Melgarejo-Meseguer, Mariela Salar-Alcaraz, Zaida Molins-Bordallo, Francisco Javier Gimeno-Blanes, Estrella Everss-Villalba, José-Antonio Flores-Yepes, José Luis Rojo-álvarez, and Arcadi García-Alberola
- Published
- 2017
- Full Text
- View/download PDF
25. Morphological Analysis on Single Lead Contactless ECG Monitoring Based on a Beat-TemplateDevelopment.
- Author
-
Jesús Hernández-Ortega, Francisco Javier Gimeno-Blanes, José Luis Rojo-álvarez, José-Antonio Flores-Yepes, Andrés Lorenzo Bleda-Tomás, Rafael Maestre-Ferriz, José-María López-Ayala, Juan-Ramón Gimeno-Blanes, and Arcadio García-Alberola
- Published
- 2014
26. QRS Delineation Algorithms Comparison and Model Fine Tuning for Automatic Clinical Classification.
- Author
-
Antonio Casañez-Ventura, Francisco Javier Gimeno-Blanes, José Luis Rojo-álvarez, José-Antonio Flores-Yepes, Juan-Ramón Gimeno-Blanes, José-María López-Ayala, and Arcadi García-Alberola
- Published
- 2013
27. Computational Efficiency and Accuracy for QRS Detection Algorithms on Clinical Long Term Multilead Monitoring.
- Author
-
Francisco Manuel Melgarejo-Meseguer, Estrella Everss-Villalba, Jan iroký, Francisco Javier Gimeno-Blanes, José-Antonio Flores-Yepes, Manuel Blanco-Velasco, José Luis Rojo-álvarez, and Arcadi García-Alberola
- Published
- 2016
28. Clinical Severity of Noise in ECG.
- Author
-
Estrella Everss-Villalba, Francisco Manuel Melgarejo-Meseguer, Francisco Javier Gimeno-Blanes, Salvador Sala-Pla, Manuel Blanco-Velasco, José Luis Rojo-álvarez, and Arcadio García-Alberola
- Published
- 2016
29. Deal Effect Curve and Promotional Models - Using Machine Learning and Bootstrap Resampling Test.
- Author
-
Cristina Soguero-Ruíz, Francisco Javier Gimeno-Blanes, Inmaculada Mora-Jiménez, María Pilar Martínez-Ruiz, and José Luis Rojo-álvarez
- Published
- 2012
30. Statistical nonlinear analysis for reliable promotion decision-making.
- Author
-
Cristina Soguero-Ruíz, Francisco Javier Gimeno-Blanes, Inmaculada Mora-Jiménez, María Pilar Martínez-Ruiz, and José Luis Rojo-álvarez
- Published
- 2014
- Full Text
- View/download PDF
31. On the differential benchmarking of promotional efficiency with machine learning modelling (II): Practical applications.
- Author
-
Cristina Soguero-Ruíz, Francisco Javier Gimeno-Blanes, Inmaculada Mora-Jiménez, María Pilar Martínez-Ruiz, and José Luis Rojo-álvarez
- Published
- 2012
- Full Text
- View/download PDF
32. On the Statistical and Temporal Dynamics of Sentiment Analysis
- Author
-
Cristina Soguero, Pedro Manuel Cuenca-Jimenez, Sergio Muñoz-Romero, José Luis Rojo-Álvarez, Margarita Rodriguez-Ibanez, and Francisco-Javier Gimeno-Blanes
- Subjects
machine learning techniques ,General Computer Science ,Computer science ,business.industry ,social networking ,Sentiment analysis ,Twitter ,General Engineering ,sentiment dictionaries ,02 engineering and technology ,Mutual information ,computer.software_genre ,public opinions ,020204 information systems ,0202 electrical engineering, electronic engineering, information engineering ,Entropy (information theory) ,020201 artificial intelligence & image processing ,General Materials Science ,Artificial intelligence ,lcsh:Electrical engineering. Electronics. Nuclear engineering ,business ,computer ,lcsh:TK1-9971 ,Natural language processing - Abstract
Despite the broad interest and use of sentiment analysis nowadays, most of the conclusions in current literature are driven by simple statistical representations of sentiment scores. On that basis, the generated sentiment evaluation consists nowadays of encoding and aggregating emotional information from a number of individuals and their populational trends. We hypothesized that the stochastic processes aimed to be measured by sentiment analysis systems will exhibit nontrivial statistical and temporal properties. We established an experimental setup consisting of analyzing the short text messages (tweets) of 6 user groups with different nature (universities, politics, musicians, communication media, technological companies, and financial companies), including in each group ten high-intensity users in their regular generation of traffic on social networks. Statistical descriptors were checked to converge at about 2000 messages for each user, for which messages from the last two weeks were compiled using a custom-made tool. The messages were subsequently processed for sentiment scoring in terms of different lexicons currently available and widely used. Not only the temporal dynamics of the resulting score time series per user was scrutinized, but also its statistical description as given by the score histogram, the temporal autocorrelation, the entropy, and the mutual information. Our results showed that the actual dynamic range of lexicons is in general moderate, and hence not much resolution is given within their end-of-scales. We found that seasonal patterns were more present in the time evolution of the number of tweets, but to a much lesser extent in the sentiment intensity. Additionally, we found that the presence of retweets added negligible effects over standard statistical modes, while it hindered informational and temporal patterns. The innovative Compounded Aggregated Positivity Index developed in this work proved to be characteristic for industries and at the same time an interesting way to identify singularities among peers. We conclude that temporal properties of messages provide with information about the sentiment dynamics, which is different in terms of lexicons and users, but commonalities can be exploited in this field using appropriate temporal digital processing tools.
- Published
- 2020
33. Noise Maps for Quantitative and Clinical Severity Towards Long-Term ECG Monitoring.
- Author
-
Estrella Everss-Villalba, Francisco Manuel Melgarejo-Meseguer, Manuel Blanco-Velasco, Francisco Javier Gimeno-Blanes, Salvador Sala-Pla, José Luis Rojo-álvarez, and Arcadi García-Alberola
- Published
- 2017
- Full Text
- View/download PDF
34. On the Differential Analysis of Enterprise Valuation Methods as a Guideline for Unlisted Companies Assessment (II): Applying Machine-Learning Techniques for Unbiased Enterprise Value Assessment
- Author
-
Francisco-Javier Gimeno-Blanes, Cristina Soguero-Ruiz, Margarita Rodriguez-Ibanez, José Luis Rojo-Álvarez, and Germania Vayas-Ortega
- Subjects
bagging trees ,Computer science ,Decision tree ,Feature selection ,02 engineering and technology ,Machine learning ,computer.software_genre ,lcsh:Technology ,lcsh:Chemistry ,Intrinsic value (finance) ,feature selection ,Benchmark (surveying) ,0502 economics and business ,0202 electrical engineering, electronic engineering, information engineering ,General Materials Science ,supervised techniques ,Instrumentation ,lcsh:QH301-705.5 ,Discounted cash flow ,Valuation (finance) ,Fluid Flow and Transfer Processes ,050208 finance ,decision trees ,business.industry ,lcsh:T ,Process Chemistry and Technology ,05 social sciences ,Enterprise value ,boosting trees ,General Engineering ,lcsh:QC1-999 ,Computer Science Applications ,supported vector machine ,Support vector machine ,company valuation ,machine learning ,lcsh:Biology (General) ,lcsh:QD1-999 ,lcsh:TA1-2040 ,enterprise value ,non-linear regression techniques ,020201 artificial intelligence & image processing ,Artificial intelligence ,business ,lcsh:Engineering (General). Civil engineering (General) ,computer ,lcsh:Physics - Abstract
The search for an unbiased company valuation method to reduce uncertainty, whether or not it is automatic, has been a relevant topic in social sciences and business development for decades. Many methods have been described in the literature, but consensus has not been reached. In the companion paper we aimed to review the assessment capabilities of traditional company valuation model, based on company&rsquo, s intrinsic value using the Discounted Cash Flow (DCF). In this paper, we capitalized on the potential of exogenous information combined with Machine Learning (ML) techniques. To do so, we performed an extensive analysis to evaluate the predictive capabilities with up to 18 different ML techniques. Endogenous variables (features) related to value creation (DCF) were proved to be crucial elements for the models, while the incorporation of exogenous, industry/country specific ones, incrementally improves the ML performance. Bagging Trees, Supported Vector Regression, Gaussian Process Regression methods consistently provided the best results. We concluded that an unbiased model can be created based on endogenous and exogenous information to build a reference framework, to price and benchmark Enterprise Value for valuation and credit risk assessment.
- Published
- 2020
- Full Text
- View/download PDF
35. On the Differential Analysis of Enterprise Valuation Methods as a Guideline for Unlisted Companies Assessment (I): Empowering Discounted Cash Flow Valuation
- Author
-
Cristina Soguero-Ruiz, Francisco-Javier Gimeno-Blanes, Germania Vayas-Ortega, and José Luis Rojo-Álvarez
- Subjects
Market capitalization ,cash flow ,Free cash flow ,02 engineering and technology ,lcsh:Technology ,stock market ,lcsh:Chemistry ,0502 economics and business ,0202 electrical engineering, electronic engineering, information engineering ,Econometrics ,Economics ,discounted cash flow ,General Materials Science ,Instrumentation ,lcsh:QH301-705.5 ,Valuation (finance) ,Discounted cash flow ,Fluid Flow and Transfer Processes ,050208 finance ,Weighted average cost of capital ,lcsh:T ,Process Chemistry and Technology ,05 social sciences ,Enterprise value ,General Engineering ,discount rate ,lcsh:QC1-999 ,Computer Science Applications ,machine learning ,lcsh:Biology (General) ,lcsh:QD1-999 ,enterprise value ,lcsh:TA1-2040 ,linear regression ,020201 artificial intelligence & image processing ,Stock market ,Cash flow ,private equity ,lcsh:Engineering (General). Civil engineering (General) ,valuation ,lcsh:Physics - Abstract
The Discounted Cash Flow (DCF) method is probably the most extended approach used in company valuation, its main drawbacks being probably the known extreme sensitivity to key variables such as Weighted Average Cost of Capital (WACC) and Free Cash Flow (FCF) estimations not unquestionably obtained. In this paper we propose an unbiased and systematic DCF method which allows us to value private equity by leveraging on stock markets evidences, based on a twofold approach: First, the use of the inverse method assesses the existence of a coherent WACC that positively compares with market observations, second, different FCF forecasting methods are benchmarked and shown to correspond with actual valuations. We use financial historical data including 42 companies in five sectors, extracted from Eikon-Reuters. Our results show that WACC and FCF forecasting are not coherent with market expectations along time, with sectors, or with market regions, when only historical and endogenous variables are taken into account. The best estimates are found when exogenous variables, operational normalization of input space, and data-driven linear techniques are considered (Root Mean Square Error of 6.51). Our method suggests that FCFs and their positive alignment with Market Capitalization and the subordinate enterprise value are the most influencing variables. The fine-tuning of the methods presented here, along with an exhaustive analysis using nonlinear machine-learning techniques, are developed and discussed in the companion paper.
- Published
- 2020
36. Enabling Heart Self-Monitoring for All and for AAL—Portable Device within a Complete Telemedicine System
- Author
-
Arcadi García-Alberola, Francisco-Manuel Melgarejo-Meseguer, José Luis Rojo-Álvarez, Javier Corral, Andrés-Lorenzo Bleda, Francisco-Javier Gimeno-Blanes, Ricardo Ruiz, and Rafael Maestre-Ferriz
- Subjects
Telemedicine ,Computer science ,Medicina ,02 engineering and technology ,lcsh:Chemical technology ,sensors ,01 natural sciences ,Biochemistry ,Article ,Analytical Chemistry ,Electrocardiography ,User-Computer Interface ,Assisted Living Facilities ,0202 electrical engineering, electronic engineering, information engineering ,medicine ,Humans ,lcsh:TP1-1185 ,Electrical and Electronic Engineering ,Instrumentation ,Monitoring, Physiologic ,Informática ,ECG ,010401 analytical chemistry ,arterial blood pressure ,Atrial fibrillation ,Heart ,Signal Processing, Computer-Assisted ,medicine.disease ,Mobile Applications ,Atomic and Molecular Physics, and Optics ,3. Good health ,0104 chemical sciences ,Heart Rhythm ,Ecg monitoring ,portability ,Blood pressure ,Heart failure ,Ambulatory ,Self-monitoring ,e-health ,020201 artificial intelligence & image processing ,Medical emergency ,atrial fibrillation detector ,QRS detector ,Algorithms - Abstract
During the last decades there has been a rapidly growing elderly population and the number of patients with chronic heart-related diseases has exploded. Many of them (such as those with congestive heart failure or some types of arrhythmias) require close medical supervision, thus imposing a big burden on healthcare costs in most western economies. Specifically, continuous or frequent Arterial Blood Pressure (ABP) and electrocardiogram (ECG) monitoring are important tools in the follow-up of many of these patients. In this work, we present a novel remote non-ambulatory and clinically validated heart self-monitoring system, which allows ABP and ECG monitoring to effectively identify clinically relevant arrhythmias. The system integrates digital transmission of the ECG and tensiometer measurements, within a patient-comfortable support, easy to recharge and with a multi-function software, all of them aiming to adapt for elderly people. The main novelty is that both physiological variables (ABP and ECG) are simultaneously measured in an ambulatory environment, which to our best knowledge is not readily available in the clinical market. Different processing techniques were implemented to analyze the heart rhythm, including pause detection, rhythm alterations and atrial fibrillation, hence allowing early detection of these diseases. Our results achieved clinical quality both for in-lab hardware testing and for ambulatory scenario validations. The proposed active assisted living (AAL) Sensor-based system is an end-to-end multidisciplinary system, fully connected to a platform and tested by the clinical team from beginning to end.
- Published
- 2019
- Full Text
- View/download PDF
37. Aditivo para fabricación de ladrillos de yeso: macizos, huecos o semihuecos mediante vibrocompresión
- Author
-
José Antonio Flores Yepes, Joaquín Julian Pastor Pérez, Luis Miguel Serna Jara, Juan Manuel Berná Serna, Antonio Martínez Gabarrón, and Francisco Javier Gimeno Blanes
- Abstract
El departamento de Ingeniería de la Universidad Miguel Hernández del Campus de Orihuela, lleva más de catorce años investigando sobre el desarrollo de nuevos materiales y la forma de aditivarlos y su aplicación a la sociedad.El aditivo en cuestión, se trata de un fluidificante, superplastificante y retardante del fraguado del yeso, compuesto por dióxido de silicio coloidal más ácido cítrico, con proporciones de 1-3 % de dióxido de silicio y de 99-97 % de ácido cítrico.Este nuevo aditivo patentado se puede adicionar, en el proceso de amasado, tanto en fase sólida como en fase líquida actuando sobre el proceso de fraguado del yeso y obteniéndose un buen resultado.Es de aplicación en yeso blanco, yeso negro ó moreno (Black paster) y escayola. Las dosificaciones, por kg de yeso, están entre 0.06 gr/kg hasta 1,2 gr/kg de yeso, con unos tiempos de fraguado entre 15 minutos y 240 minutos.La dosificación de agua para el amasado es variable, siendo una ventaja la reducción de agua en el masado. Se establece en 0,2 kg de agua por 1 kg de yeso de este modo se obtiene un yeso amasado que convertido en pequeñas bolas de diámetro entre 2 y 8 mm es utilizable en la fabricación de prefabricados mediante prensado.En la fabricación de ladrillos se produce una reducción de emisiones de CO2 en torno al 80 %, ya que el proceso de calcinación del yeso se puede realizar a una temperatura de unos 250 ºC (la temperatura puede ser menor, o mayor si se quiere reducir tiempo de cocción). Se evita el tiempo de secado del ladrillo cerámico (unas 8 horas a 100 ºC) más el de cocción del mismo (unas 8-14 horas a 900 ºC).El desarrollo del material obtenido lleva aparejado la aplicación directa en la fabricación de ladrillos huecos o semihuecos para su uso industria debido a su óptimo comportamiento a compresión (resistencia media de 19,23N/mm2), además de obtenerse un beneficio medioambiental y económico.
- Published
- 2019
38. Electrocardiographic fragmented activity (I): physiological meaning of multivariate signal decompositions
- Author
-
María-Eladia Salar-Alcaraz, Francisco-Javier Gimeno-Blanes, Juan Martínez-Sánchez, Juan-Ramon Gimeno-Blanes, José Luis Rojo-Álvarez, Arcadi García-Alberola, and Francisco-Manuel Melgarejo-Meseguer
- Subjects
Multivariate statistics ,Computer science ,Medicina ,Feature vector ,0206 medical engineering ,02 engineering and technology ,030204 cardiovascular system & hematology ,lcsh:Technology ,lcsh:Chemistry ,03 medical and health sciences ,fragmentation analysis ,0302 clinical medicine ,fragmentation detection ,General Materials Science ,ICA ,Invariant (mathematics) ,lcsh:QH301-705.5 ,Instrumentation ,Eigenvalues and eigenvectors ,Fluid Flow and Transfer Processes ,Informática ,PCA ,lcsh:T ,business.industry ,ECG ,Process Chemistry and Technology ,Statistical relation ,General Engineering ,Pattern recognition ,020601 biomedical engineering ,Independent component analysis ,lcsh:QC1-999 ,Computer Science Applications ,lcsh:Biology (General) ,lcsh:QD1-999 ,lcsh:TA1-2040 ,Principal component analysis ,Probability distribution ,Artificial intelligence ,multivariate techniques ,lcsh:Engineering (General). Civil engineering (General) ,business ,lcsh:Physics - Abstract
Recent research has proven the existence of statistical relation among fragmented QRS and several highly prevalence diseases, such as cardiac sarcoidosis, acute coronary syndrome, arrythmogenic cardiomyopathies, Brugada syndrome, and hypertrophic cardiomyopathy. One out of five hundred people suffer from hypertrophic cardiomyopathies. The relation among the fragmentation and arrhythmias drives the objective of this work, which is to propose a valid method for QRS fragmentation detection. With that aim, we followed a two-stage approach. First, we identified the features that better characterize the fragmentation by analyzing the physiological interpretation of multivariate approaches, such as principal component analysis (PCA) and independent component analysis (ICA). Second, we created an invariant transformation method for the multilead electrocardiogram (ECG), by scrutinizing the statistical distributions of the PCA eigenvectors and of the ICA transformation arrays, in order to anchor the desired elements in the suitable leads in the feature space. A complete database was compounded incorporating real fragmented ECGs, surrogate registers by synthetically adding fragmented activity to real non-fragmented ECG registers, and standard clean ECGs. Results showed that the creation of beat templates together with the application of PCA over eight independent leads achieves 0.995 fragmentation enhancement ratio and 0.07 dispersion coefficient. In the case of ICA over twelve leads, the results were 0.995 fragmentation enhancement ratio and 0.70 dispersion coefficient. We conclude that the algorithm presented in this work constructs a new paradigm, by creating a systematic and powerful tool for clinical anamnesis and evaluation based on multilead ECG. This approach consistently consolidates the inconspicuous elements present in multiple leads onto designated variables in the output space, hence offering additional and valid visual and non-visual information to standard clinical review, and opening the door to a more accurate automatic detection and statistically valid systematic approach for a wide number of applications. In this direction and within the companion paper, further developments are presented applying this technique to fragmentation detection.
- Published
- 2019
39. Towards Organization management using exploratory screening and Big Data tests: a case study of the Spanish Red Cross
- Author
-
Francisco-Javier Gimeno-Blanes, Cristina Soguero-Ruiz, Margarita Rodriguez-Ibanez, Sergio Muñoz-Romero, and José Luis Rojo-Álvarez
- Subjects
Big Data ,General Computer Science ,Computer science ,Big data ,organization management ,computer.software_genre ,Plot (graphics) ,03 medical and health sciences ,0302 clinical medicine ,Statistical significance ,0502 economics and business ,Leverage (statistics) ,General Materials Science ,Categorical variable ,Bootstrapping (statistics) ,Statistical hypothesis testing ,Informática ,business.industry ,05 social sciences ,General Engineering ,Nonparametric statistics ,Pareto principle ,16. Peace & justice ,prediction model ,machine learning ,organization efficiency ,030228 respiratory system ,lcsh:Electrical engineering. Electronics. Nuclear engineering ,Metric (unit) ,Data mining ,business ,lcsh:TK1-9971 ,computer ,050212 sport, leisure & tourism - Abstract
With the emergence of information and communication technologies, a large amount of data has turned available for the organizations, which creates expectations on their value and content for management purposes. However, the exploratory analysis of available organizational data based on emerging Big Data technologies are still developing in terms of operative tools for solid and interpretable data description. In this work, we addressed the exploratory analysis of organization databases at early stages where little quantitative information is available about their efficiency. Categorical and metric single-variable tests are proposed and formalized in order to provide a mass criterion to identify regions in forms with clusters of significant variables. Bootstrap resampling techniques are used to provide nonparametric criteria in order to establish easy-to-use statistical tests, so that single-variable tests are represented each on a visual and quantitative statistical plot, whereas all the variables in a given form are jointly visualized in the so-called chromosome plots. More detailed profile plots offer deep comparison knowledge for categorical variables across the organization physical and functional structures, while histogram plots for numerical variables incorporate the statistical significance of the variables under study for preselected Pareto groups. Performance grouping is addressed by identifying two or three groups according to some representative empirical distribution of some convenient grouping feature. The method is applied to perform a Big-Data exploratory analysis on the follow-up forms of Spanish Red Cross, based on the number of interventions and on a by-record basis. Results showed that a simple one-variable blind-knowledge exploratory Big-Data analysis, as the one developed in this paper, offers unbiased comparative graphical and numerical information that characterize organizational dynamics in terms of applied resources, available capacities, and productivity. In particular, the graphical and numerical outputs of the present analysis proved to be a valid tool to isolate the underlying overloaded or under-performing resources in complex organizations. As a consequence, the proposed method allows a systematic and principled way for efficiency analysis in complex organizations, which combined with organizational internal knowledge could leverage and validate efficient decision-making.
- Published
- 2019
40. On the feasibility of tilt test outcome early prediction using ECG and pressure parameters.
- Author
-
Francisco Javier Gimeno-Blanes, José Luis Rojo-álvarez, Antonio J. Caamaño, José-Antonio Flores-Yepes, and Arcadi García-Alberola
- Published
- 2011
- Full Text
- View/download PDF
41. Cardiac Fibrosis Detection Applying Machine Learning Techniques to Standard 12-Lead ECG
- Author
-
Arcadio García-Alberola, Francisco-Manuel Melgarejo-Meseguer, María-Eladia Salar-Alcaraz, Francisco-Javier Gimeno-Blanes, Juan-Ramon Gimeno-Blanes, and José Luis Rojo-Álvarez
- Subjects
education.field_of_study ,medicine.medical_specialty ,Cardiac fibrosis ,business.industry ,010401 analytical chemistry ,Population ,Hypertrophic cardiomyopathy ,Linear classifier ,Myocardial Disorder ,030204 cardiovascular system & hematology ,medicine.disease ,01 natural sciences ,0104 chemical sciences ,03 medical and health sciences ,0302 clinical medicine ,Fibrosis ,Internal medicine ,medicine ,Kurtosis ,Cardiology ,Myocardial fibrosis ,cardiovascular diseases ,education ,business - Abstract
Hypertrophic cardiomyopathy (HCM) is a myocardial disorder that affects 0.2% of the population and it is genetically transmitted. Several ECG findings have been related to the presence of fibrosis in other cardiac diseases, but data for HCM in this setting are lacking. Our hypothesis is that fibrosis affects the electrical cardiac propagation in patients with HCM in a relatively specific way and that this effect may be detected with suitable postprocessing applied to the ECG signals. We used 43 standard 12-lead ECGs from patients with previous clinical diagnosis of HCM. Principal Component Analysis (PCA) was applied by combining the ECG-leads oriented to different anatomic regions, hence assessing the potential fibrosis effects in the resulting leads for postprocessing convenience. Linear classifier of Support Vector Machine type were used with several statistics extracted from the resulting PCA-components, including normalized power, standard deviation, kurtosis, skewness, and local maxima. Results reached 75.0% sensitivity, 80.0% specificity, 85.7% positive predictive value, 66.7% negative predictive value, and 76.9% accuracy in our database. There is evidence that myocardial fibrosis can be detected in patients with HCM by postprocessing their ECG signals.
- Published
- 2018
- Full Text
- View/download PDF
42. On the Beat Detection Performance in Long-Term ECG Monitoring Scenarios
- Author
-
Estrella Everss-Villalba, Francisco-Manuel Melgarejo-Meseguer, Zaida Molins-Bordallo, Francisco-Javier Gimeno-Blanes, Jose-Antonio Flores-Yepes, José Luis Rojo-Álvarez, Arcadi García-Alberola, and Manuel Blanco-Velasco
- Subjects
Computer science ,02 engineering and technology ,QRS detection ,ECG ,long-term monitoring ,Holter ,7-day ,computer.software_genre ,lcsh:Chemical technology ,01 natural sciences ,Biochemistry ,Article ,Analytical Chemistry ,Beat detection ,0202 electrical engineering, electronic engineering, information engineering ,lcsh:TP1-1185 ,Sensitivity (control systems) ,Electrical and Electronic Engineering ,Instrumentation ,Noise (signal processing) ,010401 analytical chemistry ,Atomic and Molecular Physics, and Optics ,0104 chemical sciences ,Term (time) ,Ecg monitoring ,Key (cryptography) ,020201 artificial intelligence & image processing ,Data mining ,Holter monitoring ,computer - Abstract
Despite the wide literature on R-wave detection algorithms for ECG Holter recordings, the long-term monitoring applications are bringing new requirements, and it is not clear that the existing methods can be straightforwardly used in those scenarios. Our aim in this work was twofold: First, we scrutinized the scope and limitations of existing methods for Holter monitoring when moving to long-term monitoring; Second, we proposed and benchmarked a beat detection method with adequate accuracy and usefulness in long-term scenarios. A longitudinal study was made with the most widely used waveform analysis algorithms, which allowed us to tune the free parameters of the required blocks, and a transversal study analyzed how these parameters change when moving to different databases. With all the above, the extension to long-term monitoring in a database of 7-day Holter monitoring was proposed and analyzed, by using an optimized simultaneous-multilead processing. We considered both own and public databases. In this new scenario, the noise-avoid mechanisms are more important due to the amount of noise that exists in these recordings, moreover, the computational efficiency is a key parameter in order to export the algorithm to the clinical practice. The method based on a Polling function outperformed the others in terms of accuracy and computational efficiency, yielding 99.48% sensitivity, 99.54% specificity, 99.69% positive predictive value, 99.46% accuracy, and 0.85% error for MIT-BIH arrhythmia database. We conclude that the method can be used in long-term Holter monitoring systems.
- Published
- 2018
43. Noise Maps for Quantitative and Clinical Severity Towards Long-Term ECG Monitoring
- Author
-
Manuel Blanco-Velasco, Francisco-Manuel Melgarejo-Meseguer, Salvador Sala-Pla, José Luis Rojo-Álvarez, Estrella Everss-Villalba, Francisco-Javier Gimeno-Blanes, Arcadi García-Alberola, Ministerio de Economía y Competitividad (España), Ministerio de Ciencia e Innovación (España), and European Commission
- Subjects
Engineering ,Speech recognition ,0206 medical engineering ,noise bars ,02 engineering and technology ,030204 cardiovascular system & hematology ,Signal-To-Noise Ratio ,lcsh:Chemical technology ,Biochemistry ,Signal ,Article ,Analytical Chemistry ,03 medical and health sciences ,Consistency (database systems) ,Electrocardiography ,0302 clinical medicine ,Humans ,lcsh:TP1-1185 ,Electrical and Electronic Engineering ,Instrumentation ,Artifact (error) ,noise clinical severity ,Noise measurement ,business.industry ,ECG ,Holter ,long-term monitoring ,Gold standard (test) ,noise maps ,020601 biomedical engineering ,Atomic and Molecular Physics, and Optics ,Term (time) ,Noise ,Categorization ,external event recorder ,Electrocardiography, Ambulatory ,business ,Artifacts ,Algorithms - Abstract
This article belongs to the Special Issue Sensors for Health Monitoring and Disease Diagnosis., Noise and artifacts are inherent contaminating components and are particularly present in Holter electrocardiogram (ECG) monitoring. The presence of noise is even more significant in long-term monitoring (LTM) recordings, as these are collected for several days in patients following their daily activities; hence, strong artifact components can temporarily impair the clinical measurements from the LTM recordings. Traditionally, the noise presence has been dealt with as a problem of non-desirable component removal by means of several quantitative signal metrics such as the signal-to-noise ratio (SNR), but current systems do not provide any information about the true impact of noise on the ECG clinical evaluation. As a first step towards an alternative to classical approaches, this work assesses the ECG quality under the assumption that an ECG has good quality when it is clinically interpretable. Therefore, our hypotheses are that it is possible (a) to create a clinical severity score for the effect of the noise on the ECG, (b) to characterize its consistency in terms of its temporal and statistical distribution, and (c) to use it for signal quality evaluation in LTM scenarios. For this purpose, a database of external event recorder (EER) signals is assembled and labeled from a clinical point of view for its use as the gold standard of noise severity categorization. These devices are assumed to capture those signal segments more prone to be corrupted with noise during long-term periods. Then, the ECG noise is characterized through the comparison of these clinical severity criteria with conventional quantitative metrics taken from traditional noise-removal approaches, and noise maps are proposed as a novel representation tool to achieve this comparison. Our results showed that neither of the benchmarked quantitative noise measurement criteria represent an accurate enough estimation of the clinical severity of the noise. A case study of long-term ECG is reported, showing the statistical and temporal correspondences and properties with respect to EER signals used to create the gold standard for clinical noise. The proposed noise maps, together with the statistical consistency of the characterization of the noise clinical severity, paves the way towards forthcoming systems providing us with noise maps of the noise clinical severity, allowing the user to process different ECG segments with different techniques and in terms of different measured clinical parameters., This work was supported by the PRINCIPIAS project (TEC2013-48439-C4-1-R and TEC2013-48439-C4-2-R), FINALE project (TEC2016-75161-C2-1-R and TEC2016-75161-C2-2-R), KERMES project (TEC2016-81900-REDT), and project TEC2015-64835-C3-1-R with FEDER fundings. These grants allow to cover open access costs for publication.
- Published
- 2017
44. QRS Fragmentation Index as a New Discriminator for Early Diagnosis of Heart Diseases
- Author
-
Zaida Molins-Bordallo, José Luis Rojo-Álvarez, Jose-Antonio Flores-Yepes, Mariela Salar-Alcaraz, Estrella Everss-Villalba, Francisco-Javier Gimeno-Blanes, Arcadi García-Alberola, and Francisco-Manuel Melgarejo-Meseguer
- Subjects
medicine.medical_specialty ,Acute coronary syndrome ,Discriminator ,business.industry ,02 engineering and technology ,030204 cardiovascular system & hematology ,Qrs fragmentation ,021001 nanoscience & nanotechnology ,medicine.disease ,Market fragmentation ,03 medical and health sciences ,QRS complex ,0302 clinical medicine ,Internal medicine ,Pattern recognition (psychology) ,Cardiology ,medicine ,0210 nano-technology ,business ,Fiducial marker ,Brugada syndrome - Abstract
In the past few years, the presence of fragmentation in the QRS complex has been demonstrated to be related to diseases such as myocardial fibrosis, cardiac sarcoidosis, arrythmogenic cardiopathies, acute coronary syndrome, and Brugada syndrome, among others. The detection of fragmentation in the QRS is usually carried out manually, which represents a subjective pattern recognition task that demands an effort by the clinician, increasing with the number of patients. These problems have made the process of fragmentation detection a good candidate to its automatization. In this work, we used a database with over six-thousand 12-lead ECG from Hospital Virgen de la Arrixaca de Murcia (Spain), which where digitally recorded with GE MAC5000. Affected and non-affected patients records were extracted for computerized analysis. Clinical supervision was performed for gold-standard development and for signal classification. Fragmentation detection algorithms were developed using first and second derivatives calculation in the pre-qualified segments of the signal, after fiducial point detection. The obtained results were 96.88% sensitivity, 72.92% specificity, and 82.50% accuracy. These results confirm that it is possible to automatically detect fragmentation, constituting a relevant tool to pre-qualify patients for further diagnostic-tests, and it also opens new opportunities for computerized diagnosis.
- Published
- 2017
- Full Text
- View/download PDF
45. On the Influence of Heart Rate and Coupling Interval Prematurity on Heart Rate Turbulence
- Author
-
Inmaculada Mora-Jiménez, Francisco-Javier Gimeno-Blanes, Juan Pablo Martinez, Oscar Barquero-Perez, José Luis Rojo-Álvarez, Pablo Laguna, Rebeca Goya-Esteban, Leif Sörnmo, Arcadi García-Alberola, Carlos Figuera, and Eduardo Gil
- Subjects
Adult ,Male ,medicine.medical_specialty ,0206 medical engineering ,Myocardial Infarction ,Biomedical Engineering ,02 engineering and technology ,030204 cardiovascular system & hematology ,Baroreflex ,Heart rate turbulence ,Electrocardiography ,03 medical and health sciences ,0302 clinical medicine ,Heart Rate ,Risk Factors ,Internal medicine ,Heart rate ,medicine ,Humans ,In patient ,Myocardial infarction ,Aged ,medicine.diagnostic_test ,business.industry ,Signal Processing, Computer-Assisted ,Regression analysis ,Middle Aged ,medicine.disease ,Ventricular Premature Complexes ,020601 biomedical engineering ,Linear Models ,Cardiology ,Female ,sense organs ,business ,Nonlinear regression ,Biomedical engineering - Abstract
Objective: Heart rate turbulence (HRT) has been successfully explored for cardiac risk stratification. While HRT is known to be influenced by the heart rate (HR) and the coupling interval (CI), nonconcordant results have been reported on how the CI influences HRT. The purpose of this study is to investigate HRT changes in terms of CI and HR by means of an especially designed protocol. Methods: A dataset was acquired from 11 patients with structurally normal hearts for which CI was altered by different pacing trains and HR by isoproterenol during electrophysiological study (EPS). The protocol was designed so that, first, the effect of HR changes on HRT and, second, the combined effect of HR and CI could be explored. As a complement to the EPS dataset, a database of 24-h Holters from 61 acute myocardial infarction (AMI) patients was studied for the purpose of assessing risk. Data analysis was performed by using different nonlinear ridge regression models, and the relevance of model variables was assessed using resampling methods. The EPS subjects, with and without isoproterenol, were analyzed separately. Results: The proposed nonlinear regression models were found to account for the influence of HR and CI on HRT, both in patients undergoing EPS without isoproterenol and in low-risk AMI patients, whereas this influence was absent in high-risk AMI patients. Moreover, model coefficients related to CI were not statistically significant, $p > 0.05$ , on EPS subjects with isoproterenol. Conclusion: The observed relationship between CI and HRT, being in agreement with the baroreflex hypothesis, was statistically significant ( $p ), when decoupling the effect of HR and normalizing the CI by the HR. Significance: The results of this study can help to provide new risk indicators that take into account physiological influence on HRT, as well as to model how this influence changes in different cardiac conditions.
- Published
- 2017
46. Electrocardiographic Fragmented Activity (II): A Machine Learning Approach to Detection
- Author
-
José Luis Rojo-Álvarez, Francisco-Javier Gimeno-Blanes, María-Eladia Salar-Alcaraz, Francisco-Manuel Melgarejo-Meseguer, Arcadi García-Alberola, Juan Martínez-Sánchez, and Juan-Ramon Gimeno-Blanes
- Subjects
Computer science ,02 engineering and technology ,030204 cardiovascular system & hematology ,computer.software_genre ,lcsh:Technology ,lcsh:Chemistry ,0302 clinical medicine ,0202 electrical engineering, electronic engineering, information engineering ,Gaussian function ,fragmentation detection ,General Materials Science ,Medical diagnosis ,lcsh:QH301-705.5 ,Instrumentation ,Informática ,Fluid Flow and Transfer Processes ,Artificial neural network ,General Engineering ,Hypertrophic cardiomyopathy ,lcsh:QC1-999 ,3. Good health ,Computer Science Applications ,machine learning ,cardiovascular system ,symbols ,020201 artificial intelligence & image processing ,multivariate techniques ,Medicina ,fibrosis detection ,Decision tree ,Machine learning ,03 medical and health sciences ,QRS complex ,Naive Bayes classifier ,symbols.namesake ,medicine ,cardiovascular diseases ,ECG ,lcsh:T ,business.industry ,Process Chemistry and Technology ,medicine.disease ,Support vector machine ,lcsh:Biology (General) ,lcsh:QD1-999 ,lcsh:TA1-2040 ,Artificial intelligence ,lcsh:Engineering (General). Civil engineering (General) ,business ,computer ,lcsh:Physics - Abstract
Hypertrophic cardiomyopathy, according to its prevalence, is a comparatively common disease related to the risk of suffering sudden cardiac death, heart failure and stroke. This illness is characterized by the excessive deposition of collagen among healthy myocardium cells. This situation, which is medically known as fibrosis, constitutes effective conduction obstacles in the myocardium electrical path, and when severe enough, it can be outlined as additional peaks or notches in the QRS, clinically entitled as fragmentation. Nowadays, the fragmentation detection is performed by visual inspection, but the fragmented QRS can be confused with the noise present in the electrocardiogram (ECG). On the other hand, fibrosis detection is performed by magnetic resonance imaging with late gadolinium enhancement, the main drawback of this technique being its cost in terms of time and money. In this work, we propose two automatic algorithms, one for fragmented QRS detection and another for fibrosis detection. For this purpose, we used four different databases, including the subrogated database described in the companion paper and incorporating three additional ones, one compounded by more accurate subrogated ECG signals and two compounded by real and affected subjects as labeled by expert clinicians. The first real-world database contains QRS fragmented records and the second one contains records with fibrosis and both were recorded in Hospital Clí, nico Universitario Virgen de la Arrixaca (Spain). To deeply analyze the scope of these datasets, we benchmarked several classifiers such as Neural Networks, Support Vector Machines (SVM), Decision Trees and Gaussian Naï, ve Bayes (NB). For the fragmentation dataset, the best results were 0.94 sensitivity, 0.88 specificity, 0.89 positive predictive value, 0.93 negative predictive value and 0.91 accuracy when using SVM with Gaussian kernel. For the fibrosis databases, more limited accuracy was reached, with 0.47 sensitivity, 0.91 specificity, 0.82 predictive positive value, 0.66 negative predictive value and 0.70 accuracy when using Gaussian NB. Nevertheless, this is the first time that fibrosis detection is attempted automatically from ECG postprocessing, paving the way towards improved algorithms and methods for it. Therefore, we can conclude that the proposed techniques could offer a valuable tool to clinicians for both fragmentation and fibrosis diagnoses support.
- Published
- 2019
- Full Text
- View/download PDF
47. On the differential benchmarking of promotional efficiency with machine learning modelling (II): Practical applications
- Author
-
Francisco-Javier Gimeno-Blanes, José Luis Rojo-Álvarez, María Pilar Martínez-Ruiz, Cristina Soguero-Ruiz, and Inmaculada Mora-Jiménez
- Subjects
Artificial neural network ,Computer science ,business.industry ,Feature extraction ,General Engineering ,Online machine learning ,Benchmarking ,computer.software_genre ,Machine learning ,Computer Science Applications ,Support vector machine ,Artificial Intelligence ,Multilayer perceptron ,Kernel (statistics) ,Data mining ,Artificial intelligence ,business ,computer - Abstract
The assessment of promotional sales with models constructed by machine learning techniques is arousing interest due, among other reasons, to the current economic situation leading to a more complex environment of simultaneous and concurrent promotional activities. An operative model diagnosis procedure was previously proposed in the companion paper, which can be readily used both for agile decision making on the architecture and implementation details of the machine learning algorithms, and for differential benchmarking among models. In this paper, a detailed example of model analysis is presented for two representative databases with different promotional behaviour, namely, a non-seasonal category (milk) and a heavily seasonal category (beer). The performance of four well-known machine learning techniques with increasing complexity is analyzed in detail here. In particular, k-Nearest Neighbours, General Regression Neural Networks, Multilayer Perceptron (MLP), and Support Vector Machines (SVM), are differentially compared. Present paper evaluates these techniques along the experiments described for both categories when applying the methodological findings obtained in the companion paper. We conclude that some elements included in the architecture are not essential for a good performance of the machine learning promotional models, such as the semiparametric nature of the kernel in SVM models, whereas other can be strongly dependent of the database, such as the convenience of multiple output models in MLP regression schemes. Additionally, the specificity of the behaviour of certain categories and product ranges determines the need to establish suitable and specific procedures for a better prediction and feature extraction.
- Published
- 2012
- Full Text
- View/download PDF
48. Arundo donax chipboard based on urea-formaldehyde resin using under 4mm particles size meets the standard criteria for indoor use
- Author
-
J.J. Pastor, M.J. Frutos, I. Rodríguez-Guisado, A. Martinez-Gabarron, Francisco-Javier Gimeno-Blanes, and J.A. Flores
- Subjects
Moisture ,biology ,Urea-formaldehyde ,Arundo donax ,Raw material ,Pulp and paper industry ,biology.organism_classification ,law.invention ,Horticulture ,chemistry.chemical_compound ,chemistry ,law ,Particle board ,Environmental science ,Hammer ,Weed ,Agronomy and Crop Science ,Renewable resource - Abstract
Common reed ( Arundo donax ) is a fast growing perennial plant, considered in many countries as a weed or invasive plant. New material developments in recent years have led to common cane proliferation and an invasion in crops and water channels, significantly increasing elimination and control costs. In this study, it is proposed that particleboard be manufactured of common reed ( A. donax ). The objective is twofold. On the one hand it allows control of a weed, and secondly, it helps reduce the high dependence of imported wood timber and boards by using a readily and annually renewable resource. The fibers of common reed A. donax , possess high tensile strength (200 N/mm 2 between knots), providing much higher values than wood. This fact together with an important moisture and aging resistance, encourage us to propose it as a suitable raw material for the production of particleboards. In this study we propose the manufacture of particle board with urea formaldehyde resin, which is currently used in the wood chipboard industry. To perform this study, A. donax fibers obtained from a shredding hammer, were classified attending to size and slenderness. Following the standard industrial chipboard manufacturing processes, boards were developed to compare their behaviour against wood chipboards. Fifteen test-tube batches with different proportions of pre-selected fibers were manufactured. New boards were subjected to a mechanical stress test, and other physical properties were also measured. UNE EN—specific wood chipboard regulations were applied. Results showed that particleboard could be manufactured using A. donax adjusting the particle size to a maximum of 4 mm sieves. Finally it can be stated that making common reed panels also involves the reuse of a plant actually considered a weed, to become a raw material for industrial applications, leaving a door open to future development and research areas.
- Published
- 2011
- Full Text
- View/download PDF
49. Pressure Impact on Common Reed Particleboards Manufacturing Procedure
- Author
-
M.J. Frutos, Francisco-Javier Gimeno-Blanes, A. Martinez-Gabarron, J.J. Pastor, and J.A. Flores
- Subjects
presure in chipboard ,Wood waste ,Engineering ,wood waste ,Waste management ,Arundo Donax L ,business.industry ,wood remainder ,Urea-formaldehyde ,Iso standards ,Pulp and paper industry ,particle board ,law.invention ,chemistry.chemical_compound ,chemistry ,system engineering procedures ,Particle board ,law ,giant reed ,particleboards ,shredder ,shredding machine ,business ,common reed - Abstract
Present work elaborates on systems and engineering procedures to create particleboards based on Common Reed. Particleboard was manufactured using the shredding blades, usually applied by wood-particleboard industry. Diverse types of boards were manufactured keeping the same proportion of urea formaldehyde resin (for indoor use), but with different particle proportions according to ISO standard sieves, and varying the pressure from 3 N/mm2 to 25 N/mm2. Mechanical Density, bending resistance and elasticity tests were performed and results were compared against commercial wood particleboards. Results allowed us to state that particle size and pressure plays a significant role in common reed particleboard properties. Common reed particleboards were classified according to UNE EN standards, taking into account relevant standard tests carried out as well as additional classification standard-rules.
- Published
- 2011
- Full Text
- View/download PDF
50. Denoising of Heart Rate Variability signals during tilt test using independent component analysis and multidimensional recordings
- Author
-
Estrella Everss, J. Hernandez-Ortega, Arcadio García-Alberola, F. Alonso-Atienza, Jesus Requena-Carrion, José Luis Rojo-Álvarez, and Francisco-Javier Gimeno-Blanes
- Subjects
Telecomunicaciones ,Computer science ,Signal reconstruction ,Noise reduction ,Speech recognition ,3205.01 Cardiología ,Gold standard (test) ,Signal ,Independent component analysis ,Distortion ,Median filter ,Heart rate variability ,3325 Tecnología de las Telecomunicaciones - Abstract
Vasovagal Syncope (VVS) represents the most frequent cause of loss of consciousness. Additionally to its clinical usefulness, the tilt test is a good quality physiological gold standard for the spectral analysis of Heart Rate Variability (HRV). Noise removal in HRV signals is problematic, due to the presence of ectopic beats and non-stationary short-term trends. Given current Tilt Test systems simultaneously record several physiological signals, we hypothesize that independent component analysis (ICA) may separate physiological from mostly-noise components, and denoising can be properly done. Four-dimensional recordings (HR, systolic/diastolic blood pressure, and ejection volume) were obtained during 50 Tilt Test. After ICA decomposition, a 5th order median filter was applied to the noisiest component, prior to signal reconstruction. In order to check the denoising performance, a gold-standard was made by manually removing ectopic beats and artifacts from the original signals by an expert. For comparison purposes, a 5th order median filter was also applied separately to the HR signal. The spectrum analysis showed that denoising of multidimensional recordings with ICA during Tilt Test yields HRV signals with lower distortion at HF band.
- Published
- 2007
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.