791 results on '"ROBUST control"'
Search Results
152. Bilinear low-rank coding framework and extension for robust image recovery and feature representation.
- Author
-
Zhang, Zhao, Yan, Shuicheng, Zhao, Mingbo, and Li, Fan-Zhang
- Subjects
- *
ROBUST control , *FEATURE extraction , *PROBLEM solving , *ERROR correction (Information theory) , *DATA analysis - Abstract
We mainly study the low-rank image recovery problem by proposing a bilinear low-rank coding framework called Tensor Low-Rank Representation. For enhanced low-rank recovery and error correction, our method constructs a low-rank tensor subspace to reconstruct given images along row and column directions simultaneously by computing two low-rank matrices alternately from a nuclear norm minimization problem, so both column and row information of data can be effectively preserved. Our bilinear approach seamlessly integrates the low-rank coding and dictionary learning into a unified framework. Thus, our formulation can be treated as enhanced Inductive Robust Principal Component Analysis with noise removed by low-rank representation, and can also be considered as the enhanced low-rank representation with a clean informative dictionary via low-rank embedding. To enable our method to include outside images, the out-of-sample extension is also presented by regularizing the model to correlate image features with the low-rank recovery of the images. Comparison with other criteria shows that our model exhibits stronger robustness and enhanced performance. We also use the outputted bilinear low-rank codes for feature learning. Two unsupervised local and global low-rank subspace learning methods are proposed for extracting image features for classification. Simulations verified the validity of our techniques for image recovery, representation and classification. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
153. Effects of corruption on efficiency of the European airports.
- Author
-
Randrianarisoa, Laingo Manitra, Bolduc, Denis, Choo, Yap Yin, Oum, Tae Hoon, and Yan, Jia
- Subjects
- *
CORRUPTION , *AIRPORTS , *DATA analysis , *ROBUST control , *PRODUCTION (Economic theory) - Abstract
The effect of corruption on airport productive efficiency is analyzed using an unbalanced panel data of major European airports from 2003 to 2009. We first compute the residual (or net) variable factor productivity using the multilateral index number method and then apply robust cluster random effects model in order to evaluate the importance of corruption. We find strong evidence that corruption has negative impacts on airport operating efficiency; and the effects depend on the ownership form of the airport. The results suggest that airports under mixed public–private ownership with private majority achieve lower levels of efficiency when located in more corrupt countries. They even operate less efficiently than fully and/or majority government owned airports in high corruption environment. We control for economic regulation, competition level and other airports’ characteristics. Our empirical results survive several robustness checks including different control variables, three alternative corruption measures: International Country Risk Guide (ICRG) corruption index, Corruption Perception Index (CPI) and Control of Corruption Index (CCI). The empirical findings have important policy implications for management and ownership structuring of airports operating in countries that suffer from higher levels of corruption. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
154. Derivation of spatiotemporal data for cyclists (from video) to enable agent-based model calibration.
- Author
-
Osowski, Chris and Waterson, Ben
- Subjects
SPATIOTEMPORAL processes ,DATA analysis ,MULTIAGENT systems ,ROBUST control ,IMAGE processing ,CYCLISTS - Abstract
Cycling, as a mode share of urban travel, is widely desired to be increased. To support the delivery of improved infrastructure, robust modelling is desired and modelling such agents demands real-world calibration. Data to enable this is expensive to obtain by conventional means. This paper presents and demonstrates a process for the analysis of video to provide such data. This analysis yields spatiotemporal data for experimentally-observed cyclists from which velocity information (amongst other things) can be derived. With further refinement, this process can be used in the analysis of existing and future highway video data. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
155. Robust Conclusions in Mass Spectrometry Analysis.
- Author
-
Zoppis, Italo, Dondi, Riccardo, Borsani, Massimiliano, Gianazza, Erica, Chinello, Clizia, Magni, Fulvio, and Mauri, Giancarlo
- Subjects
ROBUST control ,MASS spectrometry ,DATA analysis ,BIOLOGICAL databases ,DECISION making ,RENAL cell carcinoma - Abstract
A central issue in biological data analysis is that uncertainty, resulting from different factors of variability, may change the effect of the events being investigated. Therefore, robustness is a fundamental step to be considered. Robustness refers to the ability of a process to cope well with uncertainties, but the different ways to model both the processes and the uncertainties lead to many alternative conclusions in the robustness analysis. In this paper we apply a framework allowing to deal with such questions for mass spectrometry data. Specifically, we provide robust decisions when testing hypothesis over a case/control population of subject measurements (i.e. proteomic profiles). To this concern, we formulate (i) a reference model for the observed data (i.e., graphs), (ii) a reference method to provide decisions (i.e., test of hypotheses over graph properties) and (iii) a reference model of variability to employ sources of uncertainties (i.e., random graphs). We apply these models to a realcase study, analyzing the mass spectrometry profiles of the most common type of Renal Cell Carcinoma; the Clear Cell variant. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
156. Calibrated tree counting on remotely sensed images of planted forests.
- Author
-
Pont, David, Kimberley, Mark O., Brownlie, Rod K., Sabatia, Charles O., and Watt, Michael S.
- Subjects
- *
REMOTE sensing , *ROBUST control , *DATA analysis , *FOREST surveys , *AIRBORNE lasers - Abstract
The development of robust and accurate methods for counting trees from remotely sensed data could provide substantial cost savings in forest inventory. A new methodology that provides a framework for calibrating tree detection algorithms to obtain accurate tree counts for even-aged stands is described. The methodology was evaluated using two tree detection algorithms and two operators using airborne laser scanning (ALS) and orthophotograph images for fourPinus radiataD.Don stands ranging in age between 5 and 32 years with stand densities ranging between 204 and 826 stems ha−1. For application of the methodology to ALS images the error of estimate on the total count was 4.7% when calibration counts from actual ground plots were used and 10.5% when calibration counts from virtual plots on the image were used. For orthophotographs, the error of estimate was 6.1% using ground calibration plots and 24.3% using calibration counts from virtual plots. The described methodology was shown to be robust to variations in the process from the two operators and two algorithms evaluated. The measure of accuracy determined using the methodology can be used to provide an objective basis for evaluating a wide range of tree counting and detection processes in future research. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
157. Marker Codes on BPMR Write Channel With Data-Dependent Written-in Errors.
- Author
-
Wu, Tong and Armand, Marc A.
- Subjects
- *
DATA analysis , *SYNCHRONIZATION , *PARITY-check matrix , *ROBUST control , *COMPUTER simulation - Abstract
The performance of bit-patterned media recording is limited, in part, by written-in errors in the write channel, which has recently been modeled by a channel introducing data-dependent insertion, deletion, and substitution (DIDS) errors. The Davey-MacKay (DM) construction has shown promising error performance on the DIDS channel, where synchronization errors are sparse and burst error lengths before and after each synchronization error are short. In this paper, we consider marker codes with a nonbinary low-density parity-check code as the outer code for the DIDS channel. Our contributions are twofold. First, we propose two computationally efficient inner decoding schemes for the DM and marker code constructions. One is suitable when the burst errors in the DIDS channel are short, and the other when the burst errors are long. Second, our computer simulations show that marker codes can increasingly outperform DM codes as the burst error length increases, and thus provide greater robustness against burst errors on the DIDS channel. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
158. Can They See It? The Functional Field of View Is Narrower in Individuals with Autism Spectrum Disorder.
- Author
-
Song, Yongning, Hakoda, Yuji, Sanefuji, Wakako, and Cheng, Chen
- Subjects
- *
AUTISM spectrum disorders , *SOCIAL perception , *BRAIN imaging , *ROBUST control , *DATA analysis - Abstract
Although social cognitive deficits have long been thought to underlie the characteristic and pervasive difficulties with social interaction observed in individuals with autism spectrum disorder (ASD), several recent behavioral and neuroimaging studies have indicated that visual perceptual impairments might also play a role. People with ASD show a robust bias towards detailed information at the expense of global information, although the mechanisms that underlie this phenomenon remain elusive. To address this issue, we investigated the functional field of view in a group of high-functioning children with autism (n = 13) and a paired non-ASD group (n = 13). Our results indicate that the ability to correctly detect and identify stimuli sharply decreases with greater eccentricity from the fovea in people with ASD. Accordingly, a probe analysis revealed that the functional field of view in the ASD group was only about 6.62° of retinal eccentricity, compared with 8.57° in typically developing children. Thus, children with ASD appear to have a narrower functional field of view. These results challenge the conventional hypothesis that the deficit in global processing in individuals with ASD is solely due to weak central coherence. Alternatively, our data suggest that a narrower functional field of view may also contribute to this bias. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
159. A robust ensemble approach to learn from positive and unlabeled data using SVM base models.
- Author
-
Claesen, Marc, De Smet, Frank, Suykens, Johan A.K., and De Moor, Bart
- Subjects
- *
ROBUST control , *MACHINE learning , *DATA analysis , *SUPPORT vector machines , *SET theory - Abstract
We present a novel approach to learn binary classifiers when only positive and unlabeled instances are available (PU learning). This problem is routinely cast as a supervised task with label noise in the negative set. We use an ensemble of SVM models trained on bootstrap resamples of the training data for increased robustness against label noise. The approach can be considered in a bagging framework which provides an intuitive explanation for its mechanics in a semi-supervised setting. We compared our method to state-of-the-art approaches in simulations using multiple public benchmark data sets. The included benchmark comprises three settings with increasing label noise: (i) fully supervised, (ii) PU learning and (iii) PU learning with false positives. Our approach shows a marginal improvement over existing methods in the second setting and a significant improvement in the third. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
160. Low-Dimensional Non-Rigid Image Registration Using Statistical Deformation Models From Semi-Supervised Training Data.
- Author
-
Onofrey, John A., Papademetris, Xenophon, and Staib, Lawrence H.
- Subjects
- *
IMAGE registration , *ROBUST control , *IMAGE analysis , *DATA analysis , *MEDICAL imaging systems - Abstract
Accurate and robust image registration is a fundamental task in medical image analysis applications, and requires non-rigid transformations with a large number of degrees of freedom. Statistical deformation models (SDMs) attempt to learn the distribution of non-rigid deformations, and can be used both to reduce the transformation dimensionality and to constrain the registration process. However, high-dimensional SDMs are difficult to train given orders of magnitude fewer training samples. In this paper, we utilize both a small set of annotated imaging data and a large set of unlabeled data to effectively learn an SDM of non-rigid transformations in a semi-supervised training (SST) framework. We demonstrate results applying this framework towards inter-subject registration of skull-stripped, magnetic resonance (MR) brain images. Our approach makes use of 39 labeled MR datasets to create a set of supervised registrations, which we augment with a set of over 1200 unsupervised registrations using unlabeled MRIs. Through leave-one-out cross validation, we show that SST of a non-rigid SDM results in a robust registration algorithm with significantly improved accuracy compared to standard, intensity-based registration, and does so with a 99% reduction in transformation dimensionality. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
161. Video super-resolution with registration-reliability regulation and adaptive total variation.
- Author
-
Zhang, Xinfeng, Xiong, Ruiqin, Ma, Siwei, Li, Ge, and Gao, Wen
- Subjects
- *
IMAGE registration , *HIGH resolution imaging , *PIXELS , *IMAGE processing , *ROBUST control , *MATHEMATICAL regularization , *DATA analysis - Abstract
In super-resolution that constructs a high-resolution (HR) image from a set of low-resolution (LR) reference images, it is crucial to align the LR reference images in order to efficiently exploit the pixels therein. However, due to the existence of complex local motion, ideal registration is difficult to acquire. In this paper, we present a robust video super-resolution scheme with registration-reliability regulation and content adaptive total variation regularization, which make the scheme resilient to registration failures. In order to handle ill-registered pixels, we propose a registration-reliability regulated data-fidelity term, which assigns smaller weights to the pixels with larger locally-averaged registration residuals. In addition, a content adaptive total variation based on structure tensor, which is used to estimate image local structures, is proposed to regularize the super-resolved images. The structure tensor is derived not only from the gradients of local patches but also the nonlocal similar patches. Experimental results show that the proposed scheme can remarkably improve both the objective and subjective quality of the video super-resolution results. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
162. Joint movement similarities for robust 3D action recognition using skeletal data.
- Author
-
Pazhoumand-Dar, Hossein, Lam, Chiou-Peng, and Masek, Martin
- Subjects
- *
ROBUST control , *THREE-dimensional imaging , *IMAGE recognition (Computer vision) , *DATA analysis , *HUMAN behavior , *COMPARATIVE studies - Abstract
Human action analysis based on 3D imaging is an emerging topic. This paper presents an approach for the problem of action recognition using information from a number of action descriptors calculated from a skeleton fitted to the body of a tracked subject. In the proposed approach, a novel technique that automatically determines discriminative sequences of relative joint positions for each action class is employed. In addition, we use an extended formulation of the longest common subsequence algorithm as a similarity function, which allows the classifier to reliably find the best match for extracted features from noisy skeletal data. The proposed approach is evaluated using two existing datasets from the literature, one captured using a Microsoft Kinect camera and the other using a motion capture system. The experimental results show that the approach outperforms existing skeleton-based algorithms in terms of its classification accuracy and is more robust in the presence of noise when compared to the dynamic time warping algorithm for human action recognition. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
163. Toward an Empirically Robust Science of Human Development.
- Author
-
Duncan, Greg J.
- Subjects
REPLICATION (Experimental design) ,ROBUST control ,SCIENTIFIC method ,EXPERIMENTAL psychology ,SENSITIVITY (Personality trait) ,DATA analysis - Abstract
Replications and robustness checks are key elements of the scientific method and a staple in many disciplines. My wish is for prioritizing both explicit replications and, especially, the lowest of low-hanging fruit: within-study robustness checks. I provide recommendations for editorial policies that encourage these practices and describe ways of promoting these practices in graduate training. While some of my recommendations might affect the form and substance of developmental research articles, I argue that their scientific benefits are key for advancing the field. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
164. Discussion of 'Multivariate functional outlier detection'.
- Author
-
Arribas-Gil, Ana and Romo, Juan
- Subjects
ROBUST control ,OUTLIER detection ,MULTIVARIATE analysis ,DATA analysis - Published
- 2015
- Full Text
- View/download PDF
165. Robust hand tracking via novel multi-cue integration.
- Author
-
Zhang, Xiaoqin, Li, Wei, Ye, Xiuzi, and Maybank, Stephen
- Subjects
- *
ROBUST control , *INFORMATION theory , *COMPUTER algorithms , *DATA analysis , *COMPUTER research - Abstract
In this paper, we present a robust real-time hand tracking system via multi-cue integration. In practice, the motion information of the hand, such as optical flow, is hard to exploit, because images of hands lack texture. As a result, the integration of the color and motion cues using conventional integration algorithms is difficult. Here, we integrate the motion and color cues from a novel feature point selection view. The hand is tracked using feature points, and the integration is realized during the feature points generation and selection process. In the generation process, a bounding box estimated by the color cue is used to provide a region for the feature points generation. Then, the RCD ( Representative , Compact and Diverse ) criteria are proposed to control the feature point selection process. After the selection process, the feature points are tracked using estimates of the motion of each feature point. The centroid of the feature points in each frame is adopted as the position of the hand. The experimental results show that our integration algorithm outperforms tracking algorithms that only use a single cue. Also the proposed tracking algorithm is more robust in complex environments than other state-of-art algorithms. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
166. Sea lions' ( Zalophus californianus) use of human pointing gestures as referential cues.
- Author
-
Malassis, Raphaëlle and Delfour, Fabienne
- Subjects
- *
POINTING (Gesture) , *SEA lions , *ROBUST control , *SOCIAL perception , *DATA analysis - Abstract
This experiment investigated the ability of four human-socialized sea lions to exploit human communicative gestures in three different object-choice tasks based on directional cues emitted by their caretakers. In Study 1, three of the tested subjects were able to generalize their choice of the pointed target to variations of the basic pointing gestures (i.e., cross-body point, elbow point, foot point, and gaze only), from the very first trials. Study 2 showed that the subjects could follow the pointing gestures geometrically and select the correct target among four possible targets, two on each side of the informant. In Study 3, we tested the robustness of their tendency to follow a pointing gesture by hiding targets behind barriers. One subject was able to follow pointing gestures towards targets not visible at the moment of their decision without any training, despite the presence of another visible and directly accessible one. Taken together, these results suggest that sea lions were able to use the referential property of the human pointing gesture, because they were able to rely on extrapolating precise linear vectors along different pointing body parts in order to identify a precise object rather than merely a general direction. These findings support previous arguments that some non-domesticated species might have as great an ability to respond appropriately to pointing gestures as domesticated dogs. The potential roles of human-socialization and specific features of wild sea lions ecology are discussed. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
167. AUTOMATED GATING OF PORTABLE CYTOMETER DATA BASED ON SKEW t MIXTURE MODELS.
- Author
-
WANG, XIANWEN, CHEN, FENG, CHENG, ZHI, DU, YAOHUA, and WU, TAIHU
- Subjects
- *
FLOW cytometry , *MATHEMATICAL models , *DATA analysis , *CELL populations , *ROBUST control , *EXPECTATION-maximization algorithms - Abstract
A major component of flow cytometry (FCM) data analysis involves gating, which is the process of identifying homogeneous groups of cells. With the rapid development of the portable flow cytometer, manual gating techniques have been unable to meet the demand for accurate and rapid analysis of samples. To provide a practical application for portable devices, we propose a flexible, statistical model-based clustering approach for identifying cell populations in FCM data. This approach, which mimics the manual gating process, employs a finite mixture model with a density function of skew t distribution and estimates parameters via an expectation maximization algorithm. Data analysis from an experiment on a patient's peripheral blood samples have proven that the proposed methodology yields better results in terms of robustness against outliers than current state-of-the-art automated gating methods, has more flexibility in clustering symmetric data and leads to lower misclassification rates (misclassification rates of skew t method is 0.06442) when handling highly asymmetric data. The method we proposed will improve data analysis of portable flow cytometers, especially when the users have no professional training. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
168. Robust Multi-person Tracking for Real-Time Intelligent Video Surveillance.
- Author
-
Jin-Woo Choi, Daesung Moon, and Jang-Hee Yoo
- Subjects
VIDEO surveillance ,ROBUST control ,REAL-time programming ,DATA analysis ,MONTE Carlo method - Abstract
We propose a novel multiple-object tracking algorithm for real-time intelligent video surveillance. We adopt particle filtering as our tracking framework. Background modeling and subtraction are used to generate a region of interest. A two-step pedestrian detection is employed to reduce the computation time of the algorithm, and an iterative particle repropagation method is proposed to enhance its tracking accuracy. A matching score for greedy data association is proposed to assign the detection results of the two-step pedestrian detector to trackers. Various experimental results demonstrate that the proposed algorithm tracks multiple objects accurately and precisely in real time. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
169. A multi-approach strategy in climate attribution studies: Is it possible to apply a robustness framework?
- Author
-
Pasini, Antonello and Mazzocchi, Fulvio
- Subjects
CLIMATE change ,ROBUST control ,GLOBAL warming ,EFFECT of human beings on climate change ,DATA analysis - Abstract
Attribution studies investigate the causes of recent global warming. For a few decades the scientific community generally adopted dynamical models – the so-called Global Climate Models (GCMs) – for such an investigation. These models show the essential role of anthropogenic forcings in driving the temperature behaviour of the last half century. In the last period even other (data-driven) methodological approaches were adopted for attribution studies. This allows the scientific community to compare the results coming from these different approaches and to possibly increase their robustness. For such a purpose, the paper explores the possibility of applying a robustness framework, so far used only in the case of multi-model GCM ensembles, to a strategy including models from different methodological orientations, assessing such an application especially in the light of the independence issue. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
170. A hybrid clustering algorithm based on missing attribute interval estimation for incomplete data.
- Author
-
Zhang, Li, Bing, Zhaohong, and Zhang, Liyong
- Subjects
- *
DATA analysis , *PARTICLE swarm optimization , *ALGORITHMS , *ROBUST control , *FUZZY logic - Abstract
Partially missing data sets are a prevailing problem in clustering analysis. We propose a hybrid algorithm combining fuzzy clustering with particle swarm optimization (PSO) for incomplete data clustering, and missing attributes are represented as intervals. Furthermore, we develop a neighbor interval reconstruction (NIR) method based on pre-classification results that estimates the nearest-neighbor interval of missing attribute using the nearest-neighbor rule, which avoids endpoints of intervals determined by different species information, thereby improving the accuracy of missing attribute intervals and enhancing the robustness of missing attribute imputation. Then, the PSO and fuzzy c-means hybrid algorithm are used for clustering the interval-valued data set, and the global optimization ability of the PSO can improve the accuracy of clustering results compared with gradient-based optimization methods. The experimental results for several UCI data sets show the superiority of the proposed NIR hybrid algorithm. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
171. Limits to robustness and reproducibility in the demarcation of operational taxonomic units.
- Author
-
Schmidt, Thomas S. B., Matias Rodrigues, João F., and Mering, Christian
- Subjects
- *
MICROBIAL ecology , *NUCLEOTIDE sequence , *ROBUST control , *RNA analysis , *DATA analysis , *MICROBIOLOGY - Abstract
The demarcation of operational taxonomic units ( OTUs) from complex sequence data sets is a key step in contemporary studies of microbial ecology. However, as biologically motivated 'optimal' OTU-binning algorithms remain elusive, many conceptually distinct approaches continue to be used. Using a global data set of 887 870 bacterial 16 S r RNA gene sequences, we objectively quantified biases introduced by several widely employed sequence clustering algorithms. We found that OTU-binning methods often provided surprisingly non-equivalent partitions of identical data sets, notably when clustering to the same nominal similarity thresholds; and we quantified the resulting impact on ecological data description for a well-defined human skin microbiome data set. We observed that some methods were very robust to varying clustering thresholds, while others were found to be highly susceptible even to slight threshold variations. Moreover, we comprehensively quantified the impact of the choice of 16 S r RNA gene subregion, as well as of data set scope and context on algorithm performance. Our findings may contribute to an enhanced comparability of results across sequence-processing pipelines, and we arrive at recommendations towards higher levels of standardization in established workflows. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
172. Leveraging the Power of High Performance Computing for Next Generation Sequencing Data Analysis: Tricks and Twists from a High Throughput Exome Workflow.
- Author
-
Kawalia, Amit, Motameny, Susanne, Wonczak, Stephan, Thiele, Holger, Nieroda, Lech, Jabbari, Kamel, Borowski, Stefan, Sinha, Vishal, Gunia, Wilfried, Lang, Ulrich, Achter, Viktor, and Nürnberg, Peter
- Subjects
- *
HIGH performance computing , *NUCLEOTIDE sequence , *LIFE sciences , *BIOINFORMATICS , *DATA analysis , *ROBUST control - Abstract
Next generation sequencing (NGS) has been a great success and is now a standard method of research in the life sciences. With this technology, dozens of whole genomes or hundreds of exomes can be sequenced in rather short time, producing huge amounts of data. Complex bioinformatics analyses are required to turn these data into scientific findings. In order to run these analyses fast, automated workflows implemented on high performance computers are state of the art. While providing sufficient compute power and storage to meet the NGS data challenge, high performance computing (HPC) systems require special care when utilized for high throughput processing. This is especially true if the HPC system is shared by different users. Here, stability, robustness and maintainability are as important for automated workflows as speed and throughput. To achieve all of these aims, dedicated solutions have to be developed. In this paper, we present the tricks and twists that we utilized in the implementation of our exome data processing workflow. It may serve as a guideline for other high throughput data analysis projects using a similar infrastructure. The code implementing our solutions is provided in the supporting information files. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
173. Coverage-based resampling: Building robust consolidated decision trees.
- Author
-
Ibarguren, Igor, Pérez, Jesús M., Muguerza, Javier, Gurrutxaga, Ibai, and Arbelaitz, Olatz
- Subjects
- *
RESAMPLING (Statistics) , *ROBUST control , *DECISION trees , *DATA mining , *DATA analysis , *MACHINE learning - Abstract
The class imbalance problem has attracted a lot of attention from the data mining community recently, becoming a current trend in machine learning research. The Consolidated Tree Construction (CTC) algorithm was proposed as an algorithm to solve a classification problem involving a high degree of class imbalance without losing the explaining capacity, a desirable characteristic of single decision trees and rule sets. CTC works by resampling the training sample and building a tree from each subsample, in a similar manner to ensemble classifiers, but applying the ensemble process during the tree construction phase, resulting in a unique final tree. In the ECML/PKDD 2013 conference the term “Inner Ensembles” was coined to refer to such methodologies. In this paper we propose a resampling strategy for classification algorithms that use multiple subsamples. This strategy is based on the class distribution of the training sample to ensure a minimum representation of all classes when resampling. This strategy has been applied to CTC over different classification contexts. A robust classification algorithm should not just be able to rank in the top positions for certain classification problems but should be able to excel when faced with a broad range of problems. In this paper we establish the robustness of the CTC algorithm against a wide set of classification algorithms with explaining capacity. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
174. A Framework for Meta-Analysis of Veterinary Drug Pharmacokinetic Data Using Mixed Effect Modeling.
- Author
-
Li, Mengjie, Gehring, Ronette, Lin, Zhoumeng, and Riviere, Jim
- Subjects
- *
PHARMACOKINETICS , *VETERINARY drugs , *META-analysis , *DATA analysis , *DATA mining , *ROBUST control - Abstract
Combining data from available studies is a useful approach to interpret the overwhelming amount of data generated in medical research from multiple studies. Paradoxically, in veterinary medicine, lack of data requires integrating available data to make meaningful population inferences. Nonlinear mixed-effects modeling is a useful tool to apply meta-analysis to diverse pharmacokinetic ( PK) studies of veterinary drugs. This review provides a summary of the characteristics of PK data of veterinary drugs and how integration of these data may differ from human PK studies. The limits of meta-analysis include the sophistication of data mining, and generation of misleading results caused by biased or poor quality data. The overriding strength of meta-analysis applied to this field is that robust statistical analysis of the diverse sparse data sets inherent to veterinary medicine applications can be accomplished, thereby allowing population inferences to be made. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association J Pharm Sci 104:1230-1239, 2015 [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
175. Outlier detection and robust normal-curvature estimation in mobile laser scanning 3D point cloud data.
- Author
-
Nurunnabi, Abdul, West, Geoff, and Belton, David
- Subjects
- *
OUTLIER detection , *ROBUST control , *CURVATURE , *OPTICAL scanners , *THREE-dimensional imaging , *DATA analysis , *FEATURE extraction - Abstract
This paper proposes two robust statistical techniques for outlier detection and robust saliency features, such as surface normal and curvature, estimation in laser scanning 3D point cloud data. One is based on a robust z -score and the other uses a Mahalanobis type robust distance. The methods couple the ideas of point to plane orthogonal distance and local surface point consistency to get Maximum Consistency with Minimum Distance (MCMD). The methods estimate the best-fit-plane based on most probable outlier free, and most consistent, points set in a local neighbourhood. Then the normal and curvature from the best-fit-plane will be highly robust to noise and outliers. Experiments are performed to show the performance of the algorithms compared to several existing well-known methods (from computer vision, data mining, machine learning and statistics) using synthetic and real laser scanning datasets of complex (planar and non-planar) objects. Results for plane fitting, denoising, sharp feature preserving and segmentation are significantly improved. The algorithms are demonstrated to be significantly faster, more accurate and robust. Quantitatively, for a sample size of 50 with 20% outliers the proposed MCMD_Z is approximately 5, 15 and 98 times faster than the existing methods: uLSIF, RANSAC and RPCA, respectively. The proposed MCMD_MD method can tolerate 75% clustered outliers, whereas, RPCA and RANSAC can only tolerate 47% and 64% outliers, respectively. In terms of outlier detection, for the same dataset, MCMD_Z has an accuracy of 99.72%, 0.4% false positive rate and 0% false negative rate; for RPCA, RANSAC and uLSIF, the accuracies are 97.05%, 47.06% and 94.54%, respectively, and they have misclassification rates higher than the proposed methods. The new methods have potential for local surface reconstruction, fitting, and other point cloud processing tasks. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
176. Co-occurrence probability-based pixel pairs background model for robust object detection in dynamic scenes.
- Author
-
Liang, Dong, Kaneko, Shun׳ichi, Hashimoto, Manabu, Iwata, Kenji, and Zhao, Xinyue
- Subjects
- *
PROBABILITY theory , *PIXELS , *ROBUST control , *OBJECT tracking (Computer vision) , *DYNAMICAL systems , *DATA analysis - Abstract
An illumination-invariant background model for detecting objects in dynamic scenes is proposed. It is robust in the cases of sudden illumination fluctuation as well as burst motion. Unlike the previous works, it uses the co-occurrence differential increments of multiple pixel pairs to distinguish objects from a non-stationary background. We use a two-stage training framework to model the background. First, joint histograms of co-occurrence probability are employed to screen supporting pixels with high normalized correlation coefficient values; then, K-means clustering-based spatial sampling optimizes the spatial distribution of the supporting pixels; finally the background model maintains a sensitive criterion with few parameters to detect foreground elements. Experiments using several challenging datasets (PETS-2001, AIST-INDOOR, Wallflower and a real surveillance application) prove the robust and competitive performance of object detection in various indoor and outdoor environments. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
177. Robust Modeling of Differential Gene Expression Data Using Normal/Independent Distributions: A Bayesian Approach.
- Author
-
Ganjali, Mojtaba, Baghfalaki, Taban, and Berridge, Damon
- Subjects
- *
ROBUST control , *GENE expression , *DATA analysis , *CELL differentiation , *MARKOV chain Monte Carlo - Abstract
In this paper, the problem of identifying differentially expressed genes under different conditions using gene expression microarray data, in the presence of outliers, is discussed. For this purpose, the robust modeling of gene expression data using some powerful distributions known as normal/independent distributions is considered. These distributions include the Student’s t and normal distributions which have been used previously, but also include extensions such as the slash, the contaminated normal and the Laplace distributions. The purpose of this paper is to identify differentially expressed genes by considering these distributional assumptions instead of the normal distribution. A Bayesian approach using the Markov Chain Monte Carlo method is adopted for parameter estimation. Two publicly available gene expression data sets are analyzed using the proposed approach. The use of the robust models for detecting differentially expressed genes is investigated. This investigation shows that the choice of model for differentiating gene expression data is very important. This is due to the small number of replicates for each gene and the existence of outlying data. Comparison of the performance of these models is made using different statistical criteria and the ROC curve. The method is illustrated using some simulation studies. We demonstrate the flexibility of these robust models in identifying differentially expressed genes. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
178. Folded-concave penalization approaches to tensor completion.
- Author
-
Cao, Wenfei, Wang, Yao, Yang, Can, Chang, Xiangyu, Han, Zhi, and Xu, Zongben
- Subjects
- *
COMPUTATIONAL complexity , *MATHEMATICAL optimization , *MACHINE learning , *PROBLEM solving , *DATA analysis , *ROBUST control - Abstract
The existing studies involving matrix or tensor completion problems are commonly under the nuclear norm penalization framework due to the computational efficiency of the resulting convex optimization problem. Folded-concave penalization methods have demonstrated surprising developments in sparse learning problems due to their nice practical and theoretical properties. To share the same light of folded-concave penalization methods, we propose a new tensor completion model via folded-concave penalty for estimating missing values in tensor data. Two typical folded-concave penalties, the minmax concave plus (MCP) penalty and the smoothly clipped absolute deviation (SCAD) penalty, are employed in the new model. To solve the resulting nonconvex optimization problem, we develop a local linear approximation augmented Lagrange multiplier (LLA-ALM) algorithm which combines a two-step LLA strategy to search a local optimum of the proposed model efficiently. Finally, we provide numerical experiments with phase transitions, synthetic data sets, real image and video data sets to exhibit the superiority of the proposed model over the nuclear norm penalization method in terms of the accuracy and robustness. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
179. A New Robust Regression Method Based on Particle Swarm Optimization.
- Author
-
Cagcag, Ozge, Yolcu, Ufuk, and Egrioglu, Erol
- Subjects
- *
PARTICLE swarm optimization , *ROBUST control , *PREDICTION theory , *REGRESSION analysis , *LEAST squares , *DATA analysis - Abstract
Regression analysis is one of methods widely used in prediction problems. Although there are many methods used for parameter estimation in regression analysis, ordinary least squares (OLS) technique is the most commonly used one among them. However, this technique is highly sensitive to outlier observation. Therefore, in literature, robust techniques are suggested when data set includes outlier observation. Besides, in prediction a problem, using the techniques that reduce the effectiveness of outlier and using the median as a target function rather than an error mean will be more successful in modeling these kinds of data. In this study, a new parameter estimation method using the median of absolute rate obtained by division of the difference between observation values and predicted values by the observation value and based on particle swarm optimization was proposed. The performance of the proposed method was evaluated with a simulation study by comparing it with OLS and some other robust methods in the literature. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
180. Chemical rank estimation for second-order calibration by discrete Fourier transform coupled with robust statistical analysis.
- Author
-
Yu, Yong-Jie, Fu, Hai-Yan, Gu, Hui-Wen, Li, Yong, Kang, Chao, Wang, Yi-Peng, and Wu, Hai-Long
- Subjects
- *
FOURIER transform infrared spectroscopy , *ROBUST control , *CHEMICAL decomposition , *EIGENVECTORS , *DATA analysis , *QUANTITATIVE research - Abstract
The accurate estimation of the underlying number of components in complex samples is critical in data analysis. A new chemometric strategy was developed in this study to determine accurately the number of underlying components in complex samples. First, discrete Fourier transformation was used to project the eigenvectors from the singular value decomposition to the frequency space. A robust statistical analysis based on iterative t -test was then employed to eliminate the outliers in the Fourier coefficients of each eigenvector. Finally, ANOVA was used to differentiate the meaningful components from noise. Simulated and published fluorescence datasets were used to demonstrate the strategy. Results indicate that the proposed strategy accurately and efficiently estimated the number of underlying components in the analyzed dataset. Moreover, the performance of the proposed method was comparable with the well-known core consistency diagnostic and Monte Carlo simulation coupled with frequency location methods. The new technique coupled with second-order calibration was successfully used to resolve the problem of seriously overlapped fluorescence spectra in the accurate quantification of fluoroquinolone antibiotics in tap water samples. Second-order advantage was achieved. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
181. Random projection-based partial feature extraction for robust face recognition.
- Author
-
Ma, Chunfei, Jung, June-Young, Kim, Seung-Wook, and Ko, Sung-Jea
- Subjects
- *
RANDOM projection method , *ROBUST control , *FACE perception , *COMPRESSED sensing , *FEATURE extraction , *DATA analysis - Abstract
In this paper, a novel feature extraction method for robust face recognition (FR) is proposed. The proposed method combines a simple yet effective dimensionality increasing (DI) method with an information-preserving dimensionality reduction (DR) method. For the proposed DI method, we employ the rectangle filters which sum the pixel values within a randomized rectangle window on the face image to extract the feature. By convolving the face image with all possible rectangle filters having various locations and scales, the face image in the image space is projected to a very high-dimensional feature space where more discriminative information can be incorporated. In order to significantly reduce the computational complexity while preserving the most informative features, we adopt a random projection method based on the compressed sensing theory for DR. Unlike the traditional holistic-based feature extraction methods requiring the time-consuming data-dependent training procedure, the proposed method has the partial-based and data-independent properties. Extensive experimental results on representative FR databases show that, as compared with conventional feature extraction methods, our proposed method not only achieves the higher recognition accuracy but also shows better robustness to corruption, occlusion, and disguise. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
182. Evaluating the robustness of models developed from field spectral data in predicting African grass foliar nitrogen concentration using WorldView-2 image as an independent test dataset.
- Author
-
Mutanga, Onisimo, Adam, Elhadi, Adjorlolo, Clement, and Abdel-Rahman, Elfatih M.
- Subjects
- *
GRASSLANDS , *NITROGEN content of plants , *IMAGE analysis , *DATA analysis , *RANDOM forest algorithms , *ROBUST control - Abstract
In this paper, we evaluate the extent to which the resampled field spectra compare with the actual image spectra of the new generation multispectral WorldView-2 (WV-2) satellite. This was achieved by developing models from resampled field spectra data and testing them on an actual WV-2 image of the study area. We evaluated the performance of reflectance ratios (RI), normalized difference indices (NDI) and random forest (RF) regression model in predicting foliar nitrogen concentration in a grassland environment. The field measured spectra were used to calibrate the RF model using a randomly selected training ( n = 70%) nitrogen data set. The model developed from the field spectra resampled to WV-2 wavebands was validated on an independent field spectral test dataset as well as on the actual WV-2 image of the same area ( n = 30%, bootstrapped a 100 times). The results show that the model developed using RI could predict nitrogen with a mean R 2 of 0.74 and 0.65 on an independent field spectral test data set and on the actual WV-2 image, respectively. The root mean square error of prediction (RMSE %) was 0.17 and 0.22 for the field test data set and the WV-2 image, respectively. Results provide an insight on the magnitude of errors that are expected when up-scaling field spectral models to airborne or satellite image data. The prediction also indicates the unceasing relevance of field spectroscopy studies to better understand the spectral models critical for vegetation quality assessment. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
183. Primates Decline Rapidly in Unprotected Forests: Evidence from a Monitoring Program with Data Constraints.
- Author
-
Rovero, Francesco, Mtui, Arafat, Kitegile, Amani, Jacob, Philipo, Araldi, Alessandro, and Tenan, Simone
- Subjects
- *
TROPICAL forests , *PRIMATES , *HABITATS , *DATA analysis , *ROBUST control , *CONSERVATION & restoration - Abstract
Growing threats to primates in tropical forests make robust and long-term population abundance assessments increasingly important for conservation. Concomitantly, monitoring becomes particularly relevant in countries with primate habitat. Yet monitoring schemes in these countries often suffer from logistic constraints and/or poor rigor in data collection, and a lack of consideration of sources of bias in analysis. To address the need for feasible monitoring schemes and flexible analytical tools for robust trend estimates, we analyzed data collected by local technicians on abundance of three species of arboreal monkey in the Udzungwa Mountains of Tanzania (two Colobus species and one Cercopithecus), an area of international importance for primate endemism and conservation. We counted primate social groups along eight line transects in two forest blocks in the area, one protected and one unprotected, over a span of 11 years. We applied a recently proposed open metapopulation model to estimate abundance trends while controlling for confounding effects of observer, site, and season. Primate populations were stable in the protected forest, while the colobines, including the endemic Udzungwa red colobus, declined severely in the unprotected forest. Targeted hunting pressure at this second site is the most plausible explanation for the trend observed. The unexplained variability in detection probability among transects was greater than the variability due to observers, indicating consistency in data collection among observers. There were no significant differences in both primate abundance and detectability between wet and dry seasons, supporting the choice of sampling during the dry season only based on minimizing practical constraints. Results show that simple monitoring routines implemented by trained local technicians can effectively detect changes in primate populations in tropical countries. The hierarchical Bayesian model formulation adopted provides a flexible tool to determine temporal trends with full account for any imbalance in the data set and for imperfect detection. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
184. CHOOSING AMONG ALTERNATIVE LONG-RUN EVENT-STUDY TECHNIQUES.
- Author
-
Dionysiou, Dionysia
- Subjects
DEBATE ,ACCOUNTING ,DATA analysis ,ROBUST control ,STATISTICS - Abstract
This paper reviews the long-run event-study debate by outlining the strengths and weakness of the most commonly used alternative techniques. The fist part of the discussion highlights that prior literature has failed to provide a single risk-adjusted model of long-run abnormal returns with no biases. Subsequently, the paper provides guidance on how one can choose among pertinent alternative techniques. As a conclusion, researchers ought to choose among alternative techniques after considering issues such as (i) the nature of dataset and market of interest, (ii) the event type (regulatory or corporate), (iii) returns' time-interval, (iv) association of the event with accounting data, (v) sample characteristics and prior evidence regarding similar events, as well as (vi) risk changes following the event. Robustness tests are essential, while the road for further research regarding the appropriate technique(s) is open. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
185. Robust Control for PWM-Based DC–DC Buck Power Converters With Uncertainty Via Sampled-Data Output Feedback.
- Author
-
Chuanlin Zhang, Junxiao Wang, Shihua Li, Bin Wu, and Chunjiang Qian
- Subjects
- *
ROBUST control , *PULSE width modulation , *DC-to-DC converters , *FEEDBACK control systems , *DATA analysis - Abstract
This paper investigates the sampled-data output feedback control problem for dc-dc buck power converters taking consideration of components uncertainties. A reduced-order observer and a robust output feedback controller, both in the sampled-data form, have been explicitly constructed with strong robustness in the presence of uncertain parameters. A delicate stability analysis process is presented to show that, by carefully selecting the design gains and the tunable sampling period, the output voltage of the hybrid closed-loop dc-dc buck converter system will globally asymptotically tend to the desired value even though the separation principle is out of reach and the controller is only switched at the sampling points. The proposed controller consists of a set of linear difference equations which will lead to direct and easier digital implementation. Numerical simulations and experimental results are shown to illustrate the performance of the proposed control scheme. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
186. SEMIDEFINITE PROGRAMMING BASED PRECONDITIONING FOR MORE ROBUST NEAR-SEPARABLE NONNEGATIVE MATRIX FACTORIZATION.
- Author
-
GILLIS, NICOLAS and VAVASIS, STEPHEN A.
- Subjects
- *
SEMIDEFINITE programming , *ROBUST control , *FACTORIZATION , *ALGORITHMS , *MACHINE learning , *DATA analysis , *POLYNOMIAL time algorithms - Abstract
Nonnegative matrix factorization (NMF) under the separability assumption can provably be solved efficiently, even in the presence of noise, and has been shown to be a powerful technique in document classification and hyperspectral unmixing. This problem is referred to as near-separable NMF and requires that there exists a cone spanned by a small subset of the columns of the input nonnegative matrix approximately containing all columns. In this paper, we propose a preconditioning based on the minimum volume ellipsoid and semidefinite programming making the input matrix well-conditioned. This in turn can improve significantly the performance of near-separable NMF algorithms which is illustrated on the popular successive projection algorithm (SPA). The new preconditioned SPA is provably more robust to noise, and outperforms SPA on several synthetic data sets. We also show how an active-set method allows us to apply the preconditioning on large-scale real-world hyperspectral images. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
187. Trends in Near Infrared Spectroscopy and Multivariate Data Analysis From an Industrial Perspective.
- Author
-
Wiesner, Kerstin, Fuchs, Karen, Gigler, Alexander M., and Pastusiak, Remigiusz
- Subjects
NEAR infrared spectroscopy ,CHEMOMETRICS ,MULTIVARIATE analysis ,DATA analysis ,ROBUST control - Abstract
Mid- and Near-Infrared (MIR, NIR) spectra convey characteristic information on material type and sample composition. Intense developments towards more robust and reliable IR spectrometers have made this technique an important chemical analytical method for industrial quality control and in-line process monitoring. New trends in miniaturization of spectrometers facilitate a wide range of possible applications in fields such as food & beverage, healthcare, fuel/media quality control, and environmental analytics. Multivariate data analysis is mandatory for data evaluation of NIR spectra. The development of novel chemometric tools also plays an important role in promoting new applications of IR spectroscopy in the near future. [ABSTRACT FROM AUTHOR]
- Published
- 2014
- Full Text
- View/download PDF
188. Web shear failure of angle-cleat connections loaded at high rates.
- Author
-
Rahbari, Rahi, Tyas, Andrew, Buick Davison, J., and Stoddart, Euan P.
- Subjects
- *
SHEAR (Mechanics) , *ROBUST control , *DATA analysis , *NUMERICAL analysis , *GEOMETRIC analysis , *FINITE element method - Abstract
Robustness of structures to prevent progressive collapse has been considered by different building codes since the Ronan Point accident in the UK in 1968 and research on this type of collapse was brought to the fore after the building collapses at the World Trade Center on 9/11 2001. In the case of losing a column, to maintain the integrity of the building large end rotation of the beams is likely to be required in order to transfer the vertical loads from the floor above by re-distribution of loads through the remaining structure. For this reason connections are important in enabling the structure to bridge over the lost column. Connections also need to be capable of resisting high strain rates arising from the dynamic redistribution of moments and tension loads which means they require high ductility to dissipate energy and undergo deformation without failure. This paper presents results from both experimental tests conducted on connections under quasi-static and rapidly applied loading, and a series of ANSYS LS-DYNA finite element analyses which have been developed to model the connection response. A web-cleat connection model, with its complex geometry, has been produced and validated against experimental data. The effect of rate of applied loading on connection response has been investigated both experimentally and by numerical modeling. It is important to understand the effect of loading rate on the response of connections in dynamic frame response to column loss. The results indicate that web angle cleat connections are relatively insensitive to the rate of loading within the range considered here. This suggests that static characterisation of web angle cleat connections may yield suitable data for use in dynamic analysis of frame response in column loss scenarios. [ABSTRACT FROM AUTHOR]
- Published
- 2014
- Full Text
- View/download PDF
189. MACE-means clustering.
- Author
-
Shahbaba, Mahdi and Beheshti, Soosan
- Subjects
- *
PROBABILITY theory , *ESTIMATION theory , *K-means clustering , *DATA analysis , *MACHINE learning , *ROBUST control - Abstract
In this paper, we tackle the problem of estimating the correct number of clusters from a new perspective. The proposed method probabilistically estimates the Average Central Error (ACE), which is the difference between the true cluster centers and their estimations. The novelty of this work is partly in estimating the unavailable ACE by using the available cluster compactness that is the difference between estimated centers and their members. The application of this approach is explored with K-means clustering. The proposed method denoted by Minimum ACE K-means (MACE-means) is shown to have unique advantages both with synthetic and real data. MACE-means clustering is applied to benchmark real world data sets from UCI machine learning repository and other synthesized clusters that represent a wide class of clustering scenarios. Our analysis confirms superiority of MACE-means over the state of the art clustering methods in robustness to the initialization error, accuracy in detecting the correct number of clusters, having less time complexity, and robustness to cluster overlapping. [ABSTRACT FROM AUTHOR]
- Published
- 2014
- Full Text
- View/download PDF
190. Robust error estimates in weak norms for advection dominated transport problems with rough data.
- Author
-
Burman, Erik
- Subjects
- *
ROBUST control , *SAMPLING errors , *ADVECTION-diffusion equations , *TRANSPORTATION problems (Programming) , *DATA analysis , *TRANSPORT equation - Abstract
We consider transient convection-diffusion equations with a velocity vector field with multiscale character and rough data. We assume that the velocity field has two scales, a coarse scale with slow spatial variation, which is responsible for advective transport and a fine scale with small amplitude that contributes to the mixing. For this problem we consider the estimation of filtered error quantities for solutions computed using a finite element method with symmetric stabilization. A posteriori error estimates and a priori error estimates are derived using the multiscale decomposition of the advective velocity to improve stability. All estimates are independent both of the Péclet number and of the regularity of the exact solution. [ABSTRACT FROM AUTHOR]
- Published
- 2014
- Full Text
- View/download PDF
191. Validation of Optimized Descriptive Profile (ODP) technique: Accuracy, precision and robustness.
- Author
-
Silva, Rita de Cássia dos Santos Navarro da, Minim, Valéria Paula Rodrigues, Silva, Alexandre Navarro da, Gonçalves, Aline Cristina Arruda, Carneiro, João de Deus Souza, Gomide, Aline Iamim, Della Lucia, Suzana Maria, and Minim, Luis Antonio
- Subjects
- *
TASTE testing of food , *ROBUST control , *COMPARATIVE studies , *STATISTICAL models , *DATA analysis - Abstract
The Optimized Descriptive Profile is a fast sensory description method, which was recently proposed and presents high performance. The technique proposes a reduction in time necessary for the sensory test as well as a quantitative evaluation of the sensory attributes, highlighted among fast descriptive techniques. This study sought to validate this methodology, evaluating the criteria of accuracy, precision (repeatability and reproducibility) and robustness. Accuracy of the data generated by the ODP was measured by comparison with the reference method, the Conventional Profile, considering different food matrices. Precision was measured at the panel (repeatability) and interlaboratory levels (reproducibility). Robustness assessed the sensitivity of the method with regard to a reduced number of judges on the panel. The ODP methodology met the established criteria, presenting a degree of proximity with the reference methodology exceeding 95%. In assessing the repeatability of measurements and considering three repetitions of assessments by the same panel, the ODP was statistically equal among repetitions and presented a degree of proximity greater than 99%. The method indicated data reproducibility when performed in different laboratories, presenting a perfect combination among the sensory profiles. The ODP showed robustness when reducing the number of panelists, where smaller panels presented an even smaller random variation than the complete panel. The ODP was successfully validated and presented measures of validation with a high degree of certainty considering the specific dataset. [ABSTRACT FROM AUTHOR]
- Published
- 2014
- Full Text
- View/download PDF
192. A robust damage-detection technique with environmental variability combining time-series models with principal components.
- Author
-
Lakshmi, K. and Rama Mohan Rao, A.
- Subjects
- *
TIME series analysis , *PRINCIPAL components analysis , *SEASONAL variations of diseases , *ROBUST control , *AUTOREGRESSIVE models , *DATA analysis , *PROBABILITY density function - Abstract
In this paper, a novel output-only damage-detection technique based on time-series models for structural health monitoring in the presence of environmental variability and measurement noise is presented. The large amount of data obtained in the form of time-history response is transformed using principal component analysis, in order to reduce the data size and thereby improve the computational efficiency of the proposed algorithm. The time instant of damage is obtained by fitting the acceleration time-history data from the structure using autoregressive (AR) and AR with exogenous inputs time-series prediction models. The probability density functions (PDFs) of damage features obtained from the variances of prediction errors corresponding to references and healthy current data are found to be shifting from each other due to the presence of various uncertainties such as environmental variability and measurement noise. Control limits using novelty index are obtained using the distances of the peaks of the PDF curves in healthy condition and used later for determining the current condition of the structure. Numerical simulation studies have been carried out using a simply supported beam and also validated using an experimental benchmark data corresponding to a three-storey-framed bookshelf structure proposed by Los Alamos National Laboratory. Studies carried out in this paper clearly indicate the efficiency of the proposed algorithm for damage detection in the presence of measurement noise and environmental variability. [ABSTRACT FROM AUTHOR]
- Published
- 2014
- Full Text
- View/download PDF
193. A fragment-based iterative consensus clustering algorithm with a robust similarity.
- Author
-
Chung, Chih-Heng and Dai, Bi-Ru
- Subjects
ITERATIVE methods (Mathematics) ,FUZZY clustering technique ,ALGORITHMS ,ROBUST control ,DATA analysis - Abstract
The consensus clustering technique combines multiple clustering results without accessing the original data. Consensus clustering can be used to improve the robustness of clustering results or to obtain the clustering results from multiple data sources. In this paper, we propose a novel definition of the similarity between points and clusters. With an iterative process, such a definition of similarity can represent how a point should join or leave a cluster clearly, determine the number of clusters automatically, and combine partially overlapping clustering results. We also incorporate the concept of 'clustering fragment' into our method for increased speed. The experimental results show that our algorithm achieves good performances on both artificial data and real data. [ABSTRACT FROM AUTHOR]
- Published
- 2014
- Full Text
- View/download PDF
194. Application of RAD-based phylogenetics to complex relationships among variously related taxa in a species flock.
- Author
-
Takahashi, Tetsumi, Nagata, Nobuaki, and Sota, Teiji
- Subjects
- *
SPECIES diversity , *MOLECULAR phylogeny , *NUCLEOTIDE sequence , *ROBUST control , *DATA analysis , *MAXIMUM likelihood statistics - Abstract
Restriction site-associated DNA (RAD) sequences from entire genomes can be used to resolve complex phylogenetic problems. However, the processed data matrix varies depending on the strategies used to determine orthologous loci and to filter loci according to the number of taxa with sequence data for the loci, and often contains plenty of missing data. To explore the utility of RAD sequences for elucidating the phylogenetics of variously related species, we conducted RAD sequencing for the Ohomopterus ground beetles and attempted maximum-likelihood phylogenetic analyses using 42 data matrices ranging from 1.6 × 10 4 to 8.1 × 10 6 base pairs, with 11–72% missing data. We demonstrate that robust phylogenetic trees, in terms of bootstrap values, do not necessarily result from larger data matrices, as previously suggested. Robust trees for distantly related and closely related taxa resulted from different data matrices, and topologically different robust trees for distantly related taxa resulted from various data matrices. For closely related taxa, moderately large data matrices strongly supported a topology that is incompatible with morphological evidence, possibly due to the effect of introgressive hybridization. Practically, exploring variously prepared data matrices is an effective way to propose important alternative phylogenetic hypotheses for this study group. [ABSTRACT FROM AUTHOR]
- Published
- 2014
- Full Text
- View/download PDF
195. TRIMMED ANALYSIS OF VARIANCE: A ROBUST MODIFICATION OF ANOVA.
- Author
-
MUDHOLKAR, GOVIND S., SRIVASTAVA, DEO KUMAR, MARCHETTI, CAROL E., and MUDHOLKAR, ANIL G.
- Subjects
- *
ANALYSIS of variance , *MATHEMATICAL statistics , *ROBUST control , *DATA analysis , *NUMERICAL analysis - Published
- 2013
196. Religion and the Hidden Gender Gap on Abortion.
- Author
-
Forster, A. Diana
- Subjects
- *
RELIGION , *ABORTION , *DATA analysis , *ELECTIONS , *ROBUST control , *POLITICAL participation - Abstract
This paper uses data from the American National Election Studies to examine the intricate relationships between gender, religion, and public opinion on abortion rights. Given that abortion is traditionally identified as a "women's issue," researchers are often surprised to note that the issue does not appear to manifest a gender gap among the American public. However, using data from the American National Election Studies, I show that once religious preferences are accounted for, women are significantly more supportive of the right to choose than men are. After demonstrating the robustness of this finding, I conclude by exploring its implications for our understanding of religion, political behavior, and the politics of gender more broadly. [ABSTRACT FROM AUTHOR]
- Published
- 2013
197. Robust model selection and the statistical classification of languages.
- Author
-
García, J. E., González-López, V. A., and Viola, M. L. L.
- Subjects
- *
ROBUST control , *RANKING (Statistics) , *CLASSIFICATION , *STOCHASTIC processes , *DATA analysis , *MARKOV processes , *ASYMPTOTIC expansions - Abstract
In this paper we address the problem of model selection for the set of finite memory stochastic processes with finite alphabet, when the data is contaminated. We consider m independent samples, with more than half of them being realizations of the same stochastic process with law Q, which is the one we want to retrieve. We devise a model selection procedure such that for a sample size large enough, the selected process is the one with law Q. Our model selection strategy is based on estimating relative entropies to select a subset of samples that are realizations of the same law. Although the procedure is valid for any family of finite order Markov models, we will focus on the family of variable length Markov chain models, which include the fixed order Markov chain model family. We define the asymptotic breakdown point (ABDP) for a model selection procedure, and we show the ABDP for our procedure. This means that if the proportion of contaminated samples is smaller than the ABDP, then, as the sample size grows our procedure selects a model for the process with law Q. We also use our procedure in a setting where we have one sample conformed by the concatenation of sub-samples of two or more stochastic processes, with most of the subsamples having law Q. We conducted a simulation study. In the application section we address the question of the statistical classification of languages according to their rhythmic features using speech samples. This is an important open problem in phonology. A persistent difficulty on this problem is that the speech samples correspond to several sentences produced by diverse speakers, corresponding to a mixture of distributions. The usual procedure to deal with this problem has been to choose a subset of the original sample which seems to best represent each language. The selection is made by listening to the samples. In our application we use the full dataset without any preselection of samples. We apply our robust methodology estimating a model which represent the main law for each language. Our findings agree with the linguistic conjecture, related to the rhythm of the languages included on our dataset. [ABSTRACT FROM AUTHOR]
- Published
- 2012
- Full Text
- View/download PDF
198. WORKING DRAFT for WPSA 2012.
- Author
-
Little, Deborah L. and Levy, Traci
- Subjects
- *
PATIENT education , *ROBUST control , *PEOPLE with disabilities , *DATA analysis , *OPPRESSION - Abstract
This paper puts care theory and disability theory into dialogue through an examination of the relationships between disabled persons and their children or their personal assistants/caregivers. Both theories attend to issues of oppression, invisibility, and power and voice. However, care theory tends to focus on creating a public ethic of care and examines the lived experience of those providing assistance, while disability theory focuses on creating rights for disabled persons and examines the lived experience of people with disabilities. Rarely do researchers extend their lenses to consider the perspective of the other participant in the assistant/recipient care relationship. In this paper, we explore the daily lived experience of individuals participating in care/disability relationships, including disabled adults and their non-disabled minor children and disabled working-age adults and their personal assistants. Our data come from the existing literature on these relationships, particularly the qualitative data from interviews and/or observational studies. Our aim is to develop care theory through integration of the experiences of the other participant in the care relationship. Ultimately, we argue that in order to integrate the experience and agency of physically disabled people, care theory must: acknowledge the potential for greater agency on the part of care-receivers, recognize shifting and multi-directional care relationships, and include a robust and fully-realized concept of care of the self. [ABSTRACT FROM AUTHOR]
- Published
- 2012
199. Enhanced fuzzy partitions vs data randomness in FCM.
- Author
-
Jiang, Yizhang, Chung, Fu-Lai, and Wang, Shitong
- Subjects
- *
FUZZY clustering technique , *FUZZY partitions , *DATA analysis , *ROBUST control , *RANDOM noise theory - Abstract
IFP-FIM and GIFP-FCM are two typical enhanced fuzzy clustering algorithms in which the rationale of fuzzy clustering and its robustness to noise and/or outliers are enhanced by making the maximal fuzzy membership of each data point belonging to a cluster become as big as possible and other fuzzy memberships of this point belonging to all other clusters become as small as possible. In this study, a new finding will be revealed that their enhanced fuzzy partitions can be equivalently achieved by factitiously disturbing the given dataset using a random noise and then applying the proposed noise-resistant fuzzy clustering algorithm NR-FCM to the dataset with factitiously added random noise. NR-FCM is designed as an intermediate step for us to observe this finding. The virtue of this finding exists in that it indeed helps us witness from an alternative perspective that fuzziness of fuzzy partitions in fuzzy clustering and data randomness can be collaborative and even mutually transformable rather than competitive. Our several experimental results verify the above claim. [ABSTRACT FROM AUTHOR]
- Published
- 2014
- Full Text
- View/download PDF
200. Enhancing sparsity via full rank decomposition for robust face recognition.
- Author
-
Lu, Yuwu, Cui, Jinrong, and Fang, Xiaozhao
- Subjects
- *
HUMAN facial recognition software , *DATA analysis , *MATHEMATICAL decomposition , *ROBUST control , *LINEAR equations , *PROBLEM solving - Abstract
In this paper, we propose a fast and robust face recognition method named enhancing sparsity via full rank decomposition. The proposed method first represents the test sample as a linear combination of the training data as the same as sparse representation, then make a full rank decomposition of the training data matrix. We obtain the generalized inverse of the training data matrix and then solve the general solution of the linear equation directly. For obtaining the optimum solution to represent the test sample, we use the least square method to solve it. We classify the test sample into the class which has the minimal reconstruction error. Our method can solve the optimum solution of the linear equation, and it is more suitable for face recognition than sparse representation classifier. The extensive experimental results on publicly available face databases demonstrate the effectiveness of the proposed method for face recognition. [ABSTRACT FROM AUTHOR]
- Published
- 2014
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.