29 results
Search Results
2. Multilevel dimensionality-reduction methods
- Abstract
When data sets are multilevel (group nesting or repeated measures), different sources of variations must be identified. In the framework of unsupervised analyses, multilevel simultaneous component analysis (MSCA) has recently been proposed as the most satisfactory option for analyzing multilevel data. MSCA estimates submodels for the different levels in data and thereby separates the “within”-subject and “between”-subject variations in the variables. Following the principles of MSCA and the strategy of decomposing the available data matrix into orthogonal blocks, and taking into account the between- and the within data structures, we generalize, in a multilevel perspective, multivariate models in which a matrix of response variables can be used to guide the projections (formed by responses predicted by explanatory variables or by a limited number of their combinations/composites) into choices of meaningful directions. To this end, the current paper proposes the multilevel version of the multivariate regression model and dimensionality-reduction methods (used to predict responses with fewer linear composites of explanatory variables). The principle findings of the study are that the minimization of the loss functions related to multivariate regression, principal-component regression, reduced-rank regression, and canonical-correlation regression are equivalent to the separate minimization of the sum of two separate loss functions corresponding to the between and within structures, under some constraints. The paper closes with a case study of an application focusing on the relationships between mental health severity and the intensity of care in the Lombardy region mental health system.
- Published
- 2013
3. The uniformly minimum variance unbiased estimator of odds ratio in case–control studies under inverse sampling
- Abstract
The stated goal of this paper is to propose the uniformly minimum variance unbiased estimator of odds ratio in case–control studies under inverse sampling design. The problem of estimating odds ratio plays a central role in case–control studies. However, the traditional sampling schemes appear inadequate when the expected frequencies of not exposed cases and exposed controls can be very low. In such a case, it is convenient to use the inverse sampling design, which requires that random drawings shall be continued until a given number of relevant events has emerged. In this paper we prove that a uniformly minimum variance unbiased estimator of odds ratio does not exist under usual binomial sampling, while the standard odds ratio estimator is uniformly minimum variance unbiased under inverse sampling. In addition, we compare these two sampling schemes by means of large-sample theory and small-sample simulation
- Published
- 2012
4. On the Undecidability of Fuzzy Description Logics with GCIs and Product t-norm
- Abstract
The combination of Fuzzy Logics and Description Logics (DLs) has been investigated for at least two decades because such fuzzy DLs can be used to formalize imprecise concepts. In particular, tableau algorithms for crisp Description Logics have been extended to reason also with their fuzzy counterparts. Recently, it has been shown that, in the presence of general concept inclusion axioms (GCIs), some of these fuzzy DLs actually do not have the finite model property, thus throwing doubt on the correctness of tableau algorithm for which it was claimed that they can handle fuzzy DLs with GCIs. In a previous paper, we have shown that these doubts are indeed justified, by proving that a certain fuzzy DL with product t-norm and involutive negation is undecidable. In the present paper, we show that undecidability also holds if we consider a t-norm-based fuzzy DL where disjunction and involutive negation are replaced by the constructor implication, which is interpreted as the residuum. The only condition on the t-norm is that it is a continuous t-norm "starting" with the product t-norm, which covers an uncountable family of t-norms. © 2011 Springer-Verlag.
- Published
- 2011
5. On the Undecidability of Fuzzy Description Logics with GCIs and Product t-norm
- Abstract
The combination of Fuzzy Logics and Description Logics (DLs) has been investigated for at least two decades because such fuzzy DLs can be used to formalize imprecise concepts. In particular, tableau algorithms for crisp Description Logics have been extended to reason also with their fuzzy counterparts. Recently, it has been shown that, in the presence of general concept inclusion axioms (GCIs), some of these fuzzy DLs actually do not have the finite model property, thus throwing doubt on the correctness of tableau algorithm for which it was claimed that they can handle fuzzy DLs with GCIs. In a previous paper, we have shown that these doubts are indeed justified, by proving that a certain fuzzy DL with product t-norm and involutive negation is undecidable. In the present paper, we show that undecidability also holds if we consider a t-norm-based fuzzy DL where disjunction and involutive negation are replaced by the constructor implication, which is interpreted as the residuum. The only condition on the t-norm is that it is a continuous t-norm "starting" with the product t-norm, which covers an uncountable family of t-norms. © 2011 Springer-Verlag.
- Published
- 2011
6. Improvements to the tableau prover PITP
- Abstract
In this paper we discuss the new version of PITP, a proce- dure to decide propositional intuitionistic logic, which turns out at the moment to be the best propositional prover on ILTP. The changes in the strategy and implementation make the new version of PITP faster and capable of deciding more formulas than the previous one. We give a short account both of the old optimizations and the changes in the strategy with respect to the previous version. We use ILTP library and random generated formulas to compare the implementation described in this paper to the other provers (including our old version of PITP).
- Published
- 2007
7. Application of a SPH depth-integrated model to landslide run-out analysis
- Abstract
Hazard and risk assessment of landslides with potentially long run-out is becoming more and more important. Numerical tools exploiting different constitutive models, initial data and numerical solution techniques are important for making the expert's assessment more objective, even though they cannot substitute for the expert's understanding of the site-specific conditions and the involved processes. This paper presents a depth-integrated model accounting for pore water pressure dissipation and applications both to real events and problems for which analytical solutions exist. The main ingredients are: (i) The mathematical model, which includes pore pressure dissipation as an additional equation. This makes possible to model flowslide problems with a high mobility at the beginning, the landslide mass coming to rest once pore water pressures dissipate. (ii) The rheological models describing basal friction: Bingham, frictional, Voellmy and cohesive-frictional viscous models. (iii) We have implemented simple erosion laws, providing a comparison between the approaches of Egashira, Hungr and Blanc. (iv) We propose a Lagrangian SPH model to discretize the equations, including pore water pressure information associated to the moving SPH nodes. © 2014 Springer-Verlag Berlin Heidelberg
- Published
- 2014
8. Chasing a complete understanding of the triggering mechanisms of a large rapidly evolving rockslide
- Abstract
Rockslides in alpine areas can reach large volumes and, owing to their position along slopes, can either undergo large and rapid evolution originating large rock avalanches or can decelerate and stabilize. As a consequence, in particular when located within large deep-seated deformations, this type of instability requires accurate observation and monitoring. In this paper, the case study of the La Saxe rockslide (ca. 8 × 106 m3), located within a deep-seated deformation, undergoing a major phase of acceleration in the last decade and exposing the valley bottom to a high risk, is discussed. To reach a more complete understanding of the process, in the last 3 years, an intense investigation program has been developed. Boreholes have been drilled, logged, and instrumented (open-pipe piezometers, borehole wire extensometers, inclinometric casings) to assess the landslide volume, the rate of displacement at depth, and the water pressure. Displacement monitoring has been undertaken with optical targets, a GPS network, a ground-based interferometer, and four differential multi-parametric borehole probes. A clear seasonal acceleration is observed related to snow melting periods. Deep displacements are clearly localized at specific depths. The analysis of the piezometric and snowmelt data and the calibration of a 1D block model allows the forecast of the expected displacements. To this purpose, a 1D pseudo-dynamic visco-plastic approach, based on Perzyna’s theory, has been developed. The viscous nucleus has been assumed to be bi-linear: in one case, irreversible deformations develop uniquely for positive yield function values; in a more general case, visco-plastic deformations develop even for negative values. The model has been calibrated and subsequently validated on a long temporal series of monitoring data, and it seems reliable for simulating the in situ data. A 3D simplified approach is suggested by subdividing the landslide mass into distinct interacting blocks.
- Published
- 2014
9. A trust-based approach for a competitive cloud/grid computing scenario
- Abstract
Cloud/Grid systems are composed of nodes that individually manage local resources, and when a client request is submitted to the system, it is necessary to find the most suitable nodes to satisfy that request. In a competitive scenario, each node is in competition with each other to obtain the assignment of available Tasks. In such a situation, it is possible that a node, in order to obtain the Assignment of a task, can lie when declaring its own capability. Therefore, lying nodes will need to require the collaboration of other nodes to complete the task and consequently the problem arises of finding the most promising collaborators. In such a context, to make effective this selection, each node should have a trust model for accurately choosing its interlocutors. In this paper, a trust-based approach is proposed to make a node capable of finding the most reliable interlocutors. This approach, in order to avoid the exploration of the whole node space, exploits a P2P resource finding approach for clouds/grids, capable of determining the admissible region of nodes to be considered for the search of the interlocutors.
- Published
- 2013
10. Fallacies as argumentative devices in political debates
- Abstract
The current paper attempts to contribute to the study of argumentation in political debates. We propose an examination of the role of fallacies in political argumentation. In the first two sections we conduct a brief review of literature on the concepts of argumentation and fallacies to show that they both converge in emphasizing the role of discourse type when evaluating the efficacy of communicative strategies. This perspective is then applied in the analysis section to look at the role of fallacies in a political debate on nuclear energy held in Italy. We conduct a discourse analysis of the transcript based on which we identify a variety of relevant paths followed by speakers when constructing arguments. The findings demonstrate how several informal fallacies (argumentum ad baculum, argumentum ad hominem, argument from analogy, argumentum ad consequentiam) are strategically used by politicians in order to put forward coherent and strong positions.
- Published
- 2013
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.