41 results on '"statistical probability"'
Search Results
2. Data Processing for the Life Assessment of Wheel Centers of Electric Locomotives.
- Author
-
Leonets, V. A. and Lukashevych, A. O.
- Subjects
- *
DISTRIBUTION (Probability theory) , *ELECTRIC locomotives , *LOCOMOTIVE maintenance & repair , *FATIGUE cracks , *FATIGUE life - Abstract
During the factory repairs of electric locomotives of all series and indices VL8, VL10, VL60, VL40, VL80, and VL82M (hereinafter referred to as VLv/i series electric locomotives) in the mid-20s, an unforeseen increase in the number of cast wheel centers with cracks was noted simultaneously at different domestic plants. The maximum number of axles of wheelsets withdrawn from service for various reasons was observed during 16–20 years of operation. At the same time, the duration of operation of a larger total number of wheel centers was more than 30 years. Cases of cracking in cast hub centers and spokes have been reported. There is a need for 100% control of hub cracks in the wheel centers of electric locomotives of the VLv/i series. The most common cause of failure of cast wheel centers in operation is their increased susceptibility to cracking due to casting defects. Increasing the service reliability of wheel centers is achieved by using rolled centers in locomotives, which have higher material quality and a homogeneous structure. Currently, no requirements are imposed on locomotive wheelsets in terms of the limiting value of the probability of cracking in wheel center hubs, which makes it impossible to predict their service life. So far, the peculiarities of fatigue cracks in the hubs of cast and rolled locomotive wheel centers operating under giga-cycle loading have not been sufficiently studied. An urgent task is to process statistical data on the establishment of wheelset mileages, particularly for the electric locomotives of the VLv/i series, with a given probability of the appearance of fatigue cracks. We assumed that the probability of safe mileage of wheel centers should be 0.001% with an acceptable probability of operation of the entire wheelset of 0.991. The given probability of safe operation of the wheelset centers of electric locomotives makes it possible to carry out a statistical analysis of the probability distribution of the mileage of the repaired wheel centers of each series of electric locomotives. The cumulative probability of failure-free operation of wheel centers is estimated using the central theorem, according to which the distribution function of the sum of a large number of independent random variables with a finite value of mathematical expectation and variance is reduced to the function of the normal probability distribution of the occurrence of unrepairable wheel centers of electric locomotives of this series. The analysis of the normal distribution of fatigue life under the giga-cycle loading of each series's wheel centers of electric locomotives allows one to determine the identity of their reliability. The problem of avoiding the unstable operation of locomotive repair plants without detecting out-of-service wheelset centers has been solved. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
3. Non-Life Insurance: Pricing
- Author
-
Maggioni, Massimiliano, Turchetti, Giuseppe, Maggioni, Massimiliano, and Turchetti, Giuseppe
- Published
- 2024
- Full Text
- View/download PDF
4. Where’s @Waldo?: Finding Users on Twitter
- Author
-
Clarkson, Kyle, Srivastava, Gautam, Meawad, Fatma, Dwivedi, Ashutosh Dhar, Hutchison, David, Editorial Board Member, Kanade, Takeo, Editorial Board Member, Kittler, Josef, Editorial Board Member, Kleinberg, Jon M., Editorial Board Member, Mattern, Friedemann, Editorial Board Member, Mitchell, John C., Editorial Board Member, Naor, Moni, Editorial Board Member, Pandu Rangan, C., Editorial Board Member, Steffen, Bernhard, Editorial Board Member, Terzopoulos, Demetri, Editorial Board Member, Tygar, Doug, Editorial Board Member, Goos, Gerhard, Founding Editor, Hartmanis, Juris, Founding Editor, Rutkowski, Leszek, editor, Scherer, Rafał, editor, Korytkowski, Marcin, editor, Pedrycz, Witold, editor, Tadeusiewicz, Ryszard, editor, and Zurada, Jacek M., editor
- Published
- 2019
- Full Text
- View/download PDF
5. Probabilidad para el pago de soborno: aproximación multi-país para América Latina y el Caribe.
- Author
-
Lombana, Jahir and Cabeza, Leonor
- Subjects
- *
MAXIMUM likelihood statistics , *WILLINGNESS to pay , *BRIBERY , *CORRUPTION , *DEPENDENT variables - Abstract
Bribery, as research issue, focuses on causality and impact analysis and in many cases has been subordinated as one of the many types of corruption. Using microdata from Transparency International's Global Corruption Barometer, the objective is to determine for a set of selected countries in Latin America and the Caribbean, the dependent variables to predict the probability of paying bribes when in contact with individuals, companies or entities that provide services. Inferential statistical techniques are used, such as logistic regression applying the maximum likelihood method and iterative process until the best model is found. From an initial survey with 117 variables, 11 variables were found to best explain the probability of paying a bribe. Variables such as level of schooling and the influence of officials and politicians affect people's willingness to pay a bribe. Significant topics of this study, such as retaliation to favor voting and sexual favors to receive benefits, although incipient in their study, are worth further research. This work can serve as a basis to motivate studies with microdata that help to better understand specific typologies of corruption through models that predict behaviors of ordinary citizens. [ABSTRACT FROM AUTHOR]
- Published
- 2022
6. Use of probabilistic and statistical methods for separation of rocks into permeable and impermeable parts (on example of clastic deposits of Visean stage of Sofyinskoe field)
- Author
-
Aleksandr V. Shcherbenev
- Subjects
well ,separation of well section ,reservoir ,seal rock ,open porosity ,irreducible water saturation ,hydrogen content ,natural radioactivity ,well logging methods ,core studies ,Visean oil and gas complex ,linear models ,statistical probability ,mathematical statistics ,scatter chart ,Student's coefficient ,integrated probability ,Environmental sciences ,GE1-350 - Abstract
Separation of well section into permeable and impermeable parts is one of the main problems for further construction of a geological model, reserves estimation and field development planning. Quality of separation depends on amount of knowledge about geological section, level of theoretical development of well logging methods and general geophysical characteristics of the area. The fullest differentiation is obtained by using a complex of geological and geophysical methods. The paper is focused on Visean deposits of well of Sofyinskoe field drilled in 2014. A complex of activities was performed along with well logging. Porosity was calculated by acoustic and neutron logging. Core analysis was performed. Using well logging and results of core analysis selection was made, used for construction of statistical models. Based on statistical models all parameters were made-up to a single measurement system. The analysis of degree of influence of geological and geophysical parameters was made. The geological analysis shows that the greatest influence belongs to porosity and residual water. The geophysical analysis shows that the greatest influence belongs to hydrogen content and own radioactivity of rocks. A complex probabilistic parameter that includes all measurements according to core and geophysical parameters is calculated. Results of core analysis are considered fully in order to obtain a highest degree of difference. Almost all the parameters of geophysical data increase the degree of difference, except for lateral logging, microgradient and micropotential tools and transit time of P-wave for short tool, which reduces the degree of difference. Based on values of a complex parameter that have maximum differences in geological and geophysical parameters, relationships of geological and geophysical parameters were built. Scatter charts show that fields of measured points are not intersected, which confirms a correct separation of a section. Using a statistical method allows to consider fully available geological and geophysical data to separate a section into permeable and impermeable parts.
- Published
- 2017
- Full Text
- View/download PDF
7. Fault Diagnosis for Distribution Networks Based on Fuzzy Information Fusion
- Author
-
Wu, Fangrong, Peng, Minfang, Qi, Mingjun, Zhu, Liang, Leng, Hua, Su, Yi, Zhong, Qiang, Tan, Hu, Junqueira Barbosa, Simone Diniz, Series editor, Chen, Phoebe, Series editor, Cuzzocrea, Alfredo, Series editor, Du, Xiaoyong, Series editor, Filipe, Joaquim, Series editor, Kara, Orhun, Series editor, Kotenko, Igor, Series editor, Sivalingam, Krishna M., Series editor, Ślęzak, Dominik, Series editor, Washio, Takashi, Series editor, Yang, Xiaokang, Series editor, Li, Shutao, editor, Liu, Chenglin, editor, and Wang, Yaonan, editor
- Published
- 2014
- Full Text
- View/download PDF
8. The P–T Probability Framework for Semantic Communication, Falsification, Confirmation, and Bayesian Reasoning
- Author
-
Chenguang Lu
- Subjects
statistical probability ,logical probability ,semantic information ,rate-distortion ,Boltzmann distribution ,falsification ,Logic ,BC1-199 ,Philosophy (General) ,B1-5802 - Abstract
Many researchers want to unify probability and logic by defining logical probability or probabilistic logic reasonably. This paper tries to unify statistics and logic so that we can use both statistical probability and logical probability at the same time. For this purpose, this paper proposes the P–T probability framework, which is assembled with Shannon’s statistical probability framework for communication, Kolmogorov’s probability axioms for logical probability, and Zadeh’s membership functions used as truth functions. Two kinds of probabilities are connected by an extended Bayes’ theorem, with which we can convert a likelihood function and a truth function from one to another. Hence, we can train truth functions (in logic) by sampling distributions (in statistics). This probability framework was developed in the author’s long-term studies on semantic information, statistical learning, and color vision. This paper first proposes the P–T probability framework and explains different probabilities in it by its applications to semantic information theory. Then, this framework and the semantic information methods are applied to statistical learning, statistical mechanics, hypothesis evaluation (including falsification), confirmation, and Bayesian reasoning. Theoretical applications illustrate the reasonability and practicability of this framework. This framework is helpful for interpretable AI. To interpret neural networks, we need further study.
- Published
- 2020
- Full Text
- View/download PDF
9. In the problem-solving process Structure of knowledge about probability and its lesson development
- Subjects
Equally Possible ,同様に確からしい ,数学的確率 ,根元事象 ,Elementary Event ,Statistical Probability ,統計的確率 ,Mathematical Probability - Abstract
本研究の目的は,中学校第2 学年における確率の指導において,「同様に確からしい」ことの捉えとその概念形成の過程に着目し,数学的確率を用いて問題解決する際の思考の様相を明らかにすることである。単に「同様に確からしい」を知っているということだけでなく,不確定な事象を数理的に捉える際に有効に働くということの理解への転換を図るための授業開発を行った。そこで,確率指導の導入場面と終末場面において,同じ内容の問題を設定するなどして指導計画を 作成した。同じ内容の問題を統計的確率と数学的確率で解決したそれぞれの考え方を振り返り,用いた2つの確率について検討することで,「同様に確からしい」ことに着目して根元事象を捉えるようになっていくことを確認することができた。
- Published
- 2022
10. Statistical Inference Without Frequentist Justifications
- Author
-
Sprenger, Jan, Suárez, Mauricio, editor, Dorato, Mauro, editor, and Rédei, Miklós, editor
- Published
- 2010
- Full Text
- View/download PDF
11. Body image perception and sense of agency: notes from the alien hand experiment.
- Author
-
Pinhatti, Marcelle Matiazo, de Castro, Thiago Gomes, and Gomes, William Barbosa
- Subjects
- *
BODY image , *SELF-perception , *PSYCHOLOGY of college students , *ATTENTIONAL bias , *PROBABILITY theory - Abstract
Body image perception comprises stable and dynamic representational features. Here, we examined whether a stable body image (BI) measure correlates to a dynamic sense of agency (SoA) score. Moreover, we tested whether differential BI perception scores predicts SoA performance under sensory conflict conditions. A total of 21 university students (age M=25.9, SD=9.29) completed the Brazilian version of the Silhouette Figures Scale (SFS) and participated in The Alien Hand Experiment (TAHE). TAHE is a task that manipulates participants' hand movement SoA. A Bayesian correlation matrix revealed substantial negative correlation probability between BI perception and SoA confidence in accuracy responses, specifically in experimental conditions where agency was manipulated. Based on the SFS scores, participants were divided into groups with and without evidences of BI distortion. Bayesian Independent Samples T-Tests evidenced substantial differential probability between groups. Analyzes confirmed lower SoA confidence among subjects with BI distortion greater than ±3 kg/m2. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
12. Epistemic Paradox
- Author
-
Leplin, Jarrett, Sellars, W.S., editor, Lehrer, K., editor, Hetherington, S., editor, Cohen, S., editor, and Leplin, Jarrett
- Published
- 2009
- Full Text
- View/download PDF
13. A Symbolic Approach to Syllogistic Reasoning
- Author
-
Khayata, Mohamed Yasser, Pacholczyk, Daniel, Kacprzyk, Janusz, editor, Bouchon-Meunier, Bernadette, editor, Gutiérrez-Ríos, Julio, editor, Magdalena, Luis, editor, and Yager, Ronald R., editor
- Published
- 2002
- Full Text
- View/download PDF
14. 統計的問題解決力を育成する算数科授業の開発(1) : 低学年の発達に応じた蓋然性を含む遊びを活かした学習材に着目して
- Author
-
TATSUZAKI, Kei and MATSUURA, Taketo
- Subjects
PPDAC cycle ,Statistical probability ,遊び ,Play ,Statistical problem-solving ability from lower grades ,ComputingMilieux_COMPUTERSANDEDUCATION ,蓋然性 ,統計的確率 ,Probability ,低学年からの統計的問題解決力 ,PPDACサイクル - Abstract
The purpose of this treatise was to develop elementary school mathematics lessons that foster statistical problem-solving skills. In Japan, statistical problem-solving is performed using the PPDAC cycle, which is carried out from the 5th grade, but overseas, statistical education linked to probability is provided from the lower grades of elementary school. In Japan, we also developed learning materials using rock-paper-scissors play, which is a phenomenon familiar to children, and put it into practice in the third grade. in order to develop the ability to solve statistical problems from the lower grades. As a result of protocol analysis and descriptive analysis of children, it was found this learning materials was generally effective. This suggests that there is a possibility of developing statistical problem-solving skills from the lower grades.
- Published
- 2021
15. How different types of participant payments alter task performance
- Author
-
Gary L. Brase
- Subjects
participant methodology ,monetary incentives ,judgments underuncertainty ,statistical probability ,performance. ,Social Sciences ,Psychology ,BF1-990 - Abstract
Researchers typically use incentives (such as money or course credit) in order to obtain participants who engage in the specific behaviors of interest to the researcher. There is, however, little understanding or agreement on the effects of different types and levels of incentives used. Some results in the domain of statistical reasoning suggest that performance differences --- previously deemed theoretically important --- may actually be due to differences in incentive types across studies. 704 participants completed one of five variants of a statistical reasoning task, for which they received either course credit, flat fee payment, or performance-based payment incentives. Successful task completion was more frequent with performance-based incentives than with either of the other incentive types. Performance on moderately difficult tasks (compared to very easy and very hard tasks) was most sensitive to incentives. These results can help resolve existing debates about inconsistent findings, guide more accurate comparisons across studies, and be applied beyond research settings.
- Published
- 2009
16. The Foundations
- Author
-
Hausner, Melvin and Hausner, Melvin
- Published
- 1995
- Full Text
- View/download PDF
17. A Statistical Model of Fibre Distribution in a Steel Fibre Reinforced Concrete
- Author
-
Janusz Kobaka
- Subjects
Technology ,Distribution (number theory) ,Composite number ,Article ,Consistency (statistics) ,distribution ,General Materials Science ,statistical probability ,Mathematics ,Microscopy ,QC120-168.85 ,model ,business.industry ,Manufacturing process ,QH201-278.5 ,Steel fibre ,Statistical model ,Structural engineering ,SFRC ,fibre ,Reinforced concrete ,Engineering (General). Civil engineering (General) ,TK1-9971 ,Descriptive and experimental mechanics ,Probability distribution ,Electrical engineering. Electronics. Nuclear engineering ,TA1-2040 ,business - Abstract
The aim of the research was to create a model of steel fibre distribution in a Steel Fibre Reinforced Concrete space using statistical probability means. The model was created in order to better understand the behaviour of the composite under operating conditions. Four statistical distributions (Beta, Kumaraswamy, Three Parameter Beta and Generalised Transmuted Kumaraswamy) were examined to find the distribution that best described fibre settling phenomenon caused by manufacturing process conditions. In the next stage the chosen statistical distribution was adapted to create the model of steel fibre distribution in a Steel Fibre Reinforced Concrete space. The model took into account technological conditions such as vibrating time and properties such as consistency of the tested concrete. The model showed a good agreement with the real fibre distribution.
- Published
- 2021
18. Statistical Probability Based Transmission Congestion Cost Increasing Tendency Analysis.
- Author
-
Li, Jiangtao, Zhou, Lin, and Li, Furong
- Abstract
The integration of renewable generation into power system brings challenges and threats for the power system operation and planning. Transmission congestion is one of the most intricate problems. Congestion cost links the short-term system operation and long-term network planning. But few literature study the congestion cost over long term. The main innovation of this paper is to employ statistical probability of wind and load from real data to calculate congestion cost over a long period. The high accuracy and fast speed of the proposed calculation method are verified through a case study. The main contribution of this paper is the introduction of a typical congestion cost increasing curve after exploring the impacts of wind generation capacity, transmission capacity and peak load on congestion cost. The simulation results show that, with increasing wind generation capacity connected, the congestion cost increasing curve is divided into three periods — constant, exponential increasing and saturation. The dividing points between different periods are determined by transmission capacity and wind/load characteristics. Furthermore, the constant values in constant and saturation periods are determined by different electricity prices and the transmission capacity shortage. The research findings contribute to congestion cost forecasting and renewable generation deployment. [ABSTRACT FROM PUBLISHER]
- Published
- 2013
- Full Text
- View/download PDF
19. Assessing the application of ratios between daily and sub-daily extreme rainfall as disaggregation coefficients.
- Author
-
Abreu, Marcel Carvalho, Pereira, Silvio Bueno, Cecílio, Roberto Avelino, Pruski, Fernando Falco, Almeida, Laura Thebit de, and Silva, Demetrius David da
- Subjects
- *
RAINFALL , *TIME series analysis , *REGRESSION analysis , *RAIN gauges , *RETURN migration - Abstract
The objective of this paper was to establish disaggregation coefficients for the state of Minas Gerais, Brazil, under the premises of i) stationarity of the disaggregation coefficients according to the return period and ii) spatial validation of the disaggregation coefficients. The regression analysis was used as a statistical procedure to verify the significance of the coefficients of disaggregation slope coefficient as a function of return period in daily and sub-daily rainfall time series in rain gauges distributed throughout the state of Minas Gerais. Most of the disaggregation coefficients (94% of the situations) did not show a tendency to change due to the return period. The detected trends modified coefficients of disaggregation by up to 21% to the average obtained for different return periods. It was also verified that there is a significant difference between rainfall ratios of different durations, for different locations, indicating that the coefficients of disaggregation are not regional, but rather local. The use of disaggregation coefficients to obtain sub-daily rainfall is possible, provided that the representative coefficients of the locality are used. In this context, the spatialization of disaggregation coefficients is an important approach. • Relations between rainfall of different durations are disaggregation coefficients. • The use of disaggregation coefficients to obtain sub-daily rainfall is possible. • Disaggregation coefficients did not show a trend to change due to return period. • The disaggregation coefficients have local validity. • The spatialization technique can favor suitable disaggregation coefficients. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
20. The rise of probabilistic thinking.
- Abstract
Probability in nineteenth-century science Variation was considered, well into the second half of the nineteenth century, to be deviation from an ideal value. This is clear in the ‘social physics’ of Adolphe Quetelet, where the ideal was represented by the notion of ‘average man’. In astronomical observation, the model behind this line of thought, there is supposed to be a true value in an observation, from which the actual value deviates through the presence of small erratic causes. In mathematical error theory, one could show that numerous small and mutually independent errors produce the familiar bell-shaped normal curve around a true value. But if observations contain a systematic error, this can be identified and its effect eliminated. All sorts of data regarding society were collected into public state records (whence comes the term statistics), showing remarkable statistical stability from year to year. Such stability, as in criminal records, was explained as the very nearly deterministic result of the sum of a great number of free individual acts (see Krüger et al. 1987 for studies of these developments). Around 1860, the physicist James Clerk Maxwell theoretically determined a normal Gaussian distribution law for the velocities of gas molecules. This discovery later led to statistical mechanics in the work of Ludwig Boltzmann and Josiah Willard Gibbs. Here there was no true unknown value, but genuine variation not reducible to effects of external errors. The world view of classical physics held that all motions of matter follow the deterministic laws of Newtonian mechanics. It was therefore argued, throughout the second half of the nineteenth century, and well into the twentieth, that there is an inherent contradiction in the foundations of statistical physics. [ABSTRACT FROM AUTHOR]
- Published
- 2003
- Full Text
- View/download PDF
21. A personal history of Bayesian statistics.
- Author
-
Leonard, Thomas Hoskyns
- Subjects
- *
WEB development , *BAYESIAN analysis , *AXIOMS , *ARM'S length transactions , *BUSINESS ethics - Abstract
The history of Bayesian statistics is traced, from a personal perspective, through various strands and via its re-genesis during the 1960s to the current day. Emphasis is placed on broad-sense Bayesian methodology that can be used to meaningfully analyze observed datasets. Over 750 people in science, medicine, and socioeconomics, who have influenced the evolution of the Bayesian approach into the powerful paradigm that it is today, are highlighted. The frequentist/Bayesian controversy is addressed, together with the ways in which many Bayesians combine the two ideologies as a Bayes/non-Bayes compromise, e.g., when drawing inferences about unknown parameters or when investigating the choice of sampling model in relation to its real-life background. A number of fundamental issues are discussed and critically examined, and some elementary explanations for nontechnical readers and some personal reminiscences are included. Some of the Bayesian contributions of the 21st century are subjected to more detailed critique, so that readers may learn more about the quality and relevance of the ongoing research. A recent resolution of Lindley's paradox by Baskurt and Evans is reported. The axioms of subjective probability are reassessed, some state-of-the-art alternatives to Leonard Savage's axioms of utility are discussed, and Deborah Mayo and Michael Evan's refutation of Allan Birnbaum's 1962 justification of the likelihood principle in terms of the sufficiency and conditionality principles is addressed. WIREs Comput Stat 2014, 6:80-115. doi: 10.1002/wics.1293 For further resources related to this article, please visit the . Additional Supporting Information may be found at . Disclaimer: Reference to third party websites in this article does not constitute an endorsement or recommendation by the Publisher. Views and opinions expressed on third party websites are personal to the author and do not necessarily reflect those of this Publisher. Conflict of interest: The author has declared no conflicts of interest for this article. [ABSTRACT FROM AUTHOR]
- Published
- 2014
- Full Text
- View/download PDF
22. FEATURES AND TIME DYNAMICS OF THE BODY MASS INDEX AND THE PONDERAL INDEX OF PROFESSIONAL BODYBUILDERS DURING THE 1980-2012 PERIOD.
- Author
-
Panayotov, Valentine
- Subjects
BODY mass index ,BODYBUILDERS ,RANDOM variables ,PROBABILITY density function ,GAUSSIAN distribution ,ANTHROPOMETRY - Abstract
In this article the author analyzes the values of two random variables (RV) - the body mass index (BMI) and the Pondera index (PI) of the competitors which ranked on the first six places on the contests without categories Mr Olympia, Arnold Classic, and Night of the Champions and Ironman during the 1980-2012 periods. The goal of the study was to establish the features of the probability density functions (PDF) of these indices and their changes with the time. After analyzing the data we reached to the following conclusions: 1. the body composition of the bodybuilders which ranked among the top six in the professional tournaments without categories is characterized by a clustering around the mean values in the studied indices - the sample is very homogeneous; 2. the form of the estimated PDF is intrusive for a bigger probability than the Normal distribution suggests for an emergence of competitors with extreme values of BMI and PI; 3. We consider that we could look for the reason for the increase of the values of both RVs with the time in two directions: 1. Because of the features of the judging and the ranking in the bodybuilding, the preferences of the judges evolved towards tolerating more muscular competitors or 2. The muscle mass of all the professional bodybuilders increased with the time in the studied time period which lead to an increase of the muscularity of the studied sample of bodybuilders. [ABSTRACT FROM AUTHOR]
- Published
- 2013
23. Nowcasting thunderstorms with graph spectral distance and entropy estimation.
- Author
-
Chaudhuri, Sutapa and Middey, Anirban
- Subjects
- *
THUNDERSTORMS , *WEATHER forecasting , *ENTROPY , *CONDENSATION (Meteorology) , *ESTIMATION theory - Abstract
The aim of the present study is to forecast thunderstorms over Kolkata (22°32′N, 88°20′E), India, during the pre-monsoon season (April-May) with graph spectral distance and entropy analysis. Graph vertices represent points connected by lines or edges, and lifting condensation level, convective condensation level, level of free convection, freezing level, level of neutral buoyancy and the surface level are taken as the input of the graph vertices. The variation in the most probable distance between the vertices is investigated. The result reveals a particular orientation of the vertex distances for thunderstorm days which is significantly different from the non-thunderstorm days. The reference graphs for thunderstorm and non-thunderstorm days are formed using the most probable vertex distances. The spectral distance between the reference graph and the graphs corresponding to thunderstorms are computed with the data collected during the period 1997-2009. The entropies, or the measure of disorderliness or uncertainty, are estimated for the graph distance matrices. The result shows that the thunderstorm days possess lower distance entropy than the non-thunderstorm days. This indicates that the reference graph that has been constructed for thunderstorms is more consistent. The result further depicts that the forecast accuracy through the present method is 98% with 1 h lead time, whereas the accuracy is 93% with 6 h lead time. The forecast is validated with the India Meteorological Department observations for the years 2007, 2008 and 2009. Copyright © 2011 Royal Meteorological Society [ABSTRACT FROM AUTHOR]
- Published
- 2011
- Full Text
- View/download PDF
24. Application of two-point probability distribution functions to predict properties of heterogeneous two-phase materials
- Author
-
Belvin, A., Burrell, R., Gokhale, A., Thadhani, N., and Garmestani, H.
- Subjects
- *
INHOMOGENEOUS materials , *COMPOSITE materials , *MATERIALS analysis , *TITANIUM diboride , *CERAMICS , *DISTRIBUTION (Probability theory) , *MICROSTRUCTURE , *MECHANICAL behavior of materials - Abstract
Abstract: An investigation was conducted in order to determine the viability of using two-point probability distribution functions to predict the material properties of two-phase heterogeneous materials. The material of interest in this investigation was Titanium Diboride/Alumina ceramic (TiB2 +Al2O3). In TiB2 +Al2O3 ceramics, it has been noted that the connectivity of the TiB2 second phase plays a major role in the mechanical properties of the TiB2 +Al2O3 ceramic, spall strength in particular. Two-point distribution functions were obtained for the four types of microstructures of TiB2 +Al2O3 and were compared to experimental results that have been previously reported. The two-point distribution functions showed very good correlation to connectivity and spall strength results. [Copyright &y& Elsevier]
- Published
- 2009
- Full Text
- View/download PDF
25. Locally Bayesian Learning With Applications to Retrospective Revaluation and Highlighting.
- Author
-
Kruschke, John K.
- Subjects
- *
COGNITION , *LEARNING , *COGNITIVE ability , *STOCHASTIC learning models , *HUMAN behavior , *COMPREHENSION , *SEQUENTIAL analysis , *CONCEPTS , *LEARNING ability - Abstract
A scheme is described for locally Bayesian parameter updating in models structured as successions of component functions. The essential idea is to back-propagate the target data to interior modules, such that an interior component's target is the input to the next component that maximizes the probability of the next component's target. Each layer then does locally Bayesian learning. The approach assumes online trial-by-trial learning. The resulting parameter updating is not globally Bayesian but can better capture human behavior. The approach is implemented for an associative learning model that first maps inputs to attentionally filtered inputs and then maps attentionally filtered inputs to outputs. The Bayesian updating allows the associative model to exhibit retrospective revaluation effects such as backward blocking and unovershadowing, which have been challenging for associative learning models. The back-propagation of target values to attention allows the model to show trial-order effects, including highlighting and differences in magnitude of forward and backward blocking, which have been challenging for Bayesian learning models. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
26. Genetic polymorphism revealed by 13 tetrameric and 2 pentameric STR loci in four Mongoloid tribal population
- Author
-
Maity, B., Nunga, S.C., and Kashyap, V.K.
- Subjects
- *
IDENTIFICATION , *FORENSIC sciences - Abstract
The short tandem repeat allelic profiles at to 15 autosomal polymorphic loci were analyzed in four tribal populations of Mizoram (India). The analysis was performed on 354 unrelated healthy individuals belonging to Mongoloid races. All the samples were subjected to sex test (Amelogenin marker) besides the STR typing and in all instances; it has shown no deviation from expectation. The allele frequencies for all the analyzed loci in the studied populations are within expected range in comparison to the populations from same racial background. No significant deviation from the Hardy–Weinberg Equilibrium was observed for all the populations. In no cases the observed heterozygosity is less than that of expected values and it varied from 0.978 (Penta E) to as low as 0.425 (THO1). The discriminatory power and exclusion probability values for all the analyzed markers are significantly high and thus reveal high forensic significance. There is no evidence for association of alleles among the 15 studied loci. This allele frequency data will be useful for human identity testing in Mizo population. [Copyright &y& Elsevier]
- Published
- 2003
- Full Text
- View/download PDF
27. A Statistical Model of Fibre Distribution in a Steel Fibre Reinforced Concrete.
- Author
-
Kobaka, Janusz
- Subjects
- *
STATISTICAL models , *FIBERS , *MANUFACTURING processes , *DISTRIBUTION (Probability theory) , *STEEL , *REINFORCED concrete - Abstract
The aim of the research was to create a model of steel fibre distribution in a Steel Fibre Reinforced Concrete space using statistical probability means. The model was created in order to better understand the behaviour of the composite under operating conditions. Four statistical distributions (Beta, Kumaraswamy, Three Parameter Beta and Generalised Transmuted Kumaraswamy) were examined to find the distribution that best described fibre settling phenomenon caused by manufacturing process conditions. In the next stage the chosen statistical distribution was adapted to create the model of steel fibre distribution in a Steel Fibre Reinforced Concrete space. The model took into account technological conditions such as vibrating time and properties such as consistency of the tested concrete. The model showed a good agreement with the real fibre distribution. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
28. Expert system for testing industrial processes and determining sensor status
- Author
-
Singer, Ralph [Naperville, IL]
- Published
- 1998
29. System for monitoring an industrial process and determining sensor status
- Author
-
Humenik, Keith [Columbia, MD]
- Published
- 1997
30. System for monitoring an industrial process and determining sensor status
- Author
-
Humenik, Keith [Columbia, MD]
- Published
- 1995
31. Processing data base information having nonwhite noise
- Author
-
Morreale, Patricia [Park Ridge, IL]
- Published
- 1995
32. Body image perception and sense of agency: notes from the alien hand experiment
- Author
-
Matiazo Pinhatti, Marcelle, Gomes de Castro, Thiago, Gomes, William Barbosa, Matiazo Pinhatti, Marcelle, Gomes de Castro, Thiago, and Gomes, William Barbosa
- Abstract
Body image perception comprises stable and dynamic representational features. Here, we examined whether a stable body image (BI) measure correlates to a dynamic sense of agency (SoA) score. Moreover, we tested whether differential BI perception scores predicts SoA performance under sensory conflict conditions. A total of 21 university students (age M=25.9, SD=9.29) completed the Brazilian version of the Silhouette Figures Scale (SFS) and participated in The Alien Hand Experiment (TAHE). TAHE is a task that manipulates participants’ hand movement SoA. A Bayesian correlation matrix revealed substantial negative correlation probability between BI perception and SoA confidence in accuracy responses, specifically in experimental conditions where agency was manipulated. Based on the SFS scores, participants were divided into groups with and without evidences of BI distortion. Bayesian Independent Samples T-Tests evidenced substantial differential probability between groups. Analyzes confirmed lower SoA confidence among subjects with BI distortion greater than ±3 kg/m²., A percepção da imagem corporal compreende características representacionais estáveis e dinâmicas. Aqui, examinamos se uma medida estável da imagem corporal (IC) correlaciona com o senso de agência (SA) de movimentos corporais. Um total de 21 estudantes universitários (M=25,9, DP=9,29) completaram a versão brasileira da Escala de Figuras de Silhuetas (EFS) e participaram do Experimento da Mão Alienígena. Uma matriz de correlação bayesiana revelou uma probabilidade de correlação negativa substancial entre a percepção de IC e a confiança em SA, especificamente nas condições experimentais onde o SA foi manipulado pelo experimento. Com base nos escores da EFS, os participantes foram divididos em grupos com e sem evidências de distorção de IC. Testes de comparação de grupos evidenciaram uma diferença substancial entre os grupos de IC. As análises confirmaram uma menor confiança em SA entre indivíduos com distorção de IC maior que ±3 kg/m²., La percepción de la imagen corporal comprende características representativas estables y dinámicas. Aquí examinamos si una medida estable de la imagen corporal (IC) correlaciona con el senso de agencia (SA) de movimientos corporales. Un total de 21 estudiantes universitarios (M=25,9, DP=9,29) completaron la versión brasileña de la Escala de Figuras de Siluetas (EFS) y participaran en el Experimento de la Mano Alienígena. Una matriz de correlación bayesiana reveló una probabilidad de correlación negativa sustancial entre IC y la confianza en SA, específicamente en las condiciones experimentales donde el SA fue manipulado. Con base en los escores de la EFS, los participantes se dividieron en grupos con y sin evidencias de distorsión de IC. Las comparaciónes de grupos evidenciaron una diferencia sustancial entre los grupos. Los análisis confirmaron una menor confianza en SA entre individuos con distorsión de IC mayor que ±3 kg/m².
- Published
- 2018
33. Los peligros de la probabilidad y la estadística como herramientas para la valoración jurídico-probatoria
- Author
-
Sánchez Rubio, Ana and Sánchez Rubio, Ana
- Abstract
In the common use of the expression, to prove means to test, to check, as a way of demonstrating. These days, the criminal proceedings, in order to achieve the nearest demonstration to the certainty or, in legal language, to the material truth, uses some tools of mathematical field for giving more strength to evidentiary result. In this regard, the probability product of statistics is increasingly occupying a major role. This paper aims to reveal the hazards that involve trusting completely of numerical data in legal assessment, particularly if the parties of the proceedings misinterpret mathematical results, since that could seriously influence on the judicial decision foundation., Em seu uso comum, provar significa comprovar, verificar, para desse modo, poder demonstrar. No processo penal, com o objetivo de conseguir uma demonstração mais próxima da certeza ou, como se costuma expressar em linguagem jurídica, à verdade material, cada vez com maior frequência se recorre a ferramentas do campo da matemática para fortalecer o resultado probatório. Nesse sentido, a probabilidade derivada da estatística está ocupando um papel de protagonismo. Este artigo pretende expor os perigos causados à decisão judicial pela confiança plena nesses dados numéricos já que, algumas vezes, as partes partem a sua exposição de interpretações erradas, o que pode chegar a determinar a motivação judicial., En el uso corriente de la expresión, probar significa comprobar, verificar, para de este modo, poder demostrar. El proceso penal, en aras de conseguir una demostración más próxima a la certeza o, como suele expresarse en el lenguaje jurídico, a la verdad material, cada vez con más frecuencia recurre a herramientas del campo matemático para dotar de mayor fuerza al resultado probatorio. En este sentido, la probabilidad derivada de la estadística está ocupando un papel protagonista. Este artículo pretende poner de manifiesto los riesgos que conlleva para la valoración judicial confiar plenamente en estos datos numéricos ya que, en ocasiones, las partes los exponen haciendo uso de interpretaciones erróneas que pueden llegar a condicionar la motivacion de la sentencia.
- Published
- 2018
34. Los peligros de la probabilidad y la estadística como herramientas para la valoración jurídico-probatoria
- Author
-
Ana Sánchez-Rubio
- Subjects
evidence ,credibility ,logic likelihood ,statistical probability ,Bayes’s theorem ,Sociology and Political Science ,Welfare economics ,media_common.quotation_subject ,Judicial opinion ,Foundation (evidence) ,Certainty ,Expression (computer science) ,Test (assessment) ,Psychiatry and Mental health ,Anthropology ,Political science ,Prova ,verossimilhança ,probabilidade lógica ,probabilidade estatística ,teorema de Bayes ,Prueba ,Verosimilitud ,Probabilidad lógica ,Probabilidad estadística ,Teorema de Bayes ,Law ,Safety Research ,media_common - Abstract
In the common use of the expression, to prove means to test, to check, as a way of demonstrating. These days, the criminal proceedings, in order to achieve the nearest demonstration to the certainty or, in legal language, to the material truth, uses some tools of mathematical field for giving more strength to evidentiary result. In this regard, the probability product of statistics is increasingly occupying a major role. This paper aims to reveal the hazards that involve trusting completely of numerical data in legal assessment, particularly if the parties of the proceedings misinterpret mathematical results, since that could seriously influence on the judicial decision foundation., En el uso corriente de la expresión, probar significa comprobar, verificar, para de este modo, poder demostrar. El proceso penal, en aras de conseguir una demostración más próxima a la certeza o, como suele expresarse en el lenguaje jurídico, a la verdad material, cada vez con más frecuencia recurre a herramientas del campo matemático para dotar de mayor fuerza al resultado probatorio. En este sentido, la probabilidad derivada de la estadística está ocupando un papel protagonista. Este artículo pretende poner de manifiesto los riesgos que conlleva para la valoración judicial confiar plenamente en estos datos numéricos ya que, en ocasiones, las partes los exponen haciendo uso de interpretaciones erróneas que pueden llegar a condicionar la motivacion de la sentencia., Em seu uso comum, provar significa comprovar, verificar, para desse modo, poder demonstrar. No processo penal, com o objetivo de conseguir uma demonstração mais próxima da certeza ou, como se costuma expressar em linguagem jurídica, à verdade material, cada vez com maior frequência se recorre a ferramentas do campo da matemática para fortalecer o resultado probatório. Nesse sentido, a probabilidade derivada da estatística está ocupando um papel de protagonismo. Este artigo pretende expor os perigos causados à decisão judicial pela confiança plena nesses dados numéricos já que, algumas vezes, as partes partem a sua exposição de interpretações erradas, o que pode chegar a determinar a motivação judicial.
- Published
- 2018
- Full Text
- View/download PDF
35. Joint probabilistic fluid discrimination of tight sandstone reservoirs based on Bayes discriminant and deterministic rock physics modeling.
- Author
-
Wang, Pu, Li, Jingye, Chen, Xiaohong, and Wang, Benfeng
- Subjects
- *
PETROPHYSICS , *SYNOVIAL fluid , *STATISTICAL physics , *PHYSICS , *SANDSTONE , *RANDOM noise theory , *PROBABILISTIC number theory , *SOFT sets - Abstract
Petrophysical properties of tight sandstone reservoirs are complex which brings difficulties to fluid discrimination. Rock physics makes it possible to obtain petrophysical properties from elastic parameters. However, both deterministic rock physics and statistical rock physics have corresponding limitations. By combining deterministic rock physics and statistical rock physics, a joint posterior probability is proposed for fluid discrimination. To consider the effect of complex pore structure and permeability in tight sandstone reservoirs, a new deterministic rock physics model is built. In this model, soft porosity and connected porosity are quite important parameters to describe the above-mentioned reservoir characteristics. Assuming the noise follows a Gaussian distribution, we can obtain the posterior probability of gas saturation from the deterministic rock physics. Bayes discriminant is an effective method for statistical rock physics to estimate the prior, condition and posterior probabilities of petrophysical properties from well-logging data. Thus, the posterior probability of gas saturation belonging to the statistical rock physics is obtained. To guarantee the accuracy of fluid discrimination, the reflectivity method is used to achieve high-precision elastic parameters from seismic data. Application examples of well-logging data and seismic data confirm the validity of the proposed joint probabilistic fluid discrimination. • A new rock physics model for tight sandstones is built. • A joint probabilistic method is proposed for fluid discrimination. • The fluid prediction process of tight sandstone reservoirs is effective. • Reflectivity method is used to obtain high precision elastic parameters. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
36. Inference and stochastic processes
- Author
-
Bartlett, M. S. and Bartlett, M. S.
- Published
- 1975
- Full Text
- View/download PDF
37. Other Aspects of Nomic Inference
- Author
-
Cannavo, Salvator and Cannavo, Salvator
- Published
- 1974
- Full Text
- View/download PDF
38. Fat tails in financial return distributions revisited: Evidence from the Korean stock market.
- Author
-
Eom, Cheoljun, Kaizoji, Taisei, and Scalas, Enrico
- Subjects
- *
STOCK exchanges , *TAILS , *GARCH model , *CURRENCY crises , *FOREIGN exchange - Abstract
This study empirically re-examines fat tails in stock return distributions by applying statistical methods to an extensive dataset taken from the Korean stock market. The tails of the return distributions are shown to be much fatter in recent periods than in past periods and much fatter for small-capitalization stocks than for large-capitalization stocks. After controlling for the 1997 Korean foreign currency crisis and using the GARCH filter models to control for volatility clustering in the returns, the fat tails in the distribution of residuals are found to persist. We show that market crashes and volatility clustering may not sufficiently account for the existence of fat tails in return distributions. These findings are robust regardless of period or type of stock group. • The existence of fat-tails in stock return distribution is investigated. • Influential factors of stylized facts (e.g., market crash, volatility clustering) are considered. • These factors have a significant influence on the fat-tails in return distribution. • After controlling for these factors, evidence supporting the existence of fat-tails is still verified. • The fat-tails in return distributions cannot be sufficiently explained by the stylized facts. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
39. Introduction
- Author
-
Swinburne, Richard, editor
- Published
- 2005
- Full Text
- View/download PDF
40. Probability for human intake of an atom randomly released into ground, rivers, oceans and air
- Author
-
Cohen, Bernard L.
- Subjects
RADIATION exposure ,RISK assessment - Published
- 1984
- Full Text
- View/download PDF
41. This title is unavailable for guests, please login to see more information.
- Author
-
TATSUZAKI, Kei, MATSUURA, Taketo, TATSUZAKI, Kei, and MATSUURA, Taketo
- Abstract
The purpose of this treatise was to develop elementary school mathematics lessons that foster statistical problem-solving skills. In Japan, statistical problem-solving is performed using the PPDAC cycle, which is carried out from the 5th grade, but overseas, statistical education linked to probability is provided from the lower grades of elementary school. In Japan, we also developed learning materials using rock-paper-scissors play, which is a phenomenon familiar to children, and put it into practice in the third grade. in order to develop the ability to solve statistical problems from the lower grades. As a result of protocol analysis and descriptive analysis of children, it was found this learning materials was generally effective. This suggests that there is a possibility of developing statistical problem-solving skills from the lower grades.
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.