4,039 results on '"Probability"'
Search Results
2. Chance and Necessity: Hegel's Epistemological Vision.
- Author
-
Nescolarde-Selva, J., Usó-Doménech, J. L., and Gash, H.
- Subjects
- *
SOCIAL processes , *CAUSATION (Philosophy) , *FREE will & determinism , *DIALECTIC - Abstract
In this paper the authors provide an epistemological view on the old controversial random-necessity. It has been considered that either one or the other form part of the structure of reality. Chance and indeterminism are nothing but a disorderly efficiency of contingency in the production of events, phenomena, processes, i.e., in its causality, in the broadest sense of the word. Such production may be observed in natural and artificial processes or in human social processes (in history, economics, society, politics, etc.). Here we touch the object par excellence of all scientific research whether natural or human. In this work, is presented a hypothesis whose practical result satisfies the Hegelian dialectic, with the consequent implication of their mutual reciprocal integration. Producing abstractions, without which, there is no thought or knowledge of any kind, from the concrete, that is, the real problem, which in this case is a given Ontological System or Reality. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
3. Decoding intelligence via symmetry and asymmetry.
- Author
-
Fu, Jianjing and Hsiao, Ching-an
- Abstract
Humans use pictures to model the world. The structure of a picture maps to mind space to form a concept. When an internal structure matches the corresponding external structure, an observation functions. Whether effective or not, the observation is self-consistent. In epistemology, people often differ from each other in terms of whether a concept is probabilistic or certain. Based on the effect of the presented IG and pull anti algorithm, we attempt to provide a comprehensive answer to this problem. Using the characters of hidden structures, we explain the difference between the macro and micro levels and the same difference between semantics and probability. In addition, the importance of attention is highlighted through the combination of symmetry and asymmetry included and the mechanism of chaos and collapse revealed in the presented model. Because the subject is involved in the expression of the object, representationalism is not complete. However, people undoubtedly reach a consensus based on the objectivity of the representation. Finally, we suggest that emotions could be used to regulate cognition. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
4. A Bayesian approach using spatiotemporal features for suitable next hop selection in opportunistic networks.
- Author
-
Dutta, Amit, Borah, Satya Jyoti, and Singh, Jagdeep
- Abstract
Summary: Opportunistic network (OppNet) belongs to the category of Mobile Ad‐hoc Networks (MANETs), a kind of Delay Tolerant Network (DTN), where the wireless nodes are completely mobile and the data transmission routes are dynamic. The major challenge in developing a routing model for such a network is the unpredictable nature of the movement of the nodes. In this paper, a spatiotemporal prediction model based on human mobility pattern is proposed using Bayesian posterior probability (BPPR) where several clusters are identified within the network and the day and time duration of nodes visiting those clusters are recorded. The Bayesian posterior probability is then used to determine the probability of the neighbor node visiting the destination's cluster. If the calculated probability for that node is higher than a specified threshold, the packet will be forwarded. A comparison of the results obtained on simulation is made with benchmark models—Epidemic, Prophet, HBPR, EDR, NexT, and EBC, to name a few, where it is found that on average the proposed model outperforms the other models in terms of delivery probability, hop count and number of messages dropped by around 23.89%, 24.8%, 24.4%, 37%, 11%, and 42% respectively, with varying number of nodes, TTL, message generation interval, and buffer size. Similar improvements have been observed in terms of the other two metrics. In terms of overhead ratio, the proposed model outperforms Epidemic, Prophet, HBPR, NexT, and EBC. However, as the number of nodes and TTL are varied, BPPR performs better than NexT by around 9% and 12%, respectively, in terms of average latency. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
5. Markov Chains and Kinetic Theory: A Possible Application to Socio-Economic Problems.
- Author
-
Carbonaro, Bruno and Menale, Marco
- Abstract
A very important class of models widely used nowadays to describe and predict, at least in stochastic terms, the behavior of many-particle systems (where the word "particle" is not meant in the purely mechanical sense: particles can be cells of a living tissue, or cars in a traffic flow, or even members of an animal or human population) is the Kinetic Theory for Active Particles, i.e., a scheme of possible generalizations and re-interpretations of the Boltzmann equation. Now, though in the literature on the subject this point is systematically disregarded, this scheme is based on Markov Chains, which are special stochastic processes with important properties they share with many natural processes. This circumstance is here carefully discussed not only to suggest the different ways in which Markov Chains can intervene in equations describing the stochastic behavior of any many-particle system, but also, as a preliminary methodological step, to point out the way in which the notion of a Markov Chain can be suitably generalized to this aim. As a final result of the discussion, we find how to develop new very plausible and likely ways to take into account possible effects of the external world on a non-isolated many-particle system, with particular attention paid to socio-economic problems. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
6. The pupil dilation response as an indicator of visual cue uncertainty and auditory outcome surprise.
- Author
-
Becker, Janika, Viertler, Marvin, Korn, Christoph W., and Blank, Helen
- Subjects
- *
PUPILLARY reflex , *AUDITORY perception , *PUPILLOMETRY , *VOWELS - Abstract
In everyday perception, we combine incoming sensory information with prior expectations. Expectations can be induced by cues that indicate the probability of following sensory events. The information provided by cues may differ and hence lead to different levels of uncertainty about which event will follow. In this experiment, we employed pupillometry to investigate whether the pupil dilation response to visual cues varies depending on the level of cue‐associated uncertainty about a following auditory outcome. Also, we tested whether the pupil dilation response reflects the amount of surprise about the subsequently presented auditory stimulus. In each trial, participants were presented with a visual cue (face image) which was followed by an auditory outcome (spoken vowel). After the face cue, participants had to indicate by keypress which of three auditory vowels they expected to hear next. We manipulated the cue‐associated uncertainty by varying the probabilistic cue‐outcome contingencies: One face was most likely followed by one specific vowel (low cue uncertainty), another face was equally likely followed by either of two vowels (intermediate cue uncertainty) and the third face was followed by all three vowels (high cue uncertainty). Our results suggest that pupil dilation in response to task‐relevant cues depends on the associated uncertainty, but only for large differences in the cue‐associated uncertainty. Additionally, in response to the auditory outcomes, the pupil dilation scaled negatively with the cue‐dependent probabilities, likely signalling the amount of surprise. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
7. The Best Time to Play the Lottery.
- Author
-
Rump, Christopher M.
- Abstract
AbstractThe best time to play the lottery is when the jackpot has rolled over several times and grown large, but not so large that you must share the prize if you win. We examine maximizing the expected value of a winning ticket as well as that in a random ticket. The derived optimality criteria depend on the prize elasticity of ticket demand. A regression analysis on data obtained from the Mega Millions® and Powerball® multi-state lotteries suggests ticket sales grow quadratically in the size of the advertised lump-sum cash jackpot prize. With quadratic growth, the best time to play is when ticket sales are 1.25–2.5 times the jackpot odds, currently about 300 M to one for these two lotteries. Since ticket sales are not known to ticket buyers, we invert the regression function to prescribe the best time to play in terms of the cash prize. It turns out that these lotteries offer a (pretax) fair wager with positive expected value in a surprisingly wide interval of jackpot prizes. That is a good time to play; the best time is in the neighborhood of the nearly 1 $B record cash jackpot awarded in these lotteries in recent years. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
8. Development of the Korean construction job exposure matrix (KoConJEM) based on experts' judgment using the 60 consolidated occupations for construction workers.
- Author
-
Choi, Sangjun, Lee, Kwang Min, Park, Hyunhee, Shim, Gyu-Beom, Lee, Sun Woo, Kim, Yoon-Ji, Lee, Eun-Soo, Kim, Youngki, Kang, Dongmug, Park, Ju-Hyun, and Kim, Se-Yeong
- Subjects
- *
RISK assessment , *DASHBOARDS (Management information systems) , *COLD (Temperature) , *OCCUPATIONS , *RESEARCH funding , *NOISE , *OCCUPATIONAL hazards , *PROBABILITY theory , *WORK environment , *HEAT , *OCCUPATIONAL exposure , *STATISTICS , *LIFTING & carrying (Human mechanics) , *COMPARATIVE studies , *HAZARDOUS substances , *POSTURE , *CONSTRUCTION industry , *INDUSTRIAL hygiene , *INDUSTRIAL safety - Abstract
Background This study was conducted as an effort to develop a Korean construction job exposure matrix (KoConJEM) based on 60 occupations recently consolidated by the construction workers mutual aid association for use by the construction industry. Methods The probability, intensity, and prevalence of exposure to 26 hazardous agents for 60 consolidated occupations were evaluated as binary (Yes/No) or four categories (1 to 4) by 30 industrial hygiene experts. The score for risk was calculated by multiplying the exposure intensity by the prevalence of exposure. Fleiss' kappa for each hazardous agent and occupation was used to determine agreement among the 30 experts. The JEM was expressed on a heatmap and a web-based dashboard to facilitate comparison of factors affecting exposure according to each occupation and hazardous agent. Results Awkward posture, heat/cold, heavy lifting, and noise were hazardous agents regarded as exposure is probable by at least one or more experts in all occupations, while exposure to asphalt fumes was considered hazardous in the smallest number of occupations (n = 5). Based on the degree of agreement among experts, more than half of the harmful factors and most occupations showed fair to good results. The highest risk value was 16 for awkward posture for most occupations other than safety officer. Conclusions The KoConJEM provides information on the probability, intensity, and prevalence of exposure to harmful factors, including most occupations employing construction workers; therefore, it may be useful in the conduct of epidemiological studies on assessment of health risk for construction workers. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
9. The governance of possible futures and the regime of modern historicity: Critical theory and the modality of possibility.
- Author
-
Guéguen, Haud and Jeanpierre, Laurent
- Subjects
- *
CRITICAL theory , *MODAL logic , *HISTORICITY , *POSSIBILITY , *GIFT giving - Abstract
The inaugural project of German Critical theory was to break away from the cult of facts in order to investigate the real possibilities of the present. Part of sociology has also made the possible and the relationship to the possible its central object. Such a task has met with a considerable effort on the part of government agencies to pre-empt the legitimate definition of what is possible. The social sciences were mobilised to this end. We offer a schematic account of these efforts in order to situate the extent to which the definition of possible futures is an issue of struggle in which Critical theory and sociology have a role to play. The article examines the question of the future through its close link to the category of the possible, at two levels that are often treated separately: at the level of the problem of 'governmentality' and its close link to the question of forecasting and probability; and at the level of the 'regimes of historicity' from which to consider the possible with a view to collectively reappropriating the determination of the future. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
10. RT-QuIC detection of chronic wasting disease prion in platelet samples of white-tailed deer.
- Author
-
Kobashigawa, Estela, Russell, Sherri, Zhang, Michael Z., Sinnott, Emily A., Connolly, Michael, and Zhang, Shuping
- Subjects
- *
CHRONIC wasting disease , *WHITE-tailed deer , *PRION diseases , *BLOOD platelets , *SCRAPIE , *HIGH throughput screening (Drug development) - Abstract
Background: Chronic wasting disease (CWD) is a prion disease of captive and free-ranging cervids. Currently, a definitive diagnosis of CWD relies on immunohistochemistry detection of PrPSc in the obex and retropharyngeal lymph node (RPLN) of the affected cervids. For high-throughput screening of CWD in wild cervids, RPLN samples are tested by ELISA followed by IHC confirmation of positive results. Recently, real-time quacking-induced conversion (RT-QuIC) has been used to detect CWD positivity in various types of samples. To develop a blood RT-QuIC assay suitable for CWD diagnosis, this study evaluated the assay sensitivity and specificity with and without ASR1-based preanalytical enrichment and NaI as the main ionic component in assay buffer. Results: A total of 23 platelet samples derived from CWD-positive deer (ELISA + /IHC +) and 30 platelet samples from CWD-negative (ELISA-) deer were tested. The diagnostic sensitivity was 43.48% (NaCl), 65.22% (NaI), 60.87% (NaCl-ASR1) or 82.61% (NaI-ASR1). The diagnostic specificity was 96.67% (NaCl), 100% (NaI), 100% (NaCl-ASR1), or 96.67% (NaI-ASR1). The probability of detecting CWD prion in platelet samples derived from CWD-positive deer was 0.924 (95% CRI: 0.714, 0.989) under NaI-ASR1 experimental condition and 0.530 (95% CRI: 0.156, 0.890) under NaCl alone condition. The rate of amyloid formation (RFA) was greatest under the NaI-ASR1 condition at 10–2 (0.01491, 95% CRI: 0.00675, 0.03384) and 10–3 (0.00629, 95% CRI: 0.00283, 0.01410) sample dilution levels. Conclusions: Incorporation of ASR1-based preanalytical enrichment and NaI as the main ionic component significantly improved the sensitivity of CWD RT-QuIC on deer platelet samples. Blood test by the improved RT-QuIC assay may be used for antemortem and postmortem diagnosis of CWD. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
11. A Stochastic Model of Mathematics and Science.
- Author
-
Wolpert, David H. and Kinney, David B.
- Abstract
We introduce a framework that can be used to model both mathematics and human reasoning about mathematics. This framework involves stochastic mathematical systems (SMSs), which are stochastic processes that generate pairs of questions and associated answers (with no explicit referents). We use the SMS framework to define normative conditions for mathematical reasoning, by defining a “calibration” relation between a pair of SMSs. The first SMS is the human reasoner, and the second is an “oracle” SMS that can be interpreted as deciding whether the question–answer pairs of the reasoner SMS are valid. To ground thinking, we understand the answers to questions given by this oracle to be the answers that would be given by an SMS representing the entire mathematical community in the infinite long run of the process of asking and answering questions. We then introduce a slight extension of SMSs to allow us to model both the physical universe and human reasoning about the physical universe. We then define a slightly different calibration relation appropriate for the case of scientific reasoning. In this case the first SMS represents a human scientist predicting the outcome of future experiments, while the second SMS represents the physical universe in which the scientist is embedded, with the question–answer pairs of that SMS being specifications of the experiments that will occur and the outcome of those experiments, respectively. Next we derive conditions justifying two important patterns of inference in both mathematical and scientific reasoning: (i) the practice of increasing one’s degree of belief in a claim as one observes increasingly many lines of evidence for that claim, and (ii) abduction, the practice of inferring a claim’s probability of being correct from its explanatory power with respect to some other claim that is already taken to hold for independent reasons. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
12. Probabilistic Forecasting of Lightning Strikes over the Continental USA and Alaska: Model Development and Verification.
- Author
-
Nikolov, Ned, Bothwell, Phillip, and Snook, John
- Subjects
- *
THUNDERSTORMS , *LIGHTNING , *ELECTRIC charge , *HUMIDITY , *PRINCIPAL components analysis , *GEOPOTENTIAL height - Abstract
Lightning is responsible for the most area annually burned by wildfires in the extratropical region of the Northern Hemisphere. Hence, predicting the occurrence of wildfires requires reliable forecasting of the chance of cloud-to-ground lightning strikes during storms. Here, we describe the development and verification of a probabilistic lightning-strike algorithm running on a uniform 20 km grid over the continental USA and Alaska. This is the first and only high-resolution lightning forecasting model for North America derived from 29-year-long data records. The algorithm consists of a large set of regional logistic equations parameterized on the long-term data records of observed lightning strikes and meteorological reanalysis fields from NOAA. Principal Component Analysis was employed to extract 13 principal components from a list of 611 potential predictors. Our analysis revealed that the occurrence of cloud-to-ground lightning strikes primarily depends on three factors: the temperature and geopotential heights across vertical pressure levels, the amount of low-level atmospheric moisture, and wind vectors. These physical variables isolate the conditions that are favorable for the development of thunderstorms and impact the vertical separation of electric charges in the lower troposphere during storms, which causes the voltage potential between the ground and the cloud deck to increase to a level that triggers electrical discharges. The results from a forecast verification using independent data showed excellent model performance, thus making this algorithm suitable for incorporation into models designed to forecast the chance of wildfire ignitions. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
13. The Metaphysical Foundations of the Principle of Indifference.
- Author
-
Eisner, Binyamin
- Subjects
- *
APATHY , *QUANTUM mechanics - Abstract
The arguments in favor of the Principle of Indifference fail to explain its fruitfulness in science. Using the recent metaphysical concept of Grounding, I devise an explanation that can justify a weak version of the principle and discuss an instance of its application in Quantum mechanics. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
14. The 2022 European Union report on pesticide residues in food.
- Author
-
Carrasco Cabrera, Luis, Di Piazza, Giulio, Dujardin, Bruno, Marchese, Emanuela, and Medina Pastor, Paula
- Subjects
- *
PESTICIDE residues in food , *FOOD laws , *PESTICIDE pollution , *RISK managers , *CONSUMER protection , *FOOD safety - Abstract
Under European Union legislation (Article 32, Regulation (EC) No 396/2005), the European Food Safety Authority provides an annual report assessing the pesticide residue levels in foods on the European market. In 2022, 96.3% of the overall 110,829 samples analysed fell below the maximum residue level (MRL), 3.7% exceeded this level, of which 2.2% were non‐compliant, i.e. results in a given sample exceeded the MRL after taking into account the measurement uncertainty. For the EU‐coordinated multiannual control programme subset, 11,727 samples were analysed of which 0.9% were non‐compliant. To assess acute and chronic risk to consumer health, dietary exposure to pesticide residues was estimated and compared with available health‐based guidance values (HBGV). Continuation of the probabilistic assessment methodology was consolidated to all pesticides listed in the 2022 EU Regulation providing the probability of a consumer being exposed to an exceedance of the HBGV. Overall, the assessed risk to EU consumer's health is low. Recommendations to risk managers are given to increase the effectiveness of European control systems and to ensure a high level of consumer protection throughout the EU. This publication is linked to the following EFSA Supporting Publications article: http://onlinelibrary.wiley.com/doi/10.2903/sp.efsa.2024.EN-8751/full [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
15. COMPETENZE IN DIDATTICA DELLA MATEMATICA Quarta parte: metodologie ed epistemologie del presente.
- Author
-
Papini, Alessandro, Tortoriello, Francesco Saverio, and Vespri, Vincenzo
- Abstract
In Annex A to the DPCM of 4 August 2023 it is stated that <
- Published
- 2024
16. On the commutativity probability in certain finite groups.
- Author
-
Alajmi, Khaled
- Subjects
- *
FINITE groups , *NONABELIAN groups , *PROBABILITY theory , *CONJUGACY classes , *PERMUTATION groups , *NILPOTENT groups , *PERMUTATIONS - Abstract
The purpose of this paper is to compute the probability Pr(G) that two elements of the group G, drawn at random with replacement, commute; that is, Pr(G) = Number of ordered pairs (x, y) ∈ G × G such that xy = yx/|G × G| = |G|² In particular, we compute Pr(G) for some groups such as the extraspecial groups of order p³, p prime, for the permutation groups G = Sn and G = An, n ≥ 5, for 10 non-abelian groups of order p4 and for simple groups of certain type. [ABSTRACT FROM AUTHOR]
- Published
- 2024
17. The two 'strongest pillars of the empiricist wing': the Vienna Circle, German academia and emigration in the light of correspondence between Philipp Frank and Richard von Mises (1916–1939).
- Author
-
Siegmund-Schultze, Reinhard
- Subjects
- *
VIENNA circle , *EMIGRATION & immigration , *THEORY of knowledge , *SCHOLARS - Abstract
This paper is divided into a surveying and argumentative part and a slightly longer documentary part, which is meant to verify or at least make more plausible claims made in the first part. The first part deals in broad outline with the relationship of Frank and von Mises to the Vienna Circle of Logical Empiricism on the one hand and to the physicists and mathematicians in the German-speaking world on the other. The varying special positions, partly the non-conformity of the two Austrian scientists are emphasized, in particular, their adherence to Ernst Mach's epistemology and their shared interest in probability theory and applied mathematics. The impact of emigration and the after-effects in the U.S. are discussed. This leads to new insights into the fine structure of the Vienna Circle and the latter's relationship to German academia within 'Weimar Culture'. P. Forman's interpretation (1971) of von Mises' position is critically discussed. The second, documentary part, uses recently discovered correspondence between Frank and von Mises, and, to a lesser extent, von Mises' personal diary. It aims at further substantiating some of the introductory theses and will at the same time provide material for a thorough biographical appreciation of the two scholars and friends. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
18. An example showing that the sum of two normal random variables may not be normal.
- Author
-
Fujita, Takahiko and Yoshida, Naohiro
- Subjects
- *
RANDOM variables , *GENERATING functions , *GAUSSIAN distribution , *MATHEMATICS students , *GAMMA functions - Abstract
Two novel proofs show that the sum of a specific pair of normal random variables is not normal are established in this note. This is one of the most often misunderstood facts by first-year students in probability theory and statistics. The first proof is concise using the moment generating function. The second proof checks whether the moments of the sum have the property of normal distribution. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
19. Risk-informed design and safety assessment of structures in a changing climate: a review of U.S. practice and a path forward.
- Author
-
Ghosn, Michel and Ellingwood, Bruce R.
- Subjects
- *
SAFETY standards , *STRUCTURAL reliability , *MAP design , *BRIDGE design & construction , *HAZARDS , *SERVICE life , *CLIMATE change - Abstract
Standards for the design of bridges, buildings and other infrastructure specify design loads for climatic hazards such as temperature, snow, wind, and floods based on return periods presented in maps or tables that account for regional differences. These design loads were developed from statistical analyses of historical hazard data under the assumption that the past is representative of the future. Climate change may affect the frequencies and intensities of environmental hazards which, depending on regional variations, raises questions as to whether structures designed to current specifications will meet minimum safety standards over their future service lives. This paper critically appraises issues related to using historical hazard data for future designs. It reviews basic principles of uniform reliability, that modern design codes use as the basis for ensuring minimum levels of safety, describing the relationship between hazard return periods, structural reliability, risk and the maximum loads expected within a structure's service life. Simple examples involving wind effects on structures demonstrate how to calibrate structural design hazard maps for climate-related extreme events to meet the minimum standards of safety implied in current specifications. The paper also introduces a possible practical approach to account for climate change when designing new structures and assessing the safety of existing facilities. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
20. Comparative study on chromatin loop callers using Hi-C data reveals their effectiveness.
- Author
-
Chowdhury, H. M. A. Mohit, Boult, Terrance, and Oluwadare, Oluwatosin
- Abstract
Background: Chromosome is one of the most fundamental part of cell biology where DNA holds the hierarchical information. DNA compacts its size by forming loops, and these regions house various protein particles, including CTCF, SMC3, H3 histone. Numerous sequencing methods, such as Hi-C, ChIP-seq, and Micro-C, have been developed to investigate these properties. Utilizing these data, scientists have developed a variety of loop prediction techniques that have greatly improved their methods for characterizing loop prediction and related aspects. Results: In this study, we categorized 22 loop calling methods and conducted a comprehensive study of 11 of them. Additionally, we have provided detailed insights into the methodologies underlying these algorithms for loop detection, categorizing them into five distinct groups based on their fundamental approaches. Furthermore, we have included critical information such as resolution, input and output formats, and parameters. For this analysis, we utilized the GM12878 Hi-C datasets at 5 KB, 10 KB, 100 KB and 250 KB resolutions. Our evaluation criteria encompassed various factors, including memory usages, running time, sequencing depth, and recovery of protein-specific sites such as CTCF, H3K27ac, and RNAPII. Conclusion: This analysis offers insights into the loop detection processes of each method, along with the strengths and weaknesses of each, enabling readers to effectively choose suitable methods for their datasets. We evaluate the capabilities of these tools and introduce a novel Biological, Consistency, and Computational robustness score ( B C C score ) to measure their overall robustness ensuring a comprehensive evaluation of their performance. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
21. Laypeople’s interpretations of ‘high confidence’.
- Author
-
Pennekamp, Pia and Mansour, Jamal K.
- Abstract
High confidence has been associated with high accuracy under certain conditions. Yet, how researchers operationalize ‘high confidence’ varies across publications and depends on who is asked. In this study, we collected numeric interpretations to determine thresholds for high confidence. Layperson participants provided a minimum, best, and maximum estimate for ‘high confidence’ in an eyewitness lineup decision on a scale of 0-100. The distribution of best estimates peaked at 90.90%. The peak value for the minimum estimate was 83.80%. Critically, the distributions of responses were highly variable: 68.27% of participants (one standard deviation around the mean) provided best estimates between 79% and 97% and minimum estimates between 60% and 93%. This variability in laypeople’s perceptions implies there is likely to be considerable variability in how jurors and practitioners interpret confidence. Research and practice would benefit from a standardized definition of what constitutes ‘high confidence.’ [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
22. A comparison of human and GPT-4 use of probabilistic phrases in a coordination game.
- Author
-
Maloney, Laurence T., Dal Martello, Maria F., Fei, Vivian, and Ma, Valerie
- Abstract
English speakers use probabilistic phrases such as likely to communicate information about the probability or likelihood of events. Communication is successful to the extent that the listener grasps what the speaker means to convey and, if communication is successful, individuals can potentially coordinate their actions based on shared knowledge about uncertainty. We first assessed human ability to estimate the probability and the ambiguity (imprecision) of twenty-three probabilistic phrases in a coordination game in two different contexts, investment advice and medical advice. We then had GPT-4 (OpenAI), a Large Language Model, complete the same tasks as the human participants. We found that GPT-4’s estimates of probability both in the Investment and Medical Contexts were as close or closer to that of the human participants as the human participants’ estimates were to one another. However, further analyses of residuals disclosed small but significant differences between human and GPT-4 performance. Human probability estimates were compressed relative to those of GPT-4. Estimates of probability for both the human participants and GPT-4 were little affected by context. We propose that evaluation methods based on coordination games provide a systematic way to assess what GPT-4 and similar programs can and cannot do. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
23. Statistical approaches to evaluate in vitro dissolution data against proposed dissolution specifications.
- Author
-
Li, Fasheng, Nickerson, Beverly, Van Alstine, Les, and Wang, Ke
- Abstract
In vitro dissolution testing is a regulatory required critical quality measure for solid dose pharmaceutical drug products. Setting the acceptance criteria to meet compendial criteria is required for a product to be filed and approved for marketing. Statistical approaches for analyzing dissolution data, setting specifications and visualizing results could vary according to product requirements, company's practices, and scientific judgements. This paper provides a general description of the steps taken in the evaluation and setting of in vitro dissolution specifications at release and on stability. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
24. Confidence distributions for treatment effects in clinical trials: Posteriors without priors.
- Author
-
Marschner, Ian C.
- Subjects
- *
CLINICAL trials , *TREATMENT effectiveness , *DISTRIBUTION (Probability theory) , *FREQUENTIST statistics , *BAYESIAN analysis - Abstract
An attractive feature of using a Bayesian analysis for a clinical trial is that knowledge and uncertainty about the treatment effect is summarized in a posterior probability distribution. Researchers often find probability statements about treatment effects highly intuitive and the fact that this is not accommodated in frequentist inference is a disadvantage. At the same time, the requirement to specify a prior distribution in order to obtain a posterior distribution is sometimes an artificial process that may introduce subjectivity or complexity into the analysis. This paper considers a compromise involving confidence distributions, which are probability distributions that summarize uncertainty about the treatment effect without the need for a prior distribution and in a way that is fully compatible with frequentist inference. The concept of a confidence distribution provides a posterior–like probability distribution that is distinct from, but exists in tandem with, the relative frequency interpretation of probability used in frequentist inference. Although they have been discussed for decades, confidence distributions are not well known among clinical trial statisticians and the goal of this paper is to discuss their use in analyzing treatment effects from randomized trials. As well as providing an introduction to confidence distributions, some illustrative examples relevant to clinical trials are presented, along with various case studies based on real clinical trials. It is recommended that trial statisticians consider presenting confidence distributions for treatment effects when reporting analyses of clinical trials. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
25. Reliability Analysis of a Micro Hydro Power Plants System at Lombok with Expected Energy Not Supplied Method.
- Author
-
Widjonarko, Saleh, Azmi, Utomo, Wahyu Mulyo, Omar, Saodah, and Nafi, Muhammad Ilman
- Subjects
- *
POWER resources , *ENERGY development , *RENEWABLE energy sources , *POWER plants , *HYDROELECTRIC power plants , *ELECTRIC power consumption , *ENERGY consumption - Abstract
In the context of this research, understanding the reliability of a power generator is essential as a criterion for assessing its suitability for use or the need for further development. The method used in this study is reliability analysis, known as "Expected Energy Not Supplied (EENS)." The initial step of this method is to calculate the FOR (Forced Outage Rate) to determine the level of disturbances in the generator unit. The subsequent process involves calculating individual probabilities, analyzing the generator load curve, determining the EENS values of three generators, and comparing them with the EENS standards established by the National Electricity Market. These standards stipulate that EENS should not exceed 0.002% of the total energy consumption in the region. This research marks a significant milestone as the first endeavour conducted on Lombok Island within this specific context. The study was conducted by analyzing three operational Micro-Hydro Power (MHP) units on Lombok Island. The research findings indicate that the EENS metric for MHP on Lombok Island stands at 2.822%. This result suggests that the reliability of MHP on Lombok Island falls below the established criterion, which is less than 0.002% annually. In practical terms, these findings imply that MHP plants located on Lombok Island may not be relied upon as the primary source to meet the electricity demands of the Lombok region in 2022. This research provides valuable insights into the challenges of energy reliability on Lombok Island and serves as a crucial foundation for further considerations in the development of renewable energy sources in the region. [ABSTRACT FROM AUTHOR]
- Published
- 2024
26. NECESSARY AND SUFFICIENT CONDITIONS FOR DOMINATION RESULTS FOR PROPER SCORING RULES.
- Author
-
PRUSS, ALEXANDER R.
- Subjects
- *
FORECASTING , *CALCULUS , *PROBABILITY theory - Abstract
Scoring rules measure the deviation between a forecast, which assigns degrees of confidence to various events, and reality. Strictly proper scoring rules have the property that for any forecast, the mathematical expectation of the score of a forecast p by the lights of p is strictly better than the mathematical expectation of any other forecast q by the lights of p. Forecasts need not satisfy the axioms of the probability calculus, but Predd et al. [9] have shown that given a finite sample space and any strictly proper additive and continuous scoring rule, the score for any forecast that does not satisfy the axioms of probability is strictly dominated by the score for some probabilistically consistent forecast. Recently, this result has been extended to non-additive continuous scoring rules. In this paper, a condition weaker than continuity is given that suffices for the result, and the condition is proved to be optimal. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
27. IS CAUSAL REASONING HARDER THAN PROBABILISTIC REASONING?
- Author
-
MOSSÉ, MILAN, IBELING, DULIGUR, and ICARD, THOMAS
- Subjects
- *
CAUSAL inference , *CONDITIONAL probability , *FORMAL languages , *INFERENTIAL statistics , *CONDITIONALS (Logic) , *COMPLETENESS theorem - Abstract
Many tasks in statistical and causal inference can be construed as problems of entailment in a suitable formal language. We ask whether those problems are more difficult, from a computational perspective, for causal probabilistic languages than for pure probabilistic (or "associational") languages. Despite several senses in which causal reasoning is indeed more complex—both expressively and inferentially—we show that causal entailment (or satisfiability) problems can be systematically and robustly reduced to purely probabilistic problems. Thus there is no jump in computational complexity. Along the way we answer several open problems concerning the complexity of well-known probability logics, in particular demonstrating the ${\exists \mathbb {R}}$ -completeness of a polynomial probability calculus, as well as a seemingly much simpler system, the logic of comparative conditional probability. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
28. Probability Calculation for Utilization of Photovoltaic Energy in Electric Vehicle Charging Stations.
- Author
-
Belany, Pavol, Hrabovsky, Peter, and Florkova, Zuzana
- Subjects
- *
ELECTRIC vehicle charging stations , *ELECTRIC vehicles , *ENERGY consumption , *ELECTRIC charge , *RENEWABLE energy sources , *ARTIFICIAL neural networks - Abstract
In recent years, there has been a growing emphasis on the efficient utilization of natural resources across various facets of life. One such area of focus is transportation, particularly electric mobility in conjunction with the deployment of renewable energy sources. To fully realize this objective, it is crucial to quantify the probability of achieving the desired state—production exceeding consumption. This article deals with the computation of the probability that the energy required to charge an electric vehicle will originate from a renewable source at a specific time and for a predetermined charging duration. The base of the model lies in artificial neural networks, which serve as an ancillary tool for the actual probability assessment. Neural networks are used to forecast the values of energy production and consumption. Following the processing of these data, the probability of energy availability for a given day and month is determined. A total of seven scenarios are calculated, representing individual days of the week. These findings can help users in their decision-making process regarding when and for how long to connect their electric vehicle to a charging station to receive assured clean energy from a local photovoltaic source. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
29. Temperature thresholds to guide choice of freshwater species for monitoring onset of chronic thermal stress impacts in rivers.
- Author
-
Rivers-Moore, N. A.
- Subjects
- *
PSYCHOLOGICAL stress , *CORAL bleaching , *FRESH water , *SPECIES , *CLIMATE change , *TEMPERATURE - Abstract
Aquatic species show different sensitivities and responses to chronic thermal stress, resulting in varying degrees of resistance to the negative impacts of climate change, which are ultimately expressed as range expansions or contractions. The choice of species appropriate for assessing climate change impacts in aquatic ecosystems should be guided by the robustness of the relationship between a chosen chronic stress thermal threshold and associated habitat contraction. Twelve aquatic species were evaluated as potential climate change indicators, from which six were selected for testing a conceptual framework for predicting the degree of utility of a species as a climate change indicator. Results indicate that species with a chronic biological thermal threshold below 20°C are likely to experience in excess of 50% loss of thermally suitable environment. Cooler thermal thresholds could inform the choice of suitable sentinel species for use as early indicators of chronic thermal stress, while thresholds above this reflect increasingly thermally resistant species within aquatic communities. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
30. Inherent and probabilistic naturalness.
- Author
-
Gasparri, Luca
- Subjects
- *
ORAL communication , *VOCABULARY , *NATURALNESS (Linguistics) , *SEMANTICS , *PERTURBATION theory - Abstract
Standard accounts hold that regularities of behavior must be arbitrary to constitute a convention. Yet, there is growing consensus that conventionality is a graded phenomenon, and that conventions can be more or less natural. I develop an account of natural conventions that distinguishes two basic dimensions of conventional naturalness: a probabilistic dimension and an inherent one. A convention is probabilistically natural if it is likely to emerge in a population of agents, and inherently natural if its content is a regularity that scores high on relevant measures for naturalness. I motivate the proposal on conceptual grounds and then showcase its descriptive benefits by discussing two case studies in language: the tendency towards word-length optimality and the prevalence of shape opacity in spoken language vocabularies. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
31. Close counterfactuals and almost doing the impossible.
- Author
-
Doan, Tiffany, Denison, Stephanie, and Friedman, Ori
- Subjects
- *
COUNTERFACTUALS (Logic) , *POSSIBILITY , *COGNITION , *PROBABILITY theory - Abstract
Can we feel that an unrealized outcome nearly happened if it was never possible in the first place? People often consider counterfactual events that did not happen, and some counterfactuals seem so close to reality that people say they "almost" or "easily could have" happened. Across four preregistered experiments (total N = 1,228), we investigated how judgments of counterfactual closeness depend on possibility, and whether this varies across two kinds of close counterfactuals. In judging whether outcomes almost happened, participants were more strongly impacted by possibility than by incremental manipulations of probability. In contrast, when judging whether outcomes easily could have happened, participants treated the distinction between impossible and possible like any other variation in probability. Both kinds of judgments were also impacted by propensity, though these effects were comparatively small. Together, these findings reveal novel differences between the two kinds of close counterfactuals and suggest that while possibility is privileged when judging what almost happened, probability is the focus when judging what easily could have happened. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
32. Probabilistic assessment of spatiotemporal fine particulate matter concentrations in Taiwan using multivariate indicator kriging.
- Author
-
Jang, Cheng-Shin
- Subjects
- *
PARTICULATE matter , *KRIGING , *AIR quality , *TEMPORAL integration , *QUANTILE regression - Abstract
Assessments of spatiotemporal fine particulate matter (PM2.5) concentrations are crucial for establishing risk maps and maintaining human health. This study spatiotemporally assessed PM2.5 concentrations in Taiwan by using multivariate indicator kriging (MVIK) according to current Taiwanese and US regulatory standards for annual average PM2.5 concentrations (15 and 12 μg/m3, respectively). First, multivariate integration was implemented to analyze data on PM2.5 concentrations for 2019–2021 and 2020–2022 because of no statistical difference of the 3-year PM2.5 data. MVIK was then used for modeling probabilities according to the two standards. Finally, quantile estimates on the basis of the occurrence probabilities of analyzing PM2.5 concentrations were employed to determine the optimal classifications for establishing risk maps according to the two PM2.5 standards. The study results indicated that the multivariate integration of temporal PM2.5 data in MVIK can effectively streamline the analytic process. The multivariate integration of 3-year PM2.5 data was suitable for assessing the risk categories of the regulatory standards for annual average PM2.5. The greatest estimated difference between the 2019–2021 and 2020–2022 multivariate integrations was in the Northern and Chumiao air quality regions. Because many air quality regions were in the PM2.5 categories of exceeding 12 μg/m3, the regulatory standard for annual average PM2.5 of 12 μg/m3 was inappropriate in Taiwan at this point in time according to assessing the 3-year spatiotemporal variability of PM2.5 concentrations. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
33. Process Algebraic Approach for Probabilistic Verification of Safety and Security Requirements of Smart IoT (Internet of Things) Systems in Digital Twin.
- Author
-
Song, Junsup, Lee, Sunghyun, Karagiannis, Dimitris, and Lee, Moonkun
- Subjects
- *
DIGITAL twins , *INTERNET of things , *INTERNET safety , *EMERGENCY medical services , *DETERMINISTIC algorithms - Abstract
Process algebra can be considered one of the most practical formal methods for modeling Smart IoT Systems in Digital Twin, since each IoT device in the systems can be considered as a process. Further, some of the algebras are applied to predict the behavior of the systems. For example, PALOMA (Process Algebra for Located Markovian Agents) and PACSR (Probabilistic Algebra of Communicating Shared Resources) process algebras are designed to predict the behavior of IoT Systems with probability on choice operations. However, there is a lack of analytical methods in the algebras to predict the nondeterministic behavior of the systems. Further, there is no control mechanism to handle undesirable nondeterministic behavior of the systems. In order to overcome these limitations, this paper proposes a new process algebra, called dTP-Calculus, which can be used (1) to specify the nondeterministic behavior of the systems with static probability, (2) verify the safety and security requirements of the nondeterministic behavior with probability requirements, and (3) control undesirable nondeterministic behavior with dynamic probability. To demonstrate the feasibility and practicality of the approach, the SAVE (Specification, Analysis, Verification, Evaluation) tool has been developed on the ADOxx Meta-Modeling Platform and applied to a SEMS (Smart Emergency Medical Service) example. In addition, a miniature digital twin system for the SEMS example was constructed and applied to the SAVE tool as a proof of concept for Digital Twin. It shows that the approach with dTP-Calculus on the tool can be very efficient and effective for Smart IoT Systems in Digital Twin. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
34. Reliability analysis of shear strength equations of RC beams.
- Author
-
ALACALI, Sema, ARSLAN, Güray, and İBİŞ, Aydoğan
- Subjects
- *
CONCRETE beams , *REINFORCED concrete , *RANDOM variables , *EQUATIONS , *FLEXURAL strength , *SHEAR strength - Abstract
Shear strength of a reinforced concrete (RC) member should be larger than its flexural strength, in order to prevent the shear failure, which is sudden and brittle. The reliability of a RC beam against the shear failure is closely related to the reliability of the equation determining its shear strength. In this study, the reliabilities of shear strength equations of RC beams were investigated by constructing the performance function between prediction equations and experimental results using a second-moment approach. It is assumed that the random variables are statistically independent, and the correlation effects are not taken into account. It is observed from the reliability rankings that the equation of EN 1992:2004 yields the lowest failure probability while the equation of Zsutty is the highest failure probability. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
35. Probability-Based Design of Reinforced Rock Slopes Using Coupled FORM and Monte Carlo Methods.
- Author
-
Low, Bak Kong and Boon, Chia Weng
- Subjects
- *
ROCK slopes , *FAILURE mode & effects analysis - Abstract
The efficiency of the first-order reliability method (FORM) and the accuracy of Monte Carlo simulations (MCS) are coupled in probability-based designs of reinforced rock slopes, including a Hong Kong slope with exfoliation joints. Load–resistance duality is demonstrated and resolved automatically in a foundation on rock with a discontinuity plane. Other examples include the lengthy Hoek and Bray deterministic vectorial procedure for comprehensive pentahedral blocks with external load and bolt force, which is made efficient and more succinct before extending it to probability-based design via MCS-enhanced FORM. The FORM–MCS–FORM design procedure is proposed for cases with multiple failure modes. For cases with a dominant single failure mode, the time-saving importance sampling (IS) and the fast second-order reliability method (SORM) can be used in lieu of MCS. Two cases of 3D reinforced blocks (pentahedral and tetrahedral, respectively) with the possibility of multiple sliding modes are investigated. In the case of the reinforced pentahedral block, direct MCS shows that there is only one dominant failure mode, for which the efficient method of importance sampling at the FORM design point provides fast verification of the revised design. In the case of the reinforced tetrahedral block, there are multiple failure modes contributing to the total failure probability, for which the proposed MCS-enhanced FORM procedure is demonstrated to be essential. Comparisons are made between Excel MCS and MATLAB MCS. Highlights: Probability-based design of a Hong Kong slope via coupled FORM and Monte Carlo methods. Efficient analysis of a bolted pentahedral block based on Hoek-Bray procedure and Excel Solver. New extension of Low-and-Tang FORM algorithm to MCS involving correlated nonnormals. FORM-MCS-FORM method for design of 3D rock slopes with multiple failure domains. Importance sampling or SORM in lieu of MCS for cases with a dominant single failure mode. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
36. Holistic processing is modulated by the probability that parts contain task-congruent information.
- Author
-
Curby, Kim M., Teichmann, Lina, Peterson, Mary A., and Shomstein, Sarah S.
- Subjects
- *
SELECTIVITY (Psychology) , *PROBABILITY theory - Abstract
Holistic processing of face and non-face stimuli has been framed as a perceptual strategy, with classic hallmarks of holistic processing, such as the composite effect, reflecting a failure of selective attention, which is a consequence of this strategy. Further, evidence that holistic processing is impacted by training different patterns of attentional prioritization suggest that it may be a result of learned attention to the whole, which renders it difficult to attend to only part of a stimulus. If so, holistic processing should be modulated by the same factors that shape attentional selection, such as the probability that distracting or task-relevant information will be present. In contrast, other accounts suggest that it is the match to an internal face template that triggers specialized holistic processing mechanisms. Here we probed these accounts by manipulating the probability, across different testing sessions, that the task-irrelevant face part in the composite face task will contain task-congruent or -incongruent information. Attentional accounts of holistic processing predict that when the probability that the task-irrelevant part contains congruent information is low (25%), holistic processing should be attenuated compared to when this probability is high (75%). In contrast, template-based accounts of holistic face processing predict that it will be unaffected by manipulation given the integrity of the faces remains intact. Experiment 1 found evidence consistent with attentional accounts of holistic face processing and Experiment 2 extends these findings to holistic processing of non-face stimuli. These findings are broadly consistent with learned attention accounts of holistic processing. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
37. Probability Turns Material: The Boltzmann Equation.
- Author
-
Rondoni, Lamberto and Di Florio, Vincenzo
- Subjects
- *
BOLTZMANN'S equation , *STATISTICAL mechanics , *CRITICAL analysis , *DYNAMICAL systems - Abstract
We review, under a modern light, the conditions that render the Boltzmann equation applicable. These are conditions that permit probability to behave like mass, thereby possessing clear and concrete content, whereas generally, this is not the case. Because science and technology are increasingly interested in small systems that violate the conditions of the Boltzmann equation, probability appears to be the only mathematical tool suitable for treating them. Therefore, Boltzmann's teachings remain relevant, and the present analysis provides a critical perspective useful for accurately interpreting the results of current applications of statistical mechanics. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
38. A theory of stochastic fluvial landscape evolution.
- Author
-
Roberts, G. G. and Wani, O.
- Subjects
- *
FOKKER-Planck equation , *STOCHASTIC differential equations , *SIMILARITY (Geometry) , *FLUVIAL geomorphology , *LANGEVIN equations , *PREDICTION theory - Abstract
Geometries of eroding landscapes contain important information about geologic, climatic, biotic and geomorphic processes. They are also characterized by variability, which makes disentangling their origins challenging. Observations and physical models of fluvial processes, which set the pace of erosion on most continents, emphasize complexity and variability. By contrast, the spectral content of longitudinal river profiles and similarity of geometries at scales greater than approximately 100km highlight relatively simple emergent properties. A general challenge then, addressed in this manuscript, is development of a theory of landscape evolution that embraces such scale-dependent insights. We do so by incorporating randomness and probability into a theory of fluvial erosion. First, we explore the use of stochastic differential equations of the Langevin type, and the Fokker--Planck equation, for predicting migration of erosional fronts. Second, analytical approaches incorporating distributions of driving forces, critical thresholds and associated proxies are developed. Finally, a linear programming approach is introduced, that, at its core, treats evolution of longitudinal profiles as a Markovian stochastic problem. The theory is developed essentially from first principles and incorporates physics governing fluvial erosion. We explore predictions of this theory, including the natural growth of discontinuities and scale-dependent evolution, including local complexity and emergent simplicity. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
39. On the use of receiver operating characteristic curve analysis to determine the most appropriate p value significance threshold.
- Author
-
Habibzadeh, Farrokh
- Subjects
- *
RECEIVER operating characteristic curves , *FALSE positive error , *STATISTICAL hypothesis testing , *FREQUENTIST statistics - Abstract
Background: p value is the most common statistic reported in scientific research articles. Choosing the conventional threshold of 0.05 commonly used for the p value in research articles, is unfounded. Many researchers have tried to provide a reasonable threshold for the p value; some proposed a lower threshold, eg, 0.005. However, none of the proposals has gained universal acceptance. Using the analogy between the diagnostic tests with continuous results and statistical inference tests of hypothesis, I wish to present a method to calculate the most appropriate p value significance threshold using the receiver operating characteristic curve (ROC) analysis. Results: As with diagnostic tests where the most appropriate cut-off values are different depending on the situation, there is no unique cut-off for the p significance threshold. Unlike the previous proposals, which mostly suggest lowering the threshold to a fixed value (eg, from 0.05 to 0.005), the most appropriate p significance threshold proposed here, in most instances, is much less than the conventional cut-off of 0.05 and varies from study to study and from statistical test to test, even within a single study. The proposed method provides the minimum weighted sum of type I and type II errors. Conclusions: Given the perplexity involved in using the frequentist statistics in a correct way (dealing with different p significance thresholds, even in a single study), it seems that the p value is no longer a proper statistic to be used in our research; it should be replaced by alternative methods, eg, Bayesian methods. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
40. Assessment and comparison of probability scores to predict giant cell arteritis.
- Author
-
Sargi, Chadi, Ducharme-Benard, Stephanie, Benard, Valerie, Meunier, Rosalie-Selene, Ross, Carolyn, and Makhzoum, Jean-Paul
- Subjects
- *
GIANT cell arteritis , *DOPPLER ultrasonography , *TEMPORAL arteries , *PROBABILITY theory , *LONGITUDINAL method - Abstract
Introduction/objectives: To assess and compare the performance of the giant cell arteritis probability score (GCAPS), Ing score, Bhavsar-Khalidi score (BK score), color Doppler ultrasound (CDUS) halo count, and halo score, to predict a final diagnosis of giant cell arteritis (GCA). Method: A prospective cohort study was conducted from April to December 2021. Patients with suspected new-onset GCA referred to our quaternary CDUS clinic were included. Data required to calculate each clinical and CDUS probability score was systematically collected at the initial visit. Final diagnosis of GCA was confirmed clinically 6 months after the initial visit, by two blinded vasculitis specialists. Diagnostic accuracy and receiver operator characteristic (ROC) curves for each clinical and CDUS prediction scores were assessed. Results: Two hundred patients with suspected new-onset GCA were included: 58 with confirmed GCA and 142 without GCA. All patients with GCA satisfied the 2022 ACR/EULAR classification criteria. A total of 5/15 patients with GCA had a positive temporal artery biopsy. For clinical probability scores, the GCAPS showed the best sensitivity (Se, 0.983), whereas the BK score showed the best specificity (Sp, 0.711). As for CDUS, a halo count of 1 or more was found to have a Se of 0.966 and a Sp of 0.979. Combining concordant results of clinical and CDUS prediction scores showed excellent performance in predicting a final diagnosis of GCA. Conclusion: Using a combination of clinical score and CDUS halo count provided an accurate GCA prediction method which should be used in the setting of GCA Fast-Track clinics. Key Points • In this prospective cohort of participants with suspected GCA, 3 clinical prediction tools and 2 ultrasound scores were compared head-to-head to predict a final diagnosis of GCA. • For clinical prediction tools, the giant cell arteritis probability score (GCAPS) had the highest sensitivity, whereas the Bhavsar-Khalidi score (BK score) had the highest specificity. • Ultrasound halo count was both sensitive and specific in predicting GCA. • Combination of a clinical prediction tool such as the GCAPS, with ultrasound halo count, provides an accurate method to predict GCA. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
41. Hazards and Risks Investigation of the Fit Out Project of RMM Construction: Basis for Creation of an Occupational Construction and Health and Safety Program.
- Author
-
Masanga, Luisito P., Gutierrez, Ernell Bautista, and De Vera, Rommel
- Subjects
- *
CONSTRUCTION workers , *INDUSTRIAL hygiene , *HAZARDS , *QUALITATIVE research , *GRINDING & polishing - Abstract
The existence of hazards and risks in businesses is inevitable, specifically, in the construction industry. The research is qualitative by nature which aimed the identification and investigation of the actual hazards and risks in the workplace as experienced by the select construction workers as main participants in the research. A total of eleven (11) constructions workers were utilized in the research and selected by way of purposive sampling. During the focus discussions, participants are divided into three groups. Among others, results revealed the following construction-related hazards: Scattered Broken Tiles, Unstable Scaffolding, Scattered Nails, Insufficient Lightning, Exposed Extension Wire and Chiselling and Grinding, No Insulation and Physical Transport of Heavy materials. The research further investigated the level of risks in the identified hazard using the probability and severity matrix. At average, the identified risks obtained a high level of probability while severity at average is to medium or moderate only. Occupational Construction and Health and Safety Program was developed based on the results of this research. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
42. Model of Assessing the Quality Indicators of the Service Process in Transport.
- Author
-
Zakirov, Vakhid
- Subjects
- *
QUALITY of service , *QUEUING theory , *STARTUP costs , *CUSTOMER services , *VALUE (Economics) - Abstract
In a competitive environment, satisfying the growing demand of customers for the quality of service of transport services, taking into account the economic interests of transport companies and service users is the most important task. Existing methods based only on the opinions of users do not fully take into account the interests of companies providing these services. To this end, the article analyzes the qualitative indicators of customer service provided by transport companies when providing them with various services. It has been established that the cost of organizing the service, the (lost) income, and the time lost by customers depend on the quality of service. This analysis is based on the queuing theory model. A method is proposed for determining the optimal value from an economic point of view of indicators of the quality of customer service. The method is based on minimizing a generalized indicator – reduced costs. This indicator takes into account, on the one hand, the interests of the company and, on the other hand, the interests of service users. The results obtained allow companies to choose the quality of customer service indicator and its optimal value for the provision of various services. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
43. A probabilistic-phase field model for the fracture of brittle materials.
- Author
-
Alabdullah, Mohammad and Ghoniem, Nasr M
- Subjects
- *
FRACTURE mechanics , *MECHANICAL loads , *BRITTLE materials , *PERCOLATION , *COMPUTER simulation , *BEND testing - Abstract
We develop a computational method to determine the failure probability of brittle materials under general mechanical loading conditions. The method is a combination of two parts: (1) numerical simulations of materials with multiple cracks using phase field theory, where the complete fracture process is viewed as 'damage percolation' along critical paths or clusters of cracks, rather than the traditional weak-link failure mechanism of Weibull, and (2) an extension of the Batdorf statistical theory of fracture to finite domains, where it is implemented within the finite element framework. The results of phase-field simulations at the 'percolation threshold' are used as failure data in the Batdorf theory to determine the overall probability of failure. The input to this approach is the size distribution of cracks in a pristine material. An example is shown, where alumina samples that were previously tested by Abe and coworkers (Abe et al 2003 J. Am. Ceram. Soc. 86 1019–21) in four-point loading are compared to the results of our numerical simulations. The approach developed here has the advantage of being extendable to more complex thermomechanical loading. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
44. A systematic account of probabilistic fallacies in legal fact-finding.
- Author
-
Dahlman, Christian
- Subjects
- *
LOGICAL fallacies , *LEGAL services , *SCHOLARS - Abstract
Evidence scholars have observed probabilistic fallacies in legal fact-finding and given them names since the 1980s (for example 'Prosecutor's Fallacy' and 'Defense Attorney's Fallacy'). This has produced a rather un-organised list of over a dozen different probabilistic fallacies. In this article, the author proposes a systematic account where the observed probabilistic fallacies are organised in categories. Hierarchical relations between probabilistic fallacies are highlighted, and some fallacies are re-named to reflect the category they belong to and their relation to other fallacies in that category. All fallacies are precisely defined and illustrated with examples from real cases where they are committed by fact-finders. The result is a list of 12 probabilistic fallacies organised into 7 categories. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
45. A neutral theory of plant carbon allocation.
- Author
-
Thompson, R Alex
- Subjects
- *
BOTANICAL chemistry , *CARBON metabolism , *MOLECULAR evolution , *PLANT metabolism , *CARBON , *RESPIRATION , *RESPIRATION in plants - Abstract
How plants use the carbon they gain from photosynthesis remains a key area of study among plant ecologists. Although numerous theories have been presented throughout the years, the field lacks a clear null model. To fill this gap, I have developed the first null model, or neutral theory, of plant carbon allocation using probability theory, plant biochemistry and graph theory at the level of a leaf. Neutral theories have been used to establish a null hypothesis in molecular evolution and community assembly to describe how much of an ecological phenomenon can be described by chance alone. Here, the aim of a neutral theory of plant carbon allocation is to ask: how is carbon partitioned between sinks if one assumes plants do not prioritize certain sinks over others? Using the biochemical network of plant carbon metabolism, I show that, if allocation was strictly random, carbon is more likely to be allocated to storage, defense, respiration and finally growth. This 'neutral hierarchy' suggests that a sink's biochemical distance from photosynthesis plays an important role in carbon allocation patterns, highlighting the potentially adaptive role of this biochemical network for plant survival in variable environments. A brief simulation underscores that our ability to measure the carbon allocation from photosynthesis to a given sink is unreliable due to simple probabilistic rules. While neutral theory may not explain all patterns of carbon allocation, its utility is in the minimal assumptions and role as a null model against which future data should be tested. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
46. Improved flood quantile estimation for South Africa.
- Author
-
van der Spuy, D. and du Plessis, J. A.
- Subjects
- *
DISTRIBUTION (Probability theory) , *EXTREME value theory , *FLOODS , *MOMENTS method (Statistics) , *QUANTILE regression , *HYDROLOGISTS - Abstract
The performance of the most frequently used flood frequency probability distributions in South Africa (Log-Normal, Log Pearson3 and Generalised Extreme Value) were reviewed and all tend to perform poorly when lower exceedance probability frequency events are estimated, especially where outliers are present in the dataset. This can be attributed to the challenge when analysing very limited 'samples' of annual flood peak populations, which are an unknown. At present outliers are inadequately 'managed' by attempting to 'normalise' the flood peak dataset, which conceals the significance of the observed data. Thus, to adequately consider the outliers, this study was undertaken with the aim to improve the current statistical approach by developing a more stable and consistent methodology to estimate flood quantiles. The approach followed in the development of the new methodology, called IPZA, might be considered as unconventional, given that a multiple regression approach was used to accommodate the strongly skewed data, which are often associated with annual flood peak series. The main advantages of IPZA are consistency, the simplicity of application (only one set of frequency factors for every parameter, regardless of the skewness), the integrated handling of outliers and the use of conventional method of moments, thereby eliminating the need to adjust any moments. The performance of IPZA exceeded initial expectations. The results are more consistent and, by taking outliers into account, appear to be more sensible than existing probability distributions. It is recommended that IPZA should be used as a valuable addition to the existing set of decision-making tools for hydrologists/engineers performing flood frequency analyses. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
47. An intuitive, application-based, simulation-driven approach to teaching probability and random processes.
- Author
-
Sheikh, Waseem
- Abstract
Probability and random processes is considered by students to be conceptually one of the most difficult subjects in the undergraduate electrical and computer engineering curriculum. There are numerous reasons for this difficulty encountered by the students. First off, humans are not innately good at probabilistic intuition. Traditionally, this subject has been introduced in a very abstract manner without emphasis on real-world applications from electrical and computer engineering discipline. In addition, extensive use of interactive simulation and visualization tools, offering an alternative way of developing probabilistic intuition, is usually missing from traditional course offerings. This paper presents a unique pedagogical approach to teaching an introductory probability course offered to electrical and computer engineering juniors. The salient features of the proposed pedagogical approach include more emphasis on real-world electrical and computer engineering problems that show the applications of abstract probabilistic concepts; extensive hands-on and interactive MATLAB® simulations of real-world electrical and computer engineering problems that are tightly integrated into the curriculum; highlighting the frequentist approach to build probabilistic intuition using simulations; concrete examples showing how naive probabilistic intuition can be erroneous and how to develop correct probabilistic intuition based on systematically modeling, simulating, and analyzing a problem; and application-based simulations driving the abstract theory rather than the other way around. This pedagogical approach was implemented in a course offered to electrical and computer engineering undergraduates at Purdue University Northwest. The paper presents a concrete example illustrating how the salient features of the proposed pedagogical approach were implemented as part of this course and student data from the courses to validate the efficacy of the proposed approach. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
48. Can We have Justified Beliefs about Fundamental Properties?
- Author
-
Bradley, Darren
- Subjects
- *
METAPHYSICS , *THEORY of knowledge , *PROBABILITY theory , *SKEPTICISM , *BELIEF & doubt - Abstract
An attractive picture of the world is that some features are metaphysically fundamental and others are derivative, with the derivative features grounded in the fundamental features. But how do we have justified beliefs about which features are fundamental? What is the epistemology of fundamentality? I sketch a response in this paper. The guiding idea is that the same properties cause the same experiences. I argue that a probabilistic connection between epistemic fundamentality and metaphysical fundamentality is sufficient for justified beliefs about the metaphysically fundamental. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
49. Randomness and probability: exploring student teachers' conceptions.
- Author
-
Ingram, Jenni
- Subjects
- *
STUDENT teachers , *MATHEMATICS teachers , *SCIENCE teachers , *TEACHER educators , *TEACHER education - Abstract
Understanding randomness is essential for modern life, as it underpins decisions under uncertainty. It is also an essential part of both the mathematics and science curricula in schools. Yet, research has shown that many people consider randomness difficult to perceive and argue about, with a number of different and contradictory views on the nature of randomness prevailing. This study explores beginning mathematics and science teachers' understanding of randomness. A questionnaire was used with student teachers in an initial teacher-education course to explore their understanding of and reasoning about randomness and random events. Results suggest that mathematics and science student teachers conceptualize and argue about randomness in a variety of ways. Furthermore, these different conceptualizations affect how they respond to both common classroom tasks and everyday contexts involving randomness. This raises important implications for the education of teachers who will themselves be teaching probability and statistical inference. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
50. Probabilidade para o ensino médio nos livros de conhecimento do PNLD 2021.
- Author
-
Oliveira da Silva, Anderson Rodrigo and Lisbôa Guimarães, Gilda
- Subjects
- *
MATHEMATICS textbooks , *HIGH schools , *DECISION making , *PROBABILITY theory , *TEACHING , *LEARNING - Abstract
Probability is an area of mathematics focused on predicting chances, making decisions and analyzing risks, making it one of the main areas of knowledge developed at school. Knowing that the textbook is an essential tool in the teacher's work, this article aims to analyse the perspective of teaching probability in the Probability and Statistics textbooks approved by the PNLD 2021 for High School. Based on documental research, we analyzed the activities considering the meanings of probability, sample spaces and the structuring of events. We found an asymmetry in relation to the meanings, with a great predominance of the classical meaning, the expressive use of discrete sample spaces and limitations in the conceptual approach to important theorems, such as conditional probability for the composition of events. As a result, teachers' skills are fundamental for complementing and correcting the teaching and learning process of probability. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.