96 results on '"Probability"'
Search Results
2. Diagnosis of acute aortic syndromes with ultrasound and d-dimer: the PROFUNDUS study.
- Author
-
Morello, Fulvio, Bima, Paolo, Castelli, Matteo, Capretti, Elisa, de Matos Soeiro, Alexandre, Cipriano, Alessandro, Costantino, Giorgio, Vanni, Simone, Leidel, Bernd A., Kaufmann, Beat A., Osman, Adi, Candelli, Marcello, Capsoni, Nicolò, Behringer, Wilhelm, Capuano, Marialessia, Ascione, Giovanni, Leal, Tatiana de Carvalho Andreucci Torres, Ghiadoni, Lorenzo, Pivetta, Emanuele, and Grifoni, Stefano
- Subjects
- *
COMPUTED tomography , *PATIENT selection , *MEDICAL triage , *BACKACHE , *ANGIOGRAPHY - Abstract
• Ultrasound and d- dimer were integrated for diagnosis of acute aortic syndromes. • The protocol allowed rapid triage for urgent computed tomography angiography. • Protocol based rule-out was safe since no major events were missed within 30 days. • The protocol averted 41 % of computed tomography angiography exams. • Age-adjusted interpretation of d -dimer maximized protocol efficiency. In patients complaining common symptoms such as chest/abdominal/back pain or syncope, acute aortic syndromes (AAS) are rare underlying causes. AAS diagnosis requires urgent advanced aortic imaging (AAI), mostly computed tomography angiography. However, patient selection for AAI poses conflicting risks of misdiagnosis and overtesting. We assessed the safety and efficiency of a diagnostic protocol integrating clinical data with point-of-care ultrasound (POCUS) and d- dimer (single/age-adjusted cutoff), to select patients for AAI. This prospective study involved 12 Emergency Departments from 5 countries. POCUS findings were integrated with a guideline-compliant clinical score, to define the integrated pre-test probability (iPTP) of AAS. If iPTP was high, urgent AAI was requested. If iPTP was low and d -dimer was negative, AAS was ruled out. Patients were followed for 30 days, to adjudicate outcomes. Within 1979 enrolled patients, 176 (9 %) had an AAS. POCUS led to net reclassification improvement of 20 % (24 %/-4 % for events/non-events, P < 0.001) over clinical score alone. Median time to AAS diagnosis was 60 min if POCUS was positive vs 118 if negative (P = 0.042). Within 941 patients satisfying rule-out criteria, the 30-day incidence of AAS was 0 % (95 % CI, 0–0.41 %); without POCUS, 2 AAS were potentially missed. Protocol rule-out efficiency was 48 % (95 % CI, 46–50 %) and AAI was averted in 41 % of patients. Using age-adjusted d -dimer, rule-out efficiency was 54 % (difference 6 %, 95 % CI, 4–9 %, vs standard cutoff). The integrated algorithm allowed rapid triage of high-probability patients, while providing safe and efficient rule-out of AAS. Age-adjusted d -dimer maximized efficiency. CLINICAL TRIAL REGISTRATION: Clinicaltrials.gov, NCT04430400 [Display omitted] [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
3. A note on the exponentiation approximation of the birthday paradox.
- Author
-
Motegi, Kaiji and Woo, Sejun
- Subjects
- *
EXPONENTIATION , *BIRTHDAYS , *CONDITIONAL probability , *PARADOX - Abstract
This note sheds new light on the exponentiation approximation of the probability that all K individuals have distinct birthdays across N calendar days. The exponentiation approximation imposes a pairwise independence assumption, which does not hold in general. We sidestep this assumption by deriving the conditional probability for each pair of individuals to have distinct birthdays given that previous pairs do. An interesting implication is that the conditional probability decreases in a step-function form—not in a strictly monotonical form—as more pairs are restricted to have distinct birthdays. The source of the step-function structure is identified and illustrated. We also establish the equivalence between the pairwise approach and another common approach based on permutations of all individuals. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
4. Kahneman, Tversky, and Kahneman-Tversky: three ways of thinking.
- Author
-
Johnson-Laird, P. N.
- Subjects
- *
JUDGMENT (Psychology) , *DECISION making , *PSYCHOLOGY , *PROBABILITY theory , *EXPLANATION - Abstract
AbstractThis homage to Danny Kahneman and Amos Tversky describes how each of them thought about psychology. It outlines the principal results of their collaborative research, which was their most original and most influential. Why? In search of an explanation it examines their joint thinking during their collaboration. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
5. The case of the disappearing energy: potential energies in concentration gradients.
- Author
-
Hansen, Lee D., Woodfield, Brian F., and Tolley, H. Dennis
- Subjects
- *
CONCENTRATION gradient , *ENERGY levels (Quantum mechanics) , *CONSERVATION of energy , *POTENTIAL energy , *CONSERVATION laws (Physics) - Abstract
This paper reviews observations on processes involving concentration gradients to show that (1) Concentration gradients can do external work during discharge if the system is arranged in a manner that requires it. (2) Work has to be done on the system (i.e. energy has to be added) to create a concentration gradient. (3) Concentration gradients can spontaneously discharge with no change in energy except interaction energy. These three observations are significant since, together, these observations demonstrate an apparent violation of the law of conservation of energy which is resolved by proposing that a probability field is a common element for all concentration gradients. This paper thus introduces two new concepts into thermodynamics: (1) Many spontaneous processes occur because of an increase in probability, not because of a decrease in the energy state of the system. (2) Concentration gradients coincide with a probability field and a constraint-dependent and temperature-dependent potential energy. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
6. External validation of and improvement upon a model for the prediction of placenta accreta spectrum severity using prospectively collected multicenter ultrasound data.
- Author
-
Kolak, Magdalena, Gerry, Stephen, Huras, Hubert, Al Naimi, Ammar, Fox, Karin A., Braun, Thorsten, Stefanovic, Vedran, Beekhuizen, Heleen, Morel, Olivier, Paping, Alexander, Bertholdt, Charline, Calda, Pavel, Lastuvka, Zdenek, Jaworowski, Andrzej, Savukyne, Egle, and Collins, Sally
- Subjects
- *
PLACENTA accreta , *DATABASES , *PREDICTION models , *PLACENTA , *ULTRASONIC imaging - Abstract
Introduction Material and Methods Results Conclusions This study aimed to validate the Sargent risk stratification algorithm for the prediction of placenta accreta spectrum (PAS) severity using data collected from multiple centers and using the multicenter data to improve the model.We conducted a multicenter analysis using data collected for the IS‐PAS database. The Sargent model's effectiveness in distinguishing between abnormally adherent placenta (FIGO grade 1) and abnormally invasive placenta (FIGO grades 2 and 3) was evaluated. A new model was developed using multicenter data from the IS‐PAS database.The database included 315 cases of suspected PAS, of which 226 had fully documented standardized ultrasound signs. The final diagnosis was normal placentation in 5, abnormally adherent placenta/FIGO grade 1 in 43, and abnormally invasive placenta/FIGO grades 2 and 3 in 178. The external validation of the Sargent model revealed moderate predictive accuracy in a multicenter setting (C‐index 0.68), compared to its higher accuracy in a single‐center context (C‐index 0.90). The newly developed model achieved a C‐index of 0.74.The study underscores the difficulty in developing universally applicable PAS prediction models. While models like that of Sargent et al. show promise, their reproducibility varies across settings, likely due to the interpretation of the ultrasound signs. The findings support the need for updating the current ultrasound descriptors and for the development of any new predictive models to use data collected by different operators in multiple clinical settings. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
7. How to evaluate the rationality of heuristics?
- Author
-
Nadurak, Vitaliy
- Subjects
- *
LEGAL judgments , *PROBABILITY theory , *HEURISTIC , *DECISION making , *LOGIC - Abstract
AbstractOne of the most debated topics among those who study heuristics is the question of their rationality. The present paper proposes an answer to this question based on the ideas of instrumental rationality and the probabilistic nature of heuristic judgments and decisions. Accordingly, it is argued that the rationality of heuristics is determined by their effectiveness, i.e., their ability to achieve a desired result. At the same time, heuristics do not always produce such a result, but only in a certain number of cases. Therefore, their effectiveness should not be evaluated by the binary logic criterion (effective/ineffective) but by the probabilistic criterion, i.e., by how often they lead to the desired result. Each heuristic has a certain objective probability of achieving such a result, which determines its rationality. However, this probability is mostly unknown to us, so when relying on heuristics, we are likely guided by a subjective perception of it, which serves as a metacognitive cue about the probability of the conclusion obtained with the help of this heuristic. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
8. The probability that the product of three elements in a finite ring is zero.
- Author
-
Sarma, Dibyasman and Subedi, Tikaram
- Subjects
- *
FINITE rings , *PROBABILITY theory - Abstract
Let R be a finite commutative ring. In this paper, we consider the probability that the product of randomly chosen three elements of R is zero, which we denote by zp3(R). First we obtain bounds for zp3(R) for various classes of rings and then show that for any ring R with identity zp3(R) ≤ 7 8. Finally, we characterize all commutative rings R with identity with certain range of zp3(R). [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
9. Janina Hosiasson and the value of evidence.
- Author
-
Torsell, Christian
- Subjects
- *
THEORY of knowledge , *ARGUMENT , *PROBABILITY theory - Abstract
I.J. Good's "On the Principle of Total Evidence" (1967) looms large in decision theory and Bayesian epistemology. Good proves that in Savage 's (1954) decision theory, a coherent agent always prefers to collect, rather than ignore, free evidence. It is now well known that Good's result was prefigured in an unpublished note by Frank Ramsey (Skyrms 2006). The present paper highlights another early forerunner to Good's argument, appearing in Janina Hosiasson's "Why do We Prefer Probabilities Relative to Many Data?" (1931), that has been neglected in the literature. Section 1 reviews Good's argument and the problem it was meant to resolve; call this the value of evidence problem. Section 2 offers a brief history of the value of evidence problem and provides biographical background to contextualize Hosiasson's contribution. Section 3 explicates the central argument of Hosiasson's paper and considers its relationship to Good's (1967). [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
10. Umbilical Cord Blood Gas Pairs with Near-Identical Results: Probability of Arterial or Venous Source.
- Author
-
Monneret, Denis and Stavis, Robert L.
- Subjects
- *
BLOOD gases analysis , *HYDROGEN-ion concentration , *OXYGEN , *PROBABILITY theory , *DESCRIPTIVE statistics , *CORD blood , *CONFIDENCE intervals , *CARBON dioxide , *HYPOXEMIA - Abstract
Objective In studies of concomitant arterial–venous umbilical cord blood gases (CAV-UBGs), approximately 10% of technically valid samples have very similar pH and/or pCO 2 values and were probably drawn from the same type of blood vessel. Without a way to objectively determine the source in these cases, it has been argued that most of these same-source CAV-UBGs are venous because the vein is larger and more easily sampled than the artery. This study aimed to calculate the probability of an arterial (ProbAS) or venous source (ProbVS) of same-source CAV-UBGs in the clinically and medicolegally important pH range of 6.70 to 7.25 using a statistical predictive model based on the cord blood gas values. Study Design Starting with a dataset of 56,703 CAV-UBGs, the ProbAS, ProbVS, and respective 95% confidence intervals (CIs) were calculated for the 241 sample pairs with near-identical pH, pCO 2 , and pO 2 values and a pH of 6.70 to 7.25. Using a previously validated generalized additive model, the source was categorized as: Probable Arterial or Highly Probable Arterial if the ProbAS and CIs were >0.5 or >0.8, respectively; Probable Venous or Highly Probable Venous if the ProbVS and CIs were >0.5 or >0.8, respectively; or Indeterminant if the CIs encompassed ProbAS/VS = 0.5. Results A total of 39% of the same-source CAV-UBGs were Probable Arterial, 56% were Probable Venous, and 5% were Indeterminant. However, considering samples with a pH ≤7.19, 80% were Probable Arterial and 16% were Probable Venous. Considering the Highly Probable categories, the more acidemic specimens were 9 times more likely to be arterial than venous. Similarly, CAV-UBGs with pCO 2 > 8.2 kPa (62 mm Hg) or pO 2 ≤ 1.9 kPa (14 mm Hg) were more likely to be in the arterial rather than the venous categories. Conclusion Same-source CAV-UBGs in the more acidemic, hypercarbic, or hypoxemic ranges are more likely to be arterial than venous. Key Points Umbilical cord arterial/venous gases (CAV-UBGs) with similar values are thought to be mainly venous. A validated statistical model was used to predict the probability an arterial or venous source. CAV-UBGs with very similar values and pH > 7.19 are likely venous; however, those with pH ≤ 7.19 and/or pCO 2 > 8.2 kPa and/or pO 2 ≤1.9 kPa are more likely arterial. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
11. Probabilistic optimal power flow computation for power grid including correlated wind sources.
- Author
-
Xiao, Qing, Tan, Zhuangxi, and Du, Min
- Subjects
- *
ELECTRICAL load , *ELECTRIC power distribution grids , *CUMULATIVE distribution function , *MARGINAL distributions , *LATIN hypercube sampling - Abstract
This paper sets out to develop an efficient probabilistic optimal power flow (POPF) algorithm to assess the influence of wind power on power grid. Given a set of wind data at multiple sites, their marginal distributions are fitted by a newly developed generalized Johnson system, whose parameters are specified by a percentile matching method. The correlation of wind speeds is characterized by a flexible Liouville copula, which allows to model the asymmetric dependence structure. In order to improve the efficiency for solving POPF problem, a lattice sampling method is developed to generate wind samples at multiple sites, and a logistic mixture model is proposed to fit distributions of POPF outputs. Finally, case studies are performed, the generalized Johnson system is compared with Weibull distribution and the original Johnson system for fitting wind samples, Liouville copula is compared against Archimedean copula for modelling correlated wind samples, and lattice sampling method is compared with Sobol sequence and Latin hypercube sampling for solving POPF problem on IEEE 118‐bus system, the results indicate the higher accuracy of the proposed methods for recovering the joint cumulative distribution function of correlated wind samples, as well as the higher efficiency for calculating statistical information of POPF outputs. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
12. Participants' Utilitarian Choice Is Influenced by Gamble Presentation and Age.
- Author
-
Teal, Joseph, Kusev, Petko, Vukadinova, Siana, Martin, Rose, and Heilman, Renata M.
- Subjects
- *
GAMBLING , *BEHAVIORAL research , *BEHAVIORAL sciences , *DECISION making , *PROBABILITY theory - Abstract
No prior behavioral science research has delved into the impact of gamble presentation (horizontal or vertical) on individuals' utilitarian behavior, despite evidence suggesting that such choices can be influenced by comparing attributes like probability and money in gambles. This article addresses this gap by exploring the influence of gamble presentation on utilitarian behavior. A two-factor independent measures design was employed to explore the influence of the type of gamble presentation and age on participants' utilitarian decision-making preferences. The findings showed a reduced likelihood of participants choosing the non-utilitarian gamble with vertically presented gambles compared to horizontal ones. Consequently, participants' utilitarian behavior was influenced by between-gamble comparisons of available attributes, with utilitarian choices (e.g., choosing Gamble A) being more prevalent in vertical presentations due to a straightforward comparison on the probability attribute. Furthermore, the results also revealed that older participants take more time than their younger counterparts when making utilitarian errors. We attribute this to their abundant knowledge and experience. Future research should explore the comparative psychological processing used by participants in risky decision-making tasks. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
13. The Road Less Travelled: Keynes and Knight on Probability and Uncertainty.
- Author
-
Gerrard, Bill
- Subjects
- *
PROBABILITY theory , *LOGIC , *BROTHERS , *ARGUMENT , *ENTREPRENEURSHIP - Abstract
Knight's risk/uncertainty distinction is reviewed in its original context as a contribution to the theory of profit. Knight's approach to probability is paralleled by Ludwig von Mises, as emphasised by recent developments in strategic entrepreneurship theory. Von Mises distinguishes between class probability (i.e., risk) and case probability (i.e., uncertainty) in contrast to the frequentist approach of his brother, Richard von Mises. Keynes's contribution to probability and uncertainty is reviewed, focusing on his logical theory of probability in A Treatise on Probability which he more fully contextualised subsequently in the General Theory. Keynes's fragmentary later philosophical writings are reviewed to provide some insight into the contextual issues encountered. The key contributions of Knight and Keynes are summarised as signposts for 'The Road Less Travelled'. The possibilities of a Keynesian-Knightian synthesis as a way forward are considered by comparing these signposts. However, it is concluded that, although there is some common ground between Knight and Keynes, there are fundamental differences particularly associated with the definition of confidence that preclude any meaningful synthesis. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
14. An example showing that the sum of two normal random variables may not be normal.
- Author
-
Fujita, Takahiko and Yoshida, Naohiro
- Subjects
- *
RANDOM variables , *GENERATING functions , *GAUSSIAN distribution , *MATHEMATICS students , *GAMMA functions - Abstract
Two novel proofs show that the sum of a specific pair of normal random variables is not normal are established in this note. This is one of the most often misunderstood facts by first-year students in probability theory and statistics. The first proof is concise using the moment generating function. The second proof checks whether the moments of the sum have the property of normal distribution. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
15. Risk-informed design and safety assessment of structures in a changing climate: a review of U.S. practice and a path forward.
- Author
-
Ghosn, Michel and Ellingwood, Bruce R.
- Subjects
- *
SAFETY standards , *STRUCTURAL reliability , *MAP design , *BRIDGE design & construction , *HAZARDS , *SERVICE life , *CLIMATE change - Abstract
Standards for the design of bridges, buildings and other infrastructure specify design loads for climatic hazards such as temperature, snow, wind, and floods based on return periods presented in maps or tables that account for regional differences. These design loads were developed from statistical analyses of historical hazard data under the assumption that the past is representative of the future. Climate change may affect the frequencies and intensities of environmental hazards which, depending on regional variations, raises questions as to whether structures designed to current specifications will meet minimum safety standards over their future service lives. This paper critically appraises issues related to using historical hazard data for future designs. It reviews basic principles of uniform reliability, that modern design codes use as the basis for ensuring minimum levels of safety, describing the relationship between hazard return periods, structural reliability, risk and the maximum loads expected within a structure's service life. Simple examples involving wind effects on structures demonstrate how to calibrate structural design hazard maps for climate-related extreme events to meet the minimum standards of safety implied in current specifications. The paper also introduces a possible practical approach to account for climate change when designing new structures and assessing the safety of existing facilities. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
16. An Improved Dempster–Shafer Evidence Theory with Symmetric Compression and Application in Ship Probability.
- Author
-
Fang, Ning and Cui, Junmeng
- Subjects
- *
MULTISENSOR data fusion , *INFORMATION resources , *ENTROPY , *PROBABILITY theory , *PEARSON correlation (Statistics) - Abstract
Auxiliary information sources, a subset of target recognition data sources, play a significant role in target recognition. The reliability and importance of these sources can vary, thereby affecting the effectiveness of the data provided. Consequently, it is essential to integrate these auxiliary information sources prior to their utilization for identification. The Dempster-Shafer (DS) evidence theory, a well-established data-fusion method, offers distinct advantages in handling and combining uncertain information. In cases where conflicting evidence sources and minimal disparities in fundamental probability allocation are present, the implementation of DS evidence theory may demonstrate deficiencies. To address these concerns, this study refined DS evidence theory by introducing the notion of invalid evidence sources and determining the similarity weight of evidence sources through the Pearson correlation coefficient, reflecting the credibility of the evidence. The significance of evidence is characterized by entropy weights, taking into account the uncertainty of the evidence source. The proposed asymptotic adjustment compression function adjusts the basic probability allocation of evidence sources using comprehensive weights, leading to symmetric compression and control of the influence of evidence sources in data fusion. The simulation results and their application in ship target recognition demonstrate that the proposed method successfully incorporates basic probability allocation calculations for ship targets in various environments. In addition, the method effectively integrates data from multiple auxiliary information sources to produce accurate fusion results within an acceptable margin of error, thus validating its efficacy. The superiority of the proposed method is proved by comparing it with other methods that use the calculated weights to weight the basic probability allocation of the evidence sources. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
17. The two 'strongest pillars of the empiricist wing': the Vienna Circle, German academia and emigration in the light of correspondence between Philipp Frank and Richard von Mises (1916–1939).
- Author
-
Siegmund-Schultze, Reinhard
- Subjects
- *
VIENNA circle , *EMIGRATION & immigration , *THEORY of knowledge , *SCHOLARS - Abstract
This paper is divided into a surveying and argumentative part and a slightly longer documentary part, which is meant to verify or at least make more plausible claims made in the first part. The first part deals in broad outline with the relationship of Frank and von Mises to the Vienna Circle of Logical Empiricism on the one hand and to the physicists and mathematicians in the German-speaking world on the other. The varying special positions, partly the non-conformity of the two Austrian scientists are emphasized, in particular, their adherence to Ernst Mach's epistemology and their shared interest in probability theory and applied mathematics. The impact of emigration and the after-effects in the U.S. are discussed. This leads to new insights into the fine structure of the Vienna Circle and the latter's relationship to German academia within 'Weimar Culture'. P. Forman's interpretation (1971) of von Mises' position is critically discussed. The second, documentary part, uses recently discovered correspondence between Frank and von Mises, and, to a lesser extent, von Mises' personal diary. It aims at further substantiating some of the introductory theses and will at the same time provide material for a thorough biographical appreciation of the two scholars and friends. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
18. On the commutativity probability in certain finite groups.
- Author
-
Alajmi, Khaled
- Subjects
- *
FINITE groups , *NONABELIAN groups , *PROBABILITY theory , *CONJUGACY classes , *PERMUTATION groups , *NILPOTENT groups , *PERMUTATIONS - Abstract
The purpose of this paper is to compute the probability Pr(G) that two elements of the group G, drawn at random with replacement, commute; that is, Pr(G) = Number of ordered pairs (x, y) ∈ G × G such that xy = yx/|G × G| = |G|² In particular, we compute Pr(G) for some groups such as the extraspecial groups of order p³, p prime, for the permutation groups G = Sn and G = An, n ≥ 5, for 10 non-abelian groups of order p4 and for simple groups of certain type. [ABSTRACT FROM AUTHOR]
- Published
- 2024
19. Real-World Implications of Updated Surviving Sepsis Campaign Antibiotic Timing Recommendations.
- Author
-
Taylor, Stephanie P., Kowalkowski, Marc A., Skewes, Sable, and Shih-Hsiung Chou
- Subjects
- *
SEPSIS , *ANTIBIOTICS , *HOSPITAL patients , *HOSPITAL emergency services , *MEDICAL personnel - Abstract
OBJECTIVE: To evaluate real-world implications of updated Surviving Sepsis Campaign (SSC) recommendations for antibiotic timing. DESIGN: Retrospective cohort study. SETTING: Twelve hospitals in the Southeastern United States between 2017 and 2021. PATIENTS: One hundred sixty-six thousand five hundred fifty-nine adult hospitalized patients treated in the emergency department for suspected serious infection. INTERVENTIONS: None. MEASUREMENTS AND MAIN RESULTS: We determined the number and characteristics of patients affected by updated SSC recommendations for initiation of antibiotics that incorporate a risk- and probability-stratified approach. Using an infection prediction model with a cutoff of 0.5 to classify possible vs. probable infection, we found that 30% of the suspected infection cohort would be classified as shock absent, possible infection and thus eligible for the new 3-hour antibiotic recommendation. In real-world practice, this group had a conservative time to antibiotics (median, 5.5 hr; interquartile range [IQR], 3.2-9.8 hr) and low mortality (2%). Patients categorized as shock absent, probable infection had a median time to antibiotics of 3.2 hours (IQR, 2.1-5.1 hr) and mortality of 3%. Patients categorized as shock present, the probable infection had a median time to antibiotics 2.7 hours (IQR, 1.7-4.6 hr) and mortality of 17%, and patients categorized as shock present, the possible infection had a median time to antibiotics 6.9 hours (IQR, 3.5-16.3 hr) and mortality of 12%. CONCLUSIONS: These data support recently updated SSC recommendations to align antibiotic timing targets with risk and probability stratifications. Our results provide empirical support that clinicians and hospitals should not be held to 1-hour targets for patients without shock and with only possible sepsis. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
20. Searching for the General Science of Evidence: Venn on Probability and Induction.
- Author
-
Stergiou, Chrysovalantis, Apostolidis, Alexandros, and Psillos, Stathis
- Subjects
- *
INFERENCE (Logic) , *PROBABILITY theory , *TWENTIETH century - Abstract
In this paper Venn's account of probability inference and induction is examined, tracing their differences as well as how they ‘co-operate’ in inferences from particulars to particulars. We discuss the role of mathematical idealizations in making probability inferences, the celebrated rule of succession and we delve into the nature of the reference class problem arguing that for Venn it is a common problem for both induction and inference in probability. Our approach is both historical and philosophical attempting to sketch Venn's position both in the philosophy of probability and induction of his time and in relation to the twentieth-century frequentism. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
21. The Effect of Surge on Extreme Wave Impacts and an Insight into Clustering.
- Author
-
Boon, Anna D. and Wellens, Peter R.
- Subjects
- *
ROGUE waves - Abstract
The original goal of the present research is to investigate the influence of surge on green water and slamming. Long-running experiments with forward velocity and irregular waves were repeated with and without surge. Surge is found to increase the probability of green water events, but the impact pressures on deck and the probability of a green water event reaching the deck box decreases when the ship is free to surge. Green water and slamming events turned out to not occur independently as both event types cluster for large probabilities of occurrence. Clusters are caused by large pitch motions. Larger pressures on deck are found for clustered events. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
22. AXIOMS FOR TYPE-FREE SUBJECTIVE PROBABILITY.
- Author
-
CIEŚLIŃSKI, CEZARY, HORSTEN, LEON, and LEITGEB, HANNES
- Subjects
- *
MATHEMATICAL logic , *AXIOMS , *PROBABILITY theory - Abstract
We formulate and explore two basic axiomatic systems of type-free subjective probability. One of them explicates a notion of finitely additive probability. The other explicates a concept of infinitely additive probability. It is argued that the first of these systems is a suitable background theory for formally investigating controversial principles about type-free subjective probability. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
23. Chance and Necessity: Hegel's Epistemological Vision.
- Author
-
Nescolarde-Selva, J., Usó-Doménech, J. L., and Gash, H.
- Subjects
- *
SOCIAL processes , *CAUSATION (Philosophy) , *FREE will & determinism , *DIALECTIC - Abstract
In this paper the authors provide an epistemological view on the old controversial random-necessity. It has been considered that either one or the other form part of the structure of reality. Chance and indeterminism are nothing but a disorderly efficiency of contingency in the production of events, phenomena, processes, i.e., in its causality, in the broadest sense of the word. Such production may be observed in natural and artificial processes or in human social processes (in history, economics, society, politics, etc.). Here we touch the object par excellence of all scientific research whether natural or human. In this work, is presented a hypothesis whose practical result satisfies the Hegelian dialectic, with the consequent implication of their mutual reciprocal integration. Producing abstractions, without which, there is no thought or knowledge of any kind, from the concrete, that is, the real problem, which in this case is a given Ontological System or Reality. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
24. Decoding intelligence via symmetry and asymmetry.
- Author
-
Fu, Jianjing and Hsiao, Ching-an
- Subjects
- *
SYMMETRY , *MIND maps , *THEORY of knowledge , *EMOTIONS , *SEMANTICS - Abstract
Humans use pictures to model the world. The structure of a picture maps to mind space to form a concept. When an internal structure matches the corresponding external structure, an observation functions. Whether effective or not, the observation is self-consistent. In epistemology, people often differ from each other in terms of whether a concept is probabilistic or certain. Based on the effect of the presented IG and pull anti algorithm, we attempt to provide a comprehensive answer to this problem. Using the characters of hidden structures, we explain the difference between the macro and micro levels and the same difference between semantics and probability. In addition, the importance of attention is highlighted through the combination of symmetry and asymmetry included and the mechanism of chaos and collapse revealed in the presented model. Because the subject is involved in the expression of the object, representationalism is not complete. However, people undoubtedly reach a consensus based on the objectivity of the representation. Finally, we suggest that emotions could be used to regulate cognition. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
25. A Bayesian approach using spatiotemporal features for suitable next hop selection in opportunistic networks.
- Author
-
Dutta, Amit, Borah, Satya Jyoti, and Singh, Jagdeep
- Abstract
Summary: Opportunistic network (OppNet) belongs to the category of Mobile Ad‐hoc Networks (MANETs), a kind of Delay Tolerant Network (DTN), where the wireless nodes are completely mobile and the data transmission routes are dynamic. The major challenge in developing a routing model for such a network is the unpredictable nature of the movement of the nodes. In this paper, a spatiotemporal prediction model based on human mobility pattern is proposed using Bayesian posterior probability (BPPR) where several clusters are identified within the network and the day and time duration of nodes visiting those clusters are recorded. The Bayesian posterior probability is then used to determine the probability of the neighbor node visiting the destination's cluster. If the calculated probability for that node is higher than a specified threshold, the packet will be forwarded. A comparison of the results obtained on simulation is made with benchmark models—Epidemic, Prophet, HBPR, EDR, NexT, and EBC, to name a few, where it is found that on average the proposed model outperforms the other models in terms of delivery probability, hop count and number of messages dropped by around 23.89%, 24.8%, 24.4%, 37%, 11%, and 42% respectively, with varying number of nodes, TTL, message generation interval, and buffer size. Similar improvements have been observed in terms of the other two metrics. In terms of overhead ratio, the proposed model outperforms Epidemic, Prophet, HBPR, NexT, and EBC. However, as the number of nodes and TTL are varied, BPPR performs better than NexT by around 9% and 12%, respectively, in terms of average latency. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
26. The pupil dilation response as an indicator of visual cue uncertainty and auditory outcome surprise.
- Author
-
Becker, Janika, Viertler, Marvin, Korn, Christoph W., and Blank, Helen
- Subjects
- *
PUPILLARY reflex , *AUDITORY perception , *PUPILLOMETRY , *VOWELS - Abstract
In everyday perception, we combine incoming sensory information with prior expectations. Expectations can be induced by cues that indicate the probability of following sensory events. The information provided by cues may differ and hence lead to different levels of uncertainty about which event will follow. In this experiment, we employed pupillometry to investigate whether the pupil dilation response to visual cues varies depending on the level of cue‐associated uncertainty about a following auditory outcome. Also, we tested whether the pupil dilation response reflects the amount of surprise about the subsequently presented auditory stimulus. In each trial, participants were presented with a visual cue (face image) which was followed by an auditory outcome (spoken vowel). After the face cue, participants had to indicate by keypress which of three auditory vowels they expected to hear next. We manipulated the cue‐associated uncertainty by varying the probabilistic cue‐outcome contingencies: One face was most likely followed by one specific vowel (low cue uncertainty), another face was equally likely followed by either of two vowels (intermediate cue uncertainty) and the third face was followed by all three vowels (high cue uncertainty). Our results suggest that pupil dilation in response to task‐relevant cues depends on the associated uncertainty, but only for large differences in the cue‐associated uncertainty. Additionally, in response to the auditory outcomes, the pupil dilation scaled negatively with the cue‐dependent probabilities, likely signalling the amount of surprise. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
27. Markov Chains and Kinetic Theory: A Possible Application to Socio-Economic Problems.
- Author
-
Carbonaro, Bruno and Menale, Marco
- Subjects
- *
MARKOV processes , *STOCHASTIC processes , *BOLTZMANN'S equation , *ANIMAL populations , *TRAFFIC flow - Abstract
A very important class of models widely used nowadays to describe and predict, at least in stochastic terms, the behavior of many-particle systems (where the word "particle" is not meant in the purely mechanical sense: particles can be cells of a living tissue, or cars in a traffic flow, or even members of an animal or human population) is the Kinetic Theory for Active Particles, i.e., a scheme of possible generalizations and re-interpretations of the Boltzmann equation. Now, though in the literature on the subject this point is systematically disregarded, this scheme is based on Markov Chains, which are special stochastic processes with important properties they share with many natural processes. This circumstance is here carefully discussed not only to suggest the different ways in which Markov Chains can intervene in equations describing the stochastic behavior of any many-particle system, but also, as a preliminary methodological step, to point out the way in which the notion of a Markov Chain can be suitably generalized to this aim. As a final result of the discussion, we find how to develop new very plausible and likely ways to take into account possible effects of the external world on a non-isolated many-particle system, with particular attention paid to socio-economic problems. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
28. The Best Time to Play the Lottery.
- Author
-
Rump, Christopher M.
- Abstract
AbstractThe best time to play the lottery is when the jackpot has rolled over several times and grown large, but not so large that you must share the prize if you win. We examine maximizing the expected value of a winning ticket as well as that in a random ticket. The derived optimality criteria depend on the prize elasticity of ticket demand. A regression analysis on data obtained from the Mega Millions® and Powerball® multi-state lotteries suggests ticket sales grow quadratically in the size of the advertised lump-sum cash jackpot prize. With quadratic growth, the best time to play is when ticket sales are 1.25–2.5 times the jackpot odds, currently about 300 M to one for these two lotteries. Since ticket sales are not known to ticket buyers, we invert the regression function to prescribe the best time to play in terms of the cash prize. It turns out that these lotteries offer a (pretax) fair wager with positive expected value in a surprisingly wide interval of jackpot prizes. That is a good time to play; the best time is in the neighborhood of the nearly 1 $B record cash jackpot awarded in these lotteries in recent years. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
29. The governance of possible futures and the regime of modern historicity: Critical theory and the modality of possibility.
- Author
-
Guéguen, Haud and Jeanpierre, Laurent
- Subjects
- *
CRITICAL theory , *MODAL logic , *HISTORICITY , *POSSIBILITY , *GIFT giving - Abstract
The inaugural project of German Critical theory was to break away from the cult of facts in order to investigate the real possibilities of the present. Part of sociology has also made the possible and the relationship to the possible its central object. Such a task has met with a considerable effort on the part of government agencies to pre-empt the legitimate definition of what is possible. The social sciences were mobilised to this end. We offer a schematic account of these efforts in order to situate the extent to which the definition of possible futures is an issue of struggle in which Critical theory and sociology have a role to play. The article examines the question of the future through its close link to the category of the possible, at two levels that are often treated separately: at the level of the problem of 'governmentality' and its close link to the question of forecasting and probability; and at the level of the 'regimes of historicity' from which to consider the possible with a view to collectively reappropriating the determination of the future. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
30. Development of the Korean construction job exposure matrix (KoConJEM) based on experts' judgment using the 60 consolidated occupations for construction workers.
- Author
-
Choi, Sangjun, Lee, Kwang Min, Park, Hyunhee, Shim, Gyu-Beom, Lee, Sun Woo, Kim, Yoon-Ji, Lee, Eun-Soo, Kim, Youngki, Kang, Dongmug, Park, Ju-Hyun, and Kim, Se-Yeong
- Subjects
- *
RISK assessment , *DASHBOARDS (Management information systems) , *COLD (Temperature) , *OCCUPATIONS , *RESEARCH funding , *NOISE , *OCCUPATIONAL hazards , *PROBABILITY theory , *WORK environment , *HEAT , *OCCUPATIONAL exposure , *STATISTICS , *LIFTING & carrying (Human mechanics) , *COMPARATIVE studies , *HAZARDOUS substances , *POSTURE , *CONSTRUCTION industry , *INDUSTRIAL hygiene , *INDUSTRIAL safety - Abstract
Background This study was conducted as an effort to develop a Korean construction job exposure matrix (KoConJEM) based on 60 occupations recently consolidated by the construction workers mutual aid association for use by the construction industry. Methods The probability, intensity, and prevalence of exposure to 26 hazardous agents for 60 consolidated occupations were evaluated as binary (Yes/No) or four categories (1 to 4) by 30 industrial hygiene experts. The score for risk was calculated by multiplying the exposure intensity by the prevalence of exposure. Fleiss' kappa for each hazardous agent and occupation was used to determine agreement among the 30 experts. The JEM was expressed on a heatmap and a web-based dashboard to facilitate comparison of factors affecting exposure according to each occupation and hazardous agent. Results Awkward posture, heat/cold, heavy lifting, and noise were hazardous agents regarded as exposure is probable by at least one or more experts in all occupations, while exposure to asphalt fumes was considered hazardous in the smallest number of occupations (n = 5). Based on the degree of agreement among experts, more than half of the harmful factors and most occupations showed fair to good results. The highest risk value was 16 for awkward posture for most occupations other than safety officer. Conclusions The KoConJEM provides information on the probability, intensity, and prevalence of exposure to harmful factors, including most occupations employing construction workers; therefore, it may be useful in the conduct of epidemiological studies on assessment of health risk for construction workers. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
31. Establish an agricultural drought index that is independent of historical element probabilities.
- Author
-
Pan, Yongdi, Xiao, Jingjing, and Pan, Yanhua
- Subjects
- *
DROUGHT management , *AGRICULTURE , *METEOROLOGICAL stations , *METEOROLOGICAL observations , *DROUGHTS , *WATER rights , *PROBABILITY theory - Abstract
Currently, there are three main shortcomings in meteorological drought indices: first, they rely on historical climate probability functions; second, the timescale used in calculations has a certain degree of subjectivity; third, the same index value may correspond to vastly different levels of actual drought in different climate types of regions. The purpose of this article is to establish a meteorological drought index that does not rely on historical meteorological element probability functions. Through theoretical derivation, four drought‐level maintenance lines are established on the cumulative precipitation‐cumulative water surface evaporation coordinate plane, and the coordinate quadrant is divided into five drought‐level areas. Through forward daily rolling accumulation, the maximum distance point is selected from the dynamically changing coordinate points to determine the corresponding cumulative precipitation and cumulative evaporation. The meteorological drought index is established by the distance from the selected coordinate point to each drought‐level maintenance line. Using daily precipitation and evaporation data from meteorological observation stations, the index is calculated based on the established meteorological drought index model, and compared with actual drought evolution and drought disaster records. The results show that the index can capture the development of drought well, and its changes are very consistent with drought disaster records. The index is of great significance for drought monitoring or assessment, and can provide guidance for water resource allocation, crop layout, and urban planning. Furthermore, it can also provide a way of thinking that does not rely on historical element probabilities for future drought research. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
32. RT-QuIC detection of chronic wasting disease prion in platelet samples of white-tailed deer.
- Author
-
Kobashigawa, Estela, Russell, Sherri, Zhang, Michael Z., Sinnott, Emily A., Connolly, Michael, and Zhang, Shuping
- Subjects
- *
CHRONIC wasting disease , *WHITE-tailed deer , *PRION diseases , *BLOOD platelets , *SCRAPIE , *HIGH throughput screening (Drug development) - Abstract
Background: Chronic wasting disease (CWD) is a prion disease of captive and free-ranging cervids. Currently, a definitive diagnosis of CWD relies on immunohistochemistry detection of PrPSc in the obex and retropharyngeal lymph node (RPLN) of the affected cervids. For high-throughput screening of CWD in wild cervids, RPLN samples are tested by ELISA followed by IHC confirmation of positive results. Recently, real-time quacking-induced conversion (RT-QuIC) has been used to detect CWD positivity in various types of samples. To develop a blood RT-QuIC assay suitable for CWD diagnosis, this study evaluated the assay sensitivity and specificity with and without ASR1-based preanalytical enrichment and NaI as the main ionic component in assay buffer. Results: A total of 23 platelet samples derived from CWD-positive deer (ELISA + /IHC +) and 30 platelet samples from CWD-negative (ELISA-) deer were tested. The diagnostic sensitivity was 43.48% (NaCl), 65.22% (NaI), 60.87% (NaCl-ASR1) or 82.61% (NaI-ASR1). The diagnostic specificity was 96.67% (NaCl), 100% (NaI), 100% (NaCl-ASR1), or 96.67% (NaI-ASR1). The probability of detecting CWD prion in platelet samples derived from CWD-positive deer was 0.924 (95% CRI: 0.714, 0.989) under NaI-ASR1 experimental condition and 0.530 (95% CRI: 0.156, 0.890) under NaCl alone condition. The rate of amyloid formation (RFA) was greatest under the NaI-ASR1 condition at 10–2 (0.01491, 95% CRI: 0.00675, 0.03384) and 10–3 (0.00629, 95% CRI: 0.00283, 0.01410) sample dilution levels. Conclusions: Incorporation of ASR1-based preanalytical enrichment and NaI as the main ionic component significantly improved the sensitivity of CWD RT-QuIC on deer platelet samples. Blood test by the improved RT-QuIC assay may be used for antemortem and postmortem diagnosis of CWD. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
33. The Metaphysical Foundations of the Principle of Indifference.
- Author
-
Eisner, Binyamin
- Subjects
- *
APATHY , *QUANTUM mechanics - Abstract
The arguments in favor of the Principle of Indifference fail to explain its fruitfulness in science. Using the recent metaphysical concept of Grounding, I devise an explanation that can justify a weak version of the principle and discuss an instance of its application in Quantum mechanics. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
34. Probabilistic Forecasting of Lightning Strikes over the Continental USA and Alaska: Model Development and Verification.
- Author
-
Nikolov, Ned, Bothwell, Phillip, and Snook, John
- Subjects
- *
THUNDERSTORMS , *LIGHTNING , *ELECTRIC charge , *HUMIDITY , *PRINCIPAL components analysis , *GEOPOTENTIAL height - Abstract
Lightning is responsible for the most area annually burned by wildfires in the extratropical region of the Northern Hemisphere. Hence, predicting the occurrence of wildfires requires reliable forecasting of the chance of cloud-to-ground lightning strikes during storms. Here, we describe the development and verification of a probabilistic lightning-strike algorithm running on a uniform 20 km grid over the continental USA and Alaska. This is the first and only high-resolution lightning forecasting model for North America derived from 29-year-long data records. The algorithm consists of a large set of regional logistic equations parameterized on the long-term data records of observed lightning strikes and meteorological reanalysis fields from NOAA. Principal Component Analysis was employed to extract 13 principal components from a list of 611 potential predictors. Our analysis revealed that the occurrence of cloud-to-ground lightning strikes primarily depends on three factors: the temperature and geopotential heights across vertical pressure levels, the amount of low-level atmospheric moisture, and wind vectors. These physical variables isolate the conditions that are favorable for the development of thunderstorms and impact the vertical separation of electric charges in the lower troposphere during storms, which causes the voltage potential between the ground and the cloud deck to increase to a level that triggers electrical discharges. The results from a forecast verification using independent data showed excellent model performance, thus making this algorithm suitable for incorporation into models designed to forecast the chance of wildfire ignitions. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
35. The 2022 European Union report on pesticide residues in food.
- Author
-
Carrasco Cabrera, Luis, Di Piazza, Giulio, Dujardin, Bruno, Marchese, Emanuela, and Medina Pastor, Paula
- Subjects
- *
PESTICIDE residues in food , *FOOD laws , *PESTICIDE pollution , *RISK managers , *CONSUMER protection , *FOOD safety - Abstract
Under European Union legislation (Article 32, Regulation (EC) No 396/2005), the European Food Safety Authority provides an annual report assessing the pesticide residue levels in foods on the European market. In 2022, 96.3% of the overall 110,829 samples analysed fell below the maximum residue level (MRL), 3.7% exceeded this level, of which 2.2% were non‐compliant, i.e. results in a given sample exceeded the MRL after taking into account the measurement uncertainty. For the EU‐coordinated multiannual control programme subset, 11,727 samples were analysed of which 0.9% were non‐compliant. To assess acute and chronic risk to consumer health, dietary exposure to pesticide residues was estimated and compared with available health‐based guidance values (HBGV). Continuation of the probabilistic assessment methodology was consolidated to all pesticides listed in the 2022 EU Regulation providing the probability of a consumer being exposed to an exceedance of the HBGV. Overall, the assessed risk to EU consumer's health is low. Recommendations to risk managers are given to increase the effectiveness of European control systems and to ensure a high level of consumer protection throughout the EU. This publication is linked to the following EFSA Supporting Publications article: http://onlinelibrary.wiley.com/doi/10.2903/sp.efsa.2024.EN-8751/full [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
36. A Stochastic Model of Mathematics and Science.
- Author
-
Wolpert, David H. and Kinney, David B.
- Abstract
We introduce a framework that can be used to model both mathematics and human reasoning about mathematics. This framework involves stochastic mathematical systems (SMSs), which are stochastic processes that generate pairs of questions and associated answers (with no explicit referents). We use the SMS framework to define normative conditions for mathematical reasoning, by defining a “calibration” relation between a pair of SMSs. The first SMS is the human reasoner, and the second is an “oracle” SMS that can be interpreted as deciding whether the question–answer pairs of the reasoner SMS are valid. To ground thinking, we understand the answers to questions given by this oracle to be the answers that would be given by an SMS representing the entire mathematical community in the infinite long run of the process of asking and answering questions. We then introduce a slight extension of SMSs to allow us to model both the physical universe and human reasoning about the physical universe. We then define a slightly different calibration relation appropriate for the case of scientific reasoning. In this case the first SMS represents a human scientist predicting the outcome of future experiments, while the second SMS represents the physical universe in which the scientist is embedded, with the question–answer pairs of that SMS being specifications of the experiments that will occur and the outcome of those experiments, respectively. Next we derive conditions justifying two important patterns of inference in both mathematical and scientific reasoning: (i) the practice of increasing one’s degree of belief in a claim as one observes increasingly many lines of evidence for that claim, and (ii) abduction, the practice of inferring a claim’s probability of being correct from its explanatory power with respect to some other claim that is already taken to hold for independent reasons. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
37. Comparative study on chromatin loop callers using Hi-C data reveals their effectiveness.
- Author
-
Chowdhury, H. M. A. Mohit, Boult, Terrance, and Oluwadare, Oluwatosin
- Abstract
Background: Chromosome is one of the most fundamental part of cell biology where DNA holds the hierarchical information. DNA compacts its size by forming loops, and these regions house various protein particles, including CTCF, SMC3, H3 histone. Numerous sequencing methods, such as Hi-C, ChIP-seq, and Micro-C, have been developed to investigate these properties. Utilizing these data, scientists have developed a variety of loop prediction techniques that have greatly improved their methods for characterizing loop prediction and related aspects. Results: In this study, we categorized 22 loop calling methods and conducted a comprehensive study of 11 of them. Additionally, we have provided detailed insights into the methodologies underlying these algorithms for loop detection, categorizing them into five distinct groups based on their fundamental approaches. Furthermore, we have included critical information such as resolution, input and output formats, and parameters. For this analysis, we utilized the GM12878 Hi-C datasets at 5 KB, 10 KB, 100 KB and 250 KB resolutions. Our evaluation criteria encompassed various factors, including memory usages, running time, sequencing depth, and recovery of protein-specific sites such as CTCF, H3K27ac, and RNAPII. Conclusion: This analysis offers insights into the loop detection processes of each method, along with the strengths and weaknesses of each, enabling readers to effectively choose suitable methods for their datasets. We evaluate the capabilities of these tools and introduce a novel Biological, Consistency, and Computational robustness score ( B C C score ) to measure their overall robustness ensuring a comprehensive evaluation of their performance. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
38. A comparison of human and GPT-4 use of probabilistic phrases in a coordination game.
- Author
-
Maloney, Laurence T., Dal Martello, Maria F., Fei, Vivian, and Ma, Valerie
- Abstract
English speakers use probabilistic phrases such as likely to communicate information about the probability or likelihood of events. Communication is successful to the extent that the listener grasps what the speaker means to convey and, if communication is successful, individuals can potentially coordinate their actions based on shared knowledge about uncertainty. We first assessed human ability to estimate the probability and the ambiguity (imprecision) of twenty-three probabilistic phrases in a coordination game in two different contexts, investment advice and medical advice. We then had GPT-4 (OpenAI), a Large Language Model, complete the same tasks as the human participants. We found that GPT-4’s estimates of probability both in the Investment and Medical Contexts were as close or closer to that of the human participants as the human participants’ estimates were to one another. However, further analyses of residuals disclosed small but significant differences between human and GPT-4 performance. Human probability estimates were compressed relative to those of GPT-4. Estimates of probability for both the human participants and GPT-4 were little affected by context. We propose that evaluation methods based on coordination games provide a systematic way to assess what GPT-4 and similar programs can and cannot do. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
39. Laypeople’s interpretations of ‘high confidence’.
- Author
-
Pennekamp, Pia and Mansour, Jamal K.
- Abstract
High confidence has been associated with high accuracy under certain conditions. Yet, how researchers operationalize ‘high confidence’ varies across publications and depends on who is asked. In this study, we collected numeric interpretations to determine thresholds for high confidence. Layperson participants provided a minimum, best, and maximum estimate for ‘high confidence’ in an eyewitness lineup decision on a scale of 0-100. The distribution of best estimates peaked at 90.90%. The peak value for the minimum estimate was 83.80%. Critically, the distributions of responses were highly variable: 68.27% of participants (one standard deviation around the mean) provided best estimates between 79% and 97% and minimum estimates between 60% and 93%. This variability in laypeople’s perceptions implies there is likely to be considerable variability in how jurors and practitioners interpret confidence. Research and practice would benefit from a standardized definition of what constitutes ‘high confidence.’ [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
40. Statistical approaches to evaluate in vitro dissolution data against proposed dissolution specifications.
- Author
-
Li, Fasheng, Nickerson, Beverly, Van Alstine, Les, and Wang, Ke
- Abstract
In vitro dissolution testing is a regulatory required critical quality measure for solid dose pharmaceutical drug products. Setting the acceptance criteria to meet compendial criteria is required for a product to be filed and approved for marketing. Statistical approaches for analyzing dissolution data, setting specifications and visualizing results could vary according to product requirements, company's practices, and scientific judgements. This paper provides a general description of the steps taken in the evaluation and setting of in vitro dissolution specifications at release and on stability. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
41. Confidence distributions for treatment effects in clinical trials: Posteriors without priors.
- Author
-
Marschner, Ian C.
- Subjects
- *
CLINICAL trials , *TREATMENT effectiveness , *DISTRIBUTION (Probability theory) , *FREQUENTIST statistics , *BAYESIAN analysis - Abstract
An attractive feature of using a Bayesian analysis for a clinical trial is that knowledge and uncertainty about the treatment effect is summarized in a posterior probability distribution. Researchers often find probability statements about treatment effects highly intuitive and the fact that this is not accommodated in frequentist inference is a disadvantage. At the same time, the requirement to specify a prior distribution in order to obtain a posterior distribution is sometimes an artificial process that may introduce subjectivity or complexity into the analysis. This paper considers a compromise involving confidence distributions, which are probability distributions that summarize uncertainty about the treatment effect without the need for a prior distribution and in a way that is fully compatible with frequentist inference. The concept of a confidence distribution provides a posterior–like probability distribution that is distinct from, but exists in tandem with, the relative frequency interpretation of probability used in frequentist inference. Although they have been discussed for decades, confidence distributions are not well known among clinical trial statisticians and the goal of this paper is to discuss their use in analyzing treatment effects from randomized trials. As well as providing an introduction to confidence distributions, some illustrative examples relevant to clinical trials are presented, along with various case studies based on real clinical trials. It is recommended that trial statisticians consider presenting confidence distributions for treatment effects when reporting analyses of clinical trials. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
42. Probability Calculation for Utilization of Photovoltaic Energy in Electric Vehicle Charging Stations.
- Author
-
Belany, Pavol, Hrabovsky, Peter, and Florkova, Zuzana
- Subjects
- *
ELECTRIC vehicle charging stations , *ELECTRIC vehicles , *ENERGY consumption , *ELECTRIC charge , *RENEWABLE energy sources , *ARTIFICIAL neural networks - Abstract
In recent years, there has been a growing emphasis on the efficient utilization of natural resources across various facets of life. One such area of focus is transportation, particularly electric mobility in conjunction with the deployment of renewable energy sources. To fully realize this objective, it is crucial to quantify the probability of achieving the desired state—production exceeding consumption. This article deals with the computation of the probability that the energy required to charge an electric vehicle will originate from a renewable source at a specific time and for a predetermined charging duration. The base of the model lies in artificial neural networks, which serve as an ancillary tool for the actual probability assessment. Neural networks are used to forecast the values of energy production and consumption. Following the processing of these data, the probability of energy availability for a given day and month is determined. A total of seven scenarios are calculated, representing individual days of the week. These findings can help users in their decision-making process regarding when and for how long to connect their electric vehicle to a charging station to receive assured clean energy from a local photovoltaic source. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
43. NECESSARY AND SUFFICIENT CONDITIONS FOR DOMINATION RESULTS FOR PROPER SCORING RULES.
- Author
-
PRUSS, ALEXANDER R.
- Subjects
- *
FORECASTING , *CALCULUS , *PROBABILITY theory - Abstract
Scoring rules measure the deviation between a forecast, which assigns degrees of confidence to various events, and reality. Strictly proper scoring rules have the property that for any forecast, the mathematical expectation of the score of a forecast p by the lights of p is strictly better than the mathematical expectation of any other forecast q by the lights of p. Forecasts need not satisfy the axioms of the probability calculus, but Predd et al. [9] have shown that given a finite sample space and any strictly proper additive and continuous scoring rule, the score for any forecast that does not satisfy the axioms of probability is strictly dominated by the score for some probabilistically consistent forecast. Recently, this result has been extended to non-additive continuous scoring rules. In this paper, a condition weaker than continuity is given that suffices for the result, and the condition is proved to be optimal. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
44. IS CAUSAL REASONING HARDER THAN PROBABILISTIC REASONING?
- Author
-
MOSSÉ, MILAN, IBELING, DULIGUR, and ICARD, THOMAS
- Subjects
- *
CAUSAL inference , *CONDITIONAL probability , *FORMAL languages , *INFERENTIAL statistics , *CONDITIONALS (Logic) , *COMPLETENESS theorem - Abstract
Many tasks in statistical and causal inference can be construed as problems of entailment in a suitable formal language. We ask whether those problems are more difficult, from a computational perspective, for causal probabilistic languages than for pure probabilistic (or "associational") languages. Despite several senses in which causal reasoning is indeed more complex—both expressively and inferentially—we show that causal entailment (or satisfiability) problems can be systematically and robustly reduced to purely probabilistic problems. Thus there is no jump in computational complexity. Along the way we answer several open problems concerning the complexity of well-known probability logics, in particular demonstrating the ${\exists \mathbb {R}}$ -completeness of a polynomial probability calculus, as well as a seemingly much simpler system, the logic of comparative conditional probability. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
45. Inherent and probabilistic naturalness.
- Author
-
Gasparri, Luca
- Subjects
- *
ORAL communication , *VOCABULARY , *NATURALNESS (Linguistics) , *SEMANTICS , *PERTURBATION theory - Abstract
Standard accounts hold that regularities of behavior must be arbitrary to constitute a convention. Yet, there is growing consensus that conventionality is a graded phenomenon, and that conventions can be more or less natural. I develop an account of natural conventions that distinguishes two basic dimensions of conventional naturalness: a probabilistic dimension and an inherent one. A convention is probabilistically natural if it is likely to emerge in a population of agents, and inherently natural if its content is a regularity that scores high on relevant measures for naturalness. I motivate the proposal on conceptual grounds and then showcase its descriptive benefits by discussing two case studies in language: the tendency towards word-length optimality and the prevalence of shape opacity in spoken language vocabularies. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
46. Reliability Analysis of a Micro Hydro Power Plants System at Lombok with Expected Energy Not Supplied Method.
- Author
-
Widjonarko, Saleh, Azmi, Utomo, Wahyu Mulyo, Omar, Saodah, and Nafi, Muhammad Ilman
- Subjects
- *
POWER resources , *ENERGY development , *RENEWABLE energy sources , *POWER plants , *HYDROELECTRIC power plants , *ELECTRIC power consumption , *ENERGY consumption - Abstract
In the context of this research, understanding the reliability of a power generator is essential as a criterion for assessing its suitability for use or the need for further development. The method used in this study is reliability analysis, known as "Expected Energy Not Supplied (EENS)." The initial step of this method is to calculate the FOR (Forced Outage Rate) to determine the level of disturbances in the generator unit. The subsequent process involves calculating individual probabilities, analyzing the generator load curve, determining the EENS values of three generators, and comparing them with the EENS standards established by the National Electricity Market. These standards stipulate that EENS should not exceed 0.002% of the total energy consumption in the region. This research marks a significant milestone as the first endeavour conducted on Lombok Island within this specific context. The study was conducted by analyzing three operational Micro-Hydro Power (MHP) units on Lombok Island. The research findings indicate that the EENS metric for MHP on Lombok Island stands at 2.822%. This result suggests that the reliability of MHP on Lombok Island falls below the established criterion, which is less than 0.002% annually. In practical terms, these findings imply that MHP plants located on Lombok Island may not be relied upon as the primary source to meet the electricity demands of the Lombok region in 2022. This research provides valuable insights into the challenges of energy reliability on Lombok Island and serves as a crucial foundation for further considerations in the development of renewable energy sources in the region. [ABSTRACT FROM AUTHOR]
- Published
- 2024
47. Temperature thresholds to guide choice of freshwater species for monitoring onset of chronic thermal stress impacts in rivers.
- Author
-
Rivers-Moore, N. A.
- Subjects
- *
PSYCHOLOGICAL stress , *CORAL bleaching , *FRESH water , *SPECIES , *CLIMATE change , *TEMPERATURE - Abstract
Aquatic species show different sensitivities and responses to chronic thermal stress, resulting in varying degrees of resistance to the negative impacts of climate change, which are ultimately expressed as range expansions or contractions. The choice of species appropriate for assessing climate change impacts in aquatic ecosystems should be guided by the robustness of the relationship between a chosen chronic stress thermal threshold and associated habitat contraction. Twelve aquatic species were evaluated as potential climate change indicators, from which six were selected for testing a conceptual framework for predicting the degree of utility of a species as a climate change indicator. Results indicate that species with a chronic biological thermal threshold below 20°C are likely to experience in excess of 50% loss of thermally suitable environment. Cooler thermal thresholds could inform the choice of suitable sentinel species for use as early indicators of chronic thermal stress, while thresholds above this reflect increasingly thermally resistant species within aquatic communities. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
48. Process Algebraic Approach for Probabilistic Verification of Safety and Security Requirements of Smart IoT (Internet of Things) Systems in Digital Twin.
- Author
-
Song, Junsup, Lee, Sunghyun, Karagiannis, Dimitris, and Lee, Moonkun
- Subjects
- *
DIGITAL twins , *INTERNET of things , *INTERNET safety , *EMERGENCY medical services , *DETERMINISTIC algorithms - Abstract
Process algebra can be considered one of the most practical formal methods for modeling Smart IoT Systems in Digital Twin, since each IoT device in the systems can be considered as a process. Further, some of the algebras are applied to predict the behavior of the systems. For example, PALOMA (Process Algebra for Located Markovian Agents) and PACSR (Probabilistic Algebra of Communicating Shared Resources) process algebras are designed to predict the behavior of IoT Systems with probability on choice operations. However, there is a lack of analytical methods in the algebras to predict the nondeterministic behavior of the systems. Further, there is no control mechanism to handle undesirable nondeterministic behavior of the systems. In order to overcome these limitations, this paper proposes a new process algebra, called dTP-Calculus, which can be used (1) to specify the nondeterministic behavior of the systems with static probability, (2) verify the safety and security requirements of the nondeterministic behavior with probability requirements, and (3) control undesirable nondeterministic behavior with dynamic probability. To demonstrate the feasibility and practicality of the approach, the SAVE (Specification, Analysis, Verification, Evaluation) tool has been developed on the ADOxx Meta-Modeling Platform and applied to a SEMS (Smart Emergency Medical Service) example. In addition, a miniature digital twin system for the SEMS example was constructed and applied to the SAVE tool as a proof of concept for Digital Twin. It shows that the approach with dTP-Calculus on the tool can be very efficient and effective for Smart IoT Systems in Digital Twin. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
49. Holistic processing is modulated by the probability that parts contain task-congruent information.
- Author
-
Curby, Kim M., Teichmann, Lina, Peterson, Mary A., and Shomstein, Sarah S.
- Subjects
- *
SELECTIVITY (Psychology) , *PROBABILITY theory - Abstract
Holistic processing of face and non-face stimuli has been framed as a perceptual strategy, with classic hallmarks of holistic processing, such as the composite effect, reflecting a failure of selective attention, which is a consequence of this strategy. Further, evidence that holistic processing is impacted by training different patterns of attentional prioritization suggest that it may be a result of learned attention to the whole, which renders it difficult to attend to only part of a stimulus. If so, holistic processing should be modulated by the same factors that shape attentional selection, such as the probability that distracting or task-relevant information will be present. In contrast, other accounts suggest that it is the match to an internal face template that triggers specialized holistic processing mechanisms. Here we probed these accounts by manipulating the probability, across different testing sessions, that the task-irrelevant face part in the composite face task will contain task-congruent or -incongruent information. Attentional accounts of holistic processing predict that when the probability that the task-irrelevant part contains congruent information is low (25%), holistic processing should be attenuated compared to when this probability is high (75%). In contrast, template-based accounts of holistic face processing predict that it will be unaffected by manipulation given the integrity of the faces remains intact. Experiment 1 found evidence consistent with attentional accounts of holistic face processing and Experiment 2 extends these findings to holistic processing of non-face stimuli. These findings are broadly consistent with learned attention accounts of holistic processing. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
50. Probabilistic assessment of spatiotemporal fine particulate matter concentrations in Taiwan using multivariate indicator kriging.
- Author
-
Jang, Cheng-Shin
- Subjects
- *
PARTICULATE matter , *KRIGING , *AIR quality , *TEMPORAL integration , *QUANTILE regression - Abstract
Assessments of spatiotemporal fine particulate matter (PM2.5) concentrations are crucial for establishing risk maps and maintaining human health. This study spatiotemporally assessed PM2.5 concentrations in Taiwan by using multivariate indicator kriging (MVIK) according to current Taiwanese and US regulatory standards for annual average PM2.5 concentrations (15 and 12 μg/m3, respectively). First, multivariate integration was implemented to analyze data on PM2.5 concentrations for 2019–2021 and 2020–2022 because of no statistical difference of the 3-year PM2.5 data. MVIK was then used for modeling probabilities according to the two standards. Finally, quantile estimates on the basis of the occurrence probabilities of analyzing PM2.5 concentrations were employed to determine the optimal classifications for establishing risk maps according to the two PM2.5 standards. The study results indicated that the multivariate integration of temporal PM2.5 data in MVIK can effectively streamline the analytic process. The multivariate integration of 3-year PM2.5 data was suitable for assessing the risk categories of the regulatory standards for annual average PM2.5. The greatest estimated difference between the 2019–2021 and 2020–2022 multivariate integrations was in the Northern and Chumiao air quality regions. Because many air quality regions were in the PM2.5 categories of exceeding 12 μg/m3, the regulatory standard for annual average PM2.5 of 12 μg/m3 was inappropriate in Taiwan at this point in time according to assessing the 3-year spatiotemporal variability of PM2.5 concentrations. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.