19 results on '"Probabilities"'
Search Results
2. The return period and probabilities of earthquakes occurrence in North-East, India (Eastern-Himalayas) and its vicinity inferred from Gutenberg–Richter relation.
- Author
-
Chetia, Timangshu, Choudhury, Bijit Kumar, Gogoi, Ashim, and Saikia, Namrata
- Abstract
North-Eastern (NE), India and its adjoining region is one of the sixth most seismically active regions of the world. In the present investigation, the return period of earthquake and probability of occurrence inferred from Gutenberg–Richter (GR) relation was estimated for NE, India region and its vicinity. When we consider the entire NE, India region and its vicinity, it evidently suggested that the return period of earthquakes of 7 ≤ Mw ≤ 8.6 is short, which ranges from 32.73 to 162.59 years. It was observed that the earthquake occurrence from infinitesimally short interval t~0 for Mw~3.6–4 is embedded with 100% probability. The earthquakes of Mw~4.1–5.3 reach 100% in 10 years. Similarly, Mw~5.4–5.7 reaches to 100% in 20 years. Likewise, Mw~5.8–5.9, 6.0–6.1 and 6.2 reach ~100% in 30, 40 and 50 years, respectively. For large earthquakes of Mw~7.0–8.0, the probability of occurrence reaches >80% in 100 years. This observation strongly indicates that the likelihood of earthquakes occurring in the north-eastern region of India and its surrounding areas tends to increase over time. Further, the region was divided into four zones, namely Block I (26.5–28.5ºN; 89–95ºE), Block II (26.5–28.5ºN; 95–97.5ºE), Block III (23–26.5ºN; 93–97.5ºE) and Block IV (23–26.5ºN; 89–93ºE) based on seismicity and the major tectonic domains of the region. In terms of return period based on GR-relation and stochastic observations, we may conclude that the risk associated with occurrence of earthquake is highest in Block IV, followed by Block III, Block I and Block II respectively. Further, a comparison of the probabilities of earthquake return period considering seismogenic depths along with hypocentral depth data for different blocks was investigated for a comprehensive understanding of seismic occurrences over time. However, overall, the patterns and trends observed remain consistent, emphasizing the seismic activity within each block and its associated return periods. The stochastic observations and findings are elaborately accentuated in the article. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
3. Pareto and probability distributions.
- Author
-
Tusset, Gianfranco
- Abstract
Although familiar with the developments in probability theory of his time, Vilfredo Pareto made little use of this tool in his writings, preferring theoretical constructions based on experimentation and observation. This article attempts to reconstruct Pareto's overall approach to probability by examining his references to the distribution of income, an economic fact that lends itself to probabilistic investigation. The result of this research shows how Pareto alludes to the application of probability to income and social groups, but leaves the task to his followers. JEL classification B31, B4, C1. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
4. On fatal competition and the nature of distributive inferences.
- Author
-
Bar-Lev, Moshe E. and Fox, Danny
- Subjects
INFERENCE (Logic) ,PROBABILITY theory - Abstract
Denić (2018, 2019, To appear) observes that the availability of distributive inferences—for sentences with disjunction embedded in the scope of a universal quantifier—depends on the size of the domain quantified over as it relates to the number of disjuncts. Based on her observations, she argues that probabilistic considerations play a role in the computation of implicatures. In this paper we explore a different possibility. We argue for a modification of Denić's generalization, and provide an explanation that is based on intricate logical computations but is blind to probabilities. The explanation is based on the observation that when the domain size is no larger than the number of disjuncts, universal and existential alternatives are equivalent if distributive inferences are obtained. We argue that under such conditions a general ban on 'fatal competition' (Magri 2009a,b, Spector 2014) is activated, thereby predicting distributive inferences to be unavailable. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
5. Tumor growth prediction and classification based on the KNN algorithm and discrete-time Markov chains (DTMC).
- Author
-
El Fatimi, Lahcen and Boucheneb, Hanifa
- Subjects
- *
K-nearest neighbor classification , *TUMOR growth , *MARKOV processes , *BRAIN tumors , *CANCER invasiveness - Abstract
In recent years, brain tumors have become one of the most common fatal diseases. Despite the existence of an important number of research studies on tumors, the proportion of research on predicting the growth of tumors remains insufficient due to the intricate nature of this research domain. Therefore, the presence of any application able to predict the growth of the tumor may have a role in eliminating the tumor by finding the appropriate treatment for it before it grows. This paper investigates tumor growth and presents a technique for tumor growth prediction based on the Discrete Time Markov Chain (DTMC) and K-Nearest Neighbor (KNN) algorithms. The design and development of this technique consists of a proposition of a stochastic model of tumor progression. This is followed by an extension of the mode to several cases that allow the derivation of new cases based on the study of predictive probabilities. The aim of this paper is to develop a model based on the KNN and DTMC algorithms that can classify tumors and predict the future state based on the current state of the tumor without the knowledge of the past state. In other words, all relevant information about the past and the present that would be useful in making predictions is available in the current state. In terms of performance evaluation metrics, the results show that the proposed method exceeds the existing methods with 97.65% accuracy, 71.65% specificity and 99.087% sensitivity. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
6. Bivariate Extreme Value Analysis of Rainfall and Temperature in Nigeria.
- Author
-
Chukwudum, Queensley C. and Nadarajah, Saralees
- Subjects
EXTREME value theory ,TEMPERATURE - Abstract
The rising cases of floods and the onset of drought in different parts of Nigeria require urgent attention particularly because Nigeria accommodates the largest population in Africa, hence any negative climate impact on it can easily ripple into other African regions. To understand the risk factors that drive these extreme events, we study the bivariate extreme cases of monthly precipitation and temperature observations over a period of 116 years (1901–2016). This is the first paper providing bivariate extreme value analysis of data in Nigeria. The mean rainfall and temperature variables exhibit interrelationships such as dry-cold and wet-cold associations. We further investigate whether these relationships are present at the tails by making use of the annual minimum rainfall-annual minimum temperature and annual maximum rainfall-annual minimum temperature pairs. Their extreme dependence structures are also quantified by applying the parametric bivariate extreme value models. Our results show that the compound extremes of dry-cold and wet-cold conditions exhibit a zero to weak extreme dependence at varying quantile levels. A much stronger dependence structure is present between the annual maximum rain and the total volume of rainfall. By considering both independent and dependent probability assumptions, we show that the former may lead to an underestimation of the risks associated with existing climatic hazards. The implications of these results are highlighted throughout the paper. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
7. The probability of conditionals: A review.
- Author
-
López-Astorga, Miguel, Ragni, Marco, and Johnson-Laird, P. N.
- Subjects
- *
CONDITIONAL probability , *MODEL theory , *DISTRIBUTION (Probability theory) , *CALCULUS - Abstract
A major hypothesis about conditionals is the Equation in which the probability of a conditional equals the corresponding conditional probability: p(if A then C) = p(C|A). Probabilistic theories often treat it as axiomatic, whereas it follows from the meanings of conditionals in the theory of mental models. In this theory, intuitive models (system 1) do not represent what is false, and so produce errors in estimates of p(if A then C), yielding instead p(A & C). Deliberative models (system 2) are normative, and yield the proportion of cases of A in which C holds, i.e., the Equation. Intuitive estimates of the probability of a conditional about unique events: If covid-19 disappears in the USA, then Biden will run for a second term, together with those of each of its clauses, are liable to yield joint probability distributions that sum to over 100%. The error, which is inconsistent with the probability calculus, is massive when participants estimate the joint probabilities of conditionals with each of the different possibilities to which they refer. This result and others under review corroborate the model theory. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
8. Possibilities and the parallel meanings of factual and counterfactual conditionals.
- Author
-
Espino, Orlando, Byrne, Ruth M. J., and Johnson-Laird, P. N.
- Subjects
- *
COGNITION , *LOGIC , *MATHEMATICAL models , *PROBABILITY theory , *PSYCHOLOGY , *SEMANTICS - Abstract
The mental model theory postulates that the meanings of conditionals are based on possibilities. Indicative conditionals—such as "If he is injured tomorrow, then he will take some leave"—have a factual interpretation that can be paraphrased as It is possible, and remains so, that he is injured tomorrow, and in that case certain that he takes some leave. Subjunctive conditionals, such as, "If he were injured tomorrow, then he would take some leave," have a prefactual interpretation that has the same paraphrase. But when context makes clear that his injury will not occur, the subjunctive has a counterfactual paraphrase, with the first clause: It was once possible, but does not remain so, that he will be injured tomorrow. Three experiments corroborated these predictions for participants' selections of paraphrases in their native Spanish, for epistemic and deontic conditionals, for those referring to past and to future events, and for those with then clauses referring to what may or must happen. These results are contrary to normal modal logics. They are also contrary to theories based on probabilities, which are inapplicable to deontic conditionals, such as, "If you have a ticket, then you must enter the show." [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
9. Determinants of certified organic cocoa production: evidence from the province of Guayas, Ecuador.
- Author
-
Saravia-Matus, Silvia L., Rodríguez, Adrian G., and Saravia, Jimmy A.
- Abstract
Due to its biodiversity, Ecuador is apt for the cultivation of Fine Aroma cocoa, which is one of the most desired cocoa varieties by top chocolatiers and chocolate manufacturers worldwide. Along with Fine Aroma cocoa, a cloned variety known as CCN51 is also produced. However, since 2011, the government of Ecuador has initiated a national rehabilitation plan aimed at improving Fine Aroma cocoa yields, quality, certification, and commercialization. Certified cocoa production obtains higher margins via the distinction of cocoa varieties and the implementation of eco-friendly and/or organic production techniques. The revision of technical data from a sample of over 3000 cocoa producers in the Province of Guayas (one of the most important cocoa producing provinces in Ecuador) is used to explore how certified cocoa production, and certified organic cocoa production in particular is influenced by agronomic practices, market, and credit access. The difference between organic certification and other forms of certification is relevant to understand potential obstacles faced by Fine Aroma cocoa producers aiming for organic certification in Ecuador. The findings illustrate that governmental support of specific agronomic practices such as pruning is of key relevance for certified cocoa production, while simultaneously indicating that certified organic production follows a more regulated protocol regarding selected input usage. The interlinkages between certified cocoa production (in general and organic in particular) and pruning are examined using seemingly unrelated bivariate probit regression analyses that also allows identifying the relevance of other socio-economic and institutional traits. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
10. Monte Carlo Simulation of Chemical Reactions in Plasma Enhanced Chemical Vapor Deposition: from Microscopic View to Macroscopic Results.
- Author
-
Babahani, O., Hadjadj, S., Khelfaoui, F., Kebaili, H. O., and Lemkeddem, S.
- Abstract
We propose in the present work a Monte Carlo Simulation (MCS) of chemical reactions occurring in Plasma Enhanced Chemical Vapor Deposition (PECVD) reactor during the a-Si:H growth. From a microscopic view of chemical reactions, this MCS allowed to obtain macroscopic results. In gas phase, important reactions have been identified that contribute to the production of H, SiH
2 and SiH3 . We found that SiH4 →SiH2 + 2H is the dominant silane electron-impact dissociation. We found that the reaction SiH4 + H →SiH3 + H2 plays a central role in the production of SiH3 radicals. At the surface, the microscopic view allowed us to calculate site and surface reaction probabilities of SiH3 radicals. Results at macroscopic level were consistent with other works. [ABSTRACT FROM AUTHOR]- Published
- 2019
- Full Text
- View/download PDF
11. Natural Selection and Drift as Individual-Level Causes of Evolution.
- Author
-
Bourrat, Pierrick
- Abstract
In this paper I critically evaluate Reisman and Forber’s (Philos Sci 72(5):1113-1123,
2005 ) arguments that drift and natural selection are population-level causes of evolution based on what they call the manipulation condition. Although I agree that this condition is an important step for identifying causes for evolutionary change, it is insufficient. Following Woodward, I argue that the invariance of a relationship is another crucial parameter to take into consideration for causal explanations. Starting from Reisman and Forber’s example on drift and after having briefly presented the criterion of invariance, I show that once both the manipulation condition and the criterion of invariance are taken into account, drift, in this example, should better be understood as an individual-level rather than a population-level cause. Later, I concede that it is legitimate to interpret natural selection and drift as population-level causes when they rely on genuinely indeterministic events and some cases of frequency-dependent selection. [ABSTRACT FROM AUTHOR]- Published
- 2018
- Full Text
- View/download PDF
12. The effects of natural structure on estimated tropical cyclone surge extremes.
- Author
-
Resio, Donald, Asher, Taylor, and Irish, Jennifer
- Subjects
TROPICAL cyclones ,STORM surges ,COASTAL zone management ,PROBABILITY theory ,BAYESIAN analysis - Abstract
The past 12 years have seen significant steps forward in the science and practice of coastal flood analysis. This paper aims to recount and critically assess these advances, while helping identify next steps for the field. This paper then focuses on a key problem, connecting the probabilistic characterization of flood hazards to their physical mechanisms. Our investigation into the effects of natural structure on the probabilities of storm surges shows that several different types of spatial-, temporal-, and process-related organizations affect key assumptions made in many of the methods used to estimate these probabilities. Following a brief introduction to general historical methods, we analyze the two joint probability methods used in most tropical cyclone hazard and risk studies today: the surface response function and Bayesian quadrature. A major difference between these two methods is that the response function creates continuous surfaces, which can be interpolated or extrapolated on a fine scale if necessary, and the Bayesian quadrature optimizes a set of probability masses, which cannot be directly interpolated or extrapolated. Several examples are given here showing significant impacts related to natural structure that should not be neglected in hazard and risk assessment for tropical cyclones including: (1) differences between omnidirectional sampling and directional-dependent sampling of storms in near coastal areas; (2) the impact of surge probability discontinuities on the treatment of epistemic uncertainty; (3) the ability to reduce aleatory uncertainty when sampling over larger spatial domains; and (4) the need to quantify trade-offs between aleatory and epistemic uncertainties in long-term stochastic sampling. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
13. Flood Safety versus Remaining Risks - Options and Limitations of Probabilistic Concepts in Flood Management.
- Author
-
Schumann, Andreas
- Subjects
FLOODS ,FLOOD risk ,COPULA functions ,ECONOMIC development ,NATURAL disasters ,MANAGEMENT - Abstract
Over decades, the planning of flood management was based on a safety-oriented approach. A design flood was estimated by probabilistic means to specify the limit up to which a flood should be controlled completely by technical measures. A case of failure was expected only in such cases where the design flood is overtopped. As design floods were specified by very small probabilities, the risk of a flood beyond the design flood was seen as negligible. Devastating flood events all over Europe raised the public awareness of remaining flood risks in the last two decades. Risk management became a political task in the EU. According to the European Flood Directive geographical areas, which could be flooded 'with a low probability or under extreme event scenarios', have to be specified. The combination of 'low probability' and 'extreme event scenarios' demonstrates the problem of modern flood management. The existing probabilistic tools are not sufficient to specify the risks of failures, which result from critical combinations of multiple characteristics of hydrological loads. Scenarios are one option to specify them, but their probabilities stay unknown. Multivariate statistics could offer a way to fill this gap, but some problems of their practical application are still unresolved. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
14. Explaining Drift from a Deterministic Setting.
- Author
-
Bourrat, Pierrick
- Abstract
Drift is often characterized in statistical terms. Yet such a purely statistical characterization is ambiguous for it can accept multiple physical interpretations. Because of this ambiguity it is important to distinguish what sorts of processes can lead to this statistical phenomenon. After presenting a physical interpretation of drift originating from the most popular interpretation of fitness, namely the propensity interpretation, I propose a different one starting from an analysis of the concept of drift made by Godfrey-Smith. Further on, I show how my interpretation relates to previous attempts to make sense of the notion of expected value in deterministic setups. The upshot of my analysis is a physical conception of drift that is compatible with both a deterministic and indeterministic world. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
15. Processing of probabilistic information in weight perception and motor prediction.
- Author
-
Trampenau, Leif, Eimeren, Thilo, and Kuhtz-Buschbeck, Johann
- Subjects
- *
SENSORY perception , *PROBABILITY theory , *EXPECTATION (Philosophy) , *AFFERENT pathways , *WEIGHT measurement - Abstract
We studied the effects of probabilistic cues, i.e., of information of limited certainty, in the context of an action task (GL: grip-lift) and of a perceptual task (WP: weight perception). Normal subjects (n = 22) saw four different probabilistic visual cues, each of which announced the likely weight of an object. In the GL task, the object was grasped and lifted with a pinch grip, and the peak force rates indicated that the grip and load forces were scaled predictively according to the probabilistic information. The WP task provided the expected heaviness related to each probabilistic cue; the participants gradually adjusted the object's weight until its heaviness matched the expected weight for a given cue. Subjects were randomly assigned to two groups: one started with the GL task and the other one with the WP task. The four different probabilistic cues influenced weight adjustments in the WP task and peak force rates in the GL task in a similar manner. The interpretation and utilization of the probabilistic information was critically influenced by the initial task. Participants who started with the WP task classified the four probabilistic cues into four distinct categories and applied these categories to the subsequent GL task. On the other side, participants who started with the GL task applied three distinct categories to the four cues and retained this classification in the following WP task. The initial strategy, once established, determined the way how the probabilistic information was interpreted and implemented. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
16. Statistical Characterization of Cable Electrical Failure Temperatures Due to Fire for Nuclear Power Plant Risk Applications.
- Author
-
Gallucci, Raymond
- Subjects
- *
NUCLEAR power plant risk assessment , *ELECTRIC power failures , *THERMOPLASTIC composites , *PHENOMENOLOGICAL theory (Physics) , *FIRE prevention - Abstract
Single-value failure temperatures for fire loss of electrical cable functionality have been the norm for Fire Probabilistic Risk Assessments since the publication in 2005 of NUREG/CR-6850. If the calculated exposure temperature matches or exceeds the cable failure temperature, electrical failure is always assumed; if not, no failure is assumed. While this can be relaxed somewhat if a distribution for the exposure temperature is estimated, use of a distribution on the cable failure temperature itself more readily enables such relaxation and, therefore, a more realistic assessment. This paper develops probability distributions for different generic cable types (based on insulation) using data from the US Nuclear Regulatory Commission tests. Results indicate mean failure temperatures considerably higher than those used deterministically, 252°C, 421°C and 383°C, respectively for thermoplastic, thermoset and Kerite-FR. This suggests considerable relaxation from the conservatism inherent using the deterministic failure temperatures could be achieved. The paper then postulates two hypothetical distributions on the exposure temperature from applying a fire phenomenological model in a statistical way to estimate the possible relaxation using the distributed cable failure temperatures to enhance the realism of the assessment. Examples show that use of probabilistically-distributed cable failure temperatures (in conjunction with similar for exposure temperatures) can reduce the probability of electrical failure for a normally-distributed exposure temperature with a mean of 350°C and standard deviation of 58.3°C by factors of approximately three and eight for Kerite-FR and thermoset cables, respectively. The reduction would be less pronounced for thermoplastic cables, although larger reductions would be possible here as well for lower exposure temperatures (e.g., a factor of two). [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
17. Avoiding both the Garbage-In/Garbage-Out and the Borel Paradox in updating probabilities given experimental information.
- Author
-
Bordley, Robert
- Subjects
PROBABILITY theory ,PARAMETER estimation ,COMPUTER simulation ,BAYESIAN analysis ,APPROXIMATION theory ,INFORMATION processing - Abstract
Bayes Rule specifies how probabilities over parameters should be updated given any kind of information. But in some cases, the kind of information provided by both simulation and physical experiments is information on how certain output parameters may change when other input parameters are changed. There are three different approaches to this problem, one of which leads to the Garbage-In/Garbage-Out Paradox, the second of which (Bayesian synthesis) violates the Borel Paradox, and the third of which (Bayesian melding) is a supra-Bayesian heuristic. This paper shows how to derive a fully Bayesian formula which avoids the Garbage-In/Garbage-Out and Borel Paradoxes. We also compare a Laplacian approximation of this formula with Bayesian synthesis and Bayesian melding and find that the Bayesian formula sometimes coincides with the Bayesian melding solution. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
18. Planar Maps, Random Walks and Circle Packing
- Author
-
Nachmias, Asaf
- Subjects
Mathematics ,Probabilities ,Discrete mathematics ,Geometry ,Mathematical physics ,thema EDItEUR::P Mathematics and Science::PB Mathematics::PBD Discrete mathematics ,thema EDItEUR::P Mathematics and Science::PB Mathematics::PBM Geometry ,thema EDItEUR::P Mathematics and Science::PB Mathematics::PBT Probability and statistics ,thema EDItEUR::P Mathematics and Science::PH Physics::PHU Mathematical physics - Abstract
This open access book focuses on the interplay between random walks on planar maps and Koebe’s circle packing theorem. Further topics covered include electric networks, the He–Schramm theorem on infinite circle packings, uniform spanning trees of planar maps, local limits of finite planar maps and the almost sure recurrence of simple random walks on these limits. One of its main goals is to present a self-contained proof that the uniform infinite planar triangulation (UIPT) is almost surely recurrent. Full proofs of all statements are provided. A planar map is a graph that can be drawn in the plane without crossing edges, together with a specification of the cyclic ordering of the edges incident to each vertex. One widely applicable method of drawing planar graphs is given by Koebe’s circle packing theorem (1936). Various geometric properties of these drawings, such as existence of accumulation points and bounds on the radii, encode important probabilistic information, such as the recurrence/transience of simple random walks and connectivity of the uniform spanning forest. This deep connection is especially fruitful to the study of random planar maps. The book is aimed at researchers and graduate students in mathematics and is suitable for a single-semester course; only a basic knowledge of graduate level probability theory is assumed.
- Published
- 2020
- Full Text
- View/download PDF
19. Decision Making under Deep Uncertainty
- Author
-
Marchau, Vincent A. W. J., Walker, Warren E., Bloemen, Pieter J. T. M., and Popper, Steven W.
- Subjects
Business ,Management science ,Operations research ,Decision making ,Dynamics ,Ergodic theory ,Probabilities ,thema EDItEUR::K Economics, Finance, Business and Management::KJ Business and Management::KJT Operational research ,thema EDItEUR::P Mathematics and Science::PB Mathematics::PBT Probability and statistics ,thema EDItEUR::P Mathematics and Science::PB Mathematics::PBW Applied mathematics::PBWR Nonlinear science - Abstract
This open access book focuses on both the theory and practice associated with the tools and approaches for decisionmaking in the face of deep uncertainty. It explores approaches and tools supporting the design of strategic plans under deep uncertainty, and their testing in the real world, including barriers and enablers for their use in practice. The book broadens traditional approaches and tools to include the analysis of actors and networks related to the problem at hand. It also shows how lessons learned in the application process can be used to improve the approaches and tools used in the design process. The book offers guidance in identifying and applying appropriate approaches and tools to design plans, as well as advice on implementing these plans in the real world. For decisionmakers and practitioners, the book includes realistic examples and practical guidelines that should help them understand what decisionmaking under deep uncertainty is and how it may be of assistance to them. Decision Making under Deep Uncertainty: From Theory to Practice is divided into four parts. Part I presents five approaches for designing strategic plans under deep uncertainty: Robust Decision Making, Dynamic Adaptive Planning, Dynamic Adaptive Policy Pathways, Info-Gap Decision Theory, and Engineering Options Analysis. Each approach is worked out in terms of its theoretical foundations, methodological steps to follow when using the approach, latest methodological insights, and challenges for improvement. In Part II, applications of each of these approaches are presented. Based on recent case studies, the practical implications of applying each approach are discussed in depth. Part III focuses on using the approaches and tools in real-world contexts, based on insights from real-world cases. Part IV contains conclusions and a synthesis of the lessons that can be drawn for designing, applying, and implementing strategic plans under deep uncertainty, as well as recommendations for future work. The publication of this book has been funded by the Radboud University, the RAND Corporation, Delft University of Technology, and Deltares. ; Offers a comprehensive examination of the approaches and tools for designing plans under deep uncertainty and their application Identifies barriers and enablers for the use of the various approaches and tools in practice Includes realistic examples and practical guidelines to help readers better understand the concepts
- Published
- 2019
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.