45 results on '"Keller, Klaus"'
Search Results
2. Effects of global climate mitigation on regional air quality and health.
- Author
-
Huang, Xinyuan, Srikrishnan, Vivek, Lamontagne, Jonathan, Keller, Klaus, and Peng, Wei
- Published
- 2023
- Full Text
- View/download PDF
3. Probabilistic projections of baseline twenty-first century CO2 emissions using a simple calibrated integrated assessment model.
- Author
-
Srikrishnan, Vivek, Guan, Yawen, Tol, Richard S. J., and Keller, Klaus
- Abstract
Probabilistic projections of baseline (with no additional mitigation policies) future carbon emissions are important for sound climate risk assessments. Deep uncertainty surrounds many drivers of projected emissions. Here, we use a simple integrated assessment model, calibrated to century-scale data and expert assessments of baseline emissions, global economic growth, and population growth, to make probabilistic projections of carbon emissions through 2100. Under a variety of assumptions about fossil fuel resource levels and decarbonization rates, our projections largely agree with several emissions projections under current policy conditions. Our global sensitivity analysis identifies several key economic drivers of uncertainty in future emissions and shows important higher-level interactions between economic and technological parameters, while population uncertainties are less important. Our analysis also projects relatively low global economic growth rates over the remainder of the century. This illustrates the importance of additional research into economic growth dynamics for climate risk assessment, especially if pledged and future climate mitigation policies are weakened or have delayed implementations. These results showcase the power of using a simple, transparent, and calibrated model. While the simple model structure has several advantages, it also creates caveats for our results which are related to important areas for further research. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
4. Attention to values helps shape convergence research.
- Author
-
Helgeson, Casey, Nicholas, Robert E., Keller, Klaus, Forest, Chris E., and Tuana, Nancy
- Abstract
Convergence research is driven by specific and compelling problems and requires deep integration across disciplines. The potential of convergence research is widely recognized, but questions remain about how to design, facilitate, and assess such research. Here we analyze a seven-year, twelve-million-dollar convergence project on sustainable climate risk management to answer two questions. First, what is the impact of a project-level emphasis on the values that motivate and tie convergence research to the compelling problems? Second, how does participation in convergence projects shape the research of postdoctoral scholars who are still in the process of establishing themselves professionally? We use an interview-based approach to characterize what the project specifically enabled in each participant’s research. We find that (a) the project pushed participants’ research into better alignment with the motivating concept of convergence research and that this effect was stronger for postdoctoral scholars than for more senior faculty. (b) Postdocs’ self-assessed understanding of key project themes, however, appears unconnected to metrics of project participation, raising questions about training and integration. Regarding values, (c) the project enabled heightened attention to values in the research of a large minority of participants. (d) Participants strongly believe in the importance of explicitly reflecting on values that motivate and pervade scientific research, but they question their own understanding of how to put value-focused science into practice. This mismatch of perceived importance with poor understanding highlights an unmet need in the practice of convergence science. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
5. Adaptive mitigation strategies hedge against extreme climate futures.
- Author
-
Marangoni, Giacomo, Lamontagne, Jonathan R., Quinn, Julianne D., Reed, Patrick M., and Keller, Klaus
- Abstract
The United Nations Framework Convention on Climate Change agreed to “strengthen the global response to the threat of climate change, in the context of sustainable development and efforts to eradicate poverty” (UNFCCC 2015). Designing a global mitigation strategy to support this goal poses formidable challenges. For one, there are trade-offs between the economic costs and the environmental benefits of averting climate impacts. Furthermore, the coupled human-Earth systems are subject to deep and dynamic uncertainties. Previous economic analyses typically addressed either the former, introducing multiple objectives, or the latter, making mitigation actions responsive to new information. This paper aims at bridging these two separate strands of literature. We demonstrate how information feedback from observed global temperature changes can jointly improve the economic and environmental performance of mitigation strategies. We focus on strategies that maximize discounted expected utility while also minimizing warming above 2 °C, damage costs, and mitigation costs. Expanding on the Dynamic Integrated Climate-Economy (DICE) model and previous multi-objective efforts, we implement closed-loop control strategies, map the emerging trade-offs and quantify the value of the temperature information feedback under both well-characterized and deep climate uncertainties. Adaptive strategies strongly reduce high regrets, guarding against mitigation overspending for less sensitive climate futures, and excessive warming for more sensitive ones. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
6. A tighter constraint on Earth-system sensitivity from long-term temperature and carbon-cycle observations.
- Author
-
Wong, Tony E., Cui, Ying, Royer, Dana L., and Keller, Klaus
- Subjects
RADIATIVE forcing ,TEMPERATURE ,CLIMATE change ,CHEMICAL weathering ,CARBON cycle ,OCEANOGRAPHIC submersibles - Abstract
The long-term temperature response to a given change in CO
2 forcing, or Earth-system sensitivity (ESS), is a key parameter quantifying our understanding about the relationship between changes in Earth's radiative forcing and the resulting long-term Earth-system response. Current ESS estimates are subject to sizable uncertainties. Long-term carbon cycle models can provide a useful avenue to constrain ESS, but previous efforts either use rather informal statistical approaches or focus on discrete paleoevents. Here, we improve on previous ESS estimates by using a Bayesian approach to fuse deep-time CO2 and temperature data over the last 420 Myrs with a long-term carbon cycle model. Our median ESS estimate of 3.4 °C (2.6-4.7 °C; 5-95% range) shows a narrower range than previous assessments. We show that weaker chemical weathering relative to the a priori model configuration via reduced weatherable land area yields better agreement with temperature records during the Cretaceous. Research into improving the understanding about these weathering mechanisms hence provides potentially powerful avenues to further constrain this fundamental Earth-system property. Earth-system sensitivity (ESS) describes the long-term temperature response for a given change in atmospheric CO2 and, as such, is a crucial parameter to assess future climate change. Here, the authors use a Bayesian model with data from the last 420 Myrs to reduce uncertainties and estimate ESS to be around 3.4 °C. [ABSTRACT FROM AUTHOR]- Published
- 2021
- Full Text
- View/download PDF
7. Identifying decision-relevant uncertainties for dynamic adaptive forest management under climate change.
- Author
-
Radke, Naomi, Keller, Klaus, Yousefpour, Rasoul, and Hanewinkel, Marc
- Subjects
CLIMATE change ,FOREST surveys ,FOREST microclimatology ,UNCERTAINTY ,EUROPEAN beech ,FOREST management ,INVENTORY control - Abstract
The decision on how to manage a forest under climate change is subject to deep and dynamic uncertainties. The classic approach to analyze this decision adopts a predefined strategy, tests its robustness to uncertainties, but neglects their dynamic nature (i.e., that decision-makers can learn and adjust the strategy). Accounting for learning through dynamic adaptive strategies (DAS) can drastically improve expected performance and robustness to deep uncertainties. The benefits of considering DAS hinge on identifying critical uncertainties and translating them to detectable signposts to signal when to change course. This study advances the DAS approach to forest management as a novel application domain by showcasing methods to identify potential signposts for adaptation on a case study of a classic European beech management strategy in South-West Germany. We analyze the strategy's robustness to uncertainties about model forcings and parameters. We then identify uncertainties that critically impact its economic and ecological performance by confronting a forest growth model with a large sample of time-varying scenarios. The case study results illustrate the potential of designing DAS for forest management and provide insights on key uncertainties and potential signposts. Specifically, economic uncertainties are the main driver of the strategy's robustness and impact the strategy's performance more critically than climate uncertainty. Besides economic metrics, the forest stand's past volume growth is a promising signpost metric. It mirrors the effect of both climatic and model parameter uncertainty. The regular forest inventory and planning cycle provides an ideal basis for adapting a strategy in response to these signposts. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
8. Not all carbon dioxide emission scenarios are equally likely: a subjective expert assessment.
- Author
-
Ho, Emily, Budescu, David V., Bosetti, Valentina, van Vuuren, Detlef P., and Keller, Klaus
- Subjects
CARBON dioxide ,PROBABILITY theory - Abstract
Climate researchers use carbon dioxide emission scenarios to explore alternative climate futures and potential impacts, as well as implications of mitigation and adaptation policies. Often, these scenarios are published without formal probabilistic interpretations, given the deep uncertainty related to future development. However, users often seek such information, a likely range or relative probabilities. Without further specifications, users sometimes pick a small subset of emission scenarios and/or assume that all scenarios are equally likely. Here, we present probabilistic judgments of experts assessing the distribution of 2100 emissions under a business-as-usual and a policy scenario. We obtain the judgments through a method that relies only on pairwise comparisons of various ranges of emissions. There is wide variability between individual experts, but they clearly do not assign equal probabilities for the total range of future emissions. We contrast these judgments with the emission projection ranges derived from the shared socio-economic pathways (SSPs) and a recent multi-model comparison producing probabilistic emission scenarios. Differences on long-term emission probabilities between expert estimates and model-based calculations may result from various factors including model restrictions, a coverage of a wider set of factors by experts, but also group think and inability to appreciate long-term processes. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
9. Epistemic and ethical trade-offs in decision analytical modelling.
- Author
-
Vezér, Martin, Bakker, Alexander, Keller, Klaus, and Tuana, Nancy
- Subjects
FLOOD risk ,HURRICANE Katrina, 2005 ,CLIMATE change ,FLOOD control ,SEA-walls ,SEA level & the environment - Abstract
Designing decision analytical models requires making choices that can involve a range of trade-offs and interactions between epistemic and ethical considerations. Such choices include determining the complexity of a model and deciding what types of risk will be assessed. Here, we demonstrate how model design choices can involve trade-offs between the epistemic benefits of representational completeness and simplicity, which interact with ethical considerations about fairness and human life. We illustrate this point by focusing on modeling studies that assess flood risks in New Orleans, Louisiana. Addressing the ethical and epistemic implications of model design choices can help clarify the scope of factors necessary to inform ethically sound and economically efficient decision-making. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
10. Understanding the detectability of potential changes to the 100-year peak storm surge.
- Author
-
Ceres, Robert, Forest, Chris, and Keller, Klaus
- Subjects
STORM surges ,FLOOD risk ,INVESTMENTS ,INFRASTRUCTURE (Economics) ,RISK ,MANAGEMENT - Abstract
In many coastal communities, the risks driven by storm surges are motivating substantial investments in flood risk management. The design of adaptive risk management strategies, however, hinges on the ability to detect future changes in storm surge statistics. Previous studies have used observations to identify changes in past storm surge statistics. Here, we focus on the simple and decision-relevant question: How fast can we learn from past and potential future storm surge observations about changes in future statistics? Using Observing System Simulation Experiments, we quantify the time required to detect changes in the probability of extreme storm surge events. We estimate low probabilities of detection when substantial but gradual changes to the 100-year storm surge occur. As a result, policy makers may underestimate considerable increases in storm surge risk over the typically long lifespans of major infrastructure projects. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
11. Impacts of Antarctic fast dynamics on sea-level projections and coastal flood defense.
- Author
-
Wong, Tony, Bakker, Alexander, and Keller, Klaus
- Subjects
SEA level ,FLOOD control ,COASTS ,ICE sheets ,CLIMATE change ,FLOOD risk ,PALEOCLIMATOLOGY - Abstract
Strategies to manage the risks posed by future sea-level rise hinge on a sound characterization of the inherent uncertainties. One of the major uncertainties is the possible rapid disintegration of large fractions of the Antarctic ice sheet in response to rising global temperatures. This could potentially lead to several meters of sea-level rise during the next few centuries. Previous studies have typically been silent on two coupled questions: (i) What are probabilistic estimates of this 'fast dynamic' contribution to sea-level rise? (ii) What are the implications for strategies to manage coastal flooding risks? Here, we present probabilistic hindcasts and projections of sea-level rise to 2100. The fast dynamic mechanism is approximated by a simple parameterization, designed to allow for a careful quantification of the uncertainty in its contribution to sea-level rise. We estimate that global temperature increases ranging from 1.9 to 3.1 °C coincide with fast Antarctic disintegration, and these contributions account for sea-level rise of 21-74 cm this century (5-95% range, Representative Concentration Pathway 8.5). We use a simple cost-benefit analysis of coastal defense to demonstrate in a didactic exercise how neglecting this mechanism and associated uncertainty can (i) lead to strategies which fall sizably short of protection targets and (ii) increase the expected net costs. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
12. The effects of time-varying observation errors on semi-empirical sea-level projections.
- Author
-
Ruckert, Kelsey, Guan, Yawen, Bakker, Alexander, Forest, Chris, and Keller, Klaus
- Subjects
SEA level ,FLOOD risk ,RELATIVE sea level change ,TIME-varying systems ,AUTOCORRELATION (Statistics) - Abstract
Sea-level rise is a key driver of projected flooding risks. The design of strategies to manage these risks often hinges on projections that inform decision-makers about the surrounding uncertainties. Producing semi-empirical sea-level projections is difficult, for example, due to the complexity of the error structure of the observations, such as time-varying (heteroskedastic) observation errors and autocorrelation of the data-model residuals. This raises the question of how neglecting the error structure impacts hindcasts and projections. Here, we quantify this effect on sea-level projections and parameter distributions by using a simple semi-empirical sea-level model. Specifically, we compare three model-fitting methods: a frequentist bootstrap as well as a Bayesian inversion with and without considering heteroskedastic residuals. All methods produce comparable hindcasts, but the parametric distributions and projections differ considerably based on methodological choices. Our results show that the differences based on the methodological choices are enhanced in the upper tail projections. For example, the Bayesian inversion accounting for heteroskedasticity increases the sea-level anomaly with a 1% probability of being equaled or exceeded in the year 2050 by about 34% and about 40% in the year 2100 compared to a frequentist bootstrap. These results indicate that neglecting known properties of the observation errors and the data-model residuals can lead to low-biased sea-level projections. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
13. Sources and implications of deep uncertainties surrounding sea-level projections.
- Author
-
Bakker, Alexander, Louchard, Domitille, and Keller, Klaus
- Subjects
SEA level ,FLOOD risk ,RELATIVE sea level change ,COASTAL zone management ,CLIMATE change forecasts - Abstract
Long-term flood risk management often relies on future sea-level projections. Projected uncertainty ranges are however widely divergent as a result of different methodological choices. The IPCC has condensed this deep uncertainty into a single uncertainty range covering 66% probability or more. Alternatively, structured expert judgment summarizes divergent expert opinions in a single distribution. Recently published uncertainty ranges that are derived from these 'consensus' assessments appear to differ by up to a factor four. This might result in overconfidence or overinvestment in strategies to cope with sea-level change. Here we explore possible reasons for these different interpretations. This is important for (i) the design of robust strategies and (ii) the exploration of pathways that may eventually lead to some kind of consensus distributions that are relatively straightforward to interpret. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
14. Climate risk management requires explicit representation of societal trade-offs.
- Author
-
Garner, Gregory, Reed, Patrick, and Keller, Klaus
- Subjects
CLIMATE change risk management ,STAKEHOLDERS ,CLIMATE change conferences ,GREENHOUSE gas mitigation ,ATMOSPHERIC models ,CLIMATE sensitivity ,MATHEMATICAL models - Abstract
Strategies for managing climate-change risks impact diverse stakeholder groups that possess potentially conflicting preferences. Basic physics and economics suggest that reconciling all of these preference conflicts may not be possible. Moreover, different climate risk management strategies can yield diverse and potentially severe impacts across different global stakeholders. These preference conflicts and their uncertain impacts require an explicit understanding of the trade-offs that emerge across different risk management strategies. Traditionally, integrated assessment models (IAMs) typically aggregate the stakeholders' preferences across the entire globe into a single, a priori defined utility function. This framing hides climate risk management trade-offs as well as the inherent stakeholder compromises implicit to the resulting single 'optimal' expected utility solution. Here, we analyze a simple IAM to quantify and visualize the multidimensional trade-offs among four objectives representing global concerns: (i) global economic productivity, (ii) reliable temperature stabilization, (iii) climate damages, and (iv) abatement costs. We quantify and visualize the trade-offs across these objectives and demonstrate how a traditional optimal expected utility policy implicitly eliminates many relevant policy pathways. Explicit trade-off analysis provides a richer context for exploring conflicting global policy preferences and clarifies the implications of alternative climate risk mitigation policies to better inform negotiated compromises. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
15. Increasing temperature forcing reduces the Greenland Ice Sheet's response time scale.
- Author
-
Applegate, Patrick, Parizek, Byron, Nicholas, Robert, Alley, Richard, and Keller, Klaus
- Subjects
ATMOSPHERIC temperature ,ICE sheets ,CLIMATE change ,SEA level - Abstract
Damages from sea level rise, as well as strategies to manage the associated risk, hinge critically on the time scale and eventual magnitude of sea level rise. Satellite observations and paleo-data suggest that the Greenland Ice Sheet (GIS) loses mass in response to increased temperatures, and may thus contribute substantially to sea level rise as anthropogenic climate change progresses. The time scale of GIS mass loss and sea level rise are deeply uncertain, and are often assumed to be constant. However, previous ice sheet modeling studies have shown that the time scale of GIS response likely decreases strongly with increasing temperature anomaly. Here, we map the relationship between temperature anomaly and the time scale of GIS response, by perturbing a calibrated, three-dimensional model of GIS behavior. Additional simulations with a profile, higher-order, ice sheet model yield time scales that are broadly consistent with those obtained using the three-dimensional model, and shed light on the feedbacks in the ice sheet system that cause the time scale shortening. Semi-empirical modeling studies that assume a constant time scale of sea level adjustment, and are calibrated to small preanthropogenic temperature and sea level changes, may underestimate future sea level rise. Our analysis suggests that the benefits of reducing greenhouse gas emissions, in terms of avoided sea level rise from the GIS, may be greatest if emissions reductions begin before large temperature increases have been realized. Reducing anthropogenic climate change may also allow more time for design and deployment of risk management strategies by slowing sea level contributions from the GIS. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
16. Funktionelle Störungen des Darms.
- Author
-
Keller, Klaus-Michael, Koletzko, Sybille, and Buderus, Stephan
- Published
- 2014
- Full Text
- View/download PDF
17. Morbus Crohn und Colitis ulcerosa.
- Author
-
Keller, Klaus-Michael
- Published
- 2014
- Full Text
- View/download PDF
18. Bakterielle Infektionen: Gram-negative Stäbchen.
- Author
-
Berner, Reinhard, Scholz, Horst, Heininger, Ulrich, Keller, Klaus-Michael, Huppertz, Hans-Iko, and Schmitt, Heinz-Josef
- Published
- 2014
- Full Text
- View/download PDF
19. Inaction and climate stabilization uncertainties lead to severe economic risks.
- Author
-
Butler, Martha, Reed, Patrick, Fisher-Vanden, Karen, Keller, Klaus, and Wagener, Thorsten
- Subjects
CLIMATOLOGY ,SOCIOECONOMIC factors ,CARBON dioxide ,SENSITIVITY analysis ,EMISSIONS (Air pollution) - Abstract
Climate stabilization efforts must integrate the actions of many socio-economic sectors to be successful in meeting climate stabilization goals, such as limiting atmospheric carbon dioxide (CO) concentration to be less than double the pre-industrial levels. Estimates of the costs and benefits of stabilization policies are often informed by Integrated Assessment Models (IAMs) of the climate and the economy. These IAMs are highly non-linear with many parameters that abstract globally integrated characteristics of environmental and socio-economic systems. Diagnostic analyses of IAMs can aid in identifying the interdependencies and parametric controls of modeled stabilization policies. Here we report a comprehensive variance-based sensitivity analysis of a doubled-CO stabilization policy scenario generated by the globally-aggregated Dynamic Integrated model of Climate and the Economy (DICE). We find that neglecting uncertainties considerably underestimates damage and mitigation costs associated with a doubled-CO stabilization goal. More than ninety percent of the states-of-the-world (SOWs) sampled in our analysis exceed the damages and abatement costs calculated for the reference case neglecting uncertainties (1.2 trillion 2005 USD, with worst case costs exceeding $60 trillion). We attribute the variance in these costs to uncertainties in the model parameters relating to climate sensitivity, global participation in abatement, and the cost of lower emission energy sources. [ABSTRACT FROM AUTHOR]
- Published
- 2014
- Full Text
- View/download PDF
20. Erkrankungen und Therapieformen des unteren Gastrointestinaltrakts.
- Author
-
Fuchs, Jörg, Ellerkamp, Verena, Zimmer, Klaus-Peter, Steiß, Jens-Oliver, Hauer, Almuthe C., Buderus, Stephan, Keller, Klaus-Michael, von Schweinitz, Dietrich, Lacaille, Florence, Rümmele, Frank M., Goulet, Olivier, Müller, Herbert, Waag, Karl-Ludwig, and Petersen, Claus
- Published
- 2013
- Full Text
- View/download PDF
21. Leitsymptome und Differenzialdiagnostik.
- Author
-
Nützenadel, Walter, Wenzl, Tobias G., Zimmer, Klaus-Peter, Ballauff, Antje, Hauer, Almuthe C., Keller, Klaus-Michael, and Waag, Karl-Ludwig
- Published
- 2013
- Full Text
- View/download PDF
22. Abdominale Migräne mit periodischen Bauchschmerzen: Migräne bei Kindern: nicht nur Kopfschmerzen.
- Author
-
Keller, Klaus-Michael
- Published
- 2021
- Full Text
- View/download PDF
23. Toward a physically plausible upper bound of sea-level rise projections.
- Author
-
Sriver, Ryan, Urban, Nathan, Olson, Roman, and Keller, Klaus
- Subjects
ABSOLUTE sea level change ,GLACIAL melting ,THERMAL expansion ,FLOODS ,EMERGENCY management - Abstract
Anthropogenic sea-level rise (SLR) causes considerable risks. Designing a sound SLR risk-management strategy requires careful consideration of decision-relevant uncertainties such as the reasonable upper bound of future SLR. The recent Intergovernmental Panel on Climate Change's (IPCC) Fourth Assessment reported a likely upper SLR bound in the year 2100 near 0.6 m (meter). More recent studies considering semi-empirical modeling approaches and kinematic constraints on glacial melting suggest a reasonable 2100 SLR upper bound of approximately 2 m. These recent studies have broken important new ground, but they largely neglect uncertainties surrounding thermal expansion (thermosteric SLR) and/or observational constraints on ocean heat uptake. Here we quantify the effects of key parametric uncertainties and observational constraints on thermosteric SLR projections using an Earth system model with a dynamic three-dimensional ocean, which provides a mechanistic representation of deep ocean processes and heat uptake. Considering these effects nearly doubles the contribution of thermosteric SLR compared to previous estimates and increases the reasonable upper bound of 2100 SLR projections by 0.25 m. As an illustrative example of the effect of overconfidence, we show how neglecting thermosteric uncertainty in projections of the SLR upper bound can considerably bias risk analysis and hence the design of adaptation strategies. For conditions close to the Port of Los Angeles, the 0.25 m increase in the reasonable upper bound can result in a flooding-risk increase by roughly three orders of magnitude. Results provide evidence that relatively minor underestimation of the upper bound of projected SLR can lead to major downward biases of future flooding risks. [ABSTRACT FROM AUTHOR]
- Published
- 2012
- Full Text
- View/download PDF
24. What are robust strategies in the face of uncertain climate threshold responses?
- Author
-
McInerney, David, Lempert, Robert, and Keller, Klaus
- Subjects
CLIMATE change ,DECISION making ,GREENHOUSE gas mitigation ,LEARNING ,ECONOMIC research ,HUMANITARIANISM ,CLIMATE change mitigation - Abstract
We use an integrated assessment model of climate change to analyze how alternative decision-making criteria affect preferred investments into greenhouse gas mitigation, the distribution of outcomes, the robustness of the strategies, and the economic value of information. We define robustness as trading a small decrease in a strategy's expected performance for a significant increase in a strategy's performance in the worst cases. Specifically, we modify the Dynamic Integrated model of Climate and the Economy (DICE-07) to include a simple representation of a climate threshold response, parametric uncertainty, structural uncertainty, learning, and different decision-making criteria. Economic analyses of climate change strategies typically adopt the expected utility maximization (EUM) framework. We compare EUM with two decision criteria adopted from the finance literature, namely Limited Degree of Confidence (LDC) and Safety First (SF). Both criteria increase the relative weight of the performance under the worst-case scenarios compared to EUM. We show that the LDC and SF criteria provide a computationally feasible foundation for identifying greenhouse gas mitigation strategies that may prove more robust than those identified by the EUM criterion. More robust strategies show higher near-term investments in emissions abatement. Reducing uncertainty has a higher economic value of information for the LDC and SF decision criteria than for EUM. [ABSTRACT FROM AUTHOR]
- Published
- 2012
- Full Text
- View/download PDF
25. The economics (or lack thereof) of aerosol geoengineering.
- Author
-
Goes, Marlos, Tuana, Nancy, and Keller, Klaus
- Subjects
AEROSOLS & the environment ,GREENHOUSE gases ,ENVIRONMENTAL engineering ,ANTHROPOGENIC effects on nature ,CARBON dioxide mitigation ,STRATOSPHERE - Abstract
Anthropogenic greenhouse gas emissions are changing the Earth's climate and impose substantial risks for current and future generations. What are scientifically sound, economically viable, and ethically defendable strategies to manage these climate risks? Ratified international agreements call for a reduction of greenhouse gas emissions to avoid dangerous anthropogenic interference with the climate system. Recent proposals, however, call for a different approach: to geoengineer climate by injecting aerosol precursors into the stratosphere. Published economic studies typically neglect the risks of aerosol geoengineering due to (i) the potential for a failure to sustain the aerosol forcing and (ii) the negative impacts associated with the aerosol forcing. Here we use a simple integrated assessment model of climate change to analyze potential economic impacts of aerosol geoengineering strategies over a wide range of uncertain parameters such as climate sensitivity, the economic damages due to climate change, and the economic damages due to aerosol geoengineering forcing. The simplicity of the model provides the advantages of parsimony and transparency, but it also imposes severe caveats on the interpretation of the results. For example, the analysis is based on a globally aggregated model and is hence silent on intragenerational distribution of costs and benefits. In addition, the analysis neglects the effects of learning and has a very simplistic representation of climate change impacts. Our analysis suggests three main conclusions. First, substituting aerosol geoengineering for CO abatement can be an economically ineffective strategy. One key to this finding is that a failure to sustain the aerosol forcing can lead to sizeable and abrupt climatic changes. The monetary damages due to such a discontinuous aerosol geoengineering can dominate the cost-benefit analysis because the monetary damages of climate change are expected to increase with the rate of change. Second, the relative contribution of aerosol geoengineering to an economically optimal portfolio hinges critically on, thus far, deeply uncertain estimates of the damages due to aerosol forcing. Even if we assume that aerosol forcing could be deployed continuously, the aerosol geoengineering does not considerably displace CO abatement in the simple economic optimal growth model until the damages due to the aerosol forcing are rather low. Third, substituting aerosol geoengineering for greenhouse gas emission abatement can fail an ethical test regarding intergenerational justice. Substituting aerosol geoengineering for greenhouse gas emissions abatements constitutes a conscious risk transfer to future generations, in violation of principles of intergenerational justice which demands that present generations should not create benefits for themselves in exchange for burdens on future generations. [ABSTRACT FROM AUTHOR]
- Published
- 2011
- Full Text
- View/download PDF
26. Climate Projections Using Bayesian Model Averaging and Space-Time Dependence.
- Author
-
Bhat, K., Haran, Murali, Terando, Adam, and Keller, Klaus
- Subjects
CLIMATE change ,BAYESIAN analysis ,TEMPERATURE ,GAUSSIAN processes ,SCIENTISTS ,DIFFERENCES - Abstract
Projections of future climatic changes are a key input to the design of climate change mitigation and adaptation strategies. Current climate change projections are deeply uncertain. This uncertainty stems from several factors, including parametric and structural uncertainties. One common approach to characterize and, if possible, reduce these uncertainties is to confront (calibrate in a broad sense) the models with historical observations. Here, we analyze the problem of combining multiple climate models using Bayesian Model Averaging (BMA) to derive future projections and quantify uncertainty estimates of spatiotemporally resolved temperature hindcasts and projections. One advantage of the BMA approach is that it allows the assessment of the predictive skill of a model using the training data, which can help identify the better models and discard poor models. Previous BMA approaches have broken important new ground, but often neglected space-time dependencies and/or imposed prohibitive computational demands. Here we improve on the current state-of-the-art by incorporating space-time dependence while using historical data to estimate model weights. We achieve computational efficiency using a kernel mixing approach for representing a space-time process. One key advantage of our new approach is that it enables us to incorporate multiple sources of uncertainty and biases, while remaining computationally tractable for large data sets. We introduce and apply our approach using BMA to an ensemble of Global Circulation Model output from the Intergovernmental Panel on Climate Change Fourth Assessment Report of surface temperature on a grid of space-time locations. [ABSTRACT FROM AUTHOR]
- Published
- 2011
- Full Text
- View/download PDF
27. Intrinsic Ethics Regarding Integrated Assessment Models for Climate Management.
- Author
-
Schienke, Erich, Baum, Seth, Tuana, Nancy, Davis, Kenneth, and Keller, Klaus
- Subjects
RESEARCH ethics ,CLIMATOLOGY ,CLIMATE change ,ETHICS ,PROFESSIONAL ethics - Abstract
In this essay we develop and argue for the adoption of a more comprehensive model of research ethics than is included within current conceptions of responsible conduct of research (RCR). We argue that our model, which we label the ethical dimensions of scientific research (EDSR), is a more comprehensive approach to encouraging ethically responsible scientific research compared to the currently typically adopted approach in RCR training. This essay focuses on developing a pedagogical approach that enables scientists to better understand and appreciate one important component of this model, what we call intrinsic ethics. Intrinsic ethical issues arise when values and ethical assumptions are embedded within scientific findings and analytical methods. Through a close examination of a case study and its application in teaching, namely, evaluation of climate change integrated assessment models, this paper develops a method and case for including intrinsic ethics within research ethics training to provide scientists with a comprehensive understanding and appreciation of the critical role of values and ethical choices in the production of research outcomes. [ABSTRACT FROM AUTHOR]
- Published
- 2011
- Full Text
- View/download PDF
28. Economically optimal risk reduction strategies in the face of uncertain climate thresholds.
- Author
-
McInerney, David and Keller, Klaus
- Subjects
EFFECT of human beings on climate change ,GLOBAL temperature changes ,CLIMATOLOGY ,EMISSIONS (Air pollution) ,GREENHOUSE gas mitigation ,CARBON dioxide mitigation ,UNITED Nations Framework Convention on Climate Change (1992) - Abstract
Anthropogenic greenhouse gas emissions may trigger climate threshold responses, such as a collapse of the North Atlantic meridional overturning circulation (MOC). Climate threshold responses have been interpreted as an example of “dangerous anthropogenic interference with the climate system” in the sense of the United Nations Framework Convention on Climate Change (UNFCCC). One UNFCCC objective is to “prevent” such dangerous anthropogenic interference. The current uncertainty about important parameters of the coupled natural – human system implies, however, that this UNFCCC objective can only be achieved in a probabilistic sense. In other words, climate management can only reduce – but not entirely eliminate – the risk of crossing climate thresholds. Here we use an integrated assessment model of climate change to derive economically optimal risk-reduction strategies. We implement a stochastic version of the DICE model and account for uncertainty about four parameters that have been previously identified as dominant drivers of the uncertain system response. The resulting model is, of course, just a crude approximation as it neglects, for example, some structural uncertainty and focuses on a single threshold, out of many potential climate responses. Subject to this caveat, our analysis suggests five main conclusions. First, reducing the numerical artifacts due to sub-sampling the parameter probability density functions to reasonable levels requires sample sizes exceeding 10
3 . Conclusions of previous studies that are based on much smaller sample sizes may hence need to be revisited. Second, following a business-as-usual (BAU) scenario results in odds for an MOC collapse in the next 150 years exceeding 1 in 3 in this model. Third, an economically “optimal” strategy (that maximizes the expected utility of the decision-maker) reduces carbon dioxide(CO2 ) emissions by approximately 25% at the end of this century, compared with BAU emissions. Perhaps surprisingly, this strategy leaves the odds of an MOC collapse virtually unchanged compared to a BAU strategy. Fourth, reducing the odds for an MOC collapse to 1 in 10 would require an almost complete decarbonization of the economy within a few decades. Finally, further risk reductions (e.g., to 1 in 100) are possible in the framework of the simple model, but would require faster and more expensive reductions in CO2 emissions. [ABSTRACT FROM AUTHOR]- Published
- 2008
- Full Text
- View/download PDF
29. Managing the risks of climate thresholds: uncertainties and information needs.
- Author
-
Keller, Klaus, Yohe, Gary, and Schlesinger, Michael
- Subjects
EDITORIALS ,EFFECT of human beings on climate change ,HUMAN ecology ,GLOBAL temperature changes ,CLIMATOLOGY - Abstract
The authors reflect on the risks of potential climate thresholds. They discuss the impact of human activities to the climate system and cite the uncertainties on the understanding of potential climate threshold responses. They call the need to enhance research strategies on possible climate thresholds and improve the understanding on the dangerous threats brought by anthropogenic interference with the climate system.
- Published
- 2008
- Full Text
- View/download PDF
30. Detecting potential changes in the meridional overturning circulation at 26˚N in the Atlantic.
- Author
-
Baehr, Johanna, Keller, Klaus, and Marotzke, Jochem
- Subjects
MERIDIONAL overturning circulation ,OCEAN circulation ,OCEANOGRAPHY ,CLIMATE change ,SIMULATION methods & models - Abstract
We analyze the ability of an oceanic monitoring array to detect potential changes in the North Atlantic meridional overturning circulation (MOC). The observing array is ‘deployed’ into a numerical model (ECHAM5/MPI-OM), and simulates the measurements of density and wind stress at 26
° N in the Atlantic. The simulated array mimics the continuous monitoring system deployed in the framework of the UK Rapid Climate Change program. We analyze a set of three realizations of a climate change scenario (IPCC A1B), in which – within the considered time-horizon of 200 years – the MOC weakens, but does not collapse. For the detection analysis, we assume that the natural variability of the MOC is known from an independent source, the control run. Our detection approach accounts for the effects of observation errors, infrequent observations, autocorrelated internal variability, and uncertainty in the initial conditions. Continuous observation with the simulated array for approximately 60 years yields a statistically significant (p < 0.05) detection with 95 percent reliability assuming a random observation error of 1 Sv (1 Sv = 106 m3 s−1 ). Observing continuously with an observation error of 3 Sv yields a detection time of about 90 years (with 95 percent reliability). Repeated hydrographic transects every 5 years/ 20 years result in a detection time of about 90 years/120 years, with 95 percent reliability and an assumed observation error of 3 Sv. An observation error of 3 Sv (one standard deviation) is a plausible estimate of the observation error associated with the RAPID UK 26° N array. [ABSTRACT FROM AUTHOR]- Published
- 2008
- Full Text
- View/download PDF
31. Abrupt climate change near the poles.
- Author
-
Keller, Klaus, Tol, Richard S. J., Toth, Ferenc L., and Yohe, Gary W.
- Subjects
CLIMATE change ,ASSOCIATIONS, institutions, etc. - Abstract
The article discusses various reports published within the issue, including one on the workshop hosted by the Aspen Global Change Institute on abrupt climate change and another on the implications for climate policy.
- Published
- 2008
- Full Text
- View/download PDF
32. Carbon dioxide sequestration: how much and when?
- Author
-
Keller, Klaus, McInerney, David, and Bradford, David F.
- Subjects
CARBON compounds ,COST effectiveness ,ECONOMIC forecasting ,CLIMATE change ,ACCLIMATIZATION ,CARBON dioxide ,ENERGY minerals ,FOSSIL fuels ,POWER resources ,ENVIRONMENTAL impact charges - Abstract
Abstract Carbon dioxide (CO
2 ) sequestration has been proposed as a key component in technological portfolios for managing anthropogenic climate change, since it may provide a faster and cheaper route to significant reductions in atmospheric CO2 concentrations than abating CO2 production. However, CO2 sequestration is not a perfect substitute for CO2 abatement because CO2 may leak back into the atmosphere (thus imposing future climate change impacts) and because CO2 sequestration requires energy (thus producing more CO2 and depleting fossil fuel resources earlier). Here we use analytical and numerical models to assess the economic efficiency of CO2 sequestration and analyze the optimal timing and extent of CO2 sequestration. The economic efficiency factor of CO2 sequestration can be expressed as the ratio of the marginal net benefits of sequestering CO2 and avoiding CO2 emissions. We derive an analytical solution for this efficiency factor for a simplified case in which we account for CO2 leakage, discounting, the additional fossil fuel requirement of CO2 sequestration, and the growth rate of carbon taxes. In this analytical model, the economic efficiency of CO2 sequestration decreases as the CO2 tax growth rate, leakage rates and energy requirements for CO2 sequestration increase. Increasing discount rates increases the economic efficiency factor. In this simple model, short-term sequestration methods, such as afforestation, can even have negative economic efficiencies. We use a more realistic integrated-assessment model to additionally account for potentially important effects such as learning-by-doing and socio-economic inertia on optimal strategies. We measure the economic efficiency of CO2 sequestration by the ratio of the marginal costs of CO2 sequestration and CO2 abatement along optimal trajectories. We show that the positive impacts of investments in CO2 sequestration through the reduction of future marginal CO2 sequestration costs and the alleviation of future inertia constraints can initially exceed the marginal sequestration costs. As a result, the economic efficiencies of CO2 sequestration can exceed 100% and an optimal strategy will subsidize CO2 sequestration that is initially more expensive than CO2 abatement. The potential economic value of a feasible and acceptable CO2 sequestration technology is equivalent - in the adopted utilitarian model - to a one-time investment of several percent of present gross world product. It is optimal in the chosen economic framework to sequester substantial CO2 quantities into reservoirs with small or zero leakage, given published estimates of marginal costs and climate change impacts. The optimal CO2 trajectories in the case of sequestration from air can approach the pre-industrial level, constituting geoengineering. Our analysis is silent on important questions (e.g., the effects of model and parametric uncertainty, the potential learning about these uncertainties, or ethical dimension of such geoengineering strategies), which need to be addressed before our findings can be translated into policy-relevant recommendations. [ABSTRACT FROM AUTHOR]- Published
- 2008
- Full Text
- View/download PDF
33. The dynamics of learning about a climate threshold.
- Author
-
Keller, Klaus and McInerney, David
- Subjects
- *
CLIMATOLOGY , *GREENHOUSE gases , *MERIDIONAL overturning circulation , *WEATHER , *EMISSIONS (Air pollution) - Abstract
Anthropogenic greenhouse gas emissions may trigger threshold responses of the climate system. One relevant example of such a potential threshold response is a shutdown of the North Atlantic meridional overturning circulation (MOC). Numerous studies have analyzed the problem of early MOC change detection (i.e., detection before the forcing has committed the system to a threshold response). Here we analyze the early MOC prediction problem. To this end, we virtually deploy an MOC observation system into a simple model that mimics potential future MOC responses and analyze the timing of confident detection and prediction. Our analysis suggests that a confident prediction of a potential threshold response can require century time scales, considerably longer that the time required for confident detection. The signal enabling early prediction of an approaching MOC threshold in our model study is associated with the rate at which the MOC intensity decreases for a given forcing. A faster MOC weakening implies a higher MOC sensitivity to forcing. An MOC sensitivity exceeding a critical level results in a threshold response. Determining whether an observed MOC trend in our model differs in a statistically significant way from an unforced scenario (the detection problem) imposes lower requirements on an observation system than the determination whether the MOC will shut down in the future (the prediction problem). As a result, the virtual observation systems designed in our model for early detection of MOC changes might well fail at the task of early and confident prediction. Transferring this conclusion to the real world requires a considerably refined MOC model, as well as a more complete consideration of relevant observational constraints. [ABSTRACT FROM AUTHOR]
- Published
- 2008
- Full Text
- View/download PDF
34. Immundefizienz und Darm.
- Author
-
Keller, Klaus-Michael
- Published
- 2014
- Full Text
- View/download PDF
35. Kuhmilchallergie.
- Author
-
Keller, Klaus-Michael
- Published
- 2014
- Full Text
- View/download PDF
36. Medical care of obese children and adolescents. APV: a standardised multicentre documentation derived to study initial presentation and cardiovascular risk factors in patients transferred to specialised treatment institutions.
- Author
-
Reinehr, Thomas, Wabitsch, Martin, Andler, Werner, Beyer, Peter, Böttner, Antje, Chen-Stute, Annette, Fromme, Carmen, Hampel, Olaf, Keller, Klaus M., Kilian, Uwe, Kolbe, Hilde, Lob-Corzilius, Thomas, Marg, Wolfgang, Mayer, Hermann, Mohnike, Klaus, Oepen, Johannes, Povel, Clemens, Richter, Brunhild, Riedinger, Nikola, and Schauerte, Gerd
- Subjects
CHILDHOOD obesity ,HYPERTENSION ,EVALUATION of medical care ,BLOOD pressure ,CARDIOVASCULAR diseases ,DISEASE risk factors - Abstract
Unlabelled: So far in Europe, no studies have been published on the structuring of medical care for obese children and adolescents. Besides anthropometric parameters, evaluations of the cardiovascular risk factors hypertension, dyslipidaemia, impaired glucose metabolism and treatment modalities were documented in a standardised multicentre evaluation survey (APV) of 18 primarily outpatient and nine rehabilitation institutions. In total, 3837 children (aged 2-20 years) took part in the years 2000 up to March 2003, of whom 1985 were treated in outpatient institutions and 1852 in rehabilitation institutions. Of these children, 10% were overweight, 37% obese, 49% extremely obese and 4% of normal weight at initial presentation. The frequencies of diagnostic procedures performed and documented were low (measurement of blood pressure 43%, lipids 40%, glucose metabolism 21%). In the subgroup of obese children who were screened for cardiovascular risk factors, 23% suffered from hypertension, 11% displayed increased cholesterol, 9% increased low-density lipoprotein-cholesterol, 29% increased triglycerides, 11% decreased high-density lipoprotein-cholesterol and 6% had impaired glucose metabolism.Conclusion: Despite the high prevalence of cardiovascular risk factors in obese children and adolescents confirmed in this report, diagnostic procedures failed in a considerable percentage even in specialised treatment centres for obese children and adolescents. In future, the feedback based on standardised evaluation of diagnostic and treatment procedures should aim to improve the quality of medical care. [ABSTRACT FROM AUTHOR]- Published
- 2004
- Full Text
- View/download PDF
37. PRESERVING THE OCEAN CIRCULATION: IMPLICATIONS FOR CLIMATE POLICY.
- Author
-
Keller, Klaus and Tan, Kelvin
- Subjects
OCEAN currents ,OCEAN circulation ,GREENHOUSE gases - Abstract
Presents a study which investigated the optimal policy response to the risk of a reorganization of the North Atlantic's current pattern known as the thermohaline circulation collapse. Analysis of the link between atmospheric greenhouse gas concentrations and the ocean circulation; Integrated assessment model used in the study; Results and discussion.
- Published
- 2000
- Full Text
- View/download PDF
38. MR cholangiography in children with autosomal recessive polycystic kidney disease.
- Author
-
Jung, Gregor, Benz-Bohm, Gabriele, Kugel, Harald, Keller, Klaus-Michael, and Querfeld, Uwe
- Abstract
Background. Magnetic resonance cholangiography (MRC) is a relatively new, non-invasive imaging technique of the biliary tree that has shown good correlation with endoscopic retrograde cholangiopancreatography. The liver manifestation of autosomal recessive polycystic kidney disease (ARPKD) is congenital hepatic fibrosis (CHF). CHF may be accompanied by Caroli's disease, which is characterised by a non-obstructive dilation of the intrahepatic bile ducts. Objective. A prospective study was conducted to determine the presence and extent of Caroli's disease in children with ARPKD. Materials and methods. Seven children with ARPKD aged from 3.0 to 10.1 years were examined. CHF was confirmed in all biopsied cases (5 of 7). All children had been followed by repeated abdominal US examinations for many years. The MR examination included a morphological imaging study using a T2-weighted turbo spin-echo sequence and a heavily T2-weighted inversion-recovery turbo spin-echo sequence with three-dimensional maximum intensity projection (MIP) reconstructions for MRC. Results. The diagnosis of Caroli's disease could be made in one case by US; in two other children Caroli's disease was suspected, but the differentiation from hepatic cysts was not possible. By MRC, Caroli's disease could be diagnosed in three of seven children. Furthermore, MRC with MIP reconstructions demonstrated the extent of the disease by showing the entire biliary tree from different angles. Conclusions. MRC is a valuable method to establish the diagnosis and demonstrate the extent of Caroli's disease. [ABSTRACT FROM AUTHOR]
- Published
- 1999
- Full Text
- View/download PDF
39. Benign recurrent intrahepatic cholestasis (BRIC): evidence of genetic heterogeneity and delimitation of the BRIC locus to a 7-cM interval between D18S69 and D18S64.
- Author
-
Sinke, Richard J., Carlton, Victoria E. H., Juijn, Jenneke A., Delhaas, Tammo, Bull, Laura, Henegouwen, Gerard P. Berge, van Hattum, Jan, Keller, Klaus M., Sinaasappel, Maarten, Bijleveld, Charles M. A., Knol, I. E., van Amstel, Hans-Kristian Ploos, Pearson, Peter L., Berger, Ruud, Freimer, Nelson B., and Houwen, Roderick H. J.
- Abstract
Benign recurrent intrahepatic cholestasis (BRIC) is an autosomal recessive liver disease characterized by multiple episodes of cholestasis without progression to chronic liver disease. The gene was previously assigned to chromosome 18q21, using a shared segment analysis in three families from the Netherlands. In the present study we report the linkage analysis of an expanded sample of 14 BRIC families, using 15 microsatellite markers from the 18q21 region. Obligate recombinants in two families place the gene in a 7-cM interval, between markers D18S69 and D18S64. All intervening markers had significant LOD scores in two-point linkage analysis. Moreover, we identified one family in which the BRIC gene seems to be unlinked to the 18q21 region, or that represents incomplete penetrance of the BRIC genotype. [ABSTRACT FROM AUTHOR]
- Published
- 1997
- Full Text
- View/download PDF
40. [I]-Metaiodobenzylguanidine in the treatment of metastatic neuroblastoma.
- Author
-
Klingebiel, Thomas, Treuner, Jörn, Ehninger, Gerhard, Keller, Klaus, Dopfer, Roland, Feine, Ullrich, and Niethammer, Dietrich
- Abstract
Ten children with stage III or IV neuroblastoma that had either relapsed or was refractory were treated with [I]-metaiodobenzylguanidine (MIBG) from 1984 to 1986. The total dose ranged from 4,365 to 21,900 MBq and was given in one to five courses. Two patients achieved a complete remission (CR), two, a partial remission (PR), and three, an arrest of the disease. Pharmacological studies showed that 93% of detectable radioactivity was attributable to MIBG at the beginning of the infusion. However, by the end of the infusion this had decreased to 88%. The terminal half-life of MIBG was 37.0 h, whereas that of non-MIBG-bound iodine was 71.6 h. Therefore, the radioactivity-time product of non-MIBG-boundI was much higher than that of MIBG. Dosimetric studies showed a mean level of absorbed radiation for the total body of 160 μGy/MBq, a liver irradiation of 540 μGy/MBq and a mean tumour radiation of 10,500 μGy/MBq. [ABSTRACT FROM AUTHOR]
- Published
- 1989
- Full Text
- View/download PDF
41. Analytic regularizations, finite part prescriptions and products of distributions.
- Author
-
Keller, Klaus
- Published
- 1978
- Full Text
- View/download PDF
42. The diagnostic significance of IgG cow's milk protein antibodies re-evaluated.
- Author
-
Keller, Klaus, Bürgin-Wolff, Annemarie, Lippold, Rainer, Wirth, Stefan, Lentze, Michael, Keller, K M, Bürgin-Wolff, A, Lippold, R, Wirth, S, and Lentze, M J
- Abstract
Unlabelled: The effect of different feeding regimens, notably the use of hydrolysed cow's milk formulas, on the development of allergic reactions and the development of cow's milk protein-IgG antibodies is still disputed. We prospectively compared the development of allergic manifestations and cow's milk protein-IgG antibodies in a total of 702 infants who were divided into six groups: 1. exclusively breast milk for at least 4 weeks (n = 206). 2. Breast milk plus initial partially hydrolysed formula (n = 104). 3. Breast milk plus extensively hydrolysed formula (n = 50). 4. Breast milk plus initial conventional cow's milk formula (n = 73). 5. Conventional cow's milk with or without breast milk throughout (n = 187). 6. Extensively hydrolysed cow's milk formula for 2 months, followed by conventional cow's milk (n = 82). Cow's milk protein antibodies were determined by an indirect immunofluorescent test. Antibody titres rose slowly in groups 1, 3 and 6. Children in group 5 showed two high peaks. There were no significant differences in the frequency and type of allergic manifestations between the groups. Introduction of cow's milk formula during the first trimenon resulted in elevated antibody titres in all breast fed infants compared with introduction at a later date.Conclusion: In contrast to a previous study from the same laboratory, there is no diagnostic significance of cow's milk protein-IgG antibodies for allergic manifestations. The occurrence of these antibodies is a physiological phenomenon: the shorter the breast feeding period and the earlier cow's milk formula is introduced, the higher the antibody levels. [ABSTRACT FROM AUTHOR]- Published
- 1996
- Full Text
- View/download PDF
43. Neglecting uncertainties biases house-elevation decisions to manage riverine flood risks.
- Author
-
Zarekarizi, Mahkameh, Srikrishnan, Vivek, and Keller, Klaus
- Subjects
STATISTICAL decision making ,UNCERTAINTY ,DISCOUNT prices ,EARTH sciences ,EMERGENCY management ,FLOOD warning systems ,FLOOD risk - Abstract
Homeowners around the world elevate houses to manage flood risks. Deciding how high to elevate a house poses a nontrivial decision problem. The U.S. Federal Emergency Management Agency (FEMA) recommends elevating existing houses to the Base Flood Elevation (the elevation of the 100-year flood) plus a freeboard. This recommendation neglects many uncertainties. Here we analyze a case-study of riverine flood risk management using a multi-objective robust decision-making framework in the face of deep uncertainties. While the quantitative results are location-specific, the approach and overall insights are generalizable. We find strong interactions between the economic, engineering, and Earth science uncertainties, illustrating the need for expanding on previous integrated analyses to further understand the nature and strength of these connections. Considering deep uncertainties surrounding flood hazards, the discount rate, the house lifetime, and the fragility can increase the economically optimal house elevation to values well above FEMA's recommendation. This study investigates the effects of uncertainties on the decision of how high to elevate a house in flood-prone areas. Accounting for several uncertainties suggests avenues on how to improve guidelines from FEMA. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
44. Characterizing the deep uncertainties surrounding coastal flood hazard projections: A case study for Norfolk, VA.
- Author
-
Ruckert, Kelsey L., Srikrishnan, Vivek, and Keller, Klaus
- Subjects
FLOODS ,SEA level ,FLOOD damage prevention ,ICE sheets ,DECISION making ,STORMS - Abstract
Coastal planners and decision makers design risk management strategies based on hazard projections. However, projections can differ drastically. What causes this divergence and which projection(s) should a decision maker adopt to create plans and adaptation efforts for improving coastal resiliency? Using Norfolk, Virginia, as a case study, we start to address these questions by characterizing and quantifying the drivers of differences between published sea-level rise and storm surge projections, and how these differences can impact efforts to improve coastal resilience. We find that assumptions about the complex behavior of ice sheets are the primary drivers of flood hazard diversity. Adopting a single hazard projection neglects key uncertainties and can lead to overconfident projections and downwards biased hazard estimates. These results highlight key avenues to improve the usefulness of hazard projections to inform decision-making such as (i) representing complex ice sheet behavior, (ii) covering decision-relevant timescales beyond this century, (iii) resolving storm surges with a low chance of occurring (e.g., a 0.2% chance per year), (iv) considering that storm surge projections may deviate from the historical record, and (v) communicating the considerable deep uncertainty. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
45. Bakterielle Infektionen: Anaerobier.
- Author
-
Schmitt, Heinz-Josef and Keller, Klaus-Michael
- Published
- 2014
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.