22 results on '"Klaus Keller"'
Search Results
2. Equity is more important for the social cost of methane than climate uncertainty
- Author
-
William D. Collins, Frank Errickson, Vivek Srikrishnan, David Anthoff, and Klaus Keller
- Subjects
Percentile ,Multidisciplinary ,010504 meteorology & atmospheric sciences ,Social cost ,Equity (finance) ,010501 environmental sciences ,Radiative forcing ,01 natural sciences ,Range (statistics) ,Econometrics ,Climate sensitivity ,Environmental science ,Climate model ,Tonne ,0105 earth and related environmental sciences - Abstract
The social cost of methane (SC-CH4) measures the economic loss of welfare caused by emitting one tonne of methane into the atmosphere. This valuation may in turn be used in cost–benefit analyses or to inform climate policies1–3. However, current SC-CH4 estimates have not included key scientific findings and observational constraints. Here we estimate the SC-CH4 by incorporating the recent upward revision of 25 per cent to calculations of the radiative forcing of methane4, combined with calibrated reduced-form global climate models and an ensemble of integrated assessment models (IAMs). Our multi-model mean estimate for the SC-CH4 is US$933 per tonne of CH4 (5–95 per cent range, US$471–1,570 per tonne of CH4) under a high-emissions scenario (Representative Concentration Pathway (RCP) 8.5), a 22 per cent decrease compared to estimates based on the climate uncertainty framework used by the US federal government5. Our ninety-fifth percentile estimate is 51 per cent lower than the corresponding figure from the US framework. Under a low-emissions scenario (RCP 2.6), our multi-model mean decreases to US$710 per tonne of CH4. Tightened equilibrium climate sensitivity estimates paired with the effect of previously neglected relationships between uncertain parameters of the climate model lower these estimates. We also show that our SC-CH4 estimates are sensitive to model combinations; for example, within one IAM, different methane cycle sub-models can induce variations of approximately 20 per cent in the estimated SC-CH4. But switching IAMs can more than double the estimated SC-CH4. Extending our results to account for societal concerns about equity produces SC-CH4 estimates that differ by more than an order of magnitude between low- and high-income regions. Our central equity-weighted estimate for the USA increases to US$8,290 per tonne of CH4 whereas our estimate for sub-Saharan Africa decreases to US$134 per tonne of CH4. Accounting for equity influences the social cost of methane more than climate model uncertainty does and produces results that differ by over an order of magnitude between low- and high-income regions.
- Published
- 2021
3. Attention to values helps shape convergence research
- Author
-
Casey Helgeson, Robert E. Nicholas, Klaus Keller, Chris E. Forest, and Nancy Tuana
- Subjects
Atmospheric Science ,Global and Planetary Change - Abstract
Convergence research is driven by specific and compelling problems and requires deep integration across disciplines. The potential of convergence research is widely recognized, but questions remain about how to design, facilitate, and assess such research. Here we analyze a seven-year, twelve-million-dollar convergence project on sustainable climate risk management to answer two questions. First, what is the impact of a project-level emphasis on the values that motivate and tie convergence research to the compelling problems? Second, how does participation in convergence projects shape the research of postdoctoral scholars who are still in the process of establishing themselves professionally? We use an interview-based approach to characterize what the project specifically enabled in each participant’s research. We find that (a) the project pushed participants’ research into better alignment with the motivating concept of convergence research and that this effect was stronger for postdoctoral scholars than for more senior faculty. (b) Postdocs’ self-assessed understanding of key project themes, however, appears unconnected to metrics of project participation, raising questions about training and integration. Regarding values, (c) the project enabled heightened attention to values in the research of a large minority of participants. (d) Participants strongly believe in the importance of explicitly reflecting on values that motivate and pervade scientific research, but they question their own understanding of how to put value-focused science into practice. This mismatch of perceived importance with poor understanding highlights an unmet need in the practice of convergence science.
- Published
- 2022
4. A tighter constraint on Earth-system sensitivity from long-term temperature and carbon-cycle observations
- Author
-
Klaus Keller, Tony E. Wong, Dana L. Royer, and Ying Cui
- Subjects
FOS: Computer and information sciences ,010504 meteorology & atmospheric sciences ,Science ,FOS: Physical sciences ,General Physics and Astronomy ,Forcing (mathematics) ,Palaeoclimate ,010502 geochemistry & geophysics ,Atmospheric sciences ,Bayesian inference ,Statistics - Applications ,01 natural sciences ,Article ,General Biochemistry, Genetics and Molecular Biology ,Carbon cycle ,Physics - Geophysics ,Range (statistics) ,Applications (stat.AP) ,Sensitivity (control systems) ,Physics::Atmospheric and Oceanic Physics ,0105 earth and related environmental sciences ,Multidisciplinary ,General Chemistry ,15. Life on land ,Radiative forcing ,Geophysics (physics.geo-ph) ,Term (time) ,Earth system science ,Geochemistry ,13. Climate action ,Environmental science ,Astrophysics::Earth and Planetary Astrophysics ,Climate sciences - Abstract
The long-term temperature response to a given change in CO2 forcing, or Earth-system sensitivity (ESS), is a key parameter quantifying our understanding about the relationship between changes in Earth’s radiative forcing and the resulting long-term Earth-system response. Current ESS estimates are subject to sizable uncertainties. Long-term carbon cycle models can provide a useful avenue to constrain ESS, but previous efforts either use rather informal statistical approaches or focus on discrete paleoevents. Here, we improve on previous ESS estimates by using a Bayesian approach to fuse deep-time CO2 and temperature data over the last 420 Myrs with a long-term carbon cycle model. Our median ESS estimate of 3.4 °C (2.6-4.7 °C; 5-95% range) shows a narrower range than previous assessments. We show that weaker chemical weathering relative to the a priori model configuration via reduced weatherable land area yields better agreement with temperature records during the Cretaceous. Research into improving the understanding about these weathering mechanisms hence provides potentially powerful avenues to further constrain this fundamental Earth-system property., Earth-system sensitivity (ESS) describes the long-term temperature response for a given change in atmospheric CO2 and, as such, is a crucial parameter to assess future climate change. Here, the authors use a Bayesian model with data from the last 420 Myrs to reduce uncertainties and estimate ESS to be around 3.4 °C.
- Published
- 2021
5. Epistemic and ethical trade-offs in decision analytical modelling
- Author
-
Alexander M. R. Bakker, Nancy Tuana, Martin A. Vezér, and Klaus Keller
- Subjects
Economic efficiency ,Atmospheric Science ,Global and Planetary Change ,Ethical trade ,010504 meteorology & atmospheric sciences ,business.industry ,Computer science ,Human life ,010501 environmental sciences ,01 natural sciences ,Epistemology ,business ,Completeness (statistics) ,Risk management ,0105 earth and related environmental sciences - Abstract
Designing decision analytical models requires making choices that can involve a range of trade-offs and interactions between epistemic and ethical considerations. Such choices include determining the complexity of a model and deciding what types of risk will be assessed. Here, we demonstrate how model design choices can involve trade-offs between the epistemic benefits of representational completeness and simplicity, which interact with ethical considerations about fairness and human life. We illustrate this point by focusing on modeling studies that assess flood risks in New Orleans, Louisiana. Addressing the ethical and epistemic implications of model design choices can help clarify the scope of factors necessary to inform ethically sound and economically efficient decision-making.
- Published
- 2017
6. Understanding the detectability of potential changes to the 100-year peak storm surge
- Author
-
Klaus Keller, Chris E. Forest, and Robert L. Ceres
- Subjects
Atmospheric Science ,Global and Planetary Change ,010504 meteorology & atmospheric sciences ,business.industry ,animal diseases ,0208 environmental biotechnology ,Storm surge ,Climate change ,Storm ,02 engineering and technology ,01 natural sciences ,020801 environmental engineering ,Flood risk management ,13. Climate action ,Climatology ,Environmental science ,sense organs ,14. Life underwater ,skin and connective tissue diseases ,business ,Risk management ,0105 earth and related environmental sciences - Abstract
In many coastal communities, the risks driven by storm surges are motivating substantial investments in flood risk management. The design of adaptive risk management strategies, however, hinges on the ability to detect future changes in storm surge statistics. Previous studies have used observations to identify changes in past storm surge statistics. Here, we focus on the simple and decision-relevant question: How fast can we learn from past and potential future storm surge observations about changes in future statistics? Using Observing System Simulation Experiments, we quantify the time required to detect changes in the probability of extreme storm surge events. We estimate low probabilities of detection when substantial but gradual changes to the 100-year storm surge occur. As a result, policy makers may underestimate considerable increases in storm surge risk over the typically long lifespans of major infrastructure projects.
- Published
- 2017
7. Increasing temperature forcing reduces the Greenland Ice Sheet’s response time scale
- Author
-
Richard B. Alley, Robert E. Nicholas, Patrick J. Applegate, Byron R. Parizek, and Klaus Keller
- Subjects
Ice-sheet model ,Arctic sea ice decline ,Atmospheric Science ,geography ,geography.geographical_feature_category ,Climatology ,Sea ice thickness ,Lead (sea ice) ,Ice-albedo feedback ,Greenland ice sheet ,Environmental science ,Future sea level ,Ice sheet - Abstract
Damages from sea level rise, as well as strategies to manage the associated risk, hinge critically on the time scale and eventual magnitude of sea level rise. Satellite observations and paleo-data suggest that the Greenland Ice Sheet (GIS) loses mass in response to increased temperatures, and may thus contribute substantially to sea level rise as anthropogenic climate change progresses. The time scale of GIS mass loss and sea level rise are deeply uncertain, and are often assumed to be constant. However, previous ice sheet modeling studies have shown that the time scale of GIS response likely decreases strongly with increasing temperature anomaly. Here, we map the relationship between temperature anomaly and the time scale of GIS response, by perturbing a calibrated, three-dimensional model of GIS behavior. Additional simulations with a profile, higher-order, ice sheet model yield time scales that are broadly consistent with those obtained using the three-dimensional model, and shed light on the feedbacks in the ice sheet system that cause the time scale shortening. Semi-empirical modeling studies that assume a constant time scale of sea level adjustment, and are calibrated to small preanthropogenic temperature and sea level changes, may underestimate future sea level rise. Our analysis suggests that the benefits of reducing greenhouse gas emissions, in terms of avoided sea level rise from the GIS, may be greatest if emissions reductions begin before large temperature increases have been realized. Reducing anthropogenic climate change may also allow more time for design and deployment of risk management strategies by slowing sea level contributions from the GIS.
- Published
- 2014
8. Inaction and climate stabilization uncertainties lead to severe economic risks
- Author
-
Klaus Keller, Patrick M. Reed, Karen Fisher-Vanden, M. P. Butler, and Thorsten Wagener
- Subjects
Stabilization policy ,Atmospheric Science ,Global and Planetary Change ,Cost–benefit analysis ,Natural resource economics ,business.industry ,media_common.quotation_subject ,Environmental resource management ,Variance (accounting) ,Interdependence ,Lead (geology) ,Damages ,Climate sensitivity ,Environmental science ,Energy source ,business ,media_common - Abstract
Climate stabilization efforts must integrate the actions of many socio-economic sectors to be successful in meeting climate stabilization goals, such as limiting atmospheric carbon dioxide (CO2) concentration to be less than double the pre-industrial levels. Estimates of the costs and benefits of stabilization policies are often informed by Integrated Assessment Models (IAMs) of the climate and the economy. These IAMs are highly non-linear with many parameters that abstract globally integrated characteristics of environmental and socio-economic systems. Diagnostic analyses of IAMs can aid in identifying the interdependencies and parametric controls of modeled stabilization policies. Here we report a comprehensive variance-based sensitivity analysis of a doubled-CO2 stabilization policy scenario generated by the globally-aggregated Dynamic Integrated model of Climate and the Economy (DICE). We find that neglecting uncertainties considerably underestimates damage and mitigation costs associated with a doubled-CO2 stabilization goal. More than ninety percent of the states-of-the-world (SOWs) sampled in our analysis exceed the damages and abatement costs calculated for the reference case neglecting uncertainties (1.2 trillion 2005 USD, with worst case costs exceeding $60 trillion). We attribute the variance in these costs to uncertainties in the model parameters relating to climate sensitivity, global participation in abatement, and the cost of lower emission energy sources.
- Published
- 2014
9. Publisher Correction: Robust abatement pathways to tolerable climate futures require immediate global action
- Author
-
Giacomo Marangoni, Patrick M. Reed, Klaus Keller, Jonathan R. Lamontagne, and Gregory G. Garner
- Subjects
Action (philosophy) ,Statement (logic) ,Economics ,Climate change ,Environmental Science (miscellaneous) ,Futures contract ,Social Sciences (miscellaneous) ,Law and economics - Abstract
The previous ‘Journal peer review information’ for this Letter was incorrect. The correct statement is “Nature Climate Change thanks Jan Kwakkel, Francesca Pianosi and Matthias Weitzel for their contribution to the peer review of this work.” This statement has now been amended.
- Published
- 2019
10. Toward a physically plausible upper bound of sea-level rise projections
- Author
-
Roman Olson, Ryan L. Sriver, Klaus Keller, and Nathan M. Urban
- Subjects
Atmospheric Science ,Global and Planetary Change ,Orders of magnitude (specific energy) ,Meteorology ,Climatology ,Environmental science ,Climate change ,Climate sensitivity ,Outgoing longwave radiation ,Earth system model ,Ocean heat content ,Upper and lower bounds ,Parametric statistics - Abstract
Anthropogenic sea-level rise (SLR) causes considerable risks. Designing a sound SLR risk-management strategy requires careful consideration of decision-relevant uncertainties such as the reasonable upper bound of future SLR. The recent Intergovernmental Panel on Climate Change’s (IPCC) Fourth Assessment reported a likely upper SLR bound in the year 2100 near 0.6 m (meter). More recent studies considering semi-empirical modeling approaches and kinematic constraints on glacial melting suggest a reasonable 2100 SLR upper bound of approximately 2 m. These recent studies have broken important new ground, but they largely neglect uncertainties surrounding thermal expansion (thermosteric SLR) and/or observational constraints on ocean heat uptake. Here we quantify the effects of key parametric uncertainties and observational constraints on thermosteric SLR projections using an Earth system model with a dynamic three-dimensional ocean, which provides a mechanistic representation of deep ocean processes and heat uptake. Considering these effects nearly doubles the contribution of thermosteric SLR compared to previous estimates and increases the reasonable upper bound of 2100 SLR projections by 0.25 m. As an illustrative example of the effect of overconfidence, we show how neglecting thermosteric uncertainty in projections of the SLR upper bound can considerably bias risk analysis and hence the design of adaptation strategies. For conditions close to the Port of Los Angeles, the 0.25 m increase in the reasonable upper bound can result in a flooding-risk increase by roughly three orders of magnitude. Results provide evidence that relatively minor underestimation of the upper bound of projected SLR can lead to major downward biases of future flooding risks.
- Published
- 2012
11. Tension between reducing sea-level rise and global warming through solar-radiation management
- Author
-
Klaus Keller, Peter J. Irvine, and Ryan L. Sriver
- Subjects
Sea level rise ,business.industry ,Solar radiation management ,Tension (geology) ,Climatology ,Global warming ,Climate change ,Geoengineering ,Forcing (mathematics) ,Environmental Science (miscellaneous) ,business ,Social Sciences (miscellaneous) ,Sea level - Abstract
A study finds tension between mitigating sea-level rise and reducing the rate of temperature change through solar-radiation management. The rapid warming that would occur if solar-radiation management were to be phased out is shown to depend critically on timescales, potentially committing future generations to its long-term use once started. Geoengineering using solar-radiation management (SRM) is gaining interest as a potential strategy to reduce future climate change impacts1,2,3. Basic physics and past observations suggest that reducing insolation will, on average, cool the Earth. It is uncertain, however, whether SRM can reduce climate change stressors such as sea-level rise or rates of surface air temperature change1,4,5,6. Here we use an Earth system model of intermediate complexity to quantify the possible response of sea levels and surface air temperatures to projected climate forcings7 and SRM strategies. We find that SRM strategies introduce a potentially strong tension between the objectives to reduce (1) the rate of temperature change and (2) sea-level rise. This tension arises primarily because surface air temperatures respond faster to radiative forcings than sea levels. Our results show that the forcing required to stop sea-level rise could cause a rapid cooling with a rate similar to the peak business-as-usual warming rate. Furthermore, termination of SRM was found to produce warming rates up to five times greater than the maximum rates under the business-as-usual CO2 scenario, whereas sea-level rise rates were only 30% higher. Reducing these risks requires a slow phase-out of many decades and thus commits future generations.
- Published
- 2012
12. What are robust strategies in the face of uncertain climate threshold responses?
- Author
-
Klaus Keller, David McInerney, and Robert J. Lempert
- Subjects
Atmospheric Science ,Global and Planetary Change ,business.industry ,Environmental resource management ,Distribution (economics) ,Climate change ,Multiple-criteria decision analysis ,Value of information ,Greenhouse gas ,Economics ,Econometrics ,Representation (mathematics) ,business ,Robustness (economics) ,Parametric statistics - Abstract
We use an integrated assessment model of climate change to analyze how alternative decision-making criteria affect preferred investments into greenhouse gas mitigation, the distribution of outcomes, the robustness of the strategies, and the economic value of information. We define robustness as trading a small decrease in a strategy’s expected performance for a significant increase in a strategy’s performance in the worst cases. Specifically, we modify the Dynamic Integrated model of Climate and the Economy (DICE-07) to include a simple representation of a climate threshold response, parametric uncertainty, structural uncertainty, learning, and different decision-making criteria. Economic analyses of climate change strategies typically adopt the expected utility maximization (EUM) framework. We compare EUM with two decision criteria adopted from the finance literature, namely Limited Degree of Confidence (LDC) and Safety First (SF). Both criteria increase the relative weight of the performance under the worst-case scenarios compared to EUM. We show that the LDC and SF criteria provide a computationally feasible foundation for identifying greenhouse gas mitigation strategies that may prove more robust than those identified by the EUM criterion. More robust strategies show higher near-term investments in emissions abatement. Reducing uncertainty has a higher economic value of information for the LDC and SF decision criteria than for EUM.
- Published
- 2012
13. Climate Projections Using Bayesian Model Averaging and Space–Time Dependence
- Author
-
Adam J. Terando, K. Sham Bhat, Klaus Keller, and Murali Haran
- Subjects
Statistics and Probability ,Computer science ,Applied Mathematics ,Climate change ,Bayesian inference ,Grid ,Agricultural and Biological Sciences (miscellaneous) ,symbols.namesake ,Kernel (statistics) ,Econometrics ,symbols ,Bayesian hierarchical modeling ,Climate model ,Statistics, Probability and Uncertainty ,General Agricultural and Biological Sciences ,Gaussian process ,General Environmental Science ,Parametric statistics - Abstract
Projections of future climatic changes are a key input to the design of climate change mitigation and adaptation strategies. Current climate change projections are deeply uncertain. This uncertainty stems from several factors, including parametric and structural uncertainties. One common approach to characterize and, if possible, reduce these uncertainties is to confront (calibrate in a broad sense) the models with historical observations. Here, we analyze the problem of combining multiple climate models using Bayesian Model Averaging (BMA) to derive future projections and quantify uncertainty estimates of spatiotemporally resolved temperature hindcasts and projections. One advantage of the BMA approach is that it allows the assessment of the predictive skill of a model using the training data, which can help identify the better models and discard poor models. Previous BMA approaches have broken important new ground, but often neglected space–time dependencies and/or imposed prohibitive computational demands. Here we improve on the current state-of-the-art by incorporating space–time dependence while using historical data to estimate model weights. We achieve computational efficiency using a kernel mixing approach for representing a space–time process. One key advantage of our new approach is that it enables us to incorporate multiple sources of uncertainty and biases, while remaining computationally tractable for large data sets. We introduce and apply our approach using BMA to an ensemble of Global Circulation Model output from the Intergovernmental Panel on Climate Change Fourth Assessment Report of surface temperature on a grid of space–time locations.
- Published
- 2011
14. The economics (or lack thereof) of aerosol geoengineering
- Author
-
Klaus Keller, Nancy Tuana, and Marlos Goes
- Subjects
Atmospheric Science ,Global and Planetary Change ,Natural resource economics ,Greenhouse gas ,Climatology ,Climate commitment ,Damages ,Climate change ,Climate sensitivity ,Environmental science ,Economic impact analysis ,Forcing (mathematics) ,Aerosol - Abstract
Anthropogenic greenhouse gas emissions are changing the Earth's cli- mate and impose substantial risks for current and future generations. What are scientifically sound, economically viable, and ethically defendable strategies to manage these climate risks? Ratified international agreements call for a reduction of greenhouse gas emissions to avoid dangerous anthropogenic interference with the climate system. Recent proposals, however, call for a different approach: to geoengineer climate by injecting aerosol precursors into the stratosphere. Published economic studies typically neglect the risks of aerosol geoengineering due to (i) the potential for a failure to sustain the aerosol forcing and (ii) the negative impacts associated with the aerosol forcing. Here we use a simple integrated assessment model of climate change to analyze potential economic impacts of aerosol geo- engineering strategies over a wide range of uncertain parameters such as climate sensitivity, the economic damages due to climate change, and the economic damages due to aerosol geoengineering forcing. The simplicity of the model provides the
- Published
- 2011
15. Carbon dioxide sequestration: how much and when?
- Author
-
David McInerney, David F. Bradford, and Klaus Keller
- Subjects
Marginal cost ,Economic efficiency ,Atmospheric Science ,Global and Planetary Change ,Discounting ,business.industry ,Natural resource economics ,Global warming ,Fossil fuel ,Carbon sequestration ,Efficiency factor ,Environmental protection ,Environmental science ,Leakage (economics) ,business - Abstract
Carbon dioxide (CO2) sequestration has been proposed as a key component in technological portfolios for managing anthropogenic climate change, since it may provide a faster and cheaper route to significant reductions in atmospheric CO2 concentrations than abating CO2 production. However, CO2 sequestration is not a perfect substitute for CO2 abatement because CO2 may leak back into the atmosphere (thus imposing future climate change impacts) and because CO2 sequestration requires energy (thus producing more CO2 and depleting fossil fuel resources earlier). Here we use analytical and numerical models to assess the economic efficiency of CO2 sequestration and analyze the optimal timing and extent of CO2 sequestration. The economic efficiency factor of CO2 sequestration can be expressed as the ratio of the marginal net benefits of sequestering CO2 and avoiding CO2 emissions. We derive an analytical solution for this efficiency factor for a simplified case in which we account for CO2 leakage, discounting, the additional fossil fuel requirement of CO2 sequestration, and the growth rate of carbon taxes. In this analytical model, the economic efficiency of CO2 sequestration decreases as the CO2 tax growth rate, leakage rates and energy requirements for CO2 sequestration increase. Increasing discount rates increases the economic efficiency factor. In this simple model, short-term sequestration methods, such as afforestation, can even have negative economic efficiencies. We use a more realistic integrated-assessment model to additionally account for potentially important effects such as learning-by-doing and socio-economic inertia on optimal strategies. We measure the economic efficiency of CO2 sequestration by the ratio of the marginal costs of CO2 sequestration and CO2 abatement along optimal trajectories. We show that the positive impacts of investments in CO2 sequestration through the reduction of future marginal CO2 sequestration costs and the alleviation of future inertia constraints can initially exceed the marginal sequestration costs. As a result, the economic efficiencies of CO2 sequestration can exceed 100% and an optimal strategy will subsidize CO2 sequestration that is initially more expensive than CO2 abatement. The potential economic value of a feasible and acceptable CO2 sequestration technology is equivalent – in the adopted utilitarian model – to a one-time investment of several percent of present gross world product. It is optimal in the chosen economic framework to sequester substantial CO2 quantities into reservoirs with small or zero leakage, given published estimates of marginal costs and climate change impacts. The optimal CO2 trajectories in the case of sequestration from air can approach the pre-industrial level, constituting geoengineering. Our analysis is silent on important questions (e.g., the effects of model and parametric uncertainty, the potential learning about these uncertainties, or ethical dimension of such geoengineering strategies), which need to be addressed before our findings can be translated into policy-relevant recommendations.
- Published
- 2008
16. The dynamics of learning about a climate threshold
- Author
-
David McInerney and Klaus Keller
- Subjects
Atmospheric Science ,Observation system ,Critical level ,Forcing (recursion theory) ,Computer science ,Climatology ,Early prediction ,Early detection ,Climate change ,Sensitivity (control systems) ,Change detection - Abstract
Anthropogenic greenhouse gas emissions may trigger threshold responses of the climate system. One relevant example of such a potential threshold response is a shutdown of the North Atlantic meridional overturning circulation (MOC). Numerous studies have analyzed the problem of early MOC change detection (i.e., detection before the forcing has committed the system to a threshold response). Here we analyze the early MOC prediction problem. To this end, we virtually deploy an MOC observation system into a simple model that mimics potential future MOC responses and analyze the timing of confident detection and prediction. Our analysis suggests that a confident prediction of a potential threshold response can require century time scales, considerably longer that the time required for confident detection. The signal enabling early prediction of an approaching MOC threshold in our model study is associated with the rate at which the MOC intensity decreases for a given forcing. A faster MOC weakening implies a higher MOC sensitivity to forcing. An MOC sensitivity exceeding a critical level results in a threshold response. Determining whether an observed MOC trend in our model differs in a statistically significant way from an unforced scenario (the detection problem) imposes lower requirements on an observation system than the determination whether the MOC will shut down in the future (the prediction problem). As a result, the virtual observation systems designed in our model for early detection of MOC changes might well fail at the task of early and confident prediction. Transferring this conclusion to the real world requires a considerably refined MOC model, as well as a more complete consideration of relevant observational constraints.
- Published
- 2007
17. Managing the risks of climate thresholds: uncertainties and information needs
- Author
-
Klaus Keller, Michael E. Schlesinger, and Gary W. Yohe
- Subjects
Atmospheric Science ,Global and Planetary Change ,Coping (psychology) ,010504 meteorology & atmospheric sciences ,Meteorology ,business.industry ,Environmental resource management ,0207 environmental engineering ,Climate change ,Probability density function ,Information needs ,02 engineering and technology ,01 natural sciences ,13. Climate action ,Environmental science ,020701 environmental engineering ,business ,0105 earth and related environmental sciences - Abstract
Human activities are driving atmospheric greenhouse-gas concentrations beyond levels ex-perienced by previous civilizations. The uncertainty surrounding our understanding of theresulting climate change poses nontrivial challenges for the design and implementation ofstrategies to manage the associated risks. One challenge stems from the fact that the cli-matesystemcanreactabruptlyandwithonlysubtlewarningsignsbeforeclimatethresholdshave been crossed (Stocker 1999; Alley et al. 2003). Model predictions suggest that an-thropogenic greenhouse-gas emissions increase the likelihood of crossing these thresholds(Cubasch and Meehl 2001; Yohe et al. 2006). Coping with deep uncertainty in our under-standing of the mechanisms, locations, and impacts of climate thresholds presents anotherchallenge. Deep uncertainty presents itself when the relevant range of systems models andthe associated probability density functions for their parameterizations are unknown and/orwhen decision-makers strongly disagree on their formulations (Lempert 2002). Further-more, the requirements for creating feasible observation and modeling systems that coulddeliver confident and timely prediction of impending threshold crossings are mostly un-known.Thesechallengesputanewemphasisontheanalysis,design,andimplementationofEarth observation systems and strategies to manage the risks of potential climate thresholdresponses.
- Published
- 2007
18. Economically optimal risk reduction strategies in the face of uncertain climate thresholds
- Author
-
David McInerney and Klaus Keller
- Subjects
Atmospheric Science ,Global and Planetary Change ,DICE model ,United Nations Framework Convention on Climate Change ,Greenhouse gas ,Climatology ,Global warming ,Econometrics ,Climate change ,Environmental science ,Climate sensitivity ,Climate model ,Global change - Abstract
Anthropogenic greenhouse gas emissions may trigger climate threshold responses, such as a collapse of the North Atlantic meridional overturning circulation (MOC). Climate threshold responses have been interpreted as an example of “dangerous anthropogenic interference with the climate system” in the sense of the United Nations Framework Convention on Climate Change (UNFCCC). One UNFCCC objective is to “prevent” such dangerous anthropogenic interference. The current uncertainty about important parameters of the coupled natural – human system implies, however, that this UNFCCC objective can only be achieved in a probabilistic sense. In other words, climate management can only reduce – but not entirely eliminate – the risk of crossing climate thresholds. Here we use an integrated assessment model of climate change to derive economically optimal risk-reduction strategies. We implement a stochastic version of the DICE model and account for uncertainty about four parameters that have been previously identified as dominant drivers of the uncertain system response. The resulting model is, of course, just a crude approximation as it neglects, for example, some structural uncertainty and focuses on a single threshold, out of many potential climate responses. Subject to this caveat, our analysis suggests five main conclusions. First, reducing the numerical artifacts due to sub-sampling the parameter probability density functions to reasonable levels requires sample sizes exceeding 103. Conclusions of previous studies that are based on much smaller sample sizes may hence need to be revisited. Second, following a business-as-usual (BAU) scenario results in odds for an MOC collapse in the next 150 years exceeding 1 in 3 in this model. Third, an economically “optimal” strategy (that maximizes the expected utility of the decision-maker) reduces carbon dioxide(CO2) emissions by approximately 25% at the end of this century, compared with BAU emissions. Perhaps surprisingly, this strategy leaves the odds of an MOC collapse virtually unchanged compared to a BAU strategy. Fourth, reducing the odds for an MOC collapse to 1 in 10 would require an almost complete decarbonization of the economy within a few decades. Finally, further risk reductions (e.g., to 1 in 100) are possible in the framework of the simple model, but would require faster and more expensive reductions in CO2 emissions.
- Published
- 2007
19. Detecting potential changes in the meridional overturning circulation at 26˚N in the Atlantic
- Author
-
Klaus Keller, Jochem Marotzke, and Johanna Baehr
- Subjects
Atmospheric Science ,Global and Planetary Change ,010504 meteorology & atmospheric sciences ,010505 oceanography ,Continuous monitoring ,Autocorrelation ,Wind stress ,Climate change ,01 natural sciences ,Standard deviation ,13. Climate action ,Climatology ,Climate change scenario ,Environmental science ,Thermohaline circulation ,Hydrography ,0105 earth and related environmental sciences - Abstract
We analyze the ability of an oceanic monitoring array to detect potential changes in the North Atlantic meridional overturning circulation (MOC). The observing array is 'de- ployed' into a numerical model (ECHAM5/MPI-OM), and simulates the measurements of density and wind stress at 26 ◦ N in the Atlantic. The simulated array mimics the continuous monitoring system deployed in the framework of the UK Rapid Climate Change program. We analyze a set of three realizations of a climate change scenario (IPCC A1B), in which - within the considered time-horizon of 200 years - the MOC weakens, but does not collapse. For the detection analysis, we assume that the natural variability of the MOC is known from an independent source, the control run. Our detection approach accounts for the effects of ob- servation errors, infrequent observations, autocorrelated internal variability, and uncertainty in the initial conditions. Continuous observation with the simulated array for approximately 60 years yields a statistically significant ( p < 0.05) detection with 95 percent reliability as- suming a random observation error of 1 Sv (1 Sv = 10 6 m 3 s −1 ). Observing continuously with an observation error of 3 Sv yields a detection time of about 90 years (with 95 percent reliability). Repeated hydrographic transects every 5 years/ 20 years result in a detection time of about 90 years/120 years, with 95 percent reliability and an assumed observation error of 3 Sv. An observation error of 3 Sv (one standard deviation) is a plausible estimate of the observation error associated with the RAPID UK 26 ◦ N array.
- Published
- 2007
20. Avoiding Dangerous Anthropogenic Interference with the Climate System
- Author
-
David F. Bradford, Seung Rae Kim, Michael Oppenheimer, Matthew G. Hall, and Klaus Keller
- Subjects
Atmospheric Science ,Global and Planetary Change ,geography ,geography.geographical_feature_category ,Natural resource economics ,Global warming ,Antarctic ice sheet ,Climate change ,Global change ,Demise ,Effects of global warming ,Climatology ,Greenhouse gas ,Environmental science ,Ice sheet - Abstract
The UN Framework Convention on Climate Change calls for the avoidance of “dangerous anthropogenic interference with the climate system”. Among the many plausible choices, dangerous interference with the climate system may be interpreted as anthropogenic radiative forcing causing distinct and widespread climate change impacts such as a widespread demise of coral reefs or a disintegration of the West Antarctic ice sheet. The geological record and numerical models suggest that limiting global warming below critical temperature thresholds significantly reduces the likelihood of these eventualities. Here we analyze economically optimal policies that may ensure this risk-reduction. Reducing the risk of a widespread coral reef demise implies drastic reductions in greenhouse gas emissions within decades. Virtually unchecked greenhouse gas emissions to date (combined with the inertia of the coupled natural and human systems) may have already committed future societies to a widespread demise of coral reefs. Policies to reduce the risk of a West Antarctic ice sheet disintegration allow for a smoother decarbonization of the economy within a century and may well increase consumption in the long run.
- Published
- 2005
21. [Untitled]
- Author
-
David F. Bradford, Klaus Keller, Kelvin Jui Keng Tan, and François M. M. Morel
- Subjects
Atmospheric Science ,Global and Planetary Change ,Ocean current ,Climate change ,chemistry.chemical_compound ,chemistry ,Climatology ,Greenhouse gas ,Carbon dioxide ,Damages ,medicine ,Environmental science ,Circulation (currency) ,Thermohaline circulation ,medicine.symptom ,Collapse (medical) - Abstract
Climate modelers have recognized the possibility of abrupt climate changes caused by a reorganization of the North Atlantic's current pattern (technically known as a thermohaline circulation collapse). This circulation system now warms north-western Europe and transports carbon dioxide to the deep oceans. The posited collapse of this system could produce severe cooling in northwestern Europe, even when general global warming is in progress. In this paper we use a simple integrated assessment model to investigate the optimal policy response to this risk. Adding the constraint of avoiding a thermohaline circulation collapse would significantly reduce the allowable greenhouse gas emissions in the long run along an optimal path. Our analysis implies that relatively small damages associated with a collapse (less than 1% of gross world product) would justify a considerable reduction of future carbon dioxide emissions.
- Published
- 2000
22. Analytic regularizations, finite part prescriptions and products of distributions
- Author
-
Klaus Keller
- Subjects
General Mathematics ,Calculus ,Medical prescription ,Mathematics - Published
- 1978
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.