10,549 results
Search Results
2. Numerical modelling of the lobes of radio galaxies – Paper V: universal pressure profile cluster atmospheres.
- Author
-
Stimpson, M, Hardcastle, M J, and Krause, M G H
- Subjects
- *
KELVIN-Helmholtz instability , *SYNCHROTRON radiation , *CORE materials , *ATMOSPHERE , *RADIO galaxies , *ATMOSPHERIC models - Abstract
We present relativistic magnetohydrodynamic modelling of jets running into hydrostatic, spherically symmetric cluster atmospheres. For the first time in a numerical simulation, we present model cluster atmospheres based upon the universal pressure profile (UPP), incorporating a temperature profile for a 'typical' self-similar atmosphere described by only one parameter – M 500. We explore a comprehensive range of realistic atmospheres and jet powers and derive dynamic, energetic, and polarimetric data which provide insight into what we should expect of future high-resolution studies of AGN outflows. From the simulated synchrotron emission maps which include Doppler beaming we find sidedness distributions that agree well with observations. We replicated a number of findings from our previous work, such as higher power jets inflating larger aspect-ratio lobes, and the cluster environment impacting the distribution of energy between the lobe and shocked regions. Comparing UPP and β-profiles we find that the cluster model chosen results in a different morphology for the resultant lobes with the UPP more able to clear lobe material from the core; and that these different atmospheres influence the ratio between the various forms of energy in the fully developed lobes. This work also highlights the key role played by Kelvin–Helmholtz instabilities in the formation of realistic lobe aspect ratios. Our simulations point to the need for additional lobe-widening mechanisms at high jet powers, for example jet precession. Given that the UPP is our most representative general cluster atmosphere, these numerical simulations represent the most realistic models yet for spherically symmetric atmospheres. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
3. Policies, projections, and the social cost of carbon: Results from the DICE-2023 model.
- Author
-
Barrage, Lint and Nordhaus, William
- Subjects
- *
EXTERNALITIES , *CARBON paper , *CARBON , *DISCOUNT prices , *GREENHOUSE gases , *ATMOSPHERIC models - Abstract
The present study examines the assumptions, modeling structure, and results of DICE-2023, the revised Dynamic Integrated Model of Climate and the Economy (DICE), updated to 2023. The revision contains major changes in the treatment of risk, the carbon and climate modules, the treatment of nonindustrial greenhouse gases, discount rates, as well as updates on all the major components. Noteworthy changes are a significant reduction in the target for the optimal (cost-beneficial) temperature path, a lower cost of reaching the 2 °C target, an analysis of the impact of the Paris Accord, and a major increase in the estimated social cost of carbon. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
4. PAPERS OF NOTE
- Published
- 2010
5. Accuracy of Dispersion Models: A Position Paper of the AMS 1977 Committee on Atmospheric Turbulence and Diffusion
- Published
- 1978
6. Can we predict citation counts of environmental modelling papers? Fourteen bibliographic and categorical variables predict less than 30% of the variability in citation counts.
- Author
-
Robson, Barbara J. and Mousquès, Aurélie
- Subjects
- *
ATMOSPHERIC models , *DIFFERENTIAL equations , *BIBLIOGRAPHICAL citations , *BIBLIOMETRICS , *SCIENTOMETRICS - Abstract
We assessed 6122 environmental modelling papers published since 2005 to determine whether the number of citations each paper had received by September 2014 could be predicted with no knowledge of the paper's quality. A random forest was applied, using a range of easily quantified or classified variables as predictors. The 511 papers published in two key journals in 2008 were further analysed to consider additional variables. Papers with no differential equations received more citations. The topic of the paper, number of authors and publication venue were also significant. Ten other factors, some of which have been found significant in other studies, were also considered, but most added little to the predictive power of the models. Collectively, all factors predicted 16–29% of the variation in citation counts, with the remaining variance (the majority) presumably attributable to important subjective factors such as paper quality, clarity and timeliness. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
7. MUSICA MetOp/IASI {H2O, D} pair retrieval simulations for validating tropospheric moisture pathways in atmospheric models [Discussion paper]
- Author
-
Schneider, Matthias, Borger, Christian, Wiegele, Andreas, Hase, Frank, García Rodríguez, Omaira Elena, Sepúlveda Hernández, Eliezer, and Werner, Martin
- Subjects
Project MUSICA ,Modelos atmosféricos ,Atmospheric models ,Humedad troposférica ,Tropospheric moisture ,Proyecto MUSICA - Abstract
The characteristics of {H2O,δD} pair space-based remote sensing data depend on the atmospheric and surface conditions, which compromises their usage for model evaluation studies. This paper shows how the problem can be overcome by simulating MUSICA MetOp/IASI {H2O,δD} remote sensing products for any given model atmosphere. The remote sensing retrieval simulator is freely provided as a MATLAB and Python routine. This study has been conducted in the framework of the project MUSICA which is funded by the European Research Council under the European Community’s Seventh Framework Programme (FP7/2007-2013) / ERC Grant agreement number 256961.
- Published
- 2016
8. Using open building data in the development of exposure datasets for catastrophe risk modelling.
- Author
-
Figueiredo, R. and Martina, M.
- Subjects
BUILDING protection ,FLOOD damage prevention ,EMERGENCY management ,OPEN plan (Building) ,ENVIRONMENTAL risk assessment ,ATMOSPHERIC models - Abstract
One of the necessary components to perform catastrophe risk modelling is information on the buildings at risk, such as their spatial location, geometry, height, occupancy type and other characteristics. This is commonly referred to as the exposure model or dataset. When modelling large areas, developing exposure datasets with the relevant information about every individual building is not practicable. Thus, census data at coarse spatial resolutions are often used as the starting point for the creation of such datasets, after which disaggregation to finer resolutions is carried out using different methods, based on proxies such as the population distribution. While these methods can produce acceptable results, they cannot be considered ideal. Nowadays, the availability of open data is increasing and it is possible to obtain information about buildings for some regions. Although this type of information is usually limited and, therefore, insufficient to generate an exposure dataset, it can still be very useful in its elaboration. In this paper, we focus on how open building data can be used to develop a gridded exposure model by disaggregating existing census data at coarser resolutions. Furthermore, we analyse how the selection of the level of spatial resolution can impact the accuracy and precision of the model, and compare the results in terms of affected residential building areas, due to a flood event, between different models. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
9. Forecasts covering one month using a cut cell model.
- Author
-
Steppeler, J., Park, S.-H., and Dobler, A.
- Subjects
WEATHER forecasting ,CLIMATE change ,METEOROLOGICAL precipitation ,ATMOSPHERIC models ,GAIN measurement - Abstract
This paper investigates the impact and potential use of the cut cell vertical discretisation for forecasts of 5 days and climate simulations. A first indication of the usefulness of this new method is obtained by a set of five-day forecasts, covering January 1989 by forecasts. The model area was chosen to include much of Asia, the Himalayas and Australia. The cut cell model LMZ provides a much more accurate representation of mountains on model forecasts than the terrain following coordinate used for comparison. Therefore we are in particular interested in potential forecast improvements in the target area downwind of the Himalaya, over South East China, Korea and Japan. The LMZ has been tested so far extensively for one-day forecasts on an European area. Following indications of a reduced temperature error for the short forecasts, this paper investigates the model error for five days in an area influenced by strong orography. The forecasts indicated a strong impact of the cut cell discretisation on forecast quality. The cut cell model is available only of an older (2003) Version of the model LM. It was compared using a control model differing by the use of the terrain following coordinate only. The cut cell model improved the precipitation forecasts of this old control model everywhere by a large margin. An improved version of the terrain following model LM has been developed since then under the name CLM. The CLM has been used and tested in all climates, while the LM was used for small areas in higher latitudes. The precipitation forecasts of cut cell model were compared also to the CLM. As the cut cell model LMZ did not incorporate the developments for CLM since 2003, the precipitation forecast of the CLM was not improved in all aspects. However, for the target area downstream of the Himalaya, the cut cell model improved the prediction of the monthly precipitation forecast even in comparison with the modern model version CLM considerably. The cut cell discretisation seems to improve in particular the localisation of precipitation, while the improvements leading from LM to CLM had a positive effect mainly on amplitude. [ABSTRACT FROM AUTHOR]
- Published
- 2013
- Full Text
- View/download PDF
10. Solar Wind Driven from GONG Magnetograms in the Last Solar Cycle.
- Author
-
Huang, Zhenguang, Tóth, Gábor, Sachdeva, Nishtha, and van der Holst, Bart
- Subjects
SOLAR wind ,SOLAR cycle ,SOLAR atmosphere ,PLASMA Alfven waves ,SPACE environment ,WIND speed ,ATMOSPHERIC models - Abstract
In a previous study, Huang et al. used the Alfvén Wave Solar atmosphere Model, one of the widely used solar wind models in the community, driven by ADAPT-GONG magnetograms to simulate the solar wind in the last solar cycle and found that the optimal Poynting flux parameter can be estimated from either the open field area or the average unsigned radial component of the magnetic field in the open field regions. It was also found that the average energy deposition rate (Poynting flux) in the open field regions is approximately constant. In the current study, we expand the previous work by using GONG magnetograms to simulate the solar wind for the same Carrington rotations and determine if the results are similar to the ones obtained with ADAPT-GONG magnetograms. Our results indicate that similar correlations can be obtained from the GONG maps. Moreover, we report that ADAPT-GONG magnetograms can consistently provide better comparisons with 1 au solar wind observations than GONG magnetograms, based on the best simulations selected by the minimum of the average curve distance for the solar wind speed and density. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
11. A moist aquaplanet variant of the Held-Suarez test for atmospheric model dynamical cores.
- Author
-
Thatcher, D. R. and Jablonowski, C.
- Subjects
ATMOSPHERIC models ,MOISTURE ,PARAMETERIZATION - Abstract
A moist idealized test case (MITC) for atmospheric model dynamical cores is presented. The MITC is based on the Held-Suarez (HS) test that was developed for dry simulations on a flat Earth and replaces the full physical parameterization package with a Newtonian temperature relaxation and Rayleigh damping of the low-level winds. This new variant of the HS test includes moisture and thereby sheds light on the non-linear dynamics-physics moisture feedbacks without the complexity of full physics parameterization packages. In particular, it adds simplified moist processes to the HS forcing to model large-scale condensation, boundary layer mixing, and the exchange of latent and sensible heat between the atmospheric surface and an ocean-covered planet. Using a variety of dynamical cores of NCAR's Community Atmosphere Model (CAM), this paper demonstrates that the inclusion of the moist idealized physics package leads to climatic states that closely resemble aquaplanet simulations with complex physical parameterizations. This establishes that the MITC approach generates reasonable atmospheric circulations and can be used for a broad range of scientific investigations. This paper provides examples of two application areas. First, the test case reveals the characteristics of the physics-dynamics coupling technique and reproduces coupling issues seen in full-physics simulations. In particular, it is shown that sudden adjustments of the prognostic fields due to moist physics tendencies can trigger undesirable large-scale gravity waves, which can be remedied by a more gradual application of the physical forcing. Second, the moist idealized test case can be used to intercompare dynamical cores. These examples demonstrate the versatility of the MITC approach and suggestions are made for further application areas. The new moist variant of the HS test can be considered a test case of intermediate complexity. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
12. Bayesian hierarchical model for bias-correcting climate models.
- Author
-
Carter, Jeremy, Chacón-Montalván, Erick A., and Leeson, Amber
- Subjects
ATMOSPHERIC models ,GAUSSIAN processes ,ENVIRONMENTAL sciences ,PHYSICAL laws ,TIME series analysis - Abstract
Climate models, derived from process understanding, are essential tools in the study of climate change and its wide-ranging impacts. Hindcast and future simulations provide comprehensive spatiotemporal estimates of climatology that are frequently employed within the environmental sciences community, although the output can be afflicted with bias that impedes direct interpretation. Post-processing bias correction approaches utilise observational data to address this challenge, although they are typically criticised for not being physically justified and not considering uncertainty in the correction. This paper proposes a novel Bayesian bias correction framework that robustly propagates uncertainty and models underlying spatial covariance patterns. Shared latent Gaussian processes are assumed between the in situ observations and climate model output, with the aim of partially preserving the covariance structure from the climate model after bias correction, which is based on well-established physical laws. Results demonstrate added value in modelling shared generating processes under several simulated scenarios, with the most value added for the case of sparse in situ observations and smooth underlying bias. Additionally, the propagation of uncertainty to a simulated final bias-corrected time series is illustrated, which is of key importance to a range of stakeholders, such as climate scientists engaged in impact studies, decision-makers trying to understand the likelihood of particular scenarios and individuals involved in climate change adaption strategies where accurate risk assessment is required for optimal resource allocation. This paper focuses on one-dimensional simulated examples for clarity, although the code implementation is developed to also work on multi-dimensional input data, encouraging follow-on real-world application studies that will further validate performance and remaining limitations. The Bayesian framework supports uncertainty propagation under model adaptations required for specific applications, providing a flexible approach that increases the scope of data assimilation tasks more generally. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
13. Fast and Reliable Network RTK Positioning Based on Multi-Frequency Sequential Ambiguity Resolution under Significant Atmospheric Biases.
- Author
-
Liu, Hao, Zhang, Ziteng, Sheng, Chuanzhen, Yu, Baoguo, Gao, Wang, and Meng, Xiaolin
- Subjects
AMBIGUITY ,GLOBAL Positioning System ,WEATHER ,ATMOSPHERIC models - Abstract
The positioning performance of the Global Navigation Satellite System (GNSS) network real-time kinematic (NRTK) depends on regional atmospheric error modeling. Under normal atmospheric conditions, NRTK positioning provides high accuracy and rapid initialization. However, fluctuations in atmospheric conditions can lead to poor atmospheric error modeling, resulting in significant atmospheric biases that affect the positioning accuracy, initialization speed, and reliability of NRTK positioning. Consequently, this decreases the efficiency of NRTK operations. In response to these challenges, this paper proposes a fast and reliable NRTK positioning method based on sequential ambiguity resolution (SAR) of multi-frequency combined observations. This method processes observations from extra-wide-lane (EWL), wide-lane (WL), and narrow-lane (NL) measurements; performs sequential AR using the LAMBDA algorithm; and subsequently constrains other parameters using fixed ambiguities. Ultimately, this method achieves high precision, rapid initialization, and reliable positioning. Experimental analysis was conducted using Continuous Operating Reference Station (CORS) data, with baseline lengths ranging from 88 km to 110 km. The results showed that the proposed algorithm offers positioning accuracy comparable to conventional algorithms in conventional NRTK positioning and has higher fixed rate and positioning accuracy in single-epoch positioning. On two datasets, the proposed algorithm demonstrated over 30% improvement in time to first fix (TTFF) compared to conventional algorithms. It provides higher precision in suboptimal positioning solutions when conventional NRTK algorithms fail to achieve fixed solutions during the initialization phase. These experiments highlight the advantages of the proposed algorithm in terms of initialization speed and positioning reliability. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
14. To Exascale and Beyond—The Simple Cloud‐Resolving E3SM Atmosphere Model (SCREAM), a Performance Portable Global Atmosphere Model for Cloud‐Resolving Scales.
- Author
-
Donahue, A. S., Caldwell, P. M., Bertagna, L., Beydoun, H., Bogenschutz, P. A., Bradley, A. M., Clevenger, T. C., Foucar, J., Golaz, C., Guba, O., Hannah, W., Hillman, B. R., Johnson, J. N., Keen, N., Lin, W., Singh, B., Sreepathi, S., Taylor, M. A., Tian, J., and Terai, C. R.
- Subjects
ATMOSPHERIC models ,ATMOSPHERIC radiation measurement ,HIGH performance computing ,CLIMATE change models ,GRAPHICS processing units ,COMPUTER systems ,HETEROGENEOUS computing - Abstract
The new generation of heterogeneous CPU/GPU computer systems offer much greater computational performance but are not yet widely used for climate modeling. One reason for this is that traditional climate models were written before GPUs were available and would require an extensive overhaul to run on these new machines. In addition, even conventional "high–resolution" simulations don't currently provide enough parallel work to keep GPUs busy, so the benefits of such overhaul would be limited for the types of simulations climate scientists are accustomed to. The vision of the Simple Cloud‐Resolving Energy Exascale Earth System (E3SM) Atmosphere Model (SCREAM) project is to create a global atmospheric model with the architecture to efficiently use GPUs and horizontal resolution sufficient to fully take advantage of GPU parallelism. After 5 years of model development, SCREAM is finally ready for use. In this paper, we describe the design of this new code, its performance on both CPU and heterogeneous machines, and its ability to simulate real‐world climate via a set of four 40 day simulations covering all 4 seasons of the year. Plain Language Summary: This paper describes the design and development of a 3 km version of the Energy Exascale Earth System Model (E3SM) atmosphere model, which has been fully rewritten in C++ using the Kokkos library for performance portability. This newly rewritten model is able to take advantage of the state–of–the–science high performance computing systems which use graphical processor units (GPUs) to mitigate much of the computational expense which typically plagues high–resolution global modeling. Taking advantage of this high–performance we are able to run four seasons of simulations at 3 km global resolution. We discuss the biases, including the diurnal cycle, by comparing model results with satellite and Atmospheric Radiation Measurement ground‐based site data. Key Points: Describes the C++/Kokkos implementation of the Simple Cloud–Resolving E3SM Atmosphere Model (SCREAMv1)SCREAMv1 leverages GPUs to surpass one simulated year per compute day at global 3 km resolutionHigh resolution improves some meso‐scale features and the diurnal cycle but large‐scale biases require improvement across all four seasons [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
15. Atmospheric Limitations for High-frequency Ground-based Very Long Baseline Interferometry.
- Author
-
Pesce, Dominic W., Blackburn, Lindy, Chaves, Ryan, Doeleman, Sheperd S., Freeman, Mark, Issaoun, Sara, Johnson, Michael D., Lindahl, Greg, Natarajan, Iniyan, Paine, Scott N., Palumbo, Daniel C. M., Roelofs, Freek, and Tiede, Paul
- Subjects
VERY long baseline interferometry ,INTERFEROMETRY ,VISIBILITY ,ATMOSPHERIC models - Abstract
Very long baseline interferometry (VLBI) provides the highest-resolution images in astronomy. The sharpest resolution is nominally achieved at the highest frequencies, but as the observing frequency increases, so too does the atmospheric contribution to the system noise, degrading the sensitivity of the array and hampering detection. In this paper, we explore the limits of high-frequency VLBI observations using ngehtsim, a new tool for generating realistic synthetic data. ngehtsim uses detailed historical atmospheric models to simulate observing conditions, and it employs heuristic visibility detection criteria that emulate single- and multifrequency VLBI calibration strategies. We demonstrate the fidelity of ngehtsim's predictions using a comparison with existing 230 GHz data taken by the Event Horizon Telescope (EHT), and we simulate the expected performance of EHT observations at 345 GHz. Though the EHT achieves a nearly 100% detection rate at 230 GHz, our simulations indicate that it should expect substantially poorer performance at 345 GHz; in particular, observations of M87* at 345 GHz are predicted to achieve detection rates of ≲20% that may preclude imaging. Increasing the array sensitivity through wider bandwidths and/or longer integration times—as enabled through, e.g., the simultaneous multifrequency upgrades envisioned for the next-generation EHT—can improve the 345 GHz prospects and yield detection levels that are comparable to those at 230 GHz. M87* and Sgr A* observations carried out in the atmospheric window around 460 GHz could expect to regularly achieve multiple detections on long baselines, but analogous observations at 690 and 875 GHz consistently obtain almost no detections at all. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
16. Applying Machine Learning in Numerical Weather and Climate Modeling Systems.
- Author
-
Krasnopolsky, Vladimir
- Subjects
ATMOSPHERIC models ,MACHINE learning ,WEATHER ,DEEP learning - Abstract
In this paper major machine learning (ML) tools and the most important applications developed elsewhere for numerical weather and climate modeling systems (NWCMS) are reviewed. NWCMSs are briefly introduced. The most important papers published in this field in recent years are reviewed. The advantages and limitations of the ML approach in applications to NWCMS are briefly discussed. Currently, this field is experiencing explosive growth. Several important papers are published every week. Thus, this paper should be considered as a simple introduction to the problem. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
17. Decision-making strategies implemented in SolFinder 1.0 to identify eco-efficient aircraft trajectories: application study in AirTraf 3.0.
- Author
-
Castino, Federica, Yin, Feijia, Grewe, Volker, Yamashita, Hiroshi, Matthes, Sigrun, Dietmüller, Simone, Baumann, Sabine, Soler, Manuel, Simorgh, Abolfazl, Mendiguchia Meuser, Maximilian, Linke, Florian, and Lührs, Benjamin
- Subjects
ATMOSPHERIC chemistry ,DECISION making ,OPERATING costs ,ATMOSPHERIC models ,WEATHER - Abstract
The optimization of aircraft trajectories involves balancing operating costs and climate impact, which are often conflicting objectives. To achieve compromised optimal solutions, higher-level information such as preferences of decision-makers must be taken into account. This paper introduces the SolFinder 1.0 module, a decision-making tool designed to identify eco-efficient aircraft trajectories, which allow for the reduction of the flight's climate impact with limited cost penalties compared to cost-optimal solutions. SolFinder 1.0 offers flexible decision-making options that allow users to select trade-offs between different objective functions, including fuel use, flight time, NOx emissions, contrail distance, and climate impact. The module is included in the AirTraf 3.0 submodel, which optimizes trajectories under atmospheric conditions simulated by the ECHAM/MESSy Atmospheric Chemistry model. This paper focuses on the ability of the module to identify eco-efficient trajectories while solving a bi-objective optimization problem that minimizes climate impact and operating costs. SolFinder 1.0 enables users to explore trajectory properties at varying locations of the Pareto fronts without prior knowledge of the problem results and to identify solutions that limit the cost of reducing the climate impact of a single flight. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
18. LB-SCAM: a learning-based method for efficient large-scale sensitivity analysis and tuning of the Single Column Atmosphere Model (SCAM).
- Author
-
Guo, Jiaxu, Zheng, Juepeng, Xu, Yidan, Fu, Haohuan, Xue, Wei, Wang, Lanning, Gan, Lin, Gao, Ping, Wan, Wubing, Wu, Xianwei, Zhang, Zhitao, Hu, Liang, Xu, Gaochao, and Che, Xilong
- Subjects
ATMOSPHERIC models ,SWINDLERS & swindling ,SENSITIVITY analysis ,CLOUDINESS ,MACHINE learning - Abstract
The single-column model, with its advantages of low computational cost and fast execution speed, can assist users in gaining a more intuitive understanding of the impact of parameters on the simulated results of climate models. It plays a crucial role in the study of parameterization schemes, allowing for a more direct exploration of the influence of parameters on climate model simulations. In this paper, we employed various methods to conduct sensitivity analyses on the 11 parameters of the Single Column Atmospheric Model (SCAM). We explored their impact on output variables such as precipitation, temperature, humidity, and cloud cover, among others, across five test cases. To further expedite experimentation, we utilized machine learning methods to train surrogate models for the aforementioned cases. Additionally, three-parameter joint perturbation experiments were conducted based on these surrogate models to validate the combined parameter effects on the results. Subsequently, targeting the sensitive parameter combinations identified from the aforementioned experiments, we further conducted parameter tuning for the corresponding test cases to minimize the discrepancy between the results of SCAM and observational data. Our proposed method not only enhances model performance but also expedites parameter tuning speed, demonstrating good generality at the same time. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
19. Saliency Detection Algorithm for Foggy Images Based on Deep Learning.
- Author
-
Leihong Zhang, Zhaoyuan Ji, Runchu Xu, and Dawei Zhang
- Subjects
DEEP learning ,IMAGE enhancement (Imaging systems) ,ALGORITHMS ,IMAGE recognition (Computer vision) ,ATMOSPHERIC models ,WEATHER - Abstract
The detection of salient objects in foggy scenes is an important research component in many practical applications such as action recognition, target tracking and pedestrian re-identification. To facilitate saliency detection in foggy scenes, this paper explores two issues. The construction of dataset for foggy weather conditions and implementation scheme for foggy weather saliency detection. Firstly, a foggy sky image synthesis method is designed based on the atmospheric scattering model, and a saliency detection dataset applicable to foggy sky is constructed. Secondly, we compare the current classification networks and adopt resnet50, which has the highest classification accuracy, as the backbone network of the classification module, and classify the foggy sky images into three levels, namely fogless, light fog and dense fog, according to different concentrations. Then, Residual Refinement Network (R2Net) was selected to train and test the classified images. Horizontal and vertical flipping and image cropping were used to enhance the training set to relieve over-fitting. The accuracy of the network model was improved by using Adam as the optimizer. Experimental results show that for the detection of fogless images, our method is almost on par with state-of-the-art, and performs well for both light and dense fog images. Our method has good adaptability, accuracy and robustness. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
20. High resolution global climate modelling; the UPSCALE project, a large simulation campaign.
- Author
-
Mizielinski, M. S., Roberts, M. J., Vidale, P. L., Schiemann, R., Demory, M.-E., Strachan, J., Edwards, T., Stephens, A., Lawrence, B. N., Pritchard, M., Chiu, P., Iwi, A., Churchill, J., Novales, C. del Cano, Kettleborough, J., Roseblade, W., Selwood, P., Foster, M., Glover, M., and Malcolm, A.
- Subjects
ATMOSPHERIC models ,WEATHER forecasting ,HIGH performance computing ,DATA ,MATHEMATICAL optimization - Abstract
The UPSCALE (UK on PRACE: weather-resolving Simulations of Climate for globAL Environmental risk) project constructed and ran an ensemble of HadGEM3 (Hadley centre Global Environment Model 3) atmosphere-only global climate simulations over the period 1985-2011, at resolutions of N512 (25 km), N216 (60 km) and N96 (130 km) as used in current global weather forecasting, seasonal prediction and climate modelling respectively. Alongside these present climate simulations a parallel ensemble looking at extremes of future climate was run, using a time-slice methodology to consider conditions at the end of this century. These simulations were primarily performed using a 144 million core hour, single year grant of computing time from PRACE (the Partnership for Advanced Computing in Europe) in 2012, with additional resources supplied by the Natural Environmental Research Council (NERC) and the Met Office. Almost 400 terabytes of simulation data were generated on the HERMIT supercomputer at the high performance computing center Stuttgart (HLRS), and transferred to the JASMIN super-data cluster provided by the Science and Technology Facilities Council Centre for Data Archival (STFC CEDA) for analysis and storage. In this paper we describe the implementation of the project, present the technical challenges in terms of optimisation, data output, transfer and storage that such a project involves and include details of the model configuration and the composition of the UPSCALE dataset. This dataset is available for scientific analysis to allow assessment of the value of model resolution in both present and potential future climate conditions. [ABSTRACT FROM AUTHOR]
- Published
- 2014
- Full Text
- View/download PDF
21. On the relationship between metrics to compare greenhouse gases - the case of IGTP, GWP and SGTP.
- Author
-
Azar, C. and Johansson, D. J. A.
- Subjects
ATMOSPHERIC models ,GREENHOUSE gases ,GLOBAL temperature changes ,CLIMATOLOGY ,ATMOSPHERIC temperature - Abstract
The article focuses on the metrics used for comparing greenhouse gases with focus on the Integrated Temperature Change Potential (IGTP). It states that the IGTP and global warming potentials (GWP) are asymptotically equal is the time horizon reaches infinity. It says that the IGTP is equal to the Sustained Global Temperature change Potential (SGTP) under standard assumptions when calculating GWPs.
- Published
- 2012
- Full Text
- View/download PDF
22. A flexible importance sampling method for integrating subgrid processes.
- Author
-
Raut, E. K. and Larson, V. E.
- Subjects
MICROPHYSICS ,ATMOSPHERIC models ,SAMPLING methods - Abstract
Numerical models of weather and climate need to compute grid-box-averaged rates of physical processes such as microphysics. These averages are computed by integrating subgrid variability over a grid box. For this reason, an important aspect of atmospheric modeling is integration. The needed integrals can be estimated by Monte Carlo integration. Monte Carlo integration is simple and general but requires many evaluations of the physical process rate. To reduce the number of function evaluations, this paper describes a new, flexible method of importance sampling. It divides the domain of integration into eight cate gories, such as the portion that contains both precipitation and cloud, or the portion that contains precipitation but no cloud. It then allows the modeler to prescribe the density of sample points within each of the eight categories. The new method is incorporated into the Subgrid Importance Latin Hypercube Sampler (SILHS). The resulting method is tested on drizzling cumulus and stratocumulus cases. In the cumulus case, the sampling error can be considerably reduced by drawing more sample points from the region of rain evaporation. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
23. NCAR global model topography generation software for unstructured grids.
- Author
-
Lauritzen, P. H., Bacmeister, J. T., Callaghan, P. F., and Taylor, M. A.
- Subjects
TOPOGRAPHY ,ATMOSPHERIC models ,TURBULENT flow ,PARAMETERIZATION ,COMPUTER software ,NUMERICAL grid generation (Numerical analysis) - Abstract
It is the purpose of this paper to document the NCAR global model topography generation software for unstructured grids. Given a model grid, the software computes the fraction of the grid box covered by land, the gridbox mean elevation, and associated sub-grid scale variances commonly used for gravity wave and turbulent mountain stress parameterizations. The software supports regular latitude-longitude grids as well as unstructured grids; e.g. icosahedral, Voronoi, cubed-sphere and variable resolution grids. As an example application and in the spirit of documenting model development, exploratory simulations illustrating the impacts of topographic smoothing with the NCARDOE CESM (Community Earth System Model) CAM5.2-SE (Community Atmosphere Model version 5.2 - Spectral Elements dynamical core) are shown. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
24. A 3-D RBF-FD elliptic solver for irregular boundaries: modeling the atmospheric global electric circuit with topography.
- Author
-
Bayona, V., Flyer, N., Lucas, G. M., and Baumgaertner, A. J. G.
- Subjects
TOPOGRAPHY ,ELECTRIC circuits ,ATMOSPHERIC models ,ELECTRIC conductivity ,MATHEMATICAL models - Abstract
A numerical model based on Radial Basis Function-generated Finite Differences (RBFFD) is developed for simulating the Global Electric Circuit (GEC) within the Earth's atmosphere, represented by a 3-D variable coefficient linear elliptic PDE in a sphericallyshaped volume 5 with the lower boundary being the Earth's topography and the upper boundary a sphere at 60 km. To our knowledge, this is (1) the first numerical model of the GEC to combine the Earth's topography with directly approximating the differential operators in 3-D space, and related to this (2) the first RBF-FD method to use irregular 3-D stencils for discretization to handle the topography. It benefits from the mesh-free 10 nature of RBF-FD, which is especially suitable for modeling high-dimensional problems with irregular boundaries. The RBF-FD elliptic solver proposed here makes no limiting assumptions on the spatial variability of the coefficients in the PDE (i.e. the conductivity profile), the right hand side forcing term of the PDE (i.e. distribution of current sources) or the geometry of the lower boundary. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
25. PLASIM-GENIE: a new intermediate complexity AOGCM.
- Author
-
Holden, P. B., Edwards, N. R., Fraedrich, K., Kirk, E., Lunkeit, F., and Zhu, X.
- Subjects
ATMOSPHERIC circulation ,ATMOSPHERIC models - Abstract
We describe the development, tuning and climate of PLASIM-GENIE, a new intermediate complexity Atmosphere-Ocean Global Climate Model (AOGCM), built by coupling the Planet Simulator to the GENIE earth system model. PLASIM-GENIE supersedes "GENIE-2", a coupling of GENIE to the Reading IGCM. It has been developed to join the limited number of models that bridge the gap between EMICS with simplified atmospheric dynamics and state of the art AOGCMs. A 1000 year simulation with PLASIM-GENIE requires approximately two weeks on a single node of a 2.1 GHz AMD 6172 CPU. An important motivation for intermediate complexity models is the evaluation of uncertainty. We here demonstrate the tractability of PLASIM-GENIE ensembles by deriving a "subjective" tuning of the model with a 50 member ensemble of 1000 year simulations. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
26. Overview of the Coupled Model Intercomparison Project Phase 6 (CMIP6) experimental design and organisation.
- Author
-
Eyring, V., Bony, S., Meehl, G. A., Senior, C., Stevens, B., Stouffer, R. J., and Taylor, K. E.
- Subjects
EXPERIMENTAL design ,ATMOSPHERIC models ,CLIMATE change - Abstract
By coordinating the design and distribution of global climate model simulations of the past, current and future climate, the Coupled Model Intercomparison Project (CMIP) has become one of the foundational elements of climate science. However, the need to address an ever-expanding range of scientific questions arising from more and more research communities has made it necessary to revise the organization of CMIP. After a long and wide community consultation, a new and more federated structure has been put in place. It consists of three major elements: (1) a handful of common experiments, the DECK (Diagnostic, Evaluation and Characterization of Klima experiments) and the CMIP Historical Simulation (1850-near-present) that will maintain continuity and help document basic characteristics of models across different phases of CMIP, (2) common standards, coordination, infrastructure and documentation that will facilitate the distribution of model outputs and the characterization of the model ensemble, and (3) an ensemble of CMIP-Endorsed Model Intercomparison Projects (MIPs) that will be specific to a particular phase of CMIP (now CMIP6) and that will build on the DECK and the CMIP Historical Simulation to address a large range of specific questions and fill the scientific gaps of the previous CMIP phases. The DECK and CMIP Historical Simulation, together with the use of CMIP data standards, will be the entry cards for models participating in CMIP. The participation in the CMIP6-Endorsed MIPs will be at the discretion of the modelling groups, and will depend on scientific interests and priorities. With the Grand Science Challenges of the World Climate Research Programme (WCRP) as its scientific backdrop, CMIP6 will address three broad questions: (i) how does the Earth system respond to forcing?, (ii) what are the origins and consequences of systematic model biases?, and (iii) how can we assess future climate changes given climate variability, predictability and uncertainties in scenarios? This CMIP6 overview paper presents the background and rationale for the new structure of CMIP, provides a detailed description of the DECK and the CMIP6 Historical Simulation, and includes a brief introduction to the 21 CMIP6-Endorsed MIPs. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
27. Upscaling with the dynamic two-layer classification concept (D2C): TreeMig-2L, an efficient implementation of the forest-landscape model TreeMig.
- Author
-
Nabel, J. E. M. S.
- Subjects
CLIMATE change models ,ATMOSPHERIC models ,VEGETATION dynamics ,MODELS & modelmaking ,SIMULATION methods & models - Abstract
Models used to investigate impacts of climatic changes on spatio-temporal vegetation dynamics need to balance required accuracy with computational feasibility. To enhance the computational efficiency of these models, upscaling methods are required that maintain key fine-scale processes influencing vegetation dynamics. In this paper, an adjustable method - the dynamic two-layer classification concept (D2C) - for the upscaling of time- and space-discrete models is presented. D2C aims to separate potentially repetitive calculations from those specific to single grid cells. The underlying idea is to extract processes that do not require information about a grid cell's neighbourhood to a reduced-size non-spatial layer, which is dynamically coupled to the original two-dimensional layer. The size of the non-spatial layer is thereby adaptive and depends on dynamic classifications according to pre-specified similarity criteria. I present how D2C can be used in a model implementation on the example of TreeMig-2L, a new, efficient version of the intermediate-complexity forest-landscape model TreeMig. To discuss the trade-off between computational expenses and accuracy, as well as the applicability of D2C, I compare different model stages of TreeMig-2L via simulations of two different application scenarios. This comparison of different model stages demonstrates that applying D2C can strongly reduce computational expenses of processes calculated on the new non-spatial layer. D2C is thus a valuable upscaling method for models and applications in which processes requiring information about the neighbourhood constitute the minor share of the overall computational expenses. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
28. Estimates of common ragweed pollen emission and dispersion over Europe using RegCM-pollen model.
- Author
-
Liu, L., Solmon, F., Vautard, R., Hamaoui-Laguel, L., Torma, Cs. Zs., and Giorgi, F.
- Subjects
RAGWEEDS ,ATMOSPHERIC models ,INVASIVE plants ,CLIMATE change ,ANGIOSPERMS - Abstract
Common ragweed (Ambrosia artemisiifolia L.) is a highly allergenic and invasive plant in Europe. Its pollen can be transported over large distances and has been recognized as a significant cause of hayfever and asthma (D'Amato et al., 2007; Burbach et al., 2009). To simulate production and dispersion of common ragweed pollen, we implement a pollen emission and transport module in the Regional Climate Model (RegCM) version 4 using the framework of the Community Land Model (CLM) version 4.5. In the online model environment where climate is integrated with dispersion and vegetation production, pollen emissions are calculated based on the modelling of plant distribution, pollen production, species-specific phenology, flowering probability, and flux response to meteorological conditions. A pollen tracer model is used to describe pollen advective transport, turbulent mixing, dry and wet deposition. The model is then applied and evaluated on a European domain for the period 2000-2010. To reduce the large uncertainties notably due to ragweed density distribution on pollen emission, a calibration based on airborne pollen observations is used. Resulting simulations show that the model captures the gross features of the pollen concentrations found in Europe, and reproduce reasonably both the spatial and temporal patterns of flowering season and associated pollen concentrations measured over Europe. The model can explain 68.6, 39.2, and 34.3 % of the observed variance in starting, central, and ending dates of the pollen season with associated root mean square error (RMSE) equal to 4.7, 3.9, and 7.0 days, respectively. The correlation between simulated and observed daily concentrations time series reaches 0.69. Statistical scores show that the model performs better over the central Europe source region where pollen loads are larger. From these simulations health risks associated common ragweed pollen spread are then evaluated through calculation of exposure time above health-relevant threshold levels. The total risk area with concentration above 5 grains m
-3 takes up 29.5 % of domain. The longest exposure time occurs on Pannonian Plain, where the number of days per year with the daily concentration above 20 grains m-3 exceeds 30. [ABSTRACT FROM AUTHOR]- Published
- 2015
- Full Text
- View/download PDF
29. AnaWEGE: a weather generator based on analogues of atmospheric circulation.
- Author
-
Yiou, P.
- Subjects
ATMOSPHERIC circulation ,SEA level ,METEOROLOGICAL observations ,AUTOCORRELATION (Statistics) ,ATMOSPHERIC models - Abstract
This paper presents a stochastic weather generator based on analogues of circulation (AnaWEGE). Analogues of circulation have been a promising paradigm to analyse climate variability and its extremes. The weather generator uses precomputed analogues of sea-level pressure over the North Atlantic. The stochastic rules of the generator constrain the continuity in time of the simulations. The generator then simulates spatially coherent time series of a climate variable, drawn from meteorological observations. The weather generator is tested for European temperatures, and for winter and summer seasons. The biases in temperature quantiles and autocorrelation are rather small compared to observed variability. The ability of simulating extremely hot summers and cold winters is also assessed. [ABSTRACT FROM AUTHOR]
- Published
- 2013
- Full Text
- View/download PDF
30. Application and evaluation of McICA scheme with new radiation code in BCC_AGCM2.0.1.
- Author
-
H. Zhang, X. Jing, and J. Li
- Subjects
MONTE Carlo method ,K-distribution (Probability theory) ,ATMOSPHERIC models ,ATMOSPHERIC temperature ,HUMIDITY research ,INHOMOGENEOUS materials ,CONDENSATION (Meteorology) - Abstract
This research incorporates the Monte Carlo Independent Column Approximation (McICA) scheme with the correlated k-distribution BCC-RAD radiation model into the climate model BCC_AGCM2.0.1 and examines the impacts on modeled climate through several simulations with variations in cloud structures. Results from experiments with consistent sub-grid cloud structures show that both clear-sky radiation fluxes and cloud radiative forcings (CRFs) calculated by the new scheme are mostly improved relative to those calculated from the original one. The modeled atmospheric temperature and specific humidity are also improved due to changes in the radiative heating rates. The vertical overlap of fractional clouds and horizontal distribution of cloud condensation are important for computing CRFs. The maximum changes in seasonal CRF using the general overlap assumption (GenO) with different decorrelation depths (L
cf ) are larger than 10 and 20Wm2 for longwave (LW) CRF and shortwave (SW) CRF, respec15 tively, mostly located in the Tropics and mid-latitude storm tracks. Larger (smaller) Lcf in the Tropics (mid-latitude storm tracks) yield better cloud fraction and CRF compared with observations. The inclusion of an observation-based horizontal inhomogeneity of cloud condensation has a distinct impact on LW CRF and SW CRF, with global means of ∼1.2Wm-2 and ∼3.7Wm-2 at the top of atmosphere, respectively, making these much closer to observations. These results prove the reliability of the new model configuration to be used in BCC_AGCM2.0.1 for climate simulations, and also indicate that more detailed realworld information on cloud structures should be obtained to constrain cloud settings in McICA in the future. [ABSTRACT FROM AUTHOR]- Published
- 2013
- Full Text
- View/download PDF
31. Validation of two independent retrievals of SCIAMACHY water vapour columns using radiosonde data.
- Author
-
du Piesanie, A., Piters, A. J. M., Aben, I., Schrijver, H., Wang, P., and Noël, S.
- Subjects
ATMOSPHERIC water vapor ,RADIOSONDES ,AIR masses ,HYGROMETRY ,ATMOSPHERIC models - Abstract
Two independently derived SCIAMACHY total water vapour column (WVC) products are compared with integrated water vapour data calculated from radiosonde measurements, and with each other. The two SCIAMACHY WVC products are retrieved with two different retrieval algorithms applied in the visible and short wave infrared wavelength regions respectively. The first SCIAMACHY WVC product used in the comparison is ESA's level 2 version 5.01 WVC product derived with the Air Mass Corrected Differential Absorption Spectroscopy (AMC-DOAS) retrieval algorithm (SCIAMACHYESA). The second SCIAMACHY WVC product is derived using the Iterative Maximum Likelihood Method (IMLM) developed by Netherlands Institute for Space Research (SCIAMACHY-IMLM). Both SCIAMACHY WVC products are compared with collocated water vapour amounts determined from daily relative humidity radiosonde measurements obtained from the European Centre for Medium-Range Weather Forecasts (ECMWF) radiosonde network, over an 18 month and 2 yr period respectively. Results indicate a good agreement between theWVC amounts of SCIAMACHY-ESA and the radiosonde, and a mean difference of 0.03 g cm
-2 is found for cloud free conditions. Overall the SCIAMACHY-ESA WVC amounts are smaller than the radiosonde WVC amounts, especially over oceans. For cloudy conditions the WVC bias has a clear dependence on the cloud top height and increases with increasing cloud top heights larger than approximately 2 km. A likely cause for this could be the different vertical profile shapes of water vapour and O2 leading to different relative changes in their optical thickness, which makes the AMF correction method used in the algorithm less suitable for high clouds. The SCIAMACHY-IMLM WVC amounts compare well to the radiosonde WVC amounts during cloud free conditions over land. A mean difference of 0.08 g cm-2 is found which is consistent with previous results when comparing daily averaged SCIAMACHY-IMLM WVC amounts with ECMWF model data globally. Furthermore, we show that the measurements for cloudy conditions (cloud fraction ≥0.5) with low clouds (cloud pressure ≥930 hPa) above the ocean and land compare quite well with radiosonde data. [ABSTRACT FROM AUTHOR]- Published
- 2013
- Full Text
- View/download PDF
32. DOAS measurements of NO2 from an ultralight aircraft during the Earth Challenge expedition.
- Author
-
Merlaud, A., Van Roozendael, M., van Gent, J., Fayt, C., Maes, J., Toledo, X., Ronveaux, O., and De Mazi`ere, M.
- Subjects
ATMOSPHERIC models ,SIMULATION methods & models ,CALIBRATION ,NITROGEN dioxide ,LIGHT absorption ,ABSORPTION spectra ,RADIATIVE transfer - Abstract
The article presents a study which calibrates nitrogen dioxide (NO
2 ) tropospheric columns using airborne Differential Optical Absorption Spectroscopy (DOAS). The study looks into technical aspects of ULM-DOAS instrument and describes data analysis techniques including DOAS settings, radiative transfer modeling, and inversion scheme. It also investigates the aforementioned measurements' sensitivity to geometrical and geophysical parameters.- Published
- 2012
- Full Text
- View/download PDF
33. A standard test case suite for two-dimensional linear transport on the sphere.
- Author
-
Lauritzen, P. H., Skamarock, W. C., Prather, M. J., and Taylor, M. A.
- Subjects
ATMOSPHERIC models ,TROPOSPHERIC chemistry ,TWO-dimensional models ,SCALAR field theory ,GEOPHYSICS - Abstract
The article proposes a standard test case suite for two-dimensional transport schemes on the sphere, which is meant to be used for developing model and facilitating scheme intercomparison. It notes that the test case suites are projected to assess important aspects of accuracy in geophysical fluid dynamics. The different challenges posed by the experiments for the range of transport approaches from Langrarian to Eulerian is discussed.
- Published
- 2012
- Full Text
- View/download PDF
34. A suite of Early Eocene (~ 55 Ma) climate model boundary conditions.
- Author
-
Herold, N., Buzan, J., Seton, M., Goldner, A., Green, J. A. M., Müller, R. D., Markwick, P., and Huber, M.
- Subjects
ATMOSPHERIC models ,EOCENE paleobotany ,ATMOSPHERIC aerosols ,BATHYMETRY ,TIDES - Abstract
We describe a set of Early Eocene (~ 55 Ma) climate model boundary conditions constructed in a self-consistent reference frame and incorporating recent data and methodologies. Given the growing need for uniform experimental design within the Eocene climate modelling community, we make publically available our datasets of Eocene topography, bathymetry, tidal dissipation, vegetation, aerosol distributions and river runoff. Particularly our Eocene topography and bathymetry has been significantly improved compared to previously utilized boundary conditions. Major improvements include the paleogeography of Antarctica, Australia, Europe, the Drake Passage and the Isthmus of Panama, and our boundary conditions include modelled estimates of Eocene aerosol distributions and tidal dissipation for the first time, both consistent with our paleotopog-raphy and paleobathymetry. The resolution of our datasets (1° x 1°) is also unprecedented and will facilitate high resolution climate simulations. In light of the inherent uncertainties involved in reconstructing global boundary conditions for past time peri ods these datasets should be considered as one interpretation of the available data. This paper marks the beginning of a process for reconstructing a set of accurate, open-access Eocene boundary conditions for use in climate models. [ABSTRACT FROM AUTHOR]
- Published
- 2014
- Full Text
- View/download PDF
35. The large-scale spatio-temporal variability of precipitation over Sweden observed from the weather radar network.
- Author
-
Devasthale, A. and Norin, L.
- Subjects
SPATIO-temporal variation ,METEOROLOGICAL precipitation ,WEATHER radar networks ,ATMOSPHERIC models - Abstract
Using measurements from the national network of 12 weather radar stations for the last decade (2000-2010), we investigate the large-scale spatio-temporal variability of precipitation over Sweden. These statistics provide useful information to evaluate re- gional climate models as well as for hydrology and energy applications. A strict quality control is applied to filter out noise and artifacts from the radar data. We focus on investigating four distinct aspects namely, the diurnal cycle of precipitation and its seasonality, the dominant time scale (diurnal vs. seasonal) of variability, precipitation response to different wind directions, and the correlation of precipitation events with the North Atlantic Oscillation (NAO) and the Arctic Oscillation (AO). When classified based on their intensity, moderate to high intensity events (precipitation> 0.34mm(3h)
-1 ) peak distinctly during late afternoon over the majority of radar stations in summer and during late night or early morning in winter. Precipitation variability is highest over the southwestern parts of Sweden. It is shown that the high intensity events (precipitation> 1.7mm(3h)-1 ) are positively correlated with NAO and AO (esp. over northern Sweden), while the low intensity events are negatively correlated (esp. over southeastern parts). It is further observed that southeasterly winds often lead to intense precipitation events over central and northern Sweden, while southwesterly winds contribute most to the total accumulated precipitation for all radar stations. Apart from its operational applications, the present study demonstrates the potential of the weather radar data set for studying climatic features of precipitation over Sweden. [ABSTRACT FROM AUTHOR]- Published
- 2013
- Full Text
- View/download PDF
36. Editorial for the Special Issue "Atmospheric Dispersion and Chemistry Models: Advances and Applications".
- Author
-
Viúdez-Moreiras, Daniel
- Subjects
ATMOSPHERIC chemistry ,CHEMICAL models ,DISPERSION (Atmospheric chemistry) ,PRECIPITATION scavenging ,ATMOSPHERIC models ,RESEARCH reactors ,CONTINUOUS emission monitoring ,ENVIRONMENTAL sciences - Abstract
The first paper published in this SI, by Mazzeo et al. [[1]], used an air quality model, coupled online with a meteorological model, to simulate the impact of emission reductions on PM2.5 in the West Midlands region of the United Kingdom. Atmospheric dispersion and chemical transport models (CTMs) are a key tool in both atmospheric chemistry and environmental sciences. Liu et al. [[4]] analyzed the relationship between sandstorm periods and the transport and dispersion of particulate emissions from coal bases in northwest China, assisted by a backward trajectory analysis performed with an atmospheric dispersion model. [Extracted from the article]
- Published
- 2023
- Full Text
- View/download PDF
37. Comment on "Advanced Testing of Low, Medium, and High ECS CMIP6 GCM Simulations Versus ERA5‐T2m" by N. Scafetta (2022).
- Author
-
Schmidt, Gavin A., Jones, Gareth S., and Kennedy, John J.
- Subjects
STATISTICAL errors ,ATMOSPHERIC models ,GLOBAL warming - Abstract
Scafetta (2022, https://doi.org/10.1029/2022gl097716) purports to test Coupled Model Intercomparison Project Phase 6 (CMIP6) climate models through a comparison of temperature changes over three decades. Unfortunately, the paper contains numerous conceptual and statistical errors that undermine all of the conclusions. First, no uncertainty is given for the observational temperature difference, making it impossible to assess compatibility with any model result. Second, the CMIP6 data are the ensemble means for each model, but the metric being tested is sensitive to the internal variability and so the full ensemble for each model must be used. When this is corrected, the conclusion that "all models with ECS > 3.0°C overestimate the observed global surface warming" is not sustained. Third, the statistical test in Section 2 would reject all models even in a perfect model setup given sufficient ensemble members, thus the second conclusion "that spatial t‐statistics rejects the data‐model agreement" is also not sustainable. Plain Language Summary: Comparisons of models and observations need to account from multiple sources of uncertainty in both the observations and due to the chaotic dynamics of the weather. The analyses in Scafetta (2022, https://doi.org/10.1029/2022gl097716) do not take either of these issues into account and thus the conclusions in that paper are not supportable. Key Points: Scafetta (2022) contains errors in both of the statistical tests used that make the conclusions unsupportable [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
38. The effect of using the plant functional type paradigm on a data-constrained global phenology model.
- Author
-
Caldararu, S., Purves, D. W., and Smith, M. J.
- Subjects
PLANT phenology ,ECOSYSTEMS ,ATMOSPHERIC models ,EARTH system science ,PARAMETERIZATION - Abstract
Leaf seasonality impacts a variety of important biological, chemical and physical Earth system processes, which makes it essential to represent leaf phenology in ecosystem and climate models. However, we are still lacking a general, robust parametrisation of phenology at global scales. In this study, we use a simple process-based model, which describes phenology as a strategy for carbon optimality, to test the effects of the common assumption in global modelling studies that plant species within the same plant functional type have the same parameter values, implying they are assumed to have the same species traits. In a previous study this model was shown to predict spatial and temporal dynamics of leaf area index (LAI) well across the entire global land surface provided local grid cell parameters were used, and is able to explain 96% of the spatial variation in average LAI and 87% of the variation in amplitude. In contrast, we find here that a PFT level parametrisation is unable to capture the spatial variability in seasonal cycles, explaining on average only 28% of the spatial variation in mean leaf area index and 12% of the variation in seasonal amplitude. However we also show that allowing only two parameters, light compensation point and leaf age, to be spatially variable dramatically improves the model predictions, increasing the model's capability of explaining spatial variations in leaf seasonality to 70 and 57% of the variation in LAI average and amplitude respectively. This highlights the importance of identifying the spatial scale of variation of plant traits and the necessity to critically analyse the use of the plant functional type assumption in Earth system models. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
39. Conservative interpolation between general spherical meshes.
- Author
-
Kritsikis, E., Aechtner, M., Meurdesoif, Y., and Dubos, T.
- Subjects
NUMERICAL grid generation (Numerical analysis) ,ALGORITHMS ,FINITE volume method ,ATMOSPHERIC models ,QUADRILATERALS - Abstract
An efficient, local, explicit, second-order, conservative interpolation algorithm between spherical meshes is presented. The cells composing the source and target meshes may be either spherical polygons or longitude-latitude quadrilaterals. Second-order accuracy is obtained by piecewise-linear finite volume reconstruction over the source mesh. Global conservation is achieved through the introduction of a supermesh, whose cells are all possible intersections of source and target cells. Areas and intersections are computed exactly to yield a geometrically exact method. The main efficiency bottleneck caused by the construction of the supermesh is overcome by adopting tree-based data structures and algorithms, from which the mesh connectivity can also be deduced efficiently. The theoretical second-order accuracy is verified using a smooth test function and pairs of meshes commonly used for atmospheric modelling. Experiments confirm that the most expensive operations, especially the supermesh construction, have O(NlogN) computational cost. The method presented is meant to be incorporated in pre- or post-processing atmospheric modelling pipelines, or directly into models for flexible input/output. It could also serve as a basis for conservative coupling between model components, e.g. atmosphere and ocean. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
40. The Scaling LInear Macroweather model (SLIM): using scaling to forecast global scale macroweather from months to decades.
- Author
-
Lovejoy, S., Amador, L. del Rio, and Hébert, R.
- Subjects
WEATHER forecasting ,ATMOSPHERIC models ,CLIMATE change models ,EFFECT of human beings on climate change ,STOCHASTIC models - Abstract
At scales of ≈10 days (the lifetime of planetary scale structures), there is a drastic transition from high frequency weather to low frequency macroweather. This scale is close to the predictability limits of deterministic atmospheric models; so that in GCM macroweather forecasts, the weather is a high frequency noise. But neither the GCM noise nor the GCM climate is fully realistic. In this paper we show how simple stochastic models can be developped that use empirical data to force the statistics and climate to be realistic so that even a two parameter model can outperform GCM's for annual global temperature forecasts. The key is to exploit the scaling of the dynamics and the enormous stochastic memories that it implies. Since macroweather intermittency is low, we propose using the simplest model based on fractional Gaussian noise (fGn): the Scaling LInear Macroweather model (SLIM). SLIM is based on a stochastic ordinary differential equations, differing from usual linear stochastic models (such as the Linear Inverse Mod15 elling, LIM) in that it is of fractional rather than integer order. Whereas LIM implicitly assumes there is no low frequency memory, SLIM has a huge memory that can be exploited. Although the basic mathematical forecast problem for fGn has been solved, we approach the problem in an original manner notably using the method of innovations to obtain simpler results on forecast skill and on the size of the effective system memory. A key to successful forecasts of natural macroweather variability is to first remove the low frequency anthropogenic component. A previous attempt to use fGn for forecasts had poor results because this was not done. We validate our theory using hindcasts of global and Northern Hemisphere temperatures at monthly and annual resolutions. Several nondimensional measures of forecast skill - with no adjustable parameters - show excellent agreement with hindcasts and these show some skill even at decadal scales. We also compare our forecast errors with those of several GCM experiments (with and without initialization), and with other stochastic forecasts showing that even this simplest two parameter SLIM model is somewhat superior. In future, using a space-time (regionalized) generalization of SLIM we expect to be able to exploiting the system memory more extensively and obtain even more realistic forecasts. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
41. The computational and energy cost of simulation and storage for climate science: lessons from CMIP6.
- Author
-
Acosta, Mario C., Palomas, Sergi, Paronuzzi Ticco, Stella V., Utrera, Gladys, Biercamp, Joachim, Bretonniere, Pierre-Antoine, Budich, Reinhard, Castrillo, Miguel, Caubel, Arnaud, Doblas-Reyes, Francisco, Epicoco, Italo, Fladrich, Uwe, Joussaume, Sylvie, Kumar Gupta, Alok, Lawrence, Bryan, Le Sager, Philippe, Lister, Grenville, Moine, Marie-Pierre, Rioual, Jean-Christophe, and Valcke, Sophie
- Subjects
CLIMATOLOGY ,ENERGY industries ,ATMOSPHERIC models ,INTERNATIONAL relations ,ECOLOGICAL impact ,CLIMATE change - Abstract
The Coupled Model Intercomparison Project (CMIP) is one of the biggest international efforts aimed at better understanding the past, present, and future of climate changes in a multi-model context. A total of 21 model intercomparison projects (MIPs) were endorsed in its sixth phase (CMIP6), which included 190 different experiments that were used to simulate 40 000 years and produced around 40 PB of data in total. This paper presents the main findings obtained from the CPMIP (the Computational Performance Model Intercomparison Project), a collection of a common set of metrics, specifically designed for assessing climate model performance. These metrics were exclusively collected from the production runs of experiments used in CMIP6 and primarily from institutions within the IS-ENES3 consortium. The document presents the full set of CPMIP metrics per institution and experiment, including a detailed analysis and discussion of each of the measurements. During the analysis, we found a positive correlation between the core hours needed, the complexity of the models, and the resolution used. Likewise, we show that between 5 %–15 % of the execution cost is spent in the coupling between independent components, and it only gets worse by increasing the number of resources. From the data, it is clear that queue times have a great impact on the actual speed achieved and have a huge variability across different institutions, ranging from none to up to 78 % execution overhead. Furthermore, our evaluation shows that the estimated carbon footprint of running such big simulations within the IS-ENES3 consortium is 1692 t of CO 2 equivalent. As a result of the collection, we contribute to the creation of a comprehensive database for future community reference, establishing a benchmark for evaluation and facilitating the multi-model, multi-platform comparisons crucial for understanding climate modelling performance. Given the diverse range of applications, configurations, and hardware utilised, further work is required for the standardisation and formulation of general rules. The paper concludes with recommendations for future exercises aimed at addressing the encountered challenges which will facilitate more collections of a similar nature. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
42. A Model for Air Entrainment Rates in Oceanic Whitecaps.
- Author
-
Callaghan, Adrian H.
- Subjects
WATER waves ,BUBBLES ,ATMOSPHERIC models ,SURFACE area ,OCEAN - Abstract
Air‐entraining whitecaps provide an important source of bubbles over the global oceans, yet the rate at which the associated air is entrained is not well known. This lack of understanding limits the ability to accurately parameterize bubble‐mediated gas exchange and sea spray aerosol flux. In this paper I present a model to predict the total volume of air entrained by individual whitecaps and extend it to estimate the rate at which air is entrained per unit sea surface area. The model agrees well with existing models and measurements and can be forced by the rate at which energy is dissipated by the wavefield which can be routinely provided by spectral wave models. I then use the model to present the first distributions of the estimated total volume of air entrained by individual whitecaps, as well as their rate of air entrainment and air degassing. Plain Language Summary: The amount of air in the oceans in the form of bubbles at any given time is not well known because of the difficulty associated with making in‐situ measurements. This lack of knowledge inhibits how well ocean‐atmosphere exchange processes that are driven by air and bubbles can be represented in climate models. In this paper, I present a new model to estimate the volume of air entrained by individual breaking waves called whitecaps, as well as how quickly the air is entrained into the oceans and how quickly it leaves the oceans when bubbles rise to the surface and burst. Key Points: A model for the entrainment of air by oceanic whitecaps is presented which agrees well with existing models and measurementsDistributions of the estimated volume of air entrained by individual oceanic whitecaps are presented for the first timeKey uncertainties in the air fraction and entrainment velocities of individual whitecaps remain due to a lack of measurements [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
43. Prediction of Atmospheric Profiles With Machine Learning Using the Signature Method.
- Author
-
Fujita, M., Sugiura, N., and Kouketsu, S.
- Subjects
MACHINE learning ,WATER vapor ,WATER temperature ,RAINFALL ,ATMOSPHERIC models ,ATMOSPHERIC water vapor measurement - Abstract
An array of atmospheric profile observations consists of three‐dimensional vectors representing pressure, temperature, and humidity, with each profile forming a continuous curve in this three‐dimensional space. In this paper, the Signature method, which can quantify a profile's curve, was adopted for the atmospheric profiles, and the accuracy of profile representations was investigated. The description of profiles by the signature was confirmed with adequate accuracy. The machine‐learning‐based model, developed using the signature, exhibited a high level of annual accuracy with minimal absolute mean differences in temperature and water vapor mixing ratio (<2.0 K or g kg−1). Notably, the model successfully captured the vertical structure and atmospheric instability, encompassing drastic variations in water vapor and temperature, even during intense rainfall. These results indicate the Signature method can comprehensively describe the vertical profile with information on how ordered values are correlated. This concept would potentially improve the representation of the atmospheric vertical structure. Plain Language Summary: The atmospheric profile can be visualized as a three‐dimensional curve representing pressure, temperature, and humidity. By utilizing the Signature method, we can measure and quantify the profile's curve, allowing for comprehensive modeling of the atmosphere. In this paper, we confirmed the accuracy of atmospheric profile representation using signatures and introduced the characteristics of signatures revealed through machine‐learning models. The description of profiles by the signature was confirmed with adequate accuracy. Moreover, the model demonstrated robust annual accuracy, with minimal temperature and water vapor discrepancies. It effectively captured the vertical structure and instability of the atmosphere, even during heavy rainfall, characterized by significant temperature and water vapor content fluctuations. Key Points: The utilization possibility of the Signature method for atmospheric profiles was confirmedBy utilizing the Signature method, we can measure and quantify the profile's curve, allowing for comprehensive modeling of the atmosphereThe machine‐learning model developed with the signature can predict the profiles with high annual accuracy [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
44. Hybrid modeling design patterns.
- Author
-
Rudolph, Maja, Kurz, Stefan, and Rakitsch, Barbara
- Subjects
ATMOSPHERIC models ,DESIGN - Abstract
Design patterns provide a systematic way to convey solutions to recurring modeling challenges. This paper introduces design patterns for hybrid modeling, an approach that combines modeling based on first principles with data-driven modeling techniques. While both approaches have complementary advantages there are often multiple ways to combine them into a hybrid model, and the appropriate solution will depend on the problem at hand. In this paper, we provide four base patterns that can serve as blueprints for combining data-driven components with domain knowledge into a hybrid approach. In addition, we also present two composition patterns that govern the combination of the base patterns into more complex hybrid models. Each design pattern is illustrated by typical use cases from application areas such as climate modeling, engineering, and physics. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
45. A new Wind Atlas to support the expansion of the Italian wind power fleet.
- Author
-
Sperati, Simone, Alessandrini, Stefano, D'Amico, Filippo, Cheng, Will, Rozoff, Christopher M., Bonanno, Riccardo, Lacavalla, Matteo, Aiello, Martina, Airoldi, Davide, Amaranto, Alessandro, Decimi, Goffredo, and Vergata, Milena Angelina
- Subjects
GREENHOUSE gases ,WIND power ,ATMOSPHERIC models ,WIND forecasting - Abstract
As a contribution to national strategic energy planning, recent developments in meteorological modeling and wind generation technologies have improved the representation of the spatio‐temporal features of wind. This paper describes an updated Italian Wind Atlas (Atlante EOLico ItaliANo [AEOLIAN]) released in the early 2000s. The objective of AEOLIAN is to guide future wind generation to accord with ambitious European greenhouse gas emission targets set for 2030 and 2050. AEOLIAN is the result of a collaboration effort between Ricerca sul Sistema Energetico (RSE) SpA and the National Center for Atmospheric Research (NCAR), which jointly developed a novel approach combining high‐resolution numerical weather modeling with the Analog Ensemble (AnEn) statistical technique. This paper uses dynamical model runs with hourly output for 1990–2019 with 4 km horizontal grid spacing. For 2015–2019, an inner grid nest with 1.33 km horizontal grid spacing is used. The AnEn is then employed to temporally extend the 5 years of high‐resolution runs back through 1990–2014 to create a 30‐year dataset for Italy and surrounding marine areas. A thorough verification is carried out using 104 observational stations homogeneously distributed throughout the territory. Compared with other state‐of‐the‐art products, AEOLIAN provides enhanced accuracy over complex terrain thanks to higher horizontal resolution and the assimilation of observational wind data over the domain, which result in a reduction of model bias on complex terrain and a better reconstruction of the wind distributions. Finally, a new WebGIS interface (https://atlanteeolico.rse-web.it/) to explore AEOLIAN data is described. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
46. Improved Diagnosis of Precipitation Type with LightGBM Machine Learning.
- Author
-
Zhuang, Haoyu, Lehner, Flavio, and DeGaetano, Arthur T.
- Subjects
MACHINE learning ,PRECIPITATION forecasting ,ATMOSPHERIC models ,RAINFALL ,PRECIPITATION (Chemistry) - Abstract
Existing precipitation-type algorithms have difficulty discerning the occurrence of freezing rain and ice pellets. These inherent biases are not only problematic in operational forecasting but also complicate the development of model-based precipitation-type climatologies. To address these issues, this paper introduces a novel light gradient-boosting machine (LightGBM)-based machine learning precipitation-type algorithm that utilizes reanalysis and surface observations. By comparing it with the Bourgouin precipitation-type algorithm as a baseline, we demonstrate that our algorithm improves the critical success index (CSI) for all examined precipitation types. Moreover, when compared with the precipitation-type diagnosis in reanalysis, our algorithm exhibits increased F1 scores for snow, freezing rain, and ice pellets. Subsequently, we utilize the algorithm to compute a freezing-rain climatology over the eastern United States. The resulting climatology pattern aligns well with observations; however, a significant mean bias is observed. We interpret this bias to be influenced by both the algorithm itself and assumptions regarding precipitation processes, which include biases associated with freezing drizzle, precipitation occurrence, and regional synoptic weather patterns. To mitigate the overall bias, we propose increasing the precipitation cutoff from 0.04 to 0.25 mm h−1, as it better reflects the precision of precipitation observations. This adjustment yields a substantial reduction in the overall bias. Finally, given the strong performance of LightGBM in predicting mixed precipitation episodes, we anticipate that the algorithm can be effectively utilized in operational settings and for diagnosing precipitation types in climate model outputs. Significance Statement: Freezing rain can have significant impacts on transportation and infrastructure, making accurate prediction of precipitation types crucial. In this study, we use a machine learning method known as LightGBM to predict precipitation types. We show that the new algorithm performs better than the existing methods for all precipitation types examined. Additionally, we compute a freezing-rain climatology over the eastern United States. Although the resulting climatology pattern corresponds well to observations, the algorithm overpredicts freezing-rain occurrence. We argue that this bias can be substantially reduced by increasing the precipitation cutoff from 0.04 to 0.25 mm h−1. Overall, this work highlights the potential of the LightGBM algorithm for both weather forecasting and diagnosing precipitation types in climate models. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
47. UQAM‐TCW: A Global Hybrid Tropical Cyclone Wind Model Based Upon Statistical and Coupled Climate Models.
- Author
-
Carozza, David A., Boudreault, Mathieu, Grenier, Manuel, and Caron, Louis‐Philippe
- Subjects
TROPICAL cyclones ,ATMOSPHERIC models ,CLIMATE change models ,EL Nino ,FINANCIAL risk management ,STORMS - Abstract
Tropical cyclones (TCs) are among the most destructive natural hazards and yet, quantifying their financial impacts remains a significant methodological challenge. It is therefore of high societal value to synthetically simulate TC tracks and winds to assess potential impacts along with their probability distributions for example, land use planning and financial risk management. A common approach to generate TC tracks is to apply storm detection methodologies to climate model output, but such an approach is sensitive to the method and parameterization used and tends to underestimate intense TCs. We present a global TC model (the UQAM‐TCW model thereafter) that melds statistical modeling, to capture historical risk features, with a climate model large ensemble, to generate large samples of physically coherent TC seasons. Integrating statistical and physical methods, the model is probabilistic and consistent with the physics of how TCs develop. The model includes frequency and location of cyclogenesis, full trajectories with maximum sustained winds and the entire wind structure along each track for the six typical cyclogenesis basins from IBTrACS. Being an important driver of TCs globally, we also integrate ENSO effects in key components of the model. The global TC model thus belongs to a recent strand of literature that combines probabilistic and physical approaches to TC track generation. As an application of the model, we show global hazard maps for direct and indirect hits expressed in terms of return periods. The global TC model can be of interest to climate and environmental scientists, economists and financial risk managers. Plain Language Summary: Tropical cyclones (TCs) are among the most destructive natural hazards and yet, quantifying their financial impacts remains a difficult task. Being able to randomly simulate TCs and their features (such as wind speed) with mathematical models is therefore critical to build scenarios (and their corresponding probability) for land use planning and financial risk management. A common approach is to simulate TCs by tracking them directly in climate model outputs but this often underestimates the frequency of intense TCs while being computationally costly overall to generate a large number of events. For these reasons, many authors have looked into alternative approaches that replicate key physical features of TCs but rather using statistical models that are much less computationally demanding. This paper therefore presents a global TC model that leverages the strengths of both statistical and climate models to simulate a large number of TCs whose features are consistent with the physics and observations. As an important global phenomenon that affects TCs globally, we also integrate in our model the effects of El Niño. The paper focuses on the methodology and validation of each model component and concludes with global hazard maps for direct and indirect hits. Key Points: We present a global tropical cyclone (TC) wind model built upon a climate model large ensemble that can be used for risk analysisWe integrate ENSO into our model since it is a strong driver of storm annual frequency, cyclogenesis, trajectories, and intensityWe present global hazard maps consistent with statistical features of TC components and coherent with a global climate model [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
48. Analysis of a Predictive Mathematical Model of Weather Changes Based on Neural Networks.
- Author
-
Malozyomov, Boris V., Martyushev, Nikita V., Sorokova, Svetlana N., Efremenkov, Egor A., Valuev, Denis V., and Qi, Mengxu
- Subjects
PREDICTION models ,MATHEMATICAL models ,MATHEMATICAL analysis ,ATMOSPHERIC models ,METEOROLOGICAL stations ,WEATHER forecasting - Abstract
In this paper, we investigate mathematical models of meteorological forecasting based on the work of neural networks, which allow us to calculate presumptive meteorological parameters of the desired location on the basis of previous meteorological data. A new method of grouping neural networks to obtain a more accurate output result is proposed. An algorithm is presented, based on which the most accurate meteorological forecast was obtained based on the results of the study. This algorithm can be used in a wide range of situations, such as obtaining data for the operation of equipment in a given location and studying meteorological parameters of the location. To build this model, we used data obtained from personal weather stations of the Weather Underground company and the US National Digital Forecast Database (NDFD). Also, a Google remote learning machine was used to compare the results with existing products on the market. The algorithm for building the forecast model covered several locations across the US in order to compare its performance in different weather zones. Different methods of training the machine to produce the most effective weather forecast result were also considered. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
49. KNMI'23 Climate Scenarios for the Netherlands: Storyline Scenarios of Regional Climate Change.
- Author
-
van der Wiel, Karin, Beersma, Jules, van den Brink, Henk, Krikken, Folmer, Selten, Frank, Severijns, Camiel, Sterl, Andreas, van Meijgaard, Erik, Reerink, Thomas, and van Dorland, Rob
- Subjects
GREENHOUSE gases ,CLIMATE change ,GLOBAL warming ,ENERGY futures ,ATMOSPHERIC models - Abstract
This paper presents the methodology for the construction of the KNMI'23 national climate scenarios for the Netherlands. We have developed six scenarios, that cover a substantial part of the uncertainty in CMIP6 projections of future climate change in the region. Different sources of uncertainty are disentangled as much as possible, partly by means of a storyline approach. Uncertainty in future emissions is covered by making scenarios conditional on different SSP scenarios (SSP1‐2.6, SSP2‐4.5, and SSP5‐8.5). For each SSP scenario and time horizon (2050, 2100, 2150), we determine a global warming level based on the median of the constrained estimates of climate sensitivity from IPCC AR6. The remaining climate model uncertainty of the regional climate response at these warming levels is covered by two storylines, which are designed with a focus on the annual and seasonal mean precipitation response (a dry‐trending and wet‐trending variant for each SSP). This choice was motivated by the importance of future water management to society. For users with specific interests we provide means how to account for the impact of the uncertainty in climate sensitivity. Since CMIP6 GCM data do not provide the required spatial detail for impact modeling, we reconstruct the CMIP6 responses by resampling internal variability in a GCM‐RCM initial‐condition ensemble. The resulting climate scenarios form a detailed storyline of plausible future climates in the Netherlands. The data can be used for impact calculations and assessments by stakeholders, and will be used to inform policy making in different sectors of Dutch society. Plain Language Summary: To prepare society for the effects of future climate change, we need to know what the future climate will be like. In this paper we explain the method that is used to construct six different scenarios that describe possible future climates of the Netherlands. The scenarios make assumptions about future greenhouse gas emissions, and are based on the outcomes of climate models that simulate the response of the climate to these emissions. The KNMI'23 climate scenarios show that strongly reducing global emissions strongly reduces the expected changes in the climate of the Netherlands. In the scenario in which global emissions continue to rise until 2080, Dutch society will have to adapt to a much stronger increases in heat and precipitation extremes, increased risks of droughts with low river discharge in summer, and increased risk of flooding due to high river discharges in winter. In the coming years the climate scenario data will be used to evaluate what needs to be done to keep the country a safe place for people to live in and to thrive in, under changing climate conditions. Key Points: We present a methodology for the construction of regional climate scenarios using a storyline approach to partition uncertaintyResults from CMIP6 are reconstructed with a GCM‐RCM initial condition ensemble to produce high‐resolution scenario data for end‐usersSix scenario variants cover emission uncertainty (high, moderate, low) and uncertainty in the regional response (dry‐trending, wet‐trending) [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
50. Numerical coupling of aerosol emissions, dry removal, and turbulent mixing in the E3SM Atmosphere Model version 1 (EAMv1) – Part 1: Dust budget analyses and the impacts of a revised coupling scheme.
- Author
-
Wan, Hui, Zhang, Kai, Vogl, Christopher J., Woodward, Carol S., Easter, Richard C., Rasch, Philip J., Feng, Yan, and Wang, Hailong
- Subjects
TURBULENT mixing ,DUST ,COUPLING schemes ,ATMOSPHERIC models ,AEROSOLS ,LIFE cycles (Biology) - Abstract
An earlier study evaluating dust life cycle in the Energy Exascale Earth System Model (E3SM) Atmosphere Model version 1 (EAMv1) has revealed that the simulated global mean dust lifetime is substantially shorter when higher vertical resolution is used, primarily due to significant strengthening of dust dry removal in source regions. This paper demonstrates that the sequential splitting of aerosol emissions, dry removal, and turbulent mixing in the model's time integration loop, especially the calculation of dry removal after surface emissions and before turbulent mixing, is the primary reason for the vertical resolution sensitivity reported in that earlier study. Based on this reasoning, we propose a revised numerical process coupling scheme that requires the least amount of code changes, in which the surface emissions are applied before turbulent mixing instead of before dry removal. The revised scheme allows newly emitted particles to be transported aloft by turbulence before being removed from the atmosphere, and hence better resembles the dust life cycle in the real world. Sensitivity experiments show that the revised process coupling substantially weakens dry removal and strengthens vertical mixing in dust source regions. It also strengthens the large-scale transport from source to non-source regions, strengthens dry removal outside the source regions, and strengthens wet removal and activation globally. In transient simulations of the years 2000–2009 conducted using 1 ∘ horizontal grid spacing, 72 vertical layers, and unchanged tuning parameters of emission strength, the revised process coupling leads to a 40 % increase in the global total dust burden and an increase of dust lifetime from 1.8 to 2.5 d in terms of 10-year averages. Weakened dry removal and increased mixing ratios are also seen for other aerosol species that have substantial surface emissions, although the changes in mixing ratio are considerably smaller for the submicron species than for dust and sea salt. Numerical experiments confirm that the revised coupling scheme significantly reduces the strong and non-physical sensitivities of model results to vertical resolution in the original EAMv1. This provides a motivation for adopting the revised scheme in EAM as well as for further improvements on the simple revision presented in this paper. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.