176 results
Search Results
2. Nowcasting Earthquakes With Stochastic Simulations: Information Entropy of Earthquake Catalogs.
- Author
-
Rundle, John B., Baughman, Ian, and Zhang, Tianjian
- Subjects
EARTHQUAKES ,EARTHQUAKE aftershocks ,ENTROPY (Information theory) ,MACHINE learning ,EARTHQUAKE hazard analysis ,RECEIVER operating characteristic curves ,CATALOGS ,ENTROPY - Abstract
Earthquake nowcasting has been proposed as a means of tracking the change in large earthquake potential in a seismically active area. The method was developed using observable seismic data, in which probabilities of future large earthquakes can be computed using Receiver Operating Characteristic methods. Furthermore, analysis of the Shannon information content of the earthquake catalogs has been used to show that there is information contained in the catalogs, and that it can vary in time. So an important question remains, where does the information originate? In this paper, we examine this question using stochastic simulations of earthquake catalogs. Our catalog simulations are computed using an Earthquake Rescaled Aftershock Seismicity ("ERAS") stochastic model. This model is similar in many ways to other stochastic seismicity simulations, but has the advantage that the model has only 2 free parameters to be set, one for the aftershock (Omori‐Utsu) time decay, and one for the aftershock spatial migration away from the epicenter. Generating a simulation catalog and fitting the two parameters to the observed catalog such as California takes only a few minutes of wall clock time. While clustering can arise from random, Poisson statistics, we show that significant information in the simulation catalogs arises from the "non‐Poisson" power‐law aftershock clustering, implying that the practice of de‐clustering observed catalogs may remove information that would otherwise be useful in forecasting and nowcasting. We also show that the nowcasting method provides similar results with the ERAS model as it does with observed seismicity. Plain Language Summary: Earthquake nowcasting was proposed as a means of tracking the change in the potential for large earthquakes in a seismically active area, using the record of small earthquakes. The method was developed using observed seismic data, in which probabilities of future large earthquakes can be computed using machine learning methods that were originally developed with the advent of radar in the 1940s. These methods are now being used in the development of machine learning and artificial intelligence models in a variety of applications. In recent times, methods to simulate earthquakes using the observed statistical laws of earthquake seismicity have been developed. One of the advantages of these stochastic models is that it can be used to analyze the various assumptions that are inherent in the analysis of seismic catalogs of earthquakes. In this paper, we analyze the importance of the space‐time clustering that is often observed in earthquake seismicity. We find that the clustering is the origin of information that makes the earthquake nowcasting methods possible. We also find that a common practice of "aftershock de‐clustering", often used in the analysis of these catalogs, removes information about future large earthquakes. Key Points: Earthquake nowcasting tracks the change in the potential for large earthquakes, using information contained in seismic catalogsWe analyze the information contained in the space‐time clustering that is observed in earthquake seismicityWe find that "aftershock de‐clustering" of catalogs removes information about future large earthquakes that the nowcasting method uses [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
3. It's in the bag? The effect of plastic carryout bag bans on where and what people purchase to eat.
- Subjects
PLASTIC bag laws ,CONSUMER behavior ,GROCERY shopping ,BORDERLANDS ,CONSUMERS ,ENVIRONMENTAL health - Abstract
This paper examines how banning the use of plastic carryout bags at grocery stores affects where and what people purchase to eat. Using quasi‐random variation in local bag ban adoption across California and two data sources (retail scanner data and consumer survey data), I show that banning plastic carryout bags shifted some food sales away from regulated grocery stores toward unregulated grocery stores and restaurants. Specifically, I find that bag bans cause a 1.8% decline in food‐at‐home sales and a 1.9 percentage point increase in consumers' food‐away‐from‐home expenditure share. The decline in food‐at‐home sales is larger in jurisdictions more likely to experience cross‐border shopping, whereas the increase in food‐away‐from‐home expenditures is larger farther from jurisdiction borders. Together these results suggest that a small share of consumers find a way to bypass the bag bans—either by cross‐border shopping if near a border or by shifting to restaurants if not near a border. Heterogeneity analyses reveal the policy effects are strongest for those with higher incomes, those under 65 years, and those with young children, suggesting both income effects and time constraints as mechanisms behind the behavioral change. By quantifying consumer avoidance behaviors, these results enable policymakers to more accurately measure the impacts of their regulations and to understand the potential trade‐offs between their environmental and public health objectives. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
4. The good, the bad, and the future: Systematic review identifies best use of biomass to meet air quality and climate policies in California.
- Author
-
Freer‐Smith, Peter, Bailey‐Bale, Jack H., Donnison, Caspar L., and Taylor, Gail
- Subjects
FOREST biomass ,GOVERNMENT policy on climate change ,BIOMASS ,BIOMASS production ,GREENHOUSE gases ,AIR quality - Abstract
California has large and diverse biomass resources and provides a pertinent example of how biomass use is changing and needs to change, in the face of climate mitigation policies. As in other areas of the world, California needs to optimize its use of biomass and waste to meet environmental and socioeconomic objectives. We used a systematic review to assess biomass use pathways in California and the associated impacts on climate and air quality. Biomass uses included the production of renewable fuels, electricity, biochar, compost, and other marketable products. For those biomass use pathways recently developed, information is available on the effects—usually beneficial—on greenhouse gas (GHG) emissions, and there is some, but less, published information on the effects on criteria pollutants. Our review identifies 34 biomass use pathways with beneficial impacts on either GHG or pollutant emissions, or both—the "good." These included combustion of forest biomass for power and conversion of livestock‐associated biomass to biogas by anaerobic digestion. The review identified 13 biomass use pathways with adverse impacts on GHG emissions, criteria pollutant emissions, or both—the "bad." Wildfires are an example of one out of eight pathways which were found to be bad for both climate and air quality, while only two biomass use pathways reduced GHG emissions relative to an identified counterfactual but had adverse air quality impacts. Issues of high interest for the "future" included land management to reduce fire risk, future policies for the dairy industries, and full life‐cycle analysis of biomass production and use. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
5. Drought influences habitat associations and abundances of birds in California's Central Valley.
- Author
-
Goldstein, Benjamin R., Furnas, Brett J., Calhoun, Kendall L., Larsen, Ashley E., Karp, Daniel S., and de Valpine, Perry
- Subjects
DROUGHT management ,DROUGHTS ,HABITATS ,WATER supply ,AGRICULTURE ,FARMS ,ECOLOGICAL niche - Abstract
Aim: As climate change increases the frequency and severity of droughts in many regions, conservation during drought is becoming a major challenge for ecologists. Droughts are multidimensional climate events whose impacts may be moderated by changes in temperature, water availability or food availability, or some combination of these. Simultaneously, other stressors such as extensive anthropogenic landscape modification may synergize with drought. Useful observational models for guiding conservation decision‐making during drought require multidimensional, dynamic representations to disentangle possible drought impacts, and consequently, they will require large, highly resolved data sets. In this paper, we develop a two‐stage predictive framework for assessing how drought impacts vary with species, habitats and climate pathways. Location: Central Valley, California, USA. Methods: We used a two‐stage counterfactual analysis combining predictive linear mixed models and N‐mixture models to characterize the multidimensional impacts of drought on 66 bird species. We analysed counts from the eBird participatory science data set between 2010 and 2019 and produced species‐ and habitat‐specific estimates of the impact of drought on relative abundance. Results: We found that while fewer than a quarter (16/66) of species experienced abundance declines during drought, nearly half of all species (27/66) changed their habitat associations during drought. Among species that shifted their habitat associations, the use of natural habitats declined during drought while use of developed habitat and perennial agricultural habitat increased. Main Conclusions: Our findings suggest that birds take advantage of agricultural and developed land with artificial irrigation and heat‐buffering microhabitat structure, such as in orchards or parks, to buffer drought impacts. A working lands approach that promotes biodiversity and mitigates stressors across a human‐induced water gradient will be critical for conserving birds during drought. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
6. If you build it, they will come: Coastal amenities facilitate human engagement in marine protected areas.
- Author
-
Free, Christopher M., Smith, Joshua G., Lopazanski, Cori J., Brun, Julien, Francis, Tessa B., Eurich, Jacob G., Claudet, Joachim, Dugan, Jenifer E., Gill, David A., Hamilton, Scott L., Kaschner, Kristin, Mouillot, David, Ziegler, Shelby L., Caselle, Jennifer E., and Nickols, Kerry J.
- Subjects
MARINE parks & reserves ,CHARISMA ,FISH conservation ,OUTREACH programs ,TOURIST attractions - Abstract
Calls for using marine protected areas (MPAs) to achieve goals for nature and people are increasing globally. While the conservation and fisheries impacts of MPAs have been comparatively well‐studied, impacts on other dimensions of human use have received less attention. Understanding how humans engage with MPAs and identifying traits of MPAs that promote engagement is critical to designing MPA networks that achieve multiple goals effectively, equitably and with minimal environmental impact.In this paper, we characterize human engagement in California's MPA network, the world's largest MPA network scientifically designed to function as a coherent network (124 MPAs spanning 16% of state waters and 1300 km of coastline) and identify traits associated with higher human engagement. We assemble and compare diverse indicators of human engagement that capture recreational, educational and scientific activities across California's MPAs.We find that human engagement is correlated with nearby population density and that site "charisma" can expand human engagement beyond what would be predicted based on population density alone. Charismatic MPAs tend to be located near tourist destinations, have long sandy beaches and be adjacent to state parks and associated amenities. In contrast, underutilized MPAs were often more remote and lacked both sandy beaches and parking lot access.Synthesis and applications: These results suggest that achieving MPA goals associated with human engagement can be promoted by developing land‐based amenities that increase access to coastal MPAs or by locating new MPAs near existing amenities during the design phase. Alternatively, human engagement can be limited by locating MPAs in areas far from population centres, coastal amenities or sandy beaches. Furthermore, managers may want to prioritize monitoring, enforcement, education and outreach programmes in MPAs with traits that predict high human engagement. Understanding the extent to which human engagement impacts the conservation performance of MPAs is a critical next step to designing MPAs that minimize tradeoffs among potentially competing objectives. Read the free Plain Language Summary for this article on the Journal blog. Read the free Plain Language Summary for this article on the Journal blog. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
7. Optimizing Earthquake Nowcasting With Machine Learning: The Role of Strain Hardening in the Earthquake Cycle.
- Author
-
Rundle, John B., Yazbeck, Joe, Donnellan, Andrea, Fox, Geoffrey, Ludwig, Lisa Grant, Heflin, Michael, and Crutchfield, James
- Subjects
STRAIN hardening ,SEISMIC waves ,MACHINE learning ,SUPERVISED learning ,EARTHQUAKES ,RECEIVER operating characteristic curves ,TIME series analysis - Abstract
Nowcasting is a term originating from economics, finance, and meteorology. It refers to the process of determining the uncertain state of the economy, markets or the weather at the current time by indirect means. In this paper, we describe a simple two‐parameter data analysis that reveals hidden order in otherwise seemingly chaotic earthquake seismicity. One of these parameters relates to a mechanism of seismic quiescence arising from the physics of strain‐hardening of the crust prior to major events. We observe an earthquake cycle associated with major earthquakes in California, similar to what has long been postulated. An estimate of the earthquake hazard revealed by this state variable time series can be optimized by the use of machine learning in the form of the Receiver Operating Characteristic skill score. The ROC skill is used here as a loss function in a supervised learning mode. Our analysis is conducted in the region of 5° × 5° in latitude‐longitude centered on Los Angeles, a region which we used in previous papers to build similar time series using more involved methods (Rundle & Donnellan, 2020, https://doi.org/10.1029/2020EA001097; Rundle, Donnellan et al., 2021, https://doi.org/10.1029/2021EA001757; Rundle, Stein et al., 2021, https://doi.org/10.1088/1361-6633/abf893). Here we show that not only does the state variable time series have forecast skill, the associated spatial probability densities have skill as well. In addition, use of the standard ROC and Precision (PPV) metrics allow probabilities of current earthquake hazard to be defined in a simple, straightforward, and rigorous way. Plain Language Summary: Earthquake nowcasting refers to the determination of hazard for major earthquakes at the present time, the recent past, and the near future. Nowcasting is an idea borrowed from economics, markets, and meteorology, where it has been frequently used. In this paper, we show that there is order hidden within chaotic earthquake seismicity using a very simple transformation of the data. Small earthquakes appear to transition from unstable stick‐slip events that produce seismic waves, to stable sliding where no seismic waves are produced. Our hypothesis is that this transition is due to a material phenomenon called strain‐hardening, that is frequently observed in laboratory rock mechanics experiments. The result is a state variable time series, computed over the last 51 years in California, that strongly resembles the long‐anticipated cycle of stress accumulation and release. Using supervised machine learning techniques, we can optimize the two‐parameter model. From that optimized model, we can rigorously calculate the probability of current hazard from major earthquakes. Extending these methods, we can also compute spatial hazard as well. The result is a new method for assessing earthquake hazard that may be useful for a variety of applications. Key Points: "Chaotic" seismicity contains hidden structure in the form of state variable time seriesStandard data science methods can be used to convert the time series to probabilitiesBoth temporal and spatial probabilities can be computed [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
8. Nowcasting Earthquakes: Imaging the Earthquake Cycle in California With Machine Learning.
- Author
-
Rundle, John B., Donnellan, Andrea, Fox, Geoffrey, Crutchfield, James P., and Granat, Robert
- Subjects
SUPERVISED learning ,MACHINE learning ,RECEIVER operating characteristic curves ,PRINCIPAL components analysis ,SIGNAL detection ,EARTHQUAKES - Abstract
We propose a new machine learning‐based method for nowcasting earthquakes to image the time‐dependent earthquake cycle. The result is a timeseries that may correspond to the process of stress accumulation and release. The timeseries are constructed by using principal component analysis of regional seismicity. The patterns are found as eigenvectors of the cross‐correlation matrix of a collection of seismicity timeseries in a coarse grained regional spatial grid (pattern recognition via unsupervised machine learning). The eigenvalues of this matrix represent the relative importance of the various eigenpatterns. Using the eigenvectors and eigenvalues, we compute the weighted correlation timeseries of the regional seismicity. This timeseries has the property that the weighted correlation generally decreases prior to major earthquakes in the region, and increases suddenly just after a major earthquake occurs. As in a previous paper (Rundle & Donnellan, 2020, https://doi.org/10.1029/2020ea001097), we find that this method produces a nowcasting timeseries that resembles the hypothesized regional stress accumulation and release process characterizing the earthquake cycle. We then address the problem of whether the timeseries contain information regarding future large earthquakes. For this, we compute a receiver operating characteristic and determine the decision thresholds for several future time periods of interest (optimization via supervised machine learning). We find that signals can be detected that can be used to characterize the information content of the timeseries. These signals may be useful in assessing present and near‐future seismic hazards. Plain Language Summary: Major earthquakes on fault systems in a tectonically active region are thought to occur in approximately repetitive cycles as a result of the buildup and release of tectonic forces (stress). Nowcasting is a technique adopted from weather, finance, and other fields that use readily observable proxy data to represent the unobservable stress accumulation process of interest. This paper presents a method that computes a timeseries representing the weighted correlation of small earthquake activity in the California region from 1950 to 2020. Prior to major magnitude M > 7 earthquakes, the timeseries trends toward lower values. Just after the earthquake occurs, the timeseries increases suddenly in association with the earthquake, before resuming its gradual trend toward lower values. Plotting the timeseries on an inverted scale, one sees a cyclic behavior that strongly resembles the hypothesized earthquake cycle. In principle, we can therefore use this timeseries for nowcasting, as a proxy for stress accumulation and release. Using methods of signal detection first developed for radar by the British in the 1940's, we find that the timeseries contain information about future large earthquakes that can be used for hazard assessment. Key Points: The current state of the earthquake cycle of tectonic stress accumulation and release is unobservable with existing methodsWe show that readily observable small earthquake correlations can be used to nowcast the current state of the earthquake cycleMachine learning techniques indicate that signals corresponding to future large earthquakes can be detected in a correlation time series [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
9. The Role of Anthropogenic Forcing in Western United States Hydroclimate Extremes.
- Author
-
Zhang, Wei and Gillies, Robert
- Subjects
HYDROLOGIC cycle ,ATMOSPHERIC models ,CLIMATE change - Abstract
Despite lower‐than‐average total precipitation in the western states of the U.S., the 2021 "precipitation roller coaster" defined as large precipitation swings has pointed to a strong hydroclimatic intensity (HYINT). Here we examine the 2021 HYINT using an index—a product of the average precipitation intensity (INT) and dry spell length (DSL). HYINT exhibited an extremely high value in the western U.S. in 2021. INT and DSL contribute differently to the 2021 HYINT, with large spatial variability. Overall, the 2021 extreme HYINT in central California and Utah is tied more to large INT, than to DSL. Meanwhile, the historical trends in INT and DSL may have contributed to the extreme 2021 HYINT event. The fraction of attributable risk framework reveals that the 2021 extreme HYINT is more likely to occur with anthropogenic forcing (e.g., 7.3 times more likely for HYINT exceeding 1.3) than natural forcing alone. Plain Language Summary: The western U.S. is a hotspot for studying climate change impacts on the hydrological cycle. Despite lower‐than‐average total precipitation in 2021, the contrasting dryness and wetness in the western U.S. has been widely reported as a "precipitation roller coaster." In this paper we quantified the "precipitation roller coaster" using an index (hydroclimatic intensity [HYINT])—a product of average precipitation intensity during wet days and dry spell length (DSL). The study found that the 2021 extreme HYINT event was largely attributable to the combined impacts of precipitation intensity and DSL in California and Utah, with precipitation intensity playing a more important role. In contrast, the 2021 precipitation event in other western states exhibited divergent contributions from precipitation intensity and DSL. The southwestern U.S. has been identified as a hotspot for increasing HYINT, which is tied more to the increasing DSL than the precipitation intensity. The trends in DSL and precipitation intensity may have played a key role in driving the 2021 extreme HYINT event. Using climate model experiments with and without anthropogenic forcing, an extreme HYINT event in the western U.S. is more likely to occur with anthropogenic forcing. Key Points: Hydroclimatic intensity (HYINT) exhibited extremely high values in parts of the western U.S. in 2021, mainly caused by average precipitation intensityHYINT shows a significant rising trend in most of the southwestern U.S. mainly tied to a rising dry spell length trendThe extreme HYINT event is more likely to occur under anthropogenic forcing than natural forcing alone [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
10. Data‐driven spatio‐temporal analysis of wildfire risk to power systems operation.
- Author
-
Umunnakwe, Amarachi, Parvania, Masood, Nguyen, Hieu, Horel, John D., and Davis, Katherine R.
- Subjects
WILDFIRE prevention ,WILDFIRE risk ,RISK assessment ,ARTIFICIAL neural networks ,ELECTRIC power distribution grids ,NATURAL disasters - Abstract
Wildfires are natural or man‐made disasters that continuously threaten portions of the transmission and distribution grid, and thus the stability of the electric grid. This paper presents a two‐stage framework for assessing power system‐wildfire risk using a data‐driven wildfire prediction model. The first stage of the framework estimates the spatio‐temporal probability of potential wildfire ignition and propagation using a deep neural network in combination with the wildfire physical spread model. Analysis reveals similar spatial and temporal patterns between the model‐predicted wildfire ignition potential and actual wildfire ignition. Motivated by these observations, the second stage assesses the wildfire risk in the power grid operation in terms of potential loss of load by de‐energisation, through combining geospatial information system data of the power grid topology and the stochastic spatio‐temporal wildfire model developed in the first stage. The electric power utility applications introduced by the proposed framework are twofold: 1) a spatio‐temporal risk model for proactive de‐energisation against potential power system failure‐induced wildfire, and 2) a spatio‐temporal spreading model for optimal grid operations against exogenous wildfire. The proposed model, based on real‐world dataset, is demonstrated on the IEEE 24‐bus test system mapped to a study area in Northern California, while the results illustrate the proposed model can achieve the best performance in potential wildfire ignition detection (AUC of 0.995) compared to other baselines, as well as demonstrates the risk‐aware operation of the power system enabled by the proposed framework. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
11. Making salient ethics arguments about vaccine mandates: A California case study.
- Author
-
Navin, Mark C. and Attwell, Katie
- Subjects
- *
VACCINATION policies , *HEALTH policy , *IMMUNIZATION , *HUMAN rights , *INFORMED consent (Medical law) , *HARM reduction - Abstract
Vaccine mandates can take many forms, and different kinds of mandates can implicate an array of values in diverse ways. It follows that good ethics arguments about particular vaccine mandates will attend to the details of individual policies. Furthermore, attention to particular mandate policies—and to attributes of the communities they aim to govern—can also illuminate which ethics arguments may be more salient in particular contexts. If ethicists want their arguments to make a difference in policy, they should attend to these kinds of empirical considerations. This paper focuses on the most common and contentious vaccine mandate reform in the contemporary United States: the elimination of nonmedical exemptions (NMEs) to school and daycare vaccine mandates. It highlights, in particular, debates about California's Senate Bill 277 (SB277), which was the first successful recent effort to eliminate NMEs in that country. We use media, secondary sources, and original interviews with policymakers and activists to identify and evaluate three ethics arguments offered by critics of SB277: parental freedom, informed consent, and children's rights to care and education. We then turn to one ethics argument often offered by advocates of SB277: harm prevention. We note, however, that three arguments for mandates that are common in the immunization ethics literature—fairness/free‐riding, children's rights to vaccination, and utilitarianism—did not play a role in debates about SB277. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
12. Annual biomass spatial data for southern California (2001–2021): Above‐ and belowground, standing dead, and litter.
- Author
-
Schrader‐Patton, Charlie C., Underwood, Emma C., and Sorenson, Quinn M.
- Subjects
MONTE Carlo method ,BIOMASS ,GEOGRAPHIC information systems ,ECOSYSTEM management ,TUNDRAS ,DATA libraries ,FOREST fire management ,FOREST restoration - Abstract
Biomass estimates for shrub‐dominated ecosystems in southern California have been generated at national and statewide extents. However, existing data tend to underestimate biomass in shrub vegetation types are limited to one point in time, or estimate aboveground live biomass only. In this study, we extended our previously developed estimates of aboveground live biomass (AGLBM) based on the empirical relationship of plot‐based field biomass measurements to Landsat normalized difference vegetation index (NDVI) and multiple environmental factors to include other vegetative pools of biomass. AGLBM estimates were made by extracting plot values from elevation, solar radiation, aspect, slope, soil type, landform, climatic water deficit, evapotranspiration, and precipitation rasters and then using a random forest model to estimate per‐pixel AGLBM across our southern California study area. By substituting year‐specific Landsat NDVI and precipitation data, we created a stack of annual AGLBM raster layers for each year from 2001 to 2021. Using these AGLBM data as a foundation, we developed decision rules to estimate belowground, standing dead, and litter biomass pools. These rules were based on relationships between AGLBM and the biomass of the other vegetative pools derived primarily from peer‐reviewed literature and an existing spatial data set. For shrub vegetation types (our primary focus), rules were based on literature estimates by the postfire regeneration strategy of each species (obligate seeder, facultative seeder, obligate resprouter). Similarly, for nonshrub vegetation types (grasslands, woodlands) we used literature and existing spatial data sets specific to each vegetation type to define rules to estimate the other pools from AGLBM. Using a Python language script that accessed Environmental Systems Research Institute raster geographic information system utilities, we applied decision rules to create raster layers for each of the non‐AGLBM pools for the years 2001–2021. The resulting spatial data archive contains a zipped file for each year; each of these files contains four 32‐bit tiff files for each of the four biomass pools (AGLBM, standing dead, litter, and belowground). The biomass units are grams per square meter (g/m2). We estimated the uncertainty of our biomass data by conducting a Monte Carlo analysis of the inputs used to generate the data. Our Monte Carlo technique used randomly generated values for each of the literature‐based and spatial inputs based on their expected distribution. We conducted 200 Monte Carlo iterations, which produced percentage uncertainty values for each of the biomass pools. Results showed, using 2010 as an example, mean biomass for the study area and percentage uncertainty for each of the pools as follows: AGLBM (905.4 g/m2, 14.4%); standing dead (644.9 g/m2, 1.3%); litter (731.2 g/m2, 1.2%); and belowground (776.2 g/m2, 17.2%). Because our methods are consistently applied across each year, the data produced can be used to inform changes in biomass pools due to disturbance and subsequent recovery. As such, these data provide an important contribution to supporting the management of shrub‐dominated ecosystems for monitoring trends in carbon storage and assessing the impacts of wildfire and management activities, such as fuel management and restoration. There are no copyright restrictions on the data set; please cite this paper and the data package when using these data. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
13. Uncertainty in consensus predictions of plant species' vulnerability to climate change.
- Author
-
Rose, Miranda Brooke, Velazco, Santiago José Elías, Regan, Helen M., Flint, Alan L., Flint, Lorraine E., Thorne, James H., and Franklin, Janet
- Subjects
CLIMATE change models ,PLANT species ,SPECIES distribution ,SPATIAL variation ,SPECIES diversity - Abstract
Aim: Variation in spatial predictions of species' ranges made by various models has been recognized as a significant source of uncertainty for modelling species distributions. Consensus approaches that combine the results of multiple models have been employed to reduce the uncertainty introduced by different algorithms. We evaluate how estimates of habitat suitability, projected using species distribution models (SDMs), varied among different consensus methods relative to the variation introduced by different global climate models (GCMs) and representative concentration pathways (RCPs) used for projection. Location: California Floristic Province (California, US portion). Methods: We modelled the current and future potential distributions of 82 terrestrial plant species, developing model predictions under different combinations of GCMs, RCPs, time periods, dispersal assumptions and SDM consensus methods commonly used to combine different species distribution modelling algorithms. We assessed how each of these factors contributed to the variability in future predictions of species habitat suitability change and aggregate measures of proportional change in species richness. We also related variability in species‐level habitat change to species' attributes. Results: Assuming full dispersal capacity, the variability between habitat predictions made by different consensus methods was higher than the variability introduced by different RCPs and GCMs. The relationships between species' attributes and variability in future habitat predictions depended on the source of uncertainty and dispersal assumptions. However, species with small ranges or low prevalence tended to be associated with high variability in range change forecasts. Main Conclusions: Our results support exploring multiple consensus approaches when considering changes in habitat suitability outside of species' current distributions, especially when projecting species with low prevalence and small range sizes, as these species tend to be of the greatest conservation concern yet produce highly variable model outputs. Differences in vulnerability between diverging greenhouse gas concentration scenarios are most readily observed for end‐of‐century time periods and within species' currently occupied habitats (no dispersal). [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
14. Quantifying Seasonal and Diurnal Cycles of Solar‐Induced Fluorescence With a Novel Hyperspectral Imager.
- Author
-
Ruehr, Sophie, Gerlein‐Safdi, Cynthia, Falco, Nicola, Seibert, Paul O., Chou, Chunwei, Albert, Loren, and Keenan, Trevor F.
- Subjects
CARBON dioxide fixation ,FLUORESCENCE ,PLANT canopies ,LEAF physiology ,SEASONS ,PLATEAUS - Abstract
Solar‐induced fluorescence (SIF) is a proxy of ecosystem photosynthesis that often scales linearly with gross primary productivity (GPP) at the canopy scale. However, the mechanistic relationship between GPP and SIF is still uncertain, especially at smaller temporal and spatial scales. We deployed a ultra‐hyperspectral imager over two grassland sites in California throughout a soil moisture dry down. The imager has high spatial resolution that limits mixed pixels, enabling differentiation between plants and leaves within one scene. We find that imager SIF correlates well with diurnal changes in leaf‐level physiology and gross primary productivity under well‐watered conditions. These relationships deteriorate throughout the dry down event. Our results demonstrate an advancement in SIF imaging with new possibilities in remotely sensing plant canopies from the leaf to the ecosystem. These data can be used to resolve outstanding questions regarding SIF's meaning and usefulness in terrestrial ecosystem monitoring. Plain Language Summary: Estimating the rate of carbon uptake by vegetation across space and time remains a challenge. Solar‐induced fluorescence (SIF), the emission of light by vegetation during photosynthesis, has recently emerged as a potential estimate of carbon uptake in many ecosystems and is observable from both satellites and ground‐based sensors. Here we present results from a field campaign with a novel SIF instrument that creates images (akin to a photo) across a landscape, allowing for SIF measurements from individual leaves, plants, or areas of interest. We find that SIF retrievals from the imager correspond to seasonal variations in carbon dioxide fixation rates and leaf‐level physiology relating to photosynthesis. We use this novel technology to improve understanding of SIF and carbon uptake across spatial and temporal scales. Key Points: Novel imagery technology enables solar‐induced fluorescence (SIF) acquisition across space and timeSIF diurnal and seasonal variations correspond to carbon fluxes and environmental conditionsImaging capacity predicts leaf‐level physiology across leaf, plant, and landscape scales [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
15. Quantifying Earth's Topography: Steeper and Larger Than Projected in Digital Terrain Models.
- Author
-
Voigtländer, Anne, Rheinwalt, Aljoscha, and Tofelde, Stefanie
- Subjects
DIGITAL elevation models ,EARTH topography ,LANDSLIDES ,SURFACE topography ,GRID cells ,ANALYTICAL solutions - Abstract
Grid‐ or pixel‐based models, used across various scientific disciplines from microscopic to planetary scales, contain an unquantified error that bias our interpretation of the data. The error is produced by projecting 3D data onto a 2D grid. For Digital Terrain Models (DTMs) the projection error affects all slope‐dependent topographic metrics, like surface area or slope angle. Due to the proportionality of the error to the cosine of the slope, we can correct for it. We quantify the error and test the correction using synthetic landscapes for which we have analytical solutions of their metrics. Application to real‐world landscapes in California, reveal the systematic underestimation of surface area by up to a third, and mean slope angles by up to 10° in steep topography in current DTMs. Correcting projection errors allow for true estimates of surface areas and slope distributions enabling physics‐based models of surface processes at any spatial scale. Plain Language Summary: Hiking up a steep mountain slope feels longer than the horizontal distance measured on a map. The slope angle is calculated by taking the height over the horizontal distance. The length of a (mountain) slope is always greater than the horizontal plane. This mistake is also very common in Digital Terrain Models (DTMs). Here landscapes images captured in a bird's eye view are projected onto horizontal gridded surfaces. In the model, each grid cell has the same length and contains the height of the landscape. Due to the view, the slopes are only represented by the horizontal distance, which is shorter. We call this the projection error. Because the error depends on the slope, we use it to fix it. We test the correction on different landscapes with steep and gentle topography. We find that uncorrected models underestimate surface area and length by up to a third. Fixing the projection error shows that mountain slopes are much steeper and longer than usually reported. Knowing the true length of a mountain slope, we can use the topographic data to better understand and maybe predict processes and volumes, like landslides or fluid transport. Key Points: The projection error in gridded models biases our view of surface topography at all scalesCorrecting for the projection error allows exploring physics‐based erosion and transport lawsThe correction enables better sourcing of the topographic data we have access to [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
16. Implications of Variability and Trends in Coastal Extreme Water Levels.
- Author
-
Sweet, William V., Genz, Ayesha S., Menendez, Melisa, Marra, John J., and Obeysekera, Jayantha
- Subjects
STORM surges ,TERRITORIAL waters ,WATER levels ,SEA level ,DISTRIBUTION (Probability theory) ,CLIMATE change ,TIDAL forces (Mechanics) - Abstract
Probabilities of coastal extreme water levels (EWLs) are increasing as sea levels rise. Using a time‐dependent statistical model on tide gauge data along U.S. and Pacific Basin coastlines, we show that EWL probability distributions also shift on an annual basis from climate forcing and long‐period tidal cycles. In some regions, combined variability (>15 cm) can be as large or larger than the amount of sea level rise (SLR) experienced over the past 30 years and projected over the next 30 years. Considering SLR and variability by 2050 at a location like La Jolla, California suggests a moderate‐level (damaging) flood today with a 50‐year return level (2% annual chance) would occur about 3–4 times a year during an El Nino nearing the peak of the nodal tide cycle. If interannual variability is overlooked, SLR related impacts could be more severe than anticipated based solely upon decadal‐scale projections. Plain Language Summary: Coastal communities are flooding more often due to sea level rise (SLR), but some years are worse than others. We use a statistical model to show how the probabilities of coastal high waters, often referred to as extreme water levels—a combination of above average tides and storm surge—have shifted higher or lower every year with SLR and from changes in the tides and climatic (persistent weather and ocean) patterns. There are many U.S. and Pacific coastal regions where year‐to‐year variability is 15 cm or more, which is as large as the last 30 years of SLR and this pattern is projected to continue over the next 30 years. Considering additional SLR over the next 30 years could help compensate for year‐to‐year variability. Key Points: Probability distributions of coastal extreme water levels shift higher and lower with tide cycles, climatic patterns and sea level rise (SLR)Annual shifts of >15 cm from variability along Pacific coasts exceed SLR over the last 30 years and projected over the next 30 yearsAnnual‐scale variability envelopes are envisioned to assist in decadal‐scale SLR and flood frequency assessments [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
17. Remnant salmon life history diversity rediscovered in a highly compressed habitat.
- Author
-
Hugentobler, Sara A., Sturrock, Anna M., Willmes, Malte, Thompson, Tasha Q., Johnson, Rachel C., Cordoleani, Flora, Stauffer‐Olsen, Natalie J., Whitman, George, and Meek, Mariah H.
- Subjects
LIFE history theory ,CHINOOK salmon ,ENVIRONMENTAL history ,HABITATS ,GENETIC variation - Abstract
Chinook salmon (Oncorhynchus tshawytscha) display remarkable life history diversity, underpinning their ability to adapt to environmental change. Maintaining life history diversity is vital to the resilience and stability of Chinook salmon metapopulations, particularly under changing climates. However, the conditions that promote life history diversity are rapidly disappearing, as anthropogenic forces promote homogenization of habitats and genetic lineages. In this study, we use the highly modified Yuba River in California to understand if distinct genetic lineages and life histories still exist, despite reductions in spawning habitat and hatchery practices that have promoted introgression. There is currently a concerted effort to protect federally listed Central Valley spring‐run Chinook salmon populations, given that few wild populations still exist. Despite this, we lack a comprehensive understanding of the genetic and life history diversity of Chinook salmon present in the Yuba River. To understand this diversity, we collected migration timing data and GREB1L genotypes from hook‐and‐line, acoustic tagging, and carcass surveys of Chinook salmon in the Yuba River between 2009 and 2011. Variation in the GREB1L region of the genome is tightly linked with run timing in Chinook salmon throughout their range, but the relationship between this variation and entry on spawning grounds is little explored in California's Central Valley. We found that the date Chinook salmon crossed the lowest barrier to Yuba River spawning habitat (Daguerre Point Dam) was tightly correlated with their GREB1L genotype. Importantly, our study confirms that ESA‐listed spring‐run Chinook salmon are spawning in the Yuba River, promoting a portfolio of life history and genetic diversity, despite the highly compressed habitat. This work highlights the need to identify and protect this life history diversity, especially in heavily impacted systems, to maintain healthy Chinook salmon metapopulations. Without protection, we run the risk of losing the last vestiges of important genetic variation. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
18. Prediction model for short‐term traffic flow based on a K‐means‐gated recurrent unit combination.
- Author
-
Sun, Zhaoyun, Hu, Yuanjiao, Li, Wei, Feng, Shaowei, and Pei, Lili
- Subjects
TRAFFIC flow ,TRAFFIC patterns ,K-means clustering ,PREDICTION models ,TRAFFIC estimation ,CLASSIFICATION algorithms - Abstract
Short‐term forecasting of traffic flow is an indispensable part of easing traffic pressure. Considering that different traffic flow patterns will affect the short‐term traffic flow prediction results, a combined method based on the K‐means clustering algorithm and gated recurrent unit (GRU) is proposed to build a short‐term traffic flow prediction model to overcome the above problems. The K‐means algorithm is used to cluster historical traffic flow data to establish different traffic flow pattern libraries. The K‐nearest neighbour (KNN) classification algorithm is used to determine the historical traffic flow pattern most similar to the traffic flow change trend of the date to be predicted. All historical traffic flow data in this category is used training samples to make targeted predictions. The traffic flow data of performance measurement system (PeMS) in California, USA is used to verify the performance of the proposed model. Compared with the GRU network, stacked auto encoders (SAEs), random forest (RF), and support vector machine regression (SVR), the results show that the proposed combination model K‐means‐GRU considers the diversity of traffic flow patterns and improves the prediction accuracy, it can better solve the short‐term traffic flow prediction problem. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
19. Climatically robust multiscale species distribution models to support pronghorn recovery in California.
- Author
-
Bean, William T., Butterfield, H. Scott, Howard, Jeanette K., and Batter, Thomas J.
- Subjects
SPECIES distribution ,RANDOM forest algorithms ,REGRESSION trees ,ATMOSPHERIC models ,HABITAT selection ,HOME range (Animal geography) ,HABITATS - Abstract
We combined two climate‐based distribution models with three finer‐scale suitability models to identify habitat for pronghorn recovery in California now and into the future. We used a consensus approach to identify areas of suitable climate now and future for pronghorn in California. We compared the results of climate models from two separate hypotheses about their historical ecology in the state. Under the migration hypothesis, pronghorn were expected to be limited climatically by extreme cold in winter and extreme heat in summer; under the niche reduction hypothesis, historical pronghorn of distribution would have better represented the climatic limitations of the species. We combined occurrences from GPS collars distributed across three populations of pronghorn in the state to create three distinct habitat suitability models: (1) an ensemble model using random forests, Maxent, classification and regression Trees, and a generalized linear model; (2) a step selection function; and (3) an expert‐driven model. We evaluated consensus among both the climate models and the suitability models to prioritize areas for, and evaluate the prospects of, pronghorn recovery. Climate suitability for pronghorn in the future depends heavily on model assumptions. Under the migration hypothesis, our model predicted that there will be no suitable climate in California in the future. Under the niche reduction hypothesis, by contrast, suitable climate will expand. Habitat suitability also depended on the methods used, but areas of consensus among all three models exist in large patches throughout the state. Identifying habitat for a species which has undergone extreme range collapse, and which has very fine scale habitat needs, presents novel challenges for spatial ecologists. Our multimethod, multihypothesis approach can allow habitat modelers to identify areas of consensus and, perhaps more importantly, fill critical knowledge gaps that could resolve disagreements among the models. For pronghorn, a better understanding of their upper thermal tolerances and whether historical populations migrated will be crucial to their potential recovery in California and throughout the arid Southwest. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
20. Size at maturity, reproductive cycle, and fecundity of the southern California brown box crab Lopholithodes foraminatus and implications for developing a new targeted fishery.
- Author
-
Stroud, Ashley, Culver, Carolynn S., and Page, Henry M.
- Subjects
SEXUAL cycle ,FISHERS ,FERTILITY ,CRABS ,FISHERIES ,SPRING - Abstract
Objective: The brown box crab Lopholithodes foraminatus is a member of the king and stone crab family (Lithodidae) that occurs in deepwater along the eastern Pacific coast. Historically, landings in California have been low for this species, but an increase in fishing pressure prompted the state to designate it as an emerging fishery and implement an experimental fishery program. With no known biological studies of California brown box crab, essential fisheries information is needed to evaluate the feasibility of a new targeted fishery. Methods: Using field sampling and observations, along with laboratory studies, we investigated elements of reproductive capacity of the brown box crab in southern California. Result: We found that females reach physiological maturity at a carapace width (CW) between 50.8 and 71.7 mm, and males do so at a CW between 43.3 and 66.3 mm. Morphometric maturity analysis showed a clear inflection point of abdomen width between immature and mature females. Females were 50% functionally mature at 75 mm CW. Morphometric and functional maturity was not detected for males, albeit samples of small male crabs were extremely limited, thus warranting further study. Females followed a biennial reproduction pattern: mating occurred in the fall, followed by an approximately 18‐month brooding period, with hatching in the second spring after mating. Fecundity was positively related to size and ranged from 8352 eggs/brood for a 67.8‐mm‐CW female to 62,181 eggs/brood for a 130.5‐mm‐CW female. Conclusion: These findings can inform the evaluation of a fishery for the brown box crab, including potential management strategies and models for assessing stock condition. Impact statementNew information was generated about the reproduction of the deepwater brown box crab to help evaluate the potential for a new California commercial fishery. The results are informing discussions about ways to manage such a fishery, including limits on the size, number, and/or time of year crabs may be fished. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
21. Genomics and 20 years of sampling reveal phenotypic differences between subpopulations of outmigrating Central Valley Chinook salmon.
- Author
-
Thompson, Tasha Q., O'Leary, Shannon, O'Rourke, Sean, Tarsa, Charlene, Baerwald, Melinda R., Goertler, Pascale, and Meek, Mariah H.
- Subjects
CHINOOK salmon ,NUCLEOTIDE sequencing ,GENOMICS ,BODY size ,PHENOTYPES ,DEER - Abstract
Intraspecific diversity plays a critical role in the resilience of Chinook salmon populations. California's Central Valley (CV) historically hosted one of the most diverse population complexes of Chinook salmon in the world. However, anthropogenic factors have dramatically decreased this diversity, with severe consequences for population resilience. Here we use next generation sequencing and an archive of thousands of tissue samples collected across two decades during the juvenile outmigration to evaluate phenotypic diversity between and within populations of CV Chinook salmon. To account for highly heterogeneous sample qualities in the archive dataset, we develop and test an approach for population and subpopulation assignments of CV Chinook salmon that allows inclusion of relatively low‐quality samples while controlling error rates. We find significantly distinct outmigration timing and body size distributions for each population and subpopulation. Within the archive dataset, spring run individuals that assigned to the Mill and Deer Creeks subpopulation exhibited an earlier and broader outmigration distribution as well as larger body sizes than individuals that assigned to the Butte Creek subpopulation. Within the fall run population, individuals that assigned to the late‐fall run subpopulation also exhibited an earlier and broader outmigration distribution and larger body sizes than other fall run fish in our dataset. These results highlight the importance of distinct subpopulations for maintaining remaining diversity in CV Chinook salmon, and demonstrates the power of genomics‐based population assignments to aid the study and management of intraspecific diversity. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
22. Equine neuroaxonal dystrophy/degenerative myeloencephalopathy in Gypsy Vanner horses.
- Author
-
Powers, Alexis, Peek, Simon F., Reed, Steve, Donnelly, Callum G., Tinkler, Stacey, Gasper, David, Woolard, Kevin D., and Finno, Carrie J.
- Subjects
VITAMIN E ,HORSES ,DYSTROPHY ,DIETARY supplements ,NEUROLOGIC examination ,POSTMORTEM changes ,OSTEOCHONDROSIS - Abstract
Background: Equine neuroaxonal dystrophy/degenerative myeloencephalopathy (eNAD/EDM) is a neurodegenerative disease that primarily affects young, genetically predisposed horses that are deficient in vitamin E. Equine NAD/EDM has not previously been documented in Gypsy Vanner horses (GVs). Objectives: To evaluate: (1) the clinical phenotype, blood vitamin E concentrations before and after supplementation and pedigree in a cohort of GV horses with a high prevalence of neurologic disease suspicious for eNAD/EDM and (2) to confirm eNAD/EDM in GVs through postmortem evaluation. Animals: Twenty‐six GVs from 1 farm in California and 2 cases from the Midwestern U.S. Methods: Prospective observational study on Californian horses; all 26 GVs underwent neurologic examination. Pre‐supplementation blood vitamin E concentration was assessed in 17‐ GVs. Twenty‐three were supplemented orally with 10 IU/kg of liquid RRR‐alpha‐tocopherol once daily for 28 days. Vitamin E concentration was measured in 23 GVs after supplementation, of which 15 (65%) had pre‐supplementation measurements. Two clinically affected GVs from California and the 2 Midwestern cases had necropsy confirmation of eNAD/EDM. Results: Pre‐supplementation blood vitamin E concentration was ≤2.0 μg/mL in 16/17 (94%) of GVs from California. Post‐supplementation concentration varied, with a median of 3.39 μg/mL (range, 1.23‐13.87 μg/mL), but only 12/23 (52%) were normal (≥3.0 μg/mL). Normalization of vitamin E was significantly associated with increasing age (P =.02). Euthanized horses (n = 4) had eNAD/EDM confirmed at necropsy. Conclusions and Clinical Importance: GVs could have a genetic predisposition to eNAD/EDM. Vitamin E supplementation should be considered and monitored in young GVs. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
23. The Kimberlina synthetic multiphysics dataset for CO2 monitoring investigations.
- Author
-
Alumbaugh, David, Gasperikova, Erika, Crandall, Dustin, Commer, Michael, Feng, Shihang, Harbert, William, Li, Yaoguo, Lin, Youzuo, and Samarasinghe, Savini
- Subjects
PETROPHYSICS ,GEOPHYSICAL well logging ,SPEED of sound ,SEISMIC wave velocity ,INJECTION wells ,ELECTRICAL resistivity - Abstract
We present a synthetic multi‐scale, multi‐physics dataset constructed from the Kimberlina 1.2 CO2 reservoir model based on a potential CO2 storage site in the Southern San Joaquin Basin of California. Among 300 models, one selected reservoir‐simulation scenario produces hydrologic‐state models at the onset and after 20 years of CO2 injection. Subsequently, these models were transformed into geophysical properties, including P‐ and S‐wave seismic velocities, saturated density where the saturating fluid can be a combination of brine and supercritical CO2, and electrical resistivity using established empirical petrophysical relationships. From these 3D distributions of geophysical properties, we have generated synthetic time‐lapse seismic, gravity and electromagnetic responses with acquisition geometries that mimic realistic monitoring surveys and are achievable in actual field situations. We have also created a series of synthetic well logs of CO2 saturation, acoustic velocity, density and induction resistivity in the injection well and three monitoring wells. These were constructed by combining the low‐frequency trend of the geophysical models with the high‐frequency variations of actual well logs collected at the potential storage site. In addition, to better calibrate our datasets, measurements of permeability and pore connectivity have been made on cores of Vedder Sandstone, which forms the primary reservoir unit. These measurements provide the range of scales in the otherwise synthetic dataset to be as close to a real‐world situation as possible. This dataset consisting of the reservoir models, geophysical models, simulated time‐lapse geophysical responses and well logs forms a multi‐scale, multi‐physics testbed for designing and testing geophysical CO2 monitoring systems as well as for imaging and characterization algorithms. The suite of numerical models and data have been made publicly available for downloading on the National Energy Technology Laboratory's (NETL) Energy Data Exchange (EDX) website. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
24. Prescribed fire placement matters more than increasing frequency and extent in a simulated Pacific Northwest landscape.
- Author
-
Deak, Alison L., Lucash, Melissa S., Coughlan, Michael R., Weiss, Shelby, and Silva, Lucas C. R.
- Subjects
WILDFIRES ,WILDFIRE prevention ,PRESCRIBED burning ,FUEL reduction (Wildfire prevention) ,CARBON sequestration in forests ,FOREST succession ,CLIMATE change mitigation ,FOREST dynamics - Abstract
Prescribed fire has been increasingly promoted to reduce wildfire risk and restore fire‐adapted ecosystems. Yet, the complexities of forest ecosystem dynamics in response to disturbances, climate change, and drought stress, combined with myriad social and policy barriers, have inhibited widespread implementation. Using the forest succession model LANDIS‐II, we investigated the likely impacts of increasing prescribed fire frequency and extent on wildfire severity and forest carbon storage at local and landscape scales. Specifically, we ask how much prescribed fire is required to maintain carbon storage and reduce the severity and extent of wildfires under divergent climate change scenarios? We simulated four prescribed fire scenarios (no prescribed fire, business‐as‐usual, moderate increase, and large increase) in the Siskiyou Mountains of northwest California and southwest Oregon. At the local site scale, prescribed fires lowered the severity of projected wildfires and maintained approximately the same level of ecosystem carbon storage when reapplied at a ~15‐year return interval for 50‐year simulations. Increased frequency and extent of prescribed fire decreased the likelihood of aboveground carbon combustion during wildfire events. However, at the landscape scale, prescribed fire did not decrease the projected severity and extent of wildfire, even when large increases (up to 10× the current levels) of prescribed fire were simulated. Prescribed fire was most effective at reducing wildfire severity under a climate change scenario with increased temperature and precipitation and on sites with north‐facing aspects and slopes greater than 30°. Our findings suggest that placement matters more than frequency and extent to estimate the effects of prescribed fire, and that prescribed fire alone would not be sufficient to reduce the risk of wildfire and promote carbon sequestration at regional scales in the Siskiyou Mountains. To improve feasibility, we propose targeting areas of high concern or value to decrease the risk of high‐severity fire and contribute to meeting climate mitigation and adaptation goals. Our results support strategic and targeted landscape prioritization of fire treatments to reduce wildfire severity and increase the pace and scale of forest restoration in areas of social and ecological importance, highlighting the challenges of using prescribed fire to lower wildfire risk. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
25. Evaluating the Role of Titanomagnetite in Bubble Nucleation: Novel Applications of Low Temperature Magnetic Analysis and Textural Characterization of Rhyolite Pumice and Obsidian From Glass Mountain, California.
- Author
-
McCartney, Kelly N., Hammer, Julia E., Shea, Thomas, Brachfeld, Stefanie, and Giachetti, Thomas
- Subjects
PUMICE ,RHYOLITE ,OBSIDIAN ,LOW temperatures ,HOMOGENEOUS nucleation ,BUBBLES - Abstract
Nucleation of H2O vapor bubbles in magma requires surpassing a chemical supersaturation threshold via decompression. The threshold is minimized in the presence of a nucleation substrate (heterogeneous nucleation, <50 MPa), and maximized when no nucleation substrate is present (homogeneous nucleation, >100 MPa). The existence of explosively erupted aphyric rhyolite magma staged from shallow (<100 MPa) depths represents an apparent paradox that hints at the presence of a cryptic nucleation substrate. In a pair of studies focusing on Glass Mountain eruptive units from Medicine Lake, California, we characterize titanomagnetite nanolites and ultrananolites in pumice, obsidian, and vesicular obsidian (Brachfeld et al., 2024, https://doi.org/10.1029/2023GC011336), calculate titanomagnetite crystal number densities, and compare titanomagnetite abundance with the physical properties of pumice to evaluate hypotheses on the timing of titanomagnetite crystallization. Titanomagnetite crystals with grain sizes of approximately 3–33 nm are identified in pumice samples from the thermal unblocking of low‐temperature thermoremanent magnetization. The titanomagnetite number densities for pumice are 1018 to 1020 m−3, comparable to number densities in pumice and obsidian obtained from room temperature methods (Brachfeld et al., 2024, https://doi.org/10.1029/2023GC011336). This range exceeds reported bubble number densities (BND) within the pumice from the same eruptive units (average BND ∼4 × 1014 m−3). The similar abundances of nm‐scale titanomagnetite crystals in the effusive and explosive products of the same eruption, together with the lack of correlation between pumice permeability and titanomagnetite content, are consistent with titanomagnetite formation having preceded the bubble formation. Results suggest sub‐micron titanomagnetite crystals are responsible for heterogeneous bubble nucleation in this nominally aphyric rhyolite magma. Key Points: Aphyric rhyolite eruptions staged from shallow magma reservoirs lack the overpressure needed for homogeneous bubble nucleationHeterogeneous bubble nucleation may occur on sub‐µm titanomagnetite crystals, which are undetectable using standard analytical techniquesSub‐µm titanomagnetite crystals can be detected and quantified with low temperature magnetic analyses [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
26. Evaluating the Role of Titanomagnetite in Bubble Nucleation: Rock Magnetic Detection and Characterization of Nanolites and Ultra‐Nanolites in Rhyolite Pumice and Obsidian From Glass Mountain, California.
- Author
-
Brachfeld, Stefanie, McCartney, Kelly N., Hammer, Julia E., Shea, Thomas, and Giachetti, Thomas
- Subjects
PUMICE ,RHYOLITE ,OBSIDIAN ,MAGNETIC anisotropy ,SURFACE of the earth ,COSMIC abundances ,SUPERPARAMAGNETIC materials ,SUPERRADIANCE - Abstract
We document the presence, composition, and number density (TND) of titanomagnetite nanolites and ultra‐nanolites in aphyric rhyolitic pumice, obsidian, and vesicular obsidian from the 1060 CE Glass Mountain volcanic eruption of Medicine Lake Volcano, California, using magnetic methods. Curie temperatures indicate compositions of Fe2.40Ti0.60O4 to Fe3O4. Rock‐magnetic parameters sensitive to domain state, which is dependent on grain volume, indicate a range of particle sizes spanning superparamagnetic (<50–80 nm) to multidomain (>10 μm) particles. Cylindrical cores drilled from the centers of individual pumice clasts display anisotropy of magnetic susceptibility with prolate fabrics, with the highest degree of anisotropy coinciding with the highest vesicularity. Fabrics within a pumice clast require particle alignment within a fluid, and are interpreted to result from the upward transport of magma driven by vesiculation, ensuing bubble growth, and shearing in the conduit. Titanomagnetite number density (TND) is calculated from titanomagnetite volume fraction, which is determined from ferromagnetic susceptibility. TND estimates for monospecific assemblages of 1,000 nm–10 nm cubes predict 1012 to 1020 m−3 of solid material, respectively. TND estimates derived using a power law distribution of grain sizes predict 1018 to 1019 m−3. These ranges agree well with TND determinations of 1018 to 1020 m−3 made by McCartney et al. (2024), and are several orders of magnitude larger than the number density of bubbles in these materials. These observations are consistent with the hypothesis that titanomagnetite crystals already existed in extremely high number‐abundance at the time of magma ascent and bubble nucleation. Plain Language Summary: We use magnetism experiments to prove that nanometer‐sized magnetic particles are present in volcanic rocks with low iron content and few visible crystals. Nanolites (particles between 30 and 1,000 nm) and ultra‐nanolites (particles smaller than 30 nm) are extremely difficult to detect in volcanic rocks composed mainly of glass using conventional methods such as optical and electron microscopy. Titanomagnetite nano‐particles may play a role in controlling the explosiveness of volcanic eruptions. The magnetic signatures of minerals can be used to determine their chemical composition, particle size range, and particle abundance. Pumice and obsidian contain the mineral titanomagnetite, with no evidence of prolonged crystallization at high oxygen levels at the Earth's surface. Observed magnetic behaviors are very similar to those of previously published studies of titanomagnetite in the 10–1,000 nm size range, and similar to mathematical models that simulate this size range. We find that pumice clasts have a magnetic fabric, suggesting that the nanolites and ultra‐nanolites were aligned in spatial patterns before the magma solidified, with stronger alignment coinciding with high degrees of vesicularity. Our results indicate that titanomagnetite crystals are highly abundant, and had crystallized in the magma chamber before the eruption. Key Points: Magnetic methods document titanomagnetite nanolites in rhyolitic materials from Glass Mountain, Medicine Lake Volcano, CaliforniaTitanomagnetite number densities for pumice, obsidian, and vesicular obsidian span 1012 to 1020 m−3 of solid materialTitanomagnetite crystals already existed in extremely high number‐abundance at the time of magma ascent and bubble nucleation [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
27. Plant community data collected by Robert H. Whittaker in the Siskiyou Mountains, Oregon and California, USA.
- Author
-
Whittaker, Robert H., Damschen, Ellen I., and Harrison, Susan
- Subjects
COMMUNITIES ,SPECIES diversity ,PLANT variation ,ACQUISITION of data ,CHEMICAL composition of plants ,SHRUBS ,PLANT communities ,PLANT diversity - Abstract
In 1949–1951, ecologist Robert H. Whittaker sampled plant community composition at 470 sites in the Siskiyou Mountains (Oregon and California; also known as Klamath or Klamath‐Siskiyou Mountains). His primary goal was to develop methods to quantify plant community variation across environmental gradients, following on his seminal work challenging communities as discrete entities. He selected the Siskiyous because of their diverse and endemic‐rich flora, which he attributed to geological complexity and an ancient stable climate. He chose sites to span gradients of topography, elevation, geologic substrate, and distance from the coast. He used the frequencies of indicator species in his data to assign sampling locations to positions on the topographic gradient, nested within the elevational and substrate gradients. He originated in this study the concept of diversity partitioning, in which gamma diversity (species richness of a community) equals alpha diversity (species richness in homogeneous sites) times beta diversity (species turnover among sites along gradients). Diversity partitioning subsequently became highly influential and new developments on it continue. Whittaker published his Siskiyou work covering paleohistory, biogeography, floristics, vegetation, gradient analysis, and diversity partitioning in Ecological Monographs in 1960. Discussed in 2 pages of his 60‐page monograph, diversity partitioning accounts for >95% of its current >4300 citations. In 2006, we retrieved Whittaker's Siskiyou data in hard copy from the Cornell University archives and entered them in a database. We used these data for multiple published analyses, including some based on (re)sampling the approximate locations of a subset of his sites. Because of the continued interest in diversity partitioning and in historic data sets, here we present his data, including 359 sampling locations and their descriptors and, for each sample, a list of species with their estimated percent cover (herbs and shrubs) and numbers by diameter at breast height (DBH) category (trees). Site descriptors include the approximate location (road, trail, or stream), elevation, topographic aspect, geologic substrate (serpentine, gabbro, or diorite), and dominant woody vegetation of each location. For 111 sites, including the small number chosen to represent the distance‐to‐coast gradient, we could not locate his data. There are no copyright restrictions and users of these data should cite this data paper in any publications that result from its use. The authors are available for consultations about and collaborations involving the data. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
28. Large‐scale, multidecade monitoring data from kelp forest ecosystems in California and Oregon (USA).
- Author
-
Malone, Daniel P., Davis, Kathryn, Lonhart, Steve I., Parsons‐Field, Avrey, Caselle, Jennifer E., and Carr, Mark H.
- Subjects
ECOSYSTEMS ,MACROCYSTIS ,ECOSYSTEM services ,MARINE parks & reserves ,KELPS ,OCEAN temperature ,BIOTIC communities ,FISHERIES - Abstract
Kelp forests are among the most productive ecosystems on Earth. In combination with their close proximity to the shore, the productivity and biodiversity of these ecosystems generate a wide range of ecosystem services including supporting (e.g., primary production, habitat), regulating (e.g., water flow, coastal erosion), provisioning (e.g., commercial and recreational fisheries), and cultural (e.g., recreational, artisanal) services. For these reasons, kelp forests have long been the target of ecological studies. However, with few exceptions, these studies have been localized and short term (<5 years). In 1999, recognizing the importance of large‐scale, long‐term studies for understanding the structure, functioning, and dynamics of coastal marine ecosystems, and for informing policy, the Partnership for Interdisciplinary Studies of Coastal Oceans (PISCO) designed and initiated a large‐scale, long‐term monitoring study of kelp forest ecosystems along 1400 km of coast stretching from southern California to southern Oregon, USA. The purpose of the study has been to characterize the spatial and temporal patterns of kelp forest ecosystem structure and evaluate the relative contributions of biological and environmental variables derived from external sources (e.g., sea otter density, Chl‐a concentration, sea surface temperature, wave energy) in explaining observed spatial and temporal patterns. For this purpose, the ecological community (i.e., density, percent cover, or biomass of conspicuous fishes, invertebrates, and macroalgae) and geomorphological attributes (bottom depth, substratum type, and vertical relief) of kelp forest ecosystems have been surveyed annually using SCUBA divers trained in both scientific diving and data collection techniques and the identification of kelp forest species. The study region spans distinct ecological and biogeographic provinces, which enables investigations of how variation in environmental drivers and distinctive species compositions influence community structure, and its response to climate‐related environmental change across a portion of the California Current Large Marine Ecosystem. These data have been used to inform fisheries management, design and evaluate California's state‐wide network of marine protected areas (MPAs), and assess the ecological consequences of climate change (e.g., marine heatwaves). Over time, the spatial and temporal design of the monitoring program was adapted to fill its role in evaluating the ecological responses to the establishment of MPAs. There are no copyright restrictions; please cite this paper when data are used. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
29. Exploring the impacts of traffic flow states on freeway normal crashes, primary crashes, and secondary crashes.
- Author
-
Yang, Bo, Guo, Yanyong, Zhang, Weihua, Yao, Ying, and Wu, Yiping
- Subjects
ASSOCIATION rule mining ,PROPERTY damage ,EXPRESS highways ,TRAFFIC flow ,LANE changing - Abstract
This study aims to explore the relationship between traffic flow states and crash type/severity in the scenarios of normal crashes, primary crashes, and secondary crashes using the association rules mining approach. The crash data and real‐time traffic data were collected from the I‐880 freeway for five years in California, USA. The secondary crashes were identified using a speed contour plot approach. Traffic flow states were identified by the three‐phase flow theory. The results showed that the free flow is associated with the proportion of the sideswipe normal crash, the hit object primary crash, and the injury primary crash. The synchronized flow, the wide moving jams, and the transitional state from synchronized flow to wide moving jams are associated with the proportion of the rear‐end secondary crash. The transitional state from synchronized flow to free flow is associated with the proportion of the rear‐end primary crash and the property damage only primary crash. In addition, the unsafe speed behaviour can increase the proportion of the rear‐end normal, primary, and secondary crashes. The unsafe lane change behaviour can increase the proportion of the sideswipe normal, primary, and secondary crashes. These results have the potential to reduce the secondary crash probability. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
30. A Comprehensive Assessment of Submarine Landslides and Mass Wasting Processes Offshore Southern California.
- Author
-
Walton, Maureen A. L., Conrad, James E., Papesh, Antoinette G., Brothers, Daniel S., Kluesner, Jared W., McGann, Mary, and Dartnell, Peter
- Subjects
LANDSLIDES ,MARINE sediments ,GEOPHYSICAL surveys ,COASTAL sediments ,EARTHQUAKES ,SEDIMENTATION & deposition ,MARINE debris - Abstract
It is critical to characterize submarine landslide hazards near dense coastal populations, especially in areas with active faults, which can trigger slope failure, subsequent tsunamis, and damage seabed infrastructure during earthquake shaking. Offshore southern California, numerous marine geophysical surveys have been conducted over the past decade, and high‐resolution bathymetric and subsurface data now cover about 60 percent of the total region between Point Conception and the United States‐Mexico border from the California coast out to the base of Patton Escarpment ∼200 km offshore. In a comprehensive compilation and interpretive mapping effort, we find evidence of seafloor failure throughout offshore southern California with nearly 1,500 submarine landslide‐related features, including 63 discrete slide deposits with debris and >1,400 slide‐related scarps. In our analysis, we highlight new mapping of submarine landslides in Catalina Basin, the Del Mar slide, the San Gabriel slide complex, and the 232 km2 San Nicolas slide, the largest area of any known submarine landslide mass offshore southern California. Analysis of the spatial distribution of submarine landslide features suggests that most mapped slide features are located relatively near coastal sediment sources, particularly during sea‐level lowstand conditions, which underscores the importance of sediment supply and sediment accumulation on low‐gradient slopes as failure preconditioning processes. Tectonically driven uplift at shelf edges and along basin flanks is another key preconditioning factor, and our results also suggest that earthquakes along active faults trigger mass wasting, especially for repeated, small‐scale failures on tectonically steepened slopes. Plain Language Summary: Submarine landslides can damage seabed infrastructure such as cables and moorings, cause tsunamis, and be triggered by shaking from earthquakes. It is important to understand the risk of submarine landslides near dense coastal populations, particularly where earthquakes also pose hazards. Offshore southern California, we have new high‐resolution seafloor and subsurface imaging data that help us to identify submarine landslide deposits in the marine environment. In our study, we map and compile evidence for submarine landslides and find nearly 1,500 slide‐related features, 63 of which feature significant debris deposits. We describe some of the larger slides in this study for the first time, including submarine landslides in Catalina Basin, the Del Mar slide, the San Gabriel slide complex, and the 232 square kilometer San Nicolas slide, which is one of the largest known submarine landslide masses offshore southern California. Our work suggests that submarine landslide failure processes offshore southern California require a combination of (a) significant sediment supply, which is enhanced during low sea‐level conditions, (b) uplift and steepening along faults, and (c) earthquake shaking to trigger slide events. Key Points: Comprehensive analysis of submarine landslides in southern California provides new metrics on their size, distribution, timing, and geologySubmarine landslide failure processes are controlled by a combination of sediment deposition, tectonic uplift, and earthquake triggeringSmall‐scale failures dominate steep areas near Quaternary faults; large slides tend to occur on lower slopes farther from faults [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
31. Health Impacts of Future Prescribed Fire Smoke: Considerations From an Exposure Scenario in California.
- Author
-
Rosenberg, Andrew, Hoshiko, Sumi, Buckman, Joseph R., Yeomans, Kirstin R., Hayashi, Thomas, Kramer, Samantha J., Huang, ShihMing, French, Nancy H. F., and Rappold, Ana G.
- Subjects
PRESCRIBED burning ,FIRE management ,WILDFIRE prevention ,EMERGENCY room visits ,FOREST fire management ,PARTICULATE matter ,FOREST management - Abstract
In response to increasing wildfire risks, California plans to expand the use of prescribed fire. We characterized the anticipated change in health impacts from exposure to smoke under a future fire‐management scenario relative to a historical period (2008–2016). Using dispersion models, we estimated daily fine particulate matter (PM2.5) emissions from hypothetical future prescribed fires on 500,000‐acres classified as high priority. To evaluate health impacts, we calculated excess daily cardiorespiratory emergency department visit rates attributed to all‐source PM2.5, distinguishing the portion of the burden attributed to prescribed fire. The total burden was differentiated by fire type and by smoke strata‐specific days to calculate strata‐specific burden rates, which were then applied to estimate the burden in the future scenario. This analysis suggests that the exposure to prescribed fire smoke, measured as the number of persons exposed per year, would be 15 times greater in the future. However, these exposures were associated with lower concentrations compared to the historical period. The increased number of exposure days led to an overall increase in the future health burden. Specifically, the northern, central, and southern regions experienced the largest burden increase. This study introduces an approach that integrates spatiotemporal exposure differences, baseline morbidity, and population size to assess the impacts of prescribed fire under a future scenario. The findings highlight the need to consider both the level and frequency of exposure to guide strategies to safeguard public health as well as aid forest management agencies in making informed decisions to protect communities while mitigating wildfire risks. Plain Language Summary: Prescribed fire is a forest management strategy for reducing the risks of wildfires. While some fires are ecologically beneficial, smoke from fires is a major source of airborne particle pollution, which is harmful to human health. This study examined the change in health impacts resulting from an expected increase in the use of prescribed fire within California's high‐priority wildfire risk areas. We used daily counts of cardiorespiratory emergency department visits attributed to air quality combined with model‐generated measures of smoke pollution to estimate health impacts. We compared exposures and the associated health burden on days impacted by wildfire or prescribed fire smoke in the past to the impacts in the hypothetical future scenario with increased prescribed fire. Projections of future prescribed burning in high priority areas suggest that more people would experience smoke more often, although exposures would occur at lower concentrations. With more frequent lower‐level exposure days near populated areas, the health burden would increase relative to past prescribed fire. Understanding the potential impact of prescribed fire may simultaneously help protect public health and increase safety from wildfires. Key Points: A California‐based model of future prescribed burning in high‐priority wildfire risk areas suggested more people will experience smokeAn increased number of exposure days in the future scenario led to an overall increase in the future health burdenThe excess future health burden was due to the cumulative impact of lower exposure days and high population density in high‐priority areas [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
32. Evaluating immaturity risk in young stands of the serotinous knobcone pine (Pinus attenuata).
- Author
-
Marlin, Katherine F., Greene, David F., Kane, Jeffrey M., Reilly, Matthew, and Madurapperuma, Buddhika D.
- Subjects
PINUS radiata ,PINE ,SEED viability ,CONIFERS ,CALIFORNIA wildfires ,PLANT populations - Abstract
As wildfire becomes increasingly frequent, many serotinous plant populations risk local extirpation if fire recurs prior to sufficient seed accumulation in the canopy (i.e., "immaturity risk"). Following two 2018 wildfires in northwestern California, we studied seed viability, cone production, and postfire regeneration of a serotinous conifer, knobcone pine (Pinus attenuata), with stand ages (time since fire) ranging from 6 to 79 years. Cone density per tree was more strongly associated with tree diameter than age, and cone density was positively related to postfire seedling regeneration. Most of the postfire knobcone pine regeneration established during the first year with high survivorship in the following first postfire year. Adjusting for survivorship, the estimated minimum age for knobcone pine to promote self‐replacement (one recruit per tree) was 9.5 years (or 4.6‐cm dbh) and the probability of reburning at the modern fire rotation of 43 years was 19.8%. Based on our results, we found that immaturity risk was currently low for knobcone pine. Our approach provides a quantitative method to assess immaturity risk in knobcone pine and other serotinous conifer species that can be used to evaluate future risk under rapidly changing climate and fire conditions. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
33. Satellite Remote Sensing: A Tool to Support Harmful Algal Bloom Monitoring and Recreational Health Advisories in a California Reservoir.
- Author
-
Lopez Barreto, Brittany N., Hestir, Erin L., Lee, Christine M., and Beutel, Marc W.
- Subjects
ALGAL blooms ,REMOTE sensing ,BODIES of water ,CYANOBACTERIAL toxins ,EVAPOTRANSPIRATION ,MICROCYSTIS ,TOXIC algae ,MICROCYSTINS - Abstract
Cyanobacterial harmful algal blooms (cyanoHABs) can harm people, animals, and affect consumptive and recreational use of inland waters. Monitoring cyanoHABs is often limited. However, chlorophyll‐a (chl‐a) is a common water quality metric and has been shown to have a relationship with cyanobacteria. The World Health Organization (WHO) recently updated their previous 1999 cyanoHAB guidance values (GVs) to be more practical by basing the GVs on chl‐a concentration rather than cyanobacterial counts. This creates an opportunity for widespread cyanoHAB monitoring based on chl‐a proxies, with satellite remote sensing (SRS) being a potentially powerful tool. We used Sentinel‐2 (S2) and Sentinel‐3 (S3) to map chl‐a and cyanobacteria, respectively, classified chl‐a values according to WHO GVs, and then compared them to cyanotoxin advisories issued by the California Department of Water Resources (DWR) at San Luis Reservoir, key infrastructure in California's water system. We found reasonably high rates of total agreement between advisories by DWR and SRS, however rates of agreement varied for S2 based on algorithm. Total agreement was 83% for S3, and 52%–79% for S2. False positive and false negative rates for S3 were 12% and 23%, respectively. S2 had 12%–80% false positive rate and 0%–38% false negative rate, depending on algorithm. Using SRS‐based chl‐a GVs as an early indicator for possible exposure advisories and as a trigger for in situ sampling may be effective to improve public health warnings. Implementing SRS for cyanoHAB monitoring could fill temporal data gaps and provide greater spatial information not available from in situ measurements alone. Plain Language Summary: Lakes often have algal blooms that create a water quality concern, especially when they contain cyanobacteria, which can be toxic to both humans and animals. These harmful algal blooms are of great concern in areas with limited water supply in states such as California. While it is often difficult and costly to collect and monitor toxin concentrations, monitoring concentrations of chlorophyll‐a (chl‐a) –a measure of how much algae are present—is relatively common and can even be accomplished using satellite remote sensing. There have been multiple studies that have found a relationship between toxins produced by cyanobacteria and chl‐a. The World Health Organization (WHO) has recently released (2021) an updated release of their previous 1999 guidance values for toxin monitoring based on chl‐a concentration. With satellite data, we were able to measure chl‐a concentration in a major reservoir in California, and then classify the chl‐a measurements into the WHO's guidance values for toxins. We compared the satellite‐based guidance values to the public advisory levels currently set by the California Department of Water Resources. Our results indicate that SRS of chl‐a is a reasonable substitute for cyanobacteria toxin advisories, and our framework can be applied to similar cyanobacteria dominated lakes. Key Points: The World Health Organization (WHO) updated cyanobacteria harmful algal blooms (cyanoHABs) guidelines for chlorophyll‐a (chl‐a) as a proxyWith satellite remote sensing (SRS), we estimated and classified chl‐a to compare cyanotoxins advisories used by CaliforniaThis study provides a framework for evaluating public health utility of SRS for enhancing cyanotoxin monitoring globally [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
34. Researcher effects on the biological structure and edaphic conditions of field sites and implications for management.
- Author
-
Rinehart, Shelby A., Dybiec, Jacob M., Richardson, Parker, Walker, Janet B., Peabody, James D., and Cherry, Julia A.
- Subjects
MORPHOLOGY ,RESEARCH personnel ,ENVIRONMENTAL impact analysis ,SALT marshes ,FIELD research - Abstract
Field studies are necessary for understanding natural processes in spite of the human‐induced disturbances they cause. While researchers acknowledge these effects, no studies have empirically tested the direct (e.g., harvesting plants) and indirect (i.e., trampling) effects of researcher activities on biological structure and edaphic conditions. We leveraged field studies in Alabama and California to monitor the recovery of tidal marshes following research activities. Researcher effects on animals, plants, and sediment conditions remained prevalent almost one year after the disturbance ended. For instance, trampled plots had 14%–97% lower plant cover than undisturbed plots after >10 months of recovery. Researcher effects also impacted plant composition, leading to increased subordinate species abundance. We encourage field researchers to adopt strategies that reduce their scientific footprints, including reducing field visits, limiting field team size, and considering ways to limit potential environmental impacts during study design. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
35. What Are the Main Causes of Positive Subtropical Low Cloud Feedbacks in Climate Models?
- Author
-
Webb, Mark J., Lock, Adrian P., and Ogura, Tomoo
- Subjects
ATMOSPHERIC models ,CLIMATE feedbacks ,GLOBAL warming ,STRATOCUMULUS clouds ,HUMIDITY ,LATENT heat ,HEAT flux - Abstract
We investigate positive subtropical low cloud feedback mechanisms in climate models which have performed the CMIP6/CFMIP‐3 AMIP and AMIP uniform +4K experiments while saving CFMIP‐3 process diagnostics on model levels. Our analysis focuses on the trade cumulus/stratocumulus transition region between California and Hawaii, where positive low cloud feedbacks are present in the JJA season. We introduce a methodology to test various positive cloud feedback mechanisms proposed in the literature as the main causes of the low cloud responses in the models. Causal hypotheses are tested by comparing their predictions with the models' responses of clouds, cloud controlling factors, boundary layer depth and temperature/humidity tendencies to climate warming. Changes in boundary layer depth, relative humidity in the cloud layer, convective moistening rate and large‐scale humidity advection at the top of the boundary layer are shown to be crucial for identifying the main causes of the low cloud reductions in the models. For the cases examined, our approach narrows down the seven mechanisms considered to between one and three remaining candidates for each model. No single mechanism considered can explain the feedback in all of the models at the locations examined, but the surface latent heat flux/convective entrainment mechanism remains a candidate for BCC‐CSM2‐MR, IPSL‐CM6A‐LR, and MRI‐ESM2.0, while the surface upwelling longwave mechanism remains for CESM2, HadGEM3‐GC3.1‐LL, and MIROC6. Plain Language Summary: Climate models show reductions in low‐level clouds with the warming climate which are poorly understood. We examine cloud changes between California and Hawaii in six climate models. We consider seven potential explanations for the changes. We find that examining changes in the height of low level clouds, the moistening of the atmosphere by rising plumes of moist air, the humidity of the air and the rate at which dry air is mixed into the clouds from above allows us to narrow down the number of explanations compatible with each model. Key Points: Hypotheses for positive cumulus/stratocumulus feedback mechanisms are tested in six climate modelsWe narrow down the seven mechanisms considered to between one and three candidates for each modelBoundary layer depth, relative humidity, convective moistening, and humidity advection are key [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
36. Mafic Alkaline Magmatism and Rare Earth Element Mineralization in the Mojave Desert, California: The Bobcat Hills Connection to Mountain Pass.
- Author
-
Watts, K. E., Miller, D. M., and Ponce, D. A.
- Subjects
RARE earth metals ,PASSING (Football) ,BOBCAT ,MAGMATISM ,DIKES (Geology) ,OXYGEN isotopes ,DESERTS - Abstract
Occurrences of alkaline and carbonatite rocks with high concentrations of rare earth elements (REE) are a defining feature of Precambrian geology in the Mojave Desert of southeastern California. The most economically important occurrence is the carbonatite stock at Mountain Pass, which constitutes the largest REE deposit in the United States. A central scientific goal is to understand the genesis of the carbonatite ore body in the context of widespread REE‐rich igneous activity. A swarm of mafic alkaline (shonkinite) dikes has been mapped and sampled at Bobcat Hills, 65 km southeast of the Mountain Pass mine. Whole‐rock geochemistry and zircon geochronology demonstrate a clear affinity to the ca. 1.4 Ga Mountain Pass intrusive system. Bobcat Hills dikes have comparably high REE concentrations (La ∼1,000× chondritic) and an error‐weighted mean 207Pb/206Pb zircon crystallization age of 1,426 ± 2 Ma (2σ). Unlike the alkaline intrusions at Mountain Pass, which have abundant inherited zircon from Paleoproterozoic basement rocks and crustally influenced oxygen isotope compositions (δ18Ozircon = 6.5–7.5‰), the Bobcat Hills dikes lack any evidence of crustal assimilation and have oxygen isotope values that overlap a mantle range (Bobcat Hills average δ18Ozircon = 5.6 ± 0.3‰). The dikes were a high‐temperature, early center of mafic alkaline magmatism in the Mojave Desert that serve as a snapshot of melt generation from a spatially extensive, metasomatized mantle source. We propose that modification of the crust over many tens of Myr at Mountain Pass created an environment that favored crustal assimilation and enabled ascent of late‐stage, REE‐rich carbonatite magmas. Key Points: The Bobcat Hills site expands the distribution of REE‐rich mafic alkaline magmatism 65 km southeast of the Mountain Pass mineShonkinite dikes at Bobcat Hills are physically and chemically similar to mafic intrusions at Mountain Pass and have comparably high REEBobcat Hills was an early center of REE‐rich magmatism in the Mojave Desert, with a zircon U‐Pb crystallization age of 1,426 ± 2 Ma [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
37. Evaluation of Different Bias Correction Methods for Dynamical Downscaled Future Projections of the California Current Upwelling System.
- Author
-
Pozo Buil, Mercedes, Fiechter, Jerome, Jacox, Michael G., Bograd, Steven J., and Alexander, Michael A.
- Subjects
DOWNSCALING (Climatology) ,GLOBAL modeling systems ,EFFECT of human beings on climate change ,ECOSYSTEMS ,CLIMATE change - Abstract
Biases in global Earth System Models (ESMs) are an important source of errors when used to obtain boundary conditions for regional models. Here we examine historical and future conditions in the California Current System (CCS) using three different methods to force the regional model: (a) interpolation of ESM output to the regional grid with no bias correction; (b) a "seasonally‐varying" delta method that obtains a season‐dependent mean climate change signal from the ESM for a 30‐year future period; and (c) a "time‐varying" delta method that includes the interannual variability of the ESM over the 1980–2100 period. To compare these methods, we use a high‐resolution (0.1°) physical‐biogeochemical regional model to dynamically downscale an ESM projection under the RCP8.5 emission scenario. Using different downscaling methods, the sign of future changes agrees for most of the physical and ecosystem variables, but the spatial patterns and magnitudes of these changes differ, with the seasonal‐ and time‐varying delta simulations showing more similar changes. Not correcting the ESM forcing leads to amplification of biases in some ecosystem variables as well as misrepresentation of the California Undercurrent and CCS source waters. In the non‐bias corrected and time‐varying delta simulations, most of the ecosystem variables inherit trends and decadal variability from the ESM, while in the seasonally‐varying delta simulation the future variability reflects the observed historical variability (1980–2010). Our results demonstrate that bias correcting the forcing prior to downscaling improves historical simulations, and that the bias correction method may impact the spatial and temporal variability of the future projections. Plain Language Summary: Global Earth System Models (ESMs) are important tools to understand Earth's processes and project how they will change over time in response to anthropogenic activity and changing climate conditions. However, ESMs have limited capacity to resolve coastal processes at sufficiently high resolution (e.g., <50 km) and often show regional biases when they are compared to observed data. To address the resolution issue, ESMs can be used as input to force high‐resolution models, and their biases with respect to observed data can be reduced by applying bias‐correction methods. This process, called dynamical downscaling, is widely used; but, the implications of different methodological choices during downscaling have not been adequately explored. In this article, we compare three different pre‐processing methods (two with bias correction and one without) prior to dynamical downscaling to simulate present and future conditions in the California Current System. We evaluate the performance of each method by comparing with observed data and assessing how they reproduce future changes and variability. We find that bias correcting the ESM data before forcing the ocean model is key to reducing ESM biases and resolving coastal processes. Results here will help to guide methodological choices when projecting climate change using high‐resolution ocean models to resolve coastal processes. Key Points: Bias correcting forcing prior to downscaling is key to reducing historical biases and resolving critical ecosystem‐relevant coastal processesDifferent bias correction methods produced similar mean and seasonal changes of ecosystem variables in downscaled projectionsUsing forcing without bias correction amplified the historical bias of some variables and produced a misrepresentation of the California Undercurrent [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
38. Atmospheric River Sequences as Indicators of Hydrologic Hazard in Historical Reanalysis and GFDL SPEAR Future Climate Projections.
- Author
-
Bowers, C., Serafin, K. A., Tseng, K.‐C., and Baker, J. W.
- Subjects
ATMOSPHERIC rivers ,GEOPHYSICAL fluid dynamics ,HAZARD mitigation ,FLOOD damage ,FLOODS ,STORMS ,HAZARDS ,FLOOD risk - Abstract
When multiple atmospheric rivers (ARs) occur in rapid succession, the combined effect on the hydrologic system can lead to more flooding and damage than would be expected from the individual events. This temporally compounding risk is a source of growing concern for water managers in California. We present a novel moving average‐based definition of AR "sequences" that identifies the time periods of elevated hydrologic hazard that occur during and after consecutive AR events. This marks the first quantitative evaluation of when temporal compounding is contributing to AR flood risk. We also assess projected changes in sequence frequency, intensity, and duration in California using the Geophysical Fluid Dynamics Laboratory Seamless System for Prediction and EArth system Research (GFDL SPEAR) global coupled model. Sequence frequency increases over time and is fairly uniform across the state under both intermediate (SSP2–4.5) and very high (SSP5–8.5) emissions scenarios, with the largest changes occurring by the end of the century (+0.72 sequences/year in SSP2–4.5, +1.13 sequences/year in SSP5–8.5). Sequence intensity and duration both see increases in the medians and extreme values of their respective distributions relative to the historical baselines. In particular, "super‐sequence" events longer than 60 days are projected to occur 2–3x more frequently and to emerge in places that have never seen them in the historical record. In a world where California precipitation is becoming more variable, our definition of sequences will help identify when and where hydrologic impacts will be most extreme, which can in turn support better management of the state's highly variable water resources and inform future flood mitigation strategies. Plain Language Summary: Atmospheric rivers (ARs) are a type of storm that are vital to water resources in the western United States, but can also cause significant flooding and damage. Back‐to‐back AR events have historically been a source of concern for water managers because the compound effect of multiple events together can increase the probability of damaging floods. We present a definition of AR "sequences" that identifies periods of time where the likelihood of compound effects is increased. We look at the relationship between sequences, runoff, and soil moisture in California and show that sequences are in fact aligning with time windows of elevated hydrologic hazard in the historical record. We then look at sequences in two future climate projections and find that sequence frequency, intensity, and duration are all projected to increase with increasing emissions levels. In particular, "super‐sequences" more than 60 days long are projected to become two to three times more frequent across all of California. Our definition of sequences captures and communicates new information about the risk associated with temporally compounding hydrologic events in present and future climates. Key Points: We introduce atmospheric river (AR) sequences as a way to measure the hydrologic hazard from temporally compounding (back‐to‐back) ARsAR sequences in Geophysical Fluid Dynamics Laboratory Seamless System for Prediction and EArth system Research (GFDL SPEAR) model projections increase in frequency, intensity, and duration in California by the end of the century"Super‐sequences" over 60 days long drive the projected increase in frequency and present a growing water management threat in California [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
39. Origins of Uncertainty in the Response of the Summer North Pacific Subtropical High to CO2 Forcing.
- Author
-
Lu, Kezhou, He, Jie, and Simpson, Isla R.
- Subjects
LAND surface temperature ,TYPHOONS ,OCEAN temperature ,ATMOSPHERIC models ,SUMMER ,THEORY of wave motion - Abstract
The variability of the summer North Pacific Subtropical High (NPSH) has substantial socioeconomic impacts. However, state‐of‐the‐art climate models significantly disagree on the response of the NPSH to anthropogenic warming. Inter‐model spread in NPSH projections originates from models' inconsistency in simulating tropical precipitation changes. This inconsistency in precipitation changes is partly due to inter‐model spread in tropical sea surface temperature (SST) changes, but it can also occur independently of uncertainty in SST changes. Here, we show that both types of precipitation uncertainty influence the NPSH via the Matsuno‐Gill wave response, but their relative impact varies by region. Through the modulation of low cloud fraction, inter‐model spread of the NPSH can have a further impact on extra‐tropical land surface temperature. The teleconnection between tropical precipitation and the NPSH is examined through a series of numerical experiments. Plain Language Summary: The North Pacific Subtropical High (NPSH) is a semi‐permanent high‐pressure system located in the subtropical North Pacific. The variability in the summer NPSH has a significant impact on the monsoon and typhoons over East Asia and the hydroclimate of California. However, future projections of the NPSH using state‐of‐the‐art climate models remain highly uncertain. By evaluating how much individual models deviate from the multi‐model mean at different locations, we find four hot spots of high uncertainty in NPSH projections. Our analysis further reveals that the primary source of model variance in changes in the NPSH is tropical precipitation, which can be attributed to both inter‐model SST‐driven and non‐inter‐model SST‐driven factors. Through numerical experiments, we demonstrate that the teleconnection between tropical precipitation and the NPSH is achieved through wave propagation. Key Points: Model spread in the response of the summer North Pacific Subtropical High (NPSH) to CO2 stems from model spread in simulating tropical processesModel spread in tropical sea surface temperature (SST) changes modulates the NPSH by influencing tropical precipitationModel spread in tropical precipitation changes independent of model spread in SST changes also adds to the uncertainty of the NPSH response [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
40. Understanding the Contributions of Paleo‐Informed Natural Variability and Climate Changes to Hydroclimate Extremes in the San Joaquin Valley of California.
- Author
-
Gupta, Rohini S., Steinschneider, Scott, and Reed, Patrick M.
- Subjects
CLIMATE change ,EFFECT of human beings on climate change ,CLIMATE extremes ,FLOOD risk ,DROUGHTS ,HYDROLOGIC models - Abstract
To aid California's water sector to better understand and manage future climate extremes, we present a method for creating a regionally consistent ensemble of plausible daily future climate and streamflow scenarios that represent natural climate variability captured in a network of tree‐ring chronologies, and then embed anthropogenic climate change trends within those scenarios. We use 600 years of paleo‐reconstructed weather regimes to force a stochastic weather generator, which we develop for five subbasins in the San Joaquin Valley of California. To assess the compound effects of climate change, we create temperature series that reflect projected scenarios of warming and precipitation series that have been scaled to reflect thermodynamically driven shifts in the distribution of daily precipitation. We then use these weather scenarios to force hydrologic models for each of the five subbasins. The paleo‐forced streamflow scenarios highlight periods in the region's past that produce flood and drought extremes that surpass those in the modern record and exhibit large non‐stationarity through the reconstruction. Variance decomposition is employed to characterize the contribution of natural variability and climate change to variability in decision‐relevant metrics related to floods and drought. Our results show that a large portion of variability in individual subbasin and spatially compounding extreme events can be attributed to natural variability, but that anthropogenic climate changes become more influential at longer planning horizons. The joint importance of climate change and natural variability in shaping extreme floods and droughts is critical to resilient water systems planning and management in the San Joaquin. Plain Language Summary: California experiences cycles of floods and droughts that can be driven by both natural variability and climate change. The specific role these drivers play in impacting extremes is uncertain, but can influence how to best plan and manage regional water systems for future extremes. To better quantify the role of these drivers, we introduce a framework that utilizes a 600‐year tree‐ring reconstruction to create long sequences of plausible future weather and streamflow for key basins in the San Joaquin Valley. We find that a large portion of variability in extremes can be attributed to natural variability at shorter planning horizons, but that human‐driven climate changes are influential at longer planning horizons (>30 years). Furthermore, decision‐makers' perceptions of important drivers can be skewed depending on the specific definitions used to analyze floods and droughts, which can present significant challenges for adaptation planning and infrastructure development tied to tracking hydroclimate variables. This study also illustrates the vast variability in extremes that the region has experienced over the past 600 years and highlights the pitfalls of defining risk based on a limited historical record. Key Points: We introduce a framework to create 600‐year ensembles of future weather and streamflow for basins in the San Joaquin ValleyWe discover vast variability and non‐stationarity in flood and drought extremes in the region over the past 600 yearsThe joint importance of climate change and natural variability in shaping floods and droughts is critical to water systems planning [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
41. How environmental drivers of spatial synchrony interact.
- Author
-
Reuman, Daniel C., Castorani, Max C. N., Cavanaugh, Kyle C., Sheppard, Lawrence W., Walter, Jonathan A., and Bell, Tom W.
- Subjects
SYNCHRONIC order ,GIANT kelp ,POPULATION dynamics ,POPULATION ecology ,ENDANGERED species ,ECOSYSTEMS - Abstract
Spatial synchrony, the tendency for populations across space to show correlated fluctuations, is a fundamental feature of population dynamics, linked to central topics of ecology such as population cycling, extinction risk, and ecosystem stability. A common mechanism of spatial synchrony is the Moran effect, whereby spatially synchronized environmental signals drive population dynamics and hence induce population synchrony. After reviewing recent progress in understanding Moran effects, we here elaborate a general theory of how Moran effects of different environmental drivers acting on the same populations can interact, either synergistically or destructively, to produce either substantially more or markedly less population synchrony than would otherwise occur. We provide intuition for how this newly recognized mechanism works through theoretical case studies and application of our theory to California populations of giant kelp. We argue that Moran interactions should be common. Our theory and analysis explain an important new aspect of a fundamental feature of spatiotemporal population dynamics. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
42. Recreating the California New Year's Flood Event of 1997 in a Regionally Refined Earth System Model.
- Author
-
Rhoades, Alan M., Zarzycki, Colin M., Inda‐Diaz, Héctor A., Ombadi, Mohammed, Pasquier, Ulysse, Srivastava, Abhishekh, Hatchett, Benjamin J., Dennis, Eli, Heggli, Anne, McCrary, Rachel, McGinnis, Seth, Rahimi‐Esfarjani, Stefan, Slinskey, Emily, Ullrich, Paul A., Wehner, Michael, and Jones, Andrew D.
- Subjects
NEW Year ,EXTREME weather ,WATERLOGGING (Soils) ,STORMS ,FLOOD risk ,ATMOSPHERIC circulation - Abstract
The 1997 New Year's flood event was the most costly in California's history. This compound extreme event was driven by a category 5 atmospheric river that led to widespread snowmelt. Extreme precipitation, snowmelt, and saturated soils produced heavy runoff causing widespread inundation in the Sacramento Valley. This study recreates the 1997 flood using the Regionally Refined Mesh capabilities of the Energy Exascale Earth System Model (RRM‐E3SM) under prescribed ocean conditions. Understanding the processes causing extreme events informs practical efforts to anticipate and prepare for such events in the future, and also provides a rich context to evaluate model skill in representing extremes. Three California‐focused RRM grids, with horizontal resolution refinement of 14 km down to 3.5 km, and six forecast lead times, 28 December 1996 at 00Z through 30 December 1996 at 12Z, are assessed for their ability to recreate the 1997 flood. Planetary to synoptic scale atmospheric circulations and integrated vapor transport are weakly influenced by horizontal resolution refinement over California. Topography and mesoscale circulations, such as the Sierra barrier jet, are better represented at finer horizontal resolutions resulting in better estimates of storm total precipitation and storm duration snowpack changes. Traditional time‐series and causal analysis frameworks are used to examine runoff sensitivities state‐wide and above major reservoirs. These frameworks show that horizontal resolution plays a more prominent role in shaping reservoir inflows, namely the magnitude and time‐series shape, than forecast lead time, 2‐to‐4 days prior to the 1997 flood onset. Plain Language Summary: The 1997 California New Year's flood event caused over a billion dollars in damages. This storm became a central part in guiding efforts to reduce flood risks. Earth system models are increasingly asked to recreate extreme weather events. However, the ability of Earth system models to recreate such events requires rigorous testing. Testing ensures that models provide value in anticipating and planning for future flood events. This is particularly important given the changing climate. We evaluated the Department of Energy's flagship Earth system model, the Energy Exascale Earth System Model, in its ability to recreate the weather and flood characteristics of the 1997 flood. The model resolution, important for resolving mountain terrain and storm interactions, and forecast lead time, important for storm progression accuracy, are assessed. The multi‐forecast average from the highest‐resolution model best recreates the observed precipitation, snowpack changes, and flood characteristics. Our findings provide confidence that the highest resolution model could be used to study how a 1997‐like flood event would be altered in a warmer world. Key Points: Energy Exascale Earth System Model forecasts at 3.5 km grid spacing skillfully recreate the hydrometeorology of California's 1997 floodHorizontal resolution alters the representation of key flood drivers such as the Sierra barrier jet, precipitation extremes, and snowmeltForecast lead time 2‐to‐4 days prior to the onset of the 1997 flood minimally influences forecast precipitation and snowmelt skill [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
43. Exome sequencing findings in children with annular pancreas.
- Author
-
Pitsava, Georgia, Pankratz, Nathan, Lane, John, Yang, Wei, Rigler, Shannon, Shaw, Gary M., and Mills, James L.
- Subjects
GENETIC variation ,MISSENSE mutation ,ECTOPIC tissue ,CELL migration ,HUMAN abnormalities ,PANCREAS ,MESENTERIC artery - Abstract
Background: Annular pancreas (AP) is a congenital defect of unknown cause in which the pancreas encircles the duodenum. Theories include abnormal migration and rotation of the ventral bud, persistence of ectopic pancreatic tissue, and inappropriate fusion of the ventral and dorsal buds before rotation. The few reported familial cases suggest a genetic contribution. Methods: We conducted exome sequencing in 115 affected infants from the California birth defects registry. Results: Seven cases had a single heterozygous missense variant in IQGAP1, five of them with CADD scores >20; seven other infants had a single heterozygous missense variant in NRCAM, five of them with CADD scores >20. We also looked at genes previously associated with AP and found two rare heterozygous missense variants, one each in PDX1 and FOXF1. Conclusion: IQGAP1 and NRCAM are crucial in cell polarization and migration. Mutations result in decreased motility which could possibly cause the ventral bud to not migrate normally. To our knowledge, this is the first study reporting a possible association for IQGAP1 and NRCAM with AP. Our findings of rare genetic variants involved in cell migration in 15% of our population raise the possibility that AP may be related to abnormal cell migration. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
44. Interacting ecological filters influence success and functional composition in restored plant communities over time.
- Author
-
Funk, Jennifer L., Kimball, Sarah, Nguyen, Monica A., Lulow, Megan, and Vose, Gregory E.
- Subjects
CHEMICAL composition of plants ,PLANT communities ,COMPOSITION of seeds ,RESTORATION ecology ,LEAF area - Abstract
A trait‐based community assembly framework has great potential to direct ecological restoration, but uncertainty over how traits and environmental factors interact to influence community composition over time limits the widespread application of this approach. In this study, we examined how the composition of seed mixes and environment (north‐ vs. south‐facing slope aspect) influence functional composition and native plant cover over time in restored grassland and shrubland communities. Variation in native cover over 4 years was primarily driven by species mix, slope aspect, and a species mix by year interaction rather than an interaction between species mix and slope aspect as predicted. Although native cover was higher on wetter, north‐facing slopes for most of the study, south‐facing slopes achieved a similar cover (65%–70%) by year 4. While community‐weighted mean (CWM) values generally became more resource conservative over time, we found shifts in particular traits across community types and habitats. For example, CWM for specific leaf area increased over time in grassland mixes. Belowground, CWM for root mass fraction increased while CWM for specific root length decreased across all seed mixes. Multivariate functional dispersion remained high in shrub‐containing mixes throughout the study, which could enhance invasion resistance and recovery following disturbance. Functional diversity and species richness were initially higher in drier, south‐facing slopes compared to north‐facing slopes, but these metrics were similar across north‐ and south‐facing slopes by the end of the 4‐year study. Our finding that different combinations of traits were favored in south‐ and north‐facing slopes and over time demonstrates that trait‐based approaches can be used to identify good restoration candidate species and, ultimately, enhance native plant cover across community types and microhabitat. Changing the composition of planting mixes based on traits could be a useful strategy for restoration practitioners to match species to specific environmental conditions and may be more informative than using seed mixes based on growth form, as species within functional groups can vary tremendously in leaf and root traits. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
45. Precipitation timing and soil substrate drive phenology and fitness of Arabidopsis thaliana in a Mediterranean environment.
- Author
-
Martínez‐Berdeja, Alejandra, Okada, Miki, Cooper, Martha D., Runcie, Daniel E., Burghardt, Liana T., and Schmitt, Johanna
- Subjects
FLOWERING time ,AUTUMN ,ARABIDOPSIS thaliana ,PLANT phenology ,PHENOLOGY ,SPRING - Abstract
Copyright of Functional Ecology is the property of Wiley-Blackwell and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2023
- Full Text
- View/download PDF
46. Latitudinal and elevational variation in the reproductive biology of house wrens, Troglodytes aedon.
- Author
-
Levin, Rachel N., Correa, Stephanie M., Freund, Kate A., and Fuxjager, Matthew J.
- Subjects
LIFE history theory ,WRENS ,CHICKS ,ANIMAL clutches ,BIOLOGY - Abstract
While cross‐species comparisons of birds suggest that as latitude decreases or elevation increases, clutch size decreases and the duration of developmental stages and parental attentiveness increases, studies comparing populations of the same species are rare. We studied populations of house wrens, Troglodytes aedon, at high and low elevations in California and Costa Rica, collecting data on clutch size, the duration of incubation and nestling periods, parental attentiveness, nestling growth rate, and nesting success. Our data support results from cross‐species comparisons, but also revealed unanticipated results from low elevation temperate zone house wrens in the southwest. This population had prolonged incubation and nestling periods similar to those found in the tropics. We also found that temperate zone females, especially those at our higher elevation site, spent more of their day incubating than did tropical females. Nest temperature at our high elevation temperate zone site was higher than that at all other tropical sites. Age at fledging did not differ between sites. Total feeding rates per chick and male feedings per chick did not vary between sites. Nest success rates showed the predicted effect of latitude, but not the predicted effects of elevation. Our results extend low elevation house wren research into the southwestern US and contribute the first intraspecific elevational comparison in the Neotropics. Data from our low elevation southwestern site present a unique suite of life history traits that align more with tropical house wrens, although with a larger clutch size, and point to food limitation and/or high predation pressure as being possible drivers of some of these differences. These results highlight the need for additional studies of house wrens and other broadly distributed species at a more diverse array of sites to better understand which forces drive the evolution of different life history strategies across major biogeographical gradients. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
47. Hillslope Morphology Drives Variability of Detrital 10Be Erosion Rates in Steep Landscapes.
- Author
-
DiBiase, Roman A., Neely, Alexander B., Whipple, Kelin X., Heimsath, Arjun M., and Niemi, Nathan A.
- Subjects
EROSION ,COSMOGENIC nuclides ,WATERSHEDS ,LANDSCAPES ,ROCK slopes ,MASS-wasting (Geology) ,GEOMORPHOLOGY - Abstract
The connection between topography and erosion rate is central to understanding landscape evolution and sediment hazards. However, investigation of this relationship in steep landscapes has been limited due to expectations of: (a) decoupling between erosion rate and "threshold" hillslope morphology; and (b) bias in detrital cosmogenic nuclide erosion rates due to deep‐seated landslides. Here we compile 120 new and published 10Be erosion rates from catchments in the San Gabriel Mountains, California, and show that hillslope morphology and erosion rate are coupled for slopes approaching 50° due to progressive exposure of bare bedrock with increasing erosion rate. We find no evidence for drainage area dependence in 10Be erosion rates in catchments as small as 0.09 km2, and we show that landslide deposits influence erosion rate estimates mainly by adding scatter. Our results highlight the potential and importance of sampling small catchments to better understand steep hillslope processes. Plain Language Summary: In general, erosion rates increase as landscapes steepen. But where landslides are common, this relationship is thought to break down as hillslopes approach their angle of repose. The main tracer for measuring erosion rates, 10Be in sediment, can also be affected by landslides, and models predict it is unreliable for small watersheds in steep landscapes. Here, we compile an extensive data set of 10Be erosion rates from the San Gabriel Mountains of California. We show that slope and erosion rate are coupled well above the soil angle of repose due to systematic exposure of bedrock cliffs, supporting a new conceptual model for steep landscapes. The presence of landslides adds scatter but does not bias 10Be erosion rates, which yield robust results even in small, steep watersheds that have previously been avoided. Key Points: Progressive exposure of bare rock on steeper slopes leads to correlation of 10Be erosion rate and mean hillslope angle up to 47°Deep seated landslide deposits add scatter, but do not systematically bias 10Be erosion rate estimates in the San Gabriel MountainsNo evidence for drainage area dependence of 10Be erosion rates in upland catchments [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
48. Implementing youth participatory action research at a continuation high school.
- Author
-
Lárez, Natalie A., Sharkey, Jill D., Frattaroli, Shannon, Avila, Elena, and Medina, Andres
- Subjects
COMMUNITY-based participatory research ,HIGH schools ,EDUCATIONAL evaluation ,SCHOOL credits ,EDUCATIONAL change - Abstract
Objective: To describe the process of implementing a Youth Participatory Action Research (YPAR) project at a continuation high school (CHS) and share the results of a youth‐designed research project that explores barriers to high school completion. Data Sources and Study Setting: YPAR was implemented across three cohorts at a CHS in the central coast of California from 2019 to 2022. Student survey respondents were enrolled CHS students between March and April 2021. Study Design: A modified YPAR curriculum integrating research methodology and social justice topics was used to guide student‐led research that resulted in a cross‐sectional survey. Data Collection: Field notes maintained by the first author documented the process of implementing YPAR including the curriculum, conversations, and research decisions and procedures. A student‐designed survey disseminated to all enrolled students resulted in 76 (66%) participant responses. The survey included 18 close‐ended questions and three narrative responses. Principal Findings: This study details how YPAR methodologies can be translated to a high school credit recovery program. For example, student cohorts were needed to maintain continuity over time. A student‐designed survey revealed that 72% of student respondents reported taking care of family members and illuminated high rates of depression symptoms. Conclusions: This study offers a detailed description of how we implemented YPAR at a credit recovery program and provides student‐driven perspectives on educational reform and evaluation. This project addresses the implementation and challenges of using YPAR to engage youth in transformational resistance to rapidly study and improve CHS' policy and practice. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
49. Easy to use and validated predictive models to identify beneficiaries experiencing homelessness in Medicaid administrative data.
- Author
-
Pourat, Nadereh, Yue, Dahai, Chen, Xiao, Zhou, Weihao, and O'Masta, Brenna
- Subjects
MEDICAID beneficiaries ,HOMELESSNESS ,RECEIVER operating characteristic curves ,MACHINE learning ,RANDOM forest algorithms ,PREDICTION models - Abstract
Objective: To develop easy to use and validated predictive models to identify beneficiaries experiencing homelessness from administrative data. Data Sources: We pooled enrollment and claims data from enrollees of the California Whole Person Care (WPC) Medicaid demonstration program that coordinated the care of a subset of Medicaid beneficiaries identified as high utilizers in 26 California counties (25 WPC Pilots). We also used public directories of social service and health care facilities. Study Design: Using WPC Pilot‐reported homelessness status, we trained seven supervised learning algorithms with different specifications to identify beneficiaries experiencing homelessness. The list of predictors included address‐ and claims‐based indicators, demographics, health status, health care utilization, and county‐level homelessness rate. We then assessed model performance using measures of balanced accuracy (BA), sensitivity, specificity, positive predictive value, negative predictive value, and area under the receiver operating characteristic curve (area under the curve [AUC]). Data Collection/Extraction Methods: We included 93,656 WPC enrollees from 2017 to 2018, 37,441 of whom had a WPC Pilot‐reported homelessness indicator. Principal Findings: The random forest algorithm with all available indicators had the best performance (87% BA and 0.95 AUC), but a simpler Generalized Linear Model (GLM) also performed well (74% BA and 0.83 AUC). Reducing predictors to the top 20 and top five most important indicators in a GLM model yields only slightly lower performance (86% BA and 0.94 AUC for the top 20 and 86% BA and 0.91 AUC for the top five). Conclusions: Large samples can be used to accurately predict homelessness in Medicaid administrative data if a validated homelessness indicator for a small subset can be obtained. In the absence of a validated indicator, the likelihood of homelessness can be calculated using county rate of homelessness, address‐ and claim‐based indicators, and beneficiary age using a prediction model presented here. These approaches are needed given the rising prevalence of homelessness and the focus of Medicaid and other payers on addressing homelessness and its outcomes. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
50. Identifying drivers of population dynamics for a stream breeding amphibian using time series of egg mass counts.
- Author
-
Rose, Jonathan P., Kupferberg, Sarah J., Peek, Ryan A., Ashton, Don, Bettaso, James B., Bobzien, Steve, Bourque, Ryan M., Breedveld, Koen G. H., Catenazzi, Alessandro, Drennan, Joseph E., Gonsolin, Earl, Grefsrud, Marcia, Herman, Andrea E., House, Matthew R., Kluber, Matt R., Lind, Amy J., Marlow, Karla R., Striegle, Alan, van Hattem, Michael, and Wheeler, Clara A.
- Subjects
AMPHIBIAN declines ,POPULATION dynamics ,POPULATION viability analysis ,TIME series analysis ,AMPHIBIANS ,BIRD populations ,NEMATOCIDES - Abstract
The decline in amphibian populations is one of the starkest examples of the biodiversity crisis. For stream breeding amphibians, alterations to natural flow regimes by dams, water diversions, and climate change have been implicated in declines and extirpations. Identifying drivers of amphibian declines requires long time series of abundance data because amphibian populations can exhibit high natural variability. Multiple population viability analysis (MPVA) models integrate abundance data and share information from different populations to estimate how environmental factors influence population growth. Flow alteration has been linked to declines and extirpations in the Foothill Yellow‐legged Frog (Rana boylii), a stream breeding amphibian native to California and Oregon. To date, no study has jointly analyzed abundance data from populations throughout the range of R. boylii in an MPVA model. We compiled time series of egg mass counts (an index of adult female abundance) from R. boylii populations in 36 focal streams and fit an MPVA model to quantify how streamflow metrics, stream temperature, and surrounding land cover affect population growth. We found population growth was positively related to stream temperature and was higher in the years following a wet year with high total annual streamflow. Density dependence was weakest (i.e., carrying capacity was highest) for streams with high seasonality of streamflow and intermediate rates of change in streamflow during spring. Our results highlight how altered streamflow can further increase the risk of decline for R. boylii populations. Managing stream conditions to better match natural flow and thermal regimes would benefit the conservation of R. boylii populations. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.