Skip to main content
eScholarship
Open Access Publications from the University of California

This series is automatically populated with publications deposited by UCLA Henry Samueli School of Engineering and Applied Science Department of Civil and Environmental Engineering researchers in accordance with the University of California’s open access policies. For more information see Open Access Policy Deposits and the UC Publication Management System.

Cover page of Factors and Processes Affecting Delta Levee System Vulnerability

Factors and Processes Affecting Delta Levee System Vulnerability

(2016)

We appraised factors and processes related to human activities and high water, subsidence, and seismicity. Farming and drainage of peat soils caused subsidence, which contributed to levee internal failures. Subsidence rates decreased with time, but still contributed to levee instability. Modeling changes in seepage and static slope instability suggests an increased probability of failure with decreasing peat thickness. Additional data is needed to assess the spatial and temporal effects of subsidence from peat thinning and deformation. Large-scale, state investment in levee upgrades (> $700 million since the mid-1970s) has increased conformance with applicable standards; however, accounts conflict about corresponding reductions in the number of failures.

Modeling and history suggest that projected increases in high-flow frequency associated with climate change will increase the rate of levee failures. Quantifying this increased threat requires further research. A reappraisal of seismic threats resulted in updated ground motion estimates for multiple faults and earthquake-occurrence frequencies. Estimated ground motions are large enough to induce failure. The immediate seismic threat, liquefaction, is the sudden loss of strength from an increase in the pressure of the pore fluid and the corresponding loss of inter-particle contact forces. However, levees damaged during an earthquake that do not immediately fail may eventually breach. Key sources of uncertainty include occurrence frequencies and magnitudes, localized ground motions, and data for liquefaction potential.

Estimates of the consequences of future levee failure range up to multiple billions of dollars. Analysis of future risks will benefit from improved description of levee upgrades and strength as well as consideration of subsidence, the effects of climate change, and earthquake threats. Levee habitat ecosystem benefits in this highly altered system are few. Better recognition and coordination is needed among the creation of high-value habitat, levee needs, and costs and benefits of levee improvements and breaches.

Cover page of Reliability of low frequency mHVSR ordinates

Reliability of low frequency mHVSR ordinates

(2024)

Microtremor horizontal-to-vertical spectral ratios (mHVSR) are frequency-dependent ratios of Fourier amplitude spectra of the horizontal to vertical components of a 3-component recording of ambient ground motions from microtremors. Results from mHVSR tests can identify the frequencies associated with site resonances at sites with large impedance contrasts, and hence have potential to provide useful parameters for predicting seismic site response. Site measurements are made by recording ground vibrations either from a temporarily deployed seismometer, typically recording for a relatively short period of time (~1-2 hrs.), or from a permanently-installed broadband seismometer. In this paper, we discuss ongoing work investigating the reliability of low frequency (< ∼0.1 Hz) mHVSR ordinates. Such low frequency ordinates are potentially useful for sites that are known to have deep basins (e.g., LA Basin, Imperial Valley, Great Salt Lake basin), where fundamental frequencies may fall in this range and direct measurements of depth to bedrock are difficult to make. We have found that low-frequency mHVSR ordinates (< ∼0.1-0.2 Hz) are for practical purposes not reliable in most cases, even when measured by high-quality temporary or permanent broadband sensors. In this paper, we discuss sensor drift and its limited impact on the reliability of mHVSR ordinates. We document the low frequency problem for multiple sites, although we do not have a solution as of this writing for how to improve the reliability of low-frequency results.

Cover page of A general framework for modeling subregional path effects

A general framework for modeling subregional path effects

(2024)

Next Generation Attenuation (NGA) West2 ground motion models (GMMs) include regional path adjustments for broad jurisdictional regions, which necessarily averages spatially variable path effects within those regions. We extend that framework to account for systematic variations in attenuation within subregions defined in consideration of geologic differences. In recent years, cell-based methods which systematically account for spatial variations by summing the attenuation effects over a fine discretization of uniform-rectangular cells (e.g., Dawood and Rodriquez-Marek 2013; Kuehn et al. 2019) have been shown to be an effective alternative to regionalization and a step towards modelling non-ergodic path effects. The main drawbacks of these models, however, are their increased computational complexity, poorly informed coefficients for cells in which few paths travel, and unoptimized boundaries that may span across the limits of geologic domains. The framework presented here considers physio-geological differences to form subregional boundaries. Broad jurisdictional regions are divided into a number of subregions that are orders of magnitude greater in size than the uniform cells of cell-based methods, but smaller than regions corresponding to the NGA-West2 adjustments. Subregional boundaries are informed by geological differences and empirical observations to create domains with internally similar properties. The total attenuation effect for a given path that traverses multiple subregions is obtained by weighting the individual subregional effects by the proportion of the path length within each subregion. This approach has been successfully applied in California, where it was shown to achieve a reduction in bias and within-event and single-station variability relative to an NGA-West2 GMM for ground motions at large distance (RJB > 100 km). The framework presented here can readily be adapted for other GMMs and regions.

Cover page of User-interaction with a web-served global ground motion relational database

User-interaction with a web-served global ground motion relational database

(2024)

We present an application programming interface (API) which facilitates public access to a global relational database of earthquake ground motion intensity measures, associated metadata, and time-series data. Next Generation Attenuation (NGA)-East and NGA-West2 project spreadsheets have been adapted into a relational database format composed of multiple tables through a series of primary and foreign keys. The combined dataset has been expanded to include contributions from earthquakes, generally with magnitudes greater than M3.9, that have occurred since the conclusion of the data synthesis component of both projects in 2011. Currently the database includes 62,449 ground motions recorded at 9,092 stations for 899 events. The database is accessible through an API, which allows users to interact with and query the database directly without detailed knowledge of structure query language (SQL). Simple queries are constructed by appending relatively straightforward query string parameters to the end of a uniform resource location (URL) that serves as an endpoint, which returns only data that satisfy the query constraints. The web-served nature of the database means that users have immediate access to ground motion data as soon as it is collected, reviewed, and uploaded. Furthermore, integrated end-to-end workflows – which do not require files to be downloaded and saved in local memory – are possible through the API. The structure of the database has been designed to accommodate growth, with ongoing efforts to integrate global ground motion data in anticipation of the NGA-West3 project, and improve ease-of access through the API.

Cover page of A maching learning-based analysis of liquefaction input factors using the Next Generation Liquefaction database

A maching learning-based analysis of liquefaction input factors using the Next Generation Liquefaction database

(2024)

Liquefaction triggering is typically predicted using fully-empirical and/or semi-empirical models. Hence, such models are heavily reliant upon available liquefaction (and/or lack thereof) case history data. These predictive models are based on a variety of factors, describing the demand (i.e., the cyclic stress ratio, CSR in existing legacy models) and the capacity (i.e., the cyclic resistance ratio, CRR). However, the degree to which these factors truly affect models’ performance is unknown. To explore this aspect and quantitatively rank the importance of liquefaction input model parameters, we leverage a Random Forest Machine Learning (ML) approach using two methods: (1) a feature importance metric based on the Gini impurity index, and (2) a SHapley Additive exPlanations (SHAP)-based approach. Both approaches were employed using typical input factors used in legacy liquefaction triggering models based on cone penetration test (CPT) data. Such analyses were performed using all reviewed (i.e., fully vetted) data in the Next Generation Liquefaction (NGL) database. Our analysis then separately explores the impact on resulting models of seven input parameters. We show that the most important input parameters are: (1) the peak ground acceleration, (2) the soil behavior type index, and (3) the earthquake magnitude (which serves as a proxy for duration in such models). The input parameters with the lowest importance are the total and the effective vertical stresses. A limitation of this analysis is that the ML model used does not allow for extrapolation beyond the range of the data. As a result, for input parameters with narrow distributions of the data (i.e., somewhat limited parameter space), a lower ranking could be associated with such limited availability of a wide range of values, rather than being related to actual low importance. This limitation likely accounts for the low importance attached to stress-related input parameters since legacy case histories are generally related to shallower (<10m) depths.

Cover page of Recommendations on best available science for the United States National Seismic Hazard Model

Recommendations on best available science for the United States National Seismic Hazard Model

(2024)

The 50 state update to the 2023 United States National Seismic Hazard Model (NSHM) is the latest in a sequence published by the U. S. Geological Survey (USGS). The 2023 NSHM is intended for use in building codes and similar applications at return periods of 475 years (corresponding to exceedance probabilities of 10% in 50 years) or longer. In reviewing the model, the NSHM Program Steering Committee, consisting of the authors of this paper, considered the characteristics of “best available science” that are applicable to the NSHM. Best available science must perform better than the previous NSHM, and there should be no available alternatives that could improve the models. The following are suggested characteristics of “best available science”: A) Clear objectives B) Rigorous conceptual model C) Timely, relevant and inclusive D) Verified and reproducible E) Validated intermediate and final models F) Replicable within uncertainties G) Peer reviewed H) Permanent documentation This article focuses on the justification for, and intent of, the above criteria for best available science.

Cover page of Relative contributions of different sources of epistemic uncertainty on seismic hazard in California

Relative contributions of different sources of epistemic uncertainty on seismic hazard in California

(2024)

We evaluate the relative impact of three sources of epistemic uncertainty on probabilistic seismic hazard analyses in California: source model uncertainty, ground motion model (GMM) uncertainty, and site parameter uncertainty. Seismic source model uncertainty is inherently contained in the source model framework applied by the USGS in the 2023 National Seismic Hazard Model (NSHM23); we have added tools to extract this uncertainty for California sites in the open-source seismic hazard software OpenSHA. GMM uncertainty is generally accounted for using alternative models in PSHA or a single backbone model with a defined uncertainty. Site parameter uncertainty refers to uncertainty in the shear wave velocity of the upper 30 meters of the site profile (VS30) and potentially other independent site parameters. We demonstrate the impacts of these major sources of epistemic uncertainty at the sites of two UC campuses, Berkeley, which is located near the active Hayward fault, and Davis, which is located in the relatively quiescent Central Valley. We investigate potential correlations between the different sources of uncertainty and find that source uncertainty is practically independent of GMM and VS30 uncertainty at Berkeley but dependent on GMM and VS30 at Davis. At both locations, GMM and site parameter uncertainty are correlated (i.e., inter-dependent). We represent epistemic uncertainty in ground motion with a period-dependent log-normal standard deviation term that is specific to a given site location, site condition, and exceedance frequency. We show that at Berkeley, the total epistemic uncertainty can be well approximated by the square root sum of squares (SRSS) of source uncertainty (i.e., uncertainty in ground motions related solely to the source model) and the combined GMM and site parameter uncertainty. We find that combined GMM and VS30 uncertainty is comparable to or greater than the source uncertainty at many oscillator periods at both sites. Combined uncertainties range from natural log standard deviations of about 0.2 at short periods to 0.6 at Berkeley and 0.3-0.7 at Davis at long periods.

Cover page of Findings from a decade of ground motion simulation validation research and a path forward

Findings from a decade of ground motion simulation validation research and a path forward

(2024)

Simulated ground motions can advance seismic hazard assessments and structural response analyses, particularly for conditions with limited recorded ground motions such as large magnitude earthquakes at short source-to-site distances. Rigorous validation of simulated ground motions is required before hazard analysts, practicing engineers, or regulatory bodies can be confident in their use. A decade ago, validation exercises were mainly limited to comparisons of simulated-to-observed waveforms and median values of spectral accelerations for selected earthquakes. The Southern California Earthquake Center (SCEC) Ground Motion Simulation Validation (GMSV) group was formed to increase coordination between simulation modelers and research engineers with the aim of devising and applying more effective methods for simulation validation. In this presentation, we summarize what has been learned in over a decade of GMSV activities. We categorize different validation methods according to their approach and the metrics considered. Two general validation approaches are to compare validation metrics from simulations to those from historical records or to those from semi-empirical models. Validation metrics consist of ground motion characteristics and structural responses. We discuss example validation studies that have been impactful in the past decade and suggest future research directions. Key lessons learned are that validation is application-specific, our outreach and dissemination need improvement, and much validation-related research remains unexplored. This presentation is a summary of our recent paper, Rezaeian et al. (2024), referenced below.

Cover page of Empirical site response of Mexico City through regionalization of global subduction GMMs

Empirical site response of Mexico City through regionalization of global subduction GMMs

(2024)

We assemble a dataset which enhances existing Central America and Mexico data for the NGASubduction (NGA-Sub) project, including now additional earthquake ground motions and site parameters from Mexico. This data has been used to provide regional customization of NGA-Sub global Ground Motion Models (GMMs) for application in Mexico, paying particular attention to the site response of Mexico City (CDMX). The expanded database for Mexico incorporates smaller magnitude earthquakes (M < 6) and three significant events (M 7.2-8.3) that occurred in 2017 and 2018. These latter events are particularly important, because they are well recorded over a broad distance range and apply for hazard-critical conditions. Our focus here is on presenting the observed site response in CDMX, which we model based on the time-averaged shear wave velocity in the upper 30 meters (VS30) and the Peak Ground Acceleration at a reference rock site (PGAr). The empirical model we propose is different from previous work in several ways. First, it is properly centered with respect to GMMs. Second, it is referenced to VS30 = 760 m/s, which is significantly firmer than the reference site previously used in practice, taken from a location on the UNAM campus with VS30 ~ 300 m/s. Third, we identify site amplification as nonlinear, whereas linear response has been assumed in prior models. The extent of this nonlinearity is characterized using an established seismic zonation for CDMX, being more pronounced in softer sites (Zone III) compared to stiffer ones (Zone I). This nonlinear effect is prominent at short periods and disappears at long periods. The advantages of the provided model include a unified framework for both within and outside CDMX (contrasting current practices) and the integration of applicable features from global models with necessary local customization.

Cover page of Preliminary observations in support of the development of an ergodic site response model in California conditioned on Vs30 and HVSR Parameters

Preliminary observations in support of the development of an ergodic site response model in California conditioned on Vs30 and HVSR Parameters

(2024)

Traditional ergodic models are derived based on time-averaged shear-wave velocity in the upper 30 m of the site. These models are not able to account for site resonances, the presence and frequency of which can be established from microtremor HVSR surveys. Not all California sites exhibit such resonances, but knowledge that peaks are or are not present affects site response over a wide range of frequencies, with the former producing a response spectral peak near the HVSR peak. Research is underway to develop a model using microtremor HVSR data, which will be novel relative to previous models that are based on earthquake HVSR data. Our model is being formulated as modification to a global VS30 and z1.0 relationship. This paper explains the model development approach and findings of a systematic assessment of how HVSR curves relate to features of site-specific (or non-ergodic) response, which is informing model development.