The overarching goal of this dissertation is to improve predictive capabilities of geotechnical seismic site response analyses by incorporating additional salient physical phenomena that influence site effects. Specifically, multidimensional wave-propagation effects that are neglected in conventional 1D site response analyses are incorporated by: (1) combining results of 3D regional-scale simulations with 1D nonlinear wave-propagation site response analysis, and (2) modelling soil heterogeneity in 2D site response analyses using spatially-correlated random fields to perturb soil properties. A method to combine results from 3D hybrid physics-based ground motion simulations with site-specific nonlinear site response analyses was developed. The 3D simulations capture 3D ground motion phenomena on a regional scale, while the 1D nonlinear site response, which is informed by detailed site-specific soil characterization data, can capture site effects more rigorously. Simulations of 11 moderate-to-large earthquakes from the 2010-2011 Canterbury Earthquake Sequence (CES) at 20 strong motion stations (SMS) were used to validate simulations with observed ground motions. The predictions were compared to those from an empirically-based ground motion model (GMM), and from 3D simulations with simplified VS30- based site effects modelling. By comparing all predictions to observations at seismic recording stations, it was found that the 3D physics-based simulations can predict ground motions with comparable bias and uncertainty as the GMM, albeit, with significantly lower bias at long periods. Additionally, the explicit modelling of nonlinear site-response improves predictions significantly compared to the simplified VS30-based approach for soft-soil or atypical sites that exhibit exceptionally strong site effects. A method to account for the spatial variability of soils and wave scattering in 2D site response analyses was developed and validated against a database of vertical array sites in California. The inputs required to run the 2D analyses are nominally the same as those required for 1D analyses (except for spatial correlation parameters), enabling easier adoption in practice. The first step was to create the platform and workflow, and to perform a sensitivity study involving 5,400 2D model realizations to investigate the influence of random field input parameters on wave scattering and site response. Boundary conditions were carefully assessed to understand their effect on the modelled response and select appropriate assumptions for use on a 2D model with lateral heterogeneities. Multiple ground-motion intensity measures (IMs) were analyzed to quantify the influence from random field input parameters and boundary conditions. It was found that this method is capable of scattering seismic waves and creating spatially-varying ground motions at the ground surface. The redistribution of ground-motion energy across wider frequency bands, and the scattering attenuation of high-frequency waves in 2D analyses, resemble features observed in empirical transfer functions (ETFs) computed in other studies. The developed 2D method was subsequently extended to more complicated multi-layer soil profiles and applied to a database of 21 vertical array sites in California to test its appropriate- ness for future predictions. Again, different boundary condition and input motion assumptions were explored to extend the method to the in-situ conditions of a vertical array (with a sensor embedded in the soil). ETFs were compared to theoretical transfer functions (TTFs) from conventional 1D analyses and 2D analyses with heterogeneity. Residuals of transfer-function- based IMs, and IMs of surface ground motions, were also used as validation metrics. The spatial variability of transfer-function-based IMs was estimated from 2D models and compared to the event-to-event variability from ETFs. This method was found capable of significantly improving predictions of median ETF amplification factors, especially for sites that display higher event-to-event variability. For sites that are well represented by 1D methods, the 2D approach can underpredict amplification factors at higher modes, suggesting that the level of heterogeneity may be over-represented by the 2D random field models used in this study.
The Manchester Courts building was a heritage building located in central Christchurch (New Zealand) that was damaged in the Mw 7.1 Darfield earthquake on 4 September 2010 and subsequently demolished as a risk reduction exercise. Because the building was heritage listed, the decision to demolish the building resulted in strong objections from heritage supporters who were of the opinion that the building had sufficient residual strength to survive possible aftershock earthquakes. On 22 February 2011 Christchurch was struck by a severe aftershock, leading to the question of whether building demolition had proven to be the correct risk reduction strategy. Finite element analysis was used to undertake a performance-based assessment, validating the accuracy of the model using the damage observed in the building before its collapse. In addition, soil-structure interaction was introduced into the research due to the comparatively low shear wave velocity of the soil. The demolition of a landmark heritage building was a tragedy that Christchurch will never recover from, but the decision was made considering safety, societal, economic and psychological aspects in order to protect the city and its citizens. The analytical results suggest that the Manchester Courts building would have collapsed during the 2011 Christchurch earthquake, and that the collapse of the building would have resulted in significant fatalities.
Advanced seismic effective-stress analysis is used to scrutinize the liquefaction performance of 55 well-documented case-history sites from Christchurch. The performance of these sites during the 2010-2011 Canterbury earthquake sequence varied significantly, from no liquefaction manifestation at the ground surface (in any of the major events) to severe liquefaction manifestation in multiple events. For the majority of the 55 sites, the simplified liquefaction evaluation procedures, which are conventionally used in engineering practice, could not explain these dramatic differences in the manifestation. Detailed geotechnical characterization and subsequent examination of the soil profile characteristics of the 55 sites identified some similarities but also important differences between sites that manifested liquefaction in the two major events of the sequence (YY-sites) and sites that did not manifest liquefaction in either event (NN-sites). In particular, while the YY-sites and NN-sites are shown to have practically identical critical layer characteristics, they have significant differences with regard to their deposit characteristics including the thickness and vertical continuity of their critical zones and liquefiable materials. A CPT-based effective stress analysis procedure is developed and implemented for the analyses of the 55 case history sites. Key features of this procedure are that, on the one hand, it can be fully automated in a programming environment and, on the other hand, it is directly equivalent (in the definition of cyclic resistance and required input data) to the CPT-based simplified liquefaction evaluation procedures. These features facilitate significantly the application of effective-stress analysis for simple 1D free-field soil-column problems and also provide a basis for rigorous comparisons of the outcomes of effective-stress analyses and simplified procedures. Input motions for the analyses are derived using selected (reference) recordings from the two major events of the 2010-2011 Canterbury earthquake sequence. A step-by-step procedure for the selection of representative reference motions for each site and their subsequent treatment (i.e. deconvolution and scaling) is presented. The focus of the proposed procedure is to address key aspects of spatial variability of ground motion in the near-source region of an earthquake including extended-source effects, path effects, and variation in the deeper regional geology.
This thesis presents the application of data science techniques, especially machine learning, for the development of seismic damage and loss prediction models for residential buildings. Current post-earthquake building damage evaluation forms are developed for a particular country in mind. The lack of consistency hinders the comparison of building damage between different regions. A new paper form has been developed to address the need for a global universal methodology for post-earthquake building damage assessment. The form was successfully trialled in the street ‘La Morena’ in Mexico City following the 2017 Puebla earthquake. Aside from developing a framework for better input data for performance based earthquake engineering, this project also extended current techniques to derive insights from post-earthquake observations. Machine learning (ML) was applied to seismic damage data of residential buildings in Mexico City following the 2017 Puebla earthquake and in Christchurch following the 2010-2011 Canterbury earthquake sequence (CES). The experience showcased that it is readily possible to develop empirical data only driven models that can successfully identify key damage drivers and hidden underlying correlations without prior engineering knowledge. With adequate maintenance, such models have the potential to be rapidly and easily updated to allow improved damage and loss prediction accuracy and greater ability for models to be generalised. For ML models developed for the key events of the CES, the model trained using data from the 22 February 2011 event generalised the best for loss prediction. This is thought to be because of the large number of instances available for this event and the relatively limited class imbalance between the categories of the target attribute. For the CES, ML highlighted the importance of peak ground acceleration (PGA), building age, building size, liquefaction occurrence, and soil conditions as main factors which affected the losses in residential buildings in Christchurch. ML also highlighted the influence of liquefaction on the buildings losses related to the 22 February 2011 event. Further to the ML model development, the application of post-hoc methodologies was shown to be an effective way to derive insights for ML algorithms that are not intrinsically interpretable. Overall, these provide a basis for the development of ‘greybox’ ML models.