Search

found 40 results

Research papers, University of Canterbury Library

In the last two decades, New Zealand (NZ) has experienced significant earthquakes, including the 2010 M 7.2 Darfield, 2011 M 6.2 Christchurch, and 2016 M 7.8 Kaikōura events. Amongst these large events, tens of thousands of smaller earthquakes have occurred. While previous event and ground-motion databases have analyzed these events, many events below M 4 have gone undetected. The goal of this study is to expand on previous databases, particularly for small magnitude (M<4) and low-amplitude ground motions. This new database enables a greater understanding of regional variations within NZ and contributes to the validity of internationally developed ground-motion models. The database includes event locations and magnitude estimates with uncertainty considerations, and tectonic type assessed in a hierarchical manner. Ground motions are extracted from the GeoNet FDSN server and assessed for quality using a neural network classification approach. A deep neural network approach is also utilized for picking P and S phases for determination of event hypocentres. Relative hypocentres are further improved by double-difference relocation and will contribute toward developing shallow (< 50 km) seismic tomography models. Analysis of the resulting database is compared with previous studies for discussion of implications toward national hazard prediction models.

Research papers, University of Canterbury Library

The 2015 New Zealand strong-motion database provides a wealth of new strong motion data for engineering applications. An important component of this database is the compilation of new site metadata, describing the soil conditions and site response at GeoNet strong motion stations. We have assessed and compiled four key site parameters for the ~460 GeoNet stations that recorded significant historical ground motions. Parameters include: site classification (NZS1170.5), Vs30, fundamental site period (Tsite) and depth to bedrock (Z1.0, i.e. depth to material with Vs > 1000 m/s). In addition, we have assigned a quality estimate (Quality 1 – 3) to these parameters to provide a qualitative estimate of the uncertainty. New highquality Tsite estimates have largely been obtained from newly available HVSR amplification curves and spectral ratios from inversion of regional strong motion data that has been reconciled with available geological information. Good quality Vs30 estimates, typically in urban centres, have also been incorporated following recent studies. Where site-specific measurements of Vs30 are not available, Vs30 is estimated based on surface geology following national Vs30 maps. New Z1.0 values have been provided from 3D subsurface models for Canterbury and Wellington. This database will be used in efforts to guide development and testing of new and existing ground motion prediction models in New Zealand. In particular, it will allow reexamination of the most important site parameters that control and predict site response in a New Zealand setting. Furthermore, it can be used to provide information about suitable rock reference sites for seismological research, and as a guide to site-specific references in the literature. We discuss compilation of the database, preliminary insights so far, and future directions.

Research papers, University of Canterbury Library

The focus of the study presented herein is an assessment of the relative efficacy of recent Cone Penetration Test (CPT) and small strain shear wave velocity (Vs) based variants of the simplified procedure. Towards this end Receiver Operating Characteristic (ROC) analyses were performed on the CPT- and Vs-based procedures using the field case history databases from which the respective procedures were developed. The ROC analyses show that Factors of Safety (FS) against liquefaction computed using the most recent Vs-based simplified procedure is better able to separate the “liquefaction” from the “no liquefaction” case histories in the Vs liquefaction database than the CPT-based procedure is able to separate the “liquefaction” from the “no liquefaction” case histories in the CPT liquefaction database. However, this finding somewhat contradicts the assessed predictive capabilities of the CPT- and Vs-based procedures as quantified using select, high quality liquefaction case histories from the 20102011 Canterbury, New Zealand, Earthquake Sequence (CES), wherein the CPT-based procedure was found to yield more accurate predictions. The dichotomy of these findings may result from the fact that different liquefaction field case history databases were used in the respective ROC analyses for Vs and CPT, while the same case histories were used to evaluate both the CPT- and Vs-based procedures.

Research papers, University of Canterbury Library

This study uses 44 high quality liquefaction case histories taken from 22 locations affected by the 2010-2011 Canterbury earthquake sequence to evaluate four commonly used CPT-VS correlations (i.e., Robertson, 2009; Hegazy and Mayne, 2006; Andrus et al., 2007; McGann et al., 2015b). Co-located CPT soundings and VS profiles, developed from surface wave testing, were obtained at 22 locations and case histories were developed for the Mw 7.1, 4 September 2010 Darfield and Mw 6.2, 22 February 2011 Christchurch earthquakes. The CPT soundings are used to generate VS profiles using each of four CPT-VS correlations. These correlated VS profiles are used to estimate the factor of safety against liquefaction using the Kayen et al. (2013) VS-based simplified liquefaction evaluation procedure. An error index is used to quantify the predictive capabilities of these correlations in relation to the observations of liquefaction (or the lack thereof). Additionally, the error indices from the CPT-correlated VS profiles are compared to those obtained using: (1) the Kayen et al. (2013) procedure with surface wave-derived VS profiles, and (2) the Idriss and Boulanger (2008) CPT-based liquefaction evaluation procedure. Based on the error indices, the evaluation procedures based on direct measurements of either CPT or VS provided more accurate liquefaction triggering estimates than those obtained from any of the CPT-VS correlations. However, the performance of the CPT-VS correlations varied, with the Robertson (2009) and Hegazy and Mayne (2006) correlations performing relatively poorly for the Christchurch soils and the Andrus et al. (2007) and McGann et al. (2015b) correlations performing better. The McGann et al. (2015b) correlation had the lowest error indices of the CPT-VS correlations tested, however, none of the CPT-VS correlations provided accurate enough VS predictions to be used for the evaluation of liquefaction triggering using the VS-based liquefaction evaluation procedures.

Research papers, University of Canterbury Library

SeisFinder is an open-source web service developed by QuakeCoRE and the University of Canterbury, focused on enabling the extraction of output data from computationally intensive earthquake resilience calculations. Currently, SeisFinder allows users to select historical or future events and retrieve ground motion simulation outputs for requested geographical locations. This data can be used as input for other resilience calculations, such as dynamic response history analysis. SeisFinder was developed using Django, a high-level python web framework, and uses a postgreSQL database. Because our large-scale computationally-intensive numerical ground motion simulations produce big data, the actual data is stored in file systems, while the metadata is stored in the database.

Articles, UC QuakeStudies

This study updated and superseded Earthquake hazard and risk assessment study Stage 1 Part A: Earthquake source identification and characterisation (Pettinga et al, 1998). It compiled and tabulated all relevant available information on earthquake sources in Canterbury and updated the active faults database with new fault locations and information. See Object Overview for background and usage information.

Research Papers, Lincoln University

The 2013 Seddon earthquake (Mw 6.5), the 2013 Lake Grassmere earthquake (Mw 6.6), and the 2016 Kaikōura earthquake (Mw 7.8) provided an opportunity to assemble the most extensive damage database to wine storage tanks ever compiled worldwide. An overview of this damage database is presented herein based on the in-field post-earthquake damage data collected for 2058 wine storage tanks (1512 legged tanks and 546 flat-based tanks) following the 2013 earthquakes and 1401 wine storage tanks (599 legged tanks and 802 flat-based tanks) following the 2016 earthquake. Critique of the earthquake damage database revealed that in 2013, 39% and 47% of the flat-based wine tanks sustained damage to their base shells and anchors respectively, while due to resilience measures implemented following the 2013 earthquakes, in the 2016 earthquake the damage to tank base shells and tank anchors of flat-based wine tanks was reduced to 32% and 23% respectively and instead damage to tank barrels (54%) and tank cones (43%) was identified as the two most frequently occurring damage modes for this type of tank. Analysis of damage data for legged wine tanks revealed that the frame-legs of legged wine tanks sustained the greatest damage percentage among different parts of legged tanks in both the 2013 earthquakes (40%) and in the 2016 earthquake (44%). Analysis of damage data and socio-economic findings highlight the need for industry-wide standards, which may have socio-economic implications for wineries.

Articles, UC QuakeStudies

This study compiled and tabulated all relevant available information on earthquake sources (active faults) in Canterbury and mapped the fault locations onto 1:50,000 or 1:250,000 overlays on topographic maps (later digitised into the Environment Canterbury active faults database). The study also reviewed information on historic earthquakes, instrumental seismicity and paleoseismic studies and identified information gaps. It recommended an approach for a probabilistic seismic hazard analysis and development of earthquake scenarios. See Object Overview for background and usage information.

Research papers, University of Canterbury Library

There is a now a rich literature on the connections between digital media, networked computing, and the shaping of urban material cultures. Much less has addressed the post-disaster context, like we face in Christchurch, where it is more a case of re-build rather than re-new. In what follows I suggest that Lev Manovich’s well-known distinction between narrative and database as distinct but related cultural forms is a useful framework for thinking about the Christchurch rebuild, and perhaps urbanism more generally.

Research papers, University of Canterbury Library

This paper summarizes the development of a high-resolution surficial shear wave velocity model based on the combination of the large high-spatial-density database of cone penetration test (CPT) logs in and around Christchurch, New Zealand and a recently-developed Christchurch-specific empirical correlation between soil shear wave velocity and CPT. This near-surface shear wave velocity model has applications for site characterization efforts via the development of maps of time-averaged shear wave velocities over specific depths, as well as use in site response analysis and ground motion simulation.

Research papers, University of Canterbury Library

This paper summarizes the development of a region-wide surficial shear wave velocity model based on the combination of the large high-spatial-density database of cone penetration test (CPT) logs in and around Christchurch, New Zealand and a recently-developed Christchurch-specific empirical correlation between soil shear wave velocity and CPT. The ongoing development of this near-surface shear wave velocity model has applications for site characterization efforts via the development of maps of time-averaged shear wave velocities over specific depths, and the identification of regional similarities and differences in soil shear stiffness.

Audio, Radio New Zealand

A research project on news coverage about Maori, has found that tangata whenua are still regarded as lower class citizens; Ngai Tahu iwi says it's learnt from the Canterbury earthquakes, just how important it is to safeguard important documents such as its whakapapa database in a digital form, in case there's another natural disaster; New Zealand's largest Maori owned fishing company wants to see the unique Maori story pushed by companies doing business in Asian countries; Meanwhile Ngati Kahungunu Chairman, Ngahiwi Tomoana, who was the business group convenor, says Maori business leaders are keen to set up an office in China

Audio, Radio New Zealand

A research project on news coverage about Maori, has found that tangata whenua are still regarded as lower class citizens; Ngai Tahu iwi says it's learnt from the Canterbury earthquakes, just how important it is to safeguard important documents such as its whakapapa database in a digital form, in case there's another natural disaster; New Zealand's largest Maori owned fishing company wants to see the unique Maori story pushed by companies doing business in Asian countries; Meanwhile Ngati Kahungunu Chairman, Ngahiwi Tomoana, who was the business group convenor, says Maori business leaders are keen to set up an office in China.

Research papers, University of Canterbury Library

The city of Christchurch and its surrounds experienced widespread damage due to soil liquefaction induced by seismic shaking during the Canterbury earthquake sequence that began in September 2010 with the Mw7.1 Darfield earthquake. Prior to the start of this sequence, the city had a large network of strong motion stations (SMSs) installed, which were able to record a vast database of strong ground motions. This paper uses this database of strong ground motion recordings, observations of liquefaction manifestation at the ground surface, and data from a recently completed extensive geotechnical site investigation program at each SMS to assess a range of liquefaction evaluation procedures at the four SMSs in the Christchurch Central Business District (CBD). In general, the characteristics of the accelerograms recorded at each SMS correlated well with the liquefaction evaluation procedures, with low liquefaction factors of safety predicted at sites with clear liquefaction identifiers in the ground motions. However, at sites that likely liquefied at depth (as indicated by evaluation procedures and/or inferred from the characteristics of the recorded surface accelerograms), the presence of a non-liquefiable crust layer at many of the SMS locations prevented the manifestation of any surface effects. Because of this, there was not a good correlation between surface manifestation and two surface manifestation indices, the Liquefaction Potential Index (LPI) and the Liquefaction Severity Number (LSN).

Research papers, University of Canterbury Library

The Canterbury Earthquake Sequence (CES), induced extensive damage in residential buildings and led to over NZ$40 billion in total economic losses. Due to the unique insurance setting in New Zealand, up to 80% of the financial losses were insured. Over the CES, the Earthquake Commission (EQC) received more than 412,000 insurance claims for residential buildings. The 4 September 2010 earthquake is the event for which most of the claims have been lodged with more than 138,000 residential claims for this event only. This research project uses EQC claim database to develop a seismic loss prediction model for residential buildings in Christchurch. It uses machine learning to create a procedure capable of highlighting critical features that affected the most buildings loss. A future study of those features enables the generation of insights that can be used by various stakeholders, for example, to better understand the influence of a structural system on the building loss or to select appropriate risk mitigation measures. Previous to the training of the machine learning model, the claim dataset was supplemented with additional data sourced from private and open access databases giving complementary information related to the building characteristics, seismic demand, liquefaction occurrence and soil conditions. This poster presents results of a machine learning model trained on a merged dataset using residential claims from the 4 September 2010.

Research papers, University of Canterbury Library

This study analyses the Earthquake Commission’s (EQC) insurance claims database to investigate the influence of seismic intensity and property damage resulting from the Canterbury Earthquake Sequence (CES) on the repair costs and claim settlement duration for residential buildings. Firstly, the ratio of building repair cost to its replacement cost was expressed as a Building Loss Ratio (BLR), which was further extended to Regional Loss Ratio (RLR) for greater Christchurch by multiplying the average of all building loss ratios with the proportion of building stock that lodged an insurance claim. Secondly, the total time required to settle the claim and the time taken to complete each phase of the claim settlement process were obtained. Based on the database, the regional loss ratio for greater Christchurch for three events producing shakings of intensities 6, 7, and 8 on the modified Mercalli intensity scale were 0.013, 0.066, and 0.171, respectively. Furthermore, small (less than NZD15,000), medium (between NZD15,000 and NZD100,000), and large (more than NZD100,000) claims took 0.35-0.55, 1.95-2.45, and 3.35-3.85 years to settle regardless of the building’s construction period and earthquake intensities. The number of claims was also disaggregated by various building characteristics to evaluate their relative contribution to the damage and repair costs.

Research papers, University of Canterbury Library

Social and natural capital are fundamental to people’s wellbeing, often within the context of local community. Developing communities and linking people together provide benefits in terms of mental well-being, physical activity and other associated health outcomes. The research presented here was carried out in Christchurch - Ōtautahi, New Zealand, a city currently re-building, after a series of devastating earthquakes in 2010 and 2011. Poor mental health has been shown to be a significant post-earthquake problem, and social connection has been postulated as part of a solution. By curating a disparate set of community services, activities and facilities, organised into a Geographic Information Systems (GIS) database, we created i) an accessibility analysis of 11 health and well-being services, ii) a mobility scenario analysis focusing on 4 general well-being services and iii) a location-allocation model focusing on 3 primary health care and welfare location optimisation. Our results demonstrate that overall, the majority of neighbourhoods in Christchurch benefit from a high level of accessibility to almost all the services; but with an urban-rural gradient (the further away from the centre, the less services are available, as is expected). The noticeable exception to this trend, is that the more deprived eastern suburbs have poorer accessibility, suggesting social inequity in accessibility. The findings presented here show the potential of optimisation modelling and database curation for urban and community facility planning purposes.

Research papers, University of Canterbury Library

The 2010–2011 Canterbury earthquake sequence began with the 4 September 2010, Mw7.1 Darfield earthquake and includes up to ten events that induced liquefaction. Most notably, widespread liquefaction was induced by the Darfield and Mw6.2 Christchurch earthquakes. The combination of well-documented liquefaction response during multiple events, densely recorded ground motions for the events, and detailed subsurface characterization provides an unprecedented opportunity to add well-documented case histories to the liquefaction database. This paper presents and applies 50 high-quality cone penetration test (CPT) liquefaction case histories to evaluate three commonly used, deterministic, CPT-based simplified liquefaction evaluation procedures. While all the procedures predicted the majority of the cases correctly, the procedure proposed by Idriss and Boulanger (2008) results in the lowest error index for the case histories analyzed, thus indicating better predictions of the observed liquefaction response.

Articles, UC QuakeStudies

The Christchurch liquefaction study was initiated to better determine liquefaction susceptibility in Christchurch city. It aimed to improve on earlier liquefaction susceptibility maps, which were based on soil type and distribution, by incorporating soil strength data into liquefaction analysis. This stage of the study included collating available geological and geotechnical data from Environment Canterbury and Christchurch City Council into a database, modelling liquefaction hazard and ground damage and presenting these as maps. The report contains many recommendations, which were taken up in subsequent stages of the study. (Note that the results of Stage 1 of the Christchurch liquefaction study were provided to Environment Canterbury as a letter rather than a report. This was a summary of work completed to 30 June 2001, including a review of geological and geotechnical data available within Environment Canterbury and Christchurch City Council records.) See Object Overview for background and usage information.

Research papers, University of Canterbury Library

This presentation discusses recent empirical ground motion modelling efforts in New Zealand. Firstly, the active shallow crustal and subduction interface and slab ground motion prediction equations (GMPEs) which are employed in the 2010 update of the national seismic hazard model (NSHM) are discussed. Other NZ-specific GMPEs developed, but not incorporated in the 2010 update are then discussed, in particular, the active shallow crustal model of Bradley (2010). A brief comparison of the NZ-specific GMPEs with the near-source ground motions recorded in the Canterbury earthquakes is then presented, given that these recordings collectively provide a significant increase in observed strong motions in the NZ catalogue. The ground motion prediction expert elicitation process that was undertaken following the Canterbury earthquakes for active shallow crustal earthquakes is then discussed. Finally, ongoing GMPE-related activities are discussed including: ground motion and metadata database refinement, improved site characterization of strong motion station, and predictions for subduction zone earthquakes.

Research papers, University of Canterbury Library

Unreinforced masonry (URM) structures comprise a majority of the global built heritage. The masonry heritage of New Zealand is comparatively younger to its European counterparts. In a country facing frequent earthquakes, the URM buildings are prone to extensive damage and collapse. The Canterbury earthquake sequence proved the same, causing damage to over _% buildings. The ability to assess the severity of building damage is essential for emergency response and recovery. Following the Canterbury earthquakes, the damaged buildings were categorized into various damage states using the EMS-98 scale. This article investigates machine learning techniques such as k-nearest neighbors, decision trees, and random forests, to rapidly assess earthquake-induced building damage. The damage data from the Canterbury earthquake sequence is used to obtain the forecast model, and the performance of each machine learning technique is evaluated using the remaining (test) data. On getting a high accuracy the model is then run for building database collected for Dunedin to predict expected damage during the rupture of the Akatore fault.

Research papers, University of Canterbury Library

This presentation summarizes the development of high-resolution surficial soil velocity models in the Canterbury, New Zealand basin. Shallow (<30m) shear wave velocities were primarily computed based on a combination of a large database of over 15,000 cone penetration test (CPT) logs in and around Christchurch, and a recently-developed Christchurch-specific empirical correlation between soil shear wave velocity and CPT. Large active-source testing at 22 locations and ambient-wavefield surface wave and H/V testing at over 80 locations were utilized in combination with 1700 water well logs to constrain the inter-bedded stratigraphy and velocity of Quaternary sediments up to depths of several hundred meters. Finally, seismic reflection profiles and the ambient-wavefield surface wave data provide constraint on velocities from several hundred meters to several kilometres. At all depths, the high resolution data illustrates the complexity of the soil conditions in the region, and the developed 3D models are presently being used in broadband ground motion simulations to further interpret the observed strong ground motions in the 2010-2011 Canterbury earthquake sequence.

Research papers, University of Canterbury Library

This poster provides a summary of the development of a 3D shallow (z<40m) shear wave velocity (Vs) model for the urban Christchurch, New Zealand region. The model is based on a recently developed Christchurch-specific empirical correlation between Vs and cone penetration test (CPT) data (McGann et al. 2014a,b) and the large high-density database of CPT logs in the greater Christchurch urban area (> 15,000 logs as of 01/01/2014). In particular, the 3D model provides shear wave velocities for the surficial Springston Formation, Christchurch Formation, and Riccarton gravel layers which generally comprise the upper 40m in the Christchurch urban area. Point-estimates are provided on a 200m-by- 200m grid from which interpolation to other locations can be performed. This model has applications for future site characterization and numerical modeling efforts via maps of timeaveraged Vs over specific depths (e.g. Vs30, Vs10) and via the identification of typical Vs profiles for different regions and soil behaviour types within Christchurch. In addition, the Vs model can be used to constrain the near-surface velocities for the 3D seismic velocity model of the Canterbury basin (Lee et al. 2014) currently being developed for the purpose of broadband ground motion simulation.

Research papers, University of Canterbury Library

Over 900 buildings in the Christchurch central business district and 10,000 residential homes were demolished following the 22nd of February 2011 Canterbury earthquake, significantly disrupting the rebuild progress. This study looks to quantify the time required for demolitions during this event which will be useful for future earthquake recovery planning. This was done using the Canterbury Earthquake Recovery Authority (CERA) demolition database, which allowed an in-depth look into the duration of each phase of the demolition process. The effect of building location, building height, and the stakeholder which initiated the demolition process (i.e. building owner or CERA) was investigated. The demolition process comprises of five phases; (i) decision making, (ii) procurement and planning, (iii) demolition, (iv) site clean-up, and (v) completion certification. It was found that the time required to decide to demolish the building made up majority of the total demolition duration. Demolition projects initiated by CERA had longer procurement and planning durations, but was quicker in other phases. Demolished buildings in the suburbs had a longer decision making duration, but had little effect on other phases of the demolition process. The decision making and procurement and planning phases of the demolition process were shorter for taller buildings, though the other phases took longer. Fragility functions for the duration of each phase in the demolition process are provided for the various categories of buildings for use in future studies.

Research papers, University of Canterbury Library

After a high-intensity seismic event, inspections of structural damages need to be carried out as soon as possible in order to optimize the emergency management, as well as improving the recovery time. In the current practice, damage inspections are performed by an experienced engineer, who physically inspect the structures. This way of doing not only requires a significant amount of time and high skilled human resources, but also raises the concern about the inspector’s safety. A promising alternative is represented using new technologies, such as drones and artificial intelligence, which can perform part of the damage classification task. In fact, drones can safely access high hazard components of the structures: for instance, bridge piers or abutments, and perform the reconnaissance by using highresolution cameras. Furthermore, images can be automatically processed by machine learning algorithms, and damages detected. In this paper, the possibility of applying such technologies for inspecting New Zealand bridges is explored. Firstly, a machine-learning model for damage detection by performing image analysis is presented. Specifically, the algorithm was trained to recognize cracks in concrete members. A sensitivity analysis was carried out to evaluate the algorithm accuracy by using database images. Depending on the confidence level desired,i.e. by allowing a manual classification where the alghortim confidence is below a specific tolerance, the accuracy was found reaching up to 84.7%. In the second part, the model is applied to detect the damage observed on the Anzac Bridge (GPS coordinates -43.500865, 172.701138) in Christchurch by performing a drone reconnaissance. Reults show that the accuracy of the damage detection was equal to 88% and 63% for cracking and spalling, respectively.

Research papers, University of Canterbury Library

High-quality ground motion records are required for engineering applications including response history analysis, seismic hazard development, and validation of physics-based ground motion simulations. However, the determination of whether a ground motion record is high-quality is poorly handled by automation with mathematical functions and can become prohibitive if done manually. Machine learning applications are well-suited to this problem, and a previous feed-forward neural network was developed (Bellagamba et al. 2019) to determine high-quality records from small crustal events in the Canterbury and Wellington regions for simulation validation. This prior work was however limited by the omission of moderate-to-large magnitude events and those from other tectonic environments, as well as a lack of explicit determination of the minimum usable frequency of the ground motion. To address these shortcomings, an updated neural network was developed to predict the quality of ground motion records for all magnitudes and all tectonic sources—active shallow crustal, subduction intraslab, and subduction interface—in New Zealand. The predictive performance of the previous feed-forward neural network was matched by the neural network in the domain of small crustal records, and this level of predictive performance is now extended to all source magnitudes and types in New Zealand making the neural network applicable to global ground motion databases. Furthermore, the neural network provides quality and minimum usable frequency predictions for each of the three orthogonal components of a record which may then be mapped into a binary quality decision or otherwise applied as desired. This framework provides flexibility for the end user to predict high-quality records with various acceptability thresholds allowing for this neural network to be used in a range of applications.

Research papers, University of Canterbury Library

The purpose of this thesis is to conduct a detailed examination of the forward-directivity characteristics of near-fault ground motions produced in the 2010-11 Canterbury earthquakes, including evaluating the efficacy of several existing empirical models which form the basis of frameworks for considering directivity in seismic hazard assessment. A wavelet-based pulse classification algorithm developed by Baker (2007) is firstly used to identify and characterise ground motions which demonstrate evidence of forward-directivity effects from significant events in the Canterbury earthquake sequence. The algorithm fails to classify a large number of ground motions which clearly exhibit an early-arriving directivity pulse due to: (i) incorrect pulse extraction resulting from the presence of pulse-like features caused by other physical phenomena; and (ii) inadequacy of the pulse indicator score used to carry out binary pulse-like/non-pulse-like classification. An alternative ‘manual’ approach is proposed to ensure 'correct' pulse extraction and the classification process is also guided by examination of the horizontal velocity trajectory plots and source-to-site geometry. Based on the above analysis, 59 pulse-like ground motions are identified from the Canterbury earthquakes , which in the author's opinion, are caused by forward-directivity effects. The pulses are also characterised in terms of their period and amplitude. A revised version of the B07 algorithm developed by Shahi (2013) is also subsequently utilised but without observing any notable improvement in the pulse classification results. A series of three chapters are dedicated to assess the predictive capabilities of empirical models to predict the: (i) probability of pulse occurrence; (ii) response spectrum amplification caused by the directivity pulse; (iii) period and amplitude (peak ground velocity, PGV) of the directivity pulse using observations from four significant events in the Canterbury earthquakes. Based on the results of logistic regression analysis, it is found that the pulse probability model of Shahi (2013) provides the most improved predictions in comparison to its predecessors. Pulse probability contour maps are developed to scrutinise observations of pulses/non-pulses with predicted probabilities. A direct comparison of the observed and predicted directivity amplification of acceleration response spectra reveals the inadequacy of broadband directivity models, which form the basis of the near-fault factor in the New Zealand loadings standard, NZS1170.5:2004. In contrast, a recently developed narrowband model by Shahi & Baker (2011) provides significantly improved predictions by amplifying the response spectra within a small range of periods. The significant positive bias demonstrated by the residuals associated with all models at longer vibration periods (in the Mw7.1 Darfield and Mw6.2 Christchurch earthquakes) is likely due to the influence of basin-induced surface waves and non-linear soil response. Empirical models for the pulse period notably under-predict observations from the Darfield and Christchurch earthquakes, inferred as being a result of both the effect of nonlinear site response and influence of the Canterbury basin. In contrast, observed pulse periods from the smaller magnitude June (Mw6.0) and December (Mw5.9) 2011 earthquakes are in good agreement with predictions. Models for the pulse amplitude generally provide accurate estimates of the observations at source-to-site distances between 1 km and 10 km. At longer distances, observed PGVs are significantly under-predicted due to their slower apparent attenuation. Mixed-effects regression is employed to develop revised models for both parameters using the latest NGA-West2 pulse-like ground motion database. A pulse period relationship which accounts for the effect of faulting mechanism using rake angle as a continuous predictor variable is developed. The use of a larger database in model development, however does not result in improved predictions of pulse period for the Darfield and Christchurch earthquakes. In contrast, the revised model for PGV provides a more appropriate attenuation of the pulse amplitude with distance, and does not exhibit the bias associated with previous models. Finally, the effects of near-fault directivity are explicitly included in NZ-specific probabilistic seismic hazard analysis (PSHA) using the narrowband directivity model of Shahi & Baker (2011). Seismic hazard analyses are conducted with and without considering directivity for typical sites in Christchurch and Otira. The inadequacy of the near-fault factor in the NZS1170.5: 2004 is apparent based on a comparison with the directivity amplification obtained from PSHA.

Research papers, University of Canterbury Library

The overarching goal of this dissertation is to improve predictive capabilities of geotechnical seismic site response analyses by incorporating additional salient physical phenomena that influence site effects. Specifically, multidimensional wave-propagation effects that are neglected in conventional 1D site response analyses are incorporated by: (1) combining results of 3D regional-scale simulations with 1D nonlinear wave-propagation site response analysis, and (2) modelling soil heterogeneity in 2D site response analyses using spatially-correlated random fields to perturb soil properties. A method to combine results from 3D hybrid physics-based ground motion simulations with site-specific nonlinear site response analyses was developed. The 3D simulations capture 3D ground motion phenomena on a regional scale, while the 1D nonlinear site response, which is informed by detailed site-specific soil characterization data, can capture site effects more rigorously. Simulations of 11 moderate-to-large earthquakes from the 2010-2011 Canterbury Earthquake Sequence (CES) at 20 strong motion stations (SMS) were used to validate simulations with observed ground motions. The predictions were compared to those from an empirically-based ground motion model (GMM), and from 3D simulations with simplified VS30- based site effects modelling. By comparing all predictions to observations at seismic recording stations, it was found that the 3D physics-based simulations can predict ground motions with comparable bias and uncertainty as the GMM, albeit, with significantly lower bias at long periods. Additionally, the explicit modelling of nonlinear site-response improves predictions significantly compared to the simplified VS30-based approach for soft-soil or atypical sites that exhibit exceptionally strong site effects. A method to account for the spatial variability of soils and wave scattering in 2D site response analyses was developed and validated against a database of vertical array sites in California. The inputs required to run the 2D analyses are nominally the same as those required for 1D analyses (except for spatial correlation parameters), enabling easier adoption in practice. The first step was to create the platform and workflow, and to perform a sensitivity study involving 5,400 2D model realizations to investigate the influence of random field input parameters on wave scattering and site response. Boundary conditions were carefully assessed to understand their effect on the modelled response and select appropriate assumptions for use on a 2D model with lateral heterogeneities. Multiple ground-motion intensity measures (IMs) were analyzed to quantify the influence from random field input parameters and boundary conditions. It was found that this method is capable of scattering seismic waves and creating spatially-varying ground motions at the ground surface. The redistribution of ground-motion energy across wider frequency bands, and the scattering attenuation of high-frequency waves in 2D analyses, resemble features observed in empirical transfer functions (ETFs) computed in other studies. The developed 2D method was subsequently extended to more complicated multi-layer soil profiles and applied to a database of 21 vertical array sites in California to test its appropriate- ness for future predictions. Again, different boundary condition and input motion assumptions were explored to extend the method to the in-situ conditions of a vertical array (with a sensor embedded in the soil). ETFs were compared to theoretical transfer functions (TTFs) from conventional 1D analyses and 2D analyses with heterogeneity. Residuals of transfer-function- based IMs, and IMs of surface ground motions, were also used as validation metrics. The spatial variability of transfer-function-based IMs was estimated from 2D models and compared to the event-to-event variability from ETFs. This method was found capable of significantly improving predictions of median ETF amplification factors, especially for sites that display higher event-to-event variability. For sites that are well represented by 1D methods, the 2D approach can underpredict amplification factors at higher modes, suggesting that the level of heterogeneity may be over-represented by the 2D random field models used in this study.