Search

found 80 results

Research papers, University of Canterbury Library

The 2010-2011 Canterbury earthquakes were recorded over a dense strong motion network in the near-source region, yielding significant observational evidence of seismic complexities, and a basis for interpretation of multi-disciplinary datasets and induced damage to the natural and built environment. This paper provides an overview of observed strong motions from these events and retrospective comparisons with both empirical and physics-based ground motion models. Both empirical and physics-based methods provide good predictions of observations at short vibration periods in an average sense. However, observed ground motion amplitudes at specific locations, such as Heathcote Valley, are seen to systematically depart from ‘average’ empirical predictions as a result of near surface stratigraphic and topographic features which are well modelled via sitespecific response analyses. Significant insight into the long period bias in empirical predictions is obtained from the use of hybrid broadband ground motion simulation. The comparison of both empirical and physics-based simulations against a set of 10 events in the sequence clearly illustrates the potential for simulations to improve ground motion and site response prediction, both at present, and further in the future.

Research papers, University of Canterbury Library

The purpose of this thesis is to conduct a detailed examination of the forward-directivity characteristics of near-fault ground motions produced in the 2010-11 Canterbury earthquakes, including evaluating the efficacy of several existing empirical models which form the basis of frameworks for considering directivity in seismic hazard assessment. A wavelet-based pulse classification algorithm developed by Baker (2007) is firstly used to identify and characterise ground motions which demonstrate evidence of forward-directivity effects from significant events in the Canterbury earthquake sequence. The algorithm fails to classify a large number of ground motions which clearly exhibit an early-arriving directivity pulse due to: (i) incorrect pulse extraction resulting from the presence of pulse-like features caused by other physical phenomena; and (ii) inadequacy of the pulse indicator score used to carry out binary pulse-like/non-pulse-like classification. An alternative ‘manual’ approach is proposed to ensure 'correct' pulse extraction and the classification process is also guided by examination of the horizontal velocity trajectory plots and source-to-site geometry. Based on the above analysis, 59 pulse-like ground motions are identified from the Canterbury earthquakes , which in the author's opinion, are caused by forward-directivity effects. The pulses are also characterised in terms of their period and amplitude. A revised version of the B07 algorithm developed by Shahi (2013) is also subsequently utilised but without observing any notable improvement in the pulse classification results. A series of three chapters are dedicated to assess the predictive capabilities of empirical models to predict the: (i) probability of pulse occurrence; (ii) response spectrum amplification caused by the directivity pulse; (iii) period and amplitude (peak ground velocity, PGV) of the directivity pulse using observations from four significant events in the Canterbury earthquakes. Based on the results of logistic regression analysis, it is found that the pulse probability model of Shahi (2013) provides the most improved predictions in comparison to its predecessors. Pulse probability contour maps are developed to scrutinise observations of pulses/non-pulses with predicted probabilities. A direct comparison of the observed and predicted directivity amplification of acceleration response spectra reveals the inadequacy of broadband directivity models, which form the basis of the near-fault factor in the New Zealand loadings standard, NZS1170.5:2004. In contrast, a recently developed narrowband model by Shahi & Baker (2011) provides significantly improved predictions by amplifying the response spectra within a small range of periods. The significant positive bias demonstrated by the residuals associated with all models at longer vibration periods (in the Mw7.1 Darfield and Mw6.2 Christchurch earthquakes) is likely due to the influence of basin-induced surface waves and non-linear soil response. Empirical models for the pulse period notably under-predict observations from the Darfield and Christchurch earthquakes, inferred as being a result of both the effect of nonlinear site response and influence of the Canterbury basin. In contrast, observed pulse periods from the smaller magnitude June (Mw6.0) and December (Mw5.9) 2011 earthquakes are in good agreement with predictions. Models for the pulse amplitude generally provide accurate estimates of the observations at source-to-site distances between 1 km and 10 km. At longer distances, observed PGVs are significantly under-predicted due to their slower apparent attenuation. Mixed-effects regression is employed to develop revised models for both parameters using the latest NGA-West2 pulse-like ground motion database. A pulse period relationship which accounts for the effect of faulting mechanism using rake angle as a continuous predictor variable is developed. The use of a larger database in model development, however does not result in improved predictions of pulse period for the Darfield and Christchurch earthquakes. In contrast, the revised model for PGV provides a more appropriate attenuation of the pulse amplitude with distance, and does not exhibit the bias associated with previous models. Finally, the effects of near-fault directivity are explicitly included in NZ-specific probabilistic seismic hazard analysis (PSHA) using the narrowband directivity model of Shahi & Baker (2011). Seismic hazard analyses are conducted with and without considering directivity for typical sites in Christchurch and Otira. The inadequacy of the near-fault factor in the NZS1170.5: 2004 is apparent based on a comparison with the directivity amplification obtained from PSHA.

Research papers, University of Canterbury Library

This paper examines the consistency of seismicity and ground motion models, used for seismic hazard analysis in New Zealand, with the observations in the Canterbury earthquakes. An overview is first given of seismicity and ground motion modelling as inputs of probabilistic seismic hazard analysis, whose results form the basis for elastic response spectra in NZS1170.5:2004. The magnitude of earthquakes in the Canterbury earthquake sequence are adequately allowed for in the current NZ seismicity model, however the consideration of ‘background’ earthquakes as point sources at a minimum depth of 10km results in up to a 60% underestimation of the ground motions that such events produce. The ground motion model used in conventional NZ seismic hazard analysis is shown to provide biased predictions of response spectra (over-prediction near T=0.2s , and under-predictions at moderate-to-large vibration periods). Improved ground motion prediction can be achieved using more recent NZ-specific models.

Research papers, University of Canterbury Library

Natural catastrophes are increasing worldwide. They are becoming more frequent but also more severe and impactful on our built environment leading to extensive damage and losses. Earthquake events account for the smallest part of natural events; nevertheless seismic damage led to the most fatalities and significant losses over the period 1981-2016 (Munich Re). Damage prediction is helpful for emergency management and the development of earthquake risk mitigation projects. Recent design efforts focused on the application of performance-based design engineering where damage estimation methodologies use fragility and vulnerability functions. However, the approach does not explicitly specify the essential criteria leading to economic losses. There is thus a need for an improved methodology that finds the critical building elements related to significant losses. The here presented methodology uses data science techniques to identify key building features that contribute to the bulk of losses. It uses empirical data collected on site during earthquake reconnaissance mission to train a machine learning model that can further be used for the estimation of building damage post-earthquake. The first model is developed for Christchurch. Empirical building damage data from the 2010-2011 earthquake events is analysed to find the building features that contributed the most to damage. Once processed, the data is used to train a machine-learning model that can be applied to estimate losses in future earthquake events.

Research papers, University of Canterbury Library

Background Liquefaction induced land damage has been identified in more than 13 notable New Zealand earthquakes within the past 150 years, as presented on the timeline below. Following the 2010-2011 Canterbury Earthquake Sequence (CES), the consequences of liquefaction were witnessed first-hand in the city of Christchurch and as a result the demand for understanding this phenomenon was heightened. Government, local councils, insurers and many other stakeholders are now looking to research and understand their exposure to this natural hazard.

Research papers, University of Canterbury Library

This presentation discusses recent empirical ground motion modelling efforts in New Zealand. Firstly, the active shallow crustal and subduction interface and slab ground motion prediction equations (GMPEs) which are employed in the 2010 update of the national seismic hazard model (NSHM) are discussed. Other NZ-specific GMPEs developed, but not incorporated in the 2010 update are then discussed, in particular, the active shallow crustal model of Bradley (2010). A brief comparison of the NZ-specific GMPEs with the near-source ground motions recorded in the Canterbury earthquakes is then presented, given that these recordings collectively provide a significant increase in observed strong motions in the NZ catalogue. The ground motion prediction expert elicitation process that was undertaken following the Canterbury earthquakes for active shallow crustal earthquakes is then discussed. Finally, ongoing GMPE-related activities are discussed including: ground motion and metadata database refinement, improved site characterization of strong motion station, and predictions for subduction zone earthquakes.

Research papers, University of Canterbury Library

The Canterbury Earthquake Sequence (CES), induced extensive damage in residential buildings and led to over NZ$40 billion in total economic losses. Due to the unique insurance setting in New Zealand, up to 80% of the financial losses were insured. Over the CES, the Earthquake Commission (EQC) received more than 412,000 insurance claims for residential buildings. The 4 September 2010 earthquake is the event for which most of the claims have been lodged with more than 138,000 residential claims for this event only. This research project uses EQC claim database to develop a seismic loss prediction model for residential buildings in Christchurch. It uses machine learning to create a procedure capable of highlighting critical features that affected the most buildings loss. A future study of those features enables the generation of insights that can be used by various stakeholders, for example, to better understand the influence of a structural system on the building loss or to select appropriate risk mitigation measures. Previous to the training of the machine learning model, the claim dataset was supplemented with additional data sourced from private and open access databases giving complementary information related to the building characteristics, seismic demand, liquefaction occurrence and soil conditions. This poster presents results of a machine learning model trained on a merged dataset using residential claims from the 4 September 2010.

Research papers, University of Canterbury Library

Since the early 1980s seismic hazard assessment in New Zealand has been based on Probabilistic Seismic Hazard Analysis (PSHA). The most recent version of the New Zealand National Seismic Hazard Model, a PSHA model, was published by Stirling et al, in 2012. This model follows standard PSHA principals and combines a nation-wide model of active faults with a gridded point-source model based on the earthquake catalogue since 1840. These models are coupled with the ground-motion prediction equation of McVerry et al (2006). Additionally, we have developed a time-dependent clustering-based PSHA model for the Canterbury region (Gerstenberger et al, 2014) in response to the Canterbury earthquake sequence. We are now in the process of revising that national model. In this process we are investigating several of the fundamental assumptions in traditional PSHA and in how we modelled hazard in the past. For this project, we have three main focuses: 1) how do we design an optimal combination of multiple sources of information to produce the best forecast of earthquake rates in the next 50 years: can we improve upon a simple hybrid of fault sources and background sources, and can we better handle the uncertainties in the data and models (e.g., fault segmentation, frequency-magnitude distributions, time-dependence & clustering, low strain-rate areas, and subduction zone modelling)? 2) developing revised and new ground-motion predictions models including better capturing of epistemic uncertainty – a key focus in this work is developing a new strong ground motion catalogue for model development; and 3) how can we best quantify if changes we have made in our modelling are truly improvements? Throughout this process we are working toward incorporating numerical modelling results from physics based synthetic seismicity and ground-motion models.

Research papers, University of Canterbury Library

Research Report No.2010-03Ground motion prediction equations (GMPEs) for geometric-mean pseudo-spectral acceleration amplitudes from New Zealand (NZ) earthquakes are developed. A database of 2437 three-component ground motion records is developed by applying stringent quality criteria to the historically recorded events in NZ. Despite the large number of records, the database is deficient in empirical records from large magnitude events recorded at close distances to the fault rupture plane. As a result, the basis for the NZ-specific GMPE development is to examine the applicability of foreign GMPEs for similar tectonic regions and then modify the most applicable GMPEs based on both theoretical and statistically significant empirically-driven arguments. For active shallow crustal events, five different GMPEs are considered. It was found that the McVerry et al. (2006) model, which is the current model upon which seismic design guidelines and site-specific seismic hazard analyses in NZ are based, provided the worst fit to the NZ database, and that the Chiou et al. (2010) (C10) modification of the Chiou and Youngs (2008) model was the most applicable. Discrepancies between the C10 model and the NZ database that were empirically identified and theoretically justified were used to modify the C10 model for: (i) small magnitude scaling; (ii) scaling of short period ground motion from normal faulting events in volcanic crust; (iii) scaling of ground motions on very hard rock sites; (iv) anelastic attenuation in the NZ crust; and (v) consideration of the increased anelastic attenuation in the Taupo Volcanic Zone (TVZ). For subduction slab events, initially three models were considered. It was found that all of the models had some significant biases with respect to applicability for NZ. The Zhao et al. (2006) (Z06) model was selected because of the rigorous database upon which it was developed and modified by: (i) NZ-specific scaling at small magnitudes; (ii) path scaling at large distances; (iii) consideration of the increased TVZ attenuation; and (iv) revision of the standard deviation model. Based on these modifications the developed model showed no bias of the inter- and intra-event residuals as a function of various predictor variables. The standard deviation of the residuals using the revised standard deviation model also indicated that the model has an adequate precision. Three GMPEs were considered for subduction interface events. The Zhao et al. (2006) (Z06) model was the best performing model with only bias exhibited in the site response model, and possible over-prediction of large magnitude events. The Z06 interface model was modified to account for site response and magnitude scaling using the same functional forms as those of the developed active shallow crustal and subduction slab models. The developed model showed no bias of the inter- and intra-event residuals as a function of various predictor variables. The developed GMPEs include specific features as evident in the NZ database; consistent scaling for parameters not well constrained by the NZ database; and pseudo-spectral amplitudes for vibration periods from 0.01 to 10 seconds. Hence, these models represent a significant advance in the state-of-the art for empirical ground motion prediction in NZ.

Research papers, University of Canterbury Library

This paper provides a summary of initial research results investigating systematic site effects from the prediction residuals of empirical- and physics-based ground-motion models (GMMs) for small magnitude (i.e., 3.5 ≤ MW ≤ 5) active shallow crustal earthquakes in New Zealand (NZ). Advancing ground-motion predictability through physics-based GMMs is an iterative process and requires addressing fundamental questions like: Is there salient physics which has been overlooked? Which geographic regions have predictions that significantly deviate from observations and why? Which sites exhibit systematic prediction residuals and how can the attributes influencing them be identified? This preliminary study examines these questions by classifying 171 sites from the Canterbury and Wellington regions into four geomorphic categories: basin, basin-edge, hill, and valley, following the categorisation by Nweke et al. (2022). Trends in the site-to-site residuals for each geomorphic category indicate apparent differences between the four categories, with residuals for valley sites illustrating a clear dependence with the inferred fundamental site period. Computed residuals from both empirical- and physics-based GMMs also provided insight into the role of site-specific attributes vs. the different prediction methods, assisting to understand the salient causes of these residuals.

Research papers, The University of Auckland Library

The recent instances of seismic activity in Canterbury (2010/11) and Kaikōura (2016) in New Zealand have exposed an unexpected level of damage to non-structural components, such as buried pipelines and building envelope systems. The cost of broken buried infrastructure, such as pipeline systems, to the Christchurch Council was excessive, as was the cost of repairing building envelopes to building owners in both Christchurch and Wellington (due to the Kaikōura earthquake), which indicates there are problems with compliance pathways for both of these systems. Councils rely on product testing and robust engineering design practices to provide compliance certification on the suitability of product systems, while asset and building owners rely on the compliance as proof of an acceptable design. In addition, forensic engineers and lifeline analysts rely on the same product testing and design techniques to analyse earthquake-related failures or predict future outcomes pre-earthquake, respectively. The aim of this research was to record the actual field-observed damage from the Canterbury and Kaikōura earthquakes of seismic damage to buried pipeline and building envelope systems, develop suitable testing protocols to be able to test the systems’ seismic resilience, and produce prediction design tools that deliver results that reflect the collected field observations with better accuracy than the present tools used by forensic engineers and lifeline analysts. The main research chapters of this thesis comprise of four publications that describe the gathering of seismic damage to pipes (Publication 1 of 4) and building envelopes (Publication 2 of 4). Experimental testing and the development of prediction design tools for both systems are described in Publications 3 and 4. The field observation (discussed in Publication 1 of 4) revealed that segmented pipe joints, such as those used in thick-walled PVC pipes, were particularly unsatisfactory with respect to the joint’s seismic resilience capabilities. Once the joint was damaged, silt and other deleterious material were able to penetrate the pipeline, causing blockages and the shutdown of key infrastructure services. At present, the governing Standards for PVC pipes are AS/NZS 1477 (pressure systems) and AS/NZS 1260 (gravity systems), which do not include a protocol for evaluating the PVC pipes for joint seismic resilience. Testing methodologies were designed to test a PVC pipe joint under various different simultaneously applied axial and transverse loads (discussed in Publication 3 of 4). The goal of the laboratory experiment was to establish an easy to apply testing protocol that could fill the void in the mentioned standards and produce boundary data that could be used to develop a design tool that could predict the observed failures given site-specific conditions surrounding the pipe. A tremendous amount of building envelope glazing system damage was recorded in the CBDs of both Christchurch and Wellington, which included gasket dislodgement, cracked glazing, and dislodged glazing. The observational research (Publication 2 of 4) concluded that the glazing systems were a good indication of building envelope damage as the glazing had consistent breaking characteristics, like a ballistic fuse used in forensic blast analysis. The compliance testing protocol recognised in the New Zealand Building Code, Verification Method E2/VM1, relies on the testing method from the Standard AS/NZS 4284 and stipulates the inclusion of typical penetrations, such as glazing systems, to be included in the test specimen. Some of the building envelope systems that failed in the recent New Zealand earthquakes were assessed with glazing systems using either the AS/NZS 4284 or E2/VM1 methods and still failed unexpectedly, which suggests that improvements to the testing protocols are required. An experiment was designed to mimic the observed earthquake damage using bi-directional loading (discussed in Publication 4 of 4) and to identify improvements to the current testing protocol. In a similar way to pipes, the observational and test data was then used to develop a design prediction tool. For both pipes (Publication 3 of 4) and glazing systems (Publication 4 of 4), experimentation suggests that modifying the existing testing Standards would yield more realistic earthquake damage results. The research indicates that including a specific joint testing regime for pipes and positioning the glazing system in a specific location in the specimen would improve the relevant Standards with respect to seismic resilience of these systems. Improving seismic resilience in pipe joints and glazing systems would improve existing Council compliance pathways, which would potentially reduce the liability of damage claims against the government after an earthquake event. The developed design prediction tool, for both pipe and glazing systems, uses local data specific to the system being scrutinised, such as local geology, dimensional characteristics of the system, actual or predicted peak ground accelerations (both vertically and horizontally) and results of product-specific bi-directional testing. The design prediction tools would improve the accuracy of existing techniques used by forensic engineers examining the cause of failure after an earthquake and for lifeline analysts examining predictive earthquake damage scenarios

Research papers, University of Canterbury Library

As a consequence of the 2010 – 2011 Canterbury earthquake sequence, Christchurch experienced widespread liquefaction, vertical settlement and lateral spreading. These geological processes caused extensive damage to both housing and infrastructure, and increased the need for geotechnical investigation substantially. Cone Penetration Testing (CPT) has become the most common method for liquefaction assessment in Christchurch, and issues have been identified with the soil behaviour type, liquefaction potential and vertical settlement estimates, particularly in the north-western suburbs of Christchurch where soils consist mostly of silts, clayey silts and silty clays. The CPT soil behaviour type often appears to over-estimate the fines content within a soil, while the liquefaction potential and vertical settlement are often calculated higher than those measured after the Canterbury earthquake sequence. To investigate these issues, laboratory work was carried out on three adjacent CPT/borehole pairs from the Groynes Park subdivision in northern Christchurch. Boreholes were logged according to NZGS standards, separated into stratigraphic layers, and laboratory tests were conducted on representative samples. Comparison of these results with the CPT soil behaviour types provided valuable information, where 62% of soils on average were specified by the CPT at the Groynes Park subdivision as finer than what was actually present, 20% of soils on average were specified as coarser than what was actually present, and only 18% of soils on average were correctly classified by the CPT. Hence the CPT soil behaviour type is not accurately describing the stratigraphic profile at the Groynes Park subdivision, and it is understood that this is also the case in much of northwest Christchurch where similar soils are found. The computer software CLiq, by GeoLogismiki, uses assessment parameter constants which are able to be adjusted with each CPT file, in an attempt to make each more accurate. These parameter changes can in some cases substantially alter the results for liquefaction analysis. The sensitivity of the overall assessment method, raising and lowering the water table, lowering the soil behaviour type index, Ic, liquefaction cutoff value, the layer detection option, and the weighting factor option, were analysed by comparison with a set of ‘base settings’. The investigation confirmed that liquefaction analysis results can be very sensitive to the parameters selected, and demonstrated the dependency of the soil behaviour type on the soil behaviour type index, as the tested assessment parameters made very little to no changes to the soil behaviour type plots. The soil behaviour type index, Ic, developed by Robertson and Wride (1998) has been used to define a soil’s behaviour type, which is defined according to a set of numerical boundaries. In addition to this, the liquefaction cutoff point is defined as Ic > 2.6, whereby it is assumed that any soils with an Ic value above this will not liquefy due to clay-like tendencies (Robertson and Wride, 1998). The method has been identified in this thesis as being potentially unsuitable for some areas of Christchurch as it was developed for mostly sandy soils. An alternative methodology involving adjustment of the Robertson and Wride (1998) soil behaviour type boundaries is proposed as follows:  Ic < 1.31 – Gravelly sand to dense sand  1.31 < Ic < 1.90 – Sands: clean sand to silty sand  1.90 < Ic < 2.50 – Sand mixtures: silty sand to sandy silt  2.50 < Ic < 3.20 – Silt mixtures: clayey silt to silty clay  3.20 < Ic < 3.60 – Clays: silty clay to clay  Ic > 3.60 – Organics soils: peats. When the soil behaviour type boundary changes were applied to 15 test sites throughout Christchurch, 67% showed an improved change of soil behaviour type, while the remaining 33% remained unchanged, because they consisted almost entirely of sand. Within these boundary changes, the liquefaction cutoff point was moved from Ic > 2.6 to Ic > 2.5 and altered the liquefaction potential and vertical settlement to more realistic ii values. This confirmed that the overall soil behaviour type boundary changes appear to solve both the soil behaviour type issues and reduce the overestimation of liquefaction potential and vertical settlement. This thesis acts as a starting point towards researching the issues discussed. In particular, future work which would be useful includes investigation of the CLiq assessment parameter adjustments, and those which would be most suitable for use in clay-rich soils such as those in Christchurch. In particular consideration of how the water table can be better assessed when perched layers of water exist, with the limitation that only one elevation can be entered into CLiq. Additionally, a useful investigation would be a comparison of the known liquefaction and settlements from the Canterbury earthquake sequence with the liquefaction and settlement potentials calculated in CLiq for equivalent shaking conditions. This would enable the difference between the two to be accurately defined, and a suitable adjustment applied. Finally, inconsistencies between the Laser-Sizer and Hydrometer should be investigated, as the Laser-Sizer under-estimated the fines content by up to one third of the Hydrometer values.

Research papers, University of Canterbury Library

Despite over a century of study, the relationship between lunar cycles and earthquakes remains controversial and difficult to quantitatively investigate. Perhaps as a consequence, major earthquakes around the globe are frequently followed by 'prediction' claims, using lunar cycles, that generate media furore and pressure scientists to provide resolute answers. The 2010-2011 Canterbury earthquakes in New Zealand were no exception; significant media attention was given to lunarderived earthquake predictions by non-scientists, even though the predictions were merely 'opinions' and were not based on any statistically robust temporal or causal relationships. This thesis provides a framework for studying lunisolar earthquake temporal relationships by developing replicable statistical methodology based on peer reviewed literature. Notable in the methodology is a high accuracy ephemeris, called ECLPSE, designed specifically by the author for use on earthquake catalogs, and a model for performing phase angle analysis. The statistical tests were carried out on two 'declustered' seismic catalogs, one containing the aftershocks from the Mw7.1 earthquake in Canterbury, and the other containing Australian seismicity from the past two decades. Australia is an intraplate setting far removed from active plate boundaries and Canterbury is proximal to a plate boundary, thus allowing for comparison based on tectonic regime and corresponding tectonic loading rate. No strong, conclusive, statistical correlations were found at any level of the earthquake catalogs, looking at large events, onshore events, offshore events, and the fault type of some events. This was concluded using Schuster's test of significance with α=5% and analysis of standard deviations. A few weak correlations, with p-5-10% of rejecting the null hypothesis, and anomalous standard deviations were found, but these are difficult to interpret. The results invalidate the statistical robustness of 'earthquake predictions' using lunisolar parameters in this instance. An ambitious researcher could improve on the quality of the results and on the range of parameters analyzed. The conclusions of the thesis raise more questions than answers, but the thesis provides an adaptable methodology that can be used to further investigation the problem.

Research papers, University of Canterbury Library

Ground motion observations from the most significant 10 events in the 2010-2011 Canterbury earthquake sequence at near-source sites are utilized to scrutinize New Zealand (NZ)-specific pseudo-spectral acceleration (SA) empirical ground motion prediction equations (GMPE) (Bradley 2010, Bradley 2013, McVerry et al. 2006). Region-specific modification factors based on relaxing the conventional ergodic assumption in GMPE development were developed for the Bradley (2010) model. Because of the observed biases with magnitude and source-to-site distance for the McVerry et al. (2006) model it is not possible to develop region-specific modification factors in a reliable manner. The theory of non-ergodic empirical ground motion prediction is then outlined, and applied to this 10 event dataset to determine systematic effects in the between- and within-event residuals which lead to modifications in the predicted median and standard deviation of the GMPE. By examining these systematic effects over sub-regions containing a total of 20 strong motion stations within the Canterbury area, modification factors for use in region-specific ground motion prediction are proposed. These modification factors, in particular, are suggested for use with the Bradley et al. (2010) model in Canterbury-specific probabilistic seismic hazard analysis (PSHA) to develop revised design response, particularly for long vibration periods.

Research papers, University of Canterbury Library

In this paper, the characteristics of near-fault ground motions recorded during the Mw7.1 Darfield and Mw 6.2 Christchurch earthquakes are examined and compared with existing empirical models. The characteristics of forward-directivity effects are first examined using a wavelet-based pulse-classification algorithm. This is followed by an assessment of the adequacy of empirical models which aim to capture the effect of directivity effects on amplifying the acceleration response spectra; and the period and peak velocity of the forward-directivity pulse. It is illustrated that broadband directivity models developed by Somerville et al. (1997) and Abrahamson (2000) generally under-predict the observed amplification of response spectral ordinates at longer vibration periods. In contrast, a recently developed narrowband model by Shahi and Baker (2011) provides significantly improved predictions by amplifying the response spectra within a small range of periods surrounding the directivity pulse period. Although the empirical predictions of the pulse period are generally favourable for the Christchurch earthquake, the observations from the Darfield earthquake are significantly under-predicted. The elongation in observed pulse periods is inferred as being a result of the soft sedimentary soils of the Canterbury basin. However, empirical predictions of the observed peak velocity associated with the directivity pulse are generally adequate for both events.

Research papers, University of Canterbury Library

This paper presents on-going challenges in the present paradigm shift of earthquakeinduced ground motion prediction from empirical to physics-based simulation methods. The 2010-2011 Canterbury and 2016 Kaikoura earthquakes are used to illustrate the predictive potential of the different methods. On-going efforts on simulation validation and theoretical developments are then presented, as well as the demands associated with the need for explicit consideration of modelling uncertainties. Finally, discussion is also given to the tools and databases needed for the efficient utilization of simulated ground motions both in specific engineering projects as well as for near-real-time impact assessment.

Research papers, University of Canterbury Library

This paper presents a critical evaluation of vertical ground motions observed in the Canterbury earthquake sequence. The abundance of strong near-source ground-motion recordings provides an opportunity to comprehensively review the estimation of vertical ground motions via the New Zealand Standard for earthquake loading, NZS1170.5:2004, and empirical ground motion prediction equations (GMPEs). An in-depth review of current GMPEs is carried out to determine the existing trends and characteristics present in the empirical models. Results illustrate that vertical ground motion amplitudes estimated based on NZS1170.5:2004 are significantly unconservative at short periods and near-source distances. While conventional GMPEs provide an improved prediction, in many instances they too underpredict vertical ground motion accelerations at short periods and near-source distances.

Research papers, University of Canterbury Library

© 2017 The Royal Society of New Zealand. This paper discusses simulated ground motion intensity, and its underlying modelling assumptions, for great earthquakes on the Alpine Fault. The simulations utilise the latest understanding of wave propagation physics, kinematic earthquake rupture descriptions and the three-dimensional nature of the Earth's crust in the South Island of New Zealand. The effect of hypocentre location is explicitly examined, which is found to lead to significant differences in ground motion intensities (quantified in the form of peak ground velocity, PGV) over the northern half and southwest of the South Island. Comparison with previously adopted empirical ground motion models also illustrates that the simulations, which explicitly model rupture directivity and basin-generated surface waves, lead to notably larger PGV amplitudes than the empirical predictions in the northern half of the South Island and Canterbury. The simulations performed in this paper have been adopted, as one possible ground motion prediction, in the ‘Project AF8’ Civil Defence Emergency Management exercise scenario. The similarity of the modelled ground motion features with those observed in recent worldwide earthquakes as well as similar simulations in other regions, and the notably higher simulated amplitudes than those from empirical predictions, may warrant a re-examination of regional impact assessments for major Alpine Fault earthquakes.

Research papers, University of Canterbury Library

The Canterbury Earthquake Sequence 2010-2011 (CES) induced widespread liquefaction in many parts of Christchurch city. Liquefaction was more commonly observed in the eastern suburbs and along the Avon River where the soils were characterised by thick sandy deposits with a shallow water table. On the other hand, suburbs to the north, west and south of the CBD (e.g. Riccarton, Papanui) exhibited less severe to no liquefaction. These soils were more commonly characterised by inter-layered liquefiable and non-liquefiable deposits. As part of a related large-scale study of the performance of Christchurch soils during the CES, detailed borehole data including CPT, Vs and Vp have been collected for 55 sites in Christchurch. For this subset of Christchurch sites, predictions of liquefaction triggering using the simplified method (Boulanger & Idriss, 2014) indicated that liquefaction was over-predicted for 94% of sites that did not manifest liquefaction during the CES, and under-predicted for 50% of sites that did manifest liquefaction. The focus of this study was to investigate these discrepancies between prediction and observation. To assess if these discrepancies were due to soil-layer interaction and to determine the effect that soil stratification has on the develop-ment of liquefaction and the system response of soil deposits.

Research papers, The University of Auckland Library

This thesis presents the application of data science techniques, especially machine learning, for the development of seismic damage and loss prediction models for residential buildings. Current post-earthquake building damage evaluation forms are developed for a particular country in mind. The lack of consistency hinders the comparison of building damage between different regions. A new paper form has been developed to address the need for a global universal methodology for post-earthquake building damage assessment. The form was successfully trialled in the street ‘La Morena’ in Mexico City following the 2017 Puebla earthquake. Aside from developing a framework for better input data for performance based earthquake engineering, this project also extended current techniques to derive insights from post-earthquake observations. Machine learning (ML) was applied to seismic damage data of residential buildings in Mexico City following the 2017 Puebla earthquake and in Christchurch following the 2010-2011 Canterbury earthquake sequence (CES). The experience showcased that it is readily possible to develop empirical data only driven models that can successfully identify key damage drivers and hidden underlying correlations without prior engineering knowledge. With adequate maintenance, such models have the potential to be rapidly and easily updated to allow improved damage and loss prediction accuracy and greater ability for models to be generalised. For ML models developed for the key events of the CES, the model trained using data from the 22 February 2011 event generalised the best for loss prediction. This is thought to be because of the large number of instances available for this event and the relatively limited class imbalance between the categories of the target attribute. For the CES, ML highlighted the importance of peak ground acceleration (PGA), building age, building size, liquefaction occurrence, and soil conditions as main factors which affected the losses in residential buildings in Christchurch. ML also highlighted the influence of liquefaction on the buildings losses related to the 22 February 2011 event. Further to the ML model development, the application of post-hoc methodologies was shown to be an effective way to derive insights for ML algorithms that are not intrinsically interpretable. Overall, these provide a basis for the development of ‘greybox’ ML models

Research papers, The University of Auckland Library

Predictive modelling provides an efficient means to analyse the coastal environment and generate knowledge for long term urban planning. In this study, the numerical models SWAN and XBeach were incorporated into the ESRI ArcGIS interface by means of the BeachMMtool. This was applied to the Greater Christchurch coastal environment to simulate geomorphological evolution through hydrodynamic forcing. Simulations were performed using the recent sea level rise predictions by the Intergovernmental Panel on Climate Change (2013) to determine whether the statutory requirements outlined in the New Zealand Coastal Policy Statement 2010 are consistent with central, regional and district designations. Our results indicate that current land use zoning in Greater Christchurch is not consistent with these predictions. This is because coastal hazard risk has not been thoroughly quantified during the process of installing the Canterbury Earthquake Recovery Authority residential red zone. However, the Christchurch City Council’s flood management area does provide an extent to which managed coastal retreat is a real option. The results of this research suggest that progradation will continue to occur along the Christchurch foreshore due to the net sediment flux retaining an onshore direction and the current hydrodynamic activity not being strong enough to move sediment offshore. However, inundation during periods of storm surge poses a risk to human habitation on low lying areas around the Avon-Heathcote Estuary and the Brooklands lagoon

Research papers, University of Canterbury Library

The overarching goal of this dissertation is to improve predictive capabilities of geotechnical seismic site response analyses by incorporating additional salient physical phenomena that influence site effects. Specifically, multidimensional wave-propagation effects that are neglected in conventional 1D site response analyses are incorporated by: (1) combining results of 3D regional-scale simulations with 1D nonlinear wave-propagation site response analysis, and (2) modelling soil heterogeneity in 2D site response analyses using spatially-correlated random fields to perturb soil properties. A method to combine results from 3D hybrid physics-based ground motion simulations with site-specific nonlinear site response analyses was developed. The 3D simulations capture 3D ground motion phenomena on a regional scale, while the 1D nonlinear site response, which is informed by detailed site-specific soil characterization data, can capture site effects more rigorously. Simulations of 11 moderate-to-large earthquakes from the 2010-2011 Canterbury Earthquake Sequence (CES) at 20 strong motion stations (SMS) were used to validate simulations with observed ground motions. The predictions were compared to those from an empirically-based ground motion model (GMM), and from 3D simulations with simplified VS30- based site effects modelling. By comparing all predictions to observations at seismic recording stations, it was found that the 3D physics-based simulations can predict ground motions with comparable bias and uncertainty as the GMM, albeit, with significantly lower bias at long periods. Additionally, the explicit modelling of nonlinear site-response improves predictions significantly compared to the simplified VS30-based approach for soft-soil or atypical sites that exhibit exceptionally strong site effects. A method to account for the spatial variability of soils and wave scattering in 2D site response analyses was developed and validated against a database of vertical array sites in California. The inputs required to run the 2D analyses are nominally the same as those required for 1D analyses (except for spatial correlation parameters), enabling easier adoption in practice. The first step was to create the platform and workflow, and to perform a sensitivity study involving 5,400 2D model realizations to investigate the influence of random field input parameters on wave scattering and site response. Boundary conditions were carefully assessed to understand their effect on the modelled response and select appropriate assumptions for use on a 2D model with lateral heterogeneities. Multiple ground-motion intensity measures (IMs) were analyzed to quantify the influence from random field input parameters and boundary conditions. It was found that this method is capable of scattering seismic waves and creating spatially-varying ground motions at the ground surface. The redistribution of ground-motion energy across wider frequency bands, and the scattering attenuation of high-frequency waves in 2D analyses, resemble features observed in empirical transfer functions (ETFs) computed in other studies. The developed 2D method was subsequently extended to more complicated multi-layer soil profiles and applied to a database of 21 vertical array sites in California to test its appropriate- ness for future predictions. Again, different boundary condition and input motion assumptions were explored to extend the method to the in-situ conditions of a vertical array (with a sensor embedded in the soil). ETFs were compared to theoretical transfer functions (TTFs) from conventional 1D analyses and 2D analyses with heterogeneity. Residuals of transfer-function- based IMs, and IMs of surface ground motions, were also used as validation metrics. The spatial variability of transfer-function-based IMs was estimated from 2D models and compared to the event-to-event variability from ETFs. This method was found capable of significantly improving predictions of median ETF amplification factors, especially for sites that display higher event-to-event variability. For sites that are well represented by 1D methods, the 2D approach can underpredict amplification factors at higher modes, suggesting that the level of heterogeneity may be over-represented by the 2D random field models used in this study.

Research papers, University of Canterbury Library

The 2010 Darfield and 2011 Christchurch Earthquakes triggered extensive liquefaction-induced lateral spreading proximate to streams and rivers in the Christchurch area, causing significant damage to structures and lifelines. A case study in central Christchurch is presented and compares field observations with predicted displacements from the widely adopted empirical model of Youd et al. (2002). Cone penetration testing (CPT), with measured soil gradation indices (fines content and median grain size) on typical fluvial deposits along the Avon River were used to determine the required geotechnical parameters for the model input. The method presented attempts to enable the adoption of the extensive post-quake CPT test records in place of the lower quality and less available Standard Penetration Test (SPT) data required by the original Youd model. The results indicate some agreement between the Youd model predictions and the field observations, while the majority of computed displacements error on the side of over-prediction by more than a factor of two. A sensitivity analysis was performed with respect to the uncertainties used as model input, illustrating the model’s high sensitivity to the input parameters, with median grain size and fines content among the most influential, and suggesting that the use of CPT data to quantify these parameters may lead to variable results.

Research papers, University of Canterbury Library

Background This study examines the performance of site response analysis via nonlinear total-stress 1D wave-propagation for modelling site effects in physics-based ground motion simulations of the 2010-2011 Canterbury, New Zealand earthquake sequence. This approach allows for explicit modeling of 3D ground motion phenomena at the regional scale, as well as detailed nonlinear site effects at the local scale. The approach is compared to a more commonly used empirical VS30 (30 m time-averaged shear wave velocity)-based method for computing site amplification as proposed by Graves and Pitarka (2010, 2015), and to empirical ground motion prediction via a ground motion model (GMM).

Research papers, Lincoln University

Predictive modelling provides an efficient means to analyse the coastal environment and generate knowledge for long term urban planning. In this study, the numerical models SWAN and XBeach were incorporated into the ESRI ArcGIS interface by means of the BeachMMtool. This was applied to the Greater Christchurch coastal environment to simulate geomorphological evolution through hydrodynamic forcing. Simulations were performed using the recent sea level rise predictions by the Intergovernmental Panel on Climate Change (2013) to determine whether the statutory requirements outlined in the New Zealand Coastal Policy Statement 2010 are consistent with central, regional and district designations. Our results indicate that current land use zoning in Greater Christchurch is not consistent with these predictions. This is because coastal hazard risk has not been thoroughly quantified during the process of installing the Canterbury Earthquake Recovery Authority residential red zone. However, the Christchurch City Council’s flood management area does provide an extent to which managed coastal retreat is a real option. The results of this research suggest that progradation will continue to occur along the Christchurch foreshore due to the net sediment flux retaining an onshore direction and the current hydrodynamic activity not being strong enough to move sediment offshore. However, inundation during periods of storm surge poses a risk to human habitation on low lying areas around the Avon-Heathcote Estuary and the Brooklands lagoon.

Research papers, University of Canterbury Library

Semi-empirical models based on in-situ geotechnical tests have become the standard of practice for predicting soil liquefaction. Since the inception of the “simplified” cyclic-stress model in 1971, variants based on various in-situ tests have been developed, including the Cone Penetration Test (CPT). More recently, prediction models based soley on remotely-sensed data were developed. Similar to systems that provide automated content on earthquake impacts, these “geospatial” models aim to predict liquefaction for rapid response and loss estimation using readily-available data. This data includes (i) common ground-motion intensity measures (e.g., PGA), which can either be provided in near-real-time following an earthquake, or predicted for a future event; and (ii) geospatial parameters derived from digital elevation models, which are used to infer characteristics of the subsurface relevent to liquefaction. However, the predictive capabilities of geospatial and geotechnical models have not been directly compared, which could elucidate techniques for improving the geospatial models, and which would provide a baseline for measuring improvements. Accordingly, this study assesses the realtive efficacy of liquefaction models based on geospatial vs. CPT data using 9,908 case-studies from the 2010-2016 Canterbury earthquakes. While the top-performing models are CPT-based, the geospatial models perform relatively well given their simplicity and low cost. Although further research is needed (e.g., to improve upon the performance of current models), the findings of this study suggest that geospatial models have the potential to provide valuable first-order predictions of liquefaction occurence and consequence. Towards this end, performance assessments of geospatial vs. geotechnical models are ongoing for more than 20 additional global earthquakes.

Research papers, University of Canterbury Library

This paper provides a brief discussion of observed strong ground motions from the 14 November 2016 Mw7.8 Kaikoura earthquake. Specific attention is given to examining observations in the near-source region where several ground motions exceeding 1.0g horizontal are recorded, as well as up to 2.7g in the vertical direction at one location. Ground motion response spectra in the near-source, North Canterbury, Marlborough and Wellington regions are also examined and compared with design levels. Observed spectral amplitudes are also compared with predictions from empirical and physics-based ground motion modelling.

Research papers, University of Canterbury Library

Observations of out-of-plane (OOP) instability in the 2010 Chile earthquake and in the 2011 Christchurch earthquake resulted in concerns about the current design provisions of structural walls. This mode of failure was previously observed in the experimental response of some wall specimens subjected to in-plane loading. Therefore, the postulations proposed for prediction of the limit states corresponding to OOP instability of rectangular walls are generally based on stability analysis under in-plane loading only. These approaches address stability of a cracked wall section when subjected to compression, thereby considering the level of residual strain developed in the reinforcement as the parameter that prevents timely crack closure of the wall section and induces stability failure. The New Zealand code requirements addressing the OOP instability of structural walls are based on the assumptions used in the literature and the analytical methods proposed for mathematical determination of the critical strain values. In this study, a parametric study is conducted using a numerical model capable of simulating OOP instability of rectangular walls to evaluate sensitivity of the OOP response of rectangular walls to variation of different parameters identified to be governing this failure mechanism. The effects of wall slenderness (unsupported height-to-thickness) ratio, longitudinal reinforcement ratio of the boundary regions and length on the OOP response of walls are evaluated. A clear trend was observed regarding the influence of these parameters on the initiation of OOP displacement, based on which simple equations are proposed for prediction of OOP instability in rectangular walls.

Research papers, University of Canterbury Library

Probabilistic Structural Fire Engineering (PSFE) has been introduced to overcome the limitations of current conventional approaches used for the design of fire-exposed structures. Current structural fire design investigates worst-case fire scenarios and include multiple thermal and structural analyses. PSFE permits buildings to be designed to a level of life safety or economic loss that may occur in future fire events with the help of a probabilistic approach. This thesis presents modifications to the adoption of a Performance-Based Earthquake Engineering (PBEE) framework in Probabilistic Structural Fire Engineering (PSFE). The probabilistic approach runs through a series of interrelationships between different variables, and successive convolution integrals of these interrelationships result in probabilities of different measures. The process starts with the definition of a fire severity measure (FSM), which best relates fire hazard intensity with structural response. It is identified by satisfying efficiency and sufficiency criteria as described by the PBEE framework. The relationship between a fire hazard and corresponding structural response is established by analysis methods. One method that has been used to quantify this relationship in PSFE is Incremental Fire Analysis (IFA). The existing IFA approach produces unrealistic fire scenarios, as fire profiles may be scaled to wide ranges of fire severity levels, which may not physically represent any real fires. Two new techniques are introduced in this thesis to limit extensive scaling. In order to obtain an annual rate of exceedance of fire hazard and structural response for an office building, an occurrence model and an attenuation model for office fires are generated for both Christchurch city and New Zealand. The results show that Christchurch city is 15% less likely to experience fires that have the potential to cause structural failures in comparison to all of New Zealand. In establishing better predictive relationships between fires and structural response, cumulative incident radiation (a fire hazard property) is found to be the most appropriate fire severity measure. This research brings together existing research on various sources of uncertainty in probabilistic structural fire engineering, such as elements affecting post-flashover fire development factors (fuel load, ventilation, surface lining and compartment geometry), fire models, analysis methods and structural reliability. Epistemic uncertainty and aleatory uncertainty are investigated in the thesis by examining the uncertainty associated with modelling and the factors that influence post-flashover development of fires. A survey of 12 buildings in Christchurch in combination with recent surveys in New Zealand produced new statistical data on post-flashover development factors in office buildings in New Zealand. The effects of these parameters on temperature-time profiles are evaluated. The effects of epistemic uncertainty due to fire models in the estimation of structural response is also calculated. Parametric fires are found to have large uncertainty in the prediction of post-flashover fires, while the BFD curves have large uncertainties in prediction of structural response. These uncertainties need to be incorporated into failure probability calculations. Uncertainty in structural modelling shows that the choices that are made during modelling have a large influence on realistic predictions of structural response.

Research papers, University of Canterbury Library

Heathcote Valley school strong motion station (HVSC) consistently recorded ground motions with higher intensities than nearby stations during the 2010-2011 Canterbury earthquakes. For example, as shown in Figure 1, for the 22 February 2011 Christchurch earthquake, peak ground acceleration at HVSC reached 1.4 g (horizontal) and 2 g (vertical), the largest ever recorded in New Zealand. Strong amplification of ground motions is expected at Heathcote Valley due to: 1) the high impedance contrast at the soil-rock interface, and 2) the interference of incident and surface waves within the valley. However, both conventional empirical ground motion prediction equations (GMPE) and the physics-based large scale ground motions simulations (with empirical site response) are ineffective in predicting such amplification due to their respective inherent limitations.