Search

found 39 results

Research papers, University of Canterbury Library

In the last two decades, New Zealand (NZ) has experienced significant earthquakes, including the 2010 M 7.2 Darfield, 2011 M 6.2 Christchurch, and 2016 M 7.8 Kaikōura events. Amongst these large events, tens of thousands of smaller earthquakes have occurred. While previous event and ground-motion databases have analyzed these events, many events below M 4 have gone undetected. The goal of this study is to expand on previous databases, particularly for small magnitude (M<4) and low-amplitude ground motions. This new database enables a greater understanding of regional variations within NZ and contributes to the validity of internationally developed ground-motion models. The database includes event locations and magnitude estimates with uncertainty considerations, and tectonic type assessed in a hierarchical manner. Ground motions are extracted from the GeoNet FDSN server and assessed for quality using a neural network classification approach. A deep neural network approach is also utilized for picking P and S phases for determination of event hypocentres. Relative hypocentres are further improved by double-difference relocation and will contribute toward developing shallow (< 50 km) seismic tomography models. Analysis of the resulting database is compared with previous studies for discussion of implications toward national hazard prediction models.

Research papers, University of Canterbury Library

The 2015 New Zealand strong-motion database provides a wealth of new strong motion data for engineering applications. An important component of this database is the compilation of new site metadata, describing the soil conditions and site response at GeoNet strong motion stations. We have assessed and compiled four key site parameters for the ~460 GeoNet stations that recorded significant historical ground motions. Parameters include: site classification (NZS1170.5), Vs30, fundamental site period (Tsite) and depth to bedrock (Z1.0, i.e. depth to material with Vs > 1000 m/s). In addition, we have assigned a quality estimate (Quality 1 – 3) to these parameters to provide a qualitative estimate of the uncertainty. New highquality Tsite estimates have largely been obtained from newly available HVSR amplification curves and spectral ratios from inversion of regional strong motion data that has been reconciled with available geological information. Good quality Vs30 estimates, typically in urban centres, have also been incorporated following recent studies. Where site-specific measurements of Vs30 are not available, Vs30 is estimated based on surface geology following national Vs30 maps. New Z1.0 values have been provided from 3D subsurface models for Canterbury and Wellington. This database will be used in efforts to guide development and testing of new and existing ground motion prediction models in New Zealand. In particular, it will allow reexamination of the most important site parameters that control and predict site response in a New Zealand setting. Furthermore, it can be used to provide information about suitable rock reference sites for seismological research, and as a guide to site-specific references in the literature. We discuss compilation of the database, preliminary insights so far, and future directions.

Research papers, University of Canterbury Library

The focus of the study presented herein is an assessment of the relative efficacy of recent Cone Penetration Test (CPT) and small strain shear wave velocity (Vs) based variants of the simplified procedure. Towards this end Receiver Operating Characteristic (ROC) analyses were performed on the CPT- and Vs-based procedures using the field case history databases from which the respective procedures were developed. The ROC analyses show that Factors of Safety (FS) against liquefaction computed using the most recent Vs-based simplified procedure is better able to separate the “liquefaction” from the “no liquefaction” case histories in the Vs liquefaction database than the CPT-based procedure is able to separate the “liquefaction” from the “no liquefaction” case histories in the CPT liquefaction database. However, this finding somewhat contradicts the assessed predictive capabilities of the CPT- and Vs-based procedures as quantified using select, high quality liquefaction case histories from the 20102011 Canterbury, New Zealand, Earthquake Sequence (CES), wherein the CPT-based procedure was found to yield more accurate predictions. The dichotomy of these findings may result from the fact that different liquefaction field case history databases were used in the respective ROC analyses for Vs and CPT, while the same case histories were used to evaluate both the CPT- and Vs-based procedures.

Research papers, University of Canterbury Library

This study uses 44 high quality liquefaction case histories taken from 22 locations affected by the 2010-2011 Canterbury earthquake sequence to evaluate four commonly used CPT-VS correlations (i.e., Robertson, 2009; Hegazy and Mayne, 2006; Andrus et al., 2007; McGann et al., 2015b). Co-located CPT soundings and VS profiles, developed from surface wave testing, were obtained at 22 locations and case histories were developed for the Mw 7.1, 4 September 2010 Darfield and Mw 6.2, 22 February 2011 Christchurch earthquakes. The CPT soundings are used to generate VS profiles using each of four CPT-VS correlations. These correlated VS profiles are used to estimate the factor of safety against liquefaction using the Kayen et al. (2013) VS-based simplified liquefaction evaluation procedure. An error index is used to quantify the predictive capabilities of these correlations in relation to the observations of liquefaction (or the lack thereof). Additionally, the error indices from the CPT-correlated VS profiles are compared to those obtained using: (1) the Kayen et al. (2013) procedure with surface wave-derived VS profiles, and (2) the Idriss and Boulanger (2008) CPT-based liquefaction evaluation procedure. Based on the error indices, the evaluation procedures based on direct measurements of either CPT or VS provided more accurate liquefaction triggering estimates than those obtained from any of the CPT-VS correlations. However, the performance of the CPT-VS correlations varied, with the Robertson (2009) and Hegazy and Mayne (2006) correlations performing relatively poorly for the Christchurch soils and the Andrus et al. (2007) and McGann et al. (2015b) correlations performing better. The McGann et al. (2015b) correlation had the lowest error indices of the CPT-VS correlations tested, however, none of the CPT-VS correlations provided accurate enough VS predictions to be used for the evaluation of liquefaction triggering using the VS-based liquefaction evaluation procedures.

Research papers, University of Canterbury Library

SeisFinder is an open-source web service developed by QuakeCoRE and the University of Canterbury, focused on enabling the extraction of output data from computationally intensive earthquake resilience calculations. Currently, SeisFinder allows users to select historical or future events and retrieve ground motion simulation outputs for requested geographical locations. This data can be used as input for other resilience calculations, such as dynamic response history analysis. SeisFinder was developed using Django, a high-level python web framework, and uses a postgreSQL database. Because our large-scale computationally-intensive numerical ground motion simulations produce big data, the actual data is stored in file systems, while the metadata is stored in the database.

Research papers, University of Canterbury Library

Overview of SeisFinder SeisFinder is an open-source web service developed by QuakeCoRE and the University of Canterbury, focused on enabling the extraction of output data from computationally intensive earthquake resilience calculations. Currently, SeisFinder allows users to select historical or future events and retrieve ground motion simulation outputs for requested geographical locations. This data can be used as input for other resilience calculations, such as dynamic response history analysis. SeisFinder was developed using Django, a high-level python web framework, and uses a postgreSQL database. Because our large-scale computationally-intensive numerical ground motion simulations produce big data, the actual data is stored in file systems, while the metadata is stored in the database. The basic SeisFinder architecture is shown in Figure 1.

Research papers, University of Canterbury Library

There is a now a rich literature on the connections between digital media, networked computing, and the shaping of urban material cultures. Much less has addressed the post-disaster context, like we face in Christchurch, where it is more a case of re-build rather than re-new. In what follows I suggest that Lev Manovich’s well-known distinction between narrative and database as distinct but related cultural forms is a useful framework for thinking about the Christchurch rebuild, and perhaps urbanism more generally.

Research papers, University of Canterbury Library

This paper summarizes the development of a high-resolution surficial shear wave velocity model based on the combination of the large high-spatial-density database of cone penetration test (CPT) logs in and around Christchurch, New Zealand and a recently-developed Christchurch-specific empirical correlation between soil shear wave velocity and CPT. This near-surface shear wave velocity model has applications for site characterization efforts via the development of maps of time-averaged shear wave velocities over specific depths, as well as use in site response analysis and ground motion simulation.

Research papers, University of Canterbury Library

This paper summarizes the development of a region-wide surficial shear wave velocity model based on the combination of the large high-spatial-density database of cone penetration test (CPT) logs in and around Christchurch, New Zealand and a recently-developed Christchurch-specific empirical correlation between soil shear wave velocity and CPT. The ongoing development of this near-surface shear wave velocity model has applications for site characterization efforts via the development of maps of time-averaged shear wave velocities over specific depths, and the identification of regional similarities and differences in soil shear stiffness.

Research papers, University of Canterbury Library

This paper presents on-going challenges in the present paradigm shift of earthquakeinduced ground motion prediction from empirical to physics-based simulation methods. The 2010-2011 Canterbury and 2016 Kaikoura earthquakes are used to illustrate the predictive potential of the different methods. On-going efforts on simulation validation and theoretical developments are then presented, as well as the demands associated with the need for explicit consideration of modelling uncertainties. Finally, discussion is also given to the tools and databases needed for the efficient utilization of simulated ground motions both in specific engineering projects as well as for near-real-time impact assessment.

Research papers, University of Canterbury Library

The city of Christchurch and its surrounds experienced widespread damage due to soil liquefaction induced by seismic shaking during the Canterbury earthquake sequence that began in September 2010 with the Mw7.1 Darfield earthquake. Prior to the start of this sequence, the city had a large network of strong motion stations (SMSs) installed, which were able to record a vast database of strong ground motions. This paper uses this database of strong ground motion recordings, observations of liquefaction manifestation at the ground surface, and data from a recently completed extensive geotechnical site investigation program at each SMS to assess a range of liquefaction evaluation procedures at the four SMSs in the Christchurch Central Business District (CBD). In general, the characteristics of the accelerograms recorded at each SMS correlated well with the liquefaction evaluation procedures, with low liquefaction factors of safety predicted at sites with clear liquefaction identifiers in the ground motions. However, at sites that likely liquefied at depth (as indicated by evaluation procedures and/or inferred from the characteristics of the recorded surface accelerograms), the presence of a non-liquefiable crust layer at many of the SMS locations prevented the manifestation of any surface effects. Because of this, there was not a good correlation between surface manifestation and two surface manifestation indices, the Liquefaction Potential Index (LPI) and the Liquefaction Severity Number (LSN).

Research papers, University of Canterbury Library

The Canterbury Earthquake Sequence (CES), induced extensive damage in residential buildings and led to over NZ$40 billion in total economic losses. Due to the unique insurance setting in New Zealand, up to 80% of the financial losses were insured. Over the CES, the Earthquake Commission (EQC) received more than 412,000 insurance claims for residential buildings. The 4 September 2010 earthquake is the event for which most of the claims have been lodged with more than 138,000 residential claims for this event only. This research project uses EQC claim database to develop a seismic loss prediction model for residential buildings in Christchurch. It uses machine learning to create a procedure capable of highlighting critical features that affected the most buildings loss. A future study of those features enables the generation of insights that can be used by various stakeholders, for example, to better understand the influence of a structural system on the building loss or to select appropriate risk mitigation measures. Previous to the training of the machine learning model, the claim dataset was supplemented with additional data sourced from private and open access databases giving complementary information related to the building characteristics, seismic demand, liquefaction occurrence and soil conditions. This poster presents results of a machine learning model trained on a merged dataset using residential claims from the 4 September 2010.

Research papers, University of Canterbury Library

This study analyses the Earthquake Commission’s (EQC) insurance claims database to investigate the influence of seismic intensity and property damage resulting from the Canterbury Earthquake Sequence (CES) on the repair costs and claim settlement duration for residential buildings. Firstly, the ratio of building repair cost to its replacement cost was expressed as a Building Loss Ratio (BLR), which was further extended to Regional Loss Ratio (RLR) for greater Christchurch by multiplying the average of all building loss ratios with the proportion of building stock that lodged an insurance claim. Secondly, the total time required to settle the claim and the time taken to complete each phase of the claim settlement process were obtained. Based on the database, the regional loss ratio for greater Christchurch for three events producing shakings of intensities 6, 7, and 8 on the modified Mercalli intensity scale were 0.013, 0.066, and 0.171, respectively. Furthermore, small (less than NZD15,000), medium (between NZD15,000 and NZD100,000), and large (more than NZD100,000) claims took 0.35-0.55, 1.95-2.45, and 3.35-3.85 years to settle regardless of the building’s construction period and earthquake intensities. The number of claims was also disaggregated by various building characteristics to evaluate their relative contribution to the damage and repair costs.

Research papers, University of Canterbury Library

Social and natural capital are fundamental to people’s wellbeing, often within the context of local community. Developing communities and linking people together provide benefits in terms of mental well-being, physical activity and other associated health outcomes. The research presented here was carried out in Christchurch - Ōtautahi, New Zealand, a city currently re-building, after a series of devastating earthquakes in 2010 and 2011. Poor mental health has been shown to be a significant post-earthquake problem, and social connection has been postulated as part of a solution. By curating a disparate set of community services, activities and facilities, organised into a Geographic Information Systems (GIS) database, we created i) an accessibility analysis of 11 health and well-being services, ii) a mobility scenario analysis focusing on 4 general well-being services and iii) a location-allocation model focusing on 3 primary health care and welfare location optimisation. Our results demonstrate that overall, the majority of neighbourhoods in Christchurch benefit from a high level of accessibility to almost all the services; but with an urban-rural gradient (the further away from the centre, the less services are available, as is expected). The noticeable exception to this trend, is that the more deprived eastern suburbs have poorer accessibility, suggesting social inequity in accessibility. The findings presented here show the potential of optimisation modelling and database curation for urban and community facility planning purposes.

Research papers, University of Canterbury Library

The latest two great earthquake sequences; 2010- 2011 Canterbury Earthquake and 2016 Kaikoura Earthquake, necessitate a better understanding of the New Zealand seismic hazard condition for new building design and detailed assessment of existing buildings. It is important to note, however, that the New Zealand seismic hazard map in NZS 1170.5.2004 is generalised in effort to cover all of New Zealand and limited to a earthquake database prior to 2001. This is “common” that site-specific studies typically provide spectral accelerations different to those shown on the national map (Z values in NZS 1170.5:2004); and sometimes even lower. Moreover, Section 5.2 of Module 1 of the Earthquake Geotechnical Engineering Practice series provide the guidelines to perform site- specific studies.

Research papers, University of Canterbury Library

The 2010–2011 Canterbury earthquake sequence began with the 4 September 2010, Mw7.1 Darfield earthquake and includes up to ten events that induced liquefaction. Most notably, widespread liquefaction was induced by the Darfield and Mw6.2 Christchurch earthquakes. The combination of well-documented liquefaction response during multiple events, densely recorded ground motions for the events, and detailed subsurface characterization provides an unprecedented opportunity to add well-documented case histories to the liquefaction database. This paper presents and applies 50 high-quality cone penetration test (CPT) liquefaction case histories to evaluate three commonly used, deterministic, CPT-based simplified liquefaction evaluation procedures. While all the procedures predicted the majority of the cases correctly, the procedure proposed by Idriss and Boulanger (2008) results in the lowest error index for the case histories analyzed, thus indicating better predictions of the observed liquefaction response.

Research papers, University of Canterbury Library

This presentation discusses recent empirical ground motion modelling efforts in New Zealand. Firstly, the active shallow crustal and subduction interface and slab ground motion prediction equations (GMPEs) which are employed in the 2010 update of the national seismic hazard model (NSHM) are discussed. Other NZ-specific GMPEs developed, but not incorporated in the 2010 update are then discussed, in particular, the active shallow crustal model of Bradley (2010). A brief comparison of the NZ-specific GMPEs with the near-source ground motions recorded in the Canterbury earthquakes is then presented, given that these recordings collectively provide a significant increase in observed strong motions in the NZ catalogue. The ground motion prediction expert elicitation process that was undertaken following the Canterbury earthquakes for active shallow crustal earthquakes is then discussed. Finally, ongoing GMPE-related activities are discussed including: ground motion and metadata database refinement, improved site characterization of strong motion station, and predictions for subduction zone earthquakes.

Research papers, University of Canterbury Library

Unreinforced masonry (URM) structures comprise a majority of the global built heritage. The masonry heritage of New Zealand is comparatively younger to its European counterparts. In a country facing frequent earthquakes, the URM buildings are prone to extensive damage and collapse. The Canterbury earthquake sequence proved the same, causing damage to over _% buildings. The ability to assess the severity of building damage is essential for emergency response and recovery. Following the Canterbury earthquakes, the damaged buildings were categorized into various damage states using the EMS-98 scale. This article investigates machine learning techniques such as k-nearest neighbors, decision trees, and random forests, to rapidly assess earthquake-induced building damage. The damage data from the Canterbury earthquake sequence is used to obtain the forecast model, and the performance of each machine learning technique is evaluated using the remaining (test) data. On getting a high accuracy the model is then run for building database collected for Dunedin to predict expected damage during the rupture of the Akatore fault.

Research papers, University of Canterbury Library

This presentation summarizes the development of high-resolution surficial soil velocity models in the Canterbury, New Zealand basin. Shallow (<30m) shear wave velocities were primarily computed based on a combination of a large database of over 15,000 cone penetration test (CPT) logs in and around Christchurch, and a recently-developed Christchurch-specific empirical correlation between soil shear wave velocity and CPT. Large active-source testing at 22 locations and ambient-wavefield surface wave and H/V testing at over 80 locations were utilized in combination with 1700 water well logs to constrain the inter-bedded stratigraphy and velocity of Quaternary sediments up to depths of several hundred meters. Finally, seismic reflection profiles and the ambient-wavefield surface wave data provide constraint on velocities from several hundred meters to several kilometres. At all depths, the high resolution data illustrates the complexity of the soil conditions in the region, and the developed 3D models are presently being used in broadband ground motion simulations to further interpret the observed strong ground motions in the 2010-2011 Canterbury earthquake sequence.

Research papers, University of Canterbury Library

This poster provides a summary of the development of a 3D shallow (z<40m) shear wave velocity (Vs) model for the urban Christchurch, New Zealand region. The model is based on a recently developed Christchurch-specific empirical correlation between Vs and cone penetration test (CPT) data (McGann et al. 2014a,b) and the large high-density database of CPT logs in the greater Christchurch urban area (> 15,000 logs as of 01/01/2014). In particular, the 3D model provides shear wave velocities for the surficial Springston Formation, Christchurch Formation, and Riccarton gravel layers which generally comprise the upper 40m in the Christchurch urban area. Point-estimates are provided on a 200m-by- 200m grid from which interpolation to other locations can be performed. This model has applications for future site characterization and numerical modeling efforts via maps of timeaveraged Vs over specific depths (e.g. Vs30, Vs10) and via the identification of typical Vs profiles for different regions and soil behaviour types within Christchurch. In addition, the Vs model can be used to constrain the near-surface velocities for the 3D seismic velocity model of the Canterbury basin (Lee et al. 2014) currently being developed for the purpose of broadband ground motion simulation.

Research papers, University of Canterbury Library

This research investigates the validation of simulated ground motions on complex structural systems. In this study, the seismic responses of two buildings are compared when they are subjected to as-recorded ground motions and simulated ones. The buildings have been designed based on New Zealand codes and physically constructed in Christchurch, New Zealand. The recorded ground motions are selected from 40 stations database of the historical 22 Feb. 2011 Christchurch earthquake. The Graves and Pitarka (2015) methodology is used to generate the simulated ground motions. The geometric mean of maximum inter-story drift and peak floor acceleration are selected as the main seismic responses. Also, the variation of these parameters due to record to record variability are investigated. Moreover, statistical hypothesis testing is used to investigate the similarity of results between observed and simulated ground motions. The results indicate a general agreement between the peak floor acceleration calculated by simulated and recorded ground motions for two buildings. While according to the hypothesis tests result, the difference in drift can be significant for the building with a shorter period. The results will help engineers and researchers to use or revise the procedure by using simulated ground motions for obtaining seismic responses.

Research papers, University of Canterbury Library

Liquefaction-induced lateral spreading during the 2011 Christchurch earthquake in New Zealand was severe and extensive, and data regarding the displacements associated with the lateral spreading provides an excellent opportunity to better understand the factors that influence these movements. Horizontal displacements measured from optical satellite imagery and subsurface data from the New Zealand Geotechnical Database (NZGD) were used to investigate four distinct lateral spread areas along the Avon River in Christchurch. These areas experienced displacements between 0.5 and 2 m, with the inland extent of displacement ranging from 100 m to over 600 m. Existing empirical and semi-empirical displacement models tend to under estimate displacements at some sites and over estimate at others. The integrated datasets indicate that the areas with more severe and spatially extensive displacements are associated with thicker and more laterally continuous deposits of liquefiable soil. In some areas, the inland extent of displacements is constrained by geologic boundaries and geomorphic features, as expressed by distinct topographic breaks. In other areas the extent of displacement is influenced by the continuity of liquefiable strata or by the presence of layers that may act as vertical seepage barriers. These observations demonstrate the need to integrate geologic/geomorphic analyses with geotechnical analyses when assessing the potential for lateral spreading movements.

Research papers, University of Canterbury Library

Over 900 buildings in the Christchurch central business district and 10,000 residential homes were demolished following the 22nd of February 2011 Canterbury earthquake, significantly disrupting the rebuild progress. This study looks to quantify the time required for demolitions during this event which will be useful for future earthquake recovery planning. This was done using the Canterbury Earthquake Recovery Authority (CERA) demolition database, which allowed an in-depth look into the duration of each phase of the demolition process. The effect of building location, building height, and the stakeholder which initiated the demolition process (i.e. building owner or CERA) was investigated. The demolition process comprises of five phases; (i) decision making, (ii) procurement and planning, (iii) demolition, (iv) site clean-up, and (v) completion certification. It was found that the time required to decide to demolish the building made up majority of the total demolition duration. Demolition projects initiated by CERA had longer procurement and planning durations, but was quicker in other phases. Demolished buildings in the suburbs had a longer decision making duration, but had little effect on other phases of the demolition process. The decision making and procurement and planning phases of the demolition process were shorter for taller buildings, though the other phases took longer. Fragility functions for the duration of each phase in the demolition process are provided for the various categories of buildings for use in future studies.

Research papers, University of Canterbury Library

Asset management in power systems is exercised to improve network reliability to provide confidence and security for customers and asset owners. While there are well-established reliability metrics that are used to measure and manage business-as-usual disruptions, an increasing appreciation of the consequences of low-probability high-impact events means that resilience is increasingly being factored into asset management in order to provide robustness and redundancy to components and wider networks. This is particularly important for electricity systems, given that a range of other infrastructure lifelines depend upon their operation. The 2010-2011 Canterbury Earthquake Sequence provides valuable insights into electricity system criticality and resilience in the face of severe earthquake impacts. While above-ground assets are relatively easy to monitor and repair, underground assets such as cables emplaced across wide areas in the distribution network are difficult to monitor, identify faults on, and repair. This study has characterised in detail the impacts to buried electricity cables in Christchurch resulting from seismically-induced ground deformation caused primarily by liquefaction and lateral spread. Primary modes of failure include cable bending, stretching, insulation damage, joint braking and, being pulled off other equipment such as substation connections. Performance and repair data have been compiled into a detailed geospatial database, which in combination with spatial models of peak ground acceleration, peak ground velocity and ground deformation, will be used to establish rigorous relationships between seismicity and performance. These metrics will be used to inform asset owners of network performance in future earthquakes, further assess component criticality, and provide resilience metrics.

Research papers, University of Canterbury Library

The 2010-2011 Christchurch earthquakes generated damage in several Reinforced Concrete (RC) buildings, which had RC walls as the principal resistant element against earthquake demand. Despite the agreement between structural engineers and researchers in an overall successfully performance there was a lack of knowledge about the behaviour of the damaged structures, and even deeper about a repaired structure, which triggers arguments between different parties that remains up to these days. Then, it is necessary to understand the capacity of the buildings after the earthquake and see how simple repairs techniques improve the building performance. This study will assess the residual capacity of ductile slender RC walls according to current standards in New Zealand, NZS 3101.1 2006 A3. First, a Repaired RC walls Database is created trying to gather previous studies and to evaluate them with existing international guidelines. Then, an archetype building is designed, and the wall is extracted and scaled. Four half-scale walls were designed and will be constructed and tested at the Structures Testing Laboratory at The University of Auckland. The overall dimensions are 3 [m] height, 2 [m] length and 0.175 [m] thick. All four walls will be identical, with differences in the loading protocol and the presence or absence of a repair technique. Results are going to be useful to assess the residual capacity of a damaged wall compare to the original behaviour and also the repaired capacity of walls with simpler repair techniques. The expected behaviour is focussed on big changes in stiffness, more evident than in previously tested RC beams found in the literature.

Research papers, University of Canterbury Library

After a high-intensity seismic event, inspections of structural damages need to be carried out as soon as possible in order to optimize the emergency management, as well as improving the recovery time. In the current practice, damage inspections are performed by an experienced engineer, who physically inspect the structures. This way of doing not only requires a significant amount of time and high skilled human resources, but also raises the concern about the inspector’s safety. A promising alternative is represented using new technologies, such as drones and artificial intelligence, which can perform part of the damage classification task. In fact, drones can safely access high hazard components of the structures: for instance, bridge piers or abutments, and perform the reconnaissance by using highresolution cameras. Furthermore, images can be automatically processed by machine learning algorithms, and damages detected. In this paper, the possibility of applying such technologies for inspecting New Zealand bridges is explored. Firstly, a machine-learning model for damage detection by performing image analysis is presented. Specifically, the algorithm was trained to recognize cracks in concrete members. A sensitivity analysis was carried out to evaluate the algorithm accuracy by using database images. Depending on the confidence level desired,i.e. by allowing a manual classification where the alghortim confidence is below a specific tolerance, the accuracy was found reaching up to 84.7%. In the second part, the model is applied to detect the damage observed on the Anzac Bridge (GPS coordinates -43.500865, 172.701138) in Christchurch by performing a drone reconnaissance. Reults show that the accuracy of the damage detection was equal to 88% and 63% for cracking and spalling, respectively.

Research papers, University of Canterbury Library

High-quality ground motion records are required for engineering applications including response history analysis, seismic hazard development, and validation of physics-based ground motion simulations. However, the determination of whether a ground motion record is high-quality is poorly handled by automation with mathematical functions and can become prohibitive if done manually. Machine learning applications are well-suited to this problem, and a previous feed-forward neural network was developed (Bellagamba et al. 2019) to determine high-quality records from small crustal events in the Canterbury and Wellington regions for simulation validation. This prior work was however limited by the omission of moderate-to-large magnitude events and those from other tectonic environments, as well as a lack of explicit determination of the minimum usable frequency of the ground motion. To address these shortcomings, an updated neural network was developed to predict the quality of ground motion records for all magnitudes and all tectonic sources—active shallow crustal, subduction intraslab, and subduction interface—in New Zealand. The predictive performance of the previous feed-forward neural network was matched by the neural network in the domain of small crustal records, and this level of predictive performance is now extended to all source magnitudes and types in New Zealand making the neural network applicable to global ground motion databases. Furthermore, the neural network provides quality and minimum usable frequency predictions for each of the three orthogonal components of a record which may then be mapped into a binary quality decision or otherwise applied as desired. This framework provides flexibility for the end user to predict high-quality records with various acceptability thresholds allowing for this neural network to be used in a range of applications.

Research papers, University of Canterbury Library

The purpose of this thesis is to conduct a detailed examination of the forward-directivity characteristics of near-fault ground motions produced in the 2010-11 Canterbury earthquakes, including evaluating the efficacy of several existing empirical models which form the basis of frameworks for considering directivity in seismic hazard assessment. A wavelet-based pulse classification algorithm developed by Baker (2007) is firstly used to identify and characterise ground motions which demonstrate evidence of forward-directivity effects from significant events in the Canterbury earthquake sequence. The algorithm fails to classify a large number of ground motions which clearly exhibit an early-arriving directivity pulse due to: (i) incorrect pulse extraction resulting from the presence of pulse-like features caused by other physical phenomena; and (ii) inadequacy of the pulse indicator score used to carry out binary pulse-like/non-pulse-like classification. An alternative ‘manual’ approach is proposed to ensure 'correct' pulse extraction and the classification process is also guided by examination of the horizontal velocity trajectory plots and source-to-site geometry. Based on the above analysis, 59 pulse-like ground motions are identified from the Canterbury earthquakes , which in the author's opinion, are caused by forward-directivity effects. The pulses are also characterised in terms of their period and amplitude. A revised version of the B07 algorithm developed by Shahi (2013) is also subsequently utilised but without observing any notable improvement in the pulse classification results. A series of three chapters are dedicated to assess the predictive capabilities of empirical models to predict the: (i) probability of pulse occurrence; (ii) response spectrum amplification caused by the directivity pulse; (iii) period and amplitude (peak ground velocity, PGV) of the directivity pulse using observations from four significant events in the Canterbury earthquakes. Based on the results of logistic regression analysis, it is found that the pulse probability model of Shahi (2013) provides the most improved predictions in comparison to its predecessors. Pulse probability contour maps are developed to scrutinise observations of pulses/non-pulses with predicted probabilities. A direct comparison of the observed and predicted directivity amplification of acceleration response spectra reveals the inadequacy of broadband directivity models, which form the basis of the near-fault factor in the New Zealand loadings standard, NZS1170.5:2004. In contrast, a recently developed narrowband model by Shahi & Baker (2011) provides significantly improved predictions by amplifying the response spectra within a small range of periods. The significant positive bias demonstrated by the residuals associated with all models at longer vibration periods (in the Mw7.1 Darfield and Mw6.2 Christchurch earthquakes) is likely due to the influence of basin-induced surface waves and non-linear soil response. Empirical models for the pulse period notably under-predict observations from the Darfield and Christchurch earthquakes, inferred as being a result of both the effect of nonlinear site response and influence of the Canterbury basin. In contrast, observed pulse periods from the smaller magnitude June (Mw6.0) and December (Mw5.9) 2011 earthquakes are in good agreement with predictions. Models for the pulse amplitude generally provide accurate estimates of the observations at source-to-site distances between 1 km and 10 km. At longer distances, observed PGVs are significantly under-predicted due to their slower apparent attenuation. Mixed-effects regression is employed to develop revised models for both parameters using the latest NGA-West2 pulse-like ground motion database. A pulse period relationship which accounts for the effect of faulting mechanism using rake angle as a continuous predictor variable is developed. The use of a larger database in model development, however does not result in improved predictions of pulse period for the Darfield and Christchurch earthquakes. In contrast, the revised model for PGV provides a more appropriate attenuation of the pulse amplitude with distance, and does not exhibit the bias associated with previous models. Finally, the effects of near-fault directivity are explicitly included in NZ-specific probabilistic seismic hazard analysis (PSHA) using the narrowband directivity model of Shahi & Baker (2011). Seismic hazard analyses are conducted with and without considering directivity for typical sites in Christchurch and Otira. The inadequacy of the near-fault factor in the NZS1170.5: 2004 is apparent based on a comparison with the directivity amplification obtained from PSHA.

Research papers, University of Canterbury Library

The overarching goal of this dissertation is to improve predictive capabilities of geotechnical seismic site response analyses by incorporating additional salient physical phenomena that influence site effects. Specifically, multidimensional wave-propagation effects that are neglected in conventional 1D site response analyses are incorporated by: (1) combining results of 3D regional-scale simulations with 1D nonlinear wave-propagation site response analysis, and (2) modelling soil heterogeneity in 2D site response analyses using spatially-correlated random fields to perturb soil properties. A method to combine results from 3D hybrid physics-based ground motion simulations with site-specific nonlinear site response analyses was developed. The 3D simulations capture 3D ground motion phenomena on a regional scale, while the 1D nonlinear site response, which is informed by detailed site-specific soil characterization data, can capture site effects more rigorously. Simulations of 11 moderate-to-large earthquakes from the 2010-2011 Canterbury Earthquake Sequence (CES) at 20 strong motion stations (SMS) were used to validate simulations with observed ground motions. The predictions were compared to those from an empirically-based ground motion model (GMM), and from 3D simulations with simplified VS30- based site effects modelling. By comparing all predictions to observations at seismic recording stations, it was found that the 3D physics-based simulations can predict ground motions with comparable bias and uncertainty as the GMM, albeit, with significantly lower bias at long periods. Additionally, the explicit modelling of nonlinear site-response improves predictions significantly compared to the simplified VS30-based approach for soft-soil or atypical sites that exhibit exceptionally strong site effects. A method to account for the spatial variability of soils and wave scattering in 2D site response analyses was developed and validated against a database of vertical array sites in California. The inputs required to run the 2D analyses are nominally the same as those required for 1D analyses (except for spatial correlation parameters), enabling easier adoption in practice. The first step was to create the platform and workflow, and to perform a sensitivity study involving 5,400 2D model realizations to investigate the influence of random field input parameters on wave scattering and site response. Boundary conditions were carefully assessed to understand their effect on the modelled response and select appropriate assumptions for use on a 2D model with lateral heterogeneities. Multiple ground-motion intensity measures (IMs) were analyzed to quantify the influence from random field input parameters and boundary conditions. It was found that this method is capable of scattering seismic waves and creating spatially-varying ground motions at the ground surface. The redistribution of ground-motion energy across wider frequency bands, and the scattering attenuation of high-frequency waves in 2D analyses, resemble features observed in empirical transfer functions (ETFs) computed in other studies. The developed 2D method was subsequently extended to more complicated multi-layer soil profiles and applied to a database of 21 vertical array sites in California to test its appropriate- ness for future predictions. Again, different boundary condition and input motion assumptions were explored to extend the method to the in-situ conditions of a vertical array (with a sensor embedded in the soil). ETFs were compared to theoretical transfer functions (TTFs) from conventional 1D analyses and 2D analyses with heterogeneity. Residuals of transfer-function- based IMs, and IMs of surface ground motions, were also used as validation metrics. The spatial variability of transfer-function-based IMs was estimated from 2D models and compared to the event-to-event variability from ETFs. This method was found capable of significantly improving predictions of median ETF amplification factors, especially for sites that display higher event-to-event variability. For sites that are well represented by 1D methods, the 2D approach can underpredict amplification factors at higher modes, suggesting that the level of heterogeneity may be over-represented by the 2D random field models used in this study.

Research papers, University of Canterbury Library

The recent earthquakes in Christchurch have made it clear that issues exist with current RC frame design in New Zealand. In particular, beam elongation in RC frame buildings was widespread and resulted in numerous buildings being rendered irreparable. Design solutions to overcome this problem are clearly needed, and the slotted beam is one such solution. This system has a distinct advantage over other damage avoidance design systems in that it can be constructed using current industry techniques and conventional reinforcing steel. As the name suggests, the slotted beam incorporates a vertical slot along part of the beam depth at the beam-column interface. Geometric beam elongation is accommodated via opening and closing of these slots during seismically induced rotations, while the top concrete hinge is heavily reinforced to prevent material inelastic elongation. Past research on slotted beams has shown that the bond demand on the bottom longitudinal reinforcement is increased compared with equivalent monolithic systems. Satisfying this increased bond demand through conventional means may yield impractical and economically less viable column dimensions. The same research also indicated that the joint shear mechanism was different to that observed within monolithic joints and that additional horizontal reinforcement was required as a result. Through a combination of theoretical investigation, forensic analysis, and database study, this research addresses the above issues and develops design guidelines. The use of supplementary vertical joint stirrups was investigated as a means of improving bond performance without the need for non-standard reinforcing steel or other hardware. These design guidelines were then validated experimentally with the testing of two 80% scale beam-column sub-assemblies. The revised provisions for bond within the bottom longitudinal reinforcement were found to be adequate while the top longitudinal reinforcement remained nominally elastic throughout both tests. An alternate mechanism was found to govern joint shear behaviour, removing the need for additional horizontal joint reinforcement. Current NZS3101:2006 joint shear reinforcement provisions were found to be more than adequate given the typically larger column depths required rendering the strut mechanism more effective. The test results were then used to further refine design recommendations for practicing engineers. Finally, conclusions and future research requirements were outlined.