Despite the relatively low seismicity, a large earthquake in the Waikato region is expected to have a high impact, when the fourth-largest regional population and economy and the high density critical infrastructure systems in this region are considered. Furthermore, Waikato has a deep soft sedimentary basin, which increases the regional seismic hazard due to trapping and amplification of seismic waves and generation of localized surface waves within the basin. This phenomenon is known as the “Basin Effect”, and has been attributed to the increased damage in several historic earthquakes, including the 2010-2011 Canterbury earthquakes. In order to quantitatively model the basin response and improve the understanding of regional seismic hazard, geophysical methods will be used to develop shear wave velocity profiles across the Waikato basin. Active surface wave methods involve the deployment of linear arrays of geophones to record the surface waves generated by a sledge hammer. Passive surface wave methods involve the deployment of two-dimensional seismometer arrays to record ambient vibrations. At each site, the planned testing includes one active test and two to four passive arrays. The obtained data are processed to develop dispersion curves, which describe surface wave propagation velocity as a function of frequency (or wavelength). Dispersion curves are then inverted using the Geopsy software package to develop a suite of shear wave velocity profiles. Currently, more than ten sites in Waikato are under consideration for this project. This poster presents the preliminary results from the two sites that have been tested. The shear wave velocity profiles from all sites will be used to produce a 3D velocity model for the Waikato basin, a part of QuakeCoRE flagship programme 1.
This study explicitly investigates uncertainties in physics-based ground motion simulation validation for earthquakes in the Canterbury region. The simulations utilise the Graves and Pitarka (2015) hybrid methodology, with separately quantified parametric uncertainties in the comprehensive physics and simplified physics components of the model. The study is limited to the simulation of 148 small magnitude (Mw 3.5 – 5) earthquakes, with a point source approximation for the source rupture representations, which also enables a focus on a small number of relevant uncertainties. The parametric uncertainties under consideration were selected through sensitivity analysis, and specifically include: magnitude, Brune stress parameter and high frequency rupture velocity. Twenty Monte Carlo realisations were used to sample parameter uncertainties for each of the 148 events. Residuals associated with the following intensity measures: spectral acceleration, peak ground velocity, arias intensity and significant duration, were ascertained. Using these residuals, validation was performed through assessment of systematic biases in site and source terms from mixed-effects regression. Based on the results to date, initial standard deviation recommendations for parameter uncertainties, based on the Canterbury simulations have been obtained. This work ultimately provides an initial step toward explicit incorporation of modelling uncertainty in simulated ground motion predictions for future events, which will improve the use of simulation models in seismic hazard analysis. We plan to subsequently assess uncertainties for larger magnitude events with more complex ruptures, and events across a larger geographic region, as well as uncertainties due to path attenuation, site effects, and more general model epistemic uncertainties.
This paper investigates the effects of variability in source rupture parameters on site-specific physics-based simulated ground motions, ascertained through the systematic analysis of ground motion intensity measures. As a preliminary study, we consider simulations of the 22 February 2011 Christchurch earthquake using the Graves and Pitarka (2015) methodology. The effects of source variability are considered via a sensitivity study in which parameters (hypocentre location, earthquake magnitude, average rupture velocity, fault geometry and the Brune stress parameter) are individually varied by one standard deviation. The sensitivity of simulated ground motion intensity measures are subsequently compared against observational data. The preliminary results from this study indicate that uncertainty in the stress parameter and the rupture velocity have the most significant effect on the high frequency amplitudes. Conversely, magnitude uncertainty was found to be most influential on the spectral acceleration amplitudes at low frequencies. Further work is required to extend this preliminary study to exhaustively consider more events and to include parameter covariance. The ultimate results of this research will assist in the validation of the overall simulation method’s accuracy in capturing various rupture parameters, which is essential for the use of simulated ground motion models in probabilistic seismic hazard analysis.
The development of Digital City technologies to manage and visualise spatial information has increasingly become a focus of the research community, and application by city authorities. Traditionally, the Geographic Information Systems (GIS) and Building Information Models (BIM) underlying Digital Cities have been used independently. However, integrating GIS and BIM into a single platform provides benefits for project and asset management, and is applicable to a range of issues. One of these benefits is the means to access and analyse large datasets describing the built environment, in order to characterise urban risk from and resilience to natural hazards. The aim of this thesis is to further explore methodologies of integration in two distinct areas. The first, integration through connectivity of heterogeneous datasets where GIS spatial infrastructure data is merged with 3D BIM building data to create a digital twin. Secondly, integration through analysis whereby data from the digital twin are extracted and integrated with computational models. To achieve this, a workflow was developed to identify the required datasets of a digital twin, and develop a process of integrating those datasets through a combination of; semi-autonomous conversion, translation and extension of data; and semantic web and services-based processes. Through use of a designed schema, the data were streamed in a homogenous format in a web-based platform. To demonstrate the value of this workflow with respect to urban risk and resilience, the process was applied to the Taiora: Queen Elizabeth II recreation and sports centre in eastern Christchurch, New Zealand. After integration of as-built GIS and BIM datasets, targeted data extraction was implemented, with outputs tailored for analysis in an infrastructure serviceability loss model, which assessed potable water network performance in the 22nd February 2011 Christchurch Earthquake. Using the same earthquake conditions as the serviceability loss model, performance of infrastructure assets in service at the time of the 22nd February 2011 Christchurch Earthquake was compared to new assets rebuilt at the site, post-earthquake. Due to improved potable water infrastructure resilience resulting from installation of ductile piles, a decrease of 35.5% in the probability of service loss was estimated in the serviceability loss model. To complete the workflow, the results from the external analysis were uploaded to the web-based platform. One of the more significant outcomes from the workflow was the identification of a lack of mandated metadata standards for fittings/valves connecting a building to private laterals. Whilst visually the GIS and BIM data show the building and pipes as connected, the semantic data does not include this connectivity relationship. This has no material impact on the current serviceability loss model as it is not one of the defined parameters. However, a proposed modification to the model would utilise the metadata to further assess the physical connection robustness, and increase the number of variables for estimating probability of service loss. This thesis has made a methodological contribution to urban resilience analysis by demonstrating how readily available up-to-date building and infrastructure data can be integrated, and with tailored extraction from a Digital City platform, be used for disaster impact analysis in an external computational engine, with results in turn imported and visualised in the Digital City platform. The workflow demonstrated that translation and integration of data would be more successful if a regional/national mandate was implemented for the submission of consent documentation in a specified standard BIM format. The results of this thesis have identified that the key to ensuring the success of an integrated tool lies in the initial workflow required to safeguard that all data can be either captured or translated in an interoperable format.
The development of Digital City technologies to manage and visualise spatial information has increasingly become a focus of the research community, and application by city authorities. Traditionally, the Geographic Information Systems (GIS) and Building Information Models (BIM) underlying Digital Cities have been used independently. However, integrating GIS and BIM into a single platform provides benefits for project and asset management, and is applicable to a range of issues. One of these benefits is the means to access and analyse large datasets describing the built environment, in order to characterise urban risk from and resilience to natural hazards. The aim of this thesis is to further explore methodologies of integration in two distinct areas. The first, integration through connectivity of heterogeneous datasets where GIS spatial infrastructure data is merged with 3D BIM building data to create a digital twin. Secondly, integration through analysis whereby data from the digital twin are extracted and integrated with computational models. To achieve this, a workflow was developed to identify the required datasets of a digital twin, and develop a process of integrating those datasets through a combination of; semi-autonomous conversion, translation and extension of data; and semantic web and services-based processes. Through use of a designed schema, the data were streamed in a homogenous format in a web-based platform. To demonstrate the value of this workflow with respect to urban risk and resilience, the process was applied to the Taiora: Queen Elizabeth II recreation and sports centre in eastern Christchurch, New Zealand. After integration of as-built GIS and BIM datasets, targeted data extraction was implemented, with outputs tailored for analysis in an infrastructure serviceability loss model, which assessed potable water network performance in the 22nd February 2011 Christchurch Earthquake. Using the same earthquake conditions as the serviceability loss model, performance of infrastructure assets in service at the time of the 22nd February 2011 Christchurch Earthquake was compared to new assets rebuilt at the site, post-earthquake. Due to improved potable water infrastructure resilience resulting from installation of ductile piles, a decrease of 35.5% in the probability of service loss was estimated in the serviceability loss model. To complete the workflow, the results from the external analysis were uploaded to the web-based platform. One of the more significant outcomes from the workflow was the identification of a lack of mandated metadata standards for fittings/valves connecting a building to private laterals. Whilst visually the GIS and BIM data show the building and pipes as connected, the semantic data does not include this connectivity relationship. This has no material impact on the current serviceability loss model as it is not one of the defined parameters. However, a proposed modification to the model would utilise the metadata to further assess the physical connection robustness, and increase the number of variables for estimating probability of service loss. This thesis has made a methodological contribution to urban resilience analysis by demonstrating how readily available up-to-date building and infrastructure data can be integrated, and with tailored extraction from a Digital City platform, be used for disaster impact analysis in an external computational engine, with results in turn imported and visualised in the Digital City platform. The workflow demonstrated that translation and integration of data would be more successful if a regional/national mandate was implemented for the submission of consent documentation in a specified standard BIM format. The results of this thesis have identified that the key to ensuring the success of an integrated tool lies in the initial workflow required to safeguard that all data can be either captured or translated in an interoperable format.