Seismic isolation is an effective technology for significantly reducing damage to buildings and building contents. However, its application to light-frame wood buildings has so far been unable to overcome cost and technical barriers such as susceptibility to movement during high-wind loading. The precursor to research in the field of isolation of residential buildings was the 1994 Northridge Earthquake (6.7 MW) in the United States and the 1995 Kobe Earthquake (6.9 MW) in Japan. While only a small number of lives were lost in residential buildings in these events, the economic impact was significant with over half of earthquake recovery costs given to repair and reconstruction of residential building damage. A value case has been explored to highlight the benefits of seismically isolated residential buildings compared to a standard fixed-base dwellings for the Wellington region. Loss data generated by insurance claim information from the 2011 Christchurch Earthquake has been used by researchers to determine vulnerability functions for the current light-frame wood building stock. By further considering the loss attributed to drift and acceleration sensitive components, and a simplified single degree of freedom (SDOF) building model, a method for determining vulnerability functions for seismic isolated buildings was developed. Vulnerability functions were then applied directly in a loss assessment using the GNS developed software, RiskScape. Vulnerability was shown to dramatically reduce for isolated buildings compared to an equivalent fixed-base building and as a result, the monetary savings in a given earthquake scenario were significant. This work is expected to drive further interest for development of solutions for the seismic isolation of residential dwellings, of which one option is further considered and presented herein.
Hybrid broadband simulation methods typically compute high-frequency portion of ground-motions using a simplified-physics approach (commonly known as “stochastic method”) using the same 1D velocity profile, anelastic attenuation profile and site-attenuation (κ0) value for all sites. However, these parameters relating to Earth structure are known to vary spatially. In this study we modify this conventional approach for high-frequency ground-shaking by using site-specific input parameters (referred to as “site-specific”) and analyze improvements over using same parameters for all sites (referred to as “generic”). First, we theoretically understand how different 1D velocity profiles, anelastic attenuation profiles and site-attenuation (κ0) values affects the Fourier Acceleration Spectrum (FAS). Then, we apply site-specific method to simulate 10 events from the 2010-2011 Canterbury earthquake sequence to assess performance against the generic approach in predicting recorded ground-motions. Our initial results suggest that the site-specific method yields a lower simulation standard deviation than generic case.
Welcome to the Recover newsletter Issue 6 from the Marine Ecology Research Group (MERG) of the University of Canterbury. Recover is designed to keep you updated on our MBIE-funded earthquake recovery project called RECOVER (Reef Ecology, Coastal Values & Earthquake Recovery). This 6th instalment features the ‘new land’ created by the earthquake uplift of the coastline, recreational uses of beaches in Marlborough, and pāua survey work and hatchery projects with our partners in Kaikōura.
Many buildings with relatively low damage from the 2010-2011 Canterbury were deemed uneconomic to repair and were replaced [1,2]. Factors that affected commercial building owners’ decisions to replace rather than repair, included capital availability, uncertainty with regards to regional recovery, local market conditions and ability to generate cash flow, and repair delays due to limited property access (cordon). This poster provides a framework for modeling decision-making in a case where repair is feasible but replacement might offer greater economic value – a situation not currently modeled in engineering risk analysis.
The latest two great earthquake sequences; 2010- 2011 Canterbury Earthquake and 2016 Kaikoura Earthquake, necessitate a better understanding of the New Zealand seismic hazard condition for new building design and detailed assessment of existing buildings. It is important to note, however, that the New Zealand seismic hazard map in NZS 1170.5.2004 is generalised in effort to cover all of New Zealand and limited to a earthquake database prior to 2001. This is “common” that site-specific studies typically provide spectral accelerations different to those shown on the national map (Z values in NZS 1170.5:2004); and sometimes even lower. Moreover, Section 5.2 of Module 1 of the Earthquake Geotechnical Engineering Practice series provide the guidelines to perform site- specific studies.
Detailed studies on the sediment budget may reveal valuable insights into the successive build-up of the Canterbury Plains and their modification by Holocene fluvialaction connected to major braided rivers. Additionally, they bear implications beyond these fluvial aspects. Palaeoseismological studies claim to have detected signals of major Alpine Fault earthquakes in coastal environments along the eastern seaboard of the South Island (McFadgen and Goff, 2005). This requires high connectivity between the lower reaches of major braided rivers and their mountain catchments to generate immediate significant sediment pulses. It would be contradictory to the above mentioned hypothesis though. Obtaining better control on sediment budgets of braided rivers like the Waimakariri River will finally add significant value to multiple scientific and applied topics like regional resource management. An essential first step of sediment budget studies Is to systematically map the geomorphology, conventionally in the field and/or using remote-sensing applications, to localise, genetically identify, and classify landforms or entire toposequences of the area being investigated. In formerly glaciated mountain environments it is also indispensable to obtain all available chronological information supporting subsequent investigations.
Our poster will present on-going QuakeCoRE-founded work on strong motion seismology for Dunedin-Mosgiel area, focusing on ground motion simulations for Dunedin Central Business District (CBD). Source modelling and ground motion simulations are being carried out using the SCEC (Southern California Earthquakes Center) Broad Band simulation Platform (BBP). The platform computes broadband (0-10 Hz) seismograms for earthquakes and was first implemented at the University of Otago in 2016. As large earthquakes has not been experienced in Dunedin in the time of period of instrumental recording, user-specified scenario simulations are of great value. The Akatore Fault, the most active fault in Otago and closest major fault to Dunedin, is the source focused on in the present study. Simulations for various Akatore Fault source scenarios are run and presented. Path and site effects are key components considered in the simulation process. A 1D shear wave velocity profile is required by SCEC BBP, and this is being generated to represent the Akatore-to-CBD path and site within the BBP. A 3D shear velocity model, with high resolution within Dunedin CBD, is being developed in parallel with this study (see Sangster et al. poster). This model will be the basis for developing a 3D shear wave velocity model for greater Dunedin-Mosgiel area for future ground motion simulations, using Canterbury software (currently under development).
Geospatial liquefaction models aim to predict liquefaction using data that is free and readily-available. This data includes (i) common ground-motion intensity measures; and (ii) geospatial parameters (e.g., among many, distance to rivers, distance to coast, and Vs30 estimated from topography) which are used to infer characteristics of the subsurface without in-situ testing. Since their recent inception, such models have been used to predict geohazard impacts throughout New Zealand (e.g., in conjunction with regional ground-motion simulations). While past studies have demonstrated that geospatial liquefaction-models show great promise, the resolution and accuracy of the geospatial data underlying these models is notably poor. As an example, mapped rivers and coastlines often plot hundreds of meters from their actual locations. This stems from the fact that geospatial models aim to rapidly predict liquefaction anywhere in the world and thus utilize the lowest common denominator of available geospatial data, even though higher quality data is often available (e.g., in New Zealand). Accordingly, this study investigates whether the performance of geospatial models can be improved using higher-quality input data. This analysis is performed using (i) 15,101 liquefaction case studies compiled from the 2010-2016 Canterbury Earthquakes; and (ii) geospatial data readily available in New Zealand. In particular, we utilize alternative, higher-quality data to estimate: locations of rivers and streams; location of coastline; depth to ground water; Vs30; and PGV. Most notably, a region-specific Vs30 model improves performance (Figs. 3-4), while other data variants generally have little-to-no effect, even when the “standard” and “high-quality” values differ significantly (Fig. 2). This finding is consistent with the greater sensitivity of geospatial models to Vs30, relative to any other input (Fig. 5), and has implications for modeling in locales worldwide where high quality geospatial data is available.