Search

found 2 results

Research papers, University of Canterbury Library

Semi-empirical models based on in-situ geotechnical tests have become the standard of practice for predicting soil liquefaction. Since the inception of the “simplified” cyclic-stress model in 1971, variants based on various in-situ tests have been developed, including the Cone Penetration Test (CPT). More recently, prediction models based soley on remotely-sensed data were developed. Similar to systems that provide automated content on earthquake impacts, these “geospatial” models aim to predict liquefaction for rapid response and loss estimation using readily-available data. This data includes (i) common ground-motion intensity measures (e.g., PGA), which can either be provided in near-real-time following an earthquake, or predicted for a future event; and (ii) geospatial parameters derived from digital elevation models, which are used to infer characteristics of the subsurface relevent to liquefaction. However, the predictive capabilities of geospatial and geotechnical models have not been directly compared, which could elucidate techniques for improving the geospatial models, and which would provide a baseline for measuring improvements. Accordingly, this study assesses the realtive efficacy of liquefaction models based on geospatial vs. CPT data using 9,908 case-studies from the 2010-2016 Canterbury earthquakes. While the top-performing models are CPT-based, the geospatial models perform relatively well given their simplicity and low cost. Although further research is needed (e.g., to improve upon the performance of current models), the findings of this study suggest that geospatial models have the potential to provide valuable first-order predictions of liquefaction occurence and consequence. Towards this end, performance assessments of geospatial vs. geotechnical models are ongoing for more than 20 additional global earthquakes.

Research papers, University of Canterbury Library

Asset management in power systems is exercised to improve network reliability to provide confidence and security for customers and asset owners. While there are well-established reliability metrics that are used to measure and manage business-as-usual disruptions, an increasing appreciation of the consequences of low-probability high-impact events means that resilience is increasingly being factored into asset management in order to provide robustness and redundancy to components and wider networks. This is particularly important for electricity systems, given that a range of other infrastructure lifelines depend upon their operation. The 2010-2011 Canterbury Earthquake Sequence provides valuable insights into electricity system criticality and resilience in the face of severe earthquake impacts. While above-ground assets are relatively easy to monitor and repair, underground assets such as cables emplaced across wide areas in the distribution network are difficult to monitor, identify faults on, and repair. This study has characterised in detail the impacts to buried electricity cables in Christchurch resulting from seismically-induced ground deformation caused primarily by liquefaction and lateral spread. Primary modes of failure include cable bending, stretching, insulation damage, joint braking and, being pulled off other equipment such as substation connections. Performance and repair data have been compiled into a detailed geospatial database, which in combination with spatial models of peak ground acceleration, peak ground velocity and ground deformation, will be used to establish rigorous relationships between seismicity and performance. These metrics will be used to inform asset owners of network performance in future earthquakes, further assess component criticality, and provide resilience metrics.