2. Early warning and limits of predictability

2.1 Earthquake early warning:

Early warning procedures for earthquakes for the purpose of mitigating risk requires the development and the use of real-time seismology methodologies. The focus of our research will involve the possibility of computing reliable real-time single- and multi-type risk scenarios that can be provided to end users to help in the various decision making processes. Key topics of research will be the development of real-time source inversion and rapid impact simulation tools that take advantage of data coming in real time from regional/global networks (e.g. GEOFON), the development of concepts for ad-hoc low cost instrumentation for earthquake early warning, the development of innovative methods allowing the assessment of the vulnerability of exposed assets.

2.2 Tool development for monitoring and advanced methods for data integration:

In terms of developing tools for geohazard monitoring, we will focus on the possibility of monitoring volcanic and seismo-tectonic unrest and improving volcanic hazard assessment by integrating different tools, including the (near)real time analysis of camera images, geodetic and seismic data from small scale arrays. Methods based on satellite remote sensing data analysis for monitoring geo-hazards and in particular subareal avalanches and flank collapse will be further developed. Also of relevance, especially as part of the work concerned with improving the ad-hoc early warning systems, will be the development of mass movement monitoring systems (e.g. landslides, mines) based on seismic noise analysis in combination with geodetic tools.

2.3 Testing and evaluation:

Rigorous and prospective testing of scientific hypotheses and models is an emerging field in seismology. Over the past decade, we established community-accepted testing procedures and protocols for earthquake forecast models. However, most aspects of seismic hazard assessment still remain untested and untestable. Seismic risk models completely lack previous rigorous testing and constitute predominantly expert-opinion models. We will extend the developed testing center system of the Collaboratory for the Study of Earthquake Predictability at the Southern California Earthquake Center. Our scientific targets are the development of testable hypotheses to investigate frequently applied physical models, e. g. characteristic earthquakes, Coulomb-stress change, estimates of maximum magnitudes per fault zone. Data quality assessment, in particular characterization of earthquake catalogs, will be a necessity for any subsequent test. We aim to create a testing environment for (automatic) earthquake source inversions. A major effort will be the testing center for intensity and ground-motion prediction equations to evaluate their performance and track their improvements, and the expansion to (aggregated) hazard and risk testing.

2.4 Rapid event monitoring and assessment:

A promising approach to natural hazard research treats extreme events as ‘experiments’ which can prove or disprove our concepts and prognostic models. Natural hazard events are assessed by targeted rapid analysis and field monitoring campaigns during and shortly after an event. We will enhance our capacity to quickly deploy measurement campaigns and by preparing procedures for rapid data processing and analysis. Own data will be merged with generally available data from different platforms, such as global databases and satellites, and by extracting information from the affected public via social networks and crowd sourcing. We will establish operational procedures integrating different sections of GFZ (HART-network) and CEDIM (Center for Disaster Management and Risk Reduction Technologies) to assess systematically and rapidly extreme events and their consequences.