Pipeline Processing with an Iterative, Context-based Detection Model
Technical rept. 19 Mar 2013-18 Mar 2014
NORSAR KJELLER (NORWAY)
Pagination or Media Count:
Under existing detection pipelines, seismic event hypotheses are formed from a parametric description of the waveform data obtained from a single pass over the incoming data stream. The full potential of signal processing algorithms is not being exploited due to simplistic assumptions made about the background against which signals are being detected. A vast improvement in the available computational resources allows the possibility of more sensitive and more robust context-based detection pipelines which glean progressively more information from multiple passes over the data. In the first year of this two year contract we have designed and implemented several extensions to an existing prototype detection framework to demonstrate the feasibility of improving performance from a systematic reprocessing of the raw data. The new components are signal cancellation for stripping the incoming data stream of repeating and irrelevant signals prior to running primary detectors, adaptive beamforming and matched field processing for suppressing background signals and aftershock sequences, and the testing of event hypotheses by evaluating detection probabilities for both detecting and non-detecting stations, followed by optimized beamforming.
- Seismic Detection and Detectors