ALBERT

All Library Books, journals and Electronic Records Telegrafenberg

feed icon rss

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
Filter
  • Seismological Society of America (SSA)
  • 2010-2014  (247)
  • 1
    Publication Date: 2014-12-05
    Description: In stable continental regions (SCRs), the process of probabilistic seismic-hazard assessment (PSHA) remains a scientific and technical challenge. In producing a new national hazard model for Australia, we developed several innovative techniques to address these challenges. The Australian seismic catalog is heterogeneous due to the variability between magnitude types and the sparse networks. To reduce the resulting high epistemic uncertainty in the recurrence parameters, a and b , the magnitudes of pre-1990 earthquakes have been empirically corrected to account for changes in magnitude formulas around 1990. In addition, existing methods for estimating recurrence parameters (e.g., maximum likelihood estimation) were found to be unstable. To overcome this problem, a new method was developed that removes outlier earthquakes before applying a regression. The incorporation of a model of episodic seismicity into the new hazard model required deviation from the more conventional method of PSHA. The selection of the maximum earthquake magnitude M max is based on the analysis of surface ruptures from paleoearthquakes, with M max thought to vary between geological domains (e.g., 7.2–7.6 in nonextended SCR and 7.4–7.8 in extended SCR). The sensitivity of PSHA to M max , source zone boundary location, recurrence parameters, and ground-motion prediction equations (GMPEs) was examined in this study. The hazard was found to be generally insensitive to M max in the estimated preferred magnitude range. The uncertainty in recurrence parameters was found to contribute a variation in hazard comparable to the epistemic uncertainty associated with the different GMPEs used in this study. For sites near source zone boundaries, a similar variation in hazard was observed by reasonable changes in the position of the boundaries. Aleatory variability and epistemic uncertainty in GMPEs are routinely incorporated in PSHAs, as is variation in M max . However, the uncertainties in recurrence parameters and source zone boundaries are generally given less attention.
    Print ISSN: 0037-1106
    Electronic ISSN: 1943-3573
    Topics: Geosciences , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 2
    Publication Date: 2014-12-05
    Description: The Regional Earthquake Likelihood Models experiment in California tested the performance of earthquake likelihood models over a five-year period. First-order analysis showed a smoothed-seismicity model by Helmstetter et al. (2007) to be the best model. We construct optimal multiplicative hybrids involving the best individual model as a baseline and one or more conjugate models. Conjugate models are transformed using an order-preserving function. Two parameters for each conjugate model and an overall normalizing constant are fitted to optimize the hybrid model. Many two-model hybrids have an appreciable information gain (log probability gain) per earthquake relative to the best individual model. For the whole of California, the Bird and Liu (2007) Neokinema and Holliday et al. (2007) pattern informatics (PI) models both give gains close to 0.25. For southern California, the Shen et al. (2007) geodetic model gives a gain of more than 0.5, and several others give gains of about 0.2. The best three-model hybrid for the whole region has the Neokinema and PI models as conjugates. The best three-model hybrid for southern California has the Shen et al. (2007) and PI models as conjugates. The information gains of the best multiplicative hybrids are greater than those of additive hybrids constructed from the same set of models. The gains tend to be larger when the contributing models involve markedly different concepts or data. These results need to be confirmed by further prospective tests. Multiplicative hybrids will be useful for assimilating other earthquake-related observations into forecasting models and for combining forecasting models at all timescales.
    Print ISSN: 0037-1106
    Electronic ISSN: 1943-3573
    Topics: Geosciences , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 3
    Publication Date: 2012-04-01
    Description: We determine frequency-dependent attenuation 1/Q(f?) for the Hispaniola region using direct S and Lg waves over five distinct passbands from 0.5 to 16 Hz. Data consist of 832 high-quality vertical and horizontal component waveforms recorded on short-period and broadband seismometers from the devastating 12 January 2010 M 7.0 Haiti earthquake and the rich sequence of aftershocks. For the distance range 250–700 km, we estimate an average frequency-dependent Q(f?)=224(±27)f?0.64(±0.073) using horizontal components of motion and note that Q(f?) estimated with Lg at regional distances is very consistent across vertical and horizontal components. We also determine a Q(f?)=142(±21)f?0.71(±0.11) for direct S waves at local distances, =100??km. The strong attenuation observed on both vertical and horizontal components of motion is consistent with expectations for a tectonically active region.Online Material: Figures of filtered and broadband data, Lg- and S-wave amplitudes, and apparent frequency-dependent Q, and tables of earthquake and station parameters.
    Print ISSN: 0037-1106
    Electronic ISSN: 1943-3573
    Topics: Geosciences , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 4
    Publication Date: 2011-09-01
    Description: INTRODUCTION Large earthquakes strike infrequently and close-in recordings are uncommon. This situation makes it difficult to predict the ground motion very close to earthquake-generating faults, if the prediction is to be based on readily available observations. A solution might be to cover the Earth with seismic instruments so that one could rely on the data from previous events to predict future shaking. However, even in the case of complete seismic data coverage for hundreds of years, there would still be one type of earthquake that would be difficult to predict: those very rare earthquakes that produce very large ground motion. These extreme-ground-motion events are so unlikely that most engineers would not even consider designing facilities to withstand the possibility of their occurrence. An exception would be a structure that needs to remain functional for an unusually long period of time. One example of a planned long-life structure has been the high-level nuclear waste repository at Yucca Mountain, Nevada. This structure has been envisioned as one that would perform reliably over tens of thousands of years (CRWMS M&O, 1998). The problem of predicting the maximum possible ground motion in the Yucca Mountain region has been studied using two approaches: a geological approach that examines evidence from the past, and a seismological approach that predicts possibilities for the future via computer simulations. Both strategies are described in detail in Hanks et al. (forthcoming). The seismological approach involved computer simulations that invoked a "physical limits" perspective. Calculations were performed to numerically simulate the largest possible earthquake-generated ground motions that could occur, while remaining faithful to the current state of knowledge about rock physics and wave propagation. These "physical limit" simulations were specifically applied to scenario earthquakes on the faults on and near Yucca Mountain (Andrews et al. 2007). In...
    Print ISSN: 0895-0695
    Electronic ISSN: 1938-2057
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 5
    Publication Date: 2013-08-01
    Description: We examine the conditions necessary to trigger tremor along the San Jacinto fault (SJF) near Anza, California, where previous studies suggest triggered tremor occurs, but observations are sparse. We investigate the stress required to trigger tremor using continuous broadband seismograms from 11 stations located near Anza, California. We examine 44 M w ≥7.4 teleseismic events between 2001 and 2011; these events occur at a wide range of back azimuths and hypocentral distances. In addition, we included one smaller-magnitude, regional event, the 2009 M w  6.5 Gulf of California earthquake, because it induced extremely high strains at Anza. We find the only episode of triggered tremor occurred during the 3 November 2002 M w  7.8 Denali earthquake. The tremor episode lasted 300 s, was composed of 12 tremor bursts, and was located along SJF at the northwestern edge of the Anza gap at approximately 13 km depth. The tremor episode started at the Love-wave arrival, when surface-wave particle motions are primarily in the transverse direction. We find that the Denali earthquake induced the second highest stress (~35 kPa) among the 44 teleseismic events and 1 regional event. The dominant period of the Denali surface wave was 22.8 s, at the lower end of the range observed for all events (20–40 s), similar to periods shown to trigger tremor in other locations. The surface waves from the 2009 M w  6.5 Gulf of California earthquake had the highest observed strain, yet a much shorter dominant period of 10 s and did not trigger tremor. This result suggests that not only the amplitude of the induced strain, but also the period of the incoming surface wave, may control triggering of tremors near Anza. In addition, we find that the transient-shear stress (17–35 kPa) required to trigger tremor along the SJF at Anza is distinctly higher than what has been reported for the well-studied San Andreas fault.
    Print ISSN: 0037-1106
    Electronic ISSN: 1943-3573
    Topics: Geosciences , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 6
    Publication Date: 2013-03-22
    Description: We present a testable stochastic earthquake source model for intermediate- to long-term forecasts. The model is based on fundamental observations: the frequency-magnitude distribution, slip rates on major faults, long-term strain rates, and source parameter values of instrumentally recorded and historic earthquakes. The basic building blocks of the model are two pairs of probability density maps. The first pair consists of smoothed seismicity and weighted focal mechanisms based on observed earthquakes. The second pair corresponds to mapped faults and their slip rates and consists of smoothed moment-rate and weighted focal mechanisms based on fault geometry. We construct from the model a "stochastic event set," that is to say, a large set of simulated earthquakes that are relevant for seismic hazard calculations and earthquake forecast development. Their complete descriptions are determined in the following order: magnitude, epicenter, moment tensor, length, displacement, and down-dip width. Our approach assures by construction that the simulated magnitudes are consistent with the observed frequency-magnitude distribution. We employ a magnitude-dependent weighting procedure that tends to place the largest simulated earthquakes near major faults with consistent focal mechanisms. Nevertheless, our stochastic model allows for surprises such as large off-fault earthquakes, events that comply with the observation that several recent destructive earthquakes occurred on previously unknown fault structures. We apply our model to California to illustrate its features.
    Print ISSN: 0037-1106
    Electronic ISSN: 1943-3573
    Topics: Geosciences , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 7
    Publication Date: 2013-03-22
    Description: The Regional Earthquake Likelihood Models (RELM) working group designed a 5-year experiment to forecast the number, spatial distribution, and magnitude distribution of subsequent target earthquakes, defined to be those with magnitude ≥4.95 ( M 4.95+) in a well-defined California testing region. Included in the experiment specification were the description of the data source, the methods for data processing, and the proposed evaluation metrics. The RELM experiment began on 1 January 2006 and involved 17 time-invariant forecasts constructed by seismicity modelers; by the end of the experiment on 1 January 2011, 31 target earthquakes had occurred. We analyze the experiment outcome by applying the proposed consistency tests based on likelihood measures and additional comparison tests based on a measure of information gain. We find that the smoothed seismicity forecast by Helmstetter et al. , 2007 based on M 2+ earthquakes since 1981, is the best forecast, regardless of whether aftershocks are included in the analysis. The RELM experiment has helped to clarify ideas about testing that can be applied to more wide-ranging earthquake forecasting experiments conducted by the Collaboratory for the Study of Earthquake Predictability (CSEP). Online Material: Figures and tables showing the RELM testing region and collection region definitions, numerical results associated with the RELM experiment, and the uncorrected forecast by Ebel et al. (2007) .
    Print ISSN: 0037-1106
    Electronic ISSN: 1943-3573
    Topics: Geosciences , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 8
    Publication Date: 2013-06-08
    Description: Using 3D dynamic models, we investigate the effect of fault stepovers on near-source ground motion. We use the finite-element method to model the rupture, slip, and ground motion of two parallel strike-slip faults with an unlinked overlapping stepover of variable width. We model this system as both an extensional and a compressional stepover and compare the results to those of single planar faults. We find that, overall, the presence of a stepover along the fault trace reduces the maximum ground motion when compared to the long planar fault. Whether the compressional or extensional stepover exhibits higher ground motion overall depends on the width of the separation between the faults. There is a region of reduced ground motion at the end of the first fault segment, when the faults are embedded in a homogeneous material. We also experiment with stress fields leading to supershear and subshear rupture velocities, and with different stress drops within those conditions. We find that subshear rupture produces stronger motions than supershear rupture, but supershear ruptures produce that maximum over a larger area than subshear areas, even though the overall area that experiences any shaking at all is not drastically different between the two cases. Lastly, we experiment with placing realistic materials along and around the faults, such as a sedimentary basin in an extensional stepover, a damage zone around the fault, and a soft rock layer on top of bedrock through the entire model area. These configurations alter the pattern of ground motion from the homogeneous case; the peaks in ground motion for the bimaterial cases depend on the materials in question. The results may have implications for ground-motion prediction in future earthquakes on geometrically complex faults. Online Material: MPEG-4 movies of models of dynamic rupture of fault stepovers embedded in heterogeneous material settings.
    Print ISSN: 0037-1106
    Electronic ISSN: 1943-3573
    Topics: Geosciences , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 9
    Publication Date: 2013-03-03
    Description: A trench was excavated across the southeastern Reelfoot rift margin for paleoseismic research purposes and for the 2011 Seismological Society of America national meeting field trip. The trench was parallel to and 6 m southwest of the Oldham trench described in 2006. In this 2011 trench, faulted alluvial fan stratigraphy and liquefaction deposits less than 4000 yr old were exposed. The trench revealed three tectonic deformation events. The first event (graben formation) and the second event (sand blow, minor faulting, and injection of sand dikes) both post-date a paleosol circa 4000 yr B.P. and pre-date a surficial colluvial soil deposit circa 2000 yr B.P. The third event (minor shallow liquefaction and surface deformation) post-dates a 2000-year-old surface colluvial soil. These age constraints allow this third event to be attributed to the 1811–1812 New Madrid earthquakes or a remotely triggered earthquake related to the 1811–1812 earthquake sequence. Although we cannot rule out that the deformation revealed in this trench was caused by earthquakes on other faults in the New Madrid seismic zone, our favored interpretation is that the deformation was caused by rupture on an underlying fault at the base of the Mississippi River bluff that was previously imaged in a shallow reflection profile near this site.
    Print ISSN: 0895-0695
    Electronic ISSN: 1938-2057
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 10
    Publication Date: 2011-04-01
    Description: We describe a prototype detection framework that automatically clusters events in real time from a rapidly unfolding aftershock sequence. We use the fact that many aftershocks are repetitive, producing similar waveforms. By clustering events based on correlation measures of waveform similarity, the number of independent event instances that must be examined in detail by analysts may be reduced. Our system processes array data and acquires waveform templates with a short-term average (STA)/long-term average (LTA) detector operating on a beam directed at the P phases of the aftershock sequence. The templates are used to create correlation-type (subspace) detectors that sweep the subsequent data stream for occurrences of the same waveform pattern. Events are clustered by association with a particular detector. Hundreds of subspace detectors can run in this framework a hundred times faster than in real time. Nonetheless, to check the growth in the number of detectors, the framework pauses periodically and reclusters detections to reduce the number of event groups. These groups define new subspace detectors that replace the older generation of detectors. Because low-magnitude occurrences of a particular signal template may be missed by the STA/LTA detector, we advocate restarting the framework from the beginning of the sequence periodically to reprocess the entire data stream with the existing detectors. We tested the framework on 10 days of data from the Nevada Seismic Array (NVAR) covering the 2003 San Simeon earthquake. One hundred eighty-four automatically generated detectors produced 676 detections resulting in a potential reduction in analyst workload of up to 73%.
    Print ISSN: 0037-1106
    Electronic ISSN: 1943-3573
    Topics: Geosciences , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...