ALBERT

All Library Books, journals and Electronic Records Telegrafenberg

feed icon rss

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
Filter
  • Articles  (16)
  • Copernicus  (16)
Collection
  • Articles  (16)
Journal
Topic
  • 1
    Publication Date: 2012-09-20
    Description: The standard measures of the intensity of a tornado in the USA and many other countries are the Fujita and Enhanced Fujita scales. These scales are based on the damage that a tornado causes. Another measure of the strength of a tornado is its path length of touchdown, L. In this study we consider severe tornadoes, which we define as L≥10 km, in the continental USA (USA Storm Prediction Center Severe Weather Database). We find that for the period 1982–2011, for individual severe tornadoes (L≥10 km): (i) There is a strong linear scaling between the number of severe tornadoes in a year and their total path length in that year. (ii) The cumulative frequency path length data suggests that, not taking into account any changing trends over time, we would expect in a given year (on average) one severe tornado with a path length L≥115 km and in a decade (on average) one severe tornado with a path length L≥215 km. (iii) The noncumulative frequency-length statistics of severe tornado touchdown path lengths, 20
    Print ISSN: 1680-7316
    Electronic ISSN: 1680-7324
    Topics: Geosciences
    Published by Copernicus on behalf of European Geosciences Union.
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 2
    Publication Date: 2011-12-14
    Description: We generate synthetic catalogs of seismicity in northern California using a composite simulation. The basis of the simulation is the fault based "Virtual California" (VC) earthquake simulator. Back-slip velocities and mean recurrence intervals are specified on model strike-slip faults. A catalog of characteristic earthquakes is generated for a period of 100 000 yr. These earthquakes are predominantly in the range M = 6 to M = 8, but do not follow Gutenberg-Richter (GR) scaling at lower magnitudes. In order to model seismicity on unmapped faults we introduce background seismicity which occurs randomly in time with GR scaling and is spatially associated with the VC model faults. These earthquakes fill in the GR scaling down to M = 4 (the smallest earthquakes modeled). The rate of background seismicity is constrained by the observed rate of occurrence of M 〉 4 earthquakes in northern California. These earthquakes are then used to drive the BASS (branching aftershock sequence) model of aftershock occurrence. The BASS model is the self-similar limit of the ETAS (epidemic type aftershock sequence) model. Families of aftershocks are generated following each Virtual California and background main shock. In the simulations the rate of occurrence of aftershocks is essentially equal to the rate of occurrence of main shocks in the magnitude range 4 〈 M 〈 7. We generate frequency-magnitude and recurrence interval statistics both regionally and fault specific. We compare our modeled rates of seismicity and spatial variability with observations.
    Print ISSN: 1023-5809
    Electronic ISSN: 1607-7946
    Topics: Geosciences , Physics
    Published by Copernicus on behalf of European Geosciences Union.
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 3
    Publication Date: 2001-10-31
    Description: Three aspects of complexity are fractals, chaos, and self-organized criticality. There are many examples of the applicability of fractals in solid-earth geophysics, such as earthquakes and landforms. Chaos is widely accepted as being applicable to a variety of geophysical phenomena, for instance, tectonics and mantle convection. Several simple cellular-automata models have been said to exhibit self-organized criticality. Examples include the sandpile, forest fire and slider-blocks models. It is believed that these are directly applicable to landslides, actual forest fires, and earthquakes, respectively. The slider-block model has been shown to clearly exhibit deterministic chaos and fractal behaviour. The concept of self-similar cascades can explain self-organized critical behaviour. This approach also illustrates the similarities and differences with critical phenomena through association with the site-percolation and diffusion-limited aggregation models.
    Print ISSN: 1023-5809
    Electronic ISSN: 1607-7946
    Topics: Geosciences , Physics
    Published by Copernicus on behalf of European Geosciences Union.
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 4
    Publication Date: 2002-12-31
    Description: We have studied a hybrid model combining the forest-fire model with the site-percolation model in order to better understand the earthquake cycle. We consider a square array of sites. At each time step, a "tree" is dropped on a randomly chosen site and is planted if the site is unoccupied. When a cluster of "trees" spans the site (a percolating cluster), all the trees in the cluster are removed ("burned") in a "fire." The removal of the cluster is analogous to a characteristic earthquake and planting "trees" is analogous to increasing the regional stress. The clusters are analogous to the metastable regions of a fault over which an earthquake rupture can propagate once triggered. We find that the frequency-area statistics of the metastable regions are power-law with a negative exponent of two (as in the forest-fire model). This is analogous to the Gutenberg-Richter distribution of seismicity. This "self-organized critical behavior" can be explained in terms of an inverse cascade of clusters. Small clusters of "trees" coalesce to form larger clusters. Individual trees move from small to larger clusters until they are destroyed. This inverse cascade of clusters is self-similar and the power-law distribution of cluster sizes has been shown to have an exponent of two. We have quantified the forecasting of the spanning fires using error diagrams. The assumption that "fires" (earthquakes) are quasi-periodic has moderate predictability. The density of trees gives an improved degree of predictability, while the size of the largest cluster of trees provides a substantial improvement in forecasting a "fire."
    Print ISSN: 1023-5809
    Electronic ISSN: 1607-7946
    Topics: Geosciences , Physics
    Published by Copernicus on behalf of European Geosciences Union.
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 5
    Publication Date: 2010-03-31
    Description: For the purposes of this study, an interval is the elapsed time between two earthquakes in a designated region; the minimum magnitude for the earthquakes is prescribed. A record-breaking interval is one that is longer (or shorter) than preceding intervals; a starting time must be specified. We consider global earthquakes with magnitudes greater than 5.5 and show that the record-breaking intervals are well estimated by a Poissonian (random) theory. We also consider the aftershocks of the 2004 Parkfield earthquake and show that the record-breaking intervals are approximated by very different statistics. In both cases, we calculate the number of record-breaking intervals (nrb) and the record-breaking interval durations Δtrb as a function of "natural time", the number of elapsed events. We also calculate the ratio of record-breaking long intervals to record-breaking short intervals as a function of time, r(t), which is suggested to be sensitive to trends in noisy time series data. Our data indicate a possible precursory signal to large earthquakes that is consistent with accelerated moment release (AMR) theory.
    Print ISSN: 1023-5809
    Electronic ISSN: 1607-7946
    Topics: Geosciences , Physics
    Published by Copernicus on behalf of European Geosciences Union.
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 6
    Publication Date: 2007-08-02
    Description: The sandpile, forest-fire and slider-block models are said to exhibit self-organized criticality. Associated natural phenomena include landslides, wildfires, and earthquakes. In all cases the frequency-size distributions are well approximated by power laws (fractals). Another important aspect of both the models and natural phenomena is the statistics of interval times. These statistics are particularly important for earthquakes. For earthquakes it is important to make a distinction between interoccurrence and recurrence times. Interoccurrence times are the interval times between earthquakes on all faults in a region whereas recurrence times are interval times between earthquakes on a single fault or fault segment. In many, but not all cases, interoccurrence time statistics are exponential (Poissonian) and the events occur randomly. However, the distribution of recurrence times are often Weibull to a good approximation. In this paper we study the interval statistics of slip events using a slider-block model. The behavior of this model is sensitive to the stiffness α of the system, α=kC/kL where kC is the spring constant of the connector springs and kL is the spring constant of the loader plate springs. For a soft system (small α) there are no system-wide events and interoccurrence time statistics of the larger events are Poissonian. For a stiff system (large α), system-wide events dominate the energy dissipation and the statistics of the recurrence times between these system-wide events satisfy the Weibull distribution to a good approximation. We argue that this applicability of the Weibull distribution is due to the power-law (scale invariant) behavior of the hazard function, i.e. the probability that the next event will occur at a time t0 after the last event has a power-law dependence on t0. The Weibull distribution is the only distribution that has a scale invariant hazard function. We further show that the onset of system-wide events is a well defined critical point. We find that the number of system-wide events NSWE satisfies the scaling relation NSWE ∝(α-αC)δ where αC is the critical value of the stiffness. The system-wide events represent a new phase for the slider-block system.
    Print ISSN: 1023-5809
    Electronic ISSN: 1607-7946
    Topics: Geosciences , Physics
    Published by Copernicus on behalf of European Geosciences Union.
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 7
    Publication Date: 2006-10-31
    Description: It is well known that earthquakes do not occur randomly in space and time. Foreshocks, aftershocks, precursory activation, and quiescence are just some of the patterns recognized by seismologists. Using the Pattern Informatics technique along with relative intensity analysis, we create a scoring method based on time dependent relative operating characteristic diagrams and show that the occurrences of large earthquakes in California correlate with time intervals where fluctuations in small earthquakes are suppressed relative to the long term average. We estimate a probability of less than 1% that this coincidence is due to random clustering. Furthermore, we show that the methods used to obtain these results may be applicable to other parts of the world.
    Print ISSN: 1023-5809
    Electronic ISSN: 1607-7946
    Topics: Geosciences , Physics
    Published by Copernicus on behalf of European Geosciences Union.
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 8
    Publication Date: 2005-11-09
    Description: No proven method is currently available for the reliable short time prediction of earthquakes (minutes to months). However, it is possible to make probabilistic hazard assessments for earthquake risk. In this paper we discuss a new approach to earthquake forecasting based on a pattern informatics (PI) method which quantifies temporal variations in seismicity. The output, which is based on an association of small earthquakes with future large earthquakes, is a map of areas in a seismogenic region ("hotspots'') where earthquakes are forecast to occur in a future 10-year time span. This approach has been successfully applied to California, to Japan, and on a worldwide basis. Because a sharp decision threshold is used, these forecasts are binary--an earthquake is forecast either to occur or to not occur. The standard approach to the evaluation of a binary forecast is the use of the relative (or receiver) operating characteristic (ROC) diagram, which is a more restrictive test and less subject to bias than maximum likelihood tests. To test our PI method, we made two types of retrospective forecasts for California. The first is the PI method and the second is a relative intensity (RI) forecast based on the hypothesis that future large earthquakes will occur where most smaller earthquakes have occurred in the recent past. While both retrospective forecasts are for the ten year period 1 January 2000 to 31 December 2009, we performed an interim analysis 5 years into the forecast. The PI method out performs the RI method under most circumstances.
    Print ISSN: 1023-5809
    Electronic ISSN: 1607-7946
    Topics: Geosciences , Physics
    Published by Copernicus on behalf of European Geosciences Union.
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 9
    Publication Date: 2005-06-09
    Description: This paper explores log-periodicity in a forest-fire cellular-automata model. At each time step of this model a tree is dropped on a randomly chosen site; if the site is unoccupied, the tree is planted. Then, for a given sparking frequency, matches are dropped on a randomly chosen site; if the site is occupied by a tree, the tree ignites and an "instantaneous" model fire consumes that tree and all adjacent trees. The resultant frequency-area distribution for the small and medium model fires is a power-law. However, if we consider very small sparking frequencies, the large model fires that span the square grid are dominant, and we find that the peaks in the frequency-area distribution of these large fires satisfy log-periodic scaling to a good approximation. This behavior can be examined using a simple mean-field model, where in time, the density of trees on the grid exponentially approaches unity. This exponential behavior coupled with a periodic or near-periodic sparking frequency also generates a sequence of peaks in the frequency-area distribution of large fires that satisfy log-periodic scaling. We conclude that the forest-fire model might provide a relatively simple explanation for the log-periodic behavior often seen in nature.
    Print ISSN: 1023-5809
    Electronic ISSN: 1607-7946
    Topics: Geosciences , Physics
    Published by Copernicus on behalf of European Geosciences Union.
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 10
    Publication Date: 2009-04-28
    Description: Statistical frequency-size (frequency-magnitude) properties of earthquake occurrence play an important role in seismic hazard assessments. The behavior of earthquakes is represented by two different statistics: interoccurrent behavior in a region and recurrent behavior at a given point on a fault (or at a given fault). The interoccurrent frequency-size behavior has been investigated by many authors and generally obeys the power-law Gutenberg-Richter distribution to a good approximation. It is expected that the recurrent frequency-size behavior should obey different statistics. However, this problem has received little attention because historic earthquake sequences do not contain enough events to reconstruct the necessary statistics. To overcome this lack of data, this paper investigates the recurrent frequency-size behavior for several problems. First, the sequences of creep events on a creeping section of the San Andreas fault are investigated. The applicability of the Brownian passage-time, lognormal, and Weibull distributions to the recurrent frequency-size statistics of slip events is tested and the Weibull distribution is found to be the best-fit distribution. To verify this result the behaviors of numerical slider-block and sand-pile models are investigated and the Weibull distribution is confirmed as the applicable distribution for these models as well. Exponents β of the best-fit Weibull distributions for the observed creep event sequences and for the slider-block model are found to have similar values ranging from 1.6 to 2.2 with the corresponding aperiodicities CV of the applied distribution ranging from 0.47 to 0.64. We also note similarities between recurrent time-interval statistics and recurrent frequency-size statistics.
    Print ISSN: 1023-5809
    Electronic ISSN: 1607-7946
    Topics: Geosciences , Physics
    Published by Copernicus on behalf of European Geosciences Union.
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...