ALBERT

All Library Books, journals and Electronic Records Telegrafenberg

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
Filter
  • Articles  (9)
  • sedimentology  (6)
  • structure  (3)
  • Springer  (9)
  • 1970-1974  (9)
  • 1971  (9)
  • Geosciences  (9)
  • Architecture, Civil Engineering, Surveying
Collection
  • Articles  (9)
Publisher
  • Springer  (9)
Years
  • 1970-1974  (9)
Year
Topic
  • Geosciences  (9)
  • Architecture, Civil Engineering, Surveying
  • Mathematics  (9)
  • 1
    Electronic Resource
    Electronic Resource
    Springer
    Mathematical geology 3 (1971), S. 227-238 
    ISSN: 1573-8868
    Keywords: classification ; data processing ; graphics ; mapping ; mathematics ; plotting ; sampling ; statistics ; sedimentology ; stratigraphy ; grain-size analysis ; textural analysis ; glacial geology ; Pleistocene stratigraphy ; till
    Source: Springer Online Journal Archives 1860-2000
    Topics: Geosciences , Mathematics
    Notes: Abstract Relative percentages of sand, silt, and clay from samples of the same till unit are not identical because of different lithologies in the source areas, sorting in transport, random variation, and experimental error. Random variation and experimental error can be isolated from the other two as follows. For each particle-size class of each till unit, a standard population is determined by using a normally distributed, representative group of data. New measurements are compared with the standard population and, if they compare satisfactorily, the experimental error is not significant and random variation is within the expected range for the population. The outcome of the comparison depends on numerical criteria derived from a graphical method rather than on a more commonly used one-way analysis of variance with two treatments. If the number of samples and the standard deviation of the standard population are substituted in at-test equation, a family of hyperbolas is generated, each of which corresponds to a specific number of subsamples taken from each new sample. The axes of the graphs of the hyperbolas are the standard deviation of new measurements (horizontal axis) and the difference between the means of the new measurements and the standard population (vertical axis). The area between the two branches of each hyperbola corresponds to a satisfactory comparison between the new measurements and the standard population. Measurements from a new sample can be tested by plotting their standard deviation vs. difference in means on axes containing a hyperbola corresponding to the specific number of subsamples used. If the point lies between the branches of the hyperbola, the measurements are considered reliable. But if the point lies outside this region, the measurements are repeated. Because the critical segment of the hyperbola is approximately a straight line parallel to the horizontal axis, the test is simplified to a comparison between the means of the standard population and the means of the subsample. The minimum number of subsamples required to prove significant variation between samples caused by different lithologies in the source areas and sorting in transport can be determined directly from the graphical method. The minimum number of subsamples required is the maximum number to be run for economy of effort.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 2
    Electronic Resource
    Electronic Resource
    Springer
    Mathematical geology 3 (1971), S. 335-355 
    ISSN: 1573-8868
    Keywords: nearest neighbor analysis ; regression analysis ; statistics ; trend analysis ; structure
    Source: Springer Online Journal Archives 1860-2000
    Topics: Geosciences , Mathematics
    Notes: Abstract A quantitative analysis was made of the spatial arrangement of 149explosion craters in the western rift of Uganda. A variety of methods demonstrate that the spatial pattern of the craters reveals significant structural patterns that have guided volcanism to the surface. It is shown that the east-west elements in the field affected location, and the main rift fault is resolved into two main components. Tentatively, a possible dextral transform fault is identified that affected the relative location of the two main zones of activity. Grouping techniques demonstrate that crater groups obey an exponential rank-size rule and allow a mapping of the craters into energy classes that reveals a concentric pattern of energy in the field. The effect of the topography on energy levels and crater size show that only topography greater than 11,000ft could have prevented all eruptive activity, but the smaller energies and craters are sensitive to height differences on the order of the height of the rift wall, about 1000ft. Total energy in each crater class size is roughly constant, and the field energy could create one or two single craters comparable in size to small central volcanoes.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 3
    Electronic Resource
    Electronic Resource
    Springer
    Mathematical geology 3 (1971), S. 281-295 
    ISSN: 1573-8868
    Keywords: analysis of variance ; autocorrelation ; simulation ; trend analysis ; sedimentology ; stratigraphy
    Source: Springer Online Journal Archives 1860-2000
    Topics: Geosciences , Mathematics
    Notes: Abstract It is proposed that the variance in mapped geologic data should be formally considered to be composed of three components which arise on different geographic scales. The three components (regional, local, and residual) should be defined solely in terms of the parameters of the sample data set. A two-step analysis is required to separate three components. Applying autocorrelation criteria, trend-surface analysis has been used, in the first step, to remove the residual component and, in the second step, to separate regional and local components from the resulting noise-free data. This procedure has made it possible to quantify local components in stratigraphic thickness data from the East Midlands coalfield (central England) which can be identified in terms of the known geology.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 4
    Electronic Resource
    Electronic Resource
    Springer
    Mathematical geology 3 (1971), S. 15-41 
    ISSN: 1573-8868
    Keywords: modal analysis ; sampling ; statistics ; mineralogy ; petrology ; sedimentology
    Source: Springer Online Journal Archives 1860-2000
    Topics: Geosciences , Mathematics
    Notes: Abstract The binomial model, commonly used to estimate counting error in point-count analysis, misestimates this error when the observation points on a grid are positively or negatively correlated. A model, called the “cell model,” is proposed as an alternative to the binomial model for use in studies, especially with coarse-grained rocks, in which such correlation is known or thought to exist. In the new model the thin section is conceptually partitioned into a number of cells (six is recommended), and the assumption is made that the proportions in the individual cells are statistically independent and that their variance does not differ from cell to cell. Empirical relations obtained from a suite of 200 thin sections of limestones are in reasonable support of the prediction that large particle size adversely affects counting error estimates based on the binomial model.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 5
    Electronic Resource
    Electronic Resource
    Springer
    Mathematical geology 3 (1971), S. 239-263 
    ISSN: 1573-8868
    Keywords: simulation ; hydrology ; petroleum ; sedimentology ; high fluid pressures ; compaction ; primary migration
    Source: Springer Online Journal Archives 1860-2000
    Topics: Geosciences , Mathematics
    Notes: Abstract A mathematical model of sedimentation and compaction of fine-grained rocks such as shale has been constructed. Water is considered to flow upward or downward out of a compacting rock according to Darcy's law until the pore-water pressure within the rock is normal for the depth in question. The porosity decreases during compaction until a minimum porosity, determined by the difference between total vertical stress (overburden pressure) and pore-water pressure, is obtained. The model takes into account the dependence of permeability on porosity for a given rock type, and the dependence of water viscosity on salinity, temperature, and pressure. The derived equations have been computer programmed to obtain the time dependence of porosity, pressure, water velocity, permeability, and other factors within a compacting shale during (a) shale sedimentation, (b) a time lapse following shale deposition, (c) the deposition of normally pressured sediments over the shale, and (d) a second time lapse following deposition of the normally pressured unit. Solutions to these problems are given for the situation when the unit underlying the shale is normally pressured, and for the situation when the underlying unit is impermeable. The calculations show that a portion of a thick shale adjacent to a normally pressured unit may have a considerably reduced porosity and permeability, and act as a seal for the remainder of the shale. High fluid pressures may persist for many millions of years in thick shales with low permeability. The computations can be extended to cover more complicated cases of interbedded shales, sands, and other lithologies.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 6
    Electronic Resource
    Electronic Resource
    Springer
    Mathematical geology 3 (1971), S. 43-50 
    ISSN: 1573-8868
    Keywords: dynamic programming ; sampling ; oceanography ; sedimentology
    Source: Springer Online Journal Archives 1860-2000
    Topics: Geosciences , Mathematics
    Notes: Abstract Increasing attention in recent years has been devoted to the application of statistical techniques in the analysis and interpretation of geologic and oceanographic data. Equally important, but less well explored, are methods for efficient experimental design. The theory of linear programming provides plans for optimal sampling of geologic and oceanographic phenomena. Of particular significance are solutions to problems of multivariate sampling. Often, a single field sample may be analyzed for a number of oxides, or a number of minerals, or a number of textural parameters. In general, these variables differ in the degree to which they are diagnostic of changes in the phenomenon of interest, and thus they must be known with different levels of precision if they are to be useful. Similarly, the variables differ in the ease with which they may be measured. If a sampling plan is to be most efficient, it must provide the requisite levels of precision for the minimum expenditure of time and effort. Sampling for a single variable may be optimized directly. Sampling for several variables simultaneously usually introduces special difficulties, but if the objective function can be generalized to hold for all variables, solutions can be determined even in this situation.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 7
    Electronic Resource
    Electronic Resource
    Springer
    Mathematical geology 3 (1971), S. 123-133 
    ISSN: 1573-8868
    Keywords: cluster analysis ; distance functions ; mineralogy ; petrology ; structure
    Source: Springer Online Journal Archives 1860-2000
    Topics: Geosciences , Mathematics
    Notes: Abstract The weight-percent values of four mineralogic variables (quartz, K feldspar, color index, and muscovite) for 10 sets of granitic rocks (20–50 samples in each set) from magmatic units of the Singhbhum granite were used for (1) computation of the Mahalanobis' generalized distance functions (D 2) between all pairs of the 10 sets, (2) testing significance of the difference between the multivariate means, and (3) computation of the linear discriminant functions between all possible pairs of the sets. The 10 data sets are for six magmatic units which belong to three successive but closely related phases of emplacement. The multivariate means for all sets are significantly different except for those between two of the sets of phase I. Cluster analysis on the basis of theD 2 values enables the 10 sets to be placed into four distinct groups. Group A includes two subgroups, one of which consists of two sets representing typical members of phase I; the other subgroup includes two sets which are typical of phase II. Group B includes two sets which are typical of phase III. The other four sets do not group with the typical representatives of the three phases, probably because of certain special conditions of their emplacement. A separate series ofD 2 computation from the same data, but excluding the color index, was unsuccessful in making the four aberrant sets group with the typical members of the respective phases. Efficient LDF's could be determined for discrimination between most pairs of the 10 sets of granite rocks.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 8
    Electronic Resource
    Electronic Resource
    Springer
    Mathematical geology 3 (1971), S. 171-181 
    ISSN: 1573-8868
    Keywords: Fourier analysis ; graphics ; mapping ; spatial filtering ; trend analysis ; stratigraphy ; structure
    Source: Springer Online Journal Archives 1860-2000
    Topics: Geosciences , Mathematics
    Notes: Abstract Z-trend maps are a simplified lineprinter version of spatially filtered maps designed to give a quick visual appraisal of trends. The printout shows a “yes-no” configuration by a printed character or a blank so that the map has a conspicuous pattern. This pattern reflects the presence, position, and trend of the desired features. If a reasonable symbol density ratio is used the results can be visually pleasing thus enhancing trend recognition. Z-trending can be adapted to any map with stationary properties but is most easily applied to data that have been filtered with a bandpass operator.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 9
    Electronic Resource
    Electronic Resource
    Springer
    Mathematical geology 3 (1971), S. 265-279 
    ISSN: 1573-8868
    Keywords: simulation ; mathematical models ; sedimentology ; stratigraphy
    Source: Springer Online Journal Archives 1860-2000
    Topics: Geosciences , Mathematics
    Notes: Abstract To study sedimentary phenomena, we introduce random-genetic models in which genetic hypotheses and structural random elements occur for the main part. Starting from geologic hypotheses we choose principal factors which may be random functions or random variables. These factors are: depth, nature of the facies, sedimentation rate, and subsidence. Equations of evolution link the factors. Depth is a Markov process, but generally the resultant sequence does not make a Markov chain or Markov process. Three examples of such models are given with the results of simulations.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...