ALBERT

All Library Books, journals and Electronic Records Telegrafenberg

feed icon rss

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
Filter
  • 2010-2014  (7)
Collection
Years
Year
  • 1
    Publication Date: 2014-02-28
    Description: Submarine landslides can be far larger than those on land, and are one of the most important processes for moving sediment across our planet. Landslides that are fast enough to disintegrate can generate potentially very hazardous tsunamis and produce long run-out turbidity currents that break strategically important cable networks. It is important to understand their frequency and triggers. We document the distribution of recurrence intervals for turbidity currents triggered by large landslides (〉0.1 km 3 ) in three basin plains. A common distribution of recurrence intervals is observed, despite variable ages and disparate locations, suggesting similar underlying controls on slide triggers and frequency. This common distribution closely approximates a temporally random Poisson distribution, such that the probability of a large disintegrating slide occurring along the basin margin is independent of the time since the last slide. This distribution suggests that non-random processes such as sea level are not a dominant control on frequency of these slides. Recurrence intervals of major (〉M 7.3) earthquakes have an approximately Poissonian distribution, suggesting they could be implicated as triggers. However, not all major earthquakes appear to generate widespread turbidites, and other as yet unknown triggers or sequential combinations of processes could produce the same distribution. This is the first study to show that large slide-triggered turbidites have a common frequency distribution in distal basin plains, and that this distribution is temporally random. This result has important implications for assessing hazards from landslide-tsunamis and seafloor cable breaks, and the long-term tempo of global sediment fluxes.
    Print ISSN: 0091-7613
    Electronic ISSN: 1943-2682
    Topics: Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 2
    Publication Date: 2013-10-21
    Description: We measure the potential of an observational data set to constrain a set of inputs to a complex and computationally expensive computer model. We use each member in turn of an ensemble of output from a computationally expensive model, corresponding to an observable part of a modelled system, as a proxy for an observational data set. We argue that, given some assumptions, our ability to constrain uncertain parameter inputs to a model using its own output as data, provides a maximum bound for our ability to constrain the model inputs using observations of the real system. The ensemble provides a set of known parameter input and model output pairs, which we use to build a computationally efficient statistical proxy for the full computer model, termed an emulator. We use the emulator to find and rule out "implausible" values for the inputs of held-out ensemble members, given the computer model output. As we know the true values of the inputs for the ensemble, we can compare our constraint of the model inputs with the true value of the input for any ensemble member. Measures of the quality of constraint have the potential to inform strategy for data collection campaigns, before any real-world data is collected, as well as acting as an effective sensitivity analysis. We use an ensemble of the ice sheet model Glimmer to demonstrate our measures of quality of constraint. The ensemble has 250 model runs with 5 uncertain input parameters, and an output variable representing the pattern of the thickness of ice over Greenland. We have an observation of historical ice sheet thickness that directly matches the output variable, and offers an opportunity to constrain the model. We show that different ways of summarising our output variable (ice volume, ice surface area and maximum ice thickness) offer different potential constraints on individual input parameters. We show that combining the observational data gives increased power to constrain the model. We investigate the impact of uncertainty in observations or in model biases on our measures, showing that even a modest uncertainty can seriously degrade the potential of the observational data to constrain the model.
    Print ISSN: 1991-959X
    Electronic ISSN: 1991-9603
    Topics: Geosciences
    Published by Copernicus on behalf of European Geosciences Union.
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 3
    Publication Date: 2012-04-20
    Description: A wide variety of different plankton system models have been coupled with ocean circulation models, with the aim of understanding and predicting aspects of environmental change. However, an ability to make reliable inferences about real-world processes from the model behaviour demands a quantitative understanding of model error that remains elusive. Assessment of coupled model output is inhibited by relatively limited observing system coverage of biogeochemical components. Any direct assessment of the plankton model is further inhibited by uncertainty in the physical state. Furthermore, comparative evaluation of plankton models on the basis of their design is inhibited by the sensitivity of their dynamics to many adjustable parameters. Parameter uncertainty has been widely addressed by calibrating models at data-rich ocean sites. However, relatively little attention has been given to quantifying uncertainty in the physical fields required by the plankton models at these sites, and tendencies in the biogeochemical properties due to the effects of horizontal processes are often neglected. Here we use model twin experiments, in which synthetic data are assimilated to estimate a system's known "true" parameters, to investigate the impact of error in a plankton model's environmental input data. The experiments are supported by a new software tool, the Marine Model Optimization Testbed, designed for rigorous analysis of plankton models in a multi-site 1-D framework. Simulated errors are derived from statistical characterizations of the mixed layer depth, the horizontal flux divergence tendencies of the biogeochemical tracers and the initial state. Plausible patterns of uncertainty in these data are shown to produce strong temporal and spatial variability in the expected simulation error variance over an annual cycle, indicating variation in the significance attributable to individual model-data differences. An inverse scheme using ensemble-based estimates of the simulation error variance to allow for this environment error performs well compared with weighting schemes used in previous calibration studies, giving improved estimates of the known parameters. The efficacy of the new scheme in real-world applications will depend on the quality of statistical characterizations of the input data. Practical approaches towards developing reliable characterizations are discussed.
    Print ISSN: 1991-959X
    Electronic ISSN: 1991-9603
    Topics: Geosciences
    Published by Copernicus on behalf of European Geosciences Union.
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 4
    Publication Date: 2013-04-09
    Description: We measure the potential of an observational data set to constrain a set of inputs to a complex and computationally expensive computer model. We use each member in turn of an ensemble of output from a computationally expensive model, corresponding to some observable part of a modelled system, as a proxy for an observational data set. We argue that our ability to constrain uncertain parameter inputs to a model using its own output as data, provides a maximum bound for our ability to constrain the model inputs using observations of the real system. The ensemble provides a set of known parameter input and model output pairs, which we use to build a computationally efficient statistical proxy for the full computer model, termed an emulator. We use the emulator to find and rule out ''implausible" values for the inputs of held-out ensemble members, given the computer model output. As we know the true values of the inputs for the ensemble, we can compare our constraint of the model inputs with the true value of the input for any ensemble member. Measures of the quality of constraint have the potential to inform strategy for data collection campaigns, before any real-world data is collected, as well as acting as an effective sensitivity analysis. We use an ensemble of the ice sheet model Glimmer to demonstrate our measures of quality of constraint. The ensemble has 250 model runs with 5 uncertain input parameters, and an output variable representing the pattern of the thickness of ice over Greenland. We have an observation of historical ice sheet thickness that directly matches the output variable, and offers an opportunity to constrain the model. We show that different ways of summarising our output variable (ice volume, ice surface area and maximum ice thickness) offer different potential constraints on individual input parameters. We show that combining the observational data gives increased power to constrain the model. We investigate the impact of uncertainty in observations or in model biases on our measures, showing that even a modest uncertainty can seriously degrade the potential of the observational data to constrain the model.
    Print ISSN: 1991-9611
    Electronic ISSN: 1991-962X
    Topics: Geosciences
    Published by Copernicus on behalf of European Geosciences Union.
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 5
    Publication Date: 2011-08-19
    Description: A wide variety of different marine plankton system models have been coupled with ocean circulation models, with the aim of understanding and predicting aspects of environmental change. However, an ability to make reliable inferences about real-world processes from the model behaviour demands a quantitative understanding of model error that remains elusive. Assessment of coupled model output is inhibited by relatively limited observing system coverage of biogeochemical components. Any direct assessment of the plankton model is further inhibited by uncertainty in the physical state. Furthermore, comparative evaluation of plankton models on the basis of their design is inhibited by the sensitivity of their dynamics to many adjustable parameters. The Marine Model Optimization Testbed is a new software tool designed for rigorous analysis of plankton models in a multi-site 1-D framework, in particular to address uncertainty issues in model assessment. A flexible user interface ensures its suitability to more general inter-comparison, sensitivity and uncertainty analyses, including model comparison at the level of individual processes, and to state estimation for specific locations. The principal features of MarMOT are described and its application to model calibration is demonstrated by way of a set of twin experiments, in which synthetic observations are assimilated in an attempt to recover the true parameter values of a known system. The experimental aim is to investigate the effect of different misfit weighting schemes on parameter recovery in the presence of error in the plankton model's environmental input data. Simulated errors are derived from statistical characterizations of the mixed layer depth, the horizontal flux divergences of the biogeochemical tracers and the initial state. Plausible patterns of uncertainty in these data are shown to produce strong temporal and spatial variability in the expected simulation error over an annual cycle, indicating differences in the significance attributable to model-data misfits at different data points. An inverse scheme using ensemble-based estimates of the simulation error variance to allow for this environment error performs well compared with weighting schemes used in previous plankton model calibration studies. The efficacy of the new scheme in real-world applications will depend on the quality of statistical characterizations of the input data. Practical approaches towards developing reliable characterizations are discussed.
    Print ISSN: 1991-9611
    Electronic ISSN: 1991-962X
    Topics: Geosciences
    Published by Copernicus on behalf of European Geosciences Union.
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 6
    Publication Date: 2014-09-25
    Description: Biogeochemical ocean circulation models used to investigate the role of plankton ecosystems in global change rely on adjustable parameters to compensate for missing biological complexity. In principle, optimal parameter values can be estimated by fitting models to observational data, including satellite ocean colour products such as chlorophyll that achieve good spatial and temporal coverage of the surface ocean. However, comprehensive parametric analyses require large ensemble experiments that are computationally infeasible with global 3-D simulations. Site-based simulations provide an efficient alternative but can only be used to make reliable inferences about global model performance if robust quantitative descriptions of their relationships with the corresponding 3-D simulations can be established. The feasibility of establishing such a relationship is investigated for an intermediate complexity biogeochemistry model (MEDUSA) coupled with a widely-used global ocean model (NEMO). A site-based mechanistic emulator is constructed for surface chlorophyll output from this target model as a function of model parameters. The emulator comprises an array of 1-D simulators and a statistical quantification of the uncertainty in their predictions. The unknown parameter-dependent biogeochemical environment, in terms of initial tracer concentrations and lateral flux information required by the simulators, is a significant source of uncertainty. It is approximated by a mean environment derived from a small ensemble of 3-D simulations representing variability of the target model behaviour over the parameter space of interest. The performance of two alternative uncertainty quantification schemes is examined: a direct method based on comparisons between simulator output and a sample of known target model "truths" and an indirect method that is only partially reliant on knowledge of target model output. In general, chlorophyll records at a representative array of oceanic sites are well reproduced. The use of lateral flux information reduces the 1-D simulator error considerably, consistent with a major influence of advection at some sites. Emulator robustness is assessed by comparing actual error distributions with those predicted. With the direct uncertainty quantification scheme, the emulator is reasonably robust over all sites. The indirect uncertainty quantification scheme is less reliable at some sites but scope for improving its performance is identified. The results demonstrate the strong potential of the emulation approach to improve the effectiveness of site-based methods. This represents important progress towards establishing a robust site-based capability that will allow comprehensive parametric analyses to be achieved for improving global models and quantifying uncertainty in their predictions.
    Print ISSN: 1991-9611
    Electronic ISSN: 1991-962X
    Topics: Geosciences
    Published by Copernicus on behalf of European Geosciences Union.
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 7
    Publication Date: 2010-08-01
    Print ISSN: 0278-4343
    Electronic ISSN: 1873-6955
    Topics: Geosciences
    Published by Elsevier
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...