ALBERT

All Library Books, journals and Electronic Records Telegrafenberg

feed icon rss

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
  • 1
    Electronic Resource
    Electronic Resource
    Springer
    Journal of automated reasoning 18 (1997), S. 237-246 
    ISSN: 1573-0670
    Keywords: automated theorem proving ; competition ; SETHEO ; E-SETHEO ; first-order logic ; model elimination ; equality
    Source: Springer Online Journal Archives 1860-2000
    Topics: Computer Science
    Notes: Abstract The model elimination theorem prover SETHEO (version V3.3) and its equational extension E-SETHEO are presented. SETHEO employs sophisticated mechanisms of subgoal selection, elaborate iterative deepening techniques, and local failure caching methods. Its equational counterpart E-SETHEO transforms formulae containing equality (using a variant of Brand's modification method) and processes the output with the standard SETHEO system. This article gives an overview of the theoretical background, the system architecture, and the performance of both systems.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 2
    Publication Date: 2013-08-29
    Description: The Unified Modeling Language (UML) is gaining wide popularity for the design of object-oriented systems. UML combines various object-oriented graphical design notations under one common framework. A major factor for the broad acceptance of UML is that it can be conveniently used in a highly iterative, Use Case (or scenario-based) process (although the process is not a part of UML). Here, the (pre-) requirements for the software are specified rather informally as Use Cases and a set of scenarios. A scenario can be seen as an individual trace of a software artifact. Besides first sketches of a class diagram to illustrate the static system breakdown, scenarios are a favorite way of communication with the customer, because scenarios describe concrete interactions between entities and are thus easy to understand. Scenarios with a high level of detail are often expressed as sequence diagrams. Later in the design and implementation stage (elaboration and implementation phases), a design of the system's behavior is often developed as a set of statecharts. From there (and the full-fledged class diagram), actual code development is started. Current commercial UML tools support this phase by providing code generators for class diagrams and statecharts. In practice, it can be observed that the transition from requirements to design to code is a highly iterative process. In this talk, a set of algorithms is presented which perform reasonable synthesis and transformations between different UML notations (sequence diagrams, Object Constraint Language (OCL) constraints, statecharts). More specifically, we will discuss the following transformations: Statechart synthesis, introduction of hierarchy, consistency of modifications, and "design-debugging".
    Keywords: Computer Programming and Software
    Type: IJCAR 2001; Unknown|PMD 2001; Unknown
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 3
    Publication Date: 2013-08-29
    Description: AUTOBAYES is a fully automatic program synthesis system for the statistical data analysis domain. Its input is a concise description of a data analysis problem in the form of a statistical model; its output is optimized and fully documented C/C++ code which can be linked dynamically into the Matlab and Octave environments. AUTOBAYES synthesizes code by a schema-guided deductive process. Schemas (i.e., code templates with associated semantic constraints) are applied to the original problem and recursively to emerging subproblems. AUTOBAYES complements this approach by symbolic computation to derive closed-form solutions whenever possible. In this paper, we concentrate on the interaction between the symbolic computations and the deductive synthesis process. A statistical model specifies for each problem variable (i.e., data or parameter) its properties and dependencies in the form of a probability distribution, A typical data analysis task is to estimate the best possible parameter values from the given observations or measurements. The following example models normal-distributed data but takes prior information (e.g., from previous experiments) on the data's mean value and variance into account.
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 4
    Publication Date: 2013-08-29
    Description: Code certification is a lightweight approach to formally demonstrate software quality. It concentrates on aspects of software quality that can be defined and formalized via properties, e.g., operator safety or memory safety. Its basic idea is to require code producers to provide formal proofs that their code satisfies these quality properties. The proofs serve as certificates which can be checked independently, by the code consumer or by certification authorities, e.g., the FAA. It is the idea underlying such approaches as proof-carrying code [6]. Code certification can be viewed as a more practical version of traditional Hoare-style program verification. The properties to be verified are fairly simple and regular so that it is often possible to use an automated theorem prover to automatically discharge all emerging proof obligations. Usually, however, the programmer must still splice auxiliary annotations (e.g., loop invariants) into the program to facilitate the proofs. For complex properties or larger programs this quickly becomes the limiting factor for the applicability of current certification approaches.
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 5
    Publication Date: 2018-06-06
    Description: This paper presents a new methodology for automatic knowledge driven data mining based on the theory of Mercer Kernels, which are highly nonlinear symmetric positive definite mappings from the original image space to a very high, possibly infinite dimensional feature space. We describe a new method called Mixture Density Mercer Kernels to learn kernel function directly from data, rather than using predefined kernels. These data adaptive kernels can en- code prior knowledge in the kernel using a Bayesian formulation, thus allowing for physical information to be encoded in the model. We compare the results with existing algorithms on data from the Sloan Digital Sky Survey (SDSS). The code for these experiments has been generated with the AUTOBAYES tool, which automatically generates efficient and documented C/C++ code from abstract statistical model specifications. The core of the system is a schema library which contains template for learning and knowledge discovery algorithms like different versions of EM, or numeric optimization methods like conjugate gradient methods. The template instantiation is supported by symbolic- algebraic computations, which allows AUTOBAYES to find closed-form solutions and, where possible, to integrate them into the code. The results show that the Mixture Density Mercer-Kernel described here outperforms tree-based classification in distinguishing high-redshift galaxies from low- redshift galaxies by approximately 16% on test data, bagged trees by approximately 7%, and bagged trees built on a much larger sample of data by approximately 2%.
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 6
    Publication Date: 2018-06-06
    Description: From the analysis of these pathways we conclude that postsynaptic processes that regulate synaptic transmission undergo significant cross-talk with respect to glutamatergic and neuromodulatory (dopamine) signals. The main hypothesis is that of a compensatory regulation, a competitive switch between the induction of increased AMPA conductance by CaMKII-dependent phosphorylation and reduced expression of PP2A, and increased D1 receptor sensitivity and expression by increased PKA, PP2A and decreased PP-1/calcineurin expression. Both types of plasticity are induced by NMDA receptor activation and increased internal calcium, they require different internal conditions to become expressed. Specifically we propose that AMPA regulation and D1 regulation are inversely coupled;The net result may be a bifurcation of synaptic state into predominantly AMPA or NMDA-D1 synapses. This could have functional consequences: stable connections for AMPA and conditional gating for NMDA-D1 synapses.
    Keywords: General
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 7
    Publication Date: 2019-07-06
    Description: For unmanned aerial systems (UAS) to be successfully deployed and integrated within the national airspace, it is imperative that they possess the capability to effectively complete their missions without compromising the safety of other aircraft, as well as persons and property on the ground. This necessity creates a natural requirement for UAS that can respond to uncertain environmental conditions and emergent failures in real-time, with robustness and resilience close enough to those of manned systems. We introduce a system that meets this requirement with the design of a real-time onboard system health management (SHM) capability to continuously monitor sensors, software, and hardware components. This system can detect and diagnose failures and violations of safety or performance rules during the flight of a UAS. Our approach to SHM is three-pronged, providing: (1) real-time monitoring of sensor and software signals; (2) signal analysis, preprocessing, and advanced on-the-fly temporal and Bayesian probabilistic fault diagnosis; and (3) an unobtrusive, lightweight, read-only, low-power realization using Field Programmable Gate Arrays (FPGAs) that avoids overburdening limited computing resources or costly re-certification of flight software. We call this approach rt-R2U2, a name derived from its requirements. Our implementation provides a novel approach of combining modular building blocks, integrating responsive runtime monitoring of temporal logic system safety requirements with model-based diagnosis and Bayesian network-based probabilistic analysis. We demonstrate this approach using actual flight data from the NASA Swift UAS.
    Keywords: Air Transportation and Safety
    Type: ARC-E-DAA-TN24388 , International Journal of Prognostics & Health Management (ISSN 2153-2648); 6; 021
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 8
    Publication Date: 2018-06-06
    Description: This paper presents a new methodology for automatic knowledge driven data mining based on the theory of Mercer Kernels, which are highly nonlinear symmetric positive definite mappings from the original image space to a very high, possibly dimensional feature space. we describe a new method called Mixture Density Mercer Kernels to learn kernel function directly from data, rather than using pre-defined kernels. These data adaptive kernels can encode prior knowledge in the kernel using a Bayesian formulation, thus allowing for physical information to be encoded in the model. Specifically, we demonstrate the use of the algorithm in situations with extremely small samples of data. We compare the results with existing algorithms on data from the Sloan Digital Sky Survey (SDSS) and demonstrate the method's superior performance against standard methods. The code for these experiments has been generated with the AUTOBAYES tool, which automatically generates efficient and documented C/C++ code from abstract statistical model specifications. The core of the system is a schema library which contains templates for learning and knowledge discovery algorithms like different versions of EM, or numeric optimization methods like conjugate gradient methods. The template instantiation is supported by symbolic-algebraic computations, which allows AUTOBAYES to find closed-form solutions and, where possible, to integrate them into the code.
    Keywords: Numerical Analysis
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 9
    Publication Date: 2019-07-18
    Description: Recent research has shown that adaptive neural based control systems are very effective in restoring stability and control of an aircraft in the presence of damage or failures. The application of an adaptive neural network with a flight critical control system requires a thorough and proven process to ensure safe and proper flight operation. Unique testing tools have been developed as part of a process to perform verification and validation (V&V) of real time adaptive neural networks used in recent adaptive flight control system, to evaluate the performance of the on line trained neural networks. The tools will help in certification from FAA and will help in the successful deployment of neural network based adaptive controllers in safety-critical applications. The process to perform verification and validation is evaluated against a typical neural adaptive controller and the results are discussed.
    Keywords: Aircraft Stability and Control
    Type: International Conference on Computational Intelligence on Modeling, Control and Automation (CIMCA); Jul 12, 2004 - Jul 14, 2004; Gold Coast; Australia
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 10
    Publication Date: 2019-07-18
    Description: Traditional control has proven to be ineffective to deal with catastrophic changes or slow degradation of complex, highly nonlinear systems like aircraft or spacecraft, robotics, or flexible manufacturing systems. Control systems which can adapt toward changes in the plant have been proposed as they offer many advantages (e.g., better performance, controllability of aircraft despite of a damaged wing). In the last few years, use of neural networks in adaptive controllers (neuro-adaptive control) has been studied actively. Neural networks of various architectures have been used successfully for online learning adaptive controllers. In such a typical control architecture, the neural network receives as an input the current deviation between desired and actual plant behavior and, by on-line training, tries to minimize this discrepancy (e.g.; by producing a control augmentation signal). Even though neuro-adaptive controllers offer many advantages, they have not been used in mission- or safety-critical applications, because performance and safety guarantees cannot b e provided at development time-a major prerequisite for safety certification (e.g., by the FAA or NASA). Verification and Validation (V&V) of an adaptive controller requires the development of new analysis techniques which can demonstrate that the control system behaves safely under all operating conditions. Because of the requirement to adapt toward unforeseen changes during operation, i.e., in real time, design-time V&V is not sufficient.
    Keywords: Aircraft Stability and Control
    Type: 24th International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering (MTNS2004); Jul 25, 2004 - Jul 30, 2004; Garching; Germany
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...