ALBERT

All Library Books, journals and Electronic Records Telegrafenberg

feed icon rss

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
  • 1
    Publication Date: 2020-07-17
    Description: Games and/or gamification seem to be a promising area for educational and health research. These strategies are being increasingly used for improving health indicators, even in educational settings; however, there is little information about these terms within the school to promote physical activity (PA). Objective: the aim of this study is to describe a systematic review protocol of school-based interventions for promoting PA in pre-schoolers, children, and adolescent students using games and gamification. Methods: This review protocol is registered in International prospective register of systematic reviews (PROSPERO) (CRD42019123521). Scientific databases include PubMed, Web of Science, SportDiscus, Cochrane Library, ERIC, and PsycINFO. A standardized procedure will be executed following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses protocol (PRISMA-P) checklist for conducting systematic review protocols and the PICOS (Population, Interventions, Comparators, Outcomes, and Study design) tool to address an appropriate search strategy. Detailed information will be extracted, including a quantitative assessment using effect sizes to compare the interventions and a qualitative assessment using the Evaluation of Public Health Practice Projects (EPHPP) tool. Conclusion: This systematic review protocol contributes to establishing future systematic reviews using games and gamification strategies in school settings in order to examine their effect on PA outcomes among youth. Additionally, an update and clarification on the different terms in the school context have been included.
    Print ISSN: 1661-7827
    Electronic ISSN: 1660-4601
    Topics: Energy, Environment Protection, Nuclear Power Engineering , Medicine
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 2
    Publication Date: 2020-09-21
    Description: Generating Boolean Functions (BFs) with high nonlinearity is a complex task that is usually addresses through algebraic constructions. Metaheuristics have also been applied extensively to this task. However, metaheuristics have not been able to attain so good results as the algebraic techniques. This paper proposes a novel diversity-aware metaheuristic that is able to excel. This proposal includes the design of a novel cost function that combines several information from the Walsh Hadamard Transform (WHT) and a replacement strategy that promotes a gradual change from exploration to exploitation as well as the formation of clusters of solutions with the aim of allowing intensification steps at each iteration. The combination of a high entropy in the population and a lower entropy inside clusters allows a proper balance between exploration and exploitation. This is the first memetic algorithm that is able to generate 10-variable BFs of similar quality than algebraic methods. Experimental results and comparisons provide evidence of the high performance of the proposed optimization mechanism for the generation of high quality BFs.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 3
    Publication Date: 2012-11-16
    Description: Abstract 4275 Introduction: Annually, $82 billion to $272 billion is reportedly lost to federal health care fraud. Between 1996 and 2005, 379 federal health care fraud cases initiated by qui tam relators (“whistle blowers”) concluded, resulting in $9.3 billion in recoveries. Of these, pharmaceutical companies accounted for 13 cases (False Claims Act (FCA) cases, the primary statute invoked in health care fraud and abuse), but $3.9 billion of recoveries (4% of the cases and 39% of the financial recoveries). We report concluded FCA cases involving pharmaceutical manufacturers between 2006 and 2011. Oncology accounts for the largest per cent of total pharmaceutical expenditures. Over 90% of all new cancer pharmaceuticals cost 〉 $20,000 for 12-weeks of treatment. Methods: Websites for the Department of Justice (DOJ), Taxpayers Against Fraud, Health and Human Services Inspector General's Office, Health Care Fraud and Abuse Control Project, and Lexis/Nexis were queried for pharmaceutical FCA cases (2006 to 2011). Results: Between 2006 and 2011, the DOJ closed 54 cases with pharmaceutical FCA violations, 38 with and 16 without qui tam relators, accounting for recoveries of $11.3 billion (mean $296 million) and $2.6 billion (mean, $165 million), respectively. Illegal marketing is the most common fraud allegations invoked against pharmaceutical manufacturers (19 cases). Pharmaceutical manufacturers accounted for 31% of total FCA cases, and 71.5% of total FCA recoveries (Table 1). Conclusion: Since the DOJ's shift of focus to pharmaceutical corporations in 2001, the trend has intensified, with virtually every large pharmaceutical corporation settling at least one FCA case. Pharmaceutical cases now account for 31% of the federal fraud cases and 71% of the financial recoveries. Fraud and abuse may be an important component of the high costs of cancer care in the United States. Moreover, unless fundamental changes occur, the pharmaceutical industry will continue to be the main FCA investigative target as this sector has the deepest pockets and is the health care sector most resistant to deterrence. Disclosures: No relevant conflicts of interest to declare.
    Print ISSN: 0006-4971
    Electronic ISSN: 1528-0020
    Topics: Biology , Medicine
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 4
    Publication Date: 2001-11-25
    Print ISSN: 0038-0644
    Electronic ISSN: 1097-024X
    Topics: Computer Science
    Published by Wiley
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 5
    Publication Date: 2014-01-01
    Print ISSN: 1570-0232
    Electronic ISSN: 1873-376X
    Topics: Chemistry and Pharmacology
    Published by Elsevier
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 6
  • 7
    Publication Date: 2017-10-04
    Description: Within NASA's High Performance Computing and Communication (HPCC) program, the NASA Glenn Research Center is developing an environment for the analysis/design of aircraft engines called the Numerical Propulsion System Simulation (NPSS). The vision for NPSS is to create a "numerical test cell" enabling full engine simulations overnight on cost-effective computing platforms. To this end, NPSS integrates multiple disciplines such as aerodynamics, structures, and heat transfer and supports "numerical zooming" between O-dimensional to 1-, 2-, and 3-dimensional component engine codes. In order to facilitate the timely and cost-effective capture of complex physical processes, NPSS uses object-oriented technologies such as C++ objects to encapsulate individual engine components and CORBA ORBs for object communication and deployment across heterogeneous computing platforms. Recently, the HPCC program has initiated a concept called the Information Power Grid (IPG), a virtual computing environment that integrates computers and other resources at different sites. IPG implements a range of Grid services such as resource discovery, scheduling, security, instrumentation, and data access, many of which are provided by the Globus toolkit. IPG facilities have the potential to benefit NPSS considerably. For example, NPSS should in principle be able to use Grid services to discover dynamically and then co-schedule the resources required for a particular engine simulation, rather than relying on manual placement of ORBs as at present. Grid services can also be used to initiate simulation components on parallel computers (MPPs) and to address inter-site security issues that currently hinder the coupling of components across multiple sites. These considerations led NASA Glenn and Globus project personnel to formulate a collaborative project designed to evaluate whether and how benefits such as those just listed can be achieved in practice. This project involves firstly development of the basic techniques required to achieve co-existence of commodity object technologies and Grid technologies; and secondly the evaluation of these techniques in the context of NPSS-oriented challenge problems. The work on basic techniques seeks to understand how "commodity" technologies (CORBA, DCOM, Excel, etc.) can be used in concert with specialized "Grid" technologies (for security, MPP scheduling, etc.). In principle, this coordinated use should be straightforward because of the Globus and IPG philosophy of providing low-level Grid mechanisms that can be used to implement a wide variety of application-level programming models. (Globus technologies have previously been used to implement Grid-enabled message-passing libraries, collaborative environments, and parameter study tools, among others.) Results obtained to date are encouraging: we have successfully demonstrated a CORBA to Globus resource manager gateway that allows the use of CORBA RPCs to control submission and execution of programs on workstations and MPPs; a gateway from the CORBA Trader service to the Grid information service; and a preliminary integration of CORBA and Grid security mechanisms. The two challenge problems that we consider are the following: 1) Desktop-controlled parameter study. Here, an Excel spreadsheet is used to define and control a CFD parameter study, via a CORBA interface to a high throughput broker that runs individual cases on different IPG resources. 2) Aviation safety. Here, about 100 near real time jobs running NPSS need to be submitted, run and data returned in near real time. Evaluation will address such issues as time to port, execution time, potential scalability of simulation, and reliability of resources. The full paper will present the following information: 1. A detailed analysis of the requirements that NPSS applications place on IPG. 2. A description of the techniques used to meet these requirements via the coordinated use of CORBA and Globus. 3. A description of results obtained to date in the first two challenge problems.
    Keywords: Computer Programming and Software
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 8
    Publication Date: 2018-06-05
    Description: NASA Research and Education Network (NREN) ground truthing is a method of verifying the scientific validity of satellite images and clarifying irregularities in the imagery. Ground-truthed imagery can be used to locate geological compositions of interest for a given area. On Mars, astronaut scientists could ground truth satellite imagery from the planet surface and then pinpoint optimum areas to explore. These astronauts would be able to ground truth imagery, get results back, and use the results during extravehicular activity without returning to Earth to process the data from the mission. NASA's first ground-truthing experiment, performed on June 25 in the Utah desert, demonstrated the ability to extend powerful computing resources to remote locations. Designed by Dr. Richard Beck of the Department of Geography at the University of Cincinnati, who is serving as the lead field scientist, and assisted by Dr. Robert Vincent of Bowling Green State University, the demonstration also involved researchers from the NASA Glenn Research Center and the NASA Ames Research Center, who worked with the university field scientists to design, perform, and analyze results of the experiment. As shown real-time Hyperion satellite imagery (data) is sent to a mass storage facility, while scientists at a remote (Utah) site upload ground spectra (data) to a second mass storage facility. The grid pulls data from both mass storage facilities and performs up to 64 simultaneous band ratio conversions on the data. Moments later, the results from the grid are accessed by local scientists and sent directly to the remote science team. The results are used by the remote science team to locate and explore new critical compositions of interest. The process can be repeated as required to continue to validate the data set or to converge on alternate geophysical areas of interest.
    Keywords: Space Sciences (General)
    Type: Research and Technology 2003; NASA/TM-2004-212729
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 9
    Publication Date: 2017-10-04
    Description: Recent progress in distributed object technology has enabled software applications to be developed and deployed easily such that objects or components can work together across the boundaries of the network, different operating systems, and different languages. A distributed object is not necessarily a complete application but rather a reusable, self-contained piece of software that co-operates with other objects in a plug-and-play fashion via a well-defined interface. The Common Object Request Broker Architecture (CORBA), a middleware standard defined by the Object Management Group (OMG), uses the Interface Definition Language (IDL) to specify such an interface for transparent communication between distributed objects. Since IDL can be mapped to any programming language, such as C++, Java, Smalltalk, etc., existing applications can be integrated into a new application and hence the tasks of code re-writing and software maintenance can be reduced. Many scientific applications in aerodynamics and solid mechanics are written in Fortran. Refitting these legacy Fortran codes with CORBA objects can increase the codes reusability. For example, scientists could link their scientific applications to vintage Fortran programs such as Partial Differential Equation(PDE) solvers in a plug-and-play fashion. Unfortunately, CORBA IDL to Fortran mapping has not been proposed and there seems to be no direct method of generating CORBA objects from Fortran without having to resort to manually writing C/C++ wrappers. In this paper, we present an efficient methodology to integrate Fortran legacy programs into a distributed object framework. Issues and strategies regarding the conversion and decomposition of Fortran codes into CORBA objects are discussed. The following diagram shows the conversion and decomposition mechanism we proposed. Our goal is to keep the Fortran codes unmodified. The conversion- aided tool takes the Fortran application program as input and helps programmers generate C/C++ header file and IDL file for wrapping the Fortran code. Programmers need to determine by themselves how to decompose the legacy application into several reusable components based on the cohesion and coupling factors among the functions and subroutines. However, programming effort still can be greatly reduced because function headings and types have been converted to C++ and IDL styles. Most Fortran applications use the COMMON block to facilitate the transfer of large amount of variables among several functions. The COMMON block plays the similar role of global variables used in C. In the CORBA-compliant programming environment, global variables can not be used to pass values between objects. One approach to dealing with this problem is to put the COMMON variables into the parameter list. We do not adopt this approach because it requires modification of the Fortran source code which violates our design consideration. Our approach is to extract the COMMON blocks and convert them into a structure-typed attribute in C++. Through attributes, each component can initialize the variables and return the computation result back to the client. We have tested successfully the proposed conversion methodology based on the f2c converter. Since f2c only translates Fortran to C, we still needed to edit the converted code to meet the C++ and IDL syntax. For example, C++/IDL requires a tag in the structure type, while C does not. In this paper, we identify the necessary changes to the f2c converter in order to directly generate the C++ header and the IDL file. Our future work is to add GUI interface to ease the decomposition task by simply dragging and dropping icons.
    Keywords: Computer Programming and Software
    Type: Welcome to the NASA High Performance Computing and Communications Computational Aerosciences (CAS) Workshop 2000; D-000001
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 10
    Publication Date: 2018-06-06
    Description: The objective is to develop in cooperation with other NASA centers a distributed computational environment capable of executing full 3-D aerospace propulsion applications.
    Keywords: Computer Programming and Software
    Type: 2002 Computing and Interdisciplinary Systems Office Review and Planning Meeting; 141-148; NASA/TM-2003-211896
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...