ALBERT

All Library Books, journals and Electronic Records Telegrafenberg

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
  • 1
    Publication Date: 2012-03-15
    Description: In this paper, we integrate a grid system and a wireless network to present a convenient computational service system, called the Semi-Preemptive Computational Service system (SePCS for short), which provides users with a wireless access environment and through which a user can share his/her resources with others. In the SePCS, each node is dynamically given a score based on its CPU level, available memory size, current length of waiting queue, CPU utilization and bandwidth. With the scores, resource nodes are classified into three levels. User requests based on their time constraints are also classified into three types. Resources of higher levels are allocated to more tightly constrained requests so as to increase the total performance of the system. To achieve this, a resource broker with the Semi-Preemptive Algorithm (SPA) is also proposed. When the resource broker cannot find suitable resources for the requests of higher type, it preempts the resource that is now executing a lower type request so that the request of higher type can be executed immediately. The SePCS can be applied to a Vehicular Ad Hoc Network (VANET), users of which can then exploit the convenient mobile network services and the wireless distributed computing. As a result, the performance of the system is higher than that of the tested schemes.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 2
    Publication Date: 2012-10-16
    Description: Forecasting the unit cost of every product type in a factory is an important task. However, it is not easy to deal with the uncertainty of the unit cost. Fuzzy collaborative forecasting is a very effective treatment of the uncertainty in the distributed environment. This paper presents some linear fuzzy collaborative forecasting models to predict the unit cost of a product. In these models, the experts’ forecasts differ and therefore need to be aggregated through collaboration. According to the experimental results, the effectiveness of forecasting the unit cost was considerably improved through collaboration.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 3
    Publication Date: 2012-10-16
    Description: Imperialist Competitive Algorithm (ICA) is a new population-based evolutionary algorithm. It divides its population of solutions into several sub-populations, and then searches for the optimal solution through two operations: assimilation and competition. The assimilation operation moves each non-best solution (called colony) in a sub-population toward the best solution (called imperialist) in the same sub-population. The competition operation removes a colony from the weakest sub-population and adds it to another sub-population. Previous work on ICA focuses mostly on improving the assimilation operation or replacing the assimilation operation with more powerful meta-heuristics, but none focuses on the improvement of the competition operation. Since the competition operation simply moves a colony (i.e., an inferior solution) from one sub-population to another sub-population, it incurs weak interaction among these sub-populations. This work proposes Interaction Enhanced ICA that strengthens the interaction among the imperialists of all sub-populations. The performance of Interaction Enhanced ICA is validated on a set of benchmark functions for global optimization. The results indicate that the performance of Interaction Enhanced ICA is superior to that of ICA and its existing variants.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 4
    Publication Date: 2012-09-27
    Description: In this paper we demonstrate a family of metrics for estimating the quality of a text summary relative to one or more human-generated summaries. The improved metrics are based on features automatically computed from the summaries to measure content and linguistic quality. The features are combined using one of three methods—robust regression, non-negative least squares, or canonical correlation, an eigenvalue method. The new metrics significantly outperform the previous standard for automatic text summarization evaluation, ROUGE.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 5
    Publication Date: 2012-10-20
    Description: We propose using side information to further inform anomaly detection algorithms of the semantic context of the text data they are analyzing, thereby considering both divergence from the statistical pattern seen in particular datasets and divergence seen from more general semantic expectations. Computational experiments show that our algorithm performs as expected on data that reflect real-world events with contextual ambiguity, while replicating conventional clustering on data that are either too specialized or generic to result in contextual information being actionable. These results suggest that our algorithm could potentially reduce false positive rates in existing anomaly detection systems.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 6
    Publication Date: 2012-10-23
    Description: Data can be represented in many different ways within a particular document or set of documents. Hence, attempts to automatically process the relationships between documents or determine the relevance of certain document objects can be problematic. In this study, we have developed software to automatically catalog objects contained in HTML files for patents granted by the United States Patent and Trademark Office (USPTO). Once these objects are recognized, the software creates metadata that assigns a data type to each document object. Such metadata can be easily processed and analyzed for subsequent text mining tasks. Specifically, document similarity and clustering techniques were applied to a subset of the USPTO document collection. Although our preliminary results demonstrate that tables and numerical data do not provide quantifiable value to a document’s content, the stage for future work in measuring the importance of document objects within a large corpus has been set.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 7
    Publication Date: 2012-04-06
    Description: Link puzzles involve finding paths or a cycle in a grid that satisfy given local and global properties. This paper proposes algorithms that enumerate solutions and instances of two link puzzles, Slitherlink and Numberlink, by zero-suppressed binary decision diagrams (ZDDs). A ZDD is a compact data structure for a family of sets provided with a rich family of set operations, by which, for example, one can easily extract a subfamily satisfying a desired property. Thanks to the nature of ZDDs, our algorithms offer a tool to assist users to design instances of those link puzzles.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 8
    Publication Date: 2012-04-11
    Description: Deduplication in storage systems has gained momentum recently for its capability in reducing data footprint. However, deduplication introduces challenges to storage management as storage objects (e.g., files) are no longer independent from each other due to content sharing between these storage objects. In this paper, we present a graph-based framework to address the challenges of storage management due to deduplication. Specifically, we model content sharing among storage objects by content sharing graphs (CSG), and apply graph-based algorithms to two real-world storage management use cases for deduplication-enabled storage systems. First, a quasi-linear algorithm was developed to partition deduplication domains with a minimal amount of deduplication loss (i.e., data replicated across partitioned domains) in commercial deduplication-enabled storage systems, whereas in general the partitioning problem is NP-complete. For a real-world trace of 3 TB data with 978 GB of removable duplicates, the proposed algorithm can partition the data into 15 balanced partitions with only 54 GB of deduplication loss, that is, a 5% deduplication loss. Second, a quick and accurate method to query the deduplicated size for a subset of objects in deduplicated storage systems was developed. For the same trace of 3 TB data, the optimized graph-based algorithm can complete the query in 2.6 s, which is less than 1% of that of the traditional algorithm based on the deduplication metadata.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 9
    Publication Date: 2012-04-11
    Description: Grammar-based compression is a well-studied technique to construct a context-free grammar (CFG) deriving a given text uniquely. In this work, we propose an online algorithm for grammar-based compression. Our algorithm guarantees O(log2 n)- approximation ratio for the minimum grammar size, where n is an input size, and it runs in input linear time and output linear space. In addition, we propose a practical encoding, which transforms a restricted CFG into a more compact representation. Experimental results by comparison with standard compressors demonstrate that our algorithm is especially effective for highly repetitive text.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 10
    Publication Date: 2012-04-14
    Description: A disentanglement puzzle consists of mechanically interlinked pieces, and the puzzle is solved by disentangling one piece from another set of pieces. A string puzzle consists of strings entangled with one or more wooden pieces. We consider the generalized string puzzle problem whose input is the layout of strings and a wooden board with holes embedded in the 3-dimensional Euclidean space. We present a polynomial-time transformation from an arbitrary instance ƒ of the 3SAT problem to a string puzzle s such that ƒ is satisfiable if and only if s is solvable. Therefore, the generalized string puzzle problem is NP-hard.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 11
    Publication Date: 2012-08-25
    Description: When an event occurs in the real world, numerous news reports describing this event start to appear on different news sites within a few minutes of the event occurrence. This may result in a huge amount of information for users, and automated processes may be required to help manage this information. In this paper, we describe a clustering system that can cluster news reports from disparate sources into event-centric clusters—i.e., clusters of news reports describing the same event. A user can identify any RSS feed as a source of news he/she would like to receive and our clustering system can cluster reports received from the separate RSS feeds as they arrive without knowing the number of clusters in advance. Our clustering system was designed to function well in an online incremental environment. In evaluating our system, we found that our system is very good in performing fine-grained clustering, but performs rather poorly when performing coarser-grained clustering.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 12
    Publication Date: 2012-08-23
    Description: An algorithm that forecasts volcanic activity using an event tree decision making framework and logistic regression has been developed, characterized, and validated. The suite of empirical models that drive the system were derived from a sparse and geographically diverse dataset comprised of source modeling results, volcano monitoring data, and historic information from analog volcanoes. Bootstrapping techniques were applied to the training dataset to allow for the estimation of robust logistic model coefficients. Probabilities generated from the logistic models increase with positive modeling results, escalating seismicity, and rising eruption frequency. Cross validation yielded a series of receiver operating characteristic curves with areas ranging between 0.78 and 0.81, indicating that the algorithm has good forecasting capabilities. Our results suggest that the logistic models are highly transportable and can compete with, and in some cases outperform, non-transportable empirical models trained with site specific information.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 13
    Publication Date: 2012-05-03
    Description: Deduplication in storage systems has gained momentum recently for its capability in reducing data footprint. However, deduplication introduces challenges to storage management as storage objects (e.g., files) are no longer independent from each other due to content sharing between these storage objects. In this paper, we present a graph-based framework to address the challenges of storage management due to deduplication. Specifically, we model content sharing among storage objects by content sharing graphs (CSG), and apply graph-based algorithms to two real-world storage management use cases for deduplication-enabled storage systems. First, a quasi-linear algorithm was developed to partition deduplication domains with a minimal amount of deduplication loss (i.e., data replicated across partitioned domains) in commercial deduplication-enabled storage systems, whereas in general the partitioning problem is NP-complete. For a real-world trace of 3 TB data with 978 GB of removable duplicates, the proposed algorithm can partition the data into 15 balanced partitions with only 54 GB of deduplication loss, that is, a 5% deduplication loss. Second, a quick and accurate method to query the deduplicated size for a subset of objects in deduplicated storage systems was developed. For the same trace of 3 TB data, the optimized graph-based algorithm can complete the query in 2.6 s, which is less than 1% of that of the traditional algorithm based on the deduplication metadata.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 14
    Publication Date: 2012-05-03
    Description: This paper is a review which presents and explains the decomposition of graphs by clique minimal separators. The pace is leisurely, we give many examples and figures. Easy algorithms are provided to implement this decomposition. The historical and theoretical background is given, as well as sketches of proofs of the structural results involved.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 15
    Publication Date: 2012-05-03
    Description: In recent years, many tools have been proposed to reduce programming learning difficulties felt by many students. Our group has contributed to this effort through the development of several tools, such as VIP, SICAS, OOP-Anim, SICAS-COL and H-SICAS. Even though we had some positive results, the utilization of these tools doesn’t seem to significantly reduce weaker student’s difficulties. These students need stronger support to motivate them to get engaged in learning activities, inside and outside classroom. Nowadays, many technologies are available to create contexts that may help to accomplish this goal. We consider that a promising path goes through the integration of solutions. In this paper we analyze the features, strengths and weaknesses of the tools developed by our group. Based on these considerations we present a new environment, integrating different types of pedagogical approaches, resources, tools and technologies for programming learning support. With this environment, currently under development, it will be possible to review contents and lessons, based on video and screen captures. The support for collaborative tasks is another key point to improve and stimulate different models of teamwork. The platform will also allow the creation of various alternative models (learning objects) for the same subject, enabling personalized learning paths adapted to each student knowledge level, needs and preferential learning styles. The learning sequences will work as a study organizer, following a suitable taxonomy, according to student’s cognitive skills. Although the main goal of this environment is to support students with more difficulties, it will provide a set of resources supporting the learning of more advanced topics. Software engineering techniques and representations, object orientation and event programming are features that will be available in order to promote the learning progress of students.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 16
    Publication Date: 2012-05-03
    Description: Graph search algorithms have exploited graph extremities, such as the leaves of a tree and the simplicial vertices of a chordal graph. Recently, several well-known graph search algorithms have been collectively expressed as two generic algorithms called MLS and MLSM. In this paper, we investigate the properties of the vertex that is numbered 1 by MLS on a chordal graph and by MLSM on an arbitrary graph. We explain how this vertex is an extremity of the graph. Moreover, we show the remarkable property that the minimal separators included in the neighborhood of this vertex are totally ordered by inclusion.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 17
    Publication Date: 2012-05-03
    Description: Non-preemptive schedulers, despite their many discussed drawbacks, remain a very popular choice for practitioners of real-time and embedded systems. The non-preemptive ‘thrift’ cyclic scheduler—variations of which can be found in other application areas—has recently received considerable attention for the implementation of such embedded systems. A thrift scheduler provides a flexible and compact implementation model for periodic task sets with comparatively small overheads; additionally, it can overcome several of the problems associated with traditional ‘cyclic executives’. However, severe computational difficulties can still arise when designing schedules for non-trivial task sets. This paper is concerned with an optimization version of the offset-assignment problem, in which the objective is to assign task offsets such that the required CPU clock speed is minimized whilst ensuring that task overruns do not occur; it is known that the decision version of this problem is complete for Σ2p. The paper first considers the problemof candidate solution verification—itself strongly coNP-Complete—and a fast, exact algorithm for this problem is proposed; it is shown that for any fixed number of tasks, its execution time is polynomial. The paper then proposes two heuristic algorithms of pseudopolynomial complexity for solving the offset-assignment problem, and considers how redundant choices of offset combinations can be eliminated to help speed up the search. The performance of these algorithms is then experimentally evaluated, before conclusions are drawn.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 18
    Publication Date: 2012-05-03
    Description: Algorithm animations typically assist in educational tasks aimed simply at achieving understanding. Potentially, animations could assist in higher levels of cognition, such as the analysis level, but they usually fail in providing this support because they are not flexible or comprehensive enough. In particular, animations of recursion provided by educational systems hardly support the analysis of recursive algorithms. Here we show how to provide full support to the analysis of recursive algorithms. From a technical point of view, animations are enriched with interaction techniques inspired by the information visualization (InfoVis) field. Interaction tasks are presented in seven categories, and deal with both static visualizations and dynamic animations. All of these features are implemented in the SRec system, and visualizations generated by SRec are used to illustrate the article.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 19
    Publication Date: 2012-05-03
    Description: Techniques in image similarity can be used to improve the classification of breast cancer images. Breast cancer images in the mammogram modality have an abundance of non-cancerous structures that are similar to cancer, which make classification of images as containing cancer especially difficult to work with. Only the cancerous part of the image is relevant, so the techniques must learn to recognize cancer in noisy mammograms and extract features from that cancer to appropriately classify images. There are also many types or classes of cancer with different characteristics over which the system must work. Mammograms come in sets of four, two images of each breast, which enables comparison of the left and right breast images to help determine relevant features and remove irrelevant features. In this work, image feature clustering is done to reduce the noise and the feature space, and the results are used in a distance function that uses a learned threshold in order to produce a classification. The threshold parameter of the distance function is learned simultaneously with the underlying clustering and then integrated to produce an agglomeration that is relevant to the images. This technique can diagnose breast cancer more accurately than commercial systems and other published results.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 20
    Publication Date: 2012-05-03
    Description: This paper uses an ensemble of classifiers and active learning strategies to predict radiologists’ assessment of the nodules of the Lung Image Database Consortium (LIDC). In particular, the paper presents machine learning classifiers that model agreement among ratings in seven semantic characteristics: spiculation, lobulation, texture, sphericity, margin, subtlety, and malignancy. The ensemble of classifiers (which can be considered as a computer panel of experts) uses 64 image features of the nodules across four categories (shape, intensity, texture, and size) to predict semantic characteristics. The active learning begins the training phase with nodules on which radiologists’ semantic ratings agree, and incrementally learns how to classify nodules on which the radiologists do not agree. Using our proposed approach, the classification accuracy of the ensemble of classifiers is higher than the accuracy of a single classifier. In the long run, our proposed approach can be used to increase consistency among radiological interpretations by providing physicians a “second read”.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 21
    Publication Date: 2012-05-03
    Description: We consider grammar-based text compression with longest first substitution (LFS), where non-overlapping occurrences of a longest repeating factor of the input text are replaced by a new non-terminal symbol. We present the first linear-time algorithm for LFS. Our algorithm employs a new data structure called sparse lazy suffix trees. We also deal with a more sophisticated version of LFS, called LFS2, that allows better compression. The first linear-time algorithm for LFS2 is also presented.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 22
    Publication Date: 2012-05-03
    Description: Using Hidden Markov Models (HMMs) as a recognition framework for automatic classification of animal vocalizations has a number of benefits, including the ability to handle duration variability through nonlinear time alignment, the ability to incorporate complex language or recognition constraints, and easy extendibility to continuous recognition and detection domains. In this work, we apply HMMs to several different species and bioacoustic tasks using generalized spectral features that can be easily adjusted across species and HMM network topologies suited to each task. This experimental work includes a simple call type classification task using one HMM per vocalization for repertoire analysis of Asian elephants, a language-constrained song recognition task using syllable models as base units for ortolan bunting vocalizations, and a stress stimulus differentiation task in poultry vocalizations using a non-sequential model via a one-state HMM with Gaussian mixtures. Results show strong performance across all tasks and illustrate the flexibility of the HMM framework for a variety of species, vocalization types, and analysis tasks.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 23
    Publication Date: 2012-05-03
    Description: An adaptive mesh refinement strategy is proposed for local damage models that often arise from internal state variable based continuum damage models. The proposed algorithm employs both the finite element method and the finite difference method to integrate the equations of motion of a linear elastic material with simple isotropic microcracking. The challenges of this problem include the time integration of coupled partial differential equations with time-dependent coefficients, and the proper choice of solution spaces to yield a stable finite element formulation. Discontinuous elements are used for the representation of the damage field, as it is believed that this reduction in regularity is more consistent with the physical nature of evolving microcracking. The adaptive mesh refinement algorithm relies on custom refinement indicators, two of which are presented and compared. The two refinement indicators we explore are based on the time rate of change of the damage field and on the energy release rate, respectively, where the energy release rate measures the energy per unit volume available for damage to evolve. We observe the performance of the proposed algorithm and refinement indicators by comparing the predicted damage morphology on different meshes, hence judging the capability of the proposed technique to address, but not eliminate, the mesh dependency present in the solutions of the damage field.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 24
    Publication Date: 2012-05-03
    Description: The Encyclopedia of Algorithms provides a comprehensive set of solutions to important algorithmic problems for students and researchers, including high-impact solutions from the most recent decade. [...]
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 25
    Publication Date: 2012-05-03
    Description: Recently a Delaunay refinement algorithm has been proposed that can mesh piecewise smooth complexes which include polyhedra, smooth and piecewise smooth surfaces, and non-manifolds. However, this algorithm employs domain dependent numerical predicates, some of which could be computationally expensive and hard to implement. In this paper we develop a refinement strategy that eliminates these complicated domain dependent predicates. As a result we obtain a meshing algorithm that is practical and implementation-friendly.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 26
    Publication Date: 2012-05-03
    Description: We consider the problem of all-to-one selfish routing in the absence of a payment scheme in wireless sensor networks, where a natural model for cost is the power required to forward, referring to the resulting game as a Locally Minimum Cost Forwarding (LMCF). Our objective is to characterize equilibria and their global costs in terms of stretch and diameter, in particular finding incentive compatible algorithms that are also close to globally optimal. We find that although social costs for equilibria of LMCF exhibit arbitrarily bad worst-case bounds and computational infeasibility of reaching optimal equilibria, there exist greedy and local incentive compatible heuristics achieving near-optimal global costs.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 27
    Publication Date: 2012-05-03
    Description: The emergence of novel sensing elements, computing nodes, wireless communication and integration technology provides unprecedented possibilities for the design and application of intelligent systems. Each new application system must be designed from scratch, employing sophisticated methods ranging from conventional signal processing to computational intelligence. Currently, a significant part of this overall algorithmic chain of the computational system model still has to be assembled manually by experienced designers in a time and labor consuming process. In this research work, this challenge is picked up and a methodology and algorithms for automated design of intelligent integrated and resource-aware multi-sensor systems employing multi-objective evolutionary computation are introduced. The proposed methodology tackles the challenge of rapid-prototyping of such systems under realization constraints and, additionally, includes features of system instance specific self-correction for sustained operation of a large volume and in a dynamically changing environment. The extension of these concepts to the reconfigurable hardware platform renders so called self-x sensor systems, which stands, e.g., for self-monitoring, -calibrating, -trimming, and -repairing/-healing systems. Selected experimental results prove the applicability and effectiveness of our proposed methodology and emerging tool. By our approach, competitive results were achieved with regard to classification accuracy, flexibility, and design speed under additional design constraints.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 28
    Publication Date: 2012-05-03
    Description: The fluorescence properties of tryptophan residues are sensitive to the microenvironment of fluorophores in proteins. Therefore, fluorescence characteristics are widely used to study structural transitions in proteins. However, the decoding of the structural information from spectroscopic data is challenging. Here we present a review of approaches developed for the decomposition of multi-component protein tryptophan fluorescence spectra and correlation of these spectral parameters with protein structural properties.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 29
    Publication Date: 2012-05-03
    Description: The automated approximation of solutions to differential equations which involve discontinuities across evolving surfaces is addressed. Finite element technology has developed to the point where it is now possible to model evolving discontinuities independently of the underlying mesh, which is particularly useful in simulating failure of solids. However, the approach remains tedious to program, particularly in the case of coupled problems where a variety of finite element bases are employed and where a mixture of continuous and discontinuous fields may be used. We tackle this point by exploring the scope for employing automated code generation techniques for modelling discontinuities. Function spaces and variational forms are defined in a language that resembles mathematical notation, and computer code for modelling discontinuities is automatically generated. Principles underlying the approach are elucidated and a number of two- and three-dimensional examples for different equations are presented.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 30
    Publication Date: 2012-05-03
    Description: Online OVSF code assignment has an important application to wireless communications. Recently, this problem was formally modeled as an online problem, and performances of online algorithms have been analyzed by the competitive analysis. The previous best upper and lower bounds on the competitive ratio were 10 and 5/3, respectively. In this paper, we improve them to 7 and 2, respectively. We also show that our analysis for the upper bound is tight by giving an input sequence for which the competitive ratio of our algorithm is 7 ― ε for an arbitrary constant ε 〉 0.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 31
    Publication Date: 2012-05-03
    Description: We consider a pursuit-evasion problem where some lions have the task to clear a grid graph whose nodes are initially contaminated. The contamination spreads one step per time unit in each direction not blocked by a lion. A vertex is cleared from its contamination whenever a lion moves to it. Brass et al. [5] showed that n/2 lions are not enough to clear the n x n-grid. In this paper, we consider the same problem in dimension d 〉 2 and prove that Θ(nd-1/√d) lions are necessary and sufficient to clear the nd-grid. Furthermore, we analyze a problem variant where the lions are also allowed to jump from grid vertices to non-adjacent grid vertices.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 32
    Publication Date: 2012-05-03
    Description: Considerable importance in molecular biophysics is attached to influencing by mutagenesis the specific properties of a protein family. The working hypothesis is that mutating residues at few selected positions can affect specificity. Statistical analysis of homologue sequences can identify putative specificity determining positions (SDPs) and help to shed some light on the peculiarities underlying their functional role. In this work, we present an approach to identify such positions inspired by state of the art mutual information-based SDP prediction methods. The algorithm based on this approach provides a systematic procedure to point at the relevant physical characteristics of putative SPDs and can investigate the effects of correlated mutations. The method is tested on two standard benchmarks in the field and further validated in the context of a biologically interesting problem: the multimerization of the Intrinsically Fluorescent Proteins (IFP).
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 33
    Publication Date: 2012-05-03
    Description: Calls from 14 species of bat were classified to genus and species using discriminant function analysis (DFA), support vector machines (SVM) and ensembles of neural networks (ENN). Both SVMs and ENNs outperformed DFA for every species while ENNs (mean identification rate – 97%) consistently outperformed SVMs (mean identification rate – 87%). Correct classification rates produced by the ENNs varied from 91% to 100%; calls from six species were correctly identified with 100% accuracy. Calls from the five species of Myotis, a genus whose species are considered difficult to distinguish acoustically, had correct identification rates that varied from 91 – 100%. Five parameters were most important for classifying calls correctly while seven others contributed little to classification performance.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 34
    Publication Date: 2012-05-03
    Description: An image tracking algorithm, which was originally used with the particle image velocimetry (PIV) to determine velocities of buoyant solid particles in water, is modified and applied in the presented work to detect motion of fire ant on a planar surface. A group of fire ant workers are put to the bottom of a tub and excited with vibration of selected frequency and intensity. The moving fire ants are captured with an image system that successively acquires image frames of high digital resolution. The background noise in the imaging recordings is extracted by averaging hundreds of frames and removed from each frame. The individual fire ant images are identified with a recursive digital filter, and then they are tracked between frames according to the size, brightness, shape, and orientation angle of the ant image. The speed of an individual ant is determined with the displacement of its images and the time interval between frames. The trail of the individual fire ant is determined with the image tracking results, and a statistical analysis is conducted for all the fire ants in the group. The purpose of the experiment is to investigate the response of fire ants to the substrate vibration. Test results indicate that the fire ants move faster after being excited, but the number of active ones are not increased even after a strong excitation.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 35
    Publication Date: 2012-05-03
    Description: Greenhouse-grown butter lettuce (Lactuca sativa L.) can potentially be stored for 21 days at constant 0°C. When storage temperature was increased to 5°C or 10°C, shelf life was shortened to 14 or 10 days, respectively, in our previous observations. Also, commercial shelf life of 7 to 10 days is common, due to postharvest temperature fluctuations. The objective of this study was to establish neural network (NN) models to predict the remaining shelf life (RSL) under fluctuating postharvest temperatures. A box of 12 - 24 lettuce heads constituted a sample unit. The end of the shelf life of each head was determined when it showed initial signs of decay or yellowing. Air temperatures inside a shipping box were recorded. Daily average temperatures in storage and averaged shelf life of each box were used as inputs, and the RSL was modeled as an output. An R2 of 0.57 could be observed when a simple NN structure was employed. Since the "future" (or remaining) storage temperatures were unavailable at the time of making a prediction, a second NN model was introduced to accommodate a range of future temperatures and associated shelf lives. Using such 2-stage NN models, an R2 of 0.61 could be achieved for predicting RSL. This study indicated that NN modeling has potential for cold chain quality control and shelf life prediction.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 36
    Publication Date: 2012-05-03
    Description: The discovery of gene regulatory elements requires the synergism between computational and experimental techniques in order to reveal the underlying regulatory mechanisms that drive gene expression in response to external cues and signals. Utilizing the large amount of high-throughput experimental data, constantly growing in recent years, researchers have attempted to decipher the patterns which are hidden in the genomic sequences. These patterns, called motifs, are potential binding sites to transcription factors which are hypothesized to be the main regulators of the transcription process. Consequently, precise detection of these elements is required and thus a large number of computational approaches have been developed to support the de novo identification of TFBSs. Even though novel approaches are continuously proposed and almost all have reported some success in yeast and other lower organisms, in higher organisms the problem still remains a challenge. In this paper, we therefore review the recent developments in computational methods for transcription factor binding site prediction. We start with a brief review of the basic approaches for binding site representation and promoter identification, then discuss the techniques to locate physical TFBSs, identify functional binding sites using orthologous information, and infer functional TFBSs within some context defined by additional prior knowledge. Finally, we briefly explore the opportunities for expanding these approaches towards the computational identification of transcriptional regulatory networks.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 37
    Publication Date: 2012-05-03
    Description: Artificial neural networks as a major soft-computing technology have been extensively studied and applied during the last three decades. Research on backpropagation training algorithms for multilayer perceptron networks has spurred development of other neural network training algorithms for other networks such as radial basis function, recurrent network, feedback network, and unsupervised Kohonen self-organizing network. These networks, especially the multilayer perceptron network with a backpropagation training algorithm, have gained recognition in research and applications in various scientific and engineering areas. In order to accelerate the training process and overcome data over-fitting, research has been conducted to improve the backpropagation algorithm. Further, artificial neural networks have been integrated with other advanced methods such as fuzzy logic and wavelet analysis, to enhance the ability of data interpretation and modeling and to avoid subjectivity in the operation of the training algorithm. In recent years, support vector machines have emerged as a set of high-performance supervised generalized linear classifiers in parallel with artificial neural networks. A review on development history of artificial neural networks is presented and the standard architectures and algorithms of artificial neural networks are described. Furthermore, advanced artificial neural networks will be introduced with support vector machines, and limitations of ANNs will be identified. The future of artificial neural network development in tandem with support vector machines will be discussed in conjunction with further applications to food science and engineering, soil and water relationship for crop management, and decision support for precision agriculture. Along with the network structures and training algorithms, the applications of artificial neural networks will be reviewed as well, especially in the fields of agricultural and biological engineering.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 38
    Publication Date: 2012-05-03
    Description: This paper presents an adaptive PSO algorithm whose numerical parameters can be updated following a scheduled protocol respecting some known criteria of convergence in order to enhance the chances to reach the global optimum of a hard combinatorial optimization problem, such those encountered in global optimization problems of composite laminates. Some examples concerning hard design problems are provided, showing the effectiveness of the approach.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 39
    Publication Date: 2012-05-03
    Description: This work presents a generalized approach for the fast structural alignment of thousands of macromolecular structures. The method uses string representations of a macromolecular structure and a hash table that stores n-grams of a certain size for searching. To this end, macromolecular structure-to-string translators were implemented for protein and RNA structures. A query against the index is performed in two hierarchical steps to unite speed and precision. In the first step the query structure is translated into n-grams, and all target structures containing these n-grams are retrieved from the hash table. In the second step all corresponding n-grams of the query and each target structure are subsequently aligned, and after each alignment a score is calculated based on the matching n-grams of query and target. The extendable framework enables the user to query and structurally align thousands of protein and RNA structures on a commodity machine and is available as open source from http://lajolla.sf.net.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 40
    Publication Date: 2012-05-03
    Description: This paper presents a survey describing recent developments in the area of mathematical programming techniques for various types of sensor network applications. We discuss mathematical programming formulations associated with these applications, as well as methods for solving the corresponding problems. We also address some of the challenges arising in this area, including both conceptual and computational aspects.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 41
    Publication Date: 2012-05-03
    Description: Within the framework of multifield continua, we move from the model of elastic microcracked body introduced in (Mariano, P.M. and Stazi, F.L., Strain localization in elastic microcracked bodies, Comp. Methods Appl. Mech. Engrg. 2001, 190, 5657–5677) and propose a few novel variational formulations of mixed type along with relevant mixed FEM discretizations. To this goal, suitably extended Hellinger-Reissner principles of primal and dual type are derived. A few numerical studies are presented that include an investigation on the interaction between a single cohesive macrocrack and diffuse microcracks (Mariano, P.M. and Stazi, F.L., Strain localization due to crack–microcrack interactions: X–FEM for a multifield approach, Comp. Methods Appl. Mech. Engrg. 2004, 193, 5035–5062).
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 42
    Publication Date: 2012-05-03
    Description: Although the electrolarynx (EL) provides an important means of voice reconstruction for patients who lose their vocal cords by laryngectomies, the radiated noise and additive environment noise reduce the intelligibility of the resulting EL speech. This paper proposes an improved spectrum subtract algorithm by taking into account the non-uniform effect of colored noise on the spectrum of EL speech. Since the over-subtraction factor of each frequency band can be adjusted in the enhancement process, a better noise reduction effect was obtained and the perceptually annoying musical noise was efficiently reduced, as compared to other standard speech enhancement algorithms.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 43
    Publication Date: 2012-05-03
    Description: The goal of this study was to develop a computer-aided therapeutic response (CADrx) system for early prediction of drug treatment response for glioblastoma multiforme (GBM) brain tumors with diffusion weighted (DW) MR images. In conventional Macdonald assessment, tumor response is assessed nine weeks or more post-treatment. However, we will investigate the ability of DW-MRI to assess response earlier, at five weeks post treatment. The apparent diffusion coefficient (ADC) map, calculated from DW images, has been shown to reveal changes in the tumor’s microenvironment preceding morphologic tumor changes. ADC values in treated brain tumors could theoretically both increase due to the cell kill (and thus reduced cell density) and decrease due to inhibition of edema. In this study, we investigated the effectiveness of features that quantify changes from pre- and post-treatment tumor ADC histograms to detect treatment response. There are three parts to this study: first, tumor regions were segmented on T1w contrast enhanced images by Otsu’s thresholding method, and mapped from T1w images onto ADC images by a 3D region of interest (ROI) mapping tool using DICOM header information; second, ADC histograms of the tumor region were extracted from both pre- and five weeks post-treatment scans, and fitted by a two-component Gaussian mixture model (GMM). The GMM features as well as standard histogram-based features were extracted. Finally, supervised machine learning techniques were applied for classification of responders or non-responders. The approach was evaluated with a dataset of 85 patients with GBM under chemotherapy, in which 39 responded and 46 did not, based on tumor volume reduction. We compared adaBoost, random forest and support vector machine classification algorithms, using ten-fold cross validation, resulting in the best accuracy of 69.41% and the corresponding area under the curve (Az) of 0.70.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 44
    Publication Date: 2012-05-03
    Description: The performance of microwave radiometers can be seriously degraded by the presence of radio-frequency interference (RFI). Spurious signals and harmonics from lower frequency bands, spread-spectrum signals overlapping the “protected” band of operation, or out-of-band emissions not properly rejected by the pre-detection filters due to the finite rejection modify the detected power and the estimated antenna temperature from which the geophysical parameters will be retrieved. In recent years, techniques to detect the presence of RFI have been developed. They include time- and/or frequency domain analyses, or statistical analysis of the received signal which, in the absence of RFI, must be a zero-mean Gaussian process. Current mitigation techniques are mostly based on blanking in the time and/or frequency domains where RFI has been detected. However, in some geographical areas, RFI is so persistent in time that is not possible to acquire RFI-free radiometric data. In other applications such as sea surface salinity retrieval, where the sensitivity of the brightness temperature to salinity is weak, small amounts of RFI are also very difficult to detect and mitigate. In this work a wavelet-based technique is proposed to mitigate RFI (cancel RFI as much as possible). The interfering signal is estimated by using the powerful denoising capabilities of the wavelet transform. The estimated RFI signal is then subtracted from the received signal and a “cleaned” noise signal is obtained, from which the power is estimated later. The algorithm performance as a function of the threshold type, and the threshold selection method, the decomposition level, the wavelet type and the interferenceto-noise ratio is presented. Computational requirements are evaluated in terms of quantization levels, number of operations, memory requirements (sequence length). Even though they are high for today’s technology, the algorithms presented can be applied to recorded data. The results show that even RFI much larger than the noise signal can be very effectively mitigated, well below the noise level.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 45
    Publication Date: 2012-05-03
    Description: In geometric models with high-valence vertices, current subdivision wavelets may not deal with the special cases well for good visual effect of multiresolution surfaces. In this paper, we present the novel biorthogonal polar subdivision wavelets, which can efficiently perform wavelet analysis to the control nets with polar structures. The polar subdivision can generate more natural subdivision surfaces around the high-valence vertices and avoid the ripples and saddle points where Catmull-Clark subdivision may produce. Based on polar subdivision, our wavelet scheme supports special operations on the polar structures, especially suitable to models with many facets joining. For seamless fusing with Catmull-Clark subdivision wavelet, we construct the wavelets in circular and radial layers of polar structures, so can combine the subdivision wavelets smoothly for composite models formed by quadrilaterals and polar structures. The computations of wavelet analysis and synthesis are highly efficient and fully in-place. The experimental results have confirmed the stability of our proposed approach.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 46
    Publication Date: 2012-05-03
    Description: The symmetric-convolution multiplication (SCM) property of discrete trigonometric transforms (DTTs) based on unitary transform matrices is developed. Then as the reciprocity of this property, the novel multiplication symmetric-convolution (MSC) property of discrete trigonometric transforms, is developed.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 47
    Publication Date: 2012-05-03
    Description: With the aim of classifying sperm whales, this report compares two methods that can use Gaussian functions, a radial basis function network, and support vector machines which were trained with two different approaches known as C-SVM and ν-SVM. The methods were tested on data recordings from seven different male sperm whales, six containing single click trains and the seventh containing a complete dive. Both types of classifiers could distinguish between the clicks of the seven different whales, but the SVM seemed to have better generalisation towards unknown data, at the cost of needing more information and slower performance.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 48
    Publication Date: 2012-05-03
    Description: A compressed full-text self-index for a text T is a data structure requiring reduced space and able to search for patterns P in T. It can also reproduce any substring of T, thus actually replacing T. Despite the recent explosion of interest on compressed indexes, there has not been much progress on functionalities beyond the basic exact search. In this paper we focus on indexed approximate string matching (ASM), which is of great interest, say, in bioinformatics. We study ASM algorithms for Lempel-Ziv compressed indexes and for compressed suffix trees/arrays. Most compressed self-indexes belong to one of these classes. We start by adapting the classical method of partitioning into exact search to self-indexes, and optimize it over a representative of either class of self-index. Then, we show that a Lempel- Ziv index can be seen as an extension of the classical q-samples index. We give new insights on this type of index, which can be of independent interest, and then apply them to a Lempel- Ziv index. Finally, we improve hierarchical verification, a successful technique for sequential searching, so as to extend the matches of pattern pieces to the left or right. Most compressed suffix trees/arrays support the required bidirectionality, thus enabling the implementation of the improved technique. In turn, the improved verification largely reduces the accesses to the text, which are expensive in self-indexes. We show experimentally that our algorithms are competitive and provide useful space-time tradeoffs compared to classical indexes.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 49
    Publication Date: 2012-05-03
    Description: Spatial registration of multidate or multisensorial images is required for many applications in remote sensing. Automatic image registration, which has been extensively studied in other areas of image processing, is still a complex problem in the framework of remote sensing. In this work we explore an alternative strategy for a fully automatic and operational registration system capable of registering multitemporal and multisensorial remote sensing satellite images with high accuracy and avoiding the use of ground control points, exploiting the maximum reliable information in both images (coastlines not occluded by clouds), which have been coarsely geometrically corrected only using an orbital prediction model. The automatic feature-based approach is summarized as follows: i) Reference image coastline extraction; ii) Sensed image gradient energy map estimation and iii) Contour matching, mapping function estimation and transformation of the sensed images. Several experimental results for single sensor imagery (AVHRR/3) and multisensorial imagery (AVHRR/3-SeaWiFS-MODIS-ATSR) from different viewpoints and dates have verified the robustness and accuracy of the proposed automatic registration algorithm, demonstrating its capability of registering satellite images of coastal areas within one pixel.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 50
    Publication Date: 2012-05-03
    Description: A cascade correlation learning architecture has been devised for the first time for radial basis function processing units. The proposed algorithm was evaluated with two synthetic data sets and two chemical data sets by comparison with six other standard classifiers. The ability to detect a novel class and an imbalanced class were demonstrated with synthetic data. In the chemical data sets, the growth regions of Italian olive oils were identified by their fatty acid profiles; mass spectra of polychlorobiphenyl compounds were classified by chlorine number. The prediction results by bootstrap Latin partition indicate that the proposed neural network is useful for pattern recognition.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 51
    Publication Date: 2012-05-03
    Description: Wireless sensor networks are a relatively new area where technology is developing fast and are used to solve a great diversity of problems that range from museums’ security to wildlife protection. The geometric optimisation problem solved in this paper is aimed at minimising the sensors’ range so that every point on a polygonal region R is within the range of at least two sensors. Moreover, it is also shown how to minimise the sensors’ range to assure the existence of a path within R that stays as close to two sensors as possible.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 52
    Publication Date: 2012-05-03
    Description: A general review of the extended finite element method and its application to the simulation of first-order phase transitions is provided. Detailed numerical investigations are then performed by focusing on the one-dimensional case and studying: (i) spatial and temporal discretisations, (ii) different numerical techniques for the interface-condition enforcement, and (iii) different treatments for the blending elements. An embeddeddiscontinuity finite element approach is also developed and compared with the extended finite element method, so that a clearer insight of the latter can be given. Numerical examples for melting/solidification in planar, cylindrical, and spherical symmetry are presented and the results are compared with analytical solutions.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 53
    Publication Date: 2012-05-03
    Description: This paper reviews the basics and recent researches of computer-aided diagnosis (CAD) systems for assisting neuroradiologists in detection of brain diseases, e.g., asymptomatic unruptured aneurysms, Alzheimer\'s disease, vascular dementia, and multiple sclerosis (MS), in magnetic resonance (MR) images. The CAD systems consist of image feature extraction based on image processing techniques and machine learning classifiers such as linear discriminant analysis, artificial neural networks, and support vector machines. We introduce useful examples of the CAD systems in the neuroradiology, and conclude with possibilities in the future of the CAD systems for brain diseases in MR images.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 54
    Publication Date: 2012-05-03
    Description: The Web Graph is a large-scale graph that does not fit in main memory, so that lossless compression methods have been proposed for it. This paper introduces a compression scheme that combines efficient storage with fast retrieval for the information in a node. The scheme exploits the properties of the Web Graph without assuming an ordering of the URLs, so that it may be applied to more general graphs. Tests on some datasets of use achieve space savings of about 10% over existing methods.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 55
    Publication Date: 2012-05-03
    Description: Basics of Bayesian statistics in inverse problems using the maximum entropy principle are summarized in connection with the restoration of positive, additive images from various types of data like X-ray digital mammograms. An efficient iterative algorithm for image restoration from large data sets based on the conjugate gradient method and Lagrange multipliers in nonlinear optimization of a specific potential function was developed. The point spread function of the imaging system was determined by numerical simulations of inhomogeneous breast-like tissue with microcalcification inclusions of various opacities. The processed digital and digitized mammograms resulted superior in comparison with their raw counterparts in terms of contrast, resolution, noise, and visibility of details.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 56
    Publication Date: 2012-05-03
    Description: As the rapid advance of digital imaging technologies, the content-based image retrieval (CBIR) has became one of the most vivid research areas in computer vision. In the last several years, developing computer-aided detection and/or diagnosis (CAD) schemes that use CBIR to search for the clinically relevant and visually similar medical images (or regions) depicting suspicious lesions has also been attracting research interest. CBIR-based CAD schemes have potential to provide radiologists with “visual aid” and increase their confidence in accepting CAD-cued results in the decision making. The CAD performance and reliability depends on a number of factors including the optimization of lesion segmentation, feature selection, reference database size, computational efficiency, and relationship between the clinical relevance and visual similarity of the CAD results. By presenting and comparing a number of approaches commonly used in previous studies, this article identifies and discusses the optimal approaches in developing CBIR-based CAD schemes and assessing their performance. Although preliminary studies have suggested that using CBIR-based CAD schemes might improve radiologists’ performance and/or increase their confidence in the decision making, this technology is still in the early development stage. Much research work is needed before the CBIR-based CAD schemes can be accepted in the clinical practice.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 57
    Publication Date: 2012-05-03
    Description: We present a coupled finite element, Kalman filter approach to foresee impactinduced delamination of layered composites when mechanical properties are partially unknown. Since direct numerical simulations, which require all the constitutive parameters to be assigned, cannot be run in such cases, an inverse problem is formulated to allow for modeling as well as constitutive uncertainties. Upon space discretization through finite elements and time integration through the explicit ®¡method, the resulting nonlinear stochastic state model, wherein nonlinearities are due to delamination growth, is attacked with sigma-point Kalman filtering. Comparison with experimental data available in the literature and concerning inter-laminar failure of layered composites subject to low-velocity impacts, shows that the proposed procedure leads to: an accurate description of the failure mode; converged estimates of inter-laminar strength and toughness in good agreement with experimental data.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 58
    Publication Date: 2012-05-03
    Description: A robust and complete workflow for metabolic profiling and data mining was described in detail. Three independent and complementary analytical techniques for metabolic profiling were applied: hydrophilic interaction chromatography (HILIC–LC–ESI–MS), reversed-phase liquid chromatography (RP–LC–ESI–MS), and gas chromatography (GC–TOF–MS) all coupled to mass spectrometry (MS). Unsupervised methods, such as principle component analysis (PCA) and clustering, and supervised methods, such as classification and PCA-DA (discriminatory analysis) were used for data mining. Genetic Algorithms (GA), a multivariate approach, was probed for selection of the smallest subsets of potentially discriminative predictors. From thousands of peaks found in total, small subsets selected by GA were considered as highly potential predictors allowing discrimination among groups. It was found that small groups of potential top predictors selected with PCA-DA and GA are different and unique. Annotated GC–TOF–MS data generated identified feature metabolites. Metabolites putatively detected with LC–ESI–MS profiling require further elemental composition assignment with accurate mass measurement by Fourier-transform ion cyclotron resonance mass spectrometry (FT-ICR-MS) and structure elucidation by nuclear magnetic resonance spectroscopy (NMR). GA was also used to generate correlated networks for pathway analysis. Several case studies, comprising groups of plant samples bearing different genotypes and groups of samples of human origin, namely patients and healthy volunteers’ urine samples, demonstrated that such a workflow combining comprehensive metabolic profiling and advanced data mining techniques provides a powerful approach for pattern recognition and biomarker discovery
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 59
    Publication Date: 2012-05-03
    Description: The theoretical Quantum Key-Distribution scheme of Bennett and Brassard (BB84) has been proven secure against very strong attacks including the collective attacks and the joint attacks. Though the latter are the most general attacks, collective attacks are much easier to analyze, yet, they are conjectured to be as informative to the eavesdropper. Thus, collective attacks are likely to be useful in the analysis of many theoretical and practical schemes that are still lacking a proof of security, including practical BB84 schemes. We show how powerful tools developed in previous works for proving security against the joint attack, are simplified when applied to the security of BB84 against collective attacks whilst providing the same bounds on leaked information and the same error threshold.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 60
    Publication Date: 2012-05-03
    Description: An image pattern tracking algorithm is described in this paper for time-resolved measurements of mini- and micro-scale movements of complex objects. This algorithm works with a high-speed digital imaging system, which records thousands of successive image frames in a short time period. The image pattern of the observed object is tracked among successively recorded image frames with a correlation-based algorithm, so that the time histories of the position and displacement of the investigated object in the camera focus plane are determined with high accuracy. The speed, acceleration and harmonic content of the investigated motion are obtained by post processing the position and displacement time histories. The described image pattern tracking algorithm is tested with synthetic image patterns and verified with tests on live insects.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 61
    Publication Date: 2012-05-03
    Description: With the discovery of insulin came a deeper understanding of therapeutic options for one of the most devastating chronic diseases of the modern era, diabetes mellitus. The use of insulin in the treatment of diabetes, especially in those with severe insulin deficiency (type 1 diabetes), with multiple injections or continuous subcutaneous infusion, has been largely successful, but the risk for short term and long term complications remains substantial. Insulin treatment decisions are based on the patient’s knowledge of meal size, exercise plans and the intermittent knowledge of blood glucose values. As such, these are open loop methods that require human input. The idea of closed loop control of diabetes treatment is quite different: automated control of a device that delivers insulin (and possibly glucagon or other medications) and is based on continuous or very frequent glucose measurements. Closed loop insulin control for type 1 diabetes is not new but is far from optimized. The goal of such a system is to avoid short-term complications (hypoglycemia) and long-term complications (diseases of the eyes, kidneys, nerves and cardiovascular system) by mimicking the normal insulin secretion pattern of the pancreatic beta cell. A control system for automated diabetes treatment consists of three major components, (1) a glucose sensing device that serves as the afferent limb of the system; (2) an automated control unit that uses algorithms which acquires sensor input and generates treatment outputs; and (3) a drug delivery device (primarily for delivery of insulin), which serves as the system’s efferent limb. There are several major issues that highlight the difficulty of interacting with the complex unknowns of the biological world. For example, development of accurate continuous glucose monitors is crucial; the state of the art in 2009 is that such devices sometimes experience drift and are intended only to supplement information received from standard intermittent blood glucose data. In addition, it is important to acknowledge that an “automated” closed loop pancreas cannot approach the complexity of the normal human endocrine pancreas, which takes continuous data from substrates, hormones, paracrine compounds and autonomic neural inputs, and in response, secretes four hormones. Another major issue is the substantial absorption/action delay of insulin given by the subcutaneous route. Because of this delay, some researchers have recently given a portion of the meal-related insulin in an open loop manner before the meal and found this hybrid approach to be superior to closed loop control. Proportional-Integral-Derivative (PID) systems adapted from the industrial sector utilize control algorithms that alter output based on proportional (difference between actual and target levels), derivative (rate of change) and integral (time-related summative) errors in glucose. These algorithms have proven to be very promising in limited clinical trials. Related algorithms include a “fading memory” system that combines the proportional-derivative components of a classic PID system with time-relating decay of input signals that allow greater emphasis on more recent glucose values, a characteristic noted in mammalian beta-cells. Model Predictive Control (MPC) systems are highly adaptive methods that utilize mathematical models based on observations of biological behavior patterns using system identification and are now undergoing testing in humans. The application of further mathematical models, such as fuzzy control and artificial neural networks, are also promising, but are largely clinically untested. In summary, the prospects for closed loop control of glycemia in persons with diabetes have improved considerably. Major limitations include the delayed absorption/action of subcutaneous insulin and the imperfect stability of currently-available continuous glucose sensors. The potential for improved glycemic control in persons with diabetes brings with it the potential for reduction in the frequency of acute and chronic complications of diabetes.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 62
    Publication Date: 2012-05-03
    Description: Functional mapping of dynamic traits measured in a longitudinal study was originally derived within the maximum likelihood (ML) context and implemented with the EM algorithm. Although ML-based functional mapping possesses many favorable statistical properties in parameter estimation, it may be computationally intractable for analyzing longitudinal data with high dimensions and high measurement errors. In this article, we derive a general functional mapping framework for quantitative trait locus mapping of dynamic traits within the Bayesian paradigm. Markov chain Monte Carlo techniques were implemented for functional mapping to estimate biologically and statistically sensible parameters that model the structures of time-dependent genetic effects and covariance matrix. The Bayesian approach is useful to handle difficulties in constructing confidence intervals as well as the identifiability problem, enhancing the statistical inference of functional mapping. We have undertaken simulation studies to investigate the statistical behavior of Bayesian-based functional mapping and used a real example with F2 mice to validate the utilization and usefulness of the model.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 63
    Publication Date: 2012-05-03
    Description: Specialized intelligent systems can be found everywhere: finger print, handwriting, speech, and face recognition, spam filtering, chess and other game programs, robots, et al. This decade the first presumably complete mathematical theory of artificial intelligence based on universal induction-prediction-decision-action has been proposed. This informationtheoretic approach solidifies the foundations of inductive inference and artificial intelligence. Getting the foundations right usually marks a significant progress and maturing of a field. The theory provides a gold standard and guidance for researchers working on intelligent algorithms. The roots of universal induction have been laid exactly half-a-century ago and the roots of universal intelligence exactly one decade ago. So it is timely to take stock of what has been achieved and what remains to be done. Since there are already good recent surveys, I describe the state-of-the-art only in passing and refer the reader to the literature. This article concentrates on the open problems in universal induction and its extension to universal intelligence.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 64
    Publication Date: 2012-05-03
    Description: Link puzzles involve finding paths or a cycle in a grid that satisfy given local and global properties. This paper proposes algorithms that enumerate solutions and instances of two link puzzles, Slitherlink and Numberlink, by zero-suppressed binary decision diagrams (ZDDs). A ZDD is a compact data structure for a family of sets provided with a rich family of set operations, by which, for example, one can easily extract a subfamily satisfying a desired property. Thanks to the nature of ZDDs, our algorithms offer a tool to assist users to design instances of those link puzzles.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 65
    Publication Date: 2012-05-03
    Description: In this paper, we integrate a grid system and a wireless network to present a convenient computational service system, called the Semi-Preemptive Computational Service system (SePCS for short), which provides users with a wireless access environment and through which a user can share his/her resources with others. In the SePCS, each node is dynamically given a score based on its CPU level, available memory size, current length of waiting queue, CPU utilization and bandwidth. With the scores, resource nodes are classified into three levels. User requests based on their time constraints are also classified into three types. Resources of higher levels are allocated to more tightly constrained requests so as to increase the total performance of the system. To achieve this, a resource broker with the Semi-Preemptive Algorithm (SPA) is also proposed. When the resource broker cannot find suitable resources for the requests of higher type, it preempts the resource that is now executing a lower type request so that the request of higher type can be executed immediately. The SePCS can be applied to a Vehicular Ad Hoc Network (VANET), users of which can then exploit the convenient mobile network services and the wireless distributed computing. As a result, the performance of the system is higher than that of the tested schemes.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 66
    Publication Date: 2012-05-03
    Description: Suppose there is a collection of n simple polygons in the plane, none of which overlap each other. The polygons are interlocked if no subset can be separated arbitrarily far from the rest. It is natural to ask the characterization of the subsets that makes the set of interlocked polygons free (not interlocked). This abstracts the essence of a kind of sliding block puzzle. We show that any monotone Boolean function ƒ on n variables can be described by m = O(n) interlocked polygons. We also show that the decision problem that asks if given polygons are interlocked is PSPACE-complete.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 67
    Publication Date: 2012-05-03
    Description: Tantrix (Tantrix R ⃝ is a registered trademark of Colour of Strategy Ltd. in New Zealand, and of TANTRIX JAPAN in Japan, respectively, under the license of M. McManaway, the inventor.) is a puzzle to make a loop by connecting lines drawn on hexagonal tiles, and the objective of this research is to solve it by a computer. For this purpose, we first give a problem setting of solving Tantrix as making a loop on a given fixed board. We then formulate it as an integer program by describing the rules of Tantrix as its constraints, and solve it by a mathematical programming solver to have a solution. As a result, we establish a formulation that can solve Tantrix of moderate size, and even when the solutions are invalid only by elementary constraints, we achieved it by introducing additional constraints and re-solve it. By this approach we succeeded to solve Tantrix of size up to 60.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 68
    Publication Date: 2012-05-03
    Description: A disentanglement puzzle consists of mechanically interlinked pieces, and the puzzle is solved by disentangling one piece from another set of pieces. A string puzzle consists of strings entangled with one or more wooden pieces. We consider the generalized string puzzle problem whose input is the layout of strings and a wooden board with holes embedded in the 3-dimensional Euclidean space. We present a polynomial-time transformation from an arbitrary instance ƒ of the 3SAT problem to a string puzzle s such that ƒ is satisfiable if and only if s is solvable. Therefore, the generalized string puzzle problem is NP-hard.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 69
    Publication Date: 2012-05-03
    Description: Grammar-based compression is a well-studied technique to construct a context-free grammar (CFG) deriving a given text uniquely. In this work, we propose an online algorithm for grammar-based compression. Our algorithm guarantees O(log2 n)- approximation ratio for the minimum grammar size, where n is an input size, and it runs in input linear time and output linear space. In addition, we propose a practical encoding, which transforms a restricted CFG into a more compact representation. Experimental results by comparison with standard compressors demonstrate that our algorithm is especially effective for highly repetitive text.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 70
    Publication Date: 2012-05-03
    Description: In this note we illustrate and develop further with mathematics and examples, the work on successive standardization (or normalization) that is studied earlier by the same authors in [1] and [2]. Thus, we deal with successive iterations applied to rectangular arrays of numbers, where to avoid technical difficulties an array has at least three rows and at least three columns. Without loss, an iteration begins with operations on columns: first subtract the mean of each column; then divide by its standard deviation. The iteration continues with the same two operations done successively for rows. These four operations applied in sequence completes one iteration. One then iterates again, and again, and again, ... In [1] it was argued that if arrays are made up of real numbers, then the set for which convergence of these successive iterations fails has Lebesgue measure 0. The limiting array has row and column means 0, row and column standard deviations 1. A basic result on convergence given in [1] is true, though the argument in [1] is faulty. The result is stated in the form of a theorem here, and the argument for the theorem is correct. Moreover, many graphics given in [1] suggest that except for a set of entries of any array with Lebesgue measure 0, convergence is very rapid, eventually exponentially fast in the number of iterations. Because we learned this set of rules from Bradley Efron, we call it “Efron’s algorithm”. More importantly, the rapidity of convergence is illustrated by numerical examples.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 71
    Publication Date: 2012-05-03
    Description: Building on results from data compression, we prove nearly tight bounds on how well sequences of length n can be predicted in terms of the size σ of the alphabet and the length k of the context considered when making predictions. We compare the performance achievable by an adaptive predictor with no advance knowledge of the sequence, to the performance achievable by the optimal static predictor using a table listing the frequency of each (k + 1)-tuple in the sequence. We show that, if the elements of the sequence are chosen uniformly at random, then an adaptive predictor can compete in the expected case if k ≤ logσ n – 3 – ε, for a constant ε 〉 0, but not if k ≥ logσ n.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 72
    Publication Date: 2012-05-03
    Description: Air-borne and space-borne acquired hyperspectral images are used to recognize objects and to classify materials on the surface of the earth. The state of the art compressor for lossless compression of hyperspectral images is the Spectral oriented Least SQuares (SLSQ) compressor (see [1–7]). In this paper we discuss hyperspectral image compression: we show how to visualize each band of a hyperspectral image and how this visualization suggests that an appropriate band ordering can lead to improvements in the compression process. In particular, we consider two important distance measures for band ordering: Pearson’s Correlation and Bhattacharyya distance, and report on experimental results achieved by a Java-based implementation of SLSQ.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 73
    Publication Date: 2012-05-03
    Description: We propose a framework for the exact probabilistic analysis of window-based pattern matching algorithms, such as Boyer–Moore, Horspool, Backward DAWG Matching, Backward Oracle Matching, and more. In particular, we develop an algorithm that efficiently computes the distribution of a pattern matching algorithm’s running time cost (such as the number of text character accesses) for any given pattern in a random text model. Text models range from simple uniform models to higher-order Markov models or hidden Markov models (HMMs). Furthermore, we provide an algorithm to compute the exact distribution of differences in running time cost of two pattern matching algorithms. Methodologically, we use extensions of finite automata which we call deterministic arithmetic automata (DAAs) and probabilistic arithmetic automata (PAAs) [1]. Given an algorithm, a pattern, and a text model, a PAA is constructed from which the sought distributions can be derived using dynamic programming. To our knowledge, this is the first time that substring- or suffix-based pattern matching algorithms are analyzed exactly by computing the whole distribution of running time cost. Experimentally, we compare Horspool’s algorithm, Backward DAWG Matching, and Backward Oracle Matching on prototypical patterns of short length and provide statistics on the size of minimal DAAs for these computations.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 74
    Publication Date: 2012-05-03
    Description: We present tools that can be used within a larger system referred to as a passive assistant. The system receives information from a mobile device, as well as information from an image database such as Google Street View, and employs image processing to provide useful information about a local urban environment to a user who is visually impaired. The first stage acquires and computes accurate location information, the second stage performs texture and color analysis of a scene, and the third stage provides specific object recognition and navigation information. These second and third stages rely on compression-based tools (dimensionality reduction, vector quantization, and coding) that are enhanced by knowledge of (approximate) location of objects.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 75
    Publication Date: 2012-05-03
    Description: In this paper, we consider the following sliding puzzle called torus puzzle. In an m by n board, there are mn pieces numbered from 1 to mn. Initially, the pieces are placed in ascending order. Then they are scrambled by rotating the rows and columns without the player’s knowledge. The objective of the torus puzzle is to rearrange the pieces in ascending order by rotating the rows and columns. We provide a solution to this puzzle. In addition, we provide lower and upper bounds on the number of steps for solving the puzzle. Moreover, we consider a variant of the torus puzzle in which each piece is colored either black or white, and we present a hardness result for solving it.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 76
    Publication Date: 2012-05-03
    Description: Radio Frequency Interference (RFI) detection and mitigation algorithms based on a signal’s spectrogram (frequency and time domain representation) are presented. The radiometric signal’s spectrogram is treated as an image, and therefore image processing techniques are applied to detect and mitigate RFI by two-dimensional filtering. A series of Monte-Carlo simulations have been performed to evaluate the performance of a simple thresholding algorithm and a modified two-dimensional Wiener filter.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 77
    Publication Date: 2012-05-03
    Description: We review the state of the art in DNA microarray image compression and provide original comparisons between standard and microarray-specific compression techniques that validate and expand previous work. First, we describe the most relevant approaches published in the literature and classify them according to the stage of the typical image compression process where each approach makes its contribution, and then we summarize the compression results reported for these microarray-specific image compression schemes. In a set of experiments conducted for this paper, we obtain new results for several popular image coding techniques that include the most recent coding standards. Prediction-based schemes CALIC and JPEG-LS are the best-performing standard compressors, but are improved upon by the best microarray-specific technique, Battiato’s CNN-based scheme.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 78
    Publication Date: 2012-05-03
    Description: In an asynchronous data stream, the data items may be out of order with respect to their original timestamps. This paper studies the space complexity required by a data structure to maintain such a data stream so that it can approximate the set of frequent items over a sliding time window with sufficient accuracy. Prior to our work, the best solution is given by Cormode et al. [1], who gave an O (1/ε log W log (εB/ log W) min {log W, 1/ε} log |U|)- space data structure that can approximate the frequent items within an ε error bound, where W and B are parameters of the sliding window, and U is the set of all possible item names. We gave a more space-efficient data structure that only requires O (1/ε log W log (εB/ logW) log log W) space.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 79
    Publication Date: 2012-05-03
    Description: For fixed k ≥ 2 and fixed data alphabet of cardinality m, the hierarchical type class of a data string of length n = kj for some j ≥ 1 is formed by permuting the string in all possible ways under permutations arising from the isomorphisms of the unique finite rooted tree of depth j which has n leaves and k children for each non-leaf vertex. Suppose the data strings in a hierarchical type class are losslessly encoded via binary codewords of minimal length. A hierarchical entropy function is a function on the set of m-dimensional probability distributions which describes the asymptotic compression rate performance of this lossless encoding scheme as the data length n is allowed to grow without bound. We determine infinitely many hierarchical entropy functions which are each self-affine. For each such function, an explicit iterated function system is found such that the graph of the function is the attractor of the system.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 80
    Publication Date: 2012-05-03
    Description: The European Space Agency (ESA) successfully launched the Soil Moisture and Ocean Salinity (SMOS) mission in November 2, 2009. SMOS uses a new type of instrument, a synthetic aperture radiometer named MIRAS that provides full-polarimetric multi-angular L-band brightness temperatures, from which regular and global maps of Sea Surface Salinity (SSS) and Soil Moisture (SM) are generated. Although SMOS operates in a restricted band (1400–1427 MHz), radio-frequency interference (RFI) appears in SMOS imagery in many areas of the world, and it is an important issue to be addressed for quality SSS and SM retrievals. The impact on SMOS imagery of a sinusoidal RFI source is reviewed, and the problem is illustrated with actual RFI encountered by SMOS. Two RFI detection and mitigation algorithms are developed (dual-polarization and full-polarimetric modes), the performance of the second one has been quantitatively evaluated in terms of probability of detection and false alarm (using a synthetic test scene), and results presented using real dual-polarization and full-polarimetric SMOS imagery. Finally, a statistical analysis of more than 13,000 L1b snap-shots is presented and discussed.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 81
    Publication Date: 2012-05-03
    Description: This paper presents a genetic-based control scheme that not only utilizes evolutionary characteristics to find the signal acquisition parameters, but also employs an adaptive scheme to control the search space and avoid the genetic control converging to local optimal value so as to acquire the desired signal precisely and rapidly. Simulations and experiment results show that the proposed method can improve the precision of signal parameters and take less signal acquisition time than traditional serial search methods for global navigation satellite system (GNSS) signals.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 82
    Publication Date: 2012-05-03
    Description: In order to be able to capture effects from co-transcriptional folding, we extend stochastic context-free grammars such that the probability of applying a rule can depend on the length of the subword that is eventually generated from the symbols introduced by the rule, and we show that existing algorithms for training and for determining the most probable parse tree can easily be adapted to the extended model without losses in performance. Furthermore, we show that the extended model is suited to improve the quality of predictions of RNA secondary structures. The extended model may also be applied to other fields where stochastic context-free grammars are used like natural language processing. Additionally some interesting questions in the field of formal languages arise from it.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 83
    Publication Date: 2012-05-03
    Description: We present a survey of results concerning Lempel–Ziv data compression on parallel and distributed systems, starting from the theoretical approach to parallel time complexity to conclude with the practical goal of designing distributed algorithms with low communication cost. Storer’s extension for image compression is also discussed.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 84
    Publication Date: 2012-05-03
    Description: The smallest grammar problem—namely, finding a smallest context-free grammar that generates exactly one sequence—is of practical and theoretical importance in fields such as Kolmogorov complexity, data compression and pattern discovery. We propose a new perspective on this problem by splitting it into two tasks: (1) choosing which words will be the constituents of the grammar and (2) searching for the smallest grammar given this set of constituents. We show how to solve the second task in polynomial time parsing longer constituent with smaller ones. We propose new algorithms based on classical practical algorithms that use this optimization to find small grammars. Our algorithms consistently find smaller grammars on a classical benchmark reducing the size in 10% in some cases. Moreover, our formulation allows us to define interesting bounds on the number of small grammars and to empirically compare different grammars of small size.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 85
    Publication Date: 2012-05-03
    Description: Peelle’s Pertinent Puzzle (PPP) was described in 1987 in the context of estimating fundamental parameters that arise in nuclear interaction experiments. In PPP, generalized least squares (GLS) parameter estimates fell outside the range of the data, which has raised concerns that GLS is somehow flawed and has led to suggested alternatives to GLS estimators. However, there have been no corresponding performance comparisons among methods, and one suggested approach involving simulated data realizations is statistically incomplete. Here we provide performance comparisons among estimators, introduce approximate Bayesian computation (ABC) using density estimation applied to simulated data realizations to produce an alternative to the incomplete approach, complete the incompletely specified approach, and show that estimation error in the assumed covariance matrix cannot always be ignored.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 86
    Publication Date: 2012-05-03
    Description: Given a directed graph G with non-negative cost on the arcs, a directed tour cover T of G is a cycle (not necessarily simple) in G such that either head or tail (or both of them) of every arc in G is touched by T. The minimum directed tour cover problem (DToCP), which is to find a directed tour cover of minimum cost, is NP-hard. It is thus interesting to design approximation algorithms with performance guarantee to solve this problem. Although its undirected counterpart (ToCP) has been studied in recent years, in our knowledge, the DToCP remains widely open. In this paper, we give a 2 log2(n)-approximation algorithm for the DToCP.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 87
    Publication Date: 2012-05-03
    Description: This paper analyzes how recommender systems can be applied to current e-learning systems to guide learners in personalized inclusive e-learning scenarios. Recommendations can be used to overcome current limitations of learning management systems in providing personalization and accessibility features. Recommenders can take advantage of standards-based solutions to provide inclusive support. To this end we have identified the need for developing semantic educational recommender systems, which are able to extend existing learning management systems with adaptive navigation support. In this paper we present three requirements to be considered in developing these semantic educational recommender systems, which are in line with the service-oriented approach of the third generation of learning management systems, namely: (i) a recommendation model; (ii) an open standards-based service-oriented architecture; and (iii) a usable and accessible graphical user interface to deliver the recommendations.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 88
    Publication Date: 2012-05-03
    Description: Two goodness-of-fit tests for copulas are being investigated. The first one deals with the case of elliptical copulas and the second one deals with independent copulas. These tests result from the expansion of the projection pursuit methodology that we will introduce in the present article. This method enables us to determine on which axis system these copulas lie as well as the exact value of these very copulas in the basis formed by the axes previously determined irrespective of their value in their canonical basis. Simulations are also presented as well as an application to real datasets.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 89
    Publication Date: 2012-05-03
    Description: The problem of compressed pattern matching, which has recently been treated in many papers dealing with free text, is extended to structured files, specifically to dictionaries, which appear in any full-text retrieval system. The prefix-omission method is combined with Huffman coding and a new variant based on Fibonacci codes is presented. Experimental results suggest that the new methods are often preferable to earlier ones, in particular for small files which are typical for dictionaries, since these are usually kept in small chunks.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 90
    Publication Date: 2012-05-03
    Description: Several variants of the edit distance problem with block deletions are considered. Polynomial time optimal algorithms are presented for the edit distance with block deletions allowing character insertions and character moves, but without block moves. We show that the edit distance with block moves and block deletions is NP-complete (Nondeterministic Polynomial time problems in which any given solution to such problem can be verified in polynomial time, and any NP problem can be converted into it in polynomial time), and that it can be reduced to the problem of non-recursive block moves and block deletions within a constant factor.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 91
    Publication Date: 2012-05-03
    Description: Chris Langton proposed a model of an artificial life that he named “ant”: an agent- called ant- that is over a square of a grid moves by turning to the left (or right) accordingly to black (or white) color of the square where it is heading, and the square then reverses its color. Bunimovich and Troubetzkoy proved that an ant’s trajectory is always unbounded, or equivalently, there exists no repeatable configuration of the ant’s system. On the other hand, by introducing a new type of color where the ant goes straight ahead and the color never changes, repeatable configurations are known to exist. In this paper, we prove that determining whether a given finite configuration of generalized Langton’s ant is repeatable or not is PSPACE-hard. We also prove the PSPACE-hardness of the ant’s problem on a hexagonal grid.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 92
    Publication Date: 2012-05-03
    Description: We compare univariate L1 interpolating splines calculated on 5-point windows, on 7-point windows and on global data sets using four different spline functionals, namely, ones based on the second derivative, the first derivative, the function value and the antiderivative. Computational results indicate that second-derivative-based 5-point-window L1 splines preserve shape as well as or better than the other types of L1 splines. To calculate second-derivative-based 5-point-window L1 splines, we introduce an analysis-based, parallelizable algorithm. This algorithm is orders of magnitude faster than the previously widely used primal affine algorithm.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 93
    Publication Date: 2012-05-03
    Description: Several measurements are used to describe the behavior of a diabetic patient’s blood glucose. We describe a new, wavelet-based algorithm that indicates a new measurement called a PLA index could be used to quantify the variability or predictability of blood glucose. This wavelet-based approach emphasizes the shape of a blood glucose graph. Using continuous glucose monitors (CGMs), this measurement could become a new tool to classify patients based on their blood glucose behavior and may become a new method in the management of diabetes.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 94
    Publication Date: 2012-05-03
    Description: The metric average is a binary operation between sets in Rn which is used in the approximation of set-valued functions. We introduce an algorithm that applies tools of computational geometry to the computation of the metric average of 2D sets with piecewise linear boundaries.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 95
    Publication Date: 2012-05-03
    Description: Ray J. Solomonoff died on December 7, 2009, in Cambridge, Massachusetts, of complications of a stroke caused by an aneurism in his head. Ray was the first inventor of Algorithmic Information Theory which deals with the shortest effective description length of objects and is commonly designated by the term “Kolmogorov complexity.” In the 1950s Solomonoff was one of the first researchers to treat probabilistic grammars and the associated languages. He treated probabilistic Artificial Intelligence (AI) when “probabilistic” was unfashionable, and treated questions of machine learning early on. But his greatest contribution is the creation of Algorithmic Information Theory. [...]
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 96
    Publication Date: 2012-05-03
    Description: We analytically investigate univariate C1 continuous cubic L1 interpolating splines calculated by minimizing an L1 spline functional based on the second derivative on 5-point windows. Specifically, we link geometric properties of the data points in the windows with linearity, convexity and oscillation properties of the resulting L1 spline. These analytical results provide the basis for a computationally efficient algorithm for calculation of L1 splines on 5-point windows.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 97
    Publication Date: 2012-05-03
    Description: We present a car traffic simulation prototype for complex networks, that is formed by a collection of roads and junctions. Traffic load evolution is described by a model based on fluid dynamic conservation laws, deduced from conservation of the number of cars. The model contains some additional hypothesis in order to reproduce specific car traffic features such as route based car distribution at nodes and the presence of right-of-way at the crossroads. A complete implementation of this model is then presented, together with computational results on case studies.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 98
    Publication Date: 2012-05-03
    Description: The algorithm of Lenstra, Lenstra, and Lovász (LLL) transforms a given integer lattice basis into a reduced basis. Storjohann improved the worst case complexity of LLL algorithms by a factor of O(n) using modular arithmetic. Koy and Schnorr developed a segment-LLL basis reduction algorithm that generates lattice basis satisfying a weaker condition than the LLL reduced basis with O(n) improvement than the LLL algorithm. In this paper we combine Storjohann’s modular arithmetic approach with the segment-LLL approach to further improve the worst case complexity of the segment-LLL algorithms by a factor of n0.5.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 99
    Publication Date: 2012-05-03
    Description: Generalized least squares (GLS) for model parameter estimation has a long and successful history dating to its development by Gauss in 1795. Alternatives can outperform GLS in some settings, and alternatives to GLS are sometimes sought when GLS exhibits curious behavior, such as in Peelle’s Pertinent Puzzle (PPP). PPP was described in 1987 in the context of estimating fundamental parameters that arise in nuclear interaction experiments. In PPP, GLS estimates fell outside the range of the data, eliciting concerns that GLS was somehow flawed. These concerns have led to suggested alternatives to GLS estimators. This paper defends GLS in the PPP context, investigates when PPP can occur, illustrates when PPP can be beneficial for parameter estimation, reviews optimality properties of GLS estimators, and gives an example in which PPP does occur.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 100
    Publication Date: 2012-05-03
    Description: Increasingly encompassing models have been suggested for our world. Theories range from generally accepted to increasingly speculative to apparently bogus. The progression of theories from ego- to geo- to helio-centric models to universe and multiverse theories and beyond was accompanied by a dramatic increase in the sizes of the postulated worlds, with humans being expelled from their center to ever more remote and random locations. Rather than leading to a true theory of everything, this trend faces a turning point after which the predictive power of such theories decreases (actually to zero). Incorporating the location and other capacities of the observer into such theories avoids this problem and allows to distinguish meaningful from predictively meaningless theories. This also leads to a truly complete theory of everything consisting of a (conventional objective) theory of everything plus a (novel subjective) observer process. The observer localization is neither based on the controversial anthropic principle, nor has it anything to do with the quantum-mechanical observation process. The suggested principle is extended to more practical (partial, approximate, probabilistic, parametric) world models (rather than theories of everything). Finally, I provide a justification of Ockham’s razor, and criticize the anthropic principle, the doomsday argument, the no free lunch theorem, and the falsifiability dogma.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...