ALBERT

All Library Books, journals and Electronic Records Telegrafenberg

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
Filter
  • Articles  (1,638)
  • MDPI  (1,638)
  • American Geophysical Union (AGU)
  • Copernicus
  • MDPI Publishing
  • Public Library of Science
  • 2015-2019  (1,638)
  • 1980-1984
  • 1945-1949
  • 2019  (1,638)
  • Entropy  (1,123)
  • ISPRS International Journal of Geo-Information  (515)
  • 180697
  • 6431
Collection
  • Articles  (1,638)
Publisher
Years
  • 2015-2019  (1,638)
  • 1980-1984
  • 1945-1949
Year
Journal
  • 1
    Publication Date: 2019
    Description: In view of the randomness in the selection of kernel parameters in the traditional kernel independent component analysis (KICA) algorithm, this paper proposes a CPSO-KICA algorithm based on Chaotic Particle Swarm Optimization (CPSO) and KICA. In CPSO-KICA, the maximum entropy of the extracted independent component is first adopted as the fitness function of the PSO algorithm to determine the optimal kernel parameters, then the chaotic algorithm (CO) is used to avoid the local optimum existing in the traditional PSO algorithm. Finally, this proposed algorithm is compared with Weighted KICA (WKICA) and PSO-KICA with Tennessee Eastman Process (TEP) as the benchmark. Simulation results show that the proposed algorithm can determine the optimal kernel parameters and perform better in terms of false alarm rates (FAR), detection latency (DL) and fault detection rates (FDR).
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 2
    Publication Date: 2019
    Description: We explore the dissipative dynamics of two coupled qubits placed inside a coherent cavity-field under dipole-dipole interplay and 2-photon transitions. The generated non-classical correlations (NCCs) beyond entanglement are investigated via two measures based on the Hilbert-Schmidt norm. It is found that the robustness of the generated NCCs can be greatly enhanced by performing the intrinsic dissipation rate, dipole-dipole interplay rate, initial coherence intensity and the degree of the coherent state superpositions. The results show that the intrinsic decoherence stabilize the stationarity of the non-classical correlations while the dipole interplay rate boost them. The non-classical correlations can be frozen at their stationary correlations by increasing the intrinsic dissipation rate. Also NCCs, can be enhanced by increasing the initial coherent intensity.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 3
    Publication Date: 2019
    Description: Thermodynamic aspects of the theory of nucleation are commonly considered employing Gibbs’ theory of interfacial phenomena and its generalizations. Utilizing Gibbs’ theory, the bulk parameters of the critical clusters governing nucleation can be uniquely determined for any metastable state of the ambient phase. As a rule, they turn out in such treatment to be widely similar to the properties of the newly-evolving macroscopic phases. Consequently, the major tool to resolve problems concerning the accuracy of theoretical predictions of nucleation rates and related characteristics of the nucleation process consists of an approach with the introduction of the size or curvature dependence of the surface tension. In the description of crystallization, this quantity has been expressed frequently via changes of entropy (or enthalpy) in crystallization, i.e., via the latent heat of melting or crystallization. Such a correlation between the capillarity phenomena and entropy changes was originally advanced by Stefan considering condensation and evaporation. It is known in the application to crystal nucleation as the Skapski–Turnbull relation. This relation, by mentioned reasons more correctly denoted as the Stefan–Skapski–Turnbull rule, was expanded by some of us quite recently to the description of the surface tension not only for phase equilibrium at planar interfaces, but to the description of the surface tension of critical clusters and its size or curvature dependence. This dependence is frequently expressed by a relation derived by Tolman. As shown by us, the Tolman equation can be employed for the description of the surface tension not only for condensation and boiling in one-component systems caused by variations of pressure (analyzed by Gibbs and Tolman), but generally also for phase formation caused by variations of temperature. Beyond this particular application, it can be utilized for multi-component systems provided the composition of the ambient phase is kept constant and variations of either pressure or temperature do not result in variations of the composition of the critical clusters. The latter requirement is one of the basic assumptions of classical nucleation theory. For this reason, it is only natural to use it also for the specification of the size dependence of the surface tension. Our method, relying on the Stefan–Skapski–Turnbull rule, allows one to determine the dependence of the surface tension on pressure and temperature or, alternatively, the Tolman parameter in his equation. In the present paper, we expand this approach and compare it with alternative methods of the description of the size-dependence of the surface tension and, as far as it is possible to use the Tolman equation, of the specification of the Tolman parameter. Applying these ideas to condensation and boiling, we derive a relation for the curvature dependence of the surface tension covering the whole range of metastable initial states from the binodal curve to the spinodal curve.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 4
    Publication Date: 2019
    Description: Direct dependencies and conditional dependencies in restricted Bayesian network classifiers (BNCs) are two basic kinds of dependencies. Traditional approaches, such as filter and wrapper, have proved to be beneficial to identify non-significant dependencies one by one, whereas the high computational overheads make them inefficient especially for those BNCs with high structural complexity. Study of the distributions of information-theoretic measures provides a feasible approach to identifying non-significant dependencies in batch that may help increase the structure reliability and avoid overfitting. In this paper, we investigate two extensions to the k-dependence Bayesian classifier, MI-based feature selection, and CMI-based dependence selection. These two techniques apply a novel adaptive thresholding method to filter out redundancy and can work jointly. Experimental results on 30 datasets from the UCI machine learning repository demonstrate that adaptive thresholds can help distinguish between dependencies and independencies and the proposed algorithm achieves competitive classification performance compared to several state-of-the-art BNCs in terms of 0–1 loss, root mean squared error, bias, and variance.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 5
    Publication Date: 2019
    Description: Identification of denatured biological tissue is crucial to high intensity focused ultrasound (HIFU) treatment. It is not easy for intercepting ultrasonic scattered echo signals from HIFU treatment region. Therefore, this paper employed time-frequency entropy based on generalized S-transform (GST) to intercept ultrasonic echo signals. First, the time-frequency spectra of ultrasonic echo signal is obtained by GST, which is concentrated around the real instantaneous frequency of the signal. Then the time-frequency entropy is calculated based on time-frequency spectra. The experimental results indicate that the time-frequency entropy of ultrasonic echo signal will be abnormally high when ultrasonic signal travels across the boundary between normal region and treatment region in tissues. Ultrasonic scattered echo signals from treatment region can be intercepted by time-frequency entropy. In addition, the refined composite multi-scale weighted permutation entropy (RCMWPE) is proposed to evaluate the complexity of nonlinear time series. Comparing with multi-scale permutation entropy (MPE) and multi-scale weighted permutation entropy (MWPE), RCMWPE not only measures complexity of signal including amplitude information, but also improves the stability and reliability of multi-scale entropy. The RCMWPE and MPE are applied to 300 cases of actual ultrasonic scattered echo signals (including 150 cases in normal status and 150 cases in denatured status). It is found that the RCMWPE and MPE values of denatured tissues are higher than those of the normal tissues. Both RCMWPE and MPE can be used to distinguish normal tissues and denatured tissues. However, there are fewer feature points in the overlap region between RCMWPE of denatured tissues and normal tissues compared with MPE. The intra-class distance and the inter-class distance of RCMWPE are less and greater respectively than MPE. The difference between denatured tissues and normal tissues is more obvious when RCMWPE is used as the characteristic parameter. The results of this study will be helpful to guide doctors to obtain more accurate assessment of treatment effect during HIFU treatment.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 6
    Publication Date: 2019
    Description: Gas-liquid two-phase flow behavior in horizontal channel under heaving motion showed unique dynamic characteristics due to the complex nonlinear interaction. To further establish a description model and investigate the effects of heaving motion on horizontal gas-liquid flow, experiments in a wide range of vibration parameters and working conditions were carried out by combining vibration platform with two-phase flow loop. It was found that the flow regimes under heaving motion showed significant differences compared to the ones expected in steady state flow under the same working conditions. Increasing vibration parameters showed an obvious impact on fluctuation degree of gas-liquid interface by visualizing high-speed photographs. A method based on multi-scale entropy was applied to identify flow regimes and reveal the underlying dynamic characteristics by collecting signals of pressure-difference. The results indicated that the proposed method was effective to analyze gas-liquid two-phase flow transition in horizontal channel under heaving motion by incorporating information of flow condition and change rate of multi-scale entropy, which provided a reliable guide for flow pattern control design and safe operation of equipment. However, for slug-wave and boiling wave flow, an innovative method based on multi-scale marginal spectrum entropy showed more feasible for identification of transition boundary.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 7
    Publication Date: 2019
    Description: The logistic chaotic system, as a classical complex phenomenon of nonlinear dynamic systems, has received extensive attention in the field of secure communication. It is generally believed that the characteristics of chaos are suitable for the needs of encryption systems. In this paper, a multi-scale entropy theory analysis and statistical analysis are carried out on the chaotic sequences produced by different parameters and different initial values of logistic systems. According to the simulation results, the complexity of the chaotic system represented by the logistic system is mainly decided by parameter μ. Not all characteristic parameters of the chaotic system depend on the initial values. It is possible to make a reasonable estimation and prediction of the chaotic system from a macroscopic level. A variance estimation method for the parameter μ is proposed and applied to a logistic system and to another chaotic system, which is equally effective.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 8
    Publication Date: 2019
    Description: In this paper, we study a dual-channel closed-loop supply chain in which a manufacturer considers the market waste products recovery and remanufacture, and a retailer considers provide services to customers. We build a Stackelberg game model and a centralized game model in a static and dynamic state, respectively, and analyze the two dynamic models by mathematical analysis and explore the stability and entropy of the two models using bifurcation, the basin of attraction, chaotic attractors, and so on. The influences of service level and profit distribution rate on the system’s profit are discussed. The theoretical results show that higher price adjustment speed will lead to the system lose stability with a larger entropy value. In the Stackelberg game model, the stability of the system increases as the service value and the recovery rate increases; in the centralized model, the stability of the system decreases with the increase of the service value and increases with the recovery rate increases. When the Stackelberg game model is in a stable state, the manufacturer’s profit increases first and then decreases, and the retailer’s profit first decreases and then increases as the service value of the retailer increases. The research will serve as good guidance for both the manufacturer and retailer in dual-channel closed-loop supply chains to improve decision making.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 9
    Publication Date: 2019
    Description: When chaotic systems are used in different practical applications, such as chaotic secure communication and chaotic pseudorandom sequence generators, a large number of chaotic systems are strongly required. However, for a lack of a systematic construction theory, the construction of chaotic systems mainly depends on the exhaustive search of systematic parameters or initial values, especially for a class of dynamical systems with hidden chaotic attractors. In this paper, a class of quadratic polynomial chaotic maps is studied, and a general method for constructing quadratic polynomial chaotic maps is proposed. The proposed polynomial chaotic maps satisfy the Li–Yorke definition of chaos. This method can accurately control the amplitude of chaotic time series. Through the existence and stability analysis of fixed points, we proved that such class quadratic polynomial maps cannot have hidden chaotic attractors.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 10
    Publication Date: 2019
    Description: The entropic uncertainty relation (EUR) is of significant importance in the security proof of continuous-variable quantum key distribution under coherent attacks. The parameter estimation in the EUR method contains the estimation of the covariance matrix (CM), as well as the max-entropy. The discussions in previous works have not involved the effect of finite-size on estimating the CM, which will further affect the estimation of leakage information. In this work, we address this issue by adapting the parameter estimation technique to the EUR analysis method under composable security frameworks. We also use the double-data modulation method to improve the parameter estimation step, where all the states can be exploited for both parameter estimation and key generation; thus, the statistical fluctuation of estimating the max-entropy disappears. The result shows that the adapted method can effectively estimate parameters in EUR analysis. Moreover, the double-data modulation method can, to a large extent, save the key consumption, which further improves the performance in practical implementations of the EUR.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 11
    Publication Date: 2019
    Description: Recently, active learning is considered a promising approach for data acquisition due to the significant cost of the data labeling process in many real world applications, such as natural language processing and image processing. Most active learning methods are merely designed to enhance the learning model accuracy. However, the model accuracy may not be the primary goal and there could be other domain-specific objectives to be optimized. In this work, we develop a novel active learning framework that aims to solve a general class of optimization problems. The proposed framework mainly targets the optimization problems exposed to the exploration-exploitation trade-off. The active learning framework is comprehensive, it includes exploration-based, exploitation-based and balancing strategies that seek to achieve the balance between exploration and exploitation. The paper mainly considers regression tasks, as they are under-researched in the active learning field compared to classification tasks. Furthermore, in this work, we investigate the different active querying approaches—pool-based and the query synthesis—and compare them. We apply the proposed framework to the problem of learning the price-demand function, an application that is important in optimal product pricing and dynamic (or time-varying) pricing. In our experiments, we provide a comparative study including the proposed framework strategies and some other baselines. The accomplished results demonstrate a significant performance for the proposed methods.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 12
    Publication Date: 2019
    Description: A complete quantum cooling cycle may be a useful platform for studying quantum thermodynamics just as the quantum heat engine does. Entropy change is an important feature which can help us to investigate the thermodynamic properties of the single ion cooling process. Here, we analyze the entropy change of the ion and laser field in the single ion cooling cycle by generalizing the idea in Reference (Phys. Rev. Lett. 2015, 114, 043002) to a single ion system. Thermodynamic properties of the single ion cooling process are discussed and it is shown that the Second and Third Laws of Thermodynamics are still strictly held in the quantum cooling process. Our results suggest that quantum cooling cycles are also candidates for the investigation on quantum thermodynamics besides quantum heat engines.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 13
    Publication Date: 2019
    Description: In this paper, a new three-dimensional chaotic system is proposed for image encryption. The core of the encryption algorithm is the combination of chaotic system and compressed sensing, which can complete image encryption and compression at the same time. The Lyapunov exponent, bifurcation diagram and complexity of the new three-dimensional chaotic system are analyzed. The performance analysis shows that the chaotic system has two positive Lyapunov exponents and high complexity. In the encryption scheme, a new chaotic system is used as the measurement matrix for compressed sensing, and Arnold is used to scrambling the image further. The proposed method has better reconfiguration ability in the compressible range of the algorithm compared with other methods. The experimental results show that the proposed encryption scheme has good encryption effect and image compression capability.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 14
    Publication Date: 2019
    Description: In recent years data acquisition from remote sensing has become readily available to the quarry sector. This study demonstrates how such data may be used to evaluate and back analyse rockfall potential of a legacy slope in a blocky rock mass. Use of data obtained from several aerial LiDAR (Light Detection and Ranging) and photogrammetric campaigns taken over a number of years (2011 to date) provides evidence for potential rockfall evolution from a slope within an active quarry operation in Cornwall, UK. Further investigation, through analysis of point cloud data obtained from terrestrial laser scanning, was undertaken to characterise the orientation of discontinuities present within the rock slope. Aerial and terrestrial LiDAR data were subsequently used for kinematic analysis, production of surface topography models and rockfall trajectory analyses using both 2D and 3D numerical simulations. The results of an Unmanned Aerial Vehicle (UAV)-based 3D photogrammetric analysis enabled the reconstruction of high resolution topography, allowing one to not only determine geometrical properties of the slope surface and geo-mechanical characterisation but provide data for validation of numerical simulations. The analysis undertaken shows the effectiveness of the existing rockfall barrier, while demonstrating how photogrammetric data can be used to inform back analyses of the underlying failure mechanism and investigate potential runout.
    Electronic ISSN: 2220-9964
    Topics: Architecture, Civil Engineering, Surveying , Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 15
    Publication Date: 2019
    Description: The accurate prediction of bus passenger flow is the key to public transport management and the smart city. A long short-term memory network, a deep learning method for modeling sequences, is an efficient way to capture the time dependency of passenger flow. In recent years, an increasing number of researchers have sought to apply the LSTM model to passenger flow prediction. However, few of them pay attention to the optimization procedure during model training. In this article, we propose a hybrid, optimized LSTM network based on Nesterov accelerated adaptive moment estimation (Nadam) and the stochastic gradient descent algorithm (SGD). This method trains the model with high efficiency and accuracy, solving the problems of inefficient training and misconvergence that exist in complex models. We employ a hybrid optimized LSTM network to predict the actual passenger flow in Qingdao, China and compare the prediction results with those obtained by non-hybrid LSTM models and conventional methods. In particular, the proposed model brings about a 4%–20% extra performance improvements compared with those of non-hybrid LSTM models. We have also tried combinations of other optimization algorithms and applications in different models, finding that optimizing LSTM by switching Nadam to SGD is the best choice. The sensitivity of the model to its parameters is also explored, which provides guidance for applying this model to bus passenger flow data modelling. The good performance of the proposed model in different temporal and spatial scales shows that it is more robust and effective, which can provide insightful support and guidance for dynamic bus scheduling and regional coordination scheduling.
    Electronic ISSN: 2220-9964
    Topics: Architecture, Civil Engineering, Surveying , Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 16
    Publication Date: 2019
    Description: In this paper, the Gaussian wiretap feedback channel is revisited, and some new results on its secrecy capacity are obtained. To be specific, first, we show that the Schalkwijk–Kailath (SK) feedback scheme, which achieves the secrecy capacity of the degraded Gaussian wiretap feedback channel, also achieves the secrecy capacity of the non-degraded Gaussian wiretap feedback channel. Second, applying the existing secret key-based feedback schemes to Gaussian wiretap feedback channels, we derive some new lower bounds on the secrecy capacities of these models. Finally, we compare the performances of the above feedback schemes in the degraded and non-degraded Gaussian wiretap feedback channels and show which feedback scheme performs better for these channel models.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 17
    Publication Date: 2019
    Description: A multiobjective optimization of an organic Rankine cycle (ORC) evaporator, operating with toluene as the working fluid, is presented in this paper for waste heat recovery (WHR) from the exhaust gases of a 2 MW Jenbacher JMS 612 GS-N.L. gas internal combustion engine. Indirect evaporation between the exhaust gas and the organic fluid in the parallel plate heat exchanger (ITC2) implied irreversible heat transfer and high investment costs, which were considered as objective functions to be minimized. Energy and exergy balances were applied to the system components, in addition to the phenomenological equations in the ITC2, to calculate global energy indicators, such as the thermal efficiency of the configuration, the heat recovery efficiency, the overall energy conversion efficiency, the absolute increase of engine thermal efficiency, and the reduction of the break-specific fuel consumption of the system, of the system integrated with the gas engine. The results allowed calculation of the plate spacing, plate height, plate width, and chevron angle that minimized the investment cost and entropy generation of the equipment, reaching 22.04 m2 in the heat transfer area, 693.87 kW in the energy transfer by heat recovery from the exhaust gas, and 41.6% in the overall thermal efficiency of the ORC as a bottoming cycle for the engine. This type of result contributes to the inclusion of this technology in the industrial sector as a consequence of the improvement in thermal efficiency and economic viability.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 18
    Publication Date: 2019
    Description: Rough set theory is an important approach for data mining, and it refers to Shannon’s information measures for uncertainty measurements. The existing local conditional-entropies have both the second-order feature and application limitation. By improvements of hierarchical granulation, this paper establishes double-granule conditional-entropies based on three-level granular structures (i.e., micro-bottom, meso-middle, macro-top ), and then investigates the relevant properties. In terms of the decision table and its decision classification, double-granule conditional-entropies are proposed at micro-bottom by the dual condition-granule system. By virtue of successive granular summation integrations, they hierarchically evolve to meso-middle and macro-top, to respectively have part and complete condition-granulations. Then, the new measures acquire their number distribution, calculation algorithm, three bounds, and granulation non-monotonicity at three corresponding levels. Finally, the hierarchical constructions and achieved properties are effectively verified by decision table examples and data set experiments. Double-granule conditional-entropies carry the second-order characteristic and hierarchical granulation to deepen both the classical entropy system and local conditional-entropies, and thus they become novel uncertainty measures for information processing and knowledge reasoning.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 19
    Publication Date: 2019
    Description: Nowadays, the images are transferred through open channels that are subject to potential attacks, so the exchange of image data requires additional security in many fields, such as medical, military, banking, etc. The security factors are essential in preventing the system from brute force and differential attacks. We propose an Enhanced Logistic Map (ELM) while using chaotic maps and simple encryption techniques, such as block scrambling, modified zigzag transformation for encryption phases, including permutation, diffusion, and key stream generation to withstand the attacks. The results of encryption are evaluated while using the histogram, correlation analysis, Number of Pixel Change Rate (NPCR), Unified Average Change Intensity (UACI), Peak-Signal-to-Noise Ratio (PSNR), and entropy. Our results demonstrate the security, reliability, efficiency, and flexibility of the proposed method.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 20
    Publication Date: 2019
    Description: Intrinsically disordered proteins (IDPs) represent a distinct class of proteins and are distinguished from globular proteins by conformational plasticity, high evolvability and a broad functional repertoire. Some of their properties are reminiscent of early proteins, but their abundance in eukaryotes, functional properties and compositional bias suggest that IDPs appeared at later evolutionary stages. The spectrum of IDP properties and their determinants are still not well defined. This study compares rudimentary physicochemical properties of IDPs and globular proteins using bioinformatic analysis on the level of their native sequences and random sequence permutations, addressing the contributions of composition versus sequence as determinants of the properties. IDPs have, on average, lower predicted secondary structure contents and aggregation propensities and biased amino acid compositions. However, our study shows that IDPs exhibit a broad range of these properties. Induced fold IDPs exhibit very similar compositions and secondary structure/aggregation propensities to globular proteins, and can be distinguished from unfoldable IDPs based on analysis of these sequence properties. While amino acid composition seems to be a major determinant of aggregation and secondary structure propensities, sequence randomization does not result in dramatic changes to these properties, but for both IDPs and globular proteins seems to fine-tune the tradeoff between folding and aggregation.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 21
    Publication Date: 2019
    Description: The Boltzmann–Gibbs (BG) entropy has been used in a wide variety of problems for more than a century. It is well known that BG entropy is additive and extensive, but for certain systems such as those dictated by long-range interactions, it is speculated that the entropy must be non-additive and non-extensive. Tsallis entropy possesses these characteristics, and is parameterized by a variable q ( q = 1 being the classic BG limit), but unless q is determined from microscopic dynamics, the model remains a phenomenological tool. To this day, very few examples have emerged in which q can be computed from first principles. This paper shows that the space plasma environment, which is governed by long-range collective electromagnetic interaction, represents a perfect example for which the q parameter can be computed from microphysics. By taking the electron velocity distribution function measured in the heliospheric environment into account, and considering them to be in a quasi-equilibrium state with electrostatic turbulence known as quasi-thermal noise, it is shown that the value corresponding to q = 9 / 13 = 0 . 6923 , or alternatively q = 5 / 9 = 0 . 5556 , may be deduced. This prediction is verified against observations made by spacecraft, and it is shown to be in excellent agreement. This paper constitutes an overview of recent developments regarding the non-equilibrium statistical mechanical approach to understanding the non-extensive nature of space plasma, although some recent new developments are also discussed.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 22
    Publication Date: 2019
    Description: We present a quantum scheme for signing contracts between two clients (Alice and Bob) using entangled states and the services of a third trusted party (Trent). The trusted party is only contacted for the initialization of the protocol, and possibly at the end, to verify clients’ honesty and deliver signed certificates. The protocol is fair, i.e., the probability that a client, say Bob, can obtain a signed copy of the contract, while Alice cannot, can be made arbitrarily small, and scales as N − 1 / 2 , where 4 N is the total number of rounds (communications between the two clients) of the protocol. Thus, the protocol is optimistic, as cheating is not successful, and the clients rarely have to contact Trent to confirm their honesty by delivering the actual signed certificates of the contract. Unlike the previous protocol (Paunković et al., Phys. Rev. A 84, 062331 (2011)), in the present proposal, a single client can obtain the signed contract alone, without the need for the other client’s presence. When first contacting Trent, the clients do not have to agree upon a definitive contract. Moreover, even upon terminating the protocol, the clients do not reveal the actual contract to Trent. Finally, the protocol is based on the laws of physics, rather than on mathematical conjectures and the exchange of a large number of signed authenticated messages during the actual contract signing process. Therefore, it is abuse-free, as Alice and Bob cannot prove they are involved in the contract signing process.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 23
    Publication Date: 2019
    Description: This study aims to improve the implementation of models of geospatial information in Web Ontology Language (OWL). Large amounts of geospatial information are maintained in Geographic Information Systems (GIS) based on models according to the Unified Modeling Language (UML) and standards from ISO/TC 211 and the Open Geospatial Consortium (OGC). Sharing models and geospatial information in the Semantic Web will increase the usability and value of models and information, as well as enable linking with spatial and non-spatial information from other domains. Methods for conversion from UML to OWL for basic concepts used in models of geospatial information have been studied and evaluated. Primary conversion challenges have been identified with specific attention to whether adapted rules for UML modelling could contribute to improved conversions. Results indicated that restrictions related to abstract classes, unions, compositions and code lists in UML are challenging in the Open World Assumption (OWA) on which OWL is based. Two conversion challenges are addressed by adding more semantics to UML models: global properties and reuse of external concepts. The proposed solution is formalized in a UML profile supported by rules and recommendations and demonstrated with a UML model based on the Intelligent Transport Systems (ITS) standard ISO 14825 Geographic Data Files (GDF). The scope of the resulting ontology will determine to what degree the restrictions shall be maintained in OWL, and different conversion methods are needed for different scopes.
    Electronic ISSN: 2220-9964
    Topics: Architecture, Civil Engineering, Surveying , Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 24
    Publication Date: 2019
    Description: The 3D road network scene helps to simulate the distribution of road infrastructure and the corresponding traffic conditions. However, the existing road modeling methods have limitations such as inflexibility in different types of road construction, inferior quality in visual effects and poor efficiency for large-scale model rendering. To tackle these challenges, a template-based 3D road modeling method is proposed in this paper. In this method, the road GIS data is first pre-processed before modeling. The road centerlines are analyzed to extract topology information and resampled to improve path accuracy and match the terrain. Meanwhile, the road network is segmented and organized using a hierarchical block data structure. Road elements, including roadbeds, road facilities and moving vehicles are then designed based on templates. These templates define the geometric and semantic information of elements along both the cross-section and road centerline. Finally, the road network scene is built by the construction algorithms, where roads, at-grade intersections, grade separated areas and moving vehicles are modeled and simulated separately. The proposed method is tested by generating large-scale virtual road network scenes in the World Wind, an open source software package. The experimental results demonstrate that the method is flexible and can be used to develop different types of road models and efficiently simulate large-scale road network environments.
    Electronic ISSN: 2220-9964
    Topics: Architecture, Civil Engineering, Surveying , Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 25
    Publication Date: 2019
    Description: It is becoming increasingly clear that causal analysis of dynamical systems requires different approaches than, for example, causal analysis of interconnected autoregressive processes. In this study, a correlation dimension estimated in reconstructed state spaces is used to detect causality. If deterministic dynamics plays a dominant role in data then the method based on the correlation dimension can serve as a fast and reliable way to reveal causal relationships between and within the systems. This study demonstrates that the method, unlike most other causal approaches, detects causality well, even for very weak links. It can also identify cases of uncoupled systems that are causally affected by a hidden common driver.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 26
    Publication Date: 2019
    Description: In Africa, there is growing knowledge regarding the use of data obtained by remote sensing and analysed while using Geographic Information Systems for solving myriad problems. The awareness has largely arisen through the efforts of the Programme on Space Applications (PSA) of the United Nations Office for Outer Space Affairs (UNOOSA), and the subsequent UN resolutions for the establishment of Regional Centres for Space Science and Technology Education, to train scientists and researchers in different thematic areas of space, including Remote Sensing/Geographic Information Systems (RS/GIS). The African Regional Centre for Space Science and Technology Education in English (ARCSSTE-E) is one of these regional centres. The Centre has successfully trained 474 professionals from 18 countries since its inception in 1998; about 14% of these trainees have been female. This paper highlights the training programmes of ARCSSTE-E from its inception, and discusses the potential areas of improvement with a focus on the RS/GIS area. In 2019, a survey was conducted on alumni of the Postgraduate Diploma (PGD) programme of ARCSSTE-E. Based on the analysis of their responses and the progression of the PGD programme to a new Masters programme in RS/GIS at the university, there is clear evidence regarding the impact of the UNOOSA-assisted capacity building programme on the work and career of alumni, which has already produced an appreciable number of trained personnel in developing countries in Africa.
    Electronic ISSN: 2220-9964
    Topics: Architecture, Civil Engineering, Surveying , Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 27
    Publication Date: 2019
    Description: One of the main challenges in the design and implementation of fluidized desiccant cooling (FDC) systems is increasing their low COP (coefficient of performance). Exergy analysis is one of the tools especially suitable for improvement and optimization of FDC systems. The improvement of performance is impossible as long as the main sources of exergy destruction are not identified and evaluated. In this paper, the exergy analysis was applied in order to identify these components and processes of the FDC system that are mainly responsible for exergy destruction. Moreover, the exergy efficiency of a simple fluidized desiccant cooler was determined. The results showed that fluidized beds and regenerative heat exchanger were the main exergy destruction sources with a 32% and 18% share of total exergy destruction, respectively. On the other hand, the direct evaporative cooler and air cooler placed after the desorbing fluidized bed were characterized by the lowest exergy efficiencies. This work contributes to better understanding of FDC operation principles and improvement of the performance of FDC technology.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 28
    Publication Date: 2019
    Description: The role of local electron–vibration and electron–electron interactions on the thermoelectric properties of molecular junctions is theoretically analyzed focusing on devices based on fullerene molecules. A self-consistent adiabatic approach is used in order to obtain a non-perturbative treatment of the electron coupling to low frequency vibrational modes, such as those of the molecule center of mass between metallic leads. The approach also incorporates the effects of strong electron–electron interactions between molecular degrees of freedom within the Coulomb blockade regime. The analysis is based on a one-level model which takes into account the relevant transport level of fullerene and its alignment to the chemical potential of the leads. We demonstrate that only the combined effect of local electron–vibration and electron–electron interactions is able to predict the correct behavior of both the charge conductance and the Seebeck coefficient in very good agreement with available experimental data.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 29
    Publication Date: 2019
    Description: Conventional Geographical Information Systems (GIS) software struggles to represent uncertain and contested historical knowledge. An ontology, meaning a semantic structure defining named entities, and explicit and typed relationships, can be constructed in the absence of locational data, and spatial objects can be attached to this structure if and when they become available. We describe the overall architecture of the Great Britain Historical GIS, and the PastPlace Administrative Unit Ontology that forms its core. Then, we show how particular historical geographies can be represented within this architecture through two case studies, both emphasizing entity definition and especially the application of a multi-level typology, in which each “unit” has an unchanging “type” but also a time-variant “status”. The first includes the linked systems of Poor Law unions and registration districts in 19th century England and Wales, in which most but not all unions and districts were coterminous. The second case study includes the international system of nation-states, in which most units do not appear from nothing, but rather gain or lose independence. We show that a relatively simple data model is able to represent much historical complexity.
    Electronic ISSN: 2220-9964
    Topics: Architecture, Civil Engineering, Surveying , Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 30
    Publication Date: 2019
    Description: Coherence is associated with transient quantum states; in contrast, equilibrium thermal quantum systems have no coherence. We investigate the quantum control task of generating maximum coherence from an initial thermal state employing an external field. A completely controllable Hamiltonian is assumed allowing the generation of all possible unitary transformations. Optimizing the unitary control to achieve maximum coherence leads to a micro-canonical energy distribution on the diagonal energy representation. We demonstrate such a control scenario starting from a given Hamiltonian applying an external field, reaching the control target. Such an optimization task is found to be trap-less. By constraining the amount of energy invested by the control, maximum coherence leads to a canonical energy population distribution. When the optimization procedure constrains the final energy too tightly, local suboptimal traps are found. The global optimum is obtained when a small Lagrange multiplier is employed to constrain the final energy. Finally, we explore the task of generating coherences restricted to be close to the diagonal of the density matrix in the energy representation.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 31
    Publication Date: 2019
    Description: Solar magnetism is believed to originate through dynamo action in the tachocline. Statistical mechanics, in turn, tells us that dynamo action is an inherent property of magnetohydrodynamic (MHD) turbulence, depending essentially on magnetic helicity. Here, we model the tachocline as a rotating, thin spherical shell containing MHD turbulence. Using this model, we find an expression for the entropy and from this develop the thermodynamics of MHD turbulence. This allows us to introduce the macroscopic parameters that affect magnetic self-organization and dynamo action, parameters that include magnetic helicity, as well as tachocline thickness and turbulent energy.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 32
    Publication Date: 2019
    Description: This paper is aimed to dissociate nonlocality from quantum theory. We demonstrate that the tests on violation of the Bell type inequalities are simply statistical tests of local incompatibility of observables. In fact, these are tests on violation of the Bohr complementarity principle. Thus, the attempts to couple experimental violations of the Bell type inequalities with “quantum nonlocality” is really misleading. These violations are explained in the quantum theory as exhibitions of incompatibility of observables for a single quantum system, e.g., the spin projections for a single electron or the polarization projections for a single photon. Of course, one can go beyond quantum theory with the hidden variables models (as was suggested by Bell) and then discuss their possible nonlocal features. However, conventional quantum theory is local.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 33
    Publication Date: 2019
    Description: Mathematical modeling of the heat and mass transfer processes in the evaporating droplet–high-temperature gas medium system is difficult due to the need to describe the dynamics of the formation of the quasi-steady temperature field of evaporating droplets, as well as of a gas-vapor buffer layer around them and in their trace during evaporation in high-temperature gas flows. We used planar laser-induced fluorescence (PLIF) and laser-induced phosphorescence (LIP). The experiments were conducted with water droplets (initial radius 1–2 mm) heated in a hot air flow (temperature 20–500 °C, velocity 0.5–6 m/s). Unsteady temperature fields of water droplets and the gas-vapor mixture around them were recorded. High inhomogeneity of temperature fields under study has been validated. To determine the temperature in the so called dead zones, we solved the problem of heat transfer, in which the temperature in boundary conditions was set on the basis of experimental values.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 34
    Publication Date: 2019
    Description: In this research work, a 3D rotating flow of carbon nanotubes (CNTs) over a porous stretchable sheet for heat and mass transfer is investigated. Kerosene oil is considered as a base liquid and two types of CNTs, (Single & Multi) WCNTs are added as the additives to the base liquid. The present analysis further comprises the combined effect of the Hall, ion-slip, and thermal radiation, along with heat generation/absorption. The appropriate ordinary differential system of equations after applying appropriate transformation is calculated. The resulting nonlinear system of equations (conservation of mass, momentum, temperature) is explained by HAM (Homotopy Analysis Method). Solution of velocities and thermal fields are obtained and discussed graphically. Expression of C f and N u are intended for both type of nanoliquids. The influences of prominent physical factors are plotted for velocities and thermal profiles using Methematica. These graphical results are qualitatively in excellent agreement with the previous published results. Also, single wall nanoparticles are found to have higher temperatures than multi wall CNTs nanoparticles.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 35
    Publication Date: 2019
    Description: Chromatin immunoprecipitation combined with next-generation sequencing (ChIP-Seq) technology has enabled the identification of transcription factor binding sites (TFBSs) on a genome-wide scale. To effectively and efficiently discover TFBSs in the thousand or more DNA sequences generated by a ChIP-Seq data set, we propose a new algorithm named AP-ChIP. First, we set two thresholds based on probabilistic analysis to construct and further filter the cluster subsets. Then, we use Affinity Propagation (AP) clustering on the candidate cluster subsets to find the potential motifs. Experimental results on simulated data show that the AP-ChIP algorithm is able to make an almost accurate prediction of TFBSs in a reasonable time. Also, the validity of the AP-ChIP algorithm is tested on a real ChIP-Seq data set.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 36
    Publication Date: 2019
    Description: We propose a new clustering method based on a deep neural network. Given an unlabeled dataset and the number of clusters, our method directly groups the dataset into the given number of clusters in the original space. We use a conditional discrete probability distribution defined by a deep neural network as a statistical model. Our strategy is first to estimate the cluster labels of unlabeled data points selected from a high-density region, and then to conduct semi-supervised learning to train the model by using the estimated cluster labels and the remaining unlabeled data points. Lastly, by using the trained model, we obtain the estimated cluster labels of all given unlabeled data points. The advantage of our method is that it does not require key conditions. Existing clustering methods with deep neural networks assume that the cluster balance of a given dataset is uniform. Moreover, it also can be applied to various data domains as long as the data is expressed by a feature vector. In addition, it is observed that our method is robust against outliers. Therefore, the proposed method is expected to perform, on average, better than previous methods. We conducted numerical experiments on five commonly used datasets to confirm the effectiveness of the proposed method.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 37
    Publication Date: 2019
    Description: Query complexity is a common tool for comparing quantum and classical computation, and it has produced many examples of how quantum algorithms differ from classical ones. Here we investigate in detail the role that oracles play for the advantage of quantum algorithms. We do so by using a simulation framework, Quantum Simulation Logic (QSL), to construct oracles and algorithms that solve some problems with the same success probability and number of queries as the quantum algorithms. The framework can be simulated using only classical resources at a constant overhead as compared to the quantum resources used in quantum computation. Our results clarify the assumptions made and the conditions needed when using quantum oracles. Using the same assumptions on oracles within the simulation framework we show that for some specific algorithms, such as the Deutsch-Jozsa and Simon’s algorithms, there simply is no advantage in terms of query complexity. This does not detract from the fact that quantum query complexity provides examples of how a quantum computer can be expected to behave, which in turn has proved useful for finding new quantum algorithms outside of the oracle paradigm, where the most prominent example is Shor’s algorithm for integer factorization.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 38
    Publication Date: 2019
    Description: In conventional textbook thermodynamics, entropy is a quantity that may be calculated by different methods, for example experimentally from heat capacities (following Clausius) or statistically from numbers of microscopic quantum states (following Boltzmann and Planck). It had turned out that these methods do not necessarily provide mutually consistent results, and for equilibrium systems their difference was explained by introducing a residual zero-point entropy (following Pauling), apparently violating the Nernst theorem. At finite temperatures, associated statistical entropies which count microstates that do not contribute to a body’s heat capacity, differ systematically from Clausius entropy, and are of particular relevance as measures for metastable, frozen-in non-equilibrium structures and for symbolic information processing (following Shannon). In this paper, it is suggested to consider Clausius, Boltzmann, Pauling and Shannon entropies as distinct, though related, physical quantities with different key properties, in order to avoid confusion by loosely speaking about just “entropy” while actually referring to different kinds of it. For instance, zero-point entropy exclusively belongs to Boltzmann rather than Clausius entropy, while the Nernst theorem holds rigorously for Clausius rather than Boltzmann entropy. The discussion of those terms is underpinned by a brief historical review of the emergence of corresponding fundamental thermodynamic concepts.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 39
    Publication Date: 2019
    Description: Model construction is a very fundamental and important issue in the field of complex dynamical networks. With the state-coupling complex dynamical network model proposed, many kinds of complex dynamical network models were introduced by considering various practical situations. In this paper, aiming at the data loss which may take place in the communication between any pair of directly connected nodes in a complex dynamical network, we propose a new discrete-time complex dynamical network model by constructing an auxiliary observer and choosing the observer states to compensate for the lost states in the coupling term. By employing Lyapunov stability theory and stochastic analysis, a sufficient condition is derived to guarantee the compensation values finally equal to the lost values, namely, the influence of data loss is finally eliminated in the proposed model. Moreover, we generalize the modeling method to output-coupling complex dynamical networks. Finally, two numerical examples are provided to demonstrate the effectiveness of the proposed model.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 40
    Publication Date: 2019
    Description: In ecology and evolution, entropic methods are now used widely and increasingly frequently. Their use can be traced back to Ramon Margalef’s first attempt 70 years ago to use log-series to quantify ecological diversity, including searching for ecologically meaningful groupings within a large assemblage, which we now call the gamma level. The same year, Shannon and Weaver published a generally accessible form of Shannon’s work on information theory, including the measure that we now call Shannon–Wiener entropy. Margalef seized on that measure and soon proposed that ecologists should use the Shannon–Weiner index to evaluate diversity, including assessing local (alpha) diversity and differentiation between localities (beta). He also discussed relating this measure to environmental variables and ecosystem processes such as succession. Over the subsequent decades, he enthusiastically expanded upon his initial suggestions. Finally, 2019 also would have been Margalef’s 100th birthday.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 41
    Publication Date: 2019
    Description: Point-of-interest (POI) recommendation is one of the fundamental tasks for location-based social networks (LBSNs). Some existing methods are mostly based on collaborative filtering (CF), Markov chain (MC) and recurrent neural network (RNN). However, it is difficult to capture dynamic user’s preferences using CF based methods. MC based methods suffer from strong independence assumptions. RNN based methods are still in the early stage of incorporating spatiotemporal context information, and the user’s main behavioral intention in the current sequence is not emphasized. To solve these problems, we proposed an attention-based spatiotemporal gated recurrent unit (ATST-GRU) network model for POI recommendation in this paper. We first designed a novel variant of GRU, which acquired the user’s sequential preference and spatiotemporal preference by feeding the continuous geographical distance and time interval information into the GRU network in each time step. Then, we integrated an attention model into our network, which is a personalized process and can capture the user’s main behavioral intention in the user’s check-in history. Moreover, we conducted an extensive performance evaluation on two real-world datasets: Foursquare and Gowalla. The experimental results demonstrated that the proposed ATST-GRU network outperforms the existing state-of-the-art POI recommendation methods significantly regarding two commonly-used evaluation metrics.
    Electronic ISSN: 2220-9964
    Topics: Architecture, Civil Engineering, Surveying , Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 42
    Publication Date: 2019
    Description: Complex natural disasters often cause people to suffer hardships, and they can cause a large number of casualties. A population that has been affected by a natural disaster is at high risk and desperately in need of help. Even with the timely assessment and knowledge of the degree that natural disasters affect populations, challenges arise during emergency response in the aftermath of a natural disaster. This paper proposes an approach to assessing the near-real-time intensity of the affected population using social media data. Because of its fatal impact on the Philippines, Typhoon Haiyan was selected as a case study. The results show that the normalized affected population index (NAPI) has a significant ability to indicate the affected population intensity. With the geographic information of disasters, more accurate and relevant disaster relief information can be extracted from social media data. The method proposed in this paper will benefit disaster relief operations and decision-making, which can be executed in a timely manner.
    Electronic ISSN: 2220-9964
    Topics: Architecture, Civil Engineering, Surveying , Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 43
    Publication Date: 2019
    Description: Fingerprints have long been used in automated fingerprint identification or verification systems. Singular points (SPs), namely the core and delta point, are the basic features widely used for fingerprint registration, orientation field estimation, and fingerprint classification. In this study, we propose an adaptive method to detect SPs in a fingerprint image. The algorithm consists of three stages. First, an innovative enhancement method based on singular value decomposition is applied to remove the background of the fingerprint image. Second, a blurring detection and boundary segmentation algorithm based on the innovative image enhancement is proposed to detect the region of impression. Finally, an adaptive method based on wavelet extrema and the Henry system for core point detection is proposed. Experiments conducted using the FVC2002 DB1 and DB2 databases prove that our method can detect SPs reliably.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 44
    Publication Date: 2019
    Description: This paper proposes a geometric estimator of dependency between a pair of multivariate random variables. The proposed estimator of dependency is based on a randomly permuted geometric graph (the minimal spanning tree) over the two multivariate samples. This estimator converges to a quantity that we call the geometric mutual information (GMI), which is equivalent to the Henze–Penrose divergence. between the joint distribution of the multivariate samples and the product of the marginals. The GMI has many of the same properties as standard MI but can be estimated from empirical data without density estimation; making it scalable to large datasets. The proposed empirical estimator of GMI is simple to implement, involving the construction of an minimal spanning tree (MST) spanning over both the original data and a randomly permuted version of this data. We establish asymptotic convergence of the estimator and convergence rates of the bias and variance for smooth multivariate density functions belonging to a Hölder class. We demonstrate the advantages of our proposed geometric dependency estimator in a series of experiments.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 45
    Publication Date: 2019
    Description: Mental workload assessment is crucial in many real life applications which require constant attention and where imbalance of mental workload resources may cause safety hazards. As such, mental workload and its relationship with heart rate variability (HRV) have been well studied in the literature. However, the majority of the developed models have assumed individuals are not ambulant, thus bypassing the issue of movement-related electrocardiography (ECG) artifacts and changing heart beat dynamics due to physical activity. In this work, multi-scale features for mental workload assessment of ambulatory users is explored. ECG data was sampled from users while they performed different types and levels of physical activity while performing the multi-attribute test battery (MATB-II) task at varying difficulty levels. Proposed features are shown to outperform benchmark ones and further exhibit complementarity when used in combination. Indeed, results show gains over the benchmark HRV measures of 24.41 % in accuracy and of 27.97 % in F1 score can be achieved even at high activity levels.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 46
    Publication Date: 2019
    Description: This study compares the performance of five popular equal-area projections supported by Free and Open Source Software for Geo-spatial (FOSS4G)—Sinusoidal, Mollweide, Hammer, Eckert IV and Homolosine. A set of 21,872 discrete distortion vindicatrices were positioned on the ellipsoid surface, centred on the cells of a Snyder icosahedral equal-area grid. These indicatrices were projected on the plane and the resulting angular and distance distortions computed, all using FOSS4G. The Homolosine is the only projection that manages to minimise angular and distance distortions simultaneously. It yields the lowest distortions among this set of projections and clearly outclasses when only land masses are considered. These results also indicate the Sinusoidal and Hammer projections to be largely outdated, imposing too large distortions to be useful. In contrast, the Mollweide and Eckert IV projections present trade-offs between visual expression and accuracy that are worth considering. However, for the purposes of storing and analysing big spatial data with FOSS4G the superior performance of the Homolosine projection makes its choice difficult to avoid.
    Electronic ISSN: 2220-9964
    Topics: Architecture, Civil Engineering, Surveying , Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 47
    Publication Date: 2019
    Description: The concept of a “flow network”—a set of nodes and links which carries one or more flows—unites many different disciplines, including pipe flow, fluid flow, electrical, chemical reaction, ecological, epidemiological, neurological, communications, transportation, financial, economic and human social networks. This Feature Paper presents a generalized maximum entropy framework to infer the state of a flow network, including its flow rates and other properties, in probabilistic form. In this method, the network uncertainty is represented by a joint probability function over its unknowns, subject to all that is known. This gives a relative entropy function which is maximized, subject to the constraints, to determine the most probable or most representative state of the network. The constraints can include “observable” constraints on various parameters, “physical” constraints such as conservation laws and frictional properties, and “graphical” constraints arising from uncertainty in the network structure itself. Since the method is probabilistic, it enables the prediction of network properties when there is insufficient information to obtain a deterministic solution. The derived framework can incorporate nonlinear constraints or nonlinear interdependencies between variables, at the cost of requiring numerical solution. The theoretical foundations of the method are first presented, followed by its application to a variety of flow networks.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 48
    Publication Date: 2019
    Description: In this paper, we investigate the finite-time synchronization problem for a class of Markovian jumping complex networks (MJCNs) with non-identical nodes and impulsive effects. Sufficient conditions for the MJCNs are presented based on an M-matrix technique, Lyapunov function method, stochastic analysis technique, and suitable comparison systems to guarantee finite-time synchronization. At last, numerical examples are exploited to illustrate our theoretical results, and they testify the effectiveness of our results for complex dynamic systems.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 49
    Publication Date: 2019
    Description: A number of simplified models, based on perturbation theory, have been proposed for the fiber-optical channel and have been extensively used in the literature. Although these models are mainly developed for the low-power regime, they are used at moderate or high powers as well. It remains unclear to what extent the capacity of these models is affected by the simplifying assumptions under which they are derived. In this paper, we consider single-channel data transmission based on three continuous-time optical models: (i) a regular perturbative channel, (ii) a logarithmic perturbative channel, and (iii) the stochastic nonlinear Schrödinger (NLS) channel. To obtain analytically tractable discrete-time models, we consider zero-dispersion fibers and a sampling receiver. We investigate the per-sample capacity of these models. Specifically, (i) we establish tight bounds on the capacity of the regular perturbative channel; (ii) we obtain the capacity of the logarithmic perturbative channel; and (iii) we present a novel upper bound on the capacity of the zero-dispersion NLS channel. Our results illustrate that the capacity of these models departs from each other at high powers because these models yield different capacity pre-logs. Since all three models are based on the same physical channel, our results highlight that care must be exercised in using simplified channel models in the high-power regime.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 50
    Publication Date: 2019
    Description: The essential step of surrogating algorithms is phase randomizing the Fourier transform while preserving the original spectrum amplitude before computing the inverse Fourier transform. In this paper, we propose a new method which considers the graph Fourier transform. In this manner, much more flexibility is gained to define properties of the original graph signal which are to be preserved in the surrogates. The complex case is considered to allow unconstrained phase randomization in the transformed domain, hence we define a Hermitian Laplacian matrix that models the graph topology, whose eigenvectors form the basis of a complex graph Fourier transform. We have shown that the Hermitian Laplacian matrix may have negative eigenvalues. We also show in the paper that preserving the graph spectrum amplitude implies several invariances that can be controlled by the selected Hermitian Laplacian matrix. The interest of surrogating graph signals has been illustrated in the context of scarcity of instances in classifier training.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 51
    Publication Date: 2019
    Description: Almost any interaction between two physical entities can be described through the transfer of either charge, spin, momentum, or energy. Therefore, any theory able to describe these transport phenomena can shed light on a variety of physical, chemical, and biological effects, enriching our understanding of complex, yet fundamental, natural processes, e.g., catalysis or photosynthesis. In this review, we will discuss the standard workhorses for transport in nanoscale devices, namely Boltzmann’s equation and Landauer’s approach. We will emphasize their strengths, but also analyze their limits, proposing theories and models useful to go beyond the state of the art in the investigation of transport in nanoscale devices.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 52
    Publication Date: 2019
    Description: 3D city models are being extensively used in applications such as evacuation scenarios and energy consumption estimation. The main standard for 3D city models is the CityGML data model which can be encoded through the CityJSON data format. CityGML and CityJSON use polygonal modelling in order to represent geometries. True topological data structures have proven to be more computationally efficient for geometric analysis compared to polygonal modelling. In a previous study, we have introduced a method to topologically reconstruct CityGML models while maintaining the semantic information of the dataset, based solely on the combinatorial map (C-Map) data structure. As a result of the limitations of C-Map’s semantic representation mechanism, the resulting datasets could suffer either from semantic information loss or the redundant repetition of them. In this article, we propose a solution for a more efficient representation of geometry, topology and semantics by incorporating the C-Map data structure into the CityGML data model and implementing a CityJSON extension to encode the C-Map data. In addition, we provide an algorithm for the topological reconstruction of CityJSON datasets to append them according to this extension. Finally, we apply our methodology to three open datasets in order to validate our approach when applied to real-world data. Our results show that the proposed CityJSON extension can represent all geometric information of a city model in a lossless way, providing additional topological information for the objects of the model.
    Electronic ISSN: 2220-9964
    Topics: Architecture, Civil Engineering, Surveying , Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 53
    Publication Date: 2019
    Description: We are now generating exponentially more data from more sources than a few years ago. Big data, an already familiar term, has been generally defined as a massive volume of structured, semi-structured, and/or unstructured data, which may not be effectively managed and processed using traditional databases and software techniques. It could be problematic to visualize easily and quickly a large amount of data via an Internet platform. From this perspective, the main aim of the paper is to test point data visualization possibilities of selected JavaScript Mapping Libraries to measure their performance and ability to cope with a big amount of data. Nine datasets containing 10,000 to 3,000,000 points were generated from the Nature Conservation Database. Five libraries for marker clustering and two libraries for heatmap visualization were analyzed. Loading time and the ability to visualize large data sets were compared for each dataset and each library. The best-evaluated library was a Mapbox GL JS (Graphics Library JavaScript) with the highest overall performance. Some of the tested libraries were not able to handle the desired amount of data. In general, an amount of less than 100,000 points was indicated as the threshold for implementation without a noticeable slowdown in performance. Their usage can be a limiting factor for point data visualization in such a dynamic environment as we live nowadays.
    Electronic ISSN: 2220-9964
    Topics: Architecture, Civil Engineering, Surveying , Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 54
    Publication Date: 2019
    Description: Obstructive sleep apnea (OSA) syndrome is a common sleep disorder. As an alternative to polysomnography (PSG) for OSA screening, the current automatic OSA detection methods mainly concentrate on feature extraction and classifier selection based on physiological signals. It has been reported that OSA is, along with autonomic nervous system (ANS) dysfunction and heart rate variability (HRV), a useful tool for ANS assessment. Therefore, in this paper, eight novel indices of short-time HRV are extracted for OSA detection, which are based on the proposed multi-bands time-frequency spectrum entropy (MTFSE) method. In the MTFSE, firstly, the power spectrum of HRV is estimated by the Burg–AR model, and the time-frequency spectrum image (TFSI) is obtained. Secondly, according to the physiological significance of HRV, the TFSI is divided into multiple sub-bands according to frequency. Last but not least, by studying the Shannon entropy of different sub-bands and the relationships among them, the eight indices are obtained. In order to validate the performance of MTFSE-based indices, the Physionet Apnea–ECG database and K-nearest neighbor (KNN), support vector machine (SVM), and decision tree (DT) classification methods are used. The SVM classification method gets the highest classification accuracy, its average accuracy is 91.89%, the average sensitivity is 88.01%, and the average specificity is 93.98%. Undeniably, the MTFSE-based indices provide a novel idea for the screening of OSA disease.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 55
    Publication Date: 2019
    Description: Currently, chaos-based cryptosystems are being proposed in the literature to provide confidentiality for digital images, since the diffusion effect in the Advance Encryption Standard (AES) algorithm is weak. Security is the most important challenge to assess in cryptosystems according to the National Institute of Standard and Technology (NIST), then cost and performance, and finally algorithm and implementation. Recent chaos-based image encryption algorithms present basic security analysis, which could make them insecure for some applications. In this paper, we suggest an integral analysis framework related to comprehensive security analysis, cost and performance, and the algorithm and implementation for chaos-based image cryptosystems. The proposed guideline based on 20 analysis points can assist new cryptographic designers to present an integral analysis of new algorithms. Future comparisons of new schemes can be more consistent in terms of security and efficiency. In addition, we present aspects regarding digital chaos implementation, chaos validation, and key definition to improve the security of the overall cryptosystem. The suggested guideline does not guarantee security, and it does not intend to limit the liberty to implement new analysis. However, it provides for the first time in the literature a solid basis about integral analysis for chaos-based image cryptosystems as an effective approach to improve security.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 56
    Publication Date: 2019
    Description: Thermally induced non-equilibrium gas flows have been simulated in the present study by coupling kinetic and extended thermodynamic methods. Three different types of thermally induced gas flows, including temperature-discontinuity- and temperature-gradient-induced flows and radiometric flow, have been explored in the transition regime. The temperature-discontinuity-induced flow case has shown that as the Knudsen number increases, the regularised 26 (R26) moment equation system will gradually loss its accuracy and validation. A coupling macro- and microscopic approach is employed to overcome these problems. The R26 moment equations are used at the macroscopic level for the bulk flow region, while the kinetic equation associated with the discrete velocity method (DVM) is applied to describe the gas close to the wall at the microscopic level, which yields a hybrid DVM/R26 approach. The numerical results have shown that the hybrid DVM/R26 method can be faithfully used for the thermally induced non-equilibrium flows. The proposed scheme not only improves the accuracy of the results in comparison with the R26 equations, but also extends their capability with a wider range of Knudsen numbers. In addition, the hybrid scheme is able to reduce the computational memory and time cost compared to the DVM.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 57
    Publication Date: 2019
    Description: Large-scale three-dimensional (3D) reconstruction from multi-view images is used to generate 3D mesh surfaces, which are usually built for urban areas and are widely applied in many research hotspots, such as smart cities. Their simplification is a significant step for 3D roaming, pattern recognition, and other research fields. The simplification quality has been assessed in several studies. On the one hand, almost all studies on surface simplification have measured simplification errors using the surface comparison tool Metro, which does not preserve sufficient detail. On the other hand, the reconstruction precision of urban surfaces varies as a result of homogeneity or heterogeneity. Therefore, it is difficult to assess simplification quality without surface classification. These difficulties are addressed in this study by first classifying urban surfaces into planar surfaces, detailed surfaces, and urban frameworks according to the simplification errors of different types of surfaces and then measuring these errors after sampling. A series of assessment indexes are also provided to contribute to the advancement of simplification algorithms.
    Electronic ISSN: 2220-9964
    Topics: Architecture, Civil Engineering, Surveying , Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 58
    Publication Date: 2019
    Description: This paper develops the interval maximum entropy model for the interval European option valuation by estimating an underlying asset distribution. The refined solution for the model is obtained by the Lagrange multiplier. The particle swarm optimization algorithm is applied to calculate the density function of the underlying asset, which can be utilized to price the Shanghai Stock Exchange (SSE) 50 Exchange Trades Funds (ETF) option of China and the Boeing stock option of the United States. Results show that maximum entropy distribution provides precise estimations for the underlying asset of interval number situations. In this way, we can get the distribution of the underlying assets and apply it to the interval European option pricing in the financial market.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 59
    Publication Date: 2019
    Description: In order to improve the security and efficiency of image encryption systems comprehensively, a novel chaotic S-box based image encryption scheme is proposed. Firstly, a new compound chaotic system, Sine-Tent map, is proposed to widen the chaotic range and improve the chaotic performance of 1D discrete chaotic maps. As a result, the new compound chaotic system is more suitable for cryptosystem. Secondly, an efficient and simple method for generating S-boxes is proposed, which can greatly improve the efficiency of S-box production. Thirdly, a novel double S-box based image encryption algorithm is proposed. By introducing equivalent key sequences {r, t} related with image ciphertext, the proposed cryptosystem can resist the four classical types of attacks, which is an advantage over other S-box based encryption schemes. Furthermore, it enhanced the resistance of the system to differential analysis attack by two rounds of forward and backward confusion-diffusion operation with double S-boxes. The simulation results and security analysis verify the effectiveness of the proposed scheme. The new scheme has obvious efficiency advantages, which means that it has better application potential in real-time image encryption.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 60
    Publication Date: 2019
    Description: The interest in memristors has risen due to their possible application both as memory units and as computational devices in combination with CMOS. This is in part due to their nonlinear dynamics, and a strong dependence on the circuit topology. We provide evidence that also purely memristive circuits can be employed for computational purposes. In the present paper we show that a polynomial Lyapunov function in the memory parameters exists for the case of DC controlled memristors. Such a Lyapunov function can be asymptotically approximated with binary variables, and mapped to quadratic combinatorial optimization problems. This also shows a direct parallel between memristive circuits and the Hopfield-Little model. In the case of Erdos-Renyi random circuits, we show numerically that the distribution of the matrix elements of the projectors can be roughly approximated with a Gaussian distribution, and that it scales with the inverse square root of the number of elements. This provides an approximated but direct connection with the physics of disordered system and, in particular, of mean field spin glasses. Using this and the fact that the interaction is controlled by a projector operator on the loop space of the circuit. We estimate the number of stationary points of the approximate Lyapunov function and provide a scaling formula as an upper bound in terms of the circuit topology only.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 61
    Publication Date: 2019
    Description: The analytically solvable chaotic system (ASCS) is a promising chaotic system in chaos communication and radar fields. In this paper, we propose a maximum likelihood estimator (MLE) to estimate the frequency of ASCS, then a difference-integral (DI) detector is designed with the estimated frequency, and the symbols encoded in the signal are recovered. In the proposed method, the frequency parameter is estimated by an MLE based on the square power of the received signal. The Cramer-Rao lower bound in blind frequency estimation and the bit error performance in symbol detection are analyzed to assess the performance of the proposed method. Numerical results validate the analysis and demonstrate that the proposed symbol detector achieves the error performance with a little cost of 1 dB compared to the coherent detector. The robustness of the proposed method towards parameters is also verified through simulations.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 62
    Publication Date: 2019
    Description: Satellite data are underutilized in many branches of operational oceanography. Users outside of the satellite community often encounter difficulty in discovering the types of satellite measurements that are available, and determining which satellite products are best for operational activities. In addition, the large choice of satellite data providers, each with their own data access protocols and formats, can make data access challenging. The mission of the NOAA CoastWatch Program is to make ocean satellite data easier to access and to apply to operational uses. As part of this mission, the West Coast Node of CoastWatch developed the NOAA Ocean Satellite Course, which introduces scientists and resource managers to ocean satellite products, and provides them tools to facilitate data access when using common analysis software. These tools leverage the data services provided by ERDDAP, a data distribution system designed to make data access easier via a graphical user interface and via machine-to-machine connections. The course has been offered annually since 2006 and has been attended by over 350 participants. Results of post-course surveys are analyzed to measure course effectiveness. The lessons learned from conducting these courses include using the preferred software of the course participants, providing easy access to datasets that are appropriate (fit for purpose) for operation applications, developing tools that address common tasks of the target audience, and minimizing the financial barriers to attend the course.
    Electronic ISSN: 2220-9964
    Topics: Architecture, Civil Engineering, Surveying , Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 63
    Publication Date: 2019
    Description: It is often the case when studying complex dynamical systems that a statistical formulation can provide the greatest insight into the underlying dynamics. When discussing the behavior of such a system which is evolving in time, it is useful to have the notion of a metric between two given states. A popular measure of information change in a system under perturbation has been the relative entropy of the states, as this notion allows us to quantify the difference between states of a system at different times. In this paper, we investigate the relaxation problem given by a single and coupled Ornstein–Uhlenbeck (O-U) process and compare the information length with entropy-based metrics (relative entropy, Jensen divergence) as well as others. By measuring the total information length in the long time limit, we show that it is only the information length that preserves the linear geometry of the O-U process. In the coupled O-U process, the information length is shown to be capable of detecting changes in both components of the system even when other metrics would detect almost nothing in one of the components. We show in detail that the information length is sensitive to the evolution of subsystems.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 64
    Publication Date: 2019
    Description: The analysis of loss data is of utmost interest in many branches of the financial and insurance industries, in structural engineering and in operation research, among others. In the financial industry, the determination of the distribution of losses is the first step to take to compute regulatory risk capitals; in insurance we need the distribution of losses to determine the risk premia. In reliability analysis one needs to determine the distribution of accumulated damage or the first time of occurrence of a composite event, and so on. Not only that, but in some cases we have data on the aggregate risk, but we happen to be interested in determining the statistical nature of the different types of events that contribute to the aggregate loss. Even though in many of these branches of activity one may have good theoretical descriptions of the underlying processes, the nature of the problems is such that we must resort to numerical methods to actually compute the loss distributions. Besides being able to determine numerically the distribution of losses, we also need to assess the dependence of the distribution of losses and that of the quantities computed with it, on the empirical data. It is the purpose of this note to illustrate the how the maximum entropy method and its extensions can be used to deal with the various issues that come up in the computation of the distribution of losses. These methods prove to be robust and allow for extensions to the case when the data has measurement errors and/or is given up to an interval.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 65
    Publication Date: 2019
    Description: Quantum teleportation is one of the most striking consequence of quantum mechanics and is defined as the transmission and reconstruction of an unknown quantum state over arbitrary distances. This concept was introduced for the first time in 1993 by Charles Bennett and coworkers, it has then been experimentally demonstrated by several groups under different conditions of distance, amount of particles and even with feed forward. After 20 years from its first realization, this contribution reviews the experimental implementations realized at the Quantum Optics Group of the University of Rome La Sapienza.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 66
    Publication Date: 2019
    Description: Entropy should directly reflect the extent of disorder in proteins. By clustering structurally related proteins and studying the multiple-sequence-alignment of the sequences of these clusters, we were able to link between sequence, structure, and disorder information. We introduced several parameters as measures of fluctuations at a given MSA site and used these as representative of the sequence and structure entropy at that site. In general, we found a tendency for negative correlations between disorder and structure, and significant positive correlations between disorder and the fluctuations in the system. We also found evidence for residue-type conservation for those residues proximate to potentially disordered sites. Mutation at the disorder site itself appear to be allowed. In addition, we found positive correlation for disorder and accessible surface area, validating that disordered residues occur in exposed regions of proteins. Finally, we also found that fluctuations in the dihedral angles at the original mutated residue and disorder are positively correlated while dihedral angle fluctuations in spatially proximal residues are negatively correlated with disorder. Our results seem to indicate permissible variability in the disordered site, but greater rigidity in the parts of the protein with which the disordered site interacts. This is another indication that disordered residues are involved in protein function.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 67
    Publication Date: 2019
    Description: Solving the constraint satisfaction problem (CSP) is to find an assignment of values to variables that satisfies a set of constraints. Ant colony optimization (ACO) is an efficient algorithm for solving CSPs. However, the existing ACO-based algorithms suffer from the constructed assignment with high cost. To improve the solution quality of ACO for solving CSPs, an ant colony optimization based on information entropy (ACOE) is proposed in this paper. The proposed algorithm can automatically call a crossover-based local search according to real-time information entropy. We first describe ACOE for solving CSPs and show how it constructs assignments. Then, we use a ranking-based strategy to update the pheromone, which weights the pheromone according to the rank of these ants. Furthermore, we introduce the crossover-based local search that uses a crossover operation to optimize the current best assignment. Finally, we compare ACOE with seven algorithms on binary CSPs. The experimental results revealed that our method outperformed the other compared algorithms in terms of the cost comparison, data distribution, convergence performance, and hypothesis test.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 68
    Publication Date: 2019
    Description: Since the submarine has become the major threat to maritime security, there is an urgent need to find a more efficient method of anti-submarine warfare (ASW). The digital twin theory is one of the most outstanding information technologies, and has been quite popular in recent years. The most influential change produced by digital twin is the ability to enable real-time dynamic interactions between the simulation world and the real world. Digital twin can be regarded as a paradigm by means of which selected online measurements are dynamically assimilated into the simulation world, with the running simulation model guiding the real world adaptively in reverse. By combining digital twin theory and random finite sets (RFSs) closely, a new framework of sensor control in ASW is proposed. Two key algorithms are proposed for supporting the digital twin-based framework. First, the RFS-based data-assimilation algorithm is proposed for online assimilating the sequence of real-time measurements with detection uncertainty, data association uncertainty, noise, and clutters. Second, the computation of the reward function by using the results of the proposed data-assimilation algorithm is introduced to find the optimal control action. The results of three groups of experiments successfully verify the feasibility and effectiveness of the proposed approach.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 69
    Publication Date: 2019
    Description: Pattern classification represents a challenging problem in machine learning and data science research domains, especially when there is a limited availability of training samples. In recent years, artificial neural network (ANN) algorithms have demonstrated astonishing performance when compared to traditional generative and discriminative classification algorithms. However, due to the complexity of classical ANN architectures, ANNs are sometimes incapable of providing efficient solutions when addressing complex distribution problems. Motivated by the mathematical definition of a quantum bit (qubit), we propose a novel autonomous perceptron model (APM) that can solve the problem of the architecture complexity of traditional ANNs. APM is a nonlinear classification model that has a simple and fixed architecture inspired by the computational superposition power of the qubit. The proposed perceptron is able to construct the activation operators autonomously after a limited number of iterations. Several experiments using various datasets are conducted, where all the empirical results show the superiority of the proposed model as a classifier in terms of accuracy and computational time when it is compared with baseline classification models.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 70
    Publication Date: 2019
    Description: The human postsynaptic density is an elaborate network comprising thousands of proteins, playing a vital role in the molecular events of learning and the formation of memory. Despite our growing knowledge of specific proteins and their interactions, atomic-level details of their full three-dimensional structure and their rearrangements are mostly elusive. Advancements in structural bioinformatics enabled us to depict the characteristic features of proteins involved in different processes aiding neurotransmission. We show that postsynaptic protein-protein interactions are mediated through the delicate balance of intrinsically disordered regions and folded domains, and this duality is also imprinted in the amino acid sequence. We introduce Diversity of Potential Interactions (DPI), a structure and regulation based descriptor to assess the diversity of interactions. Our approach reveals that the postsynaptic proteome has its own characteristic features and these properties reliably discriminate them from other proteins of the human proteome. Our results suggest that postsynaptic proteins are especially susceptible to forming diverse interactions with each other, which might be key in the reorganization of the postsynaptic density (PSD) in molecular processes related to learning and memory.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 71
    Publication Date: 2019
    Description: We develop an entropic framework to model the dynamics of stocks and European Options. Entropic inference is an inductive inference framework equipped with proper tools to handle situations where incomplete information is available. The objective of the paper is to lay down an alternative framework for modeling dynamics. An important information about the dynamics of a stock’s price is scale invariance. By imposing the scale invariant symmetry, we arrive at choosing the logarithm of the stock’s price as the proper variable to model. The dynamics of stock log price is derived using two pieces of information, the continuity of motion and the directionality constraint. The resulting model is the same as the Geometric Brownian Motion, GBM, of the stock price which is manifestly scale invariant. Furthermore, we come up with the dynamics of probability density function, which is a Fokker–Planck equation. Next, we extend the model to value the European Options on a stock. Derivative securities ought to be prices such that there is no arbitrage. To ensure the no-arbitrage pricing, we derive the risk-neutral measure by incorporating the risk-neutral information. Consequently, the Black–Scholes model and the Black–Scholes-Merton differential equation are derived.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 72
    Publication Date: 2019
    Description: Action potentials (spikes) can trigger the release of a neurotransmitter at chemical synapses between neurons. Such release is uncertain, as it occurs only with a certain probability. Moreover, synaptic release can occur independently of an action potential (asynchronous release) and depends on the history of synaptic activity. We focus here on short-term synaptic facilitation, in which a sequence of action potentials can temporarily increase the release probability of the synapse. In contrast to the phenomenon of short-term depression, quantifying the information transmission in facilitating synapses remains to be done. We find rigorous lower and upper bounds for the rate of information transmission in a model of synaptic facilitation. We treat the synapse as a two-state binary asymmetric channel, in which the arrival of an action potential shifts the synapse to a facilitated state, while in the absence of a spike, the synapse returns to its baseline state. The information bounds are functions of both the asynchronous and synchronous release parameters. If synchronous release facilitates more than asynchronous release, the mutual information rate increases. In contrast, short-term facilitation degrades information transmission when the synchronous release probability is intrinsically high. As synaptic release is energetically expensive, we exploit the information bounds to determine the energy–information trade-off in facilitating synapses. We show that unlike information rate, the energy-normalized information rate is robust with respect to variations in the strength of facilitation.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 73
    Publication Date: 2019
    Description: The rental housing market plays a critical role in the United States real estate market. In addition, rent changes are also indicators of urban transformation and social phenomena. However, traditional data sources for market rent prediction are often inaccurate or inadequate at covering large geographies. With the development of housing information exchange platforms such as Craigslist, user-generated rental listings now provide big data that cover wide geographies and are rich in textual information. Given the importance of rent prediction in urban studies, this study aims to develop and evaluate models of rental market dynamics using deep learning approaches on spatial and textual data from Craigslist rental listings. We tested a number of machine learning and deep learning models (e.g., convolutional neural network, recurrent neural network) for the prediction of rental prices based on data collected from Atlanta, GA, USA. With textual information alone, deep learning models achieved an average root mean square error (RMSE) of 288.4 and mean absolute error (MAE) of 196.8. When combining textual information with location and housing attributes, the integrated model achieved an average RMSE of 227.9 and MAE of 145.4. These approaches can be applied to assess the market value of rental properties, and the prediction results can be used as indicators of a variety of urban phenomena and provide practical references for home owners and renters.
    Electronic ISSN: 2220-9964
    Topics: Architecture, Civil Engineering, Surveying , Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 74
    Publication Date: 2019
    Description: In this work, we consider the pros and cons of using various layers of keyless coding to achieve secure and reliable communication over the Gaussian wiretap channel. We define a new approach to information theoretic security, called practical secrecy and the secrecy benefit, to be used over real-world channels and finite blocklength instantiations of coding layers, and use this new approach to show the fundamental reliability and security implications of several coding mechanisms that have traditionally been used for physical-layer security. We perform a systematic/structured analysis of the effect of error-control coding, scrambling, interleaving, and coset coding, as coding layers of a secrecy system. Using this new approach, scrambling and interleaving are shown to be of no effect in increasing information theoretic security, even when measuring the effect at the output of the eavesdropper’s decoder. Error control coding is shown to present a trade-off between secrecy and reliability that is dictated by the chosen code and the signal-to-noise ratios at the legitimate and eavesdropping receivers. Finally, the benefits of secrecy coding are highlighted, and it is shown how one can shape the secrecy benefit according to system specifications using combinations of different layers of coding to achieve both reliable and secure throughput.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 75
    Publication Date: 2019
    Description: The advances in mobile technologies enable mobile devices to cooperate with each other to perform complex tasks to satisfy users’ composite service requirements. However, data with different sensitivities and heterogeneous systems with diverse security policies pose a great challenge on information flow security during the service composition across multiple mobile devices. The qualitative information flow control mechanism based on non-interference provides a solid security assurance on the propagation of customer’s private data across multiple service participants. However, strict discipline limits the service availability and may cause a high failure rate on service composition. Therefore, we propose a distributed quantitative information flow evaluation approach for service composition across multiple devices in mobile environments. The quantitative approach provides us a more precise way to evaluate the leakage and supports the customized disciplines on information flow security for the diverse requirements of different customers. Considering the limited energy feature on mobile devices, we use a distributed evaluation approach to provide a better balance on consumption on each service participant. Through the experiments and evaluations, the results indicate that our approach can improve the availability of composite service effectively while the security can be ensured.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 76
    Publication Date: 2019
    Description: We establish lower bounds on the volume and the surface area of a geometric body using the size of its slices along different directions. In the first part of the paper, we derive volume bounds for convex bodies using generalized subadditivity properties of entropy combined with entropy bounds for log-concave random variables. In the second part, we investigate a new notion of Fisher information which we call the L 1 -Fisher information and show that certain superadditivity properties of the L 1 -Fisher information lead to lower bounds for the surface areas of polyconvex sets in terms of its slices.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 77
    Publication Date: 2019
    Description: Some uncertainty about flipping a biased coin can be resolved from the sequence of coin sides shown already. We report the exact amounts of predictable and unpredictable information in flipping a biased coin. Fractional coin flipping does not reflect any physical process, being defined as a binomial power series of the transition matrix for “integer” flipping. Due to strong coupling between the tossing outcomes at different times, the side repeating probabilities assumed to be independent for “integer” flipping get entangled with one another for fractional flipping. The predictable and unpredictable information components vary smoothly with the fractional order parameter. The destructive interference between two incompatible hypotheses about the flipping outcome culminates in a fair coin, which stays fair also for fractional flipping.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 78
    Publication Date: 2019
    Description: Direct left turns (DLTs) could cause traffic slowdown, delay, stops, and even accidents on intersections, especially on no-median roads. Channelization and signalization can significantly diminish negative impact of DLTs. In China, a total of 56 large and medium-sized cities, including 17 provincial capitals, have adopted vehicle restriction policies due to traffic congestion, vehicle energy conservation and emission reduction, which cause travel inconvenience for citizens. This paper mainly studies signalization and channelization selections at intersections based on an entropy method. Based on the commonly used three evaluation indexes, the number of vehicles, CO emissions and fuel consumption have been added. The entropy evaluation method (EEM) method is innovatively used to objectively calculate the weight of the six indexes, which carry out the optimal traffic volume combinations for intersections of present situation, channelization and signalization. A VISSIM simulation is also used to evaluate the operating status of three conditions. The results show that EEM could help enormously in choosing different methods at a certain intersection. With the EEM, six indexes decrease by 20–70% at most.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 79
    Publication Date: 2019
    Description: The ventilation mode affects the cooling efficiency of the air conditioners significantly in marine data centers. Three different ventilation modes, namely, underfloor ventilation, overhead ventilation, side ventilation, are numerically investigated for a typical marine data center. Four independent parameters, including temperature, velocity, air age, and uniformity index, are applied to evaluate the performances of the three ventilation modes. Further, the analytic hierarchy process (AHP) entropy weight model is established and further analysis is conducted to find the optimal ventilation mode of the marine data center. The results indicate that the underfloor ventilation mode has the best performance in the airflow patterns and temperature distribution evaluation projects, with the highest scores of 91.84 and 90.60. If low energy consumption is required, it is recommended to select the overhead ventilation mode with a maximum score of 93.50. The current evaluation results agree fairly well with the three dimensional simulation results, which further proves that the AHP entropy weight method is reasonable and has a high adaptability for the evaluation of air conditioning ventilation modes.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 80
    Publication Date: 2019
    Description: Developers have long used game engines for visualizing virtual worlds for players to explore. However, using real-world data in a game engine is always a challenging task, since most game engines have very little support for geospatial data. This paper presents our findings from exploring the Unity3D game engine for visualizing large-scale topographic data from mixed sources of terrestrial laser scanner models and topographic map data. Level of detail (LOD) 3 3D models of two buildings of the Universitas Gadjah Mada campus were obtained using a terrestrial laser scanner converted into the FBX format. Mapbox for Unity was used to provide georeferencing support for the 3D model. Unity3D also used road and place name layers via Mapbox for Unity based on OpenStreetMap (OSM) data. LOD1 buildings were modeled from topographic map data using Mapbox, and 3D models from the terrestrial laser scanner replaced two of these buildings. Building information and attributes, as well as visual appearances, were added to 3D features. The Unity3D game engine provides a rich set of libraries and assets for user interactions, and custom C# scripts were used to provide a bird’s-eye-view mode of 3D zoom, pan, and orbital display. In addition to basic 3D navigation tools, a first-person view of the scene was utilized to enable users to gain a walk-through experience while virtually inspecting the objects on the ground. For a fly-through experience, a drone view was offered to help users inspect objects from the air. The result was a multiplatform 3D visualization capable of displaying 3D models in LOD3, as well as providing user interfaces for exploring the scene using “on the ground” and “from the air” types of first person view interactions. Using the Unity3D game engine to visualize mixed sources of topographic data creates many opportunities to optimize large-scale topographic data use.
    Electronic ISSN: 2220-9964
    Topics: Architecture, Civil Engineering, Surveying , Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 81
    Publication Date: 2019
    Description: The analysis of resting-state brain activity recording in magnetoencephalograms (MEGs) with new algorithms of symbolic dynamics analysis could help obtain a deeper insight into the functioning of the brain and identify potential differences between males and females. Permutation Lempel-Ziv complexity (PLZC), a recently introduced non-linear signal processing algorithm based on symbolic dynamics, was used to evaluate the complexity of MEG signals in source space. PLZC was estimated in a broad band of frequencies (2–45 Hz), as well as in narrow bands (i.e., theta (4–8 Hz), alpha (8–12 Hz), low beta (12–20 Hz), high beta (20–30 Hz), and gamma (30–45 Hz)) in a sample of 98 healthy elderly subjects (49 males, 49 female) aged 65–80 (average age of 72.71 ± 4.22 for males and 72.67 ± 4.21 for females). PLZC was significantly higher for females than males in the high beta band at posterior brain regions including the precuneus, and the parietal and occipital cortices. Further statistical analyses showed that higher complexity values over highly overlapping regions than the ones mentioned above were associated with larger hippocampal volumes only in females. These results suggest that sex differences in healthy aging can be identified from the analysis of magnetoencephalograms with novel signal processing methods.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 82
    Publication Date: 2019
    Description: Information from social media microblogging has been applied to management of emergency situations following disasters. In particular, such blogs contain much information about the public perception of disasters. However, the effective collection and use of disaster information from microblogs still presents a significant challenge. In this paper, a spatial distribution detection method is established using emergency information based on the urgency degree grading of microblogs and spatial autocorrelation analysis. Moreover, a character-level convolutional neural network classifier is applied for microblog classification in order to mine the spatio-temporal change process of emergency rescue information. The results from the Jiuzhaigou (Sichuan, China) earthquake case study demonstrate that different emergency information types exhibit different time variation characteristics. Moreover, spatial autocorrelation analysis based on the degree of text urgency can exclude uneven spatial distribution influences of the number of microblog users, and accurately determine the level of urgency of the situation. In addition, the classification and spatio-temporal analysis methods combined in this study can effectively mine the required emergency information, allowing us to understand emergency information spatio-temporal changes. Our study can be used as a reference for microblog information applications within the field of emergency rescue activity.
    Electronic ISSN: 2220-9964
    Topics: Architecture, Civil Engineering, Surveying , Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 83
    Publication Date: 2019
    Description: Quantitative assessments and dynamic monitoring of indicators based on fine-scale population data are necessary to support the implementation of the United Nations (UN) 2030 Agenda and to comprehensively achieve its 17 Sustainable Development Goals (SDGs). However, most population data are collected by administrative units, and it is difficult to reflect true distribution and uniformity in space. To solve this problem, based on fine building information, a geospatial disaggregation method of population data for supporting SDG assessments is presented in this paper. First, Deqing County in China, which was divided into residential areas and nonresidential areas according to the idea of dasymetric mapping, was selected as the study area. Then, the town administrative areas were taken as control units, building area and number of floors were used as weighting factors to establish the disaggregation model, and population data with a resolution of 30 m in Deqing County in 2016 were obtained. After analyzing the statistical population of 160 villages and the disaggregation results, we found that the global average accuracy was 87.08%. Finally, by using the disaggregation population data, indicators 3.8.1, 4.a.1, and 9.1.1 were selected to conduct an accessibility analysis and a buffer analysis in a quantitative assessment of the SDGs. The results showed that the SDG measurement and assessment results based on the disaggregated population data were more accurate and effective than the results obtained using the traditional method.
    Electronic ISSN: 2220-9964
    Topics: Architecture, Civil Engineering, Surveying , Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 84
    Publication Date: 2019
    Description: Recent federal documents devoted to the Arctic zone economic development highlighted eight basic areas—future innovative centers of regional development. Totally 150 investment projects are planned by 2030, where 48% are designated for mineral resources extraction, 16%—for transport development, 7%—for geological survey, 2%—for environment safety protection etc. At the same time, these ambitious plans should meet green economy goals. This means that territorial planning will have to consider at least three spatially differentiated issues: Socio-economic, ecological and environmental (nature hazards, climatic changes etc.). Thus, the initial stage of territorial planning for economic development needs evaluation of different spatial combinations of these issues. This research presents an algorithm for evaluation of joint impact of basic regional components, characterizing “nature-population-economy” interrelations in order to reveal their spatial differences and demonstrate options and risks for future sustainable development of the Russian Arctic. Basic research methods included system analysis with GIS tools. Accumulated data were arranged in three blocks which included principle regional factors which control sustainable development. In order to find different patterns of sustainability provided by these factors pair assessments of ecological/economic, environmental/economic and ecological/environmental data was done. Independent variable-environmental factors offered different spatial natural patterns either promoting or hampering economic development. It was impossible to assess jointly all three blocks data because the discussed framework of regional sustainability factors attributed to spatial regional system, which demonstrated its panarchy character. Ranking results were visualized in a map where the selected pair groups were shown for each basic territory of advanced development. Visualization of proportional correlation of social, economic and ecological factors was achieved using color triangle method (RGB).
    Electronic ISSN: 2220-9964
    Topics: Architecture, Civil Engineering, Surveying , Geosciences
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 85
    Publication Date: 2019
    Description: In this paper, we propose a theoretical framework to analyze the secure communication problem for broadcasting two encrypted sources in the presence of an adversary which launches side-channel attacks. The adversary is not only allowed to eavesdrop the ciphertexts in the public communication channel, but is also allowed to gather additional information on the secret keys via the side-channels, physical phenomenon leaked by the encryption devices during the encryption process, such as the fluctuations of power consumption, heat, or electromagnetic radiation generated by the encryption devices. Based on our framework, we propose a countermeasure against such adversary by using the post-encryption-compression (PEC) paradigm, in the case of one-time-pad encryption. We implement the PEC paradigm using affine encoders constructed from linear encoders and derive the explicit the sufficient conditions to attain the exponential decay of the information leakage as the block lengths of encrypted sources become large. One interesting feature of the proposed countermeasure is that its performance is independent from the type of side information leaked by the encryption devices.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 86
    Publication Date: 2019
    Description: Based on a sample of geolocated elements, each of them labeled with a (not necessarily ordered) categorical feature, several indexes for assessing the relationship between the geolocation variables (latitude and longitude) and the categorical variable are evaluated. Among these indexes, a new one based on a Voronoi tessellation presents several advantages since it does not require a variable transformation or a previous discretization; in addition, simulations show that this index is considerably robust when compared with the previously known ones. Finally, the use of the presented indexes is also illustrated by analyzing the geolocation of communities in some communication networks derived from Call Detail Records.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 87
    Publication Date: 2019
    Description: Understanding or estimating the co-evolution processes is critical in ecology, but very challenging. Traditional methods are difficult to deal with the complex processes of evolution and to predict their consequences on nature. In this paper, we use the deep-reinforcement learning algorithms to endow the organism with learning ability, and simulate their evolution process by using the Monte Carlo simulation algorithm in a large-scale ecosystem. The combination of the two algorithms allows organisms to use experiences to determine their behavior through interaction with that environment, and to pass on experience to their offspring. Our research showed that the predators’ reinforcement learning ability contributed to the stability of the ecosystem and helped predators obtain a more reasonable behavior pattern of coexistence with its prey. The reinforcement learning effect of prey on its own population was not as good as that of predators and increased the risk of extinction of predators. The inconsistent learning periods and speed of prey and predators aggravated that risk. The co-evolution of the two species had resulted in fewer numbers of their populations due to their potentially antagonistic evolutionary networks. If the learnable predators and prey invade an ecosystem at the same time, prey had an advantage. Thus, the proposed model illustrates the influence of learning mechanism on a predator–prey ecosystem and demonstrates the feasibility of predicting the behavior evolution in a predator–prey ecosystem using AI approaches.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 88
    Publication Date: 2019
    Description: Two families of dependence measures between random variables are introduced. They are based on the Rényi divergence of order α and the relative α -entropy, respectively, and both dependence measures reduce to Shannon’s mutual information when their order α is one. The first measure shares many properties with the mutual information, including the data-processing inequality, and can be related to the optimal error exponents in composite hypothesis testing. The second measure does not satisfy the data-processing inequality, but appears naturally in the context of distributed task encoding.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 89
    Publication Date: 2019
    Description: We set out to demonstrate that the Rényi entropies are better thought of as operating in a type of non-linear semiring called a positive semifield. We show how the Rényi’s postulates lead to Pap’s g-calculus where the functions carrying out the domain transformation are Rényi’s information function and its inverse. In its turn, Pap’s g-calculus under Rényi’s information function transforms the set of positive reals into a family of semirings where “standard” product has been transformed into sum and “standard” sum into a power-emphasized sum. Consequently, the transformed product has an inverse whence the structure is actually that of a positive semifield. Instances of this construction lead to idempotent analysis and tropical algebra as well as to less exotic structures. We conjecture that this is one of the reasons why tropical algebra procedures, like the Viterbi algorithm of dynamic programming, morphological processing, or neural networks are so successful in computational intelligence applications. But also, why there seem to exist so many computational intelligence procedures to deal with “information” at large.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 90
    Publication Date: 2019
    Description: The trade-off between large power output, high efficiency and small fluctuations in the operation of heat engines has recently received interest in the context of thermodynamic uncertainty relations (TURs). Here we provide a concrete illustration of this trade-off by theoretically investigating the operation of a quantum point contact (QPC) with an energy-dependent transmission function as a steady-state thermoelectric heat engine. As a starting point, we review and extend previous analysis of the power production and efficiency. Thereafter the power fluctuations and the bound jointly imposed on the power, efficiency, and fluctuations by the TURs are analyzed as additional performance quantifiers. We allow for arbitrary smoothness of the transmission probability of the QPC, which exhibits a close to step-like dependence in energy, and consider both the linear and the non-linear regime of operation. It is found that for a broad range of parameters, the power production reaches nearly its theoretical maximum value, with efficiencies more than half of the Carnot efficiency and at the same time with rather small fluctuations. Moreover, we show that by demanding a non-zero power production, in the linear regime a stronger TUR can be formulated in terms of the thermoelectric figure of merit. Interestingly, this bound holds also in a wide parameter regime beyond linear response for our QPC device.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 91
    Publication Date: 2019
    Description: Recently, the concept of daemonic ergotropy has been introduced to quantify the maximum energy that can be obtained from a quantum system through an ancilla-assisted work extraction protocol based on information gain via projective measurements [G. Francica et al., npj Quant. Inf. 3, 12 (2018)]. We prove that quantum correlations are not advantageous over classical correlations if projective measurements are considered. We go beyond the limitations of the original definition to include generalised measurements and provide an example in which this allows for a higher daemonic ergotropy. Moreover, we propose a see-saw algorithm to find a measurement that attains the maximum work extraction. Finally, we provide a multipartite generalisation of daemonic ergotropy that pinpoints the influence of multipartite quantum correlations, and study it for multipartite entangled and classical states.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 92
    Publication Date: 2019
    Description: Analysis of high-dimensional data is a challenge in machine learning and data mining. Feature selection plays an important role in dealing with high-dimensional data for improvement of predictive accuracy, as well as better interpretation of the data. Frequently used evaluation functions for feature selection include resampling methods such as cross-validation, which show an advantage in predictive accuracy. However, these conventional methods are not only computationally expensive, but also tend to be over-optimistic. We propose a novel cross-entropy which is based on beta distribution for feature selection. In beta distribution-based cross-entropy (BetaDCE) for feature selection, the probability density is estimated by beta distribution and the cross-entropy is computed by the expected value of beta distribution, so that the generalization ability can be estimated more precisely than conventional methods where the probability density is learnt from data. Analysis of the generalization ability of BetaDCE revealed that it was a trade-off between bias and variance. The robustness of BetaDCE was demonstrated by experiments on three types of data. In the exclusive or-like (XOR-like) dataset, the false discovery rate of BetaDCE was significantly smaller than that of other methods. For the leukemia dataset, the area under the curve (AUC) of BetaDCE on the test set was 0.93 with only four selected features, which indicated that BetaDCE not only detected the irrelevant and redundant features precisely, but also more accurately predicted the class labels with a smaller number of features than the original method, whose AUC was 0.83 with 50 features. In the metabonomic dataset, the overall AUC of prediction with features selected by BetaDCE was significantly larger than that by the original reported method. Therefore, BetaDCE can be used as a general and efficient framework for feature selection.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 93
    Publication Date: 2019
    Description: Online Social Networks are used widely, raising new issues in terms of privacy, trust, and self-disclosure. For a better understanding of these issues for Facebook users, a model was built that includes privacy value, privacy risk, trust, privacy control, privacy concerns, and self-disclosure. A total of 602 respondents participated in an online survey, and structural equation modeling was used to evaluate the model. The findings indicate significant relationships between the constructs in this study. The model from our study contributes new knowledge to privacy issues, trust and self-disclosure on Online Social Networks for other researchers or developers of online social networks.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 94
    Publication Date: 2019
    Description: A measure D [ t 1 , t 2 ] for the amount of dynamical evolution exhibited by a quantum system during a time interval [ t 1 , t 2 ] is defined in terms of how distinguishable from each other are, on average, the states of the system at different times. We investigate some properties of the measure D showing that, for increasing values of the interval’s duration, the measure quickly reaches an asymptotic value given by the linear entropy of the energy distribution associated with the system’s (pure) quantum state. This leads to the formulation of an entropic variational problem characterizing the quantum states that exhibit the largest amount of dynamical evolution under energy constraints given by the expectation value of the energy.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 95
    Publication Date: 2019
    Description: Variational inference with a factorized Gaussian posterior estimate is a widely-used approach for learning parameters and hidden variables. Empirically, a regularizing effect can be observed that is poorly understood. In this work, we show how mean field inference improves generalization by limiting mutual information between learned parameters and the data through noise. We quantify a maximum capacity when the posterior variance is either fixed or learned and connect it to generalization error, even when the KL-divergence in the objective is scaled by a constant. Our experiments suggest that bounding information between parameters and data effectively regularizes neural networks on both supervised and unsupervised tasks.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 96
    Publication Date: 2019
    Description: Deep Brain Stimulation (DBS) of the Subthalamic Nuclei (STN) is the most used surgical treatment to improve motor skills in patients with Parkinson’s Disease (PD) who do not adequately respond to pharmacological treatment, or have related side effects. During surgery for the implantation of a DBS system, signals are obtained through microelectrodes recordings (MER) at different depths of the brain. These signals are analyzed by neurophysiologists to detect the entry and exit of the STN region, as well as the optimal depth for electrode implantation. In the present work, a classification model is developed and supervised by the K-nearest neighbour algorithm (KNN), which is automatically trained from the 18 temporal features of MER registers of 14 patients with PD in order to provide a clinical support tool during DBS surgery. We investigate the effect of different standardizations of the generated database, the optimal definition of KNN configuration parameters, and the selection of features that maximize KNN performance. The results indicated that KNN trained with data that was standardized per cerebral hemisphere and per patient presented the best performance, achieving an accuracy of 94.35% (p 〈 0.001). By using feature selection algorithms, it was possible to achieve 93.5% in accuracy in selecting a subset of six features, improving computation time while processing in real time.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 97
    Publication Date: 2019
    Description: Big data and streaming data are encountered in a variety of contemporary applications in business and industry. In such cases, it is common to use random projections to reduce the dimension of the data yielding compressed data. These data however possess various anomalies such as heterogeneity, outliers, and round-off errors which are hard to detect due to volume and processing challenges. This paper describes a new robust and efficient methodology, using Hellinger distance, to analyze the compressed data. Using large sample methods and numerical experiments, it is demonstrated that a routine use of robust estimation procedure is feasible. The role of double limits in understanding the efficiency and robustness is brought out, which is of independent interest.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 98
    Publication Date: 2019
    Description: Compressed sensing based in-network compression methods which minimize data redundancy are critical to cognitive video sensor networks. However, most existing methods require a large number of sensors for each measurement, resulting in significant performance degradation in energy efficiency and quality-of-service satisfaction. In this paper, a cluster-based distributed compressed sensing scheme working together with a quality-of-service aware routing framework is proposed to deliver visual information in cognitive video sensor networks efficiently. First, the correlation among adjacent video sensors determines the member nodes that participate in a cluster. On this basis, a sequential compressed sensing approach is applied to determine whether enough measurements are obtained to limit the reconstruction error between decoded signals and original signals under a specified reconstruction threshold. The goal is to maximize the removal of unnecessary traffic without sacrificing video quality. Lastly, the compressed data is transmitted via a distributed spectrum-aware quality-of-service routing scheme, with an objective of minimizing energy consumption subject to delay and reliability constraints. Simulation results demonstrate that the proposed approach can achieve energy-efficient data delivery and reconstruction accuracy of visual information compared with existing quality-of-service routing schemes.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 99
    Publication Date: 2019
    Description: The paper proposes a lossless quantum image encryption scheme based on substitution tables (S-box) scrambling, mutation operation and general Arnold transform with keys. First, the key generator builds upon the foundation of SHA-256 hash with plain-image and a random sequence. Its output value is used to yield initial conditions and parameters of the proposed image encryption scheme. Second, the permutation and gray-level encryption architecture is built by discrete Arnold map and quantum chaotic map. Before the permutation of Arnold transform, the pixel value is modified by quantum chaos sequence. In order to get high scrambling and randomness, S-box and mutation operation are exploited in gray-level encryption stage. The combination of linear transformation and nonlinear transformation ensures the complexity of the proposed scheme and avoids harmful periodicity. The simulation shows the cipher-image has a fairly uniform histogram, low correlation coefficients closed to 0, high information entropy closed to 8. The proposed cryptosystem provides 2256 key space and performs fast computational efficiency (speed = 11.920875 Mbit/s). Theoretical analyses and experimental results prove that the proposed scheme has strong resistance to various existing attacks and high level of security.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 100
    Publication Date: 2019
    Description: The blind signature is widely used in cryptography applications because it can prevent the signer from gaining the original message. Owing to the unconditional security, the quantum blind signature is more advantageous than the classical one. In this paper, we propose a new provable secure quantum blind signature scheme with the nonorthogonal single-photon BB84-state and provide a new method to encode classical messages into quantum signature states. The message owner injects a randomizing factor into the original message and then strips the blind factor from the quantum blind signature signed by the blind signer. The verifier can validate the quantum signature and announce it publicly. At last, the analytical results show that the proposed scheme satisfies all of the security requirements of the blind signature: blindness, unforgeability, non-repudiation, unlinkability, and traceability. Due to there being no use of quantum entanglement states, the total feasibility and practicability of the scheme are obviously better than the previous ones.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...