ALBERT

All Library Books, journals and Electronic Records Telegrafenberg

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
Filter
  • Articles  (10,412)
  • Molecular Diversity Preservation International  (7,240)
  • MDPI Publishing  (3,172)
  • Periodicals Archive Online (PAO)
  • Entropy  (3,172)
  • 6431
  • Chemistry and Pharmacology  (10,412)
  • Technology
Collection
  • Articles  (10,412)
Publisher
  • Molecular Diversity Preservation International  (7,240)
  • MDPI Publishing  (3,172)
  • Periodicals Archive Online (PAO)
  • MDPI  (1,442)
Years
Journal
Topic
  • Chemistry and Pharmacology  (10,412)
  • Technology
  • Physics  (10,412)
  • 1
    Publication Date: 2021-08-20
    Description: A major advantage of the use of passive sonar in the tracking multiple underwater targets is that they can be kept covert, which reduces the risk of being attacked. However, the nonlinearity of the passive Doppler and bearing measurements, the range unobservability problem, and the complexity of data association between measurements and targets make the problem of underwater passive multiple target tracking challenging. To deal with these problems, the cardinalized probability hypothesis density (CPHD) recursion, which is based on Bayesian information theory, is developed to handle the data association uncertainty, and to acquire existing targets’ numbers and states (e.g., position and velocity). The key idea of the CPHD recursion is to simultaneously estimate the targets’ intensity and the probability distribution of the number of targets. The CPHD recursion is the first moment approximation of the Bayesian multiple targets filter, which avoids the data association procedure between the targets and measurements including clutter. The Bayesian-filter-based extended Kalman filter (EKF) is applied to deal with the nonlinear bearing and Doppler measurements. The experimental results show that the EKF-based CPHD recursion works well in the underwater passive multiple target tracking system in cluttered and noisy environments.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 2
    Publication Date: 2021-08-19
    Description: A nested structure is a structural feature that is conducive to system stability formed by the coevolution of biological species in mutualistic ecosystems The coopetition relationship and value flow between industrial sectors in the global value chain are similar to the mutualistic ecosystem in nature. That is, the global economic system is always changing to form one dynamic equilibrium after another. In this paper, a nestedness-based analytical framework is used to define the generalist and specialist sectors for the purpose of analyzing the changes in the global supply pattern. We study why the global economic system can reach a stable equilibrium, what the role of different sectors play in the steady status, and how to enhance the stability of the global economic system. In detail, the domestic trade network, export trade network and import trade network of each country are extracted. Then, an econometric model is designed to analyze how the microstructure of the production system affects a country’s macroeconomic performance.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 3
    Publication Date: 2021-08-19
    Description: “What is heat?” was the title of a 1954 article by Freeman J. Dyson, published in Scientific American. Apparently, it was appropriate to ask this question at that time. The answer is given in the very first sentence of the article: heat is disordered energy. We will ask the same question again, but with a different expectation for its answer. Let us imagine that all the thermodynamic knowledge is already available: both the theory of phenomenological thermodynamics and that of statistical thermodynamics, including quantum statistics, but that the term “heat” has not yet been attributed to any of the variables of the theory. With the question “What is heat?” we now mean: which of the physical quantities deserves this name? There are several candidates: the quantities Q, H, Etherm and S. We can then formulate a desideratum, or a profile: What properties should such a measure of the quantity or amount of heat ideally have? Then, we evaluate all the candidates for their suitability. It turns out that the winner is the quantity S, which we know by the name of entropy. In the second part of the paper, we examine why entropy has not succeeded in establishing itself as a measure for the amount of heat, and we show that there is a real chance today to make up for what was missed.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 4
    Publication Date: 2021-08-20
    Description: This study applies relative entropy in naturalistic large-scale corpus to calculate the difference among L2 (second language) learners at different levels. We chose lemma, token, POS-trigram, conjunction to represent lexicon and grammar to detect the difference among different L2 groups using relative entropy. The results show that information distribution discrimination regarding lexical and grammatical differences continues to increase from L2 learners at a lower level to those at a higher level. This result is consistent with the assumption that in the course of second language acquisition, L2 learners develop towards a more complex and diverse use of language. Meanwhile, this study uses the statistics method of time series to process the data on L2 differences yielded by traditional frequency-based methods processing the same L2 corpus to compare with the results of relative entropy. However, the results from the traditional methods rarely show regularity. As compared to the algorithms in traditional approaches, relative entropy performs much better in detecting L2 proficiency development. In this sense, we have developed an effective and practical algorithm for stably detecting and predicting the developments in L2 learners’ language proficiency.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 5
    Publication Date: 2021-08-19
    Description: The transition to the use of supercritical carbon dioxide as a working fluid for power generation units will significantly reduce the equipment′s overall dimensions while increasing fuel efficiency and environmental safety. Structural and parametric optimization of S–CO2 nuclear power plants was carried out to ensure the maximum efficiency of electricity production. Based on the results of mathematical modeling, it was found that the transition to a carbon dioxide working fluid for the nuclear power plant with the BREST–OD–300 reactor leads to an increase of efficiency from 39.8 to 43.1%. Nuclear power plant transition from the Rankine water cycle to the carbon dioxide Brayton cycle with recompression is reasonable at a working fluid temperature above 455 °C due to the carbon dioxide cycle′s more effective regeneration system.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 6
    Publication Date: 2021-08-18
    Description: Time series analysis has been an important branch of information processing, and the conversion of time series into complex networks provides a new means to understand and analyze time series. In this work, using Variational Auto-Encode (VAE), we explored the construction of latent networks for univariate time series. We first trained the VAE to obtain the space of latent probability distributions of the time series and then decomposed the multivariate Gaussian distribution into multiple univariate Gaussian distributions. By measuring the distance between univariate Gaussian distributions on a statistical manifold, the latent network construction was finally achieved. The experimental results show that the latent network can effectively retain the original information of the time series and provide a new data structure for the downstream tasks.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 7
    Publication Date: 2021-08-20
    Description: The seismic data inversion from observations contaminated by spurious measures (outliers) remains a significant challenge for the industrial and scientific communities. This difficulty is due to slow processing work to mitigate the influence of the outliers. In this work, we introduce a robust formulation to mitigate the influence of spurious measurements in the seismic inversion process. In this regard, we put forth an outlier-resistant seismic inversion methodology for model estimation based on the deformed Jackson Gaussian distribution. To demonstrate the effectiveness of our proposal, we investigated a classic geophysical data-inverse problem in three different scenarios: (i) in the first one, we analyzed the sensitivity of the seismic inversion to incorrect seismic sources; (ii) in the second one, we considered a dataset polluted by Gaussian errors with different noise intensities; and (iii) in the last one we considered a dataset contaminated by many outliers. The results reveal that the deformed Jackson Gaussian outperforms the classical approach, which is based on the standard Gaussian distribution.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 8
    Publication Date: 2021-08-16
    Description: This study presents a mathematical model of recombinant protein expression, including its development, selection, and fitting results based on seventy fed-batch cultivation experiments from two independent biopharmaceutical sites. To resolve the overfitting feature of the Akaike information criterion, we proposed an entropic extension, which behaves asymptotically like the classical criteria. Estimation of recombinant protein concentration was performed with pseudo-global optimization processes while processing offline recombinant protein concentration samples. We show that functional models including the average age of the cells and the specific growth at induction or the start of product biosynthesis are the best descriptors for datasets. We also proposed introducing a tuning coefficient that would force the modified Akaike information criterion to avoid overfitting when the designer requires fewer model parameters. We expect that a lower number of coefficients would allow the efficient maximization of target microbial products in the upstream section of contract development and manufacturing organization services in the future. Experimental model fitting was accomplished simultaneously for 46 experiments at the first site and 24 fed-batch experiments at the second site. Both locations contained 196 and 131 protein samples, thus giving a total of 327 target product concentration samples derived from the bioreactor medium.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 9
    Publication Date: 2021-08-19
    Description: Pattern analysis is a widely researched topic in team sports performance analysis, using information theory as a conceptual framework. Bayesian methods are also used in this research field, but the association between these two is being developed. The aim of this paper is to present new mathematical concepts that are based on information and probability theory and can be applied to network analysis in Team Sports. These results are based on the transition matrices of the Markov chain, associated with the adjacency matrices of a network with n nodes and allowing for a more robust analysis of the variability of interactions in team sports. The proposed models refer to individual and collective rates and indexes of total variability between players and teams as well as the overall passing capacity of a network, all of which are demonstrated in the UEFA 2020/2021 Champions League Final.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 10
    Publication Date: 2021-08-19
    Description: Fault diagnosis of mechanical equipment is mainly based on the contact measurement and analysis of vibration signals. In some special working conditions, the non-contact fault diagnosis method represented by the measurement of acoustic signals can make up for the lack of contact testing. However, its engineering application value is greatly restricted due to the low signal-to-noise ratio (SNR) of the acoustic signal. To solve this deficiency, a novel fault diagnosis method based on the generalized matrix norm sparse filtering (GMNSF) is proposed in this paper. Specially, the generalized matrix norm is introduced into the sparse filtering to seek the optimal sparse feature distribution to overcome the defect of low SNR of acoustic signals. Firstly, the collected acoustic signals are randomly overlapped to form the sample fragment data set. Then, three constraints are imposed on the multi-period data set by the GMNSF model to extract the sparse features in the sample. Finally, softmax is used to as a classifier to categorize different fault types. The diagnostic performance of the proposed method is verified by the bearing and planetary gear datasets. Results show that the GMNSF model has good feature extraction ability performance and anti-noise ability than other traditional methods.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 11
    Publication Date: 2021-08-16
    Description: Our intention is to provide easy methods for estimating entropy and chemical potentials for gas phase reactions. Clausius’ virial theorem set a basis for relating kinetic energy in a body of independent material particles to its potential energy, pointing to their complementary role with respect to the second law of maximum entropy. Based on this partitioning of thermal energy as sensible heat and also as a latent heat or field potential energy, in action mechanics we express the entropy of ideal gases as a capacity factor for enthalpy plus the configurational work to sustain the relative translational, rotational, and vibrational action. This yields algorithms for estimating chemical reaction rates and positions of equilibrium. All properties of state including entropy, work potential as Helmholtz and Gibbs energies, and activated transition state reaction rates can be estimated, using easily accessible molecular properties, such as atomic weights, bond lengths, moments of inertia, and vibrational frequencies. We conclude that the large molecular size of many enzymes may catalyze reaction rates because of their large radial inertia as colloidal particles, maximising action states by impulsive collisions. Understanding how Clausius’ virial theorem justifies partitioning between thermal and statistical properties of entropy, yielding a more complete view of the second law’s evolutionary nature and the principle of maximum entropy. The ease of performing these operations is illustrated with three important chemical gas phase reactions: the reversible dissociation of hydrogen molecules, lysis of water to hydrogen and oxygen, and the reversible formation of ammonia from nitrogen and hydrogen. Employing the ergal also introduced by Clausius to define the reversible internal work overcoming molecular interactions plus the configurational work of change in Gibbs energy, often neglected; this may provide a practical guide for managing industrial processes and risk in climate change at the global scale. The concepts developed should also have value as novel methods for the instruction of senior students.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 12
    Publication Date: 2021-08-08
    Description: We provide a stochastic extension of the Baez–Fritz–Leinster characterization of the Shannon information loss associated with a measure-preserving function. This recovers the conditional entropy and a closely related information-theoretic measure that we call conditional information loss. Although not functorial, these information measures are semi-functorial, a concept we introduce that is definable in any Markov category. We also introduce the notion of an entropic Bayes’ rule for information measures, and we provide a characterization of conditional entropy in terms of this rule.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 13
    Publication Date: 2021-02-25
    Description: Anomaly detection research was conducted traditionally using mathematical and statistical methods. This topic has been widely applied in many fields. Recently reinforcement learning has achieved exceptional successes in many areas such as the AlphaGo chess playing and video gaming etc. However, there were scarce researches applying reinforcement learning to the field of anomaly detection. This paper therefore aimed at proposing an adaptable asynchronous advantage actor-critic model of reinforcement learning to this field. The performances were evaluated and compared among classical machine learning and the generative adversarial model with variants. Basic principles of the related models were introduced firstly. Then problem definitions, modelling processes and testing were detailed. The proposed model differentiated the sequence and image from other anomalies by proposing appropriate neural networks of attention mechanism and convolutional network for the two kinds of anomalies, respectively. Finally, performances with classical models using public benchmark datasets (NSL-KDD, AWID and CICIDS-2017, DoHBrw-2020) were evaluated and compared. Experiments confirmed the effectiveness of the proposed model with the results indicating higher rewards and lower loss rates on the datasets during training and testing. The metrics of precision, recall rate and F1 score were higher than or at least comparable to the state-of-the-art models. We concluded the proposed model could outperform or at least achieve comparable results with the existing anomaly detection models.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 14
    Publication Date: 2021-02-25
    Description: With the promotion of intelligent substations, more and more robots have been used in industrial sites. However, most of the meter reading methods are interfered with by the complex background environment, which makes it difficult to extract the meter area and pointer centerline, which is difficult to meet the actual needs of the substation. To solve the current problems of pointer meter reading for industrial use, this paper studies the automatic reading method of pointer instruments by putting forward the Faster Region-based Convolutional Network (Faster-RCNN) based object detection integrating with traditional computer vision. Firstly, the Faster-RCNN is used to detect the target instrument panel region. At the same time, the Poisson fusion method is proposed to expand the data set. The K-fold verification algorithm is used to optimize the quality of the data set, which solves the lack of quantity and low quality of the data set, and the accuracy of target detection is improved. Then, through some image processing methods, the image is preprocessed. Finally, the position of the centerline of the pointer is detected by the Hough transform, and the reading can be obtained. The evaluation of the algorithm performance shows that the method proposed in this paper is suitable for automatic reading of pointer meters in the substation environment, and provides a feasible idea for the target detection and reading of pointer meters.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 15
    Publication Date: 2021-02-25
    Description: Living systems are open systems, where the laws of nonequilibrium thermodynamics play the important role. Therefore, studying living systems from a nonequilibrium thermodynamic aspect is interesting and useful. In this review, we briefly introduce the history and current development of nonequilibrium thermodynamics, especially that in biochemical systems. We first introduce historically how people realized the importance to study biological systems in the thermodynamic point of view. We then introduce the development of stochastic thermodynamics, especially three landmarks: Jarzynski equality, Crooks’ fluctuation theorem and thermodynamic uncertainty relation. We also summarize the current theoretical framework for stochastic thermodynamics in biochemical reaction networks, especially the thermodynamic concepts and instruments at nonequilibrium steady state. Finally, we show two applications and research paradigms for thermodynamic study in biological systems.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 16
    Publication Date: 2021-02-25
    Description: The pooling layer is at the heart of every convolutional neural network (CNN) contributing to the invariance of data variation. This paper proposes a pooling method based on Zeckendorf’s number series. The maximum pooling layers are replaced with Z pooling layer, which capture texels from input images, convolution layers, etc. It is shown that Z pooling properties are better adapted to segmentation tasks than other pooling functions. The method was evaluated on a traditional image segmentation task and on a dense labeling task carried out with a series of deep learning architectures in which the usual maximum pooling layers were altered to use the proposed pooling mechanism. Not only does it arbitrarily increase the receptive field in a parameterless fashion but it can better tolerate rotations since the pooling layers are independent of the geometric arrangement or sizes of the image regions. Different combinations of pooling operations produce images capable of emphasizing low/high frequencies, extract ultrametric contours, etc.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 17
    Publication Date: 2021-02-25
    Description: In this paper we propose a novel transform domain steganography technique—hiding a message in components of linear combination of high order eigenfaces vectors. By high order we mean eigenvectors responsible for dimensions with low amount of overall image variance, which are usually related to high-frequency parameters of image (details). The study found that when the method was trained on large enough data sets, image quality was nearly unaffected by modification of some linear combination coefficients used as PCA-based features. The proposed method is only limited to facial images, but in the era of overwhelming influence of social media, hundreds of thousands of selfies uploaded every day to social networks do not arouse any suspicion as a potential steganography communication channel. From our best knowledge there is no description of any popular steganography method that utilizes eigenfaces image domain. Due to this fact we have performed extensive evaluation of our method using at least 200 000 facial images for training and robustness evaluation of proposed approach. The obtained results are very promising. What is more, our numerical comparison with other state-of-the-art algorithms proved that eigenfaces-based steganography is among most robust methods against compression attack. The proposed research can be reproduced because we use publicly accessible data set and our implementation can be downloaded.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 18
    Publication Date: 2021-02-25
    Description: Non-extensive statistical mechanics (NESM), introduced by Tsallis based on the principle of non-additive entropy, is a generalisation of the Boltzmann–Gibbs statistics. NESM has been shown to provide the necessary theoretical and analytical implementation for studying complex systems such as the fracture mechanisms and crack evolution processes that occur in mechanically loaded specimens of brittle materials. In the current work, acoustic emission (AE) data recorded when marble and cement mortar specimens were subjected to three distinct loading protocols until fracture, are discussed in the context of NESM. The NESM analysis showed that the cumulative distribution functions of the AE interevent times (i.e., the time interval between successive AE hits) follow a q-exponential function. For each examined specimen, the corresponding Tsallis entropic q-indices and the parameters βq and τq were calculated. The entropic index q shows a systematic behaviour strongly related to the various stages of the implemented loading protocols for all the examined specimens. Results seem to support the idea of using the entropic index q as a potential pre-failure indicator for the impending catastrophic fracture of the mechanically loaded specimens.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 19
    Publication Date: 2021-02-25
    Description: The deployment of machine learning models is expected to bring several benefits. Nevertheless, as a result of the complexity of the ecosystem in which models are generally trained and deployed, this technology also raises concerns regarding its (1) interpretability, (2) fairness, (3) safety, and (4) privacy. These issues can have substantial economic implications because they may hinder the development and mass adoption of machine learning. In light of this, the purpose of this paper was to determine, from a positive economics point of view, whether the free use of machine learning models maximizes aggregate social welfare or, alternatively, regulations are required. In cases in which restrictions should be enacted, policies are proposed. The adaptation of current tort and anti-discrimination laws is found to guarantee an optimal level of interpretability and fairness. Additionally, existing market solutions appear to incentivize machine learning operators to equip models with a degree of security and privacy that maximizes aggregate social welfare. These findings are expected to be valuable to inform the design of efficient public policies.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 20
    Publication Date: 2021-03-31
    Description: Edge detection is a fundamental image analysis task, as it provides insight on the content of an image. There are weaknesses in some of the edge detectors developed until now, such as disconnected edges, the impossibility to detect branching edges, or the need for a ground truth that is not always accessible. Therefore, a specialized detector that is optimized for the image particularities can help improve edge detection performance. In this paper, we apply transfer learning to optimize cellular automata (CA) rules for edge detection using particle swarm optimization (PSO). Cellular automata provide fast computation, while rule optimization provides adaptability to the properties of the target images. We use transfer learning from synthetic to medical images because expert-annotated medical data is typically difficult to obtain. We show that our method is tunable for medical images with different properties, and we show that, for more difficult edge detection tasks, batch optimization can be used to boost the quality of the edges. Our method is suitable for the identification of structures, such as cardiac cavities on medical images, and could be used as a component of an automatic radiology decision support tool.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 21
    Publication Date: 2021-03-30
    Description: To verify the relationship between AETA (Acoustic and Electromagnetics to Artificial Intelligence (AI)) electromagnetic anomalies and local earthquakes, we have performed statistical studies on the electromagnetic data observed at AETA station. To ensure the accuracy of statistical results, 20 AETA stations with few data missing and abundant local earthquake events were selected as research objects. A modified PCA method was used to obtain the sequence representing the signal anomaly. Statistical results of superposed epoch analysis have indicated that 80% of AETA stations have significant relationship between electromagnetic anomalies and local earthquakes. These anomalies are more likely to appear before the earthquakes rather than after them. Further, we used Molchan’s error diagram to evaluate the electromagnetic signal anomalies at stations with significant relationships. All area skill scores are greater than 0. The above results have indicated that AETA electromagnetic anomalies contain precursory information and have the potential to improve local earthquake forecasting.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 22
    Publication Date: 2021-03-31
    Description: The solar photosphere and the outer layer of the Sun’s interior are characterized by convective motions, which display a chaotic and turbulent character. In this work, we evaluated the pseudo-Lyapunov exponents of the overshooting convective motions observed on the Sun’s surface by using a method employed in the literature to estimate those exponents, as well as another technique deduced from their definition. We analyzed observations taken with state-of-the-art instruments at ground- and space-based telescopes, and we particularly benefited from the spectro-polarimetric data acquired with the Interferometric Bidimensional Spectrometer, the Crisp Imaging SpectroPolarimeter, and the Helioseismic and Magnetic Imager. Following previous studies in the literature, we computed maps of four quantities which were representative of the physical properties of solar plasma in each observation, and estimated the pseudo-Lyapunov exponents from the residuals between the values of the quantities computed at any point in the map and the mean of values over the whole map. In contrast to previous results reported in the literature, we found that the computed exponents hold negative values, which are typical of a dissipative regime, for all the quantities derived from our observations. The values of the estimated exponents increase with the spatial resolution of the data and are almost unaffected by small concentrations of magnetic field. Finally, we showed that similar results were also achieved by estimating the exponents from residuals between the values at each point in maps derived from observations taken at different times. The latter estimation technique better accounts for the definition of these exponents than the method employed in previous studies.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 23
    Publication Date: 2021-03-29
    Description: We investigate the possibility of phantom crossing in the dark energy sector and the solution for the Hubble tension between early and late universe observations. We use robust combinations of different cosmological observations, namely the Cosmic Microwave Background (CMB), local measurement of Hubble constant (H0), Baryon Acoustic Oscillation (BAO) and SnIa for this purpose. For a combination of CMB+BAO data that is related to early universe physics, phantom crossing in the dark energy sector was confirmed at a 95% confidence level and we obtained the constraint H0=71.0−3.8+2.9 km/s/Mpc at a 68% confidence level, which is in perfect agreement with the local measurement by Riess et al. We show that constraints from different combinations of data are consistent with each other and all of them are consistent with phantom crossing in the dark energy sector. For the combination of all data considered, we obtained the constraint H0=70.25±0.78 km/s/Mpc at a 68% confidence level and the phantom crossing happening at the scale factor am=0.851−0.031+0.048 at a 68% confidence level.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 24
    Publication Date: 2021-03-31
    Description: Gait stability has been measured by using many entropy-based methods. However, the relation between the entropy values and gait stability is worth further investigation. A research reported that average entropy (AE), a measure of disorder, could measure the static standing postural stability better than multiscale entropy and entropy of entropy (EoE), two measures of complexity. This study tested the validity of AE in gait stability measurement from the viewpoint of the disorder. For comparison, another five disorders, the EoE, and two traditional metrics methods were, respectively, used to measure the degrees of disorder and complexity of 10 step interval (SPI) and 79 stride interval (SI) time series, individually. As a result, every one of the 10 participants exhibited a relatively high AE value of the SPI when walking with eyes closed and a relatively low AE value when walking with eyes open. Most of the AE values of the SI of the 53 diseased subjects were greater than those of the 26 healthy subjects. A maximal overall accuracy of AE in differentiating the healthy from the diseased was 91.1%. Similar features also exists on those 5 disorder measurements but do not exist on the EoE values. Nevertheless, the EoE versus AE plot of the SI also exhibits an inverted U relation, consistent with the hypothesis for physiologic signals.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 25
    Publication Date: 2021-03-29
    Description: This paper gives formal foundations and evidence from gene science in the post Barbara McClintock era that the Gödel Sentence, far from being an esoteric construction in mathematical logic, is ubiquitous in genomic intelligence that evolved with multi-cellular life. Conditions uniquely found in the Adaptive Immune System (AIS) and Mirror Neuron System (MNS), termed the genomic immuno-cognitive system, coincide with three building blocks in computation theory of Gödel, Turing and Post (G-T-P). (i) Biotic elements have unique digital identifiers with gene codes executing 3D self-assembly for morphology and regulation of the organism using the recursive operation of Self-Ref (Self-Reference) with the other being a self-referential projection of self. (ii) A parallel offline simulation meta/mirror environment in 1–1 relation to online machine executions of self-codes gives G-T-P Self-Rep (Self-Representation). (iii) This permits a digital biotic entity to self-report that it is under attack by a biotic malware or non-self antigen in the format of the Gödel sentence, resulting in the “smarts” for contextual novelty production. The proposed unitary G-T-P recursive machinery in AIS and in MNS for social cognition yields a new explanation that the Interferon Gamma factor, known for friend-foe identification in AIS, is also integral to social behaviors. New G-T-P bio-informatics of AIS and novel anti-body production is given with interesting testable implications for COVID-19 pathology.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 26
    Publication Date: 2021-03-24
    Description: In order to explore the mechanism during the process of the non-synchronous vibration (NSV), the flow field formation development is investigated in this paper. Based on the fluid–structure interaction method, the vibration of rotor blades is found to be in the first bending mode with a non-integral order (4.6) of the rotation speed. Referring to the constant inter blade phase angle (IBPA), the appearances of frequency-locking and phase-locking can be identified for the NSV. A periodical instability flow emerges in the tip region with the mixture of separation vortex and tip leakage flow. Due to the nonlinearities of fluid and structure, the blade vibration exhibits a limit cycle oscillation (LCO) response. The separation vortex presenting a spiral structure propagates in the annulus, indicating a pattern as modal oscillation. A flow induced vibration is initiated by the spiral vortex in the tip. The large pressure oscillation caused by the movement of the spiral vortex is regarded as a main factor for the presented NSV. As the oscillation of blade loading occurs with blade rotating pass the disturbances, the intensity of the reverse leakage flow in adjacent channels also plays a crucial role in the blade vibration.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 27
    Publication Date: 2021-03-24
    Description: Hellinger distance has been widely used to derive objective functions that are alternatives to maximum likelihood methods. While the asymptotic distributions of these estimators have been well investigated, the probabilities of rare events induced by them are largely unknown. In this article, we analyze these rare event probabilities using large deviation theory under a potential model misspecification, in both one and higher dimensions. We show that these probabilities decay exponentially, characterizing their decay via a “rate function” which is expressed as a convex conjugate of a limiting cumulant generating function. In the analysis of the lower bound, in particular, certain geometric considerations arise that facilitate an explicit representation, also in the case when the limiting generating function is nondifferentiable. Our analysis involves the modulus of continuity properties of the affinity, which may be of independent interest.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 28
    Publication Date: 2021-03-24
    Description: We consider series systems built of components which have independent identically distributed (iid) lifetimes with an increasing failure rate (IFR). We determine sharp upper bounds for the expectations of the system lifetimes expressed in terms of the mean, and various scale units based on absolute central moments of component lifetimes. We further establish analogous bounds under a more stringent assumption that the component lifetimes have an increasing density (ID) function. We also indicate the relationship between the IFR property of the components and the generalized cumulative residual entropy of the series system lifetime.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 29
    Publication Date: 2021-02-02
    Description: In this paper, the parameter estimation problem of a truncated normal distribution is discussed based on the generalized progressive hybrid censored data. The desired maximum likelihood estimates of unknown quantities are firstly derived through the Newton–Raphson algorithm and the expectation maximization algorithm. Based on the asymptotic normality of the maximum likelihood estimators, we develop the asymptotic confidence intervals. The percentile bootstrap method is also employed in the case of the small sample size. Further, the Bayes estimates are evaluated under various loss functions like squared error, general entropy, and linex loss functions. Tierney and Kadane approximation, as well as the importance sampling approach, is applied to obtain the Bayesian estimates under proper prior distributions. The associated Bayesian credible intervals are constructed in the meantime. Extensive numerical simulations are implemented to compare the performance of different estimation methods. Finally, an authentic example is analyzed to illustrate the inference approaches.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 30
    Publication Date: 2021-03-19
    Description: In the integer-valued generalized autoregressive conditional heteroscedastic (INGARCH) models, parameter estimation is conventionally based on the conditional maximum likelihood estimator (CMLE). However, because the CMLE is sensitive to outliers, we consider a robust estimation method for bivariate Poisson INGARCH models while using the minimum density power divergence estimator. We demonstrate the proposed estimator is consistent and asymptotically normal under certain regularity conditions. Monte Carlo simulations are conducted to evaluate the performance of the estimator in the presence of outliers. Finally, a real data analysis using monthly count series of crimes in New South Wales and an artificial data example are provided as an illustration.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 31
    Publication Date: 2021-03-23
    Description: Uncertainty in the rainfall network can lead to mistakes in dam operation. Sudden increases in dam water levels due to rainfall uncertainty are a high disaster risk. In order to prevent these losses, it is necessary to configure an appropriate rainfall network that can effectively reflect the characteristics of the watershed. In this study, conditional entropy was used to calculate the uncertainty of the watershed using rainfall and radar data observed from 2018 to 2019 in the Goesan Dam and Hwacheon Dam watersheds. The results identified radar data suitable for the characteristics of the watershed and proposed a site for an additional rainfall gauge. It is also necessary to select the location of the additional rainfall gauged by limiting the points where smooth movement and installation, for example crossing national borders, are difficult. The proposed site emphasized accessibility and usability by leveraging road information and selecting a radar grid near the road. As a practice result, the uncertainty of precipitation in the Goesan and Hwacheon Dam watersheds could be decreased by 70.0% and 67.9%, respectively, when four and three additional gauge sites were installed without any restriction. When these were installed near to the road, with five and four additional gauge sites, the uncertainty in the Goesan Dam and Hwacheon Dam watersheds were reduced by up to 71.1%. Therefore, due to the high degree of uncertainty, it is necessary to measure precipitation. The operation of the rainfall gauge can provide a smooth site and configure an appropriate monitoring network.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 32
    Publication Date: 2021-03-23
    Description: From a physical/dynamical system perspective, the potential well represents the proportional mass of points that escape the neighbourhood of a given point. In the last 20 years, several works have shown the importance of this quantity to obtain precise approximations for several recurrence time distributions in mixing stochastic processes and dynamical systems. Besides providing a review of the different scaling factors used in the literature in recurrence times, the present work contributes two new results: (1) For ϕ-mixing and ψ-mixing processes, we give a new exponential approximation for hitting and return times using the potential well as the scaling parameter. The error terms are explicit and sharp. (2) We analyse the uniform positivity of the potential well. Our results apply to processes on countable alphabets and do not assume a complete grammar.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 33
    Publication Date: 2021-03-23
    Description: The Multi-Armed Bandit (MAB) problem has been extensively studied in order to address real-world challenges related to sequential decision making. In this setting, an agent selects the best action to be performed at time-step t, based on the past rewards received by the environment. This formulation implicitly assumes that the expected payoff for each action is kept stationary by the environment through time. Nevertheless, in many real-world applications this assumption does not hold and the agent has to face a non-stationary environment, that is, with a changing reward distribution. Thus, we present a new MAB algorithm, named f-Discounted-Sliding-Window Thompson Sampling (f-dsw TS), for non-stationary environments, that is, when the data streaming is affected by concept drift. The f-dsw TS algorithm is based on Thompson Sampling (TS) and exploits a discount factor on the reward history and an arm-related sliding window to contrast concept drift in non-stationary environments. We investigate how to combine these two sources of information, namely the discount factor and the sliding window, by means of an aggregation function f(.). In particular, we proposed a pessimistic (f=min), an optimistic (f=max), as well as an averaged (f=mean) version of the f-dsw TS algorithm. A rich set of numerical experiments is performed to evaluate the f-dsw TS algorithm compared to both stationary and non-stationary state-of-the-art TS baselines. We exploited synthetic environments (both randomly-generated and controlled) to test the MAB algorithms under different types of drift, that is, sudden/abrupt, incremental, gradual and increasing/decreasing drift. Furthermore, we adapt four real-world active learning tasks to our framework—a prediction task on crimes in the city of Baltimore, a classification task on insects species, a recommendation task on local web-news, and a time-series analysis on microbial organisms in the tropical air ecosystem. The f-dsw TS approach emerges as the best performing MAB algorithm. At least one of the versions of f-dsw TS performs better than the baselines in synthetic environments, proving the robustness of f-dsw TS under different concept drift types. Moreover, the pessimistic version (f=min) results as the most effective in all real-world tasks.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 34
    Publication Date: 2021-03-24
    Description: The spleen is one of the most frequently injured organs in blunt abdominal trauma. Computed tomography (CT) is the imaging modality of choice to assess patients with blunt spleen trauma, which may include lacerations, subcapsular or parenchymal hematomas, active hemorrhage, and vascular injuries. While computer-assisted diagnosis systems exist for other conditions assessed using CT scans, the current method to detect spleen injuries involves the manual review of scans by radiologists, which is a time-consuming and repetitive process. In this study, we propose an automated spleen injury detection method using machine learning. CT scans from patients experiencing traumatic injuries were collected from Michigan Medicine and the Crash Injury Research Engineering Network (CIREN) dataset. Ninety-nine scans of healthy and lacerated spleens were split into disjoint training and test sets, with random forest (RF), naive Bayes, SVM, k-nearest neighbors (k-NN) ensemble, and subspace discriminant ensemble models trained via 5-fold cross validation. Of these models, random forest performed the best, achieving an Area Under the receiver operating characteristic Curve (AUC) of 0.91 and an F1 score of 0.80 on the test set. These results suggest that an automated, quantitative assessment of traumatic spleen injury has the potential to enable faster triage and improve patient outcomes.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 35
    Publication Date: 2021-03-24
    Description: The presence of unaccounted heterogeneity in simultaneous equation models (SEMs) is frequently problematic in many real-life applications. Under the usual assumption of homogeneity, the model can be seriously misspecified, and it can potentially induce an important bias in the parameter estimates. This paper focuses on SEMs in which data are heterogeneous and tend to form clustering structures in the endogenous-variable dataset. Because the identification of different clusters is not straightforward, a two-step strategy that first forms groups among the endogenous observations and then uses the standard simultaneous equation scheme is provided. Methodologically, the proposed approach is based on a variational Bayes learning algorithm and does not need to be executed for varying numbers of groups in order to identify the one that adequately fits the data. We describe the statistical theory, evaluate the performance of the suggested algorithm by using simulated data, and apply the two-step method to a macroeconomic problem.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 36
    Publication Date: 2021-03-24
    Description: A mechanistic kinetic model of cobalt–hydrogen electrochemical competition for the cobalt removal process in zinc hydrometallurgical was proposed. In addition, to overcome the parameter estimation difficulties arising from the model nonlinearities and the lack of information on the possible value ranges of parameters to be estimated, a constrained guided parameter estimation scheme was derived based on model equations and experimental data. The proposed model and the parameter estimation scheme have two advantages: (i) The model reflected for the first time the mechanism of the electrochemical competition between cobalt and hydrogen ions in the process of cobalt removal in zinc hydrometallurgy; (ii) The proposed constrained parameter estimation scheme did not depend on the information of the possible value ranges of parameters to be estimated; (iii) the constraint conditions provided in that scheme directly linked the experimental phenomenon metrics to the model parameters thereby providing deeper insights into the model parameters for model users. Numerical experiments showed that the proposed constrained parameter estimation algorithm significantly improved the estimation efficiency. Meanwhile, the proposed cobalt–hydrogen electrochemical competition model allowed for accurate simulation of the impact of hydrogen ions on cobalt removal rate as well as simulation of the trend of hydrogen ion concentration, which would be helpful for the actual cobalt removal process in zinc hydrometallurgy.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 37
    Publication Date: 2021-03-10
    Description: In the context of smart cities, there is a general benefit from monitoring close encounters among pedestrians. For instance, for the access control to office buildings, subway, commercial malls, etc., where a high amount of users may be present simultaneously, and keeping a strict record on each individual may be challenging. GPS tracking may not be available in many indoor cases; video surveillance may require expensive deployment (mainly due to the high-quality cameras and face recognition algorithms) and can be restrictive in case of low budget applications; RFID systems can be cumbersome and limited in the detection range. This information can later be used in many different scenarios. For instance, in case of earthquakes, fires, and accidents in general, the administration of the buildings can have a clear record of the people inside for victim searching activities. However, in the pandemic derived from the COVID-19 outbreak, a tracking that allows detecting of pedestrians in close range (a few meters) can be particularly useful to control the virus propagation. Hence, we propose a mobile clustering scheme where only a selected number of pedestrians (Cluster Heads) collect the information of the people around them (Cluster Members) in their trajectory inside the area of interest. Hence, a small number of transmissions are made to a control post, effectively limiting the collision probability and increasing the successful registration of people in close contact. Our proposal shows an increased success packet transmission probability and a reduced collision and idle slot probability, effectively improving the performance of the system compared to the case of direct transmissions from each node.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 38
    Publication Date: 2021-03-10
    Description: Detection of the temporal reversibility of a given process is an interesting time series analysis scheme that enables the useful characterisation of processes and offers an insight into the underlying processes generating the time series. Reversibility detection measures have been widely employed in the study of ecological, epidemiological and physiological time series. Further, the time reversal of given data provides a promising tool for analysis of causality measures as well as studying the causal properties of processes. In this work, the recently proposed Compression-Complexity Causality (CCC) measure (by the authors) is shown to be free of the assumption that the "cause precedes the effect", making it a promising tool for causal analysis of reversible processes. CCC is a data-driven interventional measure of causality (second rung on the Ladder of Causation) that is based on Effort-to-Compress (ETC), a well-established robust method to characterize the complexity of time series for analysis and classification. For the detection of the temporal reversibility of processes, we propose a novel measure called the Compressive Potential based Asymmetry Measure. This asymmetry measure compares the probability of the occurrence of patterns at different scales between the forward-time and time-reversed process using ETC. We test the performance of the measure on a number of simulated processes and demonstrate its effectiveness in determining the asymmetry of real-world time series of sunspot numbers, digits of the transcedental number π and heart interbeat interval variability.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 39
    Publication Date: 2021-03-12
    Description: Machine learning models can automatically discover biomedical research trends and promote the dissemination of information and knowledge. Text feature representation is a critical and challenging task in natural language processing. Most methods of text feature representation are based on word representation. A good representation can capture semantic and structural information. In this paper, two fusion algorithms are proposed, namely, the Tr-W2v and Ti-W2v algorithms. They are based on the classical text feature representation model and consider the importance of words. The results show that the effectiveness of the two fusion text representation models is better than the classical text representation model, and the results based on the Tr-W2v algorithm are the best. Furthermore, based on the Tr-W2v algorithm, trend analyses of cancer research are conducted, including correlation analysis, keyword trend analysis, and improved keyword trend analysis. The discovery of the research trends and the evolution of hotspots for cancers can help doctors and biological researchers collect information and provide guidance for further research.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 40
    Publication Date: 2021-03-12
    Description: As a complex field-circuit coupling system comprised of electric, magnetic and thermal machines, the permanent magnet synchronous motor of the electric vehicle has various operating conditions and complicated condition environment. There are various forms of failure, and the signs of failure are crossed or overlapped. Randomness, secondary, concurrency and communication characteristics make it difficult to diagnose faults. Meanwhile, the common intelligent diagnosis methods have low accuracy, poor generalization ability and difficulty in processing high-dimensional data. This paper proposes a method of fault feature extraction for motor based on the principle of stacked denoising autoencoder (SDAE) combined with the support vector machine (SVM) classifier. First, the motor signals collected from the experiment were processed, and the input data were randomly damaged by adding noise. Furthermore, according to the experimental results, the network structure of stacked denoising autoencoder was constructed, the optimal learning rate, noise reduction coefficient and the other network parameters were set. Finally, the trained network was used to verify the test samples. Compared with the traditional fault extraction method and single autoencoder method, this method has the advantages of better accuracy, strong generalization ability and easy-to-deal-with high-dimensional data features.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 41
    Publication Date: 2021-03-14
    Description: We numerically investigate the transport of a Brownian colloidal particle in a square array of planar counter-rotating convection rolls at high Péclet numbers. We show that an external force produces huge excess peaks of the particle’s diffusion constant with a height that depends on the force orientation and intensity. In sharp contrast, the particle’s mobility is isotropic and force independent. We relate such a nonlinear response of the system to the advection properties of the laminar flow in the suspension fluid.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 42
    Publication Date: 2021-03-12
    Description: In this paper, we generalize the notion of Shannon’s entropy power to the Rényi-entropy setting. With this, we propose generalizations of the de Bruijn identity, isoperimetric inequality, or Stam inequality. This framework not only allows for finding new estimation inequalities, but it also provides a convenient technical framework for the derivation of a one-parameter family of Rényi-entropy-power-based quantum-mechanical uncertainty relations. To illustrate the usefulness of the Rényi entropy power obtained, we show how the information probability distribution associated with a quantum state can be reconstructed in a process that is akin to quantum-state tomography. We illustrate the inner workings of this with the so-called “cat states”, which are of fundamental interest and practical use in schemes such as quantum metrology. Salient issues, including the extension of the notion of entropy power to Tsallis entropy and ensuing implications in estimation theory, are also briefly discussed.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 43
    Publication Date: 2021-02-17
    Description: We prove that, within the class of pair potential Hamiltonians, the excess entropy is a universal, temperature-independent functional of the density and pair correlation function. This result extends Henderson’s theorem, which states that the free energy is a temperature dependent functional of the density and pair correlation. The stationarity and concavity of the excess entropy functional are discussed and related to the Gibbs–Bugoliubov inequality and to the free energy. We apply the Kirkwood approximation, which is commonly used for fluids, to both fluids and solids. Approximate excess entropy functionals are developed and compared to results from thermodynamic integration. The pair functional approach gives the absolute entropy and free energy based on simulation output at a single temperature without thermodynamic integration. We argue that a functional of the type, which is strictly applicable to pair potentials, is also suitable for first principles calculation of free energies from Born–Oppenheimer molecular dynamics performed at a single temperature. This advancement has the potential to reduce the evaluation the free energy to a simple modification to any procedure that evaluates the energy and the pair correlation function.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 44
    Publication Date: 2021-02-17
    Description: The history of information theory, as a mathematical principle for analyzing data transmission and information communication, was formalized in 1948 with the publication of Claude Shannon’s famous paper “A Mathematical Theory of Communication” [...]
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 45
    Publication Date: 2021-02-02
    Description: Quantitative metagenomics is an important field that has delivered successful microbiome biomarkers associated with host phenotypes. The current convention mainly depends on unsupervised assembly of metagenomic contigs with a possibility of leaving interesting genetic material unassembled. Additionally, biomarkers are commonly defined on the differential relative abundance of compositional or functional units. Accumulating evidence supports that microbial genetic variations are as important as the differential abundance content, implying the need for novel methods accounting for the genetic variations in metagenomics studies. We propose an information theoretic metagenome assembly algorithm, discovering genomic fragments with maximal self-information, defined by the empirical distributions of nucleotides across the phenotypes and quantified with the help of statistical tests. Our algorithm infers fragments populating the most informative genetic variants in a single contig, named supervariant fragments. Experiments on simulated metagenomes, as well as on a colorectal cancer and an atherosclerotic cardiovascular disease dataset consistently discovered sequences strongly associated with the disease phenotypes. Moreover, the discriminatory power of these putative biomarkers was mainly attributed to the genetic variations rather than relative abundance. Our results support that a focus on metagenomics methods considering microbiome population genetics might be useful in discovering disease biomarkers with a great potential of translating to molecular diagnostics and biotherapeutics applications.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 46
    Publication Date: 2021-03-10
    Description: Human fall identification can play a significant role in generating sensor based alarm systems, assisting physical therapists not only to reduce after fall effects but also to save human lives. Usually, elderly people suffer from various kinds of diseases and fall action is a very frequently occurring circumstance at this time for them. In this regard, this paper represents an architecture to classify fall events from others indoor natural activities of human beings. Video frame generator is applied to extract frame from video clips. Initially, a two dimensional convolutional neural network (2DCNN) model is proposed to extract features from video frames. Afterward, gated recurrent unit (GRU) network finds the temporal dependency of human movement. Binary cross-entropy loss function is calculated to update the attributes of the network like weights, learning rate to minimize the losses. Finally, sigmoid classifier is used for binary classification to detect human fall events. Experimental result shows that the proposed model obtains an accuracy of 99%, which outperforms other state-of-the-art models.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 47
    Publication Date: 2021-03-25
    Description: Identification schemes are interactive cryptographic protocols typically involving two parties, a prover, who wants to provide evidence of their identity and a verifier, who checks the provided evidence and decides whether or not it comes from the intended prover. Given the growing interest in quantum computation, it is indeed desirable to have explicit designs for achieving user identification through quantum resources. In this paper, we comment on a recent proposal for quantum identity authentication from Zawadzki. We discuss the applicability of the theoretical impossibility results from Lo, Colbeck and Buhrman et al. and formally prove that the protocol must necessarily be insecure. Moreover, to better illustrate our insecurity claim, we present an attack on Zawadzki’s protocol and show that by using a simple strategy an adversary may indeed obtain relevant information on the shared identification secret. Specifically, through the use of the principal of conclusive exclusion on quantum measurements, our attack geometrically reduces the key space resulting in the claimed logarithmic security being reduced effectively by a factor of two after only three verification attempts.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 48
    Publication Date: 2021-03-25
    Description: An information-theoretic approach for detecting causality and information transfer is used to identify interactions of solar activity and interplanetary medium conditions with the Earth’s magnetosphere–ionosphere systems. A causal information transfer from the solar wind parameters to geomagnetic indices is detected. The vertical component of the interplanetary magnetic field (Bz) influences the auroral electrojet (AE) index with an information transfer delay of 10 min and the geomagnetic disturbances at mid-latitudes measured by the symmetric field in the H component (SYM-H) index with a delay of about 30 min. Using a properly conditioned causality measure, no causal link between AE and SYM-H, or between magnetospheric substorms and magnetic storms can be detected. The observed causal relations can be described as linear time-delayed information transfer.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 49
    Publication Date: 2021-03-25
    Description: We consider the dynamics of two-dimensional interacting ultracold bosons triggered by suddenly switching on an artificial gauge field. The system is initialized in the ground state of a harmonic trapping potential. As a function of the strength of the applied artificial gauge field, we analyze the emergent dynamics by monitoring the angular momentum, the fragmentation as well as the entropy and variance of the entropy of absorption or single-shot images. We solve the underlying time-dependent many-boson Schrödinger equation using the multiconfigurational time-dependent Hartree method for indistinguishable particles (MCTDH-X). We find that the artificial gauge field implants angular momentum in the system. Fragmentation—multiple macroscopic eigenvalues of the reduced one-body density matrix—emerges in sync with the dynamics of angular momentum: the bosons in the many-body state develop non-trivial correlations. Fragmentation and angular momentum are experimentally difficult to assess; here, we demonstrate that they can be probed by statistically analyzing the variance of the image entropy of single-shot images that are the standard projective measurement of the state of ultracold atomic systems.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 50
    Publication Date: 2021-03-25
    Description: Robustness of the collaborative knowledge network (CKN) is critical to the success of open source projects. To study this robustness more comprehensively and accurately, we constructed a weighted CKN based on the semantic analysis of collaborative behavior, where (a) open source designers were the network nodes, (b) collaborative behavior among designers was the edges, and (c) collaborative text content intensity and collaborative frequency intensity were the edge weights. To study the robustness from a dynamic viewpoint, we constructed three CKNs from different stages of the project life cycle: the start-up, growth and maturation stages. The connectivity and collaboration efficiency of the weighted network were then used as robustness evaluation indexes. Further, we designed four edge failure modes based on the behavioral characteristics of open source designers. Finally, we carried out dynamic robustness analysis experiments based on the empirical data of a Local Motors open source car design project. Our results showed that the CKN performed differently at different stages of the project life cycle, and our specific findings could help community managers of open source projects to formulate different network protection strategies at different stages of their projects.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 51
    Publication Date: 2021-02-11
    Description: When a parameter quench is performed in an isolated quantum system with a complete set of constants of motion, its out of equilibrium dynamics is considered to be well captured by the Generalized Gibbs Ensemble (GGE), characterized by a set {λα} of coefficients related to the constants of motion. We determine the most elementary GGE deviation from the equilibrium distribution that leads to detectable effects. By quenching a suitable local attractive potential in a one-dimensional electron system, the resulting GGE differs from equilibrium by only one single λα, corresponding to the emergence of an only partially occupied bound state lying below a fully occupied continuum of states. The effect is shown to induce optical gain, i.e., a negative peak in the absorption spectrum, indicating the stimulated emission of radiation, enabling one to identify GGE signatures in fermionic systems through optical measurements. We discuss the implementation in realistic setups.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 52
    Publication Date: 2021-02-11
    Description: The development of new computational approaches that are able to design the correct personalized drugs is the crucial therapeutic issue in cancer research. However, tumor heterogeneity is the main obstacle to developing patient-specific single drugs or combinations of drugs that already exist in clinics. In this study, we developed a computational approach that integrates copy number alteration, gene expression, and a protein interaction network of 73 basal breast cancer samples. 2509 prognostic genes harboring a copy number alteration were identified using survival analysis, and a protein–protein interaction network considering the direct interactions was created. Each patient was described by a specific combination of seven altered hub proteins that fully characterize the 73 basal breast cancer patients. We suggested the optimal combination therapy for each patient considering drug–protein interactions. Our approach is able to confirm well-known cancer related genes and suggest novel potential drug target genes. In conclusion, we presented a new computational approach in breast cancer to deal with the intra-tumor heterogeneity towards personalized cancer therapy.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 53
    Publication Date: 2021-02-09
    Description: Commonly used rating scales and tests have been found lacking reliability and validity, for example in neurodegenerative diseases studies, owing to not making recourse to the inherent ordinality of human responses, nor acknowledging the separability of person ability and item difficulty parameters according to the well-known Rasch model. Here, we adopt an information theory approach, particularly extending deployment of the classic Brillouin entropy expression when explaining the difficulty of recalling non-verbal sequences in memory tests (i.e., Corsi Block Test and Digit Span Test): a more ordered task, of less entropy, will generally be easier to perform. Construct specification equations (CSEs) as a part of a methodological development, with entropy-based variables dominating, are found experimentally to explain (r=R2 = 0.98) and predict the construct of task difficulty for short-term memory tests using data from the NeuroMET (n = 88) and Gothenburg MCI (n = 257) studies. We propose entropy-based equivalence criteria, whereby different tasks (in the form of items) from different tests can be combined, enabling new memory tests to be formed by choosing a bespoke selection of items, leading to more efficient testing, improved reliability (reduced uncertainties) and validity. This provides opportunities for more practical and accurate measurement in clinical practice, research and trials.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 54
    Publication Date: 2021-02-11
    Description: About 160 years ago, the concept of entropy was introduced in thermodynamics by Rudolf Clausius. Since then, it has been continually extended, interpreted, and applied by researchers in many scientific fields, such as general physics, information theory, chaos theory, data mining, and mathematical linguistics. This paper presents The Entropy Universe, which aims to review the many variants of entropies applied to time-series. The purpose is to answer research questions such as: How did each entropy emerge? What is the mathematical definition of each variant of entropy? How are entropies related to each other? What are the most applied scientific fields for each entropy? We describe in-depth the relationship between the most applied entropies in time-series for different scientific fields, establishing bases for researchers to properly choose the variant of entropy most suitable for their data. The number of citations over the past sixteen years of each paper proposing a new entropy was also accessed. The Shannon/differential, the Tsallis, the sample, the permutation, and the approximate entropies were the most cited ones. Based on the ten research areas with the most significant number of records obtained in the Web of Science and Scopus, the areas in which the entropies are more applied are computer science, physics, mathematics, and engineering. The universe of entropies is growing each day, either due to the introducing new variants either due to novel applications. Knowing each entropy’s strengths and of limitations is essential to ensure the proper improvement of this research field.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 55
    Publication Date: 2021-02-17
    Description: We study a two state “jumping diffusivity” model for a Brownian process alternating between two different diffusion constants, D+〉D−, with random waiting times in both states whose distribution is rather general. In the limit of long measurement times, Gaussian behavior with an effective diffusion coefficient is recovered. We show that, for equilibrium initial conditions and when the limit of the diffusion coefficient D−⟶0 is taken, the short time behavior leads to a cusp, namely a non-analytical behavior, in the distribution of the displacements P(x,t) for x⟶0. Visually this cusp, or tent-like shape, resembles similar behavior found in many experiments of diffusing particles in disordered environments, such as glassy systems and intracellular media. This general result depends only on the existence of finite mean values of the waiting times at the different states of the model. Gaussian statistics in the long time limit is achieved due to ergodicity and convergence of the distribution of the temporal occupation fraction in state D+ to a δ-function. The short time behavior of the same quantity converges to a uniform distribution, which leads to the non-analyticity in P(x,t). We demonstrate how super-statistical framework is a zeroth order short time expansion of P(x,t), in the number of transitions, that does not yield the cusp like shape. The latter, considered as the key feature of experiments in the field, is found with the first correction in perturbation theory.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 56
    Publication Date: 2021-02-17
    Description: An earthquake of Mw6.4 hit the coastal zone of Albania on 26 November 2019, at 02:54:11 UTC. It was intensively felt at about 34 km away, in Tirana City, where damages and lives lost occurred. To emphasize a pre-seismic geomagnetic signature before the onset of this earthquake, the data collected on the interval 15 October–30 November 2019, at the Panagjurishte (PAG)-Bulgaria and Surlari (SUA)-Romania observatories were analyzed. Further on, for geomagnetic signal identification we used the polarization parameter (BPOL) which is time invariant in non-seismic conditions and it becomes unstable due to the strain effect related to the Mw6.4earthquake. Consequently, BPOL time series and its standard deviations are performed for the both sites using ultra low frequency (ULF)-fast Fourier transform (FFT) band-pass filtering. A statistical analysis, based on a standardized random variable equation, was applied to emphasize on the BPOL* (PAG) and ABS BPOL* (PAG) time series the anomalous signal’s singularity and, to differentiate the transient local anomalies due to the Mw6.4 earthquake, from the internal and external parts of the geomagnetic field, taken PAG observatory as reference. Finally, the ABS BPOL* (PAG-SUA) time series were obtained on the interval 1–30 November 2019, where a geomagnetic signature greater than 2.0, was detected on 23 November and the lead time was 3 days before the onset of the Mw6.4earthquake.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 57
    Publication Date: 2021-03-28
    Description: The problem of extracting meaningful data through graph analysis spans a range of different fields, such as social networks, knowledge graphs, citation networks, the World Wide Web, and so on. As increasingly structured data become available, the importance of being able to effectively mine and learn from such data continues to grow. In this paper, we propose the multi-scale aggregation graph neural network based on feature similarity (MAGN), a novel graph neural network defined in the vertex domain. Our model provides a simple and general semi-supervised learning method for graph-structured data, in which only a very small part of the data is labeled as the training set. We first construct a similarity matrix by calculating the similarity of original features between all adjacent node pairs, and then generate a set of feature extractors utilizing the similarity matrix to perform multi-scale feature propagation on graphs. The output of multi-scale feature propagation is finally aggregated by using the mean-pooling operation. Our method aims to improve the model representation ability via multi-scale neighborhood aggregation based on feature similarity. Extensive experimental evaluation on various open benchmarks shows the competitive performance of our method compared to a variety of popular architectures.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 58
    Publication Date: 2021-03-27
    Description: Formal Bayesian comparison of two competing models, based on the posterior odds ratio, amounts to estimation of the Bayes factor, which is equal to the ratio of respective two marginal data density values. In models with a large number of parameters and/or latent variables, they are expressed by high-dimensional integrals, which are often computationally infeasible. Therefore, other methods of evaluation of the Bayes factor are needed. In this paper, a new method of estimation of the Bayes factor is proposed. Simulation examples confirm good performance of the proposed estimators. Finally, these new estimators are used to formally compare different hybrid Multivariate Stochastic Volatility–Multivariate Generalized Autoregressive Conditional Heteroskedasticity (MSV-MGARCH) models which have a large number of latent variables. The empirical results show, among other things, that the validity of reduction of the hybrid MSV-MGARCH model to the MGARCH specification depends on the analyzed data set as well as on prior assumptions about model parameters.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 59
    facet.materialart.
    Unknown
    Molecular Diversity Preservation International
    Publication Date: 2021-03-25
    Description: The paper tries to demonstrate that the process of the increase of entropy does not explain the asymmetry of time itself because it is unable to account for its fundamental asymmetries, that is, the asymmetry of traces (we have traces of the past and no traces of the future), the asymmetry of causation (we have an impact on future events with no possibility of having an impact on the past), and the asymmetry between the fixed past and the open future, To this end, the approaches of Boltzmann, Reichenbach (and his followers), and Albert are analysed. It is argued that we should look for alternative approaches instead of this, namely we should consider a temporally asymmetrical physical theory or seek a source of the asymmetry of time in metaphysics. This second approach may even turn out to be complementary if only we accept that metaphysics can complement scientific research programmes.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 60
    Publication Date: 2021-03-28
    Description: Dempster-Shafer (DS) evidence theory is widely used in various fields of uncertain information processing, but it may produce counterintuitive results when dealing with conflicting data. Therefore, this paper proposes a new data fusion method which combines the Deng entropy and the negation of basic probability assignment (BPA). In this method, the uncertain degree in the original BPA and the negation of BPA are considered simultaneously. The degree of uncertainty of BPA and negation of BPA is measured by the Deng entropy, and the two uncertain measurement results are integrated as the final uncertainty degree of the evidence. This new method can not only deal with the data fusion of conflicting evidence, but it can also obtain more uncertain information through the negation of BPA, which is of great help to improve the accuracy of information processing and to reduce the loss of information. We apply it to numerical examples and fault diagnosis experiments to verify the effectiveness and superiority of the method. In addition, some open issues existing in current work, such as the limitations of the Dempster-Shafer theory (DST) under the open world assumption and the necessary properties of uncertainty measurement methods, are also discussed in this paper.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 61
    Publication Date: 2021-03-27
    Description: The swarm intelligence algorithm has become an important method to solve optimization problems because of its excellent self-organization, self-adaptation, and self-learning characteristics. However, when a traditional swarm intelligence algorithm faces high and complex multi-peak problems, population diversity is quickly lost, which leads to the premature convergence of the algorithm. In order to solve this problem, dimension entropy is proposed as a measure of population diversity, and a diversity control mechanism is proposed to guide the updating of the swarm intelligence algorithm. It maintains the diversity of the algorithm in the early stage and ensures the convergence of the algorithm in the later stage. Experimental results show that the performance of the improved algorithm is better than that of the original algorithm.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 62
    Publication Date: 2021-03-27
    Description: Transfer learning seeks to improve the generalization performance of a target task by exploiting the knowledge learned from a related source task. Central questions include deciding what information one should transfer and when transfer can be beneficial. The latter question is related to the so-called negative transfer phenomenon, where the transferred source information actually reduces the generalization performance of the target task. This happens when the two tasks are sufficiently dissimilar. In this paper, we present a theoretical analysis of transfer learning by studying a pair of related perceptron learning tasks. Despite the simplicity of our model, it reproduces several key phenomena observed in practice. Specifically, our asymptotic analysis reveals a phase transition from negative transfer to positive transfer as the similarity of the two tasks moves past a well-defined threshold.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 63
    Publication Date: 2021-03-25
    Description: Expressing currents and their fluctuations at the terminals of a multi-probe conductor in terms of the wave functions of carriers injected into the Fermi sea provides new insight into the physics of electric currents. This approach helps us to identify two physically different contributions to shot noise. In the quantum coherent regime, when current is carried by non-overlapping wave packets, the product of current fluctuations in different leads, the cross-correlation noise, is determined solely by the duration of the wave packet. In contrast, the square of the current fluctuations in one lead, the autocorrelation noise, is additionally determined by the coherence of the wave packet, which is associated with the spread of the wave packet in energy. The two contributions can be addressed separately in the weak back-scattering regime, when the autocorrelation noise depends only on the coherence. Analysis of shot noise in terms of these contributions allows us, in particular, to predict that no individual traveling particles with a real wave function, such as Majorana fermions, can be created in the Fermi sea in a clean manner, that is, without accompanying electron–hole pairs.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 64
    Publication Date: 2021-03-18
    Description: The Integrated Information Theory (IIT) of consciousness starts from essential phenomenological properties, which are then translated into postulates that any physical system must satisfy in order to specify the physical substrate of consciousness. We recently introduced an information measure (Barbosa et al., 2020) that captures three postulates of IIT—existence, intrinsicality and information—and is unique. Here we show that the new measure also satisfies the remaining postulates of IIT—integration and exclusion—and create the framework that identifies maximally irreducible mechanisms. These mechanisms can then form maximally irreducible systems, which in turn will specify the physical substrate of conscious experience.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 65
    Publication Date: 2021-03-18
    Description: In this work, we first consider the discrete version of Fisher information measure and then propose Jensen–Fisher information, to develop some associated results. Next, we consider Fisher information and Bayes–Fisher information measures for mixing parameter vector of a finite mixture probability mass function and establish some results. We provide some connections between these measures with some known informational measures such as chi-square divergence, Shannon entropy, Kullback–Leibler, Jeffreys and Jensen–Shannon divergences.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 66
    Publication Date: 2021-03-15
    Description: Recently, there has been a huge rise in malware growth, which creates a significant security threat to organizations and individuals. Despite the incessant efforts of cybersecurity research to defend against malware threats, malware developers discover new ways to evade these defense techniques. Traditional static and dynamic analysis methods are ineffective in identifying new malware and pose high overhead in terms of memory and time. Typical machine learning approaches that train a classifier based on handcrafted features are also not sufficiently potent against these evasive techniques and require more efforts due to feature-engineering. Recent malware detectors indicate performance degradation due to class imbalance in malware datasets. To resolve these challenges, this work adopts a visualization-based method, where malware binaries are depicted as two-dimensional images and classified by a deep learning model. We propose an efficient malware detection system based on deep learning. The system uses a reweighted class-balanced loss function in the final classification layer of the DenseNet model to achieve significant performance improvements in classifying malware by handling imbalanced data issues. Comprehensive experiments performed on four benchmark malware datasets show that the proposed approach can detect new malware samples with higher accuracy (98.23% for the Malimg dataset, 98.46% for the BIG 2015 dataset, 98.21% for the MaleVis dataset, and 89.48% for the unseen Malicia dataset) and reduced false-positive rates when compared with conventional malware mitigation techniques while maintaining low computational time. The proposed malware detection solution is also reliable and effective against obfuscation attacks.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 67
    Publication Date: 2021-03-15
    Description: The auditory mismatch negativity (MMN) has been considered a preattentive index of auditory processing and/or a signature of prediction error computation. This study tries to demonstrate the presence of an MMN to deviant trials included in complex auditory stimuli sequences, and its possible relationship to predictive coding. Additionally, the transfer of information between trials is expected to be represented by stimulus-preceding negativity (SPN), which would possibly fit the predictive coding framework. To accomplish these objectives, the EEG of 31 subjects was recorded during an auditory paradigm in which trials composed of stimulus sequences with increasing or decreasing frequencies were intermingled with deviant trials presenting an unexpected ending. Our results showed the presence of an MMN in response to deviant trials. An SPN appeared during the intertrial interval and its amplitude was reduced in response to deviant trials. The presence of an MMN in complex sequences of sounds and the generation of an SPN component, with different amplitudes in deviant and standard trials, would support the predictive coding framework.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 68
    Publication Date: 2021-03-15
    Description: Samples from a high-dimensional first-order auto-regressive process generated by an independently and identically distributed random innovation sequence are observed by a sender which can communicate only finitely many bits per unit time to a receiver. The receiver seeks to form an estimate of the process value at every time instant in real-time. We consider a time-slotted communication model in a slow-sampling regime where multiple communication slots occur between two sampling instants. We propose a successive update scheme which uses communication between sampling instants to refine estimates of the latest sample and study the following question: Is it better to collect communication of multiple slots to send better refined estimates, making the receiver wait more for every refinement, or to be fast but loose and send new information in every communication opportunity? We show that the fast but loose successive update scheme with ideal spherical codes is universally optimal asymptotically for a large dimension. However, most practical quantization codes for fixed dimensions do not meet the ideal performance required for this optimality, and they typically will have a bias in the form of a fixed additive error. Interestingly, our analysis shows that the fast but loose scheme is not an optimal choice in the presence of such errors, and a judiciously chosen frequency of updates outperforms it.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 69
    Publication Date: 2021-03-15
    Description: Rapid industrial development has caused a series of environmental problems, which is not conducive to sustainable development of society as a whole. It is necessary to build a sustainable development evaluation system. Most of the existing literature has evaluated corporate sustainable performance from the economy, environment and society on the basis of triple bottom lines. Considering the research gap and the practice need, an evaluation system is established from four dimensions, referred to as economy, society, environment and responsibility management, and 29 indicators are designed to measure these four dimensions. Twenty seven listed Chinese mining corporations are selected as research samples, and the entropy-weight-based Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS) method is applied to calculate indicators’ weights. Results show that the four dimensions of sustainable performance weights from high to low are society, environment, economy, and management process.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 70
    Publication Date: 2021-03-12
    Description: A formal analogy of fluctuating diffusivity to thermodynamics is discussed for messenger RNA molecules fluorescently fused to a protein in living cells. Regarding the average value of the fluctuating diffusivity of such RNA-protein particles as the analog of the internal energy, the analogs of the quantity of heat and work are identified. The Clausius-like inequality is shown to hold for the entropy associated with diffusivity fluctuations, which plays a role analogous to the thermodynamic entropy, and the analog of the quantity of heat. The change of the statistical fluctuation distribution is also examined from a geometric perspective. The present discussions may contribute to a deeper understanding of the fluctuating diffusivity in view of the laws of thermodynamics.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 71
    Publication Date: 2021-03-12
    Description: The seismo-electrical coupling is critical to understand the mechanism of geoelectrical precursors to earthquakes. A novel seismo-electrical model, called Chen–Ouillon–Sornette (COS) model, has been developed by combining the Burridge–Knopoff spring-block system with the mechanisms of stress-activated charge carriers (i.e., electrons and holes) and pressure-stimulated currents. Such a model, thus, can simulate fracture-induced electrical signals at a laboratory scale or earthquake-related geoelectrical signals at a geological scale. In this study, by using information measures of time series analysis, we attempt to understand the influence of diverse electrical conditions on the characteristics of the simulated electrical signals with the COS model. We employ the Fisher–Shannon method to investigate the temporal dynamics of the COS model. The result showed that the electrical parameters of the COS model, particularly for the capacitance and inductance, affect the levels of the order/disorder in the electrical time series. Compared to the field observations, we infer that the underground electrical condition has become larger capacitance or smaller inductance in seismogenic processes. Accordingly, this study may provide a better understanding of the mechanical–electrical coupling of the earth’s crust.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 72
    Publication Date: 2021-03-12
    Description: Crowded trades by similarly trading peers influence the dynamics of asset prices, possibly creating systemic risk. We propose a market clustering measure using granular trading data. For each stock, the clustering measure captures the degree of trading overlap among any two investors in that stock, based on a comparison with the expected crowding in a null model where trades are maximally random while still respecting the empirical heterogeneity of both stocks and investors. We investigate the effect of crowded trades on stock price stability and present evidence that market clustering has a causal effect on the properties of the tails of the stock return distribution, particularly the positive tail, even after controlling for commonly considered risk drivers. Reduced investor pool diversity could thus negatively affect stock price stability.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 73
    Publication Date: 2021-03-24
    Description: In this paper, we ask whether the structure of investor networks, estimated using shareholder registration data, is abnormal during a financial crises. We answer this question by analyzing the structure of investor networks through several most prominent global network features. The networks are estimated from data on marketplace transactions of all publicly traded securities executed in the Helsinki Stock Exchange by Finnish stock shareholders between 1995 and 2016. We observe that most of the feature distributions were abnormal during the 2008–2009 financial crisis, with statistical significance. This paper provides evidence that the financial crisis was associated with a structural change in investors’ trade time synchronization. This indicates that the way how investors use their private information channels changes depending on the market conditions.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 74
    Publication Date: 2021-03-19
    Description: We propose a novel framework to describe the time-evolution of dilute classical and quantum gases, initially out of equilibrium and with spatial inhomogeneities, towards equilibrium. Briefly, we divide the system into small cells and consider the local equilibrium hypothesis. We subsequently define a global functional that is the sum of cell H-functionals. Each cell functional recovers the corresponding Maxwell–Boltzmann, Fermi–Dirac, or Bose–Einstein distribution function, depending on the classical or quantum nature of the gas. The time-evolution of the system is described by the relationship dH/dt≤0, and the equality condition occurs if the system is in the equilibrium state. Via the variational method, proof of the previous relationship, which might be an extension of the H-theorem for inhomogeneous systems, is presented for both classical and quantum gases. Furthermore, the H-functionals are in agreement with the correspondence principle. We discuss how the H-functionals can be identified with the system’s entropy and analyze the relaxation processes of out-of-equilibrium systems.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 75
    Publication Date: 2021-03-16
    Description: Weak measurements have been under intensive investigation in both experiment and theory. Numerous experiments have indicated that the amplified meter shift is produced by the post-selection, yielding an improved precision compared to conventional methods. However, this amplification effect comes at the cost of a reduced rate of acquiring data, which leads to an increasing uncertainty to determine the level of meter shift. From this point of view, a number of theoretical works have suggested that weak measurements cannot improve the precision, or even damage the metrology information due to the post-selection. In this review, we give a comprehensive analysis of the weak measurements to justify their positive effect on prompting measurement precision. As a further step, we introduce two modified weak measurement protocols to boost the precision beyond the standard quantum limit. Compared to previous works beating the standard quantum limit, these protocols are free of using entangled or squeezed states. The achieved precision outperforms that of the conventional method by two orders of magnitude and attains a practical Heisenberg scaling up to n=106 photons.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 76
    Publication Date: 2021-03-13
    Description: Offline Arabic Handwriting Recognition (OAHR) has recently become instrumental in the areas of pattern recognition and image processing due to its application in several fields, such as office automation and document processing. However, OAHR continues to face several challenges, including high variability of the Arabic script and its intrinsic characteristics such as cursiveness, ligatures, and diacritics, the unlimited variation in human handwriting, and the lack of large public databases. In this paper, we introduce a novel context-aware model based on deep neural networks to address the challenges of recognizing offline handwritten Arabic text, including isolated digits, characters, and words. Specifically, we propose a supervised Convolutional Neural Network (CNN) model that contextually extracts optimal features and employs batch normalization and dropout regularization parameters. This aims to prevent overfitting and further enhance generalization performance when compared to conventional deep learning models. We employ a number of deep stacked-convolutional layers to design the proposed Deep CNN (DCNN) architecture. The model is extensively evaluated and shown to demonstrate excellent classification accuracy when compared to conventional OAHR approaches on a diverse set of six benchmark databases, including MADBase (Digits), CMATERDB (Digits), HACDB (Characters), SUST-ALT (Digits), SUST-ALT (Characters), and SUST-ALT (Names). A further experimental study is conducted on the benchmark Arabic databases by exploiting transfer learning (TL)-based feature extraction which demonstrates the superiority of our proposed model in relation to state-of-the-art VGGNet-19 and MobileNet pre-trained models. Finally, experiments are conducted to assess comparative generalization capabilities of the models using another language database , specifically the benchmark MNIST English isolated Digits database, which further confirm the superiority of our proposed DCNN model.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 77
    Publication Date: 2021-03-16
    Description: Finite-time isothermal processes are ubiquitous in quantum-heat-engine cycles, yet complicated due to the coexistence of the changing Hamiltonian and the interaction with the thermal bath. Such complexity prevents classical thermodynamic measurements of a performed work. In this paper, the isothermal process is decomposed into piecewise adiabatic and isochoric processes to measure the performed work as the internal energy change in adiabatic processes. The piecewise control scheme allows the direct simulation of the whole process on a universal quantum computer, which provides a new experimental platform to study quantum thermodynamics. We implement the simulation on ibmqx2 to show the 1/τ scaling of the extra work in finite-time isothermal processes.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 78
    Publication Date: 2021-03-16
    Description: This paper is our attempt, on the basis of physical theory, to bring more clarification on the question “What is life?” formulated in the well-known book of Schrödinger in 1944. According to Schrödinger, the main distinguishing feature of a biosystem’s functioning is the ability to preserve its order structure or, in mathematical terms, to prevent increasing of entropy. However, Schrödinger’s analysis shows that the classical theory is not able to adequately describe the order-stability in a biosystem. Schrödinger also appealed to the ambiguous notion of negative entropy. We apply quantum theory. As is well-known, behaviour of the quantum von Neumann entropy crucially differs from behaviour of classical entropy. We consider a complex biosystem S composed of many subsystems, say proteins, cells, or neural networks in the brain, that is, S=(Si). We study the following problem: whether the compound system S can maintain “global order” in the situation of an increase of local disorder and if S can preserve the low entropy while other Si increase their entropies (may be essentially). We show that the entropy of a system as a whole can be constant, while the entropies of its parts rising. For classical systems, this is impossible, because the entropy of S cannot be less than the entropy of its subsystem Si. And if a subsystems’s entropy increases, then a system’s entropy should also increase, by at least the same amount. However, within the quantum information theory, the answer is positive. The significant role is played by the entanglement of a subsystems’ states. In the absence of entanglement, the increasing of local disorder implies an increasing disorder in the compound system S (as in the classical regime). In this note, we proceed within a quantum-like approach to mathematical modeling of information processing by biosystems—respecting the quantum laws need not be based on genuine quantum physical processes in biosystems. Recently, such modeling found numerous applications in molecular biology, genetics, evolution theory, cognition, psychology and decision making. The quantum-like model of order stability can be applied not only in biology, but also in social science and artificial intelligence.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 79
    Publication Date: 2021-03-12
    Description: In this paper, we study the concomitants of dual generalized order statistics (and consequently generalized order statistics) when the parameters γ1,…,γn are assumed to be pairwise different from Huang–Kotz Farlie–Gumble–Morgenstern bivariate distribution. Some useful recurrence relations between single and product moments of concomitants are obtained. Moreover, Shannon’s entropy and the Fisher information number measures are derived. Finally, these measures are extensively studied for some well-known distributions such as exponential, Pareto and power distributions. The main motivation of the study of the concomitants of generalized order statistics (as an important practical kind to order the bivariate data) under this general framework is to enable researchers in different fields of statistics to use some of the important models contained in these generalized order statistics only under this general framework. These extended models are frequently used in the reliability theory, such as the progressive type-II censored order statistics.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 80
    Publication Date: 2021-03-13
    Description: In recent decades, image encryption, as one of the significant information security fields, has attracted many researchers and scientists. However, several studies have been performed with different methods, and novel and useful algorithms have been suggested to improve secure image encryption schemes. Nowadays, chaotic methods have been found in diverse fields, such as the design of cryptosystems and image encryption. Chaotic methods-based digital image encryptions are a novel image encryption method. This technique uses random chaos sequences for encrypting images, and it is a highly-secured and fast method for image encryption. Limited accuracy is one of the disadvantages of this technique. This paper researches the chaos sequence and wavelet transform value to find gaps. Thus, a novel technique was proposed for digital image encryption and improved previous algorithms. The technique is run in MATLAB, and a comparison is made in terms of various performance metrics such as the Number of Pixels Change Rate (NPCR), Peak Signal to Noise Ratio (PSNR), Correlation coefficient, and Unified Average Changing Intensity (UACI). The simulation and theoretical analysis indicate the proposed scheme’s effectiveness and show that this technique is a suitable choice for actual image encryption.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 81
    Publication Date: 2021-03-14
    Description: Decentralization is a peculiar characteristic of self-organizing systems such as swarm intelligence systems, which function as complex collective responsive systems without central control and operates based on contextual local coordination among relatively simple individual systems. The decentralized particularity of self-organizing systems lies in their capacity to spontaneously respond to accommodate environmental changes in a cooperative manner without external control. However, if members cannot obtain observations of the state of the whole team and environment, they have to share their knowledge and policies with each other through communication in order to adapt to the environment appropriately. In this paper, we propose an information sharing mechanism as an independent decision phase to improve individual members’ joint adaption to the world to fulfill an optimal self-organization in general. We design the information sharing decision analogous to human information sharing mechanisms. In this case, information can be shared among individual members by evaluating the semantic relationship of information based on ontology graph and their local knowledge. That is, if individual member collects more relevant information, the information will be used to update its local knowledge and improve sharing relevant information by measuring the ontological relevance. This will enable more related information to be acquired so that their models will be reinforced for more precise information sharing. Our simulations and experimental results show that this design can share information efficiently to achieve optimal adaptive self-organizing systems.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 82
    Publication Date: 2021-03-09
    Description: In this paper, we study the entropy functions on extreme rays of the polymatroidal region which contain a matroid, i.e., matroidal entropy functions. We introduce variable strength orthogonal arrays indexed by a connected matroid M and positive integer v which can be regarded as expanding the classic combinatorial structure orthogonal arrays. It is interesting that they are equivalent to the partition-representations of the matroid M with degree v and the (M,v) almost affine codes. Thus, a synergy among four fields, i.e., information theory, matroid theory, combinatorial design, and coding theory is developed, which may lead to potential applications in information problems such as network coding and secret-sharing. Leveraging the construction of variable strength orthogonal arrays, we characterize all matroidal entropy functions of order n≤5 with the exception of log10·U2,5 and logv·U3,5 for some v.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 83
    Publication Date: 2021-03-08
    Description: The simplest model of the evolution of agents with different energy strategies is considered. The model is based on the most general thermodynamic ideas and includes the procedures for selection, inheritance, and variability. The problem of finding a universal strategy (principle) as a selection of possible competing strategies is solved. It is shown that when there is non-equilibrium between the medium and agents, a direction in the evolution of agents arises, but at the same time, depending on the conditions of the evolution, different strategies can be successful. However, for this case, the simulation results reveal that in the presence of significant competition of agents, the strategy that has the maximum total energy dissipation of agents arising as a result of evolution turns out to be successful. Thus, it is not the specific strategy that is universal, but the maximization of dissipation. This result discovers an interesting connection between the basic principles of Darwin–Wallace evolution and the maximum entropy production principle.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 84
    Publication Date: 2021-03-09
    Description: Regression models provide prediction frameworks for multivariate mutual information analysis that uses information concepts when choosing covariates (also called features) that are important for analysis and prediction. We consider a high dimensional regression framework where the number of covariates (p) exceed the sample size (n). Recent work in high dimensional regression analysis has embraced an ensemble subspace approach that consists of selecting random subsets of covariates with fewer than p covariates, doing statistical analysis on each subset, and then merging the results from the subsets. We examine conditions under which penalty methods such as Lasso perform better when used in the ensemble approach by computing mean squared prediction errors for simulations and a real data example. Linear models with both random and fixed designs are considered. We examine two versions of penalty methods: one where the tuning parameter is selected by cross-validation; and one where the final predictor is a trimmed average of individual predictors corresponding to the members of a set of fixed tuning parameters. We find that the ensemble approach improves on penalty methods for several important real data and model scenarios. The improvement occurs when covariates are strongly associated with the response, when the complexity of the model is high. In such cases, the trimmed average version of ensemble Lasso is often the best predictor.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 85
    Publication Date: 2021-03-09
    Description: Unemployment has risen as the economy has shrunk. The coronavirus crisis has affected many sectors in Romania, some companies diminishing or even ceasing their activity. Making forecasts of the unemployment rate has a fundamental impact and importance on future social policy strategies. The aim of the paper is to comparatively analyze the forecast performances of different univariate time series methods with the purpose of providing future predictions of unemployment rate. In order to do that, several forecasting models (seasonal model autoregressive integrated moving average (SARIMA), self-exciting threshold autoregressive (SETAR), Holt–Winters, ETS (error, trend, seasonal), and NNAR (neural network autoregression)) have been applied, and their forecast performances have been evaluated on both the in-sample data covering the period January 2000–December 2017 used for the model identification and estimation and the out-of-sample data covering the last three years, 2018–2020. The forecast of unemployment rate relies on the next two years, 2021–2022. Based on the in-sample forecast assessment of different methods, the forecast measures root mean squared error (RMSE), mean absolute error (MAE), and mean absolute percent error (MAPE) suggested that the multiplicative Holt–Winters model outperforms the other models. For the out-of-sample forecasting performance of models, RMSE and MAE values revealed that the NNAR model has better forecasting performance, while according to MAPE, the SARIMA model registers higher forecast accuracy. The empirical results of the Diebold–Mariano test at one forecast horizon for out-of-sample methods revealed differences in the forecasting performance between SARIMA and NNAR, of which the best model of modeling and forecasting unemployment rate was considered to be the NNAR model.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 86
    Publication Date: 2021-03-08
    Description: Background: We developed CEPS as an open access MATLAB® GUI (graphical user interface) for the analysis of Complexity and Entropy in Physiological Signals (CEPS), and demonstrate its use with an example data set that shows the effects of paced breathing (PB) on variability of heart, pulse and respiration rates. CEPS is also sufficiently adaptable to be used for other time series physiological data such as EEG (electroencephalography), postural sway or temperature measurements. Methods: Data were collected from a convenience sample of nine healthy adults in a pilot for a larger study investigating the effects on vagal tone of breathing paced at various different rates, part of a development programme for a home training stress reduction system. Results: The current version of CEPS focuses on those complexity and entropy measures that appear most frequently in the literature, together with some recently introduced entropy measures which may have advantages over those that are more established. Ten methods of estimating data complexity are currently included, and some 28 entropy measures. The GUI also includes a section for data pre-processing and standard ancillary methods to enable parameter estimation of embedding dimension m and time delay τ (‘tau’) where required. The software is freely available under version 3 of the GNU Lesser General Public License (LGPLv3) for non-commercial users. CEPS can be downloaded from Bitbucket. In our illustration on PB, most complexity and entropy measures decreased significantly in response to breathing at 7 breaths per minute, differentiating more clearly than conventional linear, time- and frequency-domain measures between breathing states. In contrast, Higuchi fractal dimension increased during paced breathing. Conclusions: We have developed CEPS software as a physiological data visualiser able to integrate state of the art techniques. The interface is designed for clinical research and has a structure designed for integrating new tools. The aim is to strengthen collaboration between clinicians and the biomedical community, as demonstrated here by using CEPS to analyse various physiological responses to paced breathing.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 87
    Publication Date: 2021-03-08
    Description: The construction sector plays an important role in a country’s economic development. The financial performance of a company is a good indicator of its financial health and status. In Malaysia, the government encourages the construction industry to develop an advanced infrastructure related to health, transport, education and housing. In view of the COVID-19 pandemic, the operations and financial performance of construction sector companies have been affected recently. Additionally, uncertainty plays a vital role in the multi-criteria decision-making (MCDM) process. Based on previous studies, there has been no comprehensive study conducted on the evaluation of the financial performance of construction companies by integrating entropy and fuzzy VIKOR models. Therefore, this paper aims to propose an MCDM model to evaluate and compare the financial performance of construction companies with an integrated entropy–fuzzy VIKOR model. A case study is carried out by evaluating the listed construction companies in Malaysia with the proposed model. The findings of this paper indicate that the company ECONBHD achieves the best financial performance over the study period. The significance of this paper is to determine the priority of the financial ratios and ranking of the construction companies with the proposed entropy–fuzzy VIKOR model.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 88
    Publication Date: 2021-03-08
    Description: One of the biggest challenges in characterizing 2-D image topographies is finding a low-dimensional parameter set that can succinctly describe, not so much image patterns themselves, but the nature of these patterns. The 2-D cluster variation method (CVM), introduced by Kikuchi in 1951, can characterize very local image pattern distributions using configuration variables, identifying nearest-neighbor, next-nearest-neighbor, and triplet configurations. Using the 2-D CVM, we can characterize 2-D topographies using just two parameters; the activation enthalpy (ε0) and the interaction enthalpy (ε1). Two different initial topographies (“scale-free-like” and “extreme rich club-like”) were each computationally brought to a CVM free energy minimum, for the case where the activation enthalpy was zero and different values were used for the interaction enthalpy. The results are: (1) the computational configuration variable results differ significantly from the analytically-predicted values well before ε1 approaches the known divergence as ε1→0.881, (2) the range of potentially useful parameter values, favoring clustering of like-with-like units, is limited to the region where ε0
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 89
    Publication Date: 2021-03-04
    Description: Visual word recognition is a relatively effortless process, but recent research suggests the system involved is malleable, with evidence of increases in behavioural efficiency after prolonged lexical decision task (LDT) performance. However, the extent of neural changes has yet to be characterized in this context. The neural changes that occur could be related to a shift from initially effortful performance that is supported by control-related processing, to efficient task performance that is supported by domain-specific processing. To investigate this, we replicated the British Lexicon Project, and had participants complete 16 h of LDT over several days. We recorded electroencephalography (EEG) at three intervals to track neural change during LDT performance and assessed event-related potentials and brain signal complexity. We found that response times decreased during LDT performance, and there was evidence of neural change through N170, P200, N400, and late positive component (LPC) amplitudes across the EEG sessions, which suggested a shift from control-related to domain-specific processing. We also found widespread complexity decreases alongside localized increases, suggesting that processing became more efficient with specific increases in processing flexibility. Together, these findings suggest that neural processing becomes more efficient and optimized to support prolonged LDT performance.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 90
    Publication Date: 2021-03-06
    Description: Datasets displaying temporal dependencies abound in science and engineering applications, with Markov models representing a simplified and popular view of the temporal dependence structure. In this paper, we consider Bayesian settings that place prior distributions over the parameters of the transition kernel of a Markov model, and seek to characterize the resulting, typically intractable, posterior distributions. We present a Probably Approximately Correct (PAC)-Bayesian analysis of variational Bayes (VB) approximations to tempered Bayesian posterior distributions, bounding the model risk of the VB approximations. Tempered posteriors are known to be robust to model misspecification, and their variational approximations do not suffer the usual problems of over confident approximations. Our results tie the risk bounds to the mixing and ergodic properties of the Markov data generating model. We illustrate the PAC-Bayes bounds through a number of example Markov models, and also consider the situation where the Markov model is misspecified.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 91
    Publication Date: 2021-03-06
    Description: The present study investigates the similarity problem associated with the onset of the Mach reflection of Zel’dovich–von Neumann–Döring (ZND) detonations in the near field. The results reveal that the self-similarity in the frozen-limit regime is strictly valid only within a small scale, i.e., of the order of the induction length. The Mach reflection becomes non-self-similar during the transition of the Mach stem from “frozen” to “reactive” by coupling with the reaction zone. The triple-point trajectory first rises from the self-similar result due to compressive waves generated by the “hot spot”, and then decays after establishment of the reactive Mach stem. It is also found, by removing the restriction, that the frozen limit can be extended to a much larger distance than expected. The obtained results elucidate the physical origin of the onset of Mach reflection with chemical reactions, which has previously been observed in both experiments and numerical simulations.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 92
    Publication Date: 2021-03-06
    Description: This review looks at some of the central relationships between artificial intelligence, psychology, and economics through the lens of information theory, specifically focusing on formal models of decision-theory. In doing so we look at a particular approach that each field has adopted and how information theory has informed the development of the ideas of each field. A key theme is expected utility theory, its connection to information theory, the Bayesian approach to decision-making and forms of (bounded) rationality. What emerges from this review is a broadly unified formal perspective derived from three very different starting points that reflect the unique principles of each field. Each of the three approaches reviewed can, in principle at least, be implemented in a computational model in such a way that, with sufficient computational power, they could be compared with human abilities in complex tasks. However, a central critique that can be applied to all three approaches was first put forward by Savage in The Foundations of Statistics and recently brought to the fore by the economist Binmore: Bayesian approaches to decision-making work in what Savage called ‘small worlds’ but cannot work in ‘large worlds’. This point, in various different guises, is central to some of the current debates about the power of artificial intelligence and its relationship to human-like learning and decision-making. Recent work on artificial intelligence has gone some way to bridging this gap but significant questions remain to be answered in all three fields in order to make progress in producing realistic models of human decision-making in the real world in which we live in.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 93
    Publication Date: 2021-03-08
    Description: Machine Reading Comprehension (MRC) research concerns how to endow machines with the ability to understand given passages and answer questions, which is a challenging problem in the field of natural language processing. To solve the Chinese MRC task efficiently, this paper proposes an Improved Extraction-based Reading Comprehension method with Answer Re-ranking (IERC-AR), consisting of a candidate answer extraction module and a re-ranking module. The candidate answer extraction module uses an improved pre-training language model, RoBERTa-WWM, to generate precise word representations, which can solve the problem of polysemy and is good for capturing Chinese word-level features. The re-ranking module re-evaluates candidate answers based on a self-attention mechanism, which can improve the accuracy of predicting answers. Traditional machine-reading methods generally integrate different modules into a pipeline system, which leads to re-encoding problems and inconsistent data distribution between the training and testing phases; therefore, this paper proposes an end-to-end model architecture for IERC-AR to reasonably integrate the candidate answer extraction and re-ranking modules. The experimental results on the Les MMRC dataset show that IERC-AR outperforms state-of-the-art MRC approaches.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 94
    Publication Date: 2021-03-04
    Description: Linear separability, a core concept in supervised machine learning, refers to whether the labels of a data set can be captured by the simplest possible machine: a linear classifier. In order to quantify linear separability beyond this single bit of information, one needs models of data structure parameterized by interpretable quantities, and tractable analytically. Here, I address one class of models with these properties, and show how a combinatorial method allows for the computation, in a mean field approximation, of two useful descriptors of linear separability, one of which is closely related to the popular concept of storage capacity. I motivate the need for multiple metrics by quantifying linear separability in a simple synthetic data set with controlled correlations between the points and their labels, as well as in the benchmark data set MNIST, where the capacity alone paints an incomplete picture. The analytical results indicate a high degree of “universality”, or robustness with respect to the microscopic parameters controlling data structure.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 95
    Publication Date: 2021-03-11
    Description: The quantum speed limit (QSL) is the theoretical lower limit of the time for a quantum system to evolve from a given state to another one. Interestingly, it has been shown that non-Markovianity can be used to speed-up the dynamics and to lower the QSL time, although this behaviour is not universal. In this paper, we further carry on the investigation on the connection between QSL and non-Markovianity by looking at the effects of P- and CP-divisibility of the dynamical map to the quantum speed limit. We show that the speed-up can also be observed under P- and CP-divisible dynamics, and that the speed-up is not necessarily tied to the transition from P-divisible to non-P-divisible dynamics.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 96
    Publication Date: 2021-03-11
    Description: The authors would like to add the following information to the “Funding” section of their paper [...]
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 97
    Publication Date: 2021-03-11
    Description: Well-evidenced advances of data-driven complex machine learning approaches emerging within the so-called second wave of artificial intelligence (AI) fostered the exploration of possible AI applications in various domains and aspects of human life, practices, and society [...]
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 98
    Publication Date: 2021-03-28
    Description: Digital image correlation may be useful in many different fields of science, one of which is medicine. In this paper, the authors present the results of research aimed at detecting skin micro-shifts caused by pulsation of the veins. A novel technique using digital image correlation (DIC) and filtering the resulting shifts map to detect pulsating veins was proposed. After applying the proposed method, the veins in the forearm were visualized. The proposed technique may be used in the diagnosis of venous stenosis and may also contribute to reducing the number of adverse events during blood collection. The great advantage of the proposed method is the lack of the need to have specialized equipment, only a typical mobile phone camera is needed to perform the test.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 99
    Publication Date: 2021-03-27
    Description: The identification of emergent structures in complex dynamical systems is a formidable challenge. We propose a computationally efficient methodology to address such a challenge, based on modeling the state of the system as a set of random variables. Specifically, we present a sieving algorithm to navigate the huge space of all subsets of variables and compare them in terms of a simple index that can be computed without resorting to simulations. We obtain such a simple index by studying the asymptotic distribution of an information-theoretic measure of coordination among variables, when there is no coordination at all, which allows us to fairly compare subsets of variables having different cardinalities. We show that increasing the number of observations allows the identification of larger and larger subsets. As an example of relevant application, we make use of a paradigmatic case regarding the identification of groups in autocatalytic sets of reactions, a chemical situation related to the origin of life problem.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 100
    Publication Date: 2021-03-26
    Description: With the online presence of more than half the world population, social media plays a very important role in the lives of individuals as well as businesses alike. Social media enables businesses to advertise their products, build brand value, and reach out to their customers. To leverage these social media platforms, it is important for businesses to process customer feedback in the form of posts and tweets. Sentiment analysis is the process of identifying the emotion, either positive, negative or neutral, associated with these social media texts. The presence of sarcasm in texts is the main hindrance in the performance of sentiment analysis. Sarcasm is a linguistic expression often used to communicate the opposite of what is said, usually something that is very unpleasant, with an intention to insult or ridicule. Inherent ambiguity in sarcastic expressions make sarcasm detection very difficult. In this work, we focus on detecting sarcasm in textual conversations from various social networking platforms and online media. To this end, we develop an interpretable deep learning model using multi-head self-attention and gated recurrent units. The multi-head self-attention module aids in identifying crucial sarcastic cue-words from the input, and the recurrent units learn long-range dependencies between these cue-words to better classify the input text. We show the effectiveness of our approach by achieving state-of-the-art results on multiple datasets from social networking platforms and online media. Models trained using our proposed approach are easily interpretable and enable identifying sarcastic cues in the input text which contribute to the final classification score. We visualize the learned attention weights on a few sample input texts to showcase the effectiveness and interpretability of our model.
    Electronic ISSN: 1099-4300
    Topics: Chemistry and Pharmacology , Physics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...