ALBERT

All Library Books, journals and Electronic Records Telegrafenberg

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
Filter
  • Articles  (2,272)
  • 2015-2019  (2,272)
  • 1940-1944
  • 1930-1934
  • Information  (1,223)
  • 151794
  • Computer Science  (2,272)
  • Medicine
  • 1
    Publication Date: 2019
    Description: Service recommendation is one of the important means of service selection. Aiming at the problems of ignoring the influence of typical data sources such as service information and interaction logs on the similarity calculation of user preferences and insufficient consideration of dynamic trust relationship in traditional trust-based Web service recommendation methods, a novel approach for Web service recommendation based on advanced trust relationships is presented. After considering the influence of indirect trust paths, the improved calculation about indirect trust degree is proposed. By quantifying the popularity of service, the method of calculating user preference similarity is investigated. Furthermore, the dynamic adjustment mechanism of trust is designed by differentiating the effect of each service recommendation. Integrating these efforts, a service recommendation mechanism is introduced, in which a new service recommendation algorithm is described. Experimental results show that, compared with existing methods, the proposed approach not only has higher accuracy of service recommendation, but also can resist attacks from malicious users more effectively.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 2
    Publication Date: 2019
    Description: We explore the class of positive integers n that admit idempotent factorizations n = p ¯ q ¯ such that λ ( n ) ∣ ( p ¯ − 1 ) ( q ¯ − 1 ) , where λ is the Carmichael lambda function. Idempotent factorizations with p ¯ and q ¯ prime have received the most attention due to their cryptographic advantages, but there are an infinite number of n with idempotent factorizations containing composite p ¯ and/or q ¯ . Idempotent factorizations are exactly those p ¯ and q ¯ that generate correctly functioning keys in the Rivest–Shamir–Adleman (RSA) 2-prime protocol with n as the modulus. While the resulting p ¯ and q ¯ have no cryptographic utility and therefore should never be employed in that capacity, idempotent factorizations warrant study in their own right as they live at the intersection of multiple hard problems in computer science and number theory. We present some analytical results here. We also demonstrate the existence of maximally idempotent integers, those n for which all bipartite factorizations are idempotent. We show how to construct them, and present preliminary results on their distribution.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 3
    Publication Date: 2019
    Description: Recommender systems are nowadays an indispensable part of most personalized systems implementing information access and content delivery, supporting a great variety of user activities [...]
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 4
    Publication Date: 2019
    Description: Finite element data form an important basis for engineers to undertake analysis and research. In most cases, it is difficult to generate the internal sections of finite element data and professional operations are required. To display the internal data of entities, a method for generating the arbitrary sections of finite element data based on radial basis function (RBF) interpolation is proposed in this paper. The RBF interpolation function is used to realize arbitrary surface cutting of the entity, and the section can be generated by the triangulation of discrete tangent points. Experimental studies have proved that the method is very convenient for allowing users to obtain visualization results for an arbitrary section through simple and intuitive interactions.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 5
    Publication Date: 2019
    Description: The number of documents published on the Web in languages other than English grows every year. As a consequence, the need to extract useful information from different languages increases, highlighting the importance of research into Open Information Extraction (OIE) techniques. Different OIE methods have dealt with features from a unique language; however, few approaches tackle multilingual aspects. In those approaches, multilingualism is restricted to processing text in different languages, rather than exploring cross-linguistic resources, which results in low precision due to the use of general rules. Multilingual methods have been applied to numerous problems in Natural Language Processing, achieving satisfactory results and demonstrating that knowledge acquisition for a language can be transferred to other languages to improve the quality of the facts extracted. We argue that a multilingual approach can enhance OIE methods as it is ideal to evaluate and compare OIE systems, and therefore can be applied to the collected facts. In this work, we discuss how the transfer knowledge between languages can increase acquisition from multilingual approaches. We provide a roadmap of the Multilingual Open IE area concerning state of the art studies. Additionally, we evaluate the transfer of knowledge to improve the quality of the facts extracted in each language. Moreover, we discuss the importance of a parallel corpus to evaluate and compare multilingual systems.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 6
    Publication Date: 2019
    Description: This paper aims to explore the current status, research trends and hotspots related to the field of infrared detection technology through bibliometric analysis and visualization techniques based on the Science Citation Index Expanded (SCIE) and Social Sciences Citation Index (SSCI) articles published between 1990 and 2018 using the VOSviewer and Citespace software tools. Based on our analysis, we first present the spatiotemporal distribution of the literature related to infrared detection technology, including annual publications, origin country/region, main research organization, and source publications. Then, we report the main subject categories involved in infrared detection technology. Furthermore, we adopt literature cocitation, author cocitation, keyword co-occurrence and timeline visualization analyses to visually explore the research fronts and trends, and present the evolution of infrared detection technology research. The results show that China, the USA and Italy are the three most active countries in infrared detection technology research and that the Centre National de la Recherche Scientifique has the largest number of publications among related organizations. The most prominent research hotspots in the past five years are vibration thermal imaging, pulse thermal imaging, photonic crystals, skin temperature, remote sensing technology, and detection of delamination defects in concrete. The trend of future research on infrared detection technology is from qualitative to quantitative research development, engineering application research and infrared detection technology combined with other detection techniques. The proposed approach based on the scientific knowledge graph analysis can be used to establish reference information and a research basis for application and development of methods in the domain of infrared detection technology studies.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 7
    Publication Date: 2019
    Description: The literature on big data analytics and firm performance is still fragmented and lacking in attempts to integrate the current studies’ results. This study aims to provide a systematic review of contributions related to big data analytics and firm performance. The authors assess papers listed in the Web of Science index. This study identifies the factors that may influence the adoption of big data analytics in various parts of an organization and categorizes the diverse types of performance that big data analytics can address. Directions for future research are developed from the results. This systematic review proposes to create avenues for both conceptual and empirical research streams by emphasizing the importance of big data analytics in improving firm performance. In addition, this review offers both scholars and practitioners an increased understanding of the link between big data analytics and firm performance.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 8
    Publication Date: 2019
    Description: Term translation quality in machine translation (MT), which is usually measured by domain experts, is a time-consuming and expensive task. In fact, this is unimaginable in an industrial setting where customised MT systems often need to be updated for many reasons (e.g., availability of new training data, leading MT techniques). To the best of our knowledge, as of yet, there is no publicly-available solution to evaluate terminology translation in MT automatically. Hence, there is a genuine need to have a faster and less-expensive solution to this problem, which could help end-users to identify term translation problems in MT instantly. This study presents a faster and less expensive strategy for evaluating terminology translation in MT. High correlations of our evaluation results with human judgements demonstrate the effectiveness of the proposed solution. The paper also introduces a classification framework, TermCat, that can automatically classify term translation-related errors and expose specific problems in relation to terminology translation in MT. We carried out our experiments with a low resource language pair, English–Hindi, and found that our classifier, whose accuracy varies across the translation directions, error classes, the morphological nature of the languages, and MT models, generally performs competently in the terminology translation classification task.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 9
    Publication Date: 2019
    Description: Radar signal processing mainly focuses on target detection, classification, estimation, filtering, and so on. Compressed sensing radar (CSR) technology can potentially provide additional tools to simultaneously reduce computational complexity and effectively solve inference problems. CSR allows direct compressive signal processing without the need to reconstruct the signal. This study aimed to solve the problem of CSR detection without signal recovery by optimizing the transmit waveform. Therefore, a waveform optimization method was introduced to improve the output signal-to-interference-plus-noise ratio (SINR) in the case where the target signal is corrupted by colored interference and noise having known statistical characteristics. Two different target models are discussed: deterministic and random. In the case of a deterministic target, the optimum transmit waveform is derived by maximizing the SINR and a suboptimum solution is also presented. In the case of random target, an iterative waveform optimization method is proposed to maximize the output SINR. This approach ensures that SINR performance is improved in each iteration step. The performance of these methods is illustrated by computer simulation.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 10
    Publication Date: 2019
    Description: Appropriate business processes management (BPM) within an organization can help attain organizational goals. It is particularly important to effectively manage the lifecycle of these processes for organizational effectiveness in improving ever-growing performance and competitivity-building across the company. This paper presents a process discovery and how we can use it in a broader framework supporting self-organization in BPM. Process discovery is intrinsically associated with the process lifecycle. We have made a pre-evaluation of the usefulness of our facts using a generated log file. We also compared visualizations of the outcomes of our approach with different cases and showed performance characteristics of the cash loan sales process.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 11
    Publication Date: 2019
    Description: Correlations between observed data are at the heart of all empirical research that strives for establishing lawful regularities. However, there are numerous ways to assess these correlations, and there are numerous ways to make sense of them. This essay presents a bird’s eye perspective on different interpretive schemes to understand correlations. It is designed as a comparative survey of the basic concepts. Many important details to back it up can be found in the relevant technical literature. Correlations can (1) extend over time (diachronic correlations) or they can (2) relate data in an atemporal way (synchronic correlations). Within class (1), the standard interpretive accounts are based on causal models or on predictive models that are not necessarily causal. Examples within class (2) are (mainly unsupervised) data mining approaches, relations between domains (multiscale systems), nonlocal quantum correlations, and eventually correlations between the mental and the physical.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 12
    Publication Date: 2019
    Description: The capacity of private information retrieval (PIR) from databases coded using maximum distance separable (MDS) codes was previously characterized by Banawan and Ulukus, where it was assumed that the messages are encoded and stored separably in the databases. This assumption was also usually made in other related works in the literature, and this capacity is usually referred to as the MDS-PIR capacity colloquially. In this work, we considered the question of if and when this capacity barrier can be broken through joint encoding and storing of the messages. Our main results are two classes of novel code constructions, which allow joint encoding, as well as the corresponding PIR protocols, which indeed outperformed the separate MDS-coded systems. Moreover, we show that a simple, but novel expansion technique allows us to generalize these two classes of codes, resulting in a wider range of the cases where this capacity barrier can be broken.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 13
    Publication Date: 2019
    Description: The advent of utility computing has revolutionized almost every sector of traditional software development. Especially commercial cloud computing services, pioneered by the likes of Amazon, Google and Microsoft, have provided an unprecedented opportunity for the fast and sustainable development of complex distributed systems. Nevertheless, existing models and tools aim primarily for systems where resource usage—by humans and bots alike—is logically and physically quite disperse resulting in a low likelihood of conflicting resource access. However, a number of resource-intensive applications, such as Massively Multiplayer Online Games (MMOGs) and large-scale simulations introduce a requirement for a very large common state with many actors accessing it simultaneously and thus a high likelihood of conflicting resource access. This paper presents a systematic mapping study of the state-of-the-art in software technology aiming explicitly to support the development of MMOGs, a class of large-scale, resource-intensive software systems. By examining the main focus of a diverse set of related publications, we identify a list of criteria that are important for MMOG development. Then, we categorize the selected studies based on the inferred criteria in order to compare their approach, unveil the challenges faced in each of them and reveal research trends that might be present. Finally we attempt to identify research directions which appear promising for enabling the use of standardized technology for this class of systems.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 14
    Publication Date: 2019
    Description: This article empirically demonstrates the impacts of truthfully sharing forecast information and using forecast combinations in a fast-moving-consumer-goods (FMCG) supply chain. Although it is known a priori that sharing information improves the overall efficiency of a supply chain, information such as pricing or promotional strategy is often kept proprietary for competitive reasons. In this regard, it is herein shown that simply sharing the retail-level forecasts—this does not reveal the exact business strategy, due to the effect of omni-channel sales—yields nearly all the benefits of sharing all pertinent information that influences FMCG demand. In addition, various forecast combination methods are used to further stabilize the forecasts, in situations where multiple forecasting models are used during operation. In other words, it is shown that combining forecasts is less risky than “betting” on any component model.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 15
    Publication Date: 2019
    Description: An important problem in machine learning is that, when using more than two labels, it is very difficult to construct and optimize a group of learning functions that are still useful when the prior distribution of instances is changed. To resolve this problem, semantic information G theory, Logical Bayesian Inference (LBI), and a group of Channel Matching (CM) algorithms are combined to form a systematic solution. A semantic channel in G theory consists of a group of truth functions or membership functions. In comparison with the likelihood functions, Bayesian posteriors, and Logistic functions that are typically used in popular methods, membership functions are more convenient to use, providing learning functions that do not suffer the above problem. In Logical Bayesian Inference (LBI), every label is independently learned. For multilabel learning, we can directly obtain a group of optimized membership functions from a large enough sample with labels, without preparing different samples for different labels. Furthermore, a group of Channel Matching (CM) algorithms are developed for machine learning. For the Maximum Mutual Information (MMI) classification of three classes with Gaussian distributions in a two-dimensional feature space, only 2–3 iterations are required for the mutual information between three classes and three labels to surpass 99% of the MMI for most initial partitions. For mixture models, the Expectation-Maximization (EM) algorithm is improved to form the CM-EM algorithm, which can outperform the EM algorithm when the mixture ratios are imbalanced, or when local convergence exists. The CM iteration algorithm needs to combine with neural networks for MMI classification in high-dimensional feature spaces. LBI needs further investigation for the unification of statistics and logic.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 16
    Publication Date: 2019
    Description: Human eye movement is one of the most important functions for understanding our surroundings. When a human eye processes a scene, it quickly focuses on dominant parts of the scene, commonly known as a visual saliency detection or visual attention prediction. Recently, neural networks have been used to predict visual saliency. This paper proposes a deep learning encoder-decoder architecture, based on a transfer learning technique, to predict visual saliency. In the proposed model, visual features are extracted through convolutional layers from raw images to predict visual saliency. In addition, the proposed model uses the VGG-16 network for semantic segmentation, which uses a pixel classification layer to predict the categorical label for every pixel in an input image. The proposed model is applied to several datasets, including TORONTO, MIT300, MIT1003, and DUT-OMRON, to illustrate its efficiency. The results of the proposed model are quantitatively and qualitatively compared to classic and state-of-the-art deep learning models. Using the proposed deep learning model, a global accuracy of up to 96.22% is achieved for the prediction of visual saliency.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 17
    Publication Date: 2019
    Description: In this paper, I first review signal detection theory (SDT) approaches to perception, and then discuss why it is thought that SDT theory implies that increasing attention improves performance. Our experiments have shown, however, that this is not necessarily true. Subjects had either focused attention on two of four possible locations in the visual field, or diffused attention to all four locations. The stimuli (offset letters), locations, conditions, and tasks were all known in advance, responses were forced-choice, subjects were properly instructed and motivated, and instructions were always valid—conditions which should optimize signal detection. Relative to diffusing attention, focusing attention indeed benefitted discrimination of forward from backward pointing Es. However, focusing made it harder to identify a randomly chosen one of 20 letters. That focusing can either aid or disrupt performance, even when cues are valid and conditions are idealized, is surprising, but it can also be explained by SDT, as shown here. These results warn the experimental researcher not to confuse focusing attention with enhancing performance, and warn the modeler not to assume that SDT is unequivocal.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 18
    Publication Date: 2019
    Description: This paper deals with the Arabic translation taṣawwur in Averroes’ Great Commentary of the term τῶν ἀδιαιρέτων νόησις (“ton adiaireton noesis”, thinking of the indivisibles) in Aristotle’s De anima and the Latin translation from Arabic with (in-)formatio, as quoted by Albertus Magnus [...]
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 19
    Publication Date: 2019
    Description: Opportunistic networks are considered as the promising network structures to implement traditional and typical infrastructure-based communication by enabling smart mobile devices in the networks to contact with each other within a fixed communication area. Because of the intermittent and unstable connections between sources and destinations, message routing and forwarding in opportunistic networks have become challenging and troublesome problems recently. In this paper, to improve the data dissemination environment, we propose an improved routing-forwarding strategy utilizing node profile and location prediction for opportunistic networks, which mainly includes three continuous phases: the collecting and updating of routing state information, community detection and optimization and node location prediction. Each mobile node in the networks is able to establish a network routing matrix after the entire process of information collecting and updating. Due to the concentrated population in urban areas and relatively few people in remote areas, the distribution of location prediction roughly presents a type of symmetry in opportunistic networks. Afterwards, the community optimization and location prediction mechanisms could be regarded as an significant foundation for data dissemination in the networks. Ultimately, experimental results demonstrate that the proposed algorithm could slightly enhance the delivery ratio and substantially degrade the network overhead and end-to-end delay as compared with the other four routing strategies.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 20
    Publication Date: 2019
    Description: With digital media, not only are media extensions of their human users, as McLuhan posited, but there is a flip or reversal in which the human users of digital media become an extension of those digital media as these media scoop up their data and use them to the advantage of those that control these media. The implications of this loss of privacy as we become “an item in a data bank” are explored and the field of captology is described. The feedback of the users of digital media become the feedforward for those media.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 21
    Publication Date: 2019
    Description: Anomaly detection of network traffic flows is a non-trivial problem in the field of network security due to the complexity of network traffic. However, most machine learning-based detection methods focus on network anomaly detection but ignore the user anomaly behavior detection. In real scenarios, the anomaly network behavior may harm the user interests. In this paper, we propose an anomaly detection model based on time-decay closed frequent patterns to address this problem. The model mines closed frequent patterns from the network traffic of each user and uses a time-decay factor to distinguish the weight of current and historical network traffic. Because of the dynamic nature of user network behavior, a detection model update strategy is provided in the anomaly detection framework. Additionally, the closed frequent patterns can provide interpretable explanations for anomalies. Experimental results show that the proposed method can detect user behavior anomaly, and the network anomaly detection performance achieved by the proposed method is similar to the state-of-the-art methods and significantly better than the baseline methods.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 22
    Publication Date: 2019
    Description: The current trend in the European Union (EU) is to support the development of online dispute resolution (ODR) that saves financial and human resources. Therefore, research articles mainly deal with the design of new ODR solutions, without researching the social aspects of using different kinds of ODR solutions. For this reason, the main aim of the article is an empirical evaluation of two kinds of ODR solutions in business-to-business (B2B) relationships from the perspective of a selected social category. The article focuses on: (1) comparing unassisted and smart assisted negotiation while using the artificial intelligence approach; (2) the satisfaction and attitudes of Generation Y members from the Czech and Slovak Republic towards different ways of negotiating. The conclusions of this study can help researchers to design or improve existing ODR solutions, and companies to choose the most suitable managers from Generation Y for B2B negotiation. The results show that Generation Y members prefer computer-mediated communication as compared to face to face negotiation; the participants were more satisfied with the negotiation process when using smart assisted negotiation. Through a computer-mediated negotiation, even sellers with lower emotional stability can maintain an advantageous position. Similarly, buyers with lower agreeableness or higher extraversion can negotiate more favorable terms and offset their loss.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 23
    Publication Date: 2019
    Description: Unmanned and unwomaned aerial vehicles (UAV), or drones, are breaking and creating new boundaries of image-based communication. Using social network analysis and critical discourse analysis, we examine the 60 most popular question threads about drones on Zhihu, China’s largest social question answering platform. We trace how controversial issues around these supposedly novel tech products are mediated, domesticated, visualized, or marginalized via digital representational technology. Supported by Zhihu’s topic categorization algorithm, drone-related discussions form topic clusters. These topic clusters gain currency in the government-regulated cyberspace, where their meanings remain open to widely divergent interpretations and mediation by various agents. We find that the largest drone company DJI occupies a central and strongly interconnected position in the discussions. Drones are, moreover, represented as objects of consumption, technological advancement, national future, and uncertainty. At the same time, the sense-making process of drone-related discussions evokes emerging sets of narrative user identities with potential political effects. Users engage in digital representational technologies publicly and collectively to raise questions and represent their views on new technologies. Therefore, we argue that platforms like Zhihu are essential when studying views of the Chinese citizenry towards technological developments.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 24
    Publication Date: 2019
    Description: The ageing population has become an increasing phenomenon world-wide, leading to a growing need for specialised help. Improving the quality of life of older people can lower the risk of depression and social isolation, but it requires a multi-dimensional approach through continuous monitoring and training of the main health domains (e.g., cognitive, motor, nutritional and behavioural). To this end, the use of mobile and e-health services tailored to the user’s needs can help stabilise their health conditions, in terms of physical, mental, and social capabilities. In this context, the INTESA project proposes a set of personalised monitoring and rehabilitation services for older people, based on mobile and wearable technologies ready to be used either at home or in residential long-term care facilities. We evaluated the proposed solution by deploying a suite of services in a nursing home and defining customised protocols to involve both guests (primary users) and nursing care personnel (secondary users). In this paper, we present the extended results obtained after the one-year period of experimentation in terms of technical reliability of the system, Quality of Experience, and user acceptance for both the user categories.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 25
    Publication Date: 2019
    Description: Aiming at granting wide access to their contents, online information providers often choose not to have registered users, and therefore must give up personalization. In this paper, we focus on the case of non-personalized news recommender systems, and explore persuasive techniques that can, nonetheless, be used to enhance recommendation presentation, with the aim of capturing the user’s interest on suggested items leveraging the way news is perceived. We present the results of two evaluations “in the wild”, carried out in the context of a real online magazine and based on data from 16,134 and 20,933 user sessions, respectively, where we empirically assessed the effectiveness of persuasion strategies which exploit logical fallacies and other techniques. Logical fallacies are inferential schemes known since antiquity that, even if formally invalid, appear as plausible and are therefore psychologically persuasive. In particular, our evaluations allowed us to compare three persuasive scenarios based on the Argumentum Ad Populum fallacy, on a modified version of the Argumentum ad Populum fallacy (Group-Ad Populum), and on no fallacy (neutral condition), respectively. Moreover, we studied the effects of the Accent Fallacy (in its visual variant), and of positive vs. negative Framing.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 26
    Publication Date: 2019
    Description: Recently, with the development of big data and 5G networks, the number of intelligent mobile devices has increased dramatically, therefore the data that needs to be transmitted and processed in the networks has grown exponentially. It is difficult for the end-to-end communication mechanism proposed by traditional routing algorithms to implement the massive data transmission between mobile devices. Consequently, opportunistic social networks propose that the effective data transmission process could be implemented by selecting appropriate relay nodes. At present, most existing routing algorithms find suitable next-hop nodes by comparing the similarity degree between nodes. However, when evaluating the similarity between two mobile nodes, these routing algorithms either consider the mobility similarity between nodes, or only consider the social similarity between nodes. To improve the data dissemination environment, this paper proposes an effective data transmission strategy (MSSN) utilizing mobile and social similarities in opportunistic social networks. In our proposed strategy, we first calculate the mobile similarity between neighbor nodes and destination, set a mobile similarity threshold, and compute the social similarity between the nodes whose mobile similarity is greater than the threshold. The nodes with high mobile similarity degree to the destination node are the reliable relay nodes. After simulation experiments and comparison with other existing opportunistic social networks algorithms, the results show that the delivery ratio in the proposed algorithm is 0.80 on average, the average end-to-end delay is 23.1% lower than the FCNS algorithm (A fuzzy routing-forwarding algorithm exploiting comprehensive node similarity in opportunistic social networks), and the overhead on average is 14.9% lower than the Effective Information Transmission Based on Socialization Nodes (EIMST) algorithm.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 27
    Publication Date: 2019
    Description: Communication languages convey information through the use of a set of symbols or units. Typically, this unit is word. When developing language technologies, as words in a language do not have the same prior probability, there may not be sufficient training data for each word to model. Furthermore, the training data may not cover all possible words in the language. Due to these data sparsity and word unit coverage issues, language technologies employ modeling of subword units or subunits, which are based on prior linguistic knowledge. For instance, development of speech technologies such as automatic speech recognition system presume that there exists a phonetic dictionary or at least a writing system for the target language. Such knowledge is not available for all languages in the world. In that direction, this article develops a hidden Markov model-based abstract methodology to extract subword units given only pairwise comparison between utterances (or realizations of words in the mode of communication), i.e., whether two utterances correspond to the same word or not. We validate the proposed methodology through investigations on spoken language and sign language. In the case of spoken language, we demonstrate that the proposed methodology can lead up to discovery of phone set and development of phonetic dictionary. In the case of sign language, we demonstrate how hand movement information can be effectively modeled for sign language processing and synthesized back to gain insight about the derived subunits.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 28
    Publication Date: 2019
    Description: Though the self-portrait has been hailed as the defining artistic genre of modernity, there is not yet a good account of what the self-portrait actually is. This paper provides such an account through the lens of document theory and the philosophy of information. In this paper, the self-portrait is conceptualized as a kind of document, more specifically a kind of self-document, to gain insight into the phenomenon. A self-portrait is shown to be a construction, and not just a representation, of oneself. Creating a self-portrait then is a matter of bringing oneself forth over time—constructing oneself, rather than simply depicting oneself. This account provides grounds to consider whether or how the selfie truly is a form of self-portrait, as is often asserted. In the end, it seems that while both are technologies for self-construction, the self-portrait has the capacity for deep self-construction, whereas the selfie is limited to fewer aspects of the self. This prospect leads into an ethical discussion of the changing concept of identity in the digital age.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 29
    Publication Date: 2019
    Description: Dependability assessment is one of the most important activities for the analysis of complex systems. Classical analysis techniques of safety, risk, and dependability, like Fault Tree Analysis or Reliability Block Diagrams, are easy to implement, but they estimate inaccurate dependability results due to their simplified hypotheses that assume the components’ malfunctions to be independent from each other and from the system working conditions. Recent contributions within the umbrella of Dynamic Probabilistic Risk Assessment have shown the potential to improve the accuracy of classical dependability analysis methods. Among them, Stochastic Hybrid Fault Tree Automaton (SHyFTA) is a promising methodology because it can combine a Dynamic Fault Tree model with the physics-based deterministic model of a system process, and it can generate dependability metrics along with performance indicators of the physical variables. This paper presents the Stochastic Hybrid Fault Tree Object Oriented (SHyFTOO), a Matlab® software library for the modelling and the resolution of a SHyFTA model. One of the novel features discussed in this contribution is the ease of coupling with a Matlab® Simulink model that facilitates the design of complex system dynamics. To demonstrate the utilization of this software library and the augmented capability of generating further dependability indicators, three different case studies are discussed and solved with a thorough description for the implementation of the corresponding SHyFTA models.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 30
    Publication Date: 2019
    Description: Twisted Edwards curves have been at the center of attention since their introduction by Bernstein et al. in 2007. The curve ED25519, used for Edwards-curve Digital Signature Algorithm (EdDSA), provides faster digital signatures than existing schemes without sacrificing security. The CURVE25519 is a Montgomery curve that is closely related to ED25519. It provides a simple, constant time, and fast point multiplication, which is used by the key exchange protocol X25519. Software implementations of EdDSA and X25519 are used in many web-based PC and Mobile applications. In this paper, we introduce a low-power, low-area FPGA implementation of the ED25519 and CURVE25519 scalar multiplication that is particularly relevant for Internet of Things (IoT) applications. The efficiency of the arithmetic modulo the prime number 2 255 − 19 , in particular the modular reduction and modular multiplication, are key to the efficiency of both EdDSA and X25519. To reduce the complexity of the hardware implementation, we propose a high-radix interleaved modular multiplication algorithm. One benefit of this architecture is to avoid the use of large-integer multipliers relying on FPGA DSP modules.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 31
    Publication Date: 2019
    Description: Assembly is a very important manufacturing process in the age of Industry 4.0. Aimed at the problems of part identification and assembly inspection in industrial production, this paper proposes a method of assembly inspection based on machine vision and a deep neural network. First, the image acquisition platform is built to collect the part and assembly images. We use the Mask R-CNN model to identify and segment the shape from each part image, and to obtain the part category and position coordinates in the image. Then, according to the image segmentation results, the area, perimeter, circularity, and Hu invariant moment of the contour are extracted to form the feature vector. Finally, the SVM classification model is constructed to identify the assembly defects, with a classification accuracy rate of over 86.5%. The accuracy of the method is verified by constructing an experimental platform. The results show that the method effectively completes the identification of missing and misaligned parts in the assembly, and has good robustness.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 32
    Publication Date: 2019
    Description: The authors wish to make the following corrections to this paper [...]
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 33
    Publication Date: 2019
    Description: In the education process, students face problems with understanding due to the complexity, necessity of abstract thinking and concepts. More and more educational centres around the world have started to introduce powerful new technology-based tools that help meet the needs of the diverse student population. Over the last several years, virtual reality (VR) has moved from being the purview of gaming to professional development. It plays an important role in teaching process, providing an interesting and engaging way of acquiring information. What follows is an overview of the big trend, opportunities and concerns associated with VR in education. We present new opportunities in VR and put together the most interesting, recent virtual reality applications used in education in relation to several education areas such as general, engineering and health-related education. Additionally, this survey contributes by presenting methods for creating scenarios and different approaches for testing and validation. Lastly, we conclude and discuss future directions of VR and its potential to improve the learning experience.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 34
    Publication Date: 2019
    Description: The big data from various sensors installed on-board for monitoring the status of ship devices is very critical for improving the efficiency and safety of ship operations and reducing the cost of operation and maintenance. However, how to utilize these data is a key issue. The temperature change of the ship propulsion devices can often reflect whether the devices are faulty or not. Therefore, this paper aims to forecast the temperature of the ship propulsion devices by data-driven methods, where potential faults can be further identified automatically. The proposed forecasting process is composed of preprocessing, feature selection, and prediction, including an autoregressive distributed lag time series model (ARDL), stepwise regression (SR) model, neural network (NN) model, and deep neural network (DNN) model. Finally, the proposed forecasting process is applied on a naval ship, and the results show that the ARDL model has higher accuracy than the three other models.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 35
    Publication Date: 2019
    Description: The current inclusion of agile methodologies in web-oriented projects has been considered on a large-scale by software developers. However, the benefits and limitations go beyond the comforts that project managers delimit when choosing them. Selecting a methodology involves more than only the associated processes or some documentation. Based on the above, we could define as the main concerns the approach with which we identify the methodology, the needs of the company, the size, and qualities of the project, and especially the characteristics of agile development that they possess. However, there are several difficulties in selecting the most appropriate methodology due to the features in common; Will it be suitable for my project? What challenges will be presented in the process? Will my team understand each stage? Will I be able to deliver software that satisfies the client? Project managers create these questions, which seem manageable but have huge effects. This paper presents a systematic literature review based on the analysis of the approaches of six web development methodologies. The aim of the study is to analyze the approaches presented by relevant methodologies, identifying their common agile characteristics and managing to contrast both its benefits and limitations during a project. As a result, we could itemize five common features, which are presented within the processes; (1) flexibility, (2) constant communication of the workgroup, (3) use of UML, (4) the inclusion of the end-user and (5) some documentation.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 36
    Publication Date: 2019
    Description: After more than a decade, the supply-driven approach to publishing public (open) data has resulted in an ever-growing number of data silos. Hundreds of thousands of datasets have been catalogued and can be accessed at data portals at different administrative levels. However, usually, users do not think in terms of datasets when they search for information. Instead, they are interested in information that is most likely scattered across several datasets. In the world of proprietary in-company data, organizations invest heavily in connecting data in knowledge graphs and/or store data in data lakes with the intention of having an integrated view of the data for analysis. With the rise of machine learning, it is a common belief that governments can improve their services, for example, by allowing citizens to get answers related to government information from virtual assistants like Alexa or Siri. To provide high-quality answers, these systems need to be fed with knowledge graphs. In this paper, we share our experience of constructing and using the first open government knowledge graph in the Netherlands. Based on the developed demonstrators, we elaborate on the value of having such a graph and demonstrate its use in the context of improved data browsing, multicriteria analysis for urban planning, and the development of location-aware chat bots.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 37
    Publication Date: 2019
    Description: Computer science is a predominantly male field of study. Women face barriers while trying to insert themselves in the study of computer science. Those barriers extend to when women are exposed to the professional area of computer science. Despite decades of social fights for gender equity in Science, Technology, Engineering, and Mathematics (STEM) education and in computer science in general, few women participate in computer science, and some of the reasons include gender bias and lack of support for women when choosing a computer science career. Open source software development has been increasingly used by companies seeking the competitive advantages gained by team diversity. This diversification of the characteristics of team members includes, for example, the age of the participants, the level of experience, education and knowledge in the area, and their gender. In open source software projects women are underrepresented and a series of biases are involved in their participation. This paper conducts a systematic literature review with the objective of finding factors that could assist in increasing women’s interest in contributing to open source communities and software development projects. The main contributions of this paper are: (i) identification of factors that cause women’s lack of interest (engagement), (ii) possible solutions to increase the engagement of this public, (iii) to outline the profile of professional women who are participating in open source software projects and software development projects. The main findings of this research reveal that women are underrepresented in software development projects and in open source software projects. They represent less than 10% of the total developers and the main causes of this underrepresentation may be associated with their workplace conditions, which reflect male gender bias.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 38
    Publication Date: 2019
    Description: The ILAHS (inhomogeneous linear algebraic hybrid system) is a kind of classic hybrid system. For the purpose of optimizing the design of ILAHS, one important strategy is to introduce equivalence to reduce the states. Recent advances in the hybrid system indicate that approximate trace equivalence can further simplify the design of ILAHS. To address this issue, the paper first introduces the trajectory metric d t r j for measuring the deviation of two hybrid systems’ behaviors. Given a deviation ε ≥ 0 , the original ILAHS of H 1 can be transformed to the approximate ILAHS of H 2 , then in trace equivalence semantics, H 2 is further reduced to H 3 with the same functions, and hence H 1 is ε -approximate trace equivalent to H 3 . In particular, ε = 0 is a traditional trace equivalence. We implement an approach based on RealRootClassification to determine the approximation between the ILAHSs. The paper also shows that the existing approaches are only special cases of our method. Finally, we illustrate the effectiveness and practicality of our method on an example.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 39
    Publication Date: 2019
    Description: Image classification is one of the most important tasks in the digital era. In terms of cultural heritage, it is important to develop classification methods that obtain good accuracy, but also are less computationally intensive, as image classification usually uses very large sets of data. This study aims to train and test four classification algorithms: (i) the multilayer perceptron, (ii) averaged one dependence estimators, (iii) forest by penalizing attributes, and (iv) the k-nearest neighbor rough sets and analogy based reasoning, and compares these with the results obtained from the Convolutional Neural Network (CNN). Three types of features were extracted from the images: (i) the edge histogram, (ii) the color layout, and (iii) the JPEG coefficients. The algorithms were tested before and after applying the attribute selection, and the results indicated that the best classification performance was obtained for the multilayer perceptron in both cases.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 40
    Publication Date: 2019
    Description: In the future, automated cars may feature external human–machine interfaces (eHMIs) to communicate relevant information to other road users. However, it is currently unknown where on the car the eHMI should be placed. In this study, 61 participants each viewed 36 animations of cars with eHMIs on either the roof, windscreen, grill, above the wheels, or a projection on the road. The eHMI showed ‘Waiting’ combined with a walking symbol 1.2 s before the car started to slow down, or ‘Driving’ while the car continued driving. Participants had to press and hold the spacebar when they felt it safe to cross. Results showed that, averaged over the period when the car approached and slowed down, the roof, windscreen, and grill eHMIs yielded the best performance (i.e., the highest spacebar press time). The projection and wheels eHMIs scored relatively poorly, yet still better than no eHMI. The wheels eHMI received a relatively high percentage of spacebar presses when the car appeared from a corner, a situation in which the roof, windscreen, and grill eHMIs were out of view. Eye-tracking analyses showed that the projection yielded dispersed eye movements, as participants scanned back and forth between the projection and the car. It is concluded that eHMIs should be presented on multiple sides of the car. A projection on the road is visually effortful for pedestrians, as it causes them to divide their attention between the projection and the car itself.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 41
    Publication Date: 2019
    Description: Metagenomics studies, as well as genomics studies of polyploid species such as wheat, deal with the analysis of high variation data. Such data contain sequences from similar, but distinct genetic chains. This fact presents an obstacle to analysis and research. In particular, the detection of instrumentation errors during the digitalization of the sequences may be hindered, as they can be indistinguishable from the real biological variation inside the digital data. This can prevent the determination of the correct sequences, while at the same time make variant studies significantly more difficult. This paper details a collection of ML-based models used to distinguish a real variant from an erroneous one. The focus is on using this model directly, but experiments are also done in combination with other predictors that isolate a pool of error candidates.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 42
    Publication Date: 2019
    Description: The current paper addresses relevant network security vulnerabilities introduced by network devices within the emerging paradigm of Internet of Things (IoT) as well as the urgent need to mitigate the negative effects of some types of Distributed Denial of Service (DDoS) attacks that try to explore those security weaknesses. We design and implement a Software-Defined Intrusion Detection System (IDS) that reactively impairs the attacks at its origin, ensuring the “normal operation” of the network infrastructure. Our proposal includes an IDS that automatically detects several DDoS attacks, and then as an attack is detected, it notifies a Software Defined Networking (SDN) controller. The current proposal also downloads some convenient traffic forwarding decisions from the SDN controller to network devices. The evaluation results suggest that our proposal timely detects several types of cyber-attacks based on DDoS, mitigates their negative impacts on the network performance, and ensures the correct data delivery of normal traffic. Our work sheds light on the programming relevance over an abstracted view of the network infrastructure to timely detect a Botnet exploitation, mitigate malicious traffic at its source, and protect benign traffic.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 43
    Publication Date: 2019
    Description: In the last decade, there has been a growing scientific interest in the analysis of DNA microarray datasets, which have been widely used in basic and translational cancer research. The application fields include both the identification of oncological subjects, separating them from the healthy ones, and the classification of different types of cancer. Since DNA microarray experiments typically generate a very large number of features for a limited number of patients, the classification task is very complex and typically requires the application of a feature-selection process to reduce the complexity of the feature space and to identify a subset of distinctive features. In this framework, there are no standard state-of-the-art results generally accepted by the scientific community and, therefore, it is difficult to decide which approach to use for obtaining satisfactory results in the general case. Based on these considerations, the aim of the present work is to provide a large experimental comparison for evaluating the effect of the feature-selection process applied to different classification schemes. For comparison purposes, we considered both ranking-based feature-selection techniques and state-of-the-art feature-selection methods. The experiments provide a broad overview of the results obtainable on standard microarray datasets with different characteristics in terms of both the number of features and the number of patients.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 44
    Publication Date: 2019
    Description: In recent years, due to the unnecessary wastage of electrical energy in residential buildings, the requirement of energy optimization and user comfort has gained vital importance. In the literature, various techniques have been proposed addressing the energy optimization problem. The goal of each technique is to maintain a balance between user comfort and energy requirements, such that the user can achieve the desired comfort level with the minimum amount of energy consumption. Researchers have addressed the issue with the help of different optimization algorithms and variations in the parameters to reduce energy consumption. To the best of our knowledge, this problem is not solved yet due to its challenging nature. The gaps in the literature are due to advancements in technology, the drawbacks of optimization algorithms, and the introduction of new optimization algorithms. Further, many newly proposed optimization algorithms have produced better accuracy on the benchmark instances but have not been applied yet for the optimization of energy consumption in smart homes. In this paper, we have carried out a detailed literature review of the techniques used for the optimization of energy consumption and scheduling in smart homes. Detailed discussion has been carried out on different factors contributing towards thermal comfort, visual comfort, and air quality comfort. We have also reviewed the fog and edge computing techniques used in smart homes.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 45
    Publication Date: 2019
    Description: This article addresses the task of inferring elements in the attributes of data. Extracting data related to our interests is a challenging task. Although data on the web can be accessed through free text queries, it is difficult to obtain results that accurately correspond to user intentions because users might not express their objects of interest using exact terms (variables, outlines of data, etc.) found in the data. In other words, users do not always have sufficient knowledge of the data to formulate an effective query. Hence, we propose a method that enables the type, format, and variable elements to be inferred as attributes of data when a natural language summary of the data is provided as a free text query. To evaluate the proposed method, we used the Data Jacket’s datasets whose metadata is written in natural language. The experimental results indicate that our method outperforms those obtained from string matching and word embedding. Applications based on this study can support users who wish to retrieve or acquire new data.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 46
    Publication Date: 2019
    Description: The existing short-term traffic flow prediction models fail to provide precise prediction results and consider the impact of different traffic conditions on the prediction results in an actual traffic network. To solve these problems, a hybrid Long Short–Term Memory (LSTM) neural network is proposed, based on the LSTM model. Then, the structure and parameters of the hybrid LSTM neural network are optimized experimentally for different traffic conditions, and the final model is compared with the other typical models. It is found that the prediction error of the hybrid LSTM model is obviously less than those of the other models, but the running time of the hybrid LSTM model is only slightly longer than that of the LSTM model. Based on the hybrid LSTM model, the vehicle flows of each road section and intersection in the actual traffic network are further predicted. The results show that the maximum relative error between the actual and predictive vehicle flows of each road section is 1.03%, and the maximum relative error between the actual and predictive vehicle flows of each road intersection is 1.18%. Hence, the hybrid LSTM model is closer to the accuracy and real-time requirements of short-term traffic flow prediction, and suitable for different traffic conditions in the actual traffic network.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 47
    Publication Date: 2019
    Description: Financial prediction is an important research field in financial data time series mining. There has always been a problem of clustering massive financial time series data. Conventional clustering algorithms are not practical for time series data because they are essentially designed for static data. This impracticality results in poor clustering accuracy in several financial forecasting models. In this paper, a new hybrid algorithm is proposed based on Optimization of Initial Points and Variable-Parameter Density-Based Spatial Clustering of Applications with Noise (OVDBCSAN) and support vector regression (SVR). At the initial point of optimization, ε and MinPts, which are global parameters in DBSCAN, mainly deal with datasets of different densities. According to different densities, appropriate parameters are selected for clustering through optimization. This algorithm can find a large number of similar classes and then establish regression prediction models. It was tested extensively using real-world time series datasets from Ping An Bank, the Shanghai Stock Exchange, and the Shenzhen Stock Exchange to evaluate accuracy. The evaluation showed that our approach has major potential in clustering massive financial time series data, therefore improving the accuracy of the prediction of stock prices and financial indexes.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 48
    Publication Date: 2019
    Description: In targeting the low correlation between existing image scaling quality assessment methods and subjective awareness, a content-aware retargeted image quality assessment algorithm is proposed, which is based on the structural similarity index. In this paper, a similarity index, that is, a local structural similarity algorithm, which can measure different sizes of the same image is proposed. The Speed Up Robust Feature (SURF) algorithm is used to extract the local structural similarity and the image content loss degree. The significant area ratio is calculated by extracting the saliency region and the retargeted image quality assessment function is obtained by linear fusion. In the CUHK image database and the MIT RetargetMe database, compared with four representative assessment algorithms and other latest four kinds of retargeted image quality assessment algorithms, the experiment proves that the proposed algorithm has a higher correlation with Mean Opinion Score (MOS) values and corresponds with the result of human subjective assessment.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 49
    Publication Date: 2019
    Description: Data fragmentation and allocation has for long proven to be an efficient technique for improving the performance of distributed database systems’ (DDBSs). A crucial feature of any successful DDBS design revolves around placing an intrinsic emphasis on minimizing transmission costs (TC). This work; therefore, focuses on improving distribution performance based on transmission cost minimization. To do so, data fragmentation and allocation techniques are utilized in this work along with investigating several data replication scenarios. Moreover, site clustering is leveraged with the aim of producing a minimum possible number of highly balanced clusters. By doing so, TC is proved to be immensely reduced, as depicted in performance evaluation. DDBS performance is measured using TC objective function. An inclusive evaluation has been made in a simulated environment, and the compared results have demonstrated the superiority and efficacy of the proposed approach on reducing TC.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 50
    Publication Date: 2019
    Description: We propose an extended scheme for selecting related stocks for themed mutual funds. This scheme was designed to support fund managers who are building themed mutual funds. In our preliminary experiments, building a themed mutual fund was found to be quite difficult. Our scheme is a type of natural language processing method and based on words extracted according to their similarity to a theme using word2vec and our unique similarity based on co-occurrence in company information. We used data including investor relations and official websites as company information data. We also conducted several other experiments, including hyperparameter tuning, in our scheme. The scheme achieved a 172% higher F1 score and 21% higher accuracy than a standard method. Our research also showed the possibility that official websites are not necessary for our scheme, contrary to our preliminary experiments for assessing data collaboration.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 51
    Publication Date: 2019
    Description: The industrial internet of things (IIoT) known as industry 4.0, is the use of internet of things technologies, via the Wireless Sensor Network (WSN), to enhance manufacturing and industrial processes. It incorporates machine learning and big data technologies, to allow machine-to-machine communication that have existed for years in the industrial world. Therefore, it is necessary to propose a robust and functional communication architecture that is based on WSNs, inside factories, in order to show the great interest in the connectivity of things in the industrial environment. In such environment, propagation differs from other conventional indoor mediums, in its large dimensions, and the nature of objects and obstacles inside. Thus, the industrial medium is modeled as a fading channel affected by an impulsive and Gaussian noise. The objective of this paper is to improve robustness and performances of multi-user WSN architecture, based on Discrete Wavelet Transform, under an industrial environment using conventional channel coding and an optimal thresholding receiver.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 52
    Publication Date: 2019
    Description: We will sketch the debate on testimony in social epistemology by reference to the contemporary debate on reductionism/anti-reductionism, communitarian epistemology and inferentialism. Testimony is a fundamental source of knowledge we share and it is worthy to be considered in the ambit of a dialogical perspective, which requires a description of a formal structure, which entails deontic statuses and deontic attitudes. In particular, we will argue for a social reformulation of the “space of reasons”, which establishes a fruitful relationship with the epistemological view of Wilfrid Sellars.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 53
    Publication Date: 2019
    Description: Statistical bivariate numerical modeling is a method to infer an empirical relationship between unpaired sets of data based on statistical distributions matching. In the present paper, a novel efficient numerical algorithm is proposed to perform bivariate numerical modeling. The algorithm is then applied to correlate glomerular filtration rate to serum creatinine concentration. Glomerular filtration rate is adopted in clinical nephrology as an indicator of kidney function and is relevant for assessing progression of renal disease. As direct measurement of glomerular filtration rate is highly impractical, there is considerable interest in developing numerical algorithms to estimate glomerular filtration rate from parameters which are easier to obtain, such as demographic and ‘bedside’ assays data.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 54
    Publication Date: 2019
    Description: Path planning, as the core of navigation control for mobile robots, has become the focus of research in the field of mobile robots. Various path planning algorithms have been recently proposed. In this paper, in view of the advantages and disadvantages of different path planning algorithms, a heuristic elastic particle swarm algorithm is proposed. Using the path planned by the A* algorithm in a large-scale grid for global guidance, the elastic particle swarm optimization algorithm uses a shrinking operation to determine the globally optimal path formed by locally optimal nodes so that the particles can converge to it rapidly. Furthermore, in the iterative process, the diversity of the particles is ensured by a rebound operation. Computer simulation and real experimental results show that the proposed algorithm not only overcomes the shortcomings of the A* algorithm, which cannot yield the shortest path, but also avoids the problem of failure to converge to the globally optimal path, owing to a lack of heuristic information. Additionally, the proposed algorithm maintains the simplicity and high efficiency of both the algorithms.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 55
    Publication Date: 2019
    Description: Assigning sentiment labels to documents is, at first sight, a standard multi-label classification task. Many approaches have been used for this task, but the current state-of-the-art solutions use deep neural networks (DNNs). As such, it seems likely that standard machine learning algorithms, such as these, will provide an effective approach. We describe an alternative approach, involving the use of probabilities to construct a weighted lexicon of sentiment terms, then modifying the lexicon and calculating optimal thresholds for each class. We show that this approach outperforms the use of DNNs and other standard algorithms. We believe that DNNs are not a universal panacea and that paying attention to the nature of the data that you are trying to learn from can be more important than trying out ever more powerful general purpose machine learning algorithms.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 56
    Publication Date: 2019
    Description: Recent expansion of intelligent gadgets, such as smartphones and smart watches, familiarizes humans with sensing their activities. We have been developing a road accessibility evaluation system inspired by human sensing technologies. This paper introduces our methodology to estimate road accessibility from the three-axis acceleration data obtained by a smart phone attached on a wheelchair seat, such as environmental factors, e.g., curbs and gaps, which directly influence wheelchair bodies, and human factors, e.g., wheelchair users’ feelings of tiredness and strain. Our goal is to realize a system that provides the road accessibility visualization services to users by online/offline pattern matching using impersonal models, while gradually learning to improve service accuracy using new data provided by users. As the first step, this paper evaluates features acquired by the DCNN (deep convolutional neural network), which learns the state of the road surface from the data in supervised machine learning techniques. The evaluated results show that the features can capture the difference of the road surface condition in more detail than the label attached by us and are effective as the means for quantitatively expressing the road surface condition. This paper developed and evaluated a prototype system that estimated types of ground surfaces focusing on knowledge extraction and visualization.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 57
    Publication Date: 2019
    Description: Due to the high demands of new technologies such as social networks, e-commerce and cloud computing, more energy is being consumed in order to store all the data produced and provide the high availability required. Over the years, this increase in energy consumption has brought about a rise in both the environmental impacts and operational costs. Some companies have adopted the concept of a green data center, which is related to electricity consumption and CO2 emissions, according to the utility power source adopted. In Brazil, almost 70% of electrical power is derived from clean electricity generation, whereas in China 65% of generated electricity comes from coal. In addition, the value per kWh in the US is much lower than in other countries surveyed. In the present work, we conducted an integrated evaluation of costs and CO2 emissions of the electrical infrastructure in data centers, considering the different energy sources adopted by each country. We used a multi-layered artificial neural network, which could forecast consumption over the following months, based on the energy consumption history of the data center. All these features were supported by a tool, the applicability of which was demonstrated through a case study that computed the CO2 emissions and operational costs of a data center using the energy mix adopted in Brazil, China, Germany and the US. China presented the highest CO2 emissions, with 41,445 tons per year in 2014, followed by the US and Germany, with 37,177 and 35,883, respectively. Brazil, with 8459 tons, proved to be the cleanest. Additionally, this study also estimated the operational costs assuming that the same data center consumes energy as if it were in China, Germany and Brazil. China presented the highest kWh/year. Therefore, the best choice according to operational costs, considering the price of energy per kWh, is the US and the worst is China. Considering both operational costs and CO2 emissions, Brazil would be the best option.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 58
    Publication Date: 2019
    Description: This paper presents an approach to detect and classify the faults in complex systems with small amounts of available data history. The methodology is based on the model fusion for fault detection and classification. Moreover, the database is enriched with additional samples if they are correctly classified. For the fault detection, the kernel principal component analysis (KPCA), kernel independent component analysis (KICA) and support vector domain description (SVDD) were used and combined with a fusion operator. For the classification, extreme learning machine (ELM) was used with different activation functions combined with an average fusion function. The performance of the methodology was evaluated with a set of experimental vibration data collected from a test-to-failure bearing test rig. The results show the effectiveness of the proposed approach compared to conventional methods. The fault detection was achieved with a false alarm rate of 2.29% and a null missing alarm rate. The data is also successfully classified with a rate of 99.17%.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 59
    Publication Date: 2019
    Description: Background: Hadoop has become the base framework on the big data system via the simple concept that moving computation is cheaper than moving data. Hadoop increases a data locality in the Hadoop Distributed File System (HDFS) to improve the performance of the system. The network traffic among nodes in the big data system is reduced by increasing a data-local on the machine. Traditional research increased the data-local on one of the MapReduce stages to increase the Hadoop performance. However, there is currently no mathematical performance model for the data locality on the Hadoop. Methods: This study made the Hadoop performance analysis model with data locality for analyzing the entire process of MapReduce. In this paper, the data locality concept on the map stage and shuffle stage was explained. Also, this research showed how to apply the Hadoop performance analysis model to increase the performance of the Hadoop system by making the deep data locality. Results: This research proved the deep data locality for increasing performance of Hadoop via three tests, such as, a simulation base test, a cloud test and a physical test. According to the test, the authors improved the Hadoop system by over 34% by using the deep data locality. Conclusions: The deep data locality improved the Hadoop performance by reducing the data movement in HDFS.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 60
    Publication Date: 2019
    Description: In this work it is considered a circular Wireless Sensor Networks (WSN) in a planar structure with uniform distribution of the sensors and with a two-level hierarchical topology. At the lower level, a cluster configuration is adopted in which the sensed information is transferred from sensor nodes to a cluster head (CH) using a random access protocol (RAP). At CH level, CHs transfer information, hop-by-hop, ring-by-ring, towards to the sink located at the center of the sensed area using TDMA as MAC protocol. A Markovian model to evaluate the end-to-end (E2E) transfer delay is formulated. In addition to other results such as the well know energy hole problem, the model reveals that for a given radial distance between the CH and the sink, the transfer delay depends on the angular orientation between them. For instance, when two rings of CHs are deployed in the WSN area, the E2E delay of data packets generated at ring 2 and at the “west” side of the sink, is 20% higher than the corresponding E2E delay of data packets generated at ring 2 and at the “east” side of the sink. This asymmetry can be alleviated by rotating from time to time the allocation of temporary slots to CHs in the TDMA communication. Also, the energy consumption is evaluated and the numerical results show that for a WSN with a small coverage area, say a radio of 100 m, the energy saving is more significant when a small number of rings are deployed, perhaps none (a single cluster in which the sink acts as a CH). Conversely, topologies with a large number of rings, say 4 or 5, offer a better energy performance when the service WSN covers a large area, say radial distances greater than 400 m.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 61
    Publication Date: 2019
    Description: Urban population has grown exponentially in recent years, leading to an increase of CO2 emissions and consequently contributing on a large scale to climate change. Urban trees are fundamental to mitigating CO2 emissions as they incorporate carbon in their biomass. It becomes necessary to understand and measure urban tree carbon storage. In this paper is studied the potential of open data to measure the quantity, density, and value of carbon stored by the seven most represented urban trees in the city of Lisbon. To compute carbon storage, the seven most represented urban tree species were selected from an open database acquired from an open data portal of the city of Lisbon. Through allometric equations, it was possible to compute the trees’ biomass and calculate carbon storage quantity, density, and value. The results showed that the tree species Celtis australis is the species that contributes more to carbon storage. Central parishes of the city of Lisbon present higher-density values of carbon storage when compared with the border parishes despite the first ones presenting low-to-medium values of carbon storage quantity and value. Trees located in streets, present higher values of carbon storage, when compared with trees located in schools and green areas. Finally, the potential usage of this information to build a decision-support dashboard for planning green infrastructures was demonstrated.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 62
    Publication Date: 2019
    Description: In this study, methods for predicting energy demand on hourly consumption data are established for realizing an energy management system for buildings. The methods consist of an energy prediction algorithm that automatically separates the datasets to partitions (gate) and creates a linear regression model (local expert) for each partition on the heterogeneous mixture modeling, and an extended goal graph that extracts candidates of variables both for data partitioning and for linear regression for the energy prediction algorithm. These methods were implemented as tools and applied to create the energy prediction model on two years' hourly consumption data for a building. We validated the methods by comparing accuracies with those of different machine learning algorithms applied to the same datasets.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 63
    Publication Date: 2019
    Description: A hashtag is a type of metadata tag used on social networks, such as Twitter and other microblogging services. Hashtags indicate the core idea of a microblog post and can help people to search for specific themes or content. However, not everyone tags their posts themselves. Therefore, the task of hashtag recommendation has received significant attention in recent years. To solve the task, a key problem is how to effectively represent the text of a microblog post in a way that its representation can be utilized for hashtag recommendation. We study two major kinds of text representation methods for hashtag recommendation, including shallow textual features and deep textual features learned by deep neural models. Most existing work tries to use deep neural networks to learn microblog post representation based on the semantic combination of words. In this paper, we propose to adopt Tree-LSTM to improve the representation by combining the syntactic structure and the semantic information of words. We conduct extensive experiments on two real world datasets. The experimental results show that deep neural models generally perform better than traditional methods. Specially, Tree-LSTM achieves significantly better results on hashtag recommendation than standard LSTM, with a 30% increase in F1-score, which indicates that it is promising to utilize syntactic structure in the task of hashtag recommendation.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 64
    Publication Date: 2019
    Description: The wide-ranging application of location-based services (LBSs) through the use of mobile devices and wireless networks has brought about many critical privacy challenges. To preserve the location privacy of users, most existing location privacy-preserving mechanisms (LPPMs) modify their real locations associated with different pseudonyms, which come at a cost either in terms of resource consumption or quality of service, or both. However, we observed that the effect of resource consumption has not been discussed in existing studies. In this paper, we present the user-centric LPPMs against location inference attacks under the consideration of both service quality and energy constraints. Moreover, we modeled the precision-based and dummy-based mechanisms in the context of an existing LPPM framework, and also extended the linear program solutions applicable to them. This study allowed us to specify the LPPMs that decreased the precision of exposed locations or generated dummy locations of the users. Based on this, we evaluated the privacy protection effects of optimal location obfuscation function against an adversary’s inference attack function using real mobility datasets. The results indicate that dummy-based mechanisms provide better achievable location privacy under a given combination of service quality and energy constraints, and once a certain level of privacy is reached, both the precision-based and dummy-based mechanisms only perturb the exposed locations. The evaluation results also contribute to a better understanding for the LPPM design strategies and evaluation mechanism as far as the system resource utilization and service quality requirements are concerned.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 65
    Publication Date: 2019
    Description: This paper explores the usability of the Dice CAPTCHA via analysis of the time spent to solve the CAPTCHA, and number of tries for solving the CAPTCHA. The experiment was conducted on a set of 197 subjects who use the Internet, and are discriminated by age, daily Internet usage in hours, Internet experience in years, and type of device where a solution to the CAPTCHA is found. Each user was asked to find a solution to the Dice CAPTCHA on a tablet or laptop, and the time to successfully find a solution to the CAPTCHA for a given number of attempts was registered. Analysis was performed on the collected data via association rule mining and artificial neural network. It revealed that the time to find a solution in a given number of attempts of the CAPTCHA depended on different combinations of values of user’s features, as well as the most meaningful features influencing the solution time. In addition, this dependence was explored through prediction of the CAPTCHA solution time from the user’s features via artificial neural network. The obtained results are very helpful to analyze the combination of features having an influence on the CAPTCHA solution, and consequently, to find the CAPTCHA mostly complying to the postulate of “ideal” test.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 66
    Publication Date: 2019
    Description: This article reports on a study to investigate how the driving behaviour of autonomous vehicles influences trust and acceptance. Two different designs were presented to two groups of participants (n = 22/21), using actual autonomously driving vehicles. The first was a vehicle programmed to drive similarly to a human, “peeking” when approaching road junctions as if it was looking before proceeding. The second design had a vehicle programmed to convey the impression that it was communicating with other vehicles and infrastructure and “knew” if the junction was clear so could proceed without ever stopping or slowing down. Results showed non-significant differences in trust between the two vehicle behaviours. However, there were significant increases in trust scores overall for both designs as the trials progressed. Post-interaction interviews indicated that there were pros and cons for both driving styles, and participants suggested which aspects of the driving styles could be improved. This paper presents user information recommendations for the design and programming of driving systems for autonomous vehicles, with the aim of improving their users’ trust and acceptance.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 67
    Publication Date: 2019
    Description: Lyapunov equations are key mathematical objects in systems theory, analysis and design of control systems, and in many applications, including balanced realization algorithms, procedures for reduced order models, Newton methods for algebraic Riccati equations, or stabilization algorithms. A new iterative accuracy-enhancing solver for both standard and generalized continuous- and discrete-time Lyapunov equations is proposed and investigated in this paper. The underlying algorithm and some technical details are summarized. At each iteration, the computed solution of a reduced Lyapunov equation serves as a correction term to refine the current solution of the initial equation. The best available algorithms for solving Lyapunov equations with dense matrices, employing the real Schur(-triangular) form of the coefficient matrices, are used. The reduction to Schur(-triangular) form has to be done only once, before starting the iterative process. The algorithm converges in very few iterations. The results obtained by solving series of numerically difficult examples derived from the SLICOT benchmark collections for Lyapunov equations are compared to the solutions returned by the MATLAB and SLICOT solvers. The new solver can be more accurate than these state-of-the-art solvers and requires little additional computational effort.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 68
    Publication Date: 2019
    Description: Due to the dynamics and uncertainty of the current network environment, access control is one of the most important factors in guaranteeing network information security. How to construct a scientific and accurate access control model is a current research focus. In actual access control mechanisms, users with high trust values bring better benefits, but the losses will also be greater once cheating access is adopted. A general access control game model that can reflect both trust and risk is established in this paper. First, we construct an access control game model with user behavior trust between the user and the service provider, in which the benefits and losses are quantified by using adaptive regulatory factors and the user’s trust level, which enhances the rationality of the policy making. Meanwhile, we present two kinds of solutions for the prisoner’s dilemma in the traditional access control game model without user behavior trust. Then, due to the vulnerability of trust, the user’s trust value is updated according to the interaction situation in the previous stage, which ensures that the updating of the user’s trust value can satisfy the “slow rising-fast falling” principle. Theoretical analysis and the simulation experiment both show that this model has a better performance than a traditional game model and can guarantee scientific decision-making in the access control mechanism.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 69
    Publication Date: 2019
    Description: Different types of rewards are applied in persuasive games to encourage play persistence of its users and facilitate the achievement of desired real-world goals, such as behavioral change. Persuasive games have successfully been applied in mental healthcare and may hold potential for different types of patients. However, we question to what extent game-based rewards are suitable in a persuasive game design for a substance dependence therapy context, as people with substance-related disorders show decreased sensitivity to natural rewards, which may result in different responses to commonly applied game rewards compared to people without substance use disorders. In a within-subject experiment with 20 substance dependent and 25 non-dependent participants, we examined whether play persistence and reward evaluation differed between the two groups. Results showed that in contrast to our expectations, substance dependent participants were more motivated by the types of rewards compared to non-substance dependent participants. Participants evaluated monetary rewards more positively than playing for virtual points or social rewards. We conclude this paper with design implications of game-based rewards in persuasive games for mental healthcare.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 70
    Publication Date: 2019
    Description: Knowledge of software security is highly complex since it is quite context-specific and can be applied in diverse ways. To secure software development, software developers require not only knowledge about general security concepts but also about the context for which the software is being developed. With traditional security-centric knowledge formats, it is difficult for developers or knowledge users to retrieve their required security information based on the requirements of software products and development technologies. In order to effectively regulate the operation of security knowledge and be an essential part of practical software development practices, we argue that security knowledge must first incorporate features that specify what contextual characteristics are to be handled, and represent the security knowledge in a format that is understandable and acceptable to the individuals. This study introduces a novel ontology approach for modeling security knowledge with a context-based approach, by which security knowledge can be retrieved, taking the context of the software application at hand into consideration. In this paper, we present our security ontology with the design concepts and the corresponding evaluation process.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 71
    Publication Date: 2019
    Description: The paper addresses the problem of human virus spread reduction when the resources for the control actions are somehow limited. This kind of problem can be successfully solved in the framework of the optimal control theory, where the best solution, which minimizes a cost function while satisfying input constraints, can be provided. The problem is formulated in this contest for the case of the HIV/AIDS virus, making use of a model that considers two classes of susceptible subjects, the wise people and the people with incautious behaviours, and three classes of infected, the ones still not aware of their status, the pre-AIDS patients and the AIDS ones; the control actions are represented by an information campaign, to reduce the category of subjects with unwise behaviour, a test campaign, to reduce the number of subjects not aware of having the virus, and the medication on patients with a positive diagnosis. The cost function considered aims at reducing patients with positive diagnosis using as less resources as possible. Four different types of resources bounds are considered, divided into two classes: limitations on the instantaneous control and fixed total budgets. The optimal solutions are numerically computed, and the results of simulations performed are illustrated and compared to put in evidence the different behaviours of the control actions.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 72
    Publication Date: 2019
    Description: This article analyzes the available readiness indexes and maturity models applied for trends designated as “4.0”, with a focus on Industry 4.0, primarily within the countries of Europe. Based upon it, the available indexes and maturity models are organized into the individual layers of the metamodel; a proposal for this metamodel is this article’s main output. Simultaneously, as-yet-uncovered places for the development of existing maturity models, as well as space for further detailed research into the application of Industry 4.0 in theory and in practice, are identified on the basis of this metamodel.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 73
    Publication Date: 2019
    Description: The accurate analysis of periodic surface acoustic wave (SAW) structures by combined finite element method and boundary element method (FEM/BEM) is important for SAW design, especially in the extraction of couple-of-mode (COM) parameters. However, the time cost is very large. With the aim to accelerate the calculation of SAW FEM/BEM analysis, some optimization algorithms for the FEM and BEM calculation have been reported, while the optimization for the solution to the final FEM/BEM equations which is also with a large amount of calculation is hardly reported. In this paper, it was observed that the coefficient matrix of the final FEM/BEM equations for the periodic SAW structures was similar to a Toeplitz matrix. A fast algorithm based on the Trench recursive algorithm for the Toeplitz matrix inversion was proposed to speed up the solution of the final FEM/BEM equations. The result showed that both the time and memory cost of FEM/BEM was reduced furtherly.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 74
    Publication Date: 2019
    Description: Machine learning algorithms are used in many applications nowadays. Sometimes, we need to describe how the decision models created output, and this may not be an easy task. Information visualization (InfoVis) techniques (e.g., TreeMap, parallel coordinates, etc.) can be used for creating scenarios that visually describe the behavior of those models. Thus, InfoVis scenarios were used to analyze the evolutionary process of a tool named AutoClustering, which generates density-based clustering algorithms automatically for a given dataset using the EDA (estimation-of-distribution algorithm) evolutionary technique. Some scenarios were about fitness and population evolution (clustering algorithms) over time, algorithm parameters, the occurrence of the individual, and others. The analysis of those scenarios could lead to the development of better parameters for the AutoClustering tool and algorithms and thus have a direct impact on the processing time and quality of the generated algorithms.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 75
    Publication Date: 2019
    Description: Domain generation algorithms (DGAs) represent a class of malware used to generate large numbers of new domain names to achieve command-and-control (C2) communication between the malware program and its C2 server to avoid detection by cybersecurity measures. Deep learning has proven successful in serving as a mechanism to implement real-time DGA detection, specifically through the use of recurrent neural networks (RNNs) and convolutional neural networks (CNNs). This paper compares several state-of-the-art deep-learning implementations of DGA detection found in the literature with two novel models: a deeper CNN model and a one-dimensional (1D) Capsule Networks (CapsNet) model. The comparison shows that the 1D CapsNet model performs as well as the best-performing model from the literature.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 76
    Publication Date: 2019
    Description: Recommender systems are one of the fields of information filtering systems that have attracted great research interest during the past several decades and have been utilized in a large variety of applications, from commercial e-shops to social networks and product review sites. Since the applicability of these applications is constantly increasing, the size of the graphs that represent their users and support their functionality increases too. Over the last several years, different approaches have been proposed to deal with the problem of scalability of recommender systems’ algorithms, especially of the group of Collaborative Filtering (CF) algorithms. This article studies the problem of CF algorithms’ parallelization under the prism of graph sparsity, and proposes solutions that may improve the prediction performance of parallel implementations without strongly affecting their time efficiency. We evaluated the proposed approach on a bipartite product-rating network using an implementation on Apache Spark.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 77
    Publication Date: 2019
    Description: Literature shows an increasing interest for the development of augmented reality (AR) applications in several fields, including rehabilitation. Current studies show the need for new rehabilitation tools for upper extremity, since traditional interventions are less effective than in other body regions. This review aims at: Studying to what extent AR applications are used in shoulder rehabilitation, examining wearable/non-wearable technologies employed, and investigating the evidence supporting AR effectiveness. Nine AR systems were identified and analyzed in terms of: Tracking methods, visualization technologies, integrated feedback, rehabilitation setting, and clinical evaluation. Our findings show that all these systems utilize vision-based registration, mainly with wearable marker-based tracking, and spatial displays. No system uses head-mounted displays, and only one system (11%) integrates a wearable interface (for tactile feedback). Three systems (33%) provide only visual feedback; 66% present visual-audio feedback, and only 33% of these provide visual-audio feedback, 22% visual-audio with biofeedback, and 11% visual-audio with haptic feedback. Moreover, several systems (44%) are designed primarily for home settings. Three systems (33%) have been successfully evaluated in clinical trials with more than 10 patients, showing advantages over traditional rehabilitation methods. Further clinical studies are needed to generalize the obtained findings, supporting the effectiveness of the AR applications.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 78
    Publication Date: 2019
    Description: Data and information quality have been recognized as essential components for improving business efficiency. One approach for the assessment of information quality (IQ) is the manufacturing of information (MI). So far, research using this approach has considered a whole document as one indivisible block, which allows document evaluation only at a general level. However, the data inside the documents can be represented as components, which can further be classified according to content and composition. In this paper, we propose a novel model to explore the effectiveness of representing data as a composite unit, rather than indivisible blocks. The input data sufficiency and the relevance of the information output are evaluated in the example of analyzing an administrative form. We found that the new streamlined form proposed resulted in a 15% improvement in IQ. Additionally, we found the relationship between the data quantity and IQ was not a “simple” correlation, as IQ may increase without a corresponding increase in data quantity. We conclude that our study shows that the representation of data as a composite unit is a determining factor in IQ assessment.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 79
    Publication Date: 2019
    Description: Due to current development trends in the automotive industry towards stronger connected and autonomous driving, the attack surface of vehicles is growing which increases the risk of security attacks. This has been confirmed by several research projects in which vehicles were attacked in order to trigger various functions. In some cases these functions were critical to operational safety. To make automotive systems more secure, concepts must be developed that take existing attacks into account. Several taxonomies were proposed to analyze and classify security attacks. However, in this paper we show that the existing taxonomies were not designed for application in the automotive development process and therefore do not provide enough degree of detail for supporting development phases such as threat analysis or security testing. In order to be able to use the information that security attacks can provide for the development of security concepts and for testing automotive systems, we propose a comprehensive taxonomy with degrees of detail which addresses these tasks. In particular, our proposed taxonomy is designed in such a wa, that each step in the vehicle development process can leverage it.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 80
    Publication Date: 2019
    Description: In this study, we explore maximum distance separable (MDS) self-dual codes over Galois rings G R ( p m , r ) with p ≡ − 1 ( mod 4 ) and odd r. Using the building-up construction, we construct MDS self-dual codes of length four and eight over G R ( p m , 3 ) with ( p = 3 and m = 2 , 3 , 4 , 5 , 6 ), ( p = 7 and m = 2 , 3 ), ( p = 11 and m = 2 ), ( p = 19 and m = 2 ), ( p = 23 and m = 2 ), and ( p = 31 and m = 2 ). In the building-up construction, it is important to determine the existence of a square matrix U such that U U T = − I , which is called an antiorthogonal matrix. We prove that there is no 2 × 2 antiorthogonal matrix over G R ( 2 m , r ) with m ≥ 2 and odd r.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 81
    Publication Date: 2019
    Description: Finger-vein biometrics have been extensively investigated for person verification. One of the open issues in finger-vein verification is the lack of robustness against variations of vein patterns due to the changes in physiological and imaging conditions during the acquisition process, which results in large intra-class variations among the finger-vein images captured from the same finger and may degrade the system performance. Despite recent advances in biometric template generation and improvement, current solutions mainly focus on the extrinsic biometrics (e.g., fingerprints, face, signature) instead of intrinsic biometrics (e.g., vein). This paper proposes a weighted least square regression based model to generate and improve enrollment template for finger-vein verification. Driven by the primary target of biometric template generation and improvement, i.e., verification error minimization, we assume that a good template has the smallest intra-class distance with respect to the images from the same class in a verification system. Based on this assumption, the finger-vein template generation is converted into an optimization problem. To improve the performance, the weights associated with similarity are computed for template generation. Then, the enrollment template is generated by solving the optimization problem. Subsequently, a template improvement model is proposed to gradually update vein features in the template. To the best of our knowledge, this is the first proposed work of template generation and improvement for finger-vein biometrics. The experimental results on two public finger-vein databases show that the proposed schemes minimize the intra-class variations among samples and significantly improve finger-vein recognition accuracy.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 82
    Publication Date: 2019
    Description: Indoor localization is a dynamic and exciting research area. WiFi has exhibited a tremendous capability for internal localization since it is extensively used and easily accessible. Facilitating the use of WiFi for this purpose requires fingerprint formation and the implementation of a learning algorithm with the aim of using the fingerprint to determine locations. The most difficult aspect of techniques based on fingerprints is the effect of dynamic environmental changes on fingerprint authentication. With the aim of dealing with this problem, many experts have adopted transfer-learning methods, even though in WiFi indoor localization the dynamic quality of the change in the fingerprint has some cyclic factors that necessitate the use of previous knowledge in various situations. Thus, this paper presents the maximum feature adaptive online sequential extreme learning machine (MFA-OSELM) technique, which uses previous knowledge to handle the cyclic dynamic factors that are brought about by the issue of mobility, which is present in internal environments. This research extends the earlier study of the feature adaptive online sequential extreme learning machine (FA-OSELM). The results of this research demonstrate that MFA-OSELM is superior to FA-OSELM given its capacity to preserve previous data when a person goes back to locations that he/she had visited earlier. Also, there is always a positive accuracy change when using MFA-OSELM, with the best change achieved being 27% (ranging from eight to 27% and six to 18% for the TampereU and UJIIndoorLoc datasets, respectively), which proves the efficiency of MFA-OSELM in restoring previous knowledge.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 83
    Publication Date: 2019
    Description: Artificial intelligence is changing the healthcare industry from many perspectives: diagnosis, treatment, and follow-up. A wide range of techniques has been proposed in the literature. In this special issue, 13 selected and peer-reviewed original research articles contribute to the application of artificial intelligence (AI) approaches in various real-world problems. Papers refer to the following main areas of interest: feature selection, high dimensionality, and statistical approaches; heart and cardiovascular diseases; expert systems and e-health platforms.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 84
    Publication Date: 2019
    Description: The total variation (TV) regularization-based methods are proven to be effective in removing random noise. However, these solutions usually have staircase effects. This paper proposes a new image reconstruction method based on TV regularization with Lp-quasinorm and group gradient sparsity. In this method, the regularization term of the group gradient sparsity can retrieve the neighborhood information of an image gradient, and the Lp-quasinorm constraint can characterize the sparsity of the image gradient. The method can effectively deblur images and remove impulse noise to well preserve image edge information and reduce the staircase effect. To improve the image recovery efficiency, a Fast Fourier Transform (FFT) is introduced to effectively avoid large matrix multiplication operations. Moreover, by introducing accelerated alternating direction method of multipliers (ADMM) in the method to allow for a fast restart of the optimization process, this method can run faster. In numerical experiments on standard test images sourced form Emory University and CVG-UGR (Computer Vision Group, University of Granada) image database, the advantage of the new method is verified by comparing it with existing advanced TV-based methods in terms of peak signal-to-noise ratio (PSNR), structural similarity (SSIM), and operational time.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 85
    Publication Date: 2019
    Description: User navigation in public installations displaying 3D content is mostly supported by mid-air interactions using motion sensors, such as Microsoft Kinect. On the other hand, smartphones have been used as external controllers of large-screen installations or game environments, and they may also be effective in supporting 3D navigations. This paper aims to examine whether a smartphone-based control is a reliable alternative to mid-air interaction for four degrees of freedom (4-DOF) fist-person navigation, and to discover suitable interaction techniques for a smartphone controller. For this purpose, we setup two studies: A comparative study between smartphone-based and Kinect-based navigation, and a gesture elicitation study to collect user preferences and intentions regarding 3D navigation methods using a smartphone. The results of the first study were encouraging, as users with smartphone input performed at least as good as with Kinect and most of them preferred it as a means of control, whilst the second study produced a number of noteworthy results regarding proposed user gestures and their stance towards using a mobile phone for 3D navigation.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 86
    Publication Date: 2019
    Description: In multi-modal emotion aware frameworks, it is essential to estimate the emotional features then fuse them to different degrees. This basically follows either a feature-level or decision-level strategy. In all likelihood, while features from several modalities may enhance the classification performance, they might exhibit high dimensionality and make the learning process complex for the most used machine learning algorithms. To overcome issues of feature extraction and multi-modal fusion, hybrid fuzzy-evolutionary computation methodologies are employed to demonstrate ultra-strong capability of learning features and dimensionality reduction. This paper proposes a novel multi-modal emotion aware system by fusing speech with EEG modalities. Firstly, a mixing feature set of speaker-dependent and independent characteristics is estimated from speech signal. Further, EEG is utilized as inner channel complementing speech for more authoritative recognition, by extracting multiple features belonging to time, frequency, and time–frequency. For classifying unimodal data of either speech or EEG, a hybrid fuzzy c-means-genetic algorithm-neural network model is proposed, where its fitness function finds the optimal fuzzy cluster number reducing the classification error. To fuse speech with EEG information, a separate classifier is used for each modality, then output is computed by integrating their posterior probabilities. Results show the superiority of the proposed model, where the overall performance in terms of accuracy average rates is 98.06%, and 97.28%, and 98.53% for EEG, speech, and multi-modal recognition, respectively. The proposed model is also applied to two public databases for speech and EEG, namely: SAVEE and MAHNOB, which achieve accuracies of 98.21% and 98.26%, respectively.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 87
    Publication Date: 2019
    Description: Mobile edge computing (MEC) effectively integrates wireless network and Internet technologies and adds computing, storage, and processing functions to the edge of cellular networks. This new network architecture model can deliver services directly from the cloud to the very edge of the network while providing the best efficiency in mobile networks. However, due to the dynamic, open, and collaborative nature of MEC network environments, network security issues have become increasingly complex. Devices cannot easily ensure obtaining satisfactory and safe services because of the numerous, dynamic, and collaborative character of MEC devices and the lack of trust between devices. The trusted cooperative mechanism can help solve this problem. In this paper, we analyze the MEC network structure and device-to-device (D2D) trusted cooperative mechanism and their challenging issues and then discuss and compare different ways to establish the D2D trusted cooperative relationship in MEC, such as social trust, reputation, authentication techniques, and intrusion detection. All these ways focus on enhancing the efficiency, stability, and security of MEC services in presenting trustworthy services.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 88
    Publication Date: 2019
    Description: With the construction of the urban rail transit (URT) network, the explosion of passenger volume is more rapid than the increased capacity of the newly built infrastructure, which results in serious passenger flow congestion (PLC). Understanding the propagation process of PLC is the key to formulate sustainable policies for reducing congestion and optimizing management. This study proposes a susceptible-infected-recovered (SIR) model based on the theories of epidemiological dynamics and complex network to analyze the PLC propagation. We simulate the PLC propagation under various situations, and analyze the sensitivity of PLC propagation to model parameters. Finally, the control strategies of restricting PLC propagation are introduced from two aspects, namely, supply control and demand control. The results indicate that both of the two control strategies contribute to relieving congestion pressure. The propagating scope of PLC is more sensitive when taking mild supply control, whereas, the demand control strategy shows some advantages in flexibly implementing and dealing with serious congestion. These results are of important guidance for URT agencies to understand the mechanism of PLC propagation and formulate appropriate congestion control strategies.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 89
    Publication Date: 2019
    Description: There has been an enormous technological boom that impacted all areas of geoscience in the past few decades. Part of the change was also the process of democratization of cartography as well as geographic information systems (GIS), together with new approaches that have emerged, bringing social dimension into cartography and GIS. These new approaches were variously labelled as critical cartography, collaborative mapping, digital citizenship, Bottom-up GIS and Participatory GIS. The paper describes the role of collaborative mapping and digital participation in the process of community building and community assets mapping. Secondly, we will use the examples of Kenya and Peru to support our findings of community development. Thirdly, we will discuss a possible further development within the use of OpenStreetMap (OSM) for remote communities. The analysis compares approaches and experiences in different countries on different continents.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 90
    Publication Date: 2019
    Description: Consumers’ purchase behavior increasingly relies on online reviews. Accordingly, there are more and more deceptive reviews which are harmful to customers. Existing methods to detect spam reviews mainly take the problem as a general text classification task, but they ignore the important features of spam reviews. In this paper, we propose a novel model, which splits a review into three parts: first sentence, middle context, and last sentence, based on the discovery that the first and last sentence express stronger emotion than the middle context. Then, the model uses four independent bidirectional long-short term memory (LSTM) models to encode the beginning, middle, end of a review and the whole review into four document representations. After that, the four representations are integrated into one document representation by a self-attention mechanism layer and an attention mechanism layer. Based on three domain datasets, the results of in-domain and mix-domain experiments show that our proposed method performs better than the compared methods.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 91
    Publication Date: 2019
    Description: The kinds of interactions taking place in an online personal finance forum and the sentiments expressed in its posts may influence the diffusion and usefulness of those forums. We explore a set of major threads on a personal finance forum to assess the degree of participation of posters and the prevailing sentiments. The participation appears to be dominated by a small number of posters, with the most frequent poster contributing even more than a third of all posts. Just a small fraction of all possible direct interactions actually take place. Dominance is also confirmed by the large presence of self-replies (i.e., a poster submitting several posts in succession) and rejoinders (i.e., a poster counter-replying to another poster). Though trust is the prevailing sentiment, anger and fear appear to be present as well, though at a lower level, revealing that posts exhibit both aggressive and defensive tones.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 92
    Publication Date: 2019
    Description: Cyber risk management is a very important problem for every company connected to the internet. Usually, risk management is done considering only Risk Analysis without connecting it with Vulnerability Assessment, using external and expensive tools. In this paper we present CYber Risk Vulnerability Management (CYRVM)—a custom-made software platform devised to simplify and improve automation and continuity in cyber security assessment. CYRVM’s main novelties are the combination, in a single and easy-to-use Web-based software platform, of an online Vulnerability Assessment tool within a Risk Analysis framework following the NIST 800-30 Risk Management guidelines and the integration of predictive solutions able to suggest to the user the risk rating and classification.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 93
    Publication Date: 2019
    Description: Several data-centric applications today produce and manipulate a large volume of data, the so-called Big Data. Traditional databases, in particular, relational databases, are not suitable for Big Data management. As a consequence, some approaches that allow the definition and manipulation of large relational data sets stored in NoSQL databases through an SQL interface have been proposed, focusing on scalability and availability. This paper presents a comparative analysis of these approaches based on an architectural classification that organizes them according to their system architectures. Our motivation is that wrapping is a relevant strategy for relational-based applications that intend to move relational data to NoSQL databases (usually maintained in the cloud). We also claim that this research area has some open issues, given that most approaches deal with only a subset of SQL operations or give support to specific target NoSQL databases. Our intention with this survey is, therefore, to contribute to the state-of-art in this research area and also provide a basis for choosing or even designing a relational-to-NoSQL data wrapping solution.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 94
    Publication Date: 2019
    Description: The fifth International Workshop on the Market of Data (MoDAT2017) was held on November 18th, 2017 in New Orleans, USA in conjunction with IEEE ICDM 2017 [...]
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 95
    Publication Date: 2019
    Description: How to explore the interaction between an individual researcher and others in scientific research, find out the degree of association among individual researchers, and evaluate the contribution of researchers to the whole according to the mechanism and law of interaction, is of great significance to grasp the overall trend of the field. Scholars mostly use bibliometrics to solve these problems and analyze the citation and cooperation among academic achievements from the dimension of “quantity”. However, there is still no mature method for scholars to explore the evolution of knowledge and the relationship between authors; this paper tries to fill this gap. We narrow down the scope of research and focus the research content on the literature in biology and chemistry, collect all the papers from PubMed system (a very comprehensive authoritative database of biomedical papers) during 2014–2018, and take year as a specific analysis unit so as to improve the accuracy of the analysis. Then, we construct the author cooperation networks. Finally, through the above methods and steps, we identify the core authors of each year, analyze the recent cooperative relationships among authors, and predict some changes in the cooperative relationship among the authors based on the networks’ analytical data, evaluating and estimating the role that authors play in the overall field. Therefore, we expect that the cooperative authorship networks supported by the complex network theory can better explain the author’s cooperative relationship.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 96
    Publication Date: 2019
    Description: Previous researchers have examined the motivations of developers to participate in hackathons events and the challenges of open data hackathons, but limited studies have focused on the preparation and evaluation of these contests. Thus, the purpose of this paper is to examine factors that lead to the effective implementation and success of open data hackathons and innovation contests. Six case studies of open data hackathons and innovation contests held between 2014 and 2018 in Thessaloniki were studied in order to identify the factors leading to the success of hackathon contests using criteria from the existing literature. The results show that the most significant factors were clear problem definition, mentors’ participation to the contest, level of support to participants by mentors in order to launch their applications to the market, jury members’ knowledge and experience, the entry requirements of the competition, and the participation of companies, data providers, and academics. Furthermore, organizers should take team members’ competences and skills, as well as the support of post-launch activities for applications, into consideration. This paper can be of interest to organizers of hackathon events because they could be knowledgeable about the factors that should take into consideration for the successful implementation of these events.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 97
    Publication Date: 2019
    Description: Volunteered geographic information (VGI) refers to geospatial data that is collected and/or shared voluntarily over the Internet. Its use, however, presents many limitations, such as data quality, difficulty in use and recovery. One alternative to improve its use is to use semantic enrichment, which is a process to assign semantic resources to metadata and data. This study proposes a VGI semantic enrichment method using linked data and thesaurus. The method has two stages, one automatic and one manual. The automatic stage links VGI contributions to places that are of interest to users. In the manual stage, a thesaurus in the hydric domain was built based on terms found in VGI. Finally, a process is proposed, which returns semantically similar VGI contributions based on queries made by users. To verify the viability of the proposed method, contributions from the VGI system Gota D’Água, related to water waste prevention, were used.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 98
    Publication Date: 2019
    Description: Anthropological, archaeological, and forensic studies situate enforced disappearance as a strategy associated with the Brazilian military dictatorship (1964–1985), leaving hundreds of persons without identity or cause of death identified. Their forensic reports are the only existing clue for people identification and detection of possible crimes associated with them. The exchange of information among institutions about the identities of disappeared people was not a common practice. Thus, their analysis requires unsupervised techniques, mainly due to the fact that their contextual annotation is extremely time-consuming, difficult to obtain, and with high dependence on the annotator. The use of these techniques allows researchers to assist in the identification and analysis in four areas: Common causes of death, relevant body locations, personal belongings terminology, and correlations between actors such as doctors and police officers involved in the disappearances. This paper analyzes almost 3000 textual reports of missing persons in São Paulo city during the Brazilian dictatorship through unsupervised algorithms of information extraction in Portuguese, identifying named entities and relevant terminology associated with these four criteria. The analysis allowed us to observe terminological patterns relevant for people identification (e.g., presence of rings or similar personal belongings) and automate the study of correlations between actors. The proposed system acts as a first classificatory and indexing middleware of the reports and represents a feasible system that can assist researchers working in pattern search among autopsy reports.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 99
    Publication Date: 2019
    Description: The European Union (EU) Regulation 910/2014 on electronic IDentification, Authentication, and trust Services (eIDAS) for electronic transactions in the internal market went into effect on 29 September 2018, meaning that EU Member States are required to recognize the electronic identities issued in the countries that have notified their eID schemes. Technically speaking, a unified interoperability platform—named eIDAS infrastructure—has been set up to connect the EU countries’ national eID schemes to allow a person to authenticate in their home EU country when getting access to services provided by an eIDAS-enabled Service Provider (SP) in another EU country. The eIDAS infrastructure allows the transfer of authentication requests and responses back and forth between its nodes, transporting basic attributes about a person, e.g., name, surname, date of birth, and a so-called eIDAS identifier. However, to build new eIDAS-enabled services in specific domains, additional attributes are needed. We describe our approach to retrieve and transport new attributes through the eIDAS infrastructure, and we detail their exploitation in a selected set of academic services. First, we describe the definition and the support for the additional attributes in the eIDAS nodes. We then present a solution for their retrieval from our university. Finally, we detail the design, implementation, and installation of two eIDAS-enabled academic services at our university: the eRegistration in the Erasmus student exchange program and the Login facility with national eIDs on the university portal.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 100
    Publication Date: 2019
    Description: Spam emails, also known as non-self, are unsolicited commercial or malicious emails, sent to affect either a single individual or a corporation or a group of people. Besides advertising, these may contain links to phishing or malware hosting websites set up to steal confidential information. In this paper, a study of the effectiveness of using a Negative Selection Algorithm (NSA) for anomaly detection applied to spam filtering is presented. NSA has a high performance and a low false detection rate. The designed framework intelligently works through three detection phases to finally determine an email’s legitimacy based on the knowledge gathered in the training phase. The system operates by elimination through Negative Selection similar to the functionality of T-cells’ in biological systems. It has been observed that with the inclusion of more datasets, the performance continues to improve, resulting in a 6% increase of True Positive and True Negative detection rate while achieving an actual detection rate of spam and ham of 98.5%. The model has been further compared against similar studies, and the result shows that the proposed system results in an increase of 2 to 15% in the correct detection rate of spam and ham.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...