ALBERT

All Library Books, journals and Electronic Records Telegrafenberg

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
Filter
  • Books
  • Articles  (1,217)
  • 2015-2019  (1,217)
  • 1990-1994
  • 1915-1919
  • Future Internet  (644)
  • 125090
  • Computer Science  (1,217)
  • Philosophy
  • 1
    Publication Date: 2019
    Description: Over the years, the cellular mobile network has evolved from a wireless plain telephone system to a very complex system providing telephone service, Internet connectivity and many interworking capabilities with other networks. Its air interface performance has increased drastically over time, leading to high throughput and low latency. Changes to the core network, however, have been slow and incremental, with increased complexity worsened by the necessity of backwards-compatibility with older-generation systems such as the Global System for Mobile communication (GSM). In this paper, a new virtualized Peer-to-Peer (P2P) core network architecture is presented. The key idea of our approach is that each user is assigned a private virtualized copy of the whole core network. This enables a higher degree of security and novel services that are not possible in today’s architecture. We describe the new architecture, focusing on its main elements, IP addressing, message flows, mobility management, and scalability. Furthermore, we will show some significant advantages this new architecture introduces. Finally, we investigate the performance of our architecture by analyzing voice-call traffic available in a database of a large U.S. cellular network provider.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 2
    Publication Date: 2019
    Description: The ongoing digital transformation has the potential to revolutionize nearly all industrial manufacturing processes. However, its concrete requirements and implications are still not sufficiently investigated. In order to establish a common understanding, a multitude of initiatives have published guidelines, reference frameworks and specifications, all intending to promote their particular interpretation of the Industrial Internet of Things (IIoT). As a result of the inconsistent use of terminology, heterogeneous structures and proposed processes, an opaque landscape has been created. The consequence is that both new users and experienced experts can hardly manage to get an overview of the amount of information and publications, and make decisions on what is best to use and to adopt. This work contributes to the state of the art by providing a structured analysis of existing reference frameworks, their classifications and the concerns they target. We supply alignments of shared concepts, identify gaps and give a structured mapping of regarded concerns at each part of the respective reference architectures. Furthermore, the linking of relevant industry standards and technologies to the architectures allows a more effective search for specifications and guidelines and supports the direct technology adoption.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 3
    Publication Date: 2019
    Description: Google’s Material Design, created in 2014, led to the extended application of floating action buttons (FAB) in user interfaces of web pages and mobile applications. FAB’s roll is to trigger an activity either on the present screen, or it can play out an activity that makes another screen. A few specialists in user experience (UX) and user interface (UI) design are sceptical regarding the usability of FAB in the interfaces of both web pages and mobile applications. They claim that the use of FAB easily distracts users and that it interferes with using other important functions of the applications, and it is unusable in applications designed for iOS systems. The aim of this paper is to investigate by an experiment the quality of experience (QoE) of a static and animated FAB and compare it to the toolbar alternative. The experimental results of different testing methods rejected the hypothesis that the usage and animation of this UI element has a positive influence on the application usability. However, its static and animated utilization enhanced the ratings of hedonic and aesthetic features of the user experience, justifying the usage of this type of button.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 4
    Publication Date: 2019
    Description: Service Level Agreements are employed to set availability commitments in cloud services. When a violation occurs as in an outage, cloud providers may be called to compensate customers for the losses incurred. Such compensation may be so large as to erode cloud providers’ profit margins. Insurance may be used to protect cloud providers against such a danger. In this paper, closed formulas are provided through the expected utility paradigm to set the insurance premium under different outage models and QoS metrics (no. of outages, no. of long outages, and unavailability). When the cloud service is paid through a fixed fee, we also provide the maximum unit compensation that a cloud provider can offer so as to meet constraints on its profit loss. The unit compensation is shown to vary approximately as the inverse square of the service fee.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 5
    Publication Date: 2019
    Description: In semi-autonomous robot conferencing, not only the operator controls the robot, but the robot itself also moves autonomously. Thus, it can modify the operator’s movement (e.g., adding social behaviors). However, the sense of agency, that is, the degree of feeling that the movement of the robot is the operator’s own movement, would decrease if the operator is conscious of the discrepancy between the teleoperation and autonomous behavior. In this study, we developed an interface to control the robot head by using an eye tracker. When the robot autonomously moves its eye-gaze position, the interface guides the operator’s eye movement towards this autonomous movement. The experiment showed that our interface can maintain the sense of agency, because it provided the illusion that the autonomous behavior of a robot is directed by the operator’s eye movement. This study reports the conditions of how to provide this illusion in semi-autonomous robot conferencing.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 6
    Publication Date: 2019
    Description: Collaborative filtering based recommender systems have proven to be extremely successful in settings where user preference data on items is abundant. However, collaborative filtering algorithms are hindered by their weakness against the item cold-start problem and general lack of interpretability. Ontology-based recommender systems exploit hierarchical organizations of users and items to enhance browsing, recommendation, and profile construction. While ontology-based approaches address the shortcomings of their collaborative filtering counterparts, ontological organizations of items can be difficult to obtain for items that mostly belong to the same category (e.g., television series episodes). In this paper, we present an ontology-based recommender system that integrates the knowledge represented in a large ontology of literary themes to produce fiction content recommendations. The main novelty of this work is an ontology-based method for computing similarities between items and its integration with the classical Item-KNN (K-nearest neighbors) algorithm. As a study case, we evaluated the proposed method against other approaches by performing the classical rating prediction task on a collection of Star Trek television series episodes in an item cold-start scenario. This transverse evaluation provides insights into the utility of different information resources and methods for the initial stages of recommender system development. We found our proposed method to be a convenient alternative to collaborative filtering approaches for collections of mostly similar items, particularly when other content-based approaches are not applicable or otherwise unavailable. Aside from the new methods, this paper contributes a testbed for future research and an online framework to collaboratively extend the ontology of literary themes to cover other narrative content.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 7
    Publication Date: 2019
    Description: Military named entity recognition (MNER) is one of the key technologies in military information extraction. Traditional methods for the MNER task rely on cumbersome feature engineering and specialized domain knowledge. In order to solve this problem, we propose a method employing a bidirectional long short-term memory (BiLSTM) neural network with a self-attention mechanism to identify the military entities automatically. We obtain distributed vector representations of the military corpus by unsupervised learning and the BiLSTM model combined with the self-attention mechanism is adopted to capture contextual information fully carried by the character vector sequence. The experimental results show that the self-attention mechanism can improve effectively the performance of MNER task. The F-score of the military documents and network military texts identification was 90.15% and 89.34%, respectively, which was better than other models.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 8
    Publication Date: 2019
    Description: In an era of accelerating digitization and advanced big data analytics, harnessing quality data and insights will enable innovative research methods and management approaches. Among others, Artificial Intelligence Imagery Analysis has recently emerged as a new method for analyzing the content of large amounts of pictorial data. In this paper, we provide background information and outline the application of Artificial Intelligence Imagery Analysis for analyzing the content of large amounts of pictorial data. We suggest that Artificial Intelligence Imagery Analysis constitutes a profound improvement over previous methods that have mostly relied on manual work by humans. In this paper, we discuss the applications of Artificial Intelligence Imagery Analysis for research and practice and provide an example of its use for research. In the case study, we employed Artificial Intelligence Imagery Analysis for decomposing and assessing thumbnail images in the context of marketing and media research and show how properly assessed and designed thumbnail images promote the consumption of online videos. We conclude the paper with a discussion on the potential of Artificial Intelligence Imagery Analysis for research and practice across disciplines.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 9
    Publication Date: 2019
    Description: The exorbitant increase in the computational complexity of modern video coding standards, such as High Efficiency Video Coding (HEVC), is a compelling challenge for resource-constrained consumer electronic devices. For instance, the brute force evaluation of all possible combinations of available coding modes and quadtree-based coding structure in HEVC to determine the optimum set of coding parameters for a given content demand a substantial amount of computational and energy resources. Thus, the resource requirements for real time operation of HEVC has become a contributing factor towards the Quality of Experience (QoE) of the end users of emerging multimedia and future internet applications. In this context, this paper proposes a content-adaptive Coding Unit (CU) size selection algorithm for HEVC intra-prediction. The proposed algorithm builds content-specific weighted Support Vector Machine (SVM) models in real time during the encoding process, to provide an early estimate of CU size for a given content, avoiding the brute force evaluation of all possible coding mode combinations in HEVC. The experimental results demonstrate an average encoding time reduction of 52.38%, with an average Bjøntegaard Delta Bit Rate (BDBR) increase of 1.19% compared to the HM16.1 reference encoder. Furthermore, the perceptual visual quality assessments conducted through Video Quality Metric (VQM) show minimal visual quality impact on the reconstructed videos of the proposed algorithm compared to state-of-the-art approaches.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 10
    Publication Date: 2019
    Description: A crowdsourcing contest is one of the most popular modes of crowdsourcing and is also an important tool for an enterprise to implement open innovation. The solvers’ active participation is one of the major reasons for the success of crowdsourcing contests. Research on solvers’ participation behavior is helpful in understanding the sustainability and incentives of solvers’ participation in the online crowdsourcing platform. So, how to attract more solvers to participate and put in more effort is the focus of researchers. In this regard, previous studies mainly used the submission quantity to measure solvers’ participation behavior and lacked an effective measure on the degree of participation effort expended by a solver. For the first time, we use solvers’ participation time as a dependent variable to measure their effort in a crowdsourcing contest. Thus, we incorporate participation time into the solver’s participation research. With the data from Taskcn.com, we analyze how participation time is affected four key factors including task design, task description, task process, and environment, respectively. We found that, first, for task design, higher task rewards will attract solvers to invest more time in the participation process and the relationship between participation time and task duration is inverted U-shaped. Second, for task description, the length of the task description has a negative impact on participation time and the task description attachment will positively influence the participation time. Third, for the task process, communication and supplementary explanations in a crowdsourcing process positively affect participation time. Fourth, for environmental factors, the task density of the crowdsourcing platform and the market price of all crowdsourcing contests have respectively negative and positive effects on participation time.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 11
    Publication Date: 2019
    Description: The growing demand on video streaming services increasingly motivates the development of a reliable and accurate models for the assessment of Quality of Experience (QoE). In this duty, human-related factors which have significant influence on QoE play a crucial role. However, the complexity caused by multiple effects of those factors on human perception has introduced challenges on contemporary studies. In this paper, we inspect the impact of the human-related factors, namely perceptual factors, memory effect, and the degree of interest. Based on our investigation, a novel QoE model is proposed that effectively incorporates those factors to reflect the user’s cumulative perception. Evaluation results indicate that our proposed model performed excellently in predicting cumulative QoE at any moment within a streaming session.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 12
    Publication Date: 2019
    Description: The main scope of the presented research was the development of an innovative product for the management of city parking lots. Our application will ensure the implementation of the Smart City concept by using computer vision and communication platforms, which enable the development of new integrated digital services. The use of video cameras could simplify and lower the costs of parking lot controls. In the aim of parking space detection, an aggregated decision was proposed, employing various metrics, computed over a sliding window interval provided by the camera. The history created over 20 images provides an adaptive method for background and accurate detection. The system has shown high robustness in two benchmarks, achieving a recognition rate higher than 93%.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 13
    Publication Date: 2019
    Description: The application of blockchain technology to the energy sector promises to derive new operating models focused on local generation and sustainable practices, which are driven by peer-to-peer collaboration and community engagement. However, real-world energy blockchains differ from typical blockchain networks insofar as they must interoperate with grid infrastructure, adhere to energy regulations, and embody engineering principles. Naturally, these additional dimensions make real-world energy blockchains highly dependent on the participation of grid operators, engineers, and energy providers. Although much theoretical and proof-of-concept research has been published on energy blockchains, this research aims to establish a lens on real-world projects and implementations that may inform the alignment of academic and industry research agendas. This research classifies 131 real-world energy blockchain initiatives to develop an understanding of how blockchains are being applied to the energy domain, what type of failure rates can be observed from recently reported initiatives, and what level of technical and theoretical details are reported for real-world deployments. The results presented from the systematic analysis highlight that real-world energy blockchains are (a) growing exponentially year-on-year, (b) producing relatively low failure/drop-off rates (~7% since 2015), and (c) demonstrating information sharing protocols that produce content with insufficient technical and theoretical depth.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 14
    Publication Date: 2019
    Description: The spectral efficiency of wireless networks can be significantly improved by exploiting spatial multiplexing techniques known as multi-user MIMO. These techniques enable the allocation of multiple users to the same time-frequency block, thus reducing the interference between users. There is ample evidence that user groupings can have a significant impact on the performance of spatial multiplexing. The situation is even more complex when the data packets have priority and deadlines for delivery. Hence, combining packet queue management and beamforming would considerably enhance the overall system performance. In this paper, we propose a combination of beamforming and scheduling to improve the overall performance of multi-user MIMO systems in realistic conditions where data packets have both priority and deadlines beyond which they become obsolete. This method dubbed Reward Per Second (RPS), combines advanced matrix factorization at the physical layer with recently-developed queue management techniques. We demonstrate the merits of the this technique compared to other state-of-the-art scheduling methods through simulations.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 15
    Publication Date: 2019
    Description: An exemplary paradigm of how an AI can be a disruptive technological paragon via the utilization of blockchain comes straight from the world of deep learning. Data scientists have long struggled to maintain the quality of a dataset for machine learning by an AI entity. Datasets can be very expensive to purchase, as, depending on both the proper selection of the elements and the homogeneity of the data contained within, constructing and maintaining the integrity of a dataset is difficult. Blockchain as a highly secure storage medium presents a technological quantum leap in maintaining data integrity. Furthermore, blockchain’s immutability constructs a fruitful environment for creating high quality, permanent and growing datasets for deep learning. The combination of AI and blockchain could impact fields like Internet of things (IoT), identity, financial markets, civil governance, smart cities, small communities, supply chains, personalized medicine and other fields, and thereby deliver benefits to many people.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 16
    Publication Date: 2019
    Description: With the development of artificial intelligence, machine learning algorithms and deep learning algorithms are widely applied to attack detection models. Adversarial attacks against artificial intelligence models become inevitable problems when there is a lack of research on the cross-site scripting (XSS) attack detection model for defense against attacks. It is extremely important to design a method that can effectively improve the detection model against attack. In this paper, we present a method based on reinforcement learning (called RLXSS), which aims to optimize the XSS detection model to defend against adversarial attacks. First, the adversarial samples of the detection model are mined by the adversarial attack model based on reinforcement learning. Secondly, the detection model and the adversarial model are alternately trained. After each round, the newly-excavated adversarial samples are marked as a malicious sample and are used to retrain the detection model. Experimental results show that the proposed RLXSS model can successfully mine adversarial samples that escape black-box and white-box detection and retain aggressive features. What is more, by alternately training the detection model and the confrontation attack model, the escape rate of the detection model is continuously reduced, which indicates that the model can improve the ability of the detection model to defend against attacks.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 17
    Publication Date: 2019
    Description: Presently, we are observing an explosion of data that need to be stored and processed over the Internet, and characterized by large volume, velocity and variety. For this reason, software developers have begun to look at NoSQL solutions for data storage. However, operations that are trivial in traditional Relational DataBase Management Systems (DBMSs) can become very complex in NoSQL DBMSs. This is the case of the join operation to establish a connection between two or more DB structures, whose construct is not explicitly available in many NoSQL databases. As a consequence, the data model has to be changed or a set of operations have to be performed to address particular queries on data. Thus, open questions are: how do NoSQL solutions work when they have to perform join operations on data that are not natively supported? What is the quality of NoSQL solutions in such cases? In this paper, we deal with such issues specifically considering one of the major NoSQL document oriented DB available on the market: MongoDB. In particular, we discuss an approach to perform join operations at application layer in MongoDB that allows us to preserve data models. We analyse performance of the proposes approach discussing the introduced overhead in comparison with SQL-like DBs.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 18
    Publication Date: 2019
    Description: The goal of the present research is to contribute to the detection of tax fraud concerning personal income tax returns (IRPF, in Spanish) filed in Spain, through the use of Machine Learning advanced predictive tools, by applying Multilayer Perceptron neural network (MLP) models. The possibilities springing from these techniques have been applied to a broad range of personal income return data supplied by the Institute of Fiscal Studies (IEF). The use of the neural networks enabled taxpayer segmentation as well as calculation of the probability concerning an individual taxpayer’s propensity to attempt to evade taxes. The results showed that the selected model has an efficiency rate of 84.3%, implying an improvement in relation to other models utilized in tax fraud detection. The proposal can be generalized to quantify an individual’s propensity to commit fraud with regards to other kinds of taxes. These models will support tax offices to help them arrive at the best decisions regarding action plans to combat tax fraud.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 19
    Publication Date: 2019
    Description: Gender role norms have been widely studied in the offline partner violence context. Different studies have indicated that internalizing these norms was associated with dating violence. However, very few research works have analyzed this relation in forms of aggression against partners and former partners using information and communication technologies (ICT). The objective of the present study was to examine the co-occurrence of cyber dating abuse by analyzing the extent to which victimization and perpetration overlap, and by analyzing the differences according to conformity to the masculine gender norms between men who are perpetrators or victims of cyber dating abuse. The participants were 614 male university students, and 26.5% of the sample reported having been a victim and perpetrator of cyber dating abuse. Nonetheless, the regression analyses did not reveal any statistically significant association between conformity to masculine gender norms and practicing either perpetration or victimization by cyber dating abuse.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 20
    Publication Date: 2019
    Description: The rapid development of distributed technology has made it possible to store and query massive trajectory data. As a result, a variety of schemes for big trajectory data management have been proposed. However, the factor of data transmission is not considered in most of these, resulting in a certain impact on query efficiency. In view of that, we present THBase, a coprocessor-based scheme for big trajectory data management in HBase. THBase introduces a segment-based data model and a moving-object-based partition model to solve massive trajectory data storage, and exploits a hybrid local secondary index structure based on Observer coprocessor to accelerate spatiotemporal queries. Furthermore, it adopts certain maintenance strategies to ensure the colocation of relevant data. Based on these, THBase designs node-locality-based parallel query algorithms by Endpoint coprocessor to reduce the overhead caused by data transmission, thus ensuring efficient query performance. Experiments on datasets of ship trajectory show that our schemes can significantly outperform other schemes.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 21
    Publication Date: 2019
    Description: In this paper, we present an analysis of the mining process of two popular assets, Bitcoin and gold. The analysis highlights that Bitcoin, more specifically its underlying technology, is a “safe haven” that allows facing the modern environmental challenges better than gold. Our analysis emphasizes that crypto-currencies systems have a social and economic impact much smaller than that of the traditional financial systems. We present an analysis of the several stages needed to produce an ounce of gold and an artificial agent-based market model simulating the Bitcoin mining process and allowing the quantification of Bitcoin mining costs. In this market model, miners validate the Bitcoin transactions using the proof of work as the consensus mechanism, get a reward in Bitcoins, sell a fraction of them to cover their expenses, and stay competitive in the market by buying and divesting hardware units and adjusting their expenses by turning off/on their machines according to the signals provided by a technical analysis indicator, the so-called relative strength index.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 22
    Publication Date: 2019
    Description: In recent years, almost all of the current top-performing object detection networks use CNN (convolutional neural networks) features. State-of-the-art object detection networks depend on CNN features. In this work, we add feature fusion in the object detection network to obtain a better CNN feature, which incorporates well deep, but semantic, and shallow, but high-resolution, CNN features, thus improving the performance of a small object. Also, the attention mechanism was applied to our object detection network, AF R-CNN (attention mechanism and convolution feature fusion based object detection), to enhance the impact of significant features and weaken background interference. Our AF R-CNN is a single end to end network. We choose the pre-trained network, VGG-16, to extract CNN features. Our detection network is trained on the dataset, PASCAL VOC 2007 and 2012. Empirical evaluation of the PASCAL VOC 2007 dataset demonstrates the effectiveness and improvement of our approach. Our AF R-CNN achieves an object detection accuracy of 75.9% on PASCAL VOC 2007, six points higher than Faster R-CNN.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 23
    Publication Date: 2019
    Description: Research and development (R&D) are always oriented towards new discoveries, based on original terms or hypotheses, and their concluding outcomes are often uncertain. The present work focused on the degree of uncertainty for R&D activities. In fact, uncertainty makes it difficult to quantify the time and resources needed to achieve a final outcome, create a work plan and budget, and finalize the resulting “innovative” products or services that could be transferred or exchanged in a specific market. The present work attempts to indicate the degree of uncertainty of the research activities developed by a set of firms. The method used aimed to quantify the five criteria defined by the Manual of Frascati. Through the creation of an uncertainty cloud, a cone of uncertainty was defined following an approach based on project management. The evaluation grid was characterized by the decomposition of the different variables divided into quartiles, which allowed for the detection of the evolution of the project and each of its component. The ancillary objective aim was to also observe the development degree of these industries towards a framework of Industry 4.0.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 24
    Publication Date: 2019
    Description: Conversational agents are reshaping our communication environment and have the potential to inform and persuade in new and effective ways. In this paper, we present the underlying technologies and the theoretical background behind a health-care platform dedicated to supporting medical stuff and individuals with movement disabilities and to providing advanced monitoring functionalities in hospital and home surroundings. The framework implements an intelligent combination of two research areas: (1) sensor- and camera-based monitoring to collect, analyse, and interpret people behaviour and (2) natural machine–human interaction through an apprehensive virtual assistant benefiting ailing patients. In addition, the framework serves as an important assistant to caregivers and clinical experts to obtain information about the patients in an intuitive manner. The proposed approach capitalises on latest breakthroughs in computer vision, sensor management, speech recognition, natural language processing, knowledge representation, dialogue management, semantic reasoning, and speech synthesis, combining medical expertise and patient history.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 25
    Publication Date: 2019
    Description: Network Function Virtualization (NFV) has revolutionized the way network services are offered to end users. Individual network functions are decoupled from expensive and dedicated middleboxes and are now provided as software-based virtualized entities called Virtualized Network Functions (VNFs). NFV is often complemented with the Cloud Computing paradigm to provide networking functions to enterprise customers and end-users remote from their premises. NFV along with Cloud Computing has also started to be seen in Internet of Things (IoT) platforms as a means to provide networking functions to the IoT traffic. The intermix of IoT, NFV, and Cloud technologies, however, is still in its infancy creating a rich and open future research area. To this end, in this paper, we propose a novel approach to facilitate the placement and deployment of service chained VNFs in a network cloud infrastructure that can be extended using the Mobile Edge Computing (MEC) infrastructure for accommodating mission critical and delay sensitive traffic. Our aim is to minimize the end-to-end communication delay while keeping the overall deployment cost to minimum. Results reveal that the proposed approach can significantly reduce the delay experienced, while satisfying the Service Providers’ goal of low deployment costs.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 26
    Publication Date: 2019
    Description: Network Function Virtualization is a new technology allowing for a elastic cloud and bandwidth resource allocation. The technology requires an orchestrator whose role is the service and resource orchestration. It receives service requests, each one characterized by a Service Function Chain, which is a set of service functions to be executed according to a given order. It implements an algorithm for deciding where both to allocate the cloud and bandwidth resources and to route the SFCs. In a traditional orchestration algorithm, the orchestrator has a detailed knowledge of the cloud and network infrastructures and that can lead to high computational complexity of the SFC Routing and Cloud and Bandwidth resource Allocation (SRCBA) algorithm. In this paper, we propose and evaluate the effectiveness of a scalable orchestration architecture inherited by the one proposed within the European Telecommunications Standards Institute (ETSI) and based on the functional separation of an NFV orchestrator in Resource Orchestrator (RO) and Network Service Orchestrator (NSO). Each cloud domain is equipped with an RO whose task is to provide a simple and abstract representation of the cloud infrastructure. These representations are notified of the NSO that can apply a simplified and less complex SRCBA algorithm. In addition, we show how the segment routing technology can help to simplify the SFC routing by means of an effective addressing of the service functions. The scalable orchestration solution has been investigated and compared to the one of a traditional orchestrator in some network scenarios and varying the number of cloud domains. We have verified that the execution time of the SRCBA algorithm can be drastically reduced without degrading the performance in terms of cloud and bandwidth resource costs.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 27
    Publication Date: 2019
    Description: Texture evaluation is manually performed in general, and such analytical tasks can get cumbersome. In this regard, a neural network model is employed in this study. This paper describes a system that can estimate the food texture of snacks. The system comprises a simple equipment unit and an artificial neural network model. The equipment simultaneously examines the load and sound when a snack is pressed. The neural network model analyzes the load change and sound signals and then outputs a numerical value within the range (0,1) to express the level of textures such as “crunchiness” and “crispness”. Experimental results validate the model’s capacity to output moderate texture values of the snacks. In addition, we applied the convolutional neural network (CNN) model to classify snacks and the capability of the CNN model for texture estimation is discussed.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 28
    Publication Date: 2019
    Description: Gamification, the use of game design elements in applications that are not games, has been developed to provide attractive environments and maintain user interest in several domains. In domains such as education, marketing and health, where gamification techniques are applied, user engagement in applications has increased. In these applications the protection of users’ privacy is an important aspect to consider, due to the applications obtaining a record of the personal information of their users. Thus, the purpose of this paper is to identify if applications where gamification is applied do respect users’ privacy. For the accomplishment of this aim, two main steps have been implemented. Since the main principle of gamification is the existence of game elements, the first step was to identify the set of game elements recorded in the literature that are commonly applied in various applications. Afterwards, an examination of the relationship between these elements and privacy requirements was implemented in order to identify which elements conflict with the privacy requirements leading to potential privacy violations and which elements do not. Α conceptual model according to the results of this examination was designed, which presents how elements conflict with requirements. Based on the results, there are indeed game elements which can lead to privacy violations. The results of this work provide valuable guidance to software developers, especially during the design stages of gamified applications since it helps them to consider the protection of users’ privacy in parallel from the early stages of the application development onwards.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 29
    Publication Date: 2019
    Description: Most industrial and SCADA-like (supervisory control and data acquisition) systems use proprietary communication protocols, and hence interoperability is not fulfilled. However, the MODBUS TCP is an open de facto standard, and is used for some automation and telecontrol systems. It is based on a polling mechanism and follows the synchronous request–response pattern, as opposed to the asynchronous publish–subscribe pattern. In this study, polling-based and event-based protocols are investigated to realize an open and interoperable Industrial Internet of Things (IIoT) environment. Many Internet of Things (IoT) protocols are introduced and compared, and the message queuing telemetry transport (MQTT) is chosen as the event-based, publish–subscribe protocol. The study shows that MODBUS defines an optimized message structure in the application layer, which is dedicated to industrial applications. In addition, it shows that an event-oriented IoT protocol complements the MODBUS TCP but cannot replace it. Therefore, two scenarios are proposed to build the IIoT environment. The first scenario is to consider the MODBUS TCP as an IoT protocol, and build the environment using the MODBUS TCP on a standalone basis. The second scenario is to use MQTT in conjunction with the MODBUS TCP. The first scenario is efficient and complies with most industrial applications where the request–response pattern is needed only. If the publish–subscribe pattern is needed, the MQTT in the second scenario complements the MODBUS TCP and eliminates the need for a gateway; however, MQTT lacks interoperability. To maintain a homogeneous message structure for the entire environment, industrial data are organized using the structure of MODBUS messages, formatted in the UTF-8, and then transferred in the payload of an MQTT publish message. The open and interoperable environment can be used for Internet SCADA, Internet-based monitoring, and industrial control systems.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 30
    Publication Date: 2019
    Description: The promising advancements in the telecommunications and automotive sectors over the years have empowered drivers with highly innovative communication and sensing capabilities, in turn paving the way for the next-generation connected and autonomous vehicles. Today, vehicles communicate wirelessly with other vehicles and vulnerable pedestrians in their immediate vicinity to share timely safety-critical information primarily for collision mitigation. Furthermore, vehicles connect with the traffic management entities via their supporting network infrastructure to become more aware of any potential hazards on the roads and for guidance pertinent to their current and anticipated speeds and travelling course to ensure more efficient traffic flows. Therefore, a secure and low-latency communication is highly indispensable in order to meet the stringent performance requirements of such safety-critical vehicular applications. However, the heterogeneity of diverse radio access technologies and inflexibility in their deployment results in network fragmentation and inefficient resource utilization, and these, therefore, act as bottlenecks in realizing the aims for a highly efficient vehicular networking architecture. In order to overcome such sorts of bottlenecks, this article brings forth the current state-of-the-art in the context of intelligent transportation systems (ITS) and subsequently proposes a software-defined heterogeneous vehicular networking (SDHVNet) architecture for ensuring a highly agile networking infrastructure to ensure rapid network innovation on-demand. Finally, a number of potential architectural challenges and their probable solutions are discussed.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 31
    Publication Date: 2019
    Description: Information-centric networking integrates by design a pull-based model which brings in advantages in terms of control as well as of in-network caching strategies. Currently, ICN main areas of action concern content distribution and IoT, both of which are environments that often require support for periodic and even-triggered data transmission. Such environments can benefit from push-based communication to achieve faster data forwarding. This paper provides an overview on the current push-based mechanisms that can be applied to information-centric paradigms, explaining the trade-off associated with the different approaches. Moreover, the paper provides design guidelines for integrating push communications in information-centric networking, having as example the application of this networking architecture in IoT environments.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 32
    Publication Date: 2019
    Description: The software defined networking (SDN) paradigm separates the control plane from the data plane, where an SDN controller receives requests from its connected switches and manages the operation of the switches under its control. Reassignments between switches and their controllers are performed dynamically, in order to balance the load over SDN controllers. In order to perform load balancing, most dynamic assignment solutions use a central element to gather information requests for reassignment of switches. Increasing the number of controllers causes a scalability problem, when one super controller is used for all controllers and gathers information from all switches. In a large network, the distances between the controllers is sometimes a constraint for assigning them switches. In this paper, a new approach is presented to solve the well-known load balancing problem in the SDN control plane. This approach implies less load on the central element and meeting the maximum distance constraint allowed between controllers. An architecture with two levels of load balancing is defined. At the top level, the main component called Super Controller, arranges the controllers in clusters, so that there is a balance between the loads of the clusters. At the bottom level, in each cluster there is a dedicated controller called Master Controller, which performs a reassignment of the switches in order to balance the loads between the controllers. We provide a two-phase algorithm, called Dynamic Controllers Clustering algorithm, for the top level of load balancing operation. The load balancing operation takes place at regular intervals. The length of the cycle in which the operation is performed can be shorter, since the top-level operation can run independently of the bottom level operation. Shortening cycle time allows for more accurate results of load balancing. Theoretical analysis demonstrates that our algorithm provides a near-optimal solution. Simulation results show that our dynamic clustering improves fixed clustering by a multiplicative factor of 5.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 33
    Publication Date: 2019
    Description: The idea and perception of good cyber security protection remains at the forefront of many organizations’ information and communication technology strategy and investment. However, delving deeper into the details of its implementation reveals that organizations’ human capital cyber security knowledge bases are very low. In particular, the lack of social engineering awareness is a concern in the context of human cyber security risks. This study highlights pitfalls and ongoing issues that organizations encounter in the process of developing the human knowledge to protect from social engineering attacks. A detailed literature review is provided to support these arguments with analysis of contemporary approaches. The findings show that despite state-of-the-art cyber security preparations and trained personnel, hackers are still successful in their malicious acts of stealing sensitive information that is crucial to organizations. The factors influencing users’ proficiency in threat detection and mitigation have been identified as business environmental, social, political, constitutional, organizational, economical, and personal. Challenges with respect to both traditional and modern tools have been analyzed to suggest the need for profiling at-risk employees (including new hires) and developing training programs at each level of the hierarchy to ensure that the hackers do not succeed.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 34
    Publication Date: 2019
    Description: This is a study of the way in which YouTubers’ media metrics influence the effect of their one-sided messages (1SMs) and two-sided messages (2SMs), providing theoretical explanations based on the elaboration likelihood model. Its main objective is the proposition and testing of: (i) the interaction effect between type of message and media metrics of the YouTuber on customers’ responses, and (ii) the moderation of individuals’ conformity intention for the interaction effect between type of message and media metrics on customers’ responses. The results of an experiment showed that high YouTubers’ media metrics have more effect for 1SMs and less effect for 2SMs. Additionally, conformity intention moderates the effect of the interaction type of message X media metrics. A high level of conformity intention neutralizes the interaction effect between YouTubers’ media metrics and message sidedness. This study makes a theoretical contribution to research into online content and information use, providing explanations of how media metrics of a vlog influence the effect of two types of messages.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 35
    Publication Date: 2019
    Description: Set-valued database publication has been increasing its importance recently due to its benefit for various applications such as marketing analysis and advertising. However, publishing a raw set-valued database may cause individual privacy breach such as the leakage of sensitive information like personal tendencies when data recipients perform data analysis. Even though imposing data anonymization methods such as suppression-based methods and random data swapping methods to such a database can successfully hide personal tendency, it induces item loss from records and causes significant distortion in record structure that degrades database utility. To avoid the problems, we proposed a method based on swapping technique where an individual’s items in a record are swapped to items of the other record. Our swapping technique is distinct from existing one called random data swapping which yields much structure distortion. Even though the technique results in inaccuracy at a record level, it can preserve every single item in a database from loss. Thus, data recipients may obtain all the item information in an anonymized database. In addition, by carefully selecting a pair of records for item swapping, we can avoid excessive record structure distortion that leads to alter database content immensely. More importantly, such a strategy allows one to successfully hide personal tendency without sacrificing a lot of database utility.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 36
    Publication Date: 2019
    Description: Software-defined networking (SDN) is an innovative architecture that designs a logical controller to manage and program the network based on the global view, providing more efficient management, better performance, and higher flexibility for the network. Therefore, applying the SDN concept in a multi-hop wireless network (MWN) has been proposed and extensively studied to overcome the challenges of MWN. In this paper, we propose an energy-efficient global routing algorithm for a software-defined multi-hop wireless network (SDMWN), which is able to get transmission paths for several users at the same time to minimize the global energy consumption with the premise of satisfying the QoS required by users. To this end, we firstly propose a Lagrange relaxation-based aggregated cost (LARAC) and K-Dijkstra combined algorithm to get the top K energy-minimum paths that satisfy the QoS in polynomial time. Then, we combine the alternative paths of each user obtained by K-LARAC and propose an improved genetic algorithm to solve the global routing strategy. The simulation results show that the proposed K-LARAC and genetic algorithm combined method has the ability to obtain an approximate optimal solution with lower time cost.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 37
    Publication Date: 2019
    Description: In recent years, gesture recognition has been used in many fields, such as games, robotics and sign language recognition. Human computer interaction (HCI) has been significantly improved by the development of gesture recognition, and now gesture recognition in video is an important research direction. Because each kind of neural network structure has its limitation, we proposed a neural network with alternate fusion of 3D CNN and ConvLSTM, which we called the Multiple extraction and Multiple prediction (MEMP) network. The main feature of the MEMP network is to extract and predict the temporal and spatial feature information of gesture video multiple times, which enables us to obtain a high accuracy rate. In the experimental part, three data sets (LSA64, SKIG and Chalearn 2016) are used to verify the performance of network. Our approach achieved high accuracy on those data sets. In the LSA64, the network achieved an identification rate of 99.063%. In SKIG, this network obtained the recognition rates of 97.01% and 99.02% in the RGB part and the rgb-depth part. In Chalearn 2016, the network achieved 74.57% and 78.85% recognition rates in RGB part and rgb-depth part respectively.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 38
    Publication Date: 2019
    Description: The purpose of this Special Issue is to collect current developments and future directions of Future Intelligent Systems and Networks [...]
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 39
    Publication Date: 2019
    Description: The emerging connected and autonomous vehicles (CAVs) challenge ad hoc wireless multi-hop communications by mobility, large-scale, new data acquisition and computing patterns. The Named Data Networking (NDN) is suitable for such vehicle ad hoc networks due to its information centric networking approach. However, flooding interest packets in ad-hoc NDN can lead to broadcast storm issue. Existing solutions will either increase the number of redundant interest packets or need a global knowledge about data producers. In this paper, a Location-Based Deferred Broadcast (LBDB) scheme is introduced to improve the efficiency and performance of interest broadcast in ad-hoc NDN. The scheme takes advantage of location information to set up timers when rebroadcasting an interest. The LBDB is implemented in V-NDN network architecture using ndnSIM simulator. Comparisons with several existing protocols are conducted in simulation. The results show that LBDB improves the overhead, the average number of hops and delay while maintaining an average satisfaction ratio when compared with several other broadcast schemes. The improvement can help offer timely data acquisition for quick responses in emergent CAV application situations.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 40
    Publication Date: 2019
    Description: This study develops a contingent mediation model to investigate whether user perception enhances customer stickiness through emotional connection and further assess such mediating effect varies with different adaptivity. A moderated mediation approach is adopted to test the hypotheses. Findings reveal the mediating role of emotional connection on the link between perceived usefulness and customer stickiness, but not moderated by adaptivity. On the other hand, the results showed that the relationship between perceived ease of use and customer stickiness is not mediated by emotional connection; however, after considering the moderating effect, our results show that moderated mediation exists.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 41
    Publication Date: 2019
    Description: With the rise of the Internet of Things (IoT), applications have become smarter and connected devices give rise to their exploitation in all aspects of a modern city. As the volume of the collected data increases, Machine Learning (ML) techniques are applied to further enhance the intelligence and the capabilities of an application. The field of smart transportation has attracted many researchers and it has been approached with both ML and IoT techniques. In this review, smart transportation is considered to be an umbrella term that covers route optimization, parking, street lights, accident prevention/detection, road anomalies, and infrastructure applications. The purpose of this paper is to make a self-contained review of ML techniques and IoT applications in Intelligent Transportation Systems (ITS) and obtain a clear view of the trends in the aforementioned fields and spot possible coverage needs. From the reviewed articles it becomes profound that there is a possible lack of ML coverage for the Smart Lighting Systems and Smart Parking applications. Additionally, route optimization, parking, and accident/detection tend to be the most popular ITS applications among researchers.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 42
    Publication Date: 2019
    Description: Human mobility is a key element in the understanding of epidemic spreading. Thus, correctly modeling and quantifying human mobility is critical for studying large-scale spatial transmission of infectious diseases and improving epidemic control. In this study, a large-scale agent-based transport simulation (MATSim) is linked with a generic epidemic spread model to simulate the spread of communicable diseases in an urban environment. The use of an agent-based model allows reproduction of the real-world behavior of individuals’ daily path in an urban setting and allows the capture of interactions among them, in the form of a spatial-temporal social network. This model is used to study seasonal influenza outbreaks in the metropolitan area of Zurich, Switzerland. The observations of the agent-based models are compared with results from classical SIR models. The model presented is a prototype that can be used to analyze multiple scenarios in the case of a disease spread at an urban scale, considering variations of different model parameters settings. The results of this simulation can help to improve comprehension of the disease spread dynamics and to take better steps towards the prevention and control of an epidemic.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 43
    Publication Date: 2019
    Description: An increasing number of implementations of IoT for development use the LoRaWAN protocol as many of them leverage the free network and application servers provided by The Things Networks (TTN) to fulfill their needs. Unfortunately, in some countries in Sub-Saharan Africa and South Asia, Internet access cannot be taken for granted, therefore, TTN might not be available. Moreover, low-cost and low-power consumption options devices are the most sustainable ones. In this paper, we propose a LoRaWAN network with autonomous base stations that can work without Internet connectivity for essential services, while being able to provide additional features whenever Internet access becomes available, even in an intermittent fashion. Security and privacy are preserved, with support for mobile nodes.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 44
    Publication Date: 2019
    Description: Electronic purchasing or e-procurement saves millions of dollars yearly in transaction costs. E-procurement helps to cut down the supplier base, promotes paperless transactions, and increases transparency and accountability in the procurement process. Nonetheless, studies report that around 80% of e-procurement initiatives have met with failure and failed to achieve the desired results. Although studies to better understand the Critical Success Factors (CSFs) of e-procurement implementation involving various industries have been on the rise, little is known about architecture engineering and construction (AEC) practices, which has led to limited development of pragmatic frameworks to uncover the factors. Thus, this study aims to identify those CSFs (predicting variables) which significantly contribute to e-procurement implementation success in the construction sector and to put forward for better implementation. Results from multiple regression analysis revealed five factors to be statistically significant predictors of success. Three factors were determined to be predictors of user satisfaction. Finally, internet online procurement frameworks were developed for the success of e-procurement implementation in the construction sector.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 45
    Publication Date: 2019
    Description: Social network service (SNS) information has benefited many individuals. However, as such information has increased exponentially, the number of SNS users has increased dramatically and negative effects of SNSs on users have emerged. Many SNS users experience negative psychological conditions such as fatigue, burnout, and stress. Thus, in this study, we investigated the SNS and user characteristics that affect SNS fatigue, living disorder, and reduced SNS use intention. We developed a research model to examine the impact of two SNS characteristics (irrelevant information overload and open reachability) and two user characteristics (engagement and maintaining self-reputation) on SNS fatigue. We also examined the role of the experience of privacy violations in the relationship between living disorder and reduced SNS use intention. We collected data from 579 SNS users and created a partial least squares structural equation model to test the hypotheses. The results of the analysis showed that three factors, other than open reachability, positively affected SNS fatigue. Furthermore, we found that SNS fatigue significantly affected living disorder and reduced SNS use intention, and that experience of privacy violations significantly affected the relationship between living disorder and reduced SNS use intention. These results expand our understanding of SNS fatigue and users’ negative behaviors.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 46
    Publication Date: 2019
    Description: Disaster events and their economic impacts are trending, and climate projection studies suggest that the risks of disaster will continue to increase in the near future. Despite the broad and increasing social effects of these events, the empirical basis of disaster research is often weak, partially due to the natural paucity of observed data. At the same time, some of the early research regarding social responses to disasters have become outdated as social, cultural, and political norms have changed. The digital revolution, the open data trend, and the advancements in data science provide new opportunities for social science disaster research. We introduce the term computational social science of disasters (CSSD), which can be formally defined as the systematic study of the social behavioral dynamics of disasters utilizing computational methods. In this paper, we discuss and showcase the opportunities and the challenges in this new approach to disaster research. Following a brief review of the fields that relate to CSSD, namely traditional social sciences of disasters, computational social science, and crisis informatics, we examine how advances in Internet technologies offer a new lens through which to study disasters. By identifying gaps in the literature, we show how this new field could address ways to advance our understanding of the social and behavioral aspects of disasters in a digitally connected world. In doing so, our goal is to bridge the gap between data science and the social sciences of disasters in rapidly changing environments.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 47
    Publication Date: 2019
    Description: This article addresses the question of passengers’ experience through different transport modes. It presents the main results of a pilot study, for which stress levels experienced by a traveller were assessed and predicted over two long journeys. Accelerometer measures and several physiological signals (electrodermal activity, blood volume pulse and skin temperature) were recorded using a smart wristband while travelling from Grenoble to Bilbao. Based on user’s feedback, three events of high stress and one period of moderate activity with low stress were identified offline. Over these periods, feature extraction and machine learning were performed from the collected sensor data to build a personalized regressive model, with user’s stress levels as output. A smartphone application has been developed on its basis, in order to record and visualize a timely estimated stress level using traveler’s physiological signals. This setting was put on test during another travel from Grenoble to Brussels, where the same user’s stress levels were predicted in real time by the smartphone application. The number of correctly classified stress-less time windows ranged from 92.6% to 100%, depending on participant’s level of activity. By design, this study represents a first step for real-life, ambulatory monitoring of passenger’s stress while travelling.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 48
    Publication Date: 2019
    Description: In today’s world, ruled by a great amount of data and mobile devices, cloud-based systems are spreading all over. Such phenomenon increases the number of connected devices, broadcast bandwidth, and information exchange. These fine-grained interconnected systems, which enable the Internet connectivity for an extremely large number of facilities (far beyond the current number of devices) go by the name of Internet of Things (IoT). In this scenario, mobile devices have an operating time which is proportional to the battery capacity, the number of operations performed per cycle and the amount of exchanged data. Since the transmission of data to a central cloud represents a very energy-hungry operation, new computational paradigms have been implemented. The computation is not completely performed in the cloud, distributing the power load among the nodes of the system, and data are compressed to reduce the transmitted power requirements. In the edge-computing paradigm, part of the computational power is moved toward data collection sources, and, only after a first elaboration, collected data are sent to the central cloud server. Indeed, the “edge” term refers to the extremities of systems represented by IoT devices. This survey paper presents the hardware architectures of typical IoT devices and sums up many of the low power techniques which make them appealing for a large scale of applications. An overview of the newest research topics is discussed, besides a final example of a complete functioning system, embedding all the introduced features.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 49
    Publication Date: 2019
    Description: Football clubs can be considered global brands, and exactly as any other brand, they need to face the challenge of adapting to digital communications. Nevertheless, communication sciences research in this field is scarce, so the main purpose of this work is to analyze digital communication of the main football clubs in Europe to identify and describe what strategies they follow to make themselves known on the internet and to interact with their users. Specifically, the article studies the characteristics of web pages—considered as the main showcase of a brand/team in the digital environment—of the fifteen best teams in the UEFA ranking to establish what type of structure and what online communication resources they use. Through a descriptive and comparative analysis, the study concludes, among other aspects, that the management of communication is effective, but also warns that none of the analyzed team takes full advantage of the possibilities of interaction with the user offered by the digital scenario.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 50
    Publication Date: 2019
    Description: The rapid rise and implementation of Smart Systems (i.e., multi-functional observation and platform systems that depict settings and/or identify situations or features of interest, often in real-time) has inversely paralleled and readily exposed the reduced capacity of human and societal systems to effectively respond to environmental hazards. This overarching review and essay explores the complex set of interactions found among Smart, Societal, and Environmental Systems. The resulting rise in the poorly performing response solutions to environmental hazards that has occurred despite best practices, detailed forecast information, and the use and application of real-time in situ observational platforms are considered. The application of Smart Systems, relevant architectures, and ever-increasing numbers of applications and tools development by individuals as they interact with Smart Systems offers a means to ameliorate and resolve confounding found among all of the interdependent Systems. The interactions of human systems with environmental hazards further expose society’s complex operational vulnerabilities and gaps in response to such threats. An examination of decision-making, the auto-reactive nature of responses before, during, and after environmental hazards; and the lack of scalability and comparability are presented with regard to the prospects of applying probabilistic methods, cross-scale time and space domains; anticipated impacts, and the need to account for multimodal actions and reactions—including psycho-social contributions. Assimilation of these concepts and principles in Smart System architectures, applications, and tools is essential to ensure future viability and functionalities with regard to environmental hazards and to produce an effective set of societal engagement responses. Achieving the promise of Smart Systems relative to environmental hazards will require an extensive transdisciplinary approach to tie psycho-social behaviors directly with non-human components and systems in order to close actionable gaps in response. Pathways to achieve a more comprehensive understanding are given for consideration by the wide diversity of disciplines necessary to move forward in Smart Systems as tied with the societal response to environmental hazards.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 51
    Publication Date: 2019
    Description: Social network services for self-media, such as Weibo, Blog, and WeChat Public, constitute a powerful medium that allows users to publish posts every day. Due to insufficient information transparency, malicious marketing of the Internet from self-media posts imposes potential harm on society. Therefore, it is necessary to identify news with marketing intentions for life. We follow the idea of text classification to identify marketing intentions. Although there are some current methods to address intention detection, the challenge is how the feature extraction of text reflects semantic information and how to improve the time complexity and space complexity of the recognition model. To this end, this paper proposes a machine learning method to identify marketing intentions from large-scale We-Media data. First, the proposed Latent Semantic Analysis (LSI)-Word2vec model can reflect the semantic features. Second, the decision tree model is simplified by decision tree pruning to save computing resources and reduce the time complexity. Finally, this paper examines the effects of classifier associations and uses the optimal configuration to help people efficiently identify marketing intention. Finally, the detailed experimental evaluation on several metrics shows that our approaches are effective and efficient. The F1 value can be increased by about 5%, and the running time is increased by 20%, which prove that the newly-proposed method can effectively improve the accuracy of marketing news recognition.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 52
    Publication Date: 2019
    Description: Multi-access edge computing (MEC) brings high-bandwidth and low-latency access to applications distributed at the edge of the network. Data transmission and exchange become faster, and the overhead of the task migration between mobile devices and edge cloud becomes smaller. In this paper, we adopt the fine-grained task migration model. At the same time, in order to further reduce the delay and energy consumption of task execution, the concept of the task cache is proposed, which involves caching the completed tasks and related data on the edge cloud. Then, we consider the limitations of the edge cloud cache capacity to study the task caching strategy and fine-grained task migration strategy on the edge cloud using the genetic algorithm (GA). Thus, we obtained the optimal mobile device task migration strategy, satisfying minimum energy consumption and the optimal cache on the edge cloud. The simulation results showed that the task caching strategy based on fine-grained migration can greatly reduce the energy consumption of mobile devices in the MEC environment.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 53
    Publication Date: 2019
    Description: The presented work is a result of extended research and analysis on timing methods precision, their efficiency in different virtual environments and the impact of timing precision on the performance of high-speed networks applications. We investigated how timer hardware is shared among heavily CPU- and I/O-bound tasks on a virtualized OS as well as on bare OS. By replacing the invoked timing methods within a well-known application for estimation of available path bandwidth, we provide the analysis of their impact on estimation accuracy. We show that timer overhead and precision are crucial for high-performance network applications, and low-precision timing methods usage, e.g., the delays and overheads issued by virtualization result in the degradation of the virtual environment. Furthermore, in this paper, we provide confirmation that, by using the methods we intentionally developed for both precise timing operations and AvB estimation, it is possible to overcome the inefficiency of standard time-related operations and overhead that comes with the virtualization. The impacts of negative virtualization factors were investigated in five different environments to define the most optimal virtual environment for high-speed network applications.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 54
    Publication Date: 2019
    Description: This paper aims to present a possible alternative to direct file transfer in “challenged networks”, by using DTNbox, a recent application for peer-to-peer directory synchronization between DTN nodes. This application uses the Bundle Protocol (BP) to tackle long delays and link intermittency typical of challenged networks. The directory synchronization approach proposed in the paper consists of delegating the transmission of bulk data files to DTNbox, instead of modifying source applications to interface with the API of a specific BP implementation, or making use of custom scripts for file transfers. The validity of the proposed approach is investigated in the paper by considering a Mars to Earth interplanetary environment. Experiments are carried out by means of Virtual Machines running ION, the NASA-JPL implementation of DTN protocols. The results show that the directory synchronization approach is a valid alternative to direct transfer in interplanetary scenarios such as that considered in the paper.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 55
    Publication Date: 2019
    Description: Modern supply chains have evolved into highly complex value networks and turned into a vital source of competitive advantage. However, it has become increasingly challenging to verify the source of raw materials and maintain visibility of products and merchandise while they are moving through the value chain network. The application of the Internet of Things (IoT) can help companies to observe, track, and monitor products, activities, and processes within their respective value chain networks. Other applications of IoT include product monitoring to optimize operations in warehousing‚ manufacturing, and transportation. In combination with IoT, Blockchain technology can enable a broad range of different application scenarios to enhance value chain transparency and to increase B2B trust. When combined, IoT and Blockchain technology have the potential to increase the effectiveness and efficiency of modern supply chains. The contribution of this paper is twofold. First, we illustrate how the deployment of Blockchain technology in combination with IoT infrastructure can streamline and benefit modern supply chains and enhance value chain networks. Second, we derive six research propositions outlining how Blockchain technology can impact key features of the IoT (i.e., scalability, security, immutability and auditing, information flows, traceability and interoperability, quality) and thus lay the foundation for future research projects.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 56
    Publication Date: 2019
    Description: Aspect-level sentiment analysis (ASA) aims at determining the sentiment polarity of specific aspect term with a given sentence. Recent advances in attention mechanisms suggest that attention models are useful in ASA tasks and can help identify focus words. Or combining attention mechanisms with neural networks are also common methods. However, according to the latest research, they often fail to extract text representations efficiently and to achieve interaction between aspect terms and contexts. In order to solve the complete task of ASA, this paper proposes a Multi-Attention Network (MAN) model which adopts several attention networks. This model not only preprocesses data by Bidirectional Encoder Representations from Transformers (BERT), but a number of measures have been taken. First, the MAN model utilizes the partial Transformer after transformation to obtain hidden sequence information. Second, because words in different location have different effects on aspect terms, we introduce location encoding to analyze the impact on distance from ASA tasks, then we obtain the influence of different words with aspect terms through the bidirectional attention network. From the experimental results of three datasets, we could find that the proposed model could achieve consistently superior results.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 57
    Publication Date: 2019
    Description: The increase of Internet of Things devices and the rise of more computationally intense applications presents challenges for future Internet of Things architectures. We envision a future in which edge, fog, and cloud devices work together to execute future applications. Because the entire application cannot run on smaller edge or fog devices, we will need to split the application into smaller application components. These application components will send event messages to each other to create a single application from multiple application components. The execution location of the application components can be optimized to minimize the resource consumption. In this paper, we describe the Distributed Uniform Stream (DUST) framework that creates an abstraction between the application components and the middleware which is required to make the execution location transparent to the application component. We describe a real-world application that uses the DUST framework for platform transparency. Next to the DUST framework, we also describe the distributed DUST Coordinator, which will optimize the resource consumption by moving the application components to a different execution location. The coordinators will use an adapted version of the Contract Net Protocol to find local minima in resource consumption.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 58
    Publication Date: 2019
    Description: Researchers in many disciplines are developing novel interactive smart learning objects like exercises and visualizations. Meanwhile, Learning Management Systems (LMS) and eTextbook systems are also becoming more sophisticated in their ability to use standard protocols to make use of third party smart learning objects. But at this time, educational tool developers do not always make best use of the interoperability standards and need exemplars to guide and motivate their development efforts. In this paper we present a case study where the two large educational ecosystems use the Learning Tools Interoperability (LTI) standard to allow cross-sharing of their educational materials. At the end of our development process, Virginia Tech’s OpenDSA eTextbook system became able to import materials from Aalto University’s ACOS smart learning content server, such as python programming exercises and Parsons problems. Meanwhile, University of Pittsburgh’s Mastery Grids (which already uses the ACOS exercises) was made to support CodeWorkout programming exercises (a system already used within OpenDSA). Thus, four major projects in CS Education became inter-operable.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 59
    Publication Date: 2019
    Description: Activity-Based Congestion management (ABC) is a novel domain-based QoS mechanism providing more fairness among customers on bottleneck links. It avoids per-flow or per-customer states in the core network and is suitable for application in future 5G networks. However, ABC cannot be configured on standard devices. P4 is a novel programmable data plane specification which allows defining new headers and forwarding behavior. In this work, we implement an ABC prototype using P4 and point out challenges experienced during implementation. Experimental validation of ABC using the P4-based prototype reveals the desired fairness results.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 60
    Publication Date: 2019
    Description: Several emerging mobile applications and services (e.g., autonomous cars) require higher wireless throughput than ever before. This demand stresses the need for investigating novel methods that have the potential to dramatically increase the spectral efficiency (SE) of wireless systems. An evolving approach is the Single-channel full duplex (SCFD) communication where each node may simultaneously receive and transmit over the same frequency channel, and, hence, this could potentially double the current SE figures. In an earlier research work, we derived a model of the signal to interference plus noise ratio (SINR) in an SCFD-based cellular system with imperfect self interference cancellation, and investigated interference management under feasible QoS requirements. In this paper, game theoretic results are exploited to investigate the intercell interference management in SCFD-based cellular networks under infeasible QoS requirements. The investigation starts with a game formulation that captures two different cases. Then, the existence and uniqueness of the Nash equilibrium point are established. After that, a computationally efficient distributed algorithm, which realizes best effort and fair wireless services, is designed. The merit of this scheme is that, when the QoS requirements are feasible, they will be achieved with minimum energy consumption. Results of extensive simulation experiments are presented to show the effectiveness of the proposed schemes.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 61
    Publication Date: 2019
    Description: Self- and mutual-help by citizens are important as well as social-help from the local governments, for disaster prevention and mitigation. Then, town watching and disaster prevention map-making workshops are held to review the town and promote self- and mutual-help by citizens. On the other hand, the use of social media for information sharing during and after disasters has been gaining attention. To facilitate information sharing in disasters, we developed a web system, Disaster Information Tweeting and Mapping System (DITS/DIMS). From the above background, we organized a town-watching workshop using DITS/DIMS in October 2018 in Minami Ward, Sapporo City, Hokkaido, Japan; affected area of the Hokkaido Eastern Iburi Earthquake in September 2018. In this paper, we explain the workshop procedure, outcome, questionnaire survey results, and post-meeting. The questionnaire survey result shows that the workshop educated the participants about posting useful information on social media during a disaster. In addition, at the post-meeting, the participants recognized that they had reviewed the town only from the perspective of “daily life” convenience before the earthquake, and they had not evaluated the “emergency viewpoint.” Therefore, the workshop was a meaningful opportunity for the participants to review the town in terms of disaster prevention and mitigation.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 62
    Publication Date: 2019
    Description: Research on digital image processing has become quite popular and rapid in recent years, and scholars have proposed various image verification mechanisms. Similarly , blockchain technology has also become very popular in recent years. This paper proposes a new image verification mechanism based on the Merkle tree technique in the blockchain. The Merkle tree root in the blockchain mechanism provides a reliable environment for storage of image features. In image verification, the verification of each image can be performed by the Merkle tree mechanism to obtain the hash value of the Merkle tree node on the path. In addition, the method combines the Inter-Planetary File System (IPFS) to improve the availability of images. The main purpose of this paper is to achieve the goal of image integrity verification. The proposed method can not only verify the integrity of the image but also restore the tampered area in the case of image tampering. Since the proposed method employs the blockchain mechanism, the image verification mechanism does not need third party resources . The verification method is performed by each node in the blockchain network. The experimental results demonstrate that the proposed method successfully achieved the goal of image authentication and tampered area restoration.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 63
    Publication Date: 2019
    Description: Next Generation Internet (NGI) is the European initiative launched to identify the future internet technologies, designed to serve the needs of the digitalized society while ensuring privacy, trust, decentralization, openness, inclusion, and business cooperation. NGI provides efficient support to promote diversity, decentralization and the growth of disruptive innovation envisioned by smart cities. After the earthquake of 6 April 2009, the city of L’Aquila is facing a massive and innovative reconstruction process. As a consequence, nowadays, the L’Aquila city can be considered as a living laboratory model for applications within the context of smart cities. This paper describes and evaluates the realization of a Collaborative Road Mobility System (CRMS) for L’Aquila city by using our CHOReVOLUTION approach for the automated choreography production. The CRMS allows vehicles and transport infrastructure to interconnect, share information and use it to coordinate their actions.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 64
    Publication Date: 2019
    Description: Industrial Automation and Control Systems (IACS) are broadly utilized in critical infrastructures for monitoring and controlling the industrial processes remotely. The real-time transmissions in such systems provoke security breaches. Many security breaches have been reported impacting society severely. Hence, it is essential to achieve secure communication between the devices for creating a secure environment. For this to be effective, the keys used for secure communication must be protected against unauthorized disclosure, misuse, alteration or loss, which can be taken care of by a Key Management Infrastructure. In this paper, by considering the generic industrial automation network, a comprehensive key management infrastructure (CKMI) is designed for IACS. To design such an infrastructure, the proposed scheme employs ECDH, matrix method, and polynomial crypto mechanisms. The proposed design handles all the standard key management operations, viz. key generation, device registration, key establishment, key storage, device addition, key revocation, key update, key recovery, key archival, and key de-registration and destruction. The design supports secure communication between the same and different levels of IACS devices. The proposed design can be applied for major industrial automation networks to handle the key management operations. The performance analysis and implementation results highlight the benefits of the proposed design.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 65
    Publication Date: 2019
    Description: Telco content delivery networks (CDNs) have envisioned building highly distributed and cloudified sites to provide a high-quality CDN service in the 5G era. However, there are still two open problems to be addressed. First, telco CDNs are operated upon the underlay network evolving towards information-centric networking (ICN). Different from CDNs that perform on the application layer, ICN enables information-centric forwarding to the network layer. Thus, it is challenging to take advantage of the benefits of both ICN and CDN to provide a high-quality content delivery service in the context of ICN-based telco CDNs. Second, bandwidth pricing and request mapping issues in ICN-based telco CDNs have not been thoroughly studied. In this paper, we first propose an ICN-based telco CDN framework that integrates the information-centric forwarding enabled by ICN and the powerful edge caching enabled by telco CDNs. Then, we propose a location-dependent pricing (LDP) strategy, taking into consideration the congestion level of different sites. Furthermore, on the basis of LDP, we formulate a price-aware request mapping (PARM) problem, which can be solved by existing linear programming solvers. Finally, we conduct extensive simulations to evaluate the effectiveness of our design.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 66
    Publication Date: 2019
    Description: Software-defined networking (SDN) is a modern network architecture, which separates the network control plane from the data plane. Considering the gradual migration from traditional networks to SDNs, the hybrid SDN, which consists of SDN-enabled devices and legacy devices, is an intermediate state. For wide-area hybrid SDNs, to guarantee the control performance, such as low latency, multi SDN controllers are usually needed to be deployed at different places. How to assign them to switches and partition the network into several control domains is a critical problem. For this problem, the control latency and the packet loss rate of control messages are important metrics, which have been considered in a lot of previous works. However, hybrid SDNs have their unique characters, which can affect the assignment scheme and have been ignored by previous studies. For example, control messages pass through Legacy Forwarding Devices (LFDs) in hybrid SDNs and cause more latency and packet loss rate for queuing compared with SDN-enabled Forwarding Devices (SFDs). In this paper, we propose a dynamic controller assignment scheme in hybrid SDNs, which is called the Legacy Based Assignment (LBA). This scheme can dynamically delegate each controller with a subset of SFDs in the hybrid SDNs, whose objective is to minimize average SFD-to-control latency. We performed some experiments compared with other schemes, which show that our scheme has a better performance in terms of the latency and the packet loss rate.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 67
    Publication Date: 2019
    Description: Vehicular ad hoc networks (VANETs) are a recent class of peer-to-peer wireless networks that are used to organize the communication and interaction between cars (V2V), between cars and infrastructure (V2I), and between cars and other types of nodes (V2X). These networks are based on the dedicated short-range communication (DSRC) IEEE 802.11 standards and are mainly intended to organize the exchange of various types of messages, mainly emergency ones, to prevent road accidents, alert when a road accident occurs, or control the priority of the roadway. Initially, it was assumed that cars would only interact with each other, but later, with the advent of the concept of the Internet of things (IoT), interactions with surrounding devices became a demand. However, there are many challenges associated with the interaction of vehicles and the interaction with the road infrastructure. Among the main challenge is the high density and the dramatic increase of the vehicles’ traffic. To this end, this work provides a novel system based on mobile edge computing (MEC) to solve the problem of high traffic density and provides and offloading path to vehicle’s traffic. The proposed system also reduces the total latency of data communicated between vehicles and stationary roadside units (RSUs). Moreover, a latency-aware offloading algorithm is developed for managing and controlling data offloading from vehicles to edge servers. The system was simulated over a reliable environment for performance evaluation, and a real experiment was conducted to validate the proposed system and the developed offloading method.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 68
    Publication Date: 2019
    Description: The Internet of Things (IoT) is rapidly changing our society to a world where every “thing” is connected to the Internet, making computing pervasive like never before. This tsunami of connectivity and data collection relies more and more on the Cloud, where data analytics and intelligence actually reside. Cloud computing has indeed revolutionized the way computational resources and services can be used and accessed, implementing the concept of utility computing whose advantages are undeniable for every business. However, despite the benefits in terms of flexibility, economic savings, and support of new services, its widespread adoption is hindered by the security issues arising with its usage. From a security perspective, the technological revolution introduced by IoT and Cloud computing can represent a disaster, as each object might become inherently remotely hackable and, as a consequence, controllable by malicious actors. While the literature mostly focuses on the security of IoT and Cloud computing as separate entities, in this article we provide an up-to-date and well-structured survey of the security issues of cloud computing in the IoT era. We give a clear picture of where security issues occur and what their potential impact is. As a result, we claim that it is not enough to secure IoT devices, as cyber-storms come from Clouds.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 69
    Publication Date: 2019
    Description: Latency is a critical issue that impacts the performance of decentralized systems. Recently we designed various protocols to regulate the injection rate of unverified transactions into the system to improve system performance. Each of the protocols is designed to address issues related to some particular network traffic syndrome. In this work, we first provide the review of our prior protocols. We then provide a hybrid scheme that combines our transaction injection protocols and provides an optimal linear combination of the protocols based on the syndromes in the network. The goal is to speed up the verification process of systems that rely on only one single basic protocol. The underlying basic protocols are Periodic Injection of Transaction via Evaluation Corridor (PITEC), Probabilistic Injection of Transactions (PIT), and Adaptive Semi-synchronous Transaction Injection (ASTI).
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 70
    Publication Date: 2019
    Description: Vehicle speed estimation is an important problem in traffic surveillance. Many existing approaches to this problem are based on camera calibration. Two shortcomings exist for camera calibration-based methods. First, camera calibration methods are sensitive to the environment, which means the accuracy of the results are compromised in some situations where the environmental condition is not satisfied. Furthermore, camera calibration-based methods rely on vehicle trajectories acquired by a two-stage tracking and detection process. In an effort to overcome these shortcomings, we propose an alternate end-to-end method based on 3-dimensional convolutional networks (3D ConvNets). The proposed method bases average vehicle speed estimation on information from video footage. Our methods are characterized by the following three features. First, we use non-local blocks in our model to better capture spatial–temporal long-range dependency. Second, we use optical flow as an input in the model. Optical flow includes the information on the speed and direction of pixel motion in an image. Third, we construct a multi-scale convolutional network. This network extracts information on various characteristics of vehicles in motion. The proposed method showcases promising experimental results on commonly used dataset with mean absolute error (MAE) as 2.71 km/h and mean square error (MSE) as 14.62 .
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 71
    Publication Date: 2019
    Description: The advancements in digital communication technology have made communication between humans more accessible and instant. However, personal and sensitive information may be available online through social networks and online services that lack the security measures to protect this information. Communication systems are vulnerable and can easily be penetrated by malicious users through social engineering attacks. These attacks aim at tricking individuals or enterprises into accomplishing actions that benefit attackers or providing them with sensitive data such as social security number, health records, and passwords. Social engineering is one of the biggest challenges facing network security because it exploits the natural human tendency to trust. This paper provides an in-depth survey about the social engineering attacks, their classifications, detection strategies, and prevention procedures.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 72
    Publication Date: 2019
    Description: With the development of techniques, such as the Internet of Things (IoT) and edge computing, home energy management systems (HEMS) have been widely implemented to improve the electric energy efficiency of customers. In order to automatically optimize electric appliances’ operation schedules, this paper considers how to quantitatively evaluate a customer’s comfort satisfaction in energy-saving programs, and how to formulate the optimal energy-saving model based on this satisfaction evaluation. First, the paper categorizes the utility functions of current electric appliances into two types; time-sensitive utilities and temperature-sensitive utilities, which cover nearly all kinds of electric appliances in HEMS. Furthermore, considering the bounded rationality of customers, a novel concept called the energy-saving cost is defined by incorporating prospect theory in behavioral economics into general utility functions. The proposed energy-saving cost depicts the comfort loss risk for customers when their HEMS schedules the operation status of appliances, which is able to be set by residents as a coefficient in the automatic energy-saving program. An optimization model is formulated based on minimizing energy consumption. Because the energy-saving cost has already been evaluated in the context of the satisfaction of customers, the formulation of the optimization program is very simple and has high computational efficiency. The case study included in this paper is first performed on a general simulation system. Then, a case study is set up based on real field tests from a pilot project in Guangdong province, China, in which air-conditioners, lighting, and some other popular electric appliances were included. The total energy-saving rate reached 65.5% after the proposed energy-saving program was deployed in our project. The benchmark test shows our optimal strategy is able to considerably save electrical energy for residents while ensuring customers’ comfort satisfaction is maintained.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 73
    Publication Date: 2019
    Description: The revolution of cooperative connected and automated vehicles is about to begin and a key milestone is the introduction of short range wireless communications between cars. Given the tremendous expected market growth, two different technologies have been standardized by international companies and consortia, namely IEEE 802.11p, out for nearly a decade, and short range cellular-vehicle-to-anything (C-V2X), of recent definition. In both cases, evolutions are under discussion. The former is only decentralized and based on a sensing before transmitting access, while the latter is based on orthogonal resources that can be also managed by an infrastructure. Although studies have been conducted to highlight advantages and drawbacks of both, doubts still remain. In this work, with a reference to the literature and the aid of large scale simulations in realistic urban and highway scenarios, we provide an insight in such a comparison, also trying to isolate the contribution of the physical and medium access control layers.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 74
    Publication Date: 2019
    Description: This paper focuses on the load imbalance problem in System Wide Information Management (SWIM) task scheduling. In order to meet the quality requirements of users for task completion, we studied large-scale network information system task scheduling methods. Combined with the traditional ant colony optimization (ACO) algorithm, using the hardware performance quality index and load standard deviation function of SWIM resource nodes to update the pheromone, a SWIM ant colony task scheduling algorithm based on load balancing (ACTS-LB) is presented in this paper. The experimental simulation results show that the ACTS-LB algorithm performance is better than the traditional min-min algorithm, ACO algorithm and particle swarm optimization (PSO) algorithm. It not only reduces the task execution time and improves the utilization of system resources, but also can maintain SWIM in a more load balanced state.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 75
    Publication Date: 2019
    Description: Volunteer computing (VC) is a distributed computing paradigm, which provides unlimited computing resources in the form of donated idle resources for many large-scale scientific computing applications. Task scheduling is one of the most challenging problems in VC. Although, dynamic scheduling problem with deadline constraint has been extensively studied in prior studies in the heterogeneous system, such as cloud computing and clusters, these algorithms can’t be fully applied to VC. This is because volunteer nodes can get offline whenever they want without taking any responsibility, which is different from other distributed computing. For this situation, this paper proposes a dynamic task scheduling algorithm for heterogeneous VC with deadline constraint, called deadline preference dispatch scheduling (DPDS). The DPDS algorithm selects tasks with the nearest deadline each time and assigns them to volunteer nodes (VN), which solves the dynamic task scheduling problem with deadline constraint. To make full use of resources and maximize the number of completed tasks before the deadline constraint, on the basis of the DPDS algorithm, improved dispatch constraint scheduling (IDCS) is further proposed. To verify our algorithms, we conducted experiments, and the results show that the proposed algorithms can effectively solve the dynamic task assignment problem with deadline constraint in VC.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 76
    Publication Date: 2019
    Description: Next generation 5G networks generate a need for broadband, low latency and power efficient backhauling and data-relay services. In this paper, optical satellite communications links, as an integrated component of 5G networks, are studied. More specifically, the Geostationary (GEO) satellite-to-ground optical communication link is investigated. Long-term irradiance statistics based on experimental measurements from the ARTEMIS program are presented and a new time series generator related to the received irradiance/power fluctuations due to atmospheric turbulence is reported. The proposed synthesizer takes into consideration the turbulence-induced scintillation effects that deteriorate the laser beam propagation, on the assumption of the Kolmogorov spectrum. The modeling is based on Rytov theory regarding weak turbulence conditions with the incorporation of first order stochastic differential equations. Summing up, the time series synthesizer is validated in terms of first and second order statistics with experimental results from the European Space Agency‘s ARTEMIS experimental optical downlink and simulated received power statistics for various weather conditions are presented using the proposed validated methodology. Some important conclusions are drawn.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 77
    Publication Date: 2019
    Description: Aims: The aim of this study was to compare victims of one type of cyberstalking (OneType) with victims of more than one type of cyberstalking (MoreType) regarding (1) the impact of cyberstalking and (2) attitudes related to telling someone about the experience of cyberstalking and the coping strategies used by victims. Methods: A self-administered questionnaire was distributed to over 250 students at the University of Torino. Results: About half of the participants experienced at least one incident of cyberstalking. Among them, more than half experienced more than one type of cyberstalking. Victims suffered from depression more than those who had never experienced cyberstalking. No statistically significant difference emerged for anxiety. The coping strategies used by MoreType were more varied than those used by OneType victims of cyberstalking. Moreover, MoreType victims told someone about their victimization more than OneType victims. Conclusion: The work presented suggests implications for health care professionals, police officers, and government. For example, our suggestion is to pay attention to cyberstalking victims and provide flyers in schools, universities, and cafeterias that explain the risk of certain online behaviors and their consequences in physical and emotional spheres.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 78
    Publication Date: 2019
    Description: Contemporary software is inherently distributed. The principles guiding the design of such software have been mainly manifested by the service-oriented architecture (SOA) concept. In a SOA, applications are orchestrated by software services generally operated by distinct entities. Due to the latter fact, service security has been of importance in such systems ever since. A dominant protocol for implementing SOA-based systems is SOAP, which comes with a well-elaborated security framework. As an alternative to SOAP, the architectural style representational state transfer (REST) is gaining traction as a simple, lightweight and flexible guideline for designing distributed service systems that scale at large. This paper starts by introducing the basic constraints representing REST. Based on these foundations, the focus is afterwards drawn on the security needs of REST-based service systems. The limitations of transport-oriented protection means are emphasized and the demand for specific message-oriented safeguards is assessed. The paper then reviews the current activities in respect to REST-security and finds that the available schemes are mostly HTTP-centered and very heterogeneous. More importantly, all of the analyzed schemes contain vulnerabilities. The paper contributes a methodology on how to establish REST-security as a general security framework for protecting REST-based service systems of any kind by consistent and comprehensive protection means. First adoptions of the introduced approach are presented in relation to REST message authentication with instantiations for REST-ful HTTP (web/cloud services) and REST-ful constraint application protocol (CoAP) (internet of things (IoT) services).
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 79
    Publication Date: 2019
    Description: Ship detection and recognition are important for smart monitoring of ships in order to manage port resources effectively. However, this is challenging due to complex ship profiles, ship background, object occlusion, variations of weather and light conditions, and other issues. It is also expensive to transmit monitoring video in a whole, especially if the port is not in a rural area. In this paper, we propose an on-site processing approach, which is called Embedded Ship Detection and Recognition using Deep Learning (ESDR-DL). In ESDR-DL, the video stream is processed using embedded devices, and we design a two-stage neural network named DCNet, which is composed of a DNet for ship detection and a CNet for ship recognition, running on embedded devices. We have extensively evaluated ESDR-DL, including performance of accuracy and efficiency. The ESDR-DL is deployed at the Dongying port of China, which has been running for over a year and demonstrates that it can work reliably for practical usage.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 80
    Publication Date: 2019
    Description: Network structures, consisting of nodes and edges, have applications in almost all subjects. A set of nodes is called a community if the nodes have strong interrelations. Industries (including cell phone carriers and online social media companies) need community structures to allocate network resources and provide proper and accurate services. However, most detection algorithms are derived independently, which is arduous and even unnecessary. Although recent research shows that a general detection method that serves all purposes does not exist, we believe that there is some general procedure of deriving detection algorithms. In this paper, we represent such a general scheme. We mainly focus on two types of networks: transmission networks and similarity networks. We reduce them to a unified graph model, based on which we propose a method to define and detect community structures. Finally, we also give a demonstration to show how our design scheme works.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 81
    Publication Date: 2019
    Description: With the emergence of the Internet of Things, environmental sensing has been gaining interest, promising to improve agricultural practices by facilitating decision-making based on gathered environmental data (i.e., weather forecasting, crop monitoring, and soil moisture sensing). Environmental sensing, and by extension what is referred to as precision or smart agriculture, pose new challenges, especially regarding the collection of environmental data in the presence of connectivity disruptions, their gathering, and their exploitation by end-users or by systems that must perform actions according to the values of those collected data. In this paper, we present a middleware platform for the Internet of Things that implements disruption tolerant opportunistic networking and computing techniques, and that makes it possible to expose and manage physical objects through Web-based protocols, standards and technologies, thus providing interoperability between objects and creating a Web of Things (WoT). This WoT-based opportunistic computing approach is backed up by a practical experiment whose outcomes are presented in this article.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 82
    Publication Date: 2019
    Description: The World Wide Web has become an essential modern tool for people’s daily routine. The fact that it is a convenient means for communication and information search has made it extremely popular. This fact led companies to start using online advertising by creating corporate websites. With the rapid increase in the number of websites, search engines had to come up with a solution of algorithms and programs to qualify the results of a search and provide the users with relevant content to their search. On the other side, developers, in pursuit of the highest rankings in the search engine result pages (SERPs), began to study and observe how search engines work and which factors contribute to higher rankings. The knowledge that has been extracted constituted the base for the creation of the profession of Search Engine Optimization (SEO). This paper consists of two parts. The first part aims to perform a literature review of the factors that affect the ranking of websites in the SERPs and to highlight the top factors that contribute to better ranking. To achieve this goal, a collection and analysis of academic papers was conducted. According to our research, 24 website characteristics came up as factors affecting any website’s ranking, with the most references mentioning quality and quantity of backlinks, social media support, keyword in title tag, website structure, website size, loading time, domain age, and keyword density. The second part consists of our research which was conducted manually using the phrases “hotel Athens”, “email marketing”, and “casual shoes”. For each one of these keywords, the first 15 Google results were examined considering the factors found in the literature review. For the measurement of the significance of each factor, the Spearman correlation was calculated and every factor was compared with the ranking of the results individually. The findings of the research showed us that the top factors that contribute to higher rankings are the existence of website SSL certificate as well as keyword in URL, the quantity of backlinks pointing to a website, the text length, and the domain age, which is not perfectly aligned with what the literature review showed us.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 83
    Publication Date: 2019
    Description: Recent interest in applications where content is of primary interest has triggered the exploration of a variety of protocols and algorithms. For such networks that are information-centric, architectures such as the Content-Centric Networking have been proven to result in good network performance. However, such architectures are still evolving to cater for application-specific requirements. This paper proposes T-Move, a light-weight solution for producer mobility and caching at the edge that is especially suitable for content-centric networks with mobile content producers. T-Move introduces a novel concept called trendiness of data for Content-Centric Networking (CCN)/Named Data Networking (NDN)-based networks. It enhances network performance and quality of service (QoS) using two strategies—cache replacement and proactive content-pushing for handling producer mobility—both based on trendiness. It uses simple operations and smaller control message overhead and is suitable for networks where the response needs to be quick. Simulation results using ndnSIM show reduced traffic, content retrieval time, and increased cache hit ratio with T-Move, when compared to MAP-Me and plain NDN for networks of different sizes and mobility rates.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 84
    Publication Date: 2019
    Description: Internet of Things applications are not only the new opportunity for digital businesses but also a major driving force for the modification and creation of software systems in all industries and businesses. Compared to other types of software-intensive products, the development of Internet of Things applications lacks a systematic approach and guidelines. This paper aims at understanding the common practices and challenges among start-up companies who are developing Internet of Things products. A qualitative research is conducted with data from twelve semi-structured interviews. A thematic analysis reveals common types of Minimum Viable Products, prototyping techniques and production concerns among early stage hardware start-ups. We found that hardware start-ups go through an incremental prototyping process toward production. The progress associates with the transition from speed-focus to quality-focus. Hardware start-ups heavily rely on third-party vendors in term of development speed and final product quality. We identified 24 challenges related to management, requirement, design, implementation and testing. Internet of Things entrepreneurs should be aware of relevant pitfalls and managing both internal and external risks.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 85
    Publication Date: 2019
    Description: Future vehicles are becoming more like driving partners instead of mere machines. With the application of advanced information and communication technologies (ICTs), vehicles perform driving tasks while drivers monitor the functioning states of vehicles. This change in interaction requires a deliberate consideration of how vehicles should present driving-related information. As a way of encouraging drivers to more readily accept instructions from vehicles, we suggest the use of social rules, such as politeness, in human-vehicle interaction. In a 2 × 2 between-subjects experiment, we test the effects of vehicle politeness (plain vs. polite) on drivers’ interaction experiences in two operation situations (normal vs. failure). The results indicate that vehicle politeness improves interaction experience in normal working situations but impedes the experience in failure situations. Specifically, in normal situations, vehicles with polite instructions are highly evaluated for social presence, politeness, satisfaction and intention to use. Theoretical and practical implications on politeness research and speech interaction design are discussed.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 86
    Publication Date: 2019
    Description: Internet has become so widespread that most popular websites are accessed by hundreds of millions of people on a daily basis. Monolithic architectures, which were frequently used in the past, were mostly composed of traditional relational database management systems, but quickly have become incapable of sustaining high data traffic very common these days. Meanwhile, NoSQL databases have emerged to provide some missing properties in relational databases like the schema-less design, horizontal scaling, and eventual consistency. This paper analyzes and compares the consistency model implementation on five popular NoSQL databases: Redis, Cassandra, MongoDB, Neo4j, and OrientDB. All of which offer at least eventual consistency, and some have the option of supporting strong consistency. However, imposing strong consistency will result in less availability when subject to network partition events.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 87
    Publication Date: 2019
    Description: In previous investigations, controllers for the track-keeping of ships were designed with the assumption of constant ship speed. However, when navigating in a fairway area, the ship’s speed is usually decreased to prepare for berthing. The existing track-keeping systems, which are applied when the ship navigates in the open sea with a constant ship speed, cannot be used to navigate the ship in the fairway. In this article, a support system is proposed for ship navigation in the fairway. This system performs three tasks. First, the ship is automatically controlled by regulating the rudder to follow planned tracks. Second, the ship’s speed is reduced step by step to approach the berth area at a low speed. Finally, at low speed, when the ship’s rudder is not effective enough to control the ship’s heading to a desired angle, the ship’s heading is adjusted appropriately by the bow thruster before changing the control mode into the automatic berthing system. By the proposed system, the automatic systems can be combined to obtain a fully automatic system for ship control. To validate the effectiveness of this proposed system for automatic ship navigation in the fairway, numerical simulations were conducted with a training ship model.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 88
    Publication Date: 2019
    Description: The demand for Autonomic Network Management (ANM) and optimization is as intense as ever, even though significant research has been devoted towards this direction. This paper addresses such need in Software Defined (SDR) based Cognitive Radio Networks (CRNs). We propose a new framework for ANM and network reconfiguration combining Software Defined Networks (SDN) with SDR via Network Function Virtualization (NFV) enabled Virtual Utility Functions (VUFs). This is the first approach combining ANM with SDR and SDN via NFV, demonstrating how these state-of-the-art technologies can be effectively combined to achieve reconfiguration flexibility, improved performance and efficient use of available resources. In order to show the feasibility of the proposed framework, we implemented its main functionalities in a cross-layer resource allocation mechanism for CRNs over real SDR testbeds provided by the Orchestration and Reconfiguration Control Architecture (ORCA) EU project. We demonstrate the efficacy of our framework, and based on the obtained results, we identify aspects that can be further investigated for improving the applicability and increasing performance of our broader framework.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 89
    Publication Date: 2019
    Description: Industry 4.0 demands a dynamic optimization of production lines. They are formed by sets of heterogeneous devices that cooperate towards a shared goal. The Internet of Things can serve as a technology enabler for implementing such a vision. Nevertheless, the domain is struggling in finding a shared understanding of the concepts for describing a device. This aspect plays a fundamental role in enabling an “intelligent interoperability” among sensor and actuators that will constitute a dynamic Industry 4.0 production line. In this paper, we summarize the efforts of academics and practitioners toward describing devices in order to enable dynamic reconfiguration by machines or humans. We also propose a set of concepts for describing devices, and we analyze how present initiatives are covering these aspects.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 90
    Publication Date: 2019
    Description: Over the past decade, radio-frequency identification (RFID) technology has attracted significant attention and become very popular in different applications, such as identification, management, and monitoring. In this study, a dual-band microstrip-fed monopole antenna has been introduced for RFID applications. The antenna is designed to work at the frequency ranges of 2.2–2.6 GHz and 5.3–6.8 GHz, covering 2.4/5.8 GHz RFID operation bands. The antenna structure is like a modified F-shaped radiator. It is printed on an FR-4 dielectric with an overall size of 38 × 45 × 1.6 mm3. Fundamental characteristics of the antenna in terms of return loss, Smith Chart, phase, radiation pattern, and antenna gain are investigated and good results are obtained. Simulations have been carried out using computer simulation technology (CST) software. A prototype of the antenna was fabricated and its characteristics were measured. The measured results show good agreement with simulations. The structure of the antenna is planar, simple to design and fabricate, easy to integrate with RF circuit, and suitable for use in RFID systems.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 91
    Publication Date: 2019
    Description: Visualising complex data facilitates a more comprehensive stage for conveying knowledge. Within the medical data domain, there is an increasing requirement for valuable and accurate information. Patients need to be confident that their data is being stored safely and securely. As such, it is now becoming necessary to visualise data patterns and trends in real-time to identify erratic and anomalous network access behaviours. In this paper, an investigation into modelling data flow within healthcare infrastructures is presented; where a dataset from a Liverpool-based (UK) hospital is employed for the case study. Specifically, a visualisation of transmission control protocol (TCP) socket connections is put forward, as an investigation into the data complexity and user interaction events within healthcare networks. In addition, a filtering algorithm is proposed for noise reduction in the TCP dataset. Positive results from using this algorithm are apparent on visual inspection, where noise is reduced by up to 89.84%.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 92
    Publication Date: 2019
    Description: Opportunistic networks have recently seen increasing interest in the networking community. They can serve a range of application scenarios, most of them being destination-less, i.e., without a-priori knowledge of who is the final destination of a message. In this paper, we explore the usage of data popularity for improving the efficiency of data forwarding in opportunistic networks. Whether a message will become popular or not is not known before disseminating it to users. Thus, popularity needs to be estimated in a distributed manner considering a local context. We propose Keetchi, a data forwarding protocol based on Q-Learning to give more preference to popular data rather than less popular data. Our extensive simulation comparison between Keetchi and the well known Epidemic protocol shows that the network overhead of data forwarding can be significantly reduced while keeping the delivery rate the same.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 93
    Publication Date: 2019
    Description: Self-adapting exploratory structures (SAESs) are the basic components of exploratory search. They are abstract structures which allow searching or querying of an information base and summarizing of results using a uniform representation. A definition and a characterization of SAES is given, as well as a discussion of structures that are SAES or can be modified in order to become SAES. These include dynamic taxonomies (also known as faceted search), tag clouds, continuous sliders, geographic maps, and dynamic clustering methods, such as Scatter-Gather. Finally, the integration of these structures into a single interface is discussed.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 94
    Publication Date: 2019
    Description: ActoDatA (Actor Data Analysis) is an actor-based software library for the development of distributed data mining applications. It provides a multi-agent architecture with a set of predefined and configurable agents performing the typical tasks of data mining applications. In particular, its architecture can manage different users’ applications; it maintains a high level of execution quality by distributing the agents of the applications on a dynamic set of computational nodes. Moreover, it provides reports about the analysis results and the collected data, which can be accessed through either a web browser or a dedicated mobile APP. After an introduction about the actor model and the software framework used for implementing the software library, this article underlines the main features of ActoDatA and presents its experimentation in some well-known data analysis domains.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 95
    Publication Date: 2019
    Description: In order to achieve more efficient energy consumption, it is crucial that accurate detailed information is given on how power is consumed. Electricity details benefit both market utilities and also power consumers. Non-intrusive load monitoring (NILM), a novel and economic technology, obtains single-appliance power consumption through a single total power meter. This paper, focusing on load disaggregation with low hardware costs, proposed a load disaggregation method for low sampling data from smart meters based on a clustering algorithm and support vector regression optimization. This approach combines the k-median algorithm and dynamic time warping to identify the operating appliance and retrieves single energy consumption from an aggregate smart meter signal via optimized support vector regression (OSVR). Experiments showed that the technique can recognize multiple devices switching on at the same time using low-frequency data and achieve a high load disaggregation performance. The proposed method employs low sampling data acquired by smart meters without installing extra measurement equipment, which lowers hardware cost and is suitable for applications in smart grid environments.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 96
    Publication Date: 2019
    Description: Human activity recognition is an active field of research in computer vision with numerous applications. Recently, deep convolutional networks and recurrent neural networks (RNN) have received increasing attention in multimedia studies, and have yielded state-of-the-art results. In this research work, we propose a new framework which intelligently combines 3D-CNN and LSTM networks. First, we integrate discriminative information from a video into a map called a ‘motion map’ by using a deep 3-dimensional convolutional network (C3D). A motion map and the next video frame can be integrated into a new motion map, and this technique can be trained by increasing the training video length iteratively; then, the final acquired network can be used for generating the motion map of the whole video. Next, a linear weighted fusion scheme is used to fuse the network feature maps into spatio-temporal features. Finally, we use a Long-Short-Term-Memory (LSTM) encoder-decoder for final predictions. This method is simple to implement and retains discriminative and dynamic information. The improved results on benchmark public datasets prove the effectiveness and practicability of the proposed method.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 97
    Publication Date: 2019
    Description: Multipath transport protocols are aimed at increasing the throughput of data flows as well as maintaining fairness between users, which are both crucial factors to maximize user satisfaction. In this paper, a mixed (non)linear programming (MINLP) solution is developed which provides an optimum solution to allocate link capacities in a network to a number of given traffic demands considering both the maximization of link utilization as well as fairness between transport layer data flows or subflows. The solutions of the MINLP formulation are evaluated w. r. t. their throughput and fairness using well-known metrics from the literature. It is shown that network flow fairness based capacity allocation achieves better fairness results than the bottleneck-based methods in most cases while yielding the same capacity allocation performance.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 98
    Publication Date: 2019
    Description: The tooth-marked tongue is an important indicator in traditional Chinese medicinal diagnosis. However, the clinical competence of tongue diagnosis is determined by the experience and knowledge of the practitioners. Due to the characteristics of different tongues, having many variations such as different colors and shapes, tooth-marked tongue recognition is challenging. Most existing methods focus on partial concave features and use specific threshold values to classify the tooth-marked tongue. They lose the overall tongue information and lack the ability to be generalized and interpretable. In this paper, we try to solve these problems by proposing a visual explanation method which takes the entire tongue image as an input and uses a convolutional neural network to extract features (instead of setting a fixed threshold artificially) then classifies the tongue and produces a coarse localization map highlighting tooth-marked regions using Gradient-weighted Class Activation Mapping. Experimental results demonstrate the effectiveness of the proposed method.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 99
    Publication Date: 2019
    Description: In this paper, we propose a joint power allocation, time switching (TS) factor and relay selection scheme for an energy harvesting two-way relaying communication network (TWRN), where two transceivers exchange information with the help of a wireless-powered relay. By exploiting the TS architecture at the relay node, the relay node needs to use additional time slots for energy transmission, reducing the transmission rate. Thus, we propose a joint resource allocation algorithm to maximize the max-min bidirectional instantaneous information rate. To solve the original non-convex optimization problem, the objective function is decomposed into three sub-problems and solved sequentially. The closed-form solution of the transmit power of two sources and the optimal TS factor can be obtained by the information rate balancing technology and the proposed time allocation scheme, respectively. At last, the optimal relay node can be obtained. Simulation results show that the performance of the proposed algorithm is better than the traditional schemes and power-splitting (PS) scheme.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 100
    Publication Date: 2019
    Description: Radio-frequency (RF) tomographic imaging is a promising technique for inferring multi-dimensional physical space by processing RF signals traversed across a region of interest. Tensor-based approaches for tomographic imaging are superior at detecting the objects within higher dimensional spaces. The recently-proposed tensor sensing approach based on the transform tensor model achieves a lower error rate and faster speed than the previous tensor-based compress sensing approach. However, the running time of the tensor sensing approach increases exponentially with the dimension of tensors, thus not being very practical for big tensors. In this paper, we address this problem by exploiting massively-parallel GPUs. We design, implement, and optimize the tensor sensing approach on an NVIDIA Tesla GPU and evaluate the performance in terms of the running time and recovery error rate. Experimental results show that our GPU tensor sensing is as accurate as the CPU counterpart with an average of 44.79 × and up to 84.70 × speedups for varying-sized synthetic tensor data. For IKEA Model 3D model data of a smaller size, our GPU algorithm achieved 15.374× speedup over the CPU tensor sensing. We further encapsulate the GPU algorithm into an open-source library, called cuTensorSensing (CUDA Tensor Sensing), which can be used for efficient RF tomographic imaging.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...