ALBERT

All Library Books, journals and Electronic Records Telegrafenberg

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
Filter
  • Articles  (15,496)
  • Latest Papers from Table of Contents or Articles in Press  (15,496)
  • Molecular Diversity Preservation International  (15,193)
  • MDPI  (303)
  • 2020-2022  (15,496)
  • 1980-1984
  • 1925-1929
  • Process Engineering, Biotechnology, Nutrition Technology  (12,483)
  • Computer Science  (3,013)
Collection
  • Articles  (15,496)
Source
  • Latest Papers from Table of Contents or Articles in Press  (15,496)
Years
Year
Journal
  • 1
    Publication Date: 2020-08-27
    Description: Festivals are experiential products heavily depending on the recommendations of previous visitors. With the power of social media growing, understanding the antecedents of positive electronic word-of-mouth (eWOM) intentions of festival attendees is immensely beneficial for festival organizers to better promote their festivals and control negative publicity. However, there is still limited research regarding eWOM intentions in the festival context. Thus, this study aims to fill such a gap by investigating the relationships among festival attendees’ enjoyment seeking motivation, perceived value, visitor satisfaction, and eWOM intention in a local festival setting. Additionally, the moderating role of gender was tested as it is one of the most important demographic variables to show individual differences in behavioral intentions. The results of structural equation modeling showed a positive effect of enjoyment seeking motivation on perceived value, visitor satisfaction, and eWOM intention. Moreover, gender differences in eWOM intention and a full mediating effect of visitor satisfaction between perceived value and eWOM intention for female respondents were revealed. The findings of this study extend the existing festival literature and provide insights for strategically organizing and promoting festivals to generate more positive eWOM which can be utilized as an effective marketing tool and a feedback channel.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 2
    Publication Date: 2020-08-26
    Description: Information and communication technologies transform modern education into a more available learning matrix. One of the unexplored aspects of open education is the constant communicative interaction within the student group by using social media. The aim of the study was to determine principal functions of student-led communication in the educational process, the method for assessing its strong points and the disadvantages disrupting traditional learning. For the primary study of the phenomenon, we used methods that made it possible to propose approaches to further analysis. Netnography is the main research method defining the essence and characteristics of the student-led peer-communication. In our research, we applied data visualization, analytical and quantitative methods and developed a set of quantitative indicators that can be used to assess various aspects of student communication in chats. The elaborated visual model can serve as a simple tool for diagnosing group communication processes. We revealed that online group chats perform a support function in learning. They provide constant informational resource on educational and organizational issues and create emotional comfort. Identified features serve to define shortcomings (e.g., lack of students’ readiness to freely exchange answers to assignments) and significant factors (e.g., underutilized opportunities for self-organization) that exist in the modern system of higher education.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 3
    Publication Date: 2020-08-28
    Description: Due to the growing success of neural machine translation (NMT), many have started to question its applicability within the field of literary translation. In order to grasp the possibilities of NMT, we studied the output of the neural machine system of Google Translate (GNMT) and DeepL when applied to four classic novels translated from English into Dutch. The quality of the NMT systems is discussed by focusing on manual annotations, and we also employed various metrics in order to get an insight into lexical richness, local cohesion, syntactic, and stylistic difference. Firstly, we discovered that a large proportion of the translated sentences contained errors. We also observed a lower level of lexical richness and local cohesion in the NMTs compared to the human translations. In addition, NMTs are more likely to follow the syntactic structure of a source sentence, whereas human translations can differ. Lastly, the human translations deviate from the machine translations in style.
    Electronic ISSN: 2227-9709
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 4
    Publication Date: 2020-08-29
    Description: The emergence and outbreak of the novel coronavirus (COVID-19) had a devasting effect on global health, the economy, and individuals’ daily lives. Timely diagnosis of COVID-19 is a crucial task, as it reduces the risk of pandemic spread, and early treatment will save patients’ life. Due to the time-consuming, complex nature, and high false-negative rate of the gold-standard RT-PCR test used for the diagnosis of COVID-19, the need for an additional diagnosis method has increased. Studies have proved the significance of X-ray images for the diagnosis of COVID-19. The dissemination of deep-learning techniques on X-ray images can automate the diagnosis process and serve as an assistive tool for radiologists. In this study, we used four deep-learning models—DenseNet121, ResNet50, VGG16, and VGG19—using the transfer-learning concept for the diagnosis of X-ray images as COVID-19 or normal. In the proposed study, VGG16 and VGG19 outperformed the other two deep-learning models. The study achieved an overall classification accuracy of 99.3%.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 5
    Publication Date: 2020-08-29
    Description: In this work, we demonstrate how the blockchain and the off-chain storage interact via Oracle-based mechanisms, which build an effective connection between a distributed database and real assets. For demonstration purposes, smart contracts were drawn up to deal with two different applications. Due to the characteristics of the blockchain, we may still encounter severe privacy issues, since the data stored on the blockchain are exposed to the public. The proposed scheme provides a general solution for resolving the above-mentioned privacy issue; that is, we try to protect the on-chain privacy of the sensitive data by using homomorphic encryption techniques. Specifically, we constructed a secure comparison protocol that can check the correctness of a logic function directly in the encrypted domain. By using the proposed access control contract and the secure comparison protocol, one can carry out sensitive data-dependent smart contract operations without revealing the data themselves.
    Electronic ISSN: 2073-431X
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 6
    Publication Date: 2020-08-29
    Description: Healthcare facilities are constantly deteriorating due to tight budgets allocated to the upkeep of building assets. This entails the need for improved deterioration modeling of such buildings in order to enforce a predictive maintenance approach that decreases the unexpected occurrence of failures and the corresponding downtime elapsed to repair or replace the faulty asset components. Currently, hospitals utilize subjective deterioration prediction methodologies that mostly rely on age as the sole indicator of degradation to forecast the useful lives of the building components. Thus, this paper aims at formulating a more efficient stochastic deterioration prediction model that integrates the latest observed condition into the forecasting procedure to overcome the subjectivity and uncertainties associated with the currently employed methods. This is achieved by means of developing a hybrid genetic algorithm-based fuzzy Markovian model that simulates the deterioration process given the scarcity of available data demonstrating the condition assessment and evaluation for such critical facilities. A nonhomogeneous transition probability matrix (TPM) based on fuzzy membership functions representing the condition, age and relative deterioration rate of the hospital systems is utilized to address the inherited uncertainties. The TPM is further calibrated by means of a genetic algorithm to circumvent the drawbacks of the expert-based models. A sensitivity analysis was carried out to analyze the possible changes in the output resulting from predefined modifications to the input parameters in order to ensure the robustness of the model. The performance of the deterioration prediction model developed is then validated through a comparison with a state-of-art stochastic model in contrast to real hospital datasets, and the results obtained from the developed model significantly outperformed the long-established Weibull distribution-based deterioration prediction methodology with mean absolute errors of 1.405 and 9.852, respectively. Therefore, the developed model is expected to assist decision-makers in creating more efficient maintenance programs as well as more data-driven capital renewal plans.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 7
    Publication Date: 2020-08-29
    Description: The harmonic closeness centrality measure associates, to each node of a graph, the average of the inverse of its distances from all the other nodes (by assuming that unreachable nodes are at infinite distance). This notion has been adapted to temporal graphs (that is, graphs in which edges can appear and disappear during time) and in this paper we address the question of finding the top-k nodes for this metric. Computing the temporal closeness for one node can be done in O(m) time, where m is the number of temporal edges. Therefore computing exactly the closeness for all nodes, in order to find the ones with top closeness, would require O(nm) time, where n is the number of nodes. This time complexity is intractable for large temporal graphs. Instead, we show how this measure can be efficiently approximated by using a “backward” temporal breadth-first search algorithm and a classical sampling technique. Our experimental results show that the approximation is excellent for nodes with high closeness, allowing us to detect them in practice in a fraction of the time needed for computing the exact closeness of all nodes. We validate our approach with an extensive set of experiments.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 8
    Publication Date: 2020-07-20
    Description: Computer programmers require various instructive information during coding and development. Such information is dispersed in different sources like language documentation, wikis, and forums. As an information exchange platform, programmers broadly utilize Stack Overflow, a Web-based Question Answering site. In this paper, we propose a recommender system which uses a supervised machine learning approach to investigate Stack Overflow posts to present instructive information for the programmers. This might be helpful for the programmers to solve programming problems that they confront with in their daily life. We analyzed posts related to two most popular programming languages—Python and PHP. We performed a few trials and found that the supervised approach could effectively manifold valuable information from our corpus. We validated the performance of our system from human perception which showed an accuracy of 71%. We also presented an interactive interface for the users that satisfied the users’ query with the matching sentences with most instructive information.
    Electronic ISSN: 2073-431X
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 9
    Publication Date: 2020-07-19
    Description: Background: Health benefits from physical activity (PA) can be achieved by following the WHO recommendation for PA. To increase PA in inactive individuals, digital interventions can provide cost-effective and low-threshold access. Moreover, gamification elements can raise the motivation for PA. This study analyzed which factors (personality traits, app features, gamification) are relevant to increasing PA within this target group. Methods: N = 808 inactive participants (f = 480; m = 321; age = 48 ± 6) were integrated into the analysis of the desire for PA, the appearance of personality traits and resulting interest in app features and gamification. The statistical analysis included chi-squared tests, one-way ANOVA and regression analysis. Results: The main interests in PA were fitness (97%) and outdoor activities (75%). No significant interaction between personality traits, interest in PA goals, app features and gamification were found. The interest in gamification was determined by the PA goal. Participants’ requirements for features included feedback and suggestions for activities. Monetary incentives were reported as relevant gamification aspects. Conclusion: Inactive people can be reached by outdoor activities, interventions to increase an active lifestyle, fitness and health sports. The study highlighted the interest in specific app features and gamification to increase PA in inactive people through an app.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 10
    Publication Date: 2020-07-01
    Description: This paper presents a study related to human psychophysiological activity estimation based on a smartphone camera and sensors. In recent years, awareness of the human body, as well as human mental states, has become more and more popular. Yoga and meditation practices have moved from the east to Europe, the USA, Russia, and other countries, and there are a lot of people who are interested in them. However, recently, people have tried the practice but would prefer an objective assessment. We propose to apply the modern methods of computer vision, pattern recognition, competence management, and dynamic motivation to estimate the quality of the meditation process and provide the users with objective information about their practice. We propose an approach that covers the possibility of recognizing pictures of humans from a smartphone and utilizes wearable electronics to measure the user’s heart rate and motions. We propose a model that allows building meditation estimation scores based on these parameters. Moreover, we propose a meditation expert network through which users can find the coach that is most appropriate for him/her. Finally, we propose the dynamic motivation model, which encourages people to perform the practice every day.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 11
    Publication Date: 2020-08-31
    Description: The recent literature concerning globalizing regional development has placed significant emphasis on the Global Production Network (GPN 2.0). GPN 2.0 in economic geography emphasizes that regional growth is caused by a shift in the strategic coupling mode from a low to high level. In addition, GPN 2.0 regards firm-level value capture trajectories as key analytical object, rather than the interactive relationships among scalar and divergent actors in GPN 1.0. To provide a better understanding of causal linkages between the GPNs and uneven regional development in the background of globalization and to test the applicability of GPN 2.0 analysis framework, the paper analyzed 62 Korean-invested automotive firms in Jiangsu Province, China. In order to explore the value capture trajectories of lead firms in the GPNs, the authors applied K-means clustering method to quantitatively analyze the local supply networks of lead firms from organizational and spatial dimensions. Then, comparisons were made between strategic coupling modes of GPNs and regional development in North and South Jiangsu. This study found obvious similarities within these two regions but obvious differences between them in terms of value capture trajectories. We observed that North Jiangsu is currently in the stage of “structural coupling”, whereas South Jiangsu is in the stage of “functional coupling.” Thus, this article argues that spatial settings such as regional assets and autonomy are key factors influencing uneven economic development. This research may provide a crucial reference for the regional development of Jiangsu, China.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 12
    Publication Date: 2020-08-31
    Description: Software defined networking (SDN) is an emerging network paradigm that decouples the control plane from the data plane. The data plane is composed of forwarding elements called switches and the control plane is composed of controllers. SDN is gaining popularity from industry and academics due to its advantages such as centralized, flexible, and programmable network management. The increasing number of traffics due to the proliferation of the Internet of Thing (IoT) devices may result in two problems: (1) increased processing load of the controller, and (2) insufficient space in the switches’ flow table to accommodate the flow entries. These problems may cause undesired network behavior and unstable network performance, especially in large-scale networks. Many solutions have been proposed to improve the management of the flow table, reducing controller processing load, and mitigating security threats and vulnerabilities on the controllers and switches. This paper provides comprehensive surveys of existing schemes to ensure SDN meets the quality of service (QoS) demands of various applications and cloud services. Finally, potential future research directions are identified and discussed such as management of flow table using machine learning.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 13
    Publication Date: 2020-07-16
    Description: High order convective Cahn-Hilliard type equations describe the faceting of a growing surface, or the dynamics of phase transitions in ternary oil-water-surfactant systems. In this paper, we prove the well-posedness of the classical solutions for the Cauchy problem, associated with this equation.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 14
    Publication Date: 2020-07-15
    Description: As Web applications become more and more complex, the development costs are increasing as well. A Model Driven Architecture (MDA) approach is proposed in this paper since it simplifies modeling, design, implementation, and integration of applications by defining software mainly at the model level. We adopt the The Unified Modeling Language (UML), as modeling language. UML provides a set of diagrams to model structural and behavioral aspects of the Web applications. Automatic translation of UML diagrams to the Object-Oriented code is highly desirable because it eliminates the chances of introducing human errors. Moreover, automatic code generation helps the software designers delivering of the software on time. In our approach, the automatic transformations across the MDA’s levels are based on meta-models for two of the most important constructs of UML, namely Use Cases and classes. A proprietary tool (called xGenerator) performs the transformations up to the Java source code. The architecture of the generated Web applications respects a variant of the well-known Model-View-Controller (MVC) pattern.
    Electronic ISSN: 2073-431X
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 15
    Publication Date: 2020-07-15
    Description: It is critical for organizations to self-assess their Industry 4.0 readiness to survive and thrive in the age of the Fourth Industrial Revolution. Thereon, conceptualization or development of an Industry 4.0 readiness model with the fundamental model dimensions is needed. This paper used a systematic literature review (SLR) methodology with the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines and content analysis strategy to review 97 papers in peer-reviewed academic journals and industry reports published from 2000 to 2019. The review identifies 30 Industry 4.0 readiness models with 158 unique model dimensions. Based on this review, there are two theoretical contributions. First, this paper proposes six dimensions (Technology, People, Strategy, Leadership, Process and Innovation) that can be considered as the most important dimensions for organizations. Second, this review reveals that 70 (44%) out of total 158 total unique dimensions on Industry 4.0 pertain to the assessment of technology alone. This establishes that organizations need to largely improve on their technology readiness, to strengthen their Industry 4.0 readiness. In summary, these six most common dimensions, and in particular, the dominance of the technology dimension provides a research agenda for future research on Industry 4.0 readiness.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 16
    Publication Date: 2020-07-16
    Description: This study introduces a software-based traffic congestion monitoring system. The transportation system controls the traffic between cities all over the world. Traffic congestion happens not only in cities, but also on highways and other places. The current transportation system is not satisfactory in the area without monitoring. In order to improve the limitations of the current traffic system in obtaining road data and expand its visual range, the system uses remote sensing data as the data source for judging congestion. Since some remote sensing data needs to be kept confidential, this is a problem to be solved to effectively protect the safety of remote sensing data during the deep learning training process. Compared with the general deep learning training method, this study provides a federated learning method to identify vehicle targets in remote sensing images to solve the problem of data privacy in the training process of remote sensing data. The experiment takes the remote sensing image data sets of Los Angeles Road and Washington Road as samples for training, and the training results can achieve an accuracy of about 85%, and the estimated processing time of each image can be as low as 0.047 s. In the final experimental results, the system can automatically identify the vehicle targets in the remote sensing images to achieve the purpose of detecting congestion.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 17
    Publication Date: 2020-07-15
    Description: Fractal’s spatially nonuniform phenomena and chaotic nature highlight the function utilization in fractal cryptographic applications. This paper proposes a new composite fractal function (CFF) that combines two different Mandelbrot set (MS) functions with one control parameter. The CFF simulation results demonstrate that the given map has high initial value sensitivity, complex structure, wider chaotic region, and more complicated dynamical behavior. By considering the chaotic properties of a fractal, an image encryption algorithm using a fractal-based pixel permutation and substitution is proposed. The process starts by scrambling the plain image pixel positions using the Henon map so that an intruder fails to obtain the original image even after deducing the standard confusion-diffusion process. The permutation phase uses a Z-scanned random fractal matrix to shuffle the scrambled image pixel. Further, two different fractal sequences of complex numbers are generated using the same function i.e. CFF. The complex sequences are thus modified to a double datatype matrix and used to diffuse the scrambled pixels in a row-wise and column-wise manner, separately. Security and performance analysis results confirm the reliability, high-security level, and robustness of the proposed algorithm against various attacks, including brute-force attack, known/chosen-plaintext attack, differential attack, and occlusion attack.
    Electronic ISSN: 2313-433X
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 18
    Publication Date: 2020-07-08
    Description: In the last decade, there has been a surge in interest in connected and automated vehicles (CAVs) and related enabling technologies in the fields of communication, automation, computing, sensing, and positioning [...]
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 19
    Publication Date: 2020-07-08
    Description: We consider a rather general problem of nonparametric estimation of an uncountable set of probability density functions (p.d.f.’s) of the form: f ( x ; r ) , where r is a non-random real variable and ranges from R 1 to R 2 . We put emphasis on the algorithmic aspects of this problem, since they are crucial for exploratory analysis of big data that are needed for the estimation. A specialized learning algorithm, based on the 2D FFT, is proposed and tested on observations that allow for estimate p.d.f.’s of a jet engine temperatures as a function of its rotation speed. We also derive theoretical results concerning the convergence of the estimation procedure that contains hints on selecting parameters of the estimation algorithm.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 20
    Publication Date: 2020-07-08
    Description: The lockdown was crucial to stop the COVID-19 pandemic in Italy, but it affected many aspects of social life, among which traditional live science cafés. Moreover, citizens and experts asked for a direct contact, not relying on mass-media communication. In this paper, we describe how the Florence and Rome science cafés, contacted by citizens and experts, either directly or through the Florence science shop, responded to these needs by organizing online versions of traditional face-to-face events, experiencing high levels of participation. The science café methodology was also requested by a high school that needed to conclude an engagement experience with students and their families. We also report the results of a survey about the satisfaction of this new methodology with respect to the old one.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 21
    Publication Date: 2020-07-09
    Description: This research presents a machine vision approach to detect lesions in liver ultrasound as well as resolving some issues in ultrasound such as artifacts, speckle noise, and blurring effect. The anisotropic diffusion is modified using the edge preservation conditions which found better than traditional ones in quantitative evolution. To dig for more potential information, a learnable super-resolution (SR) is embedded into the deep CNN. The feature is fused using Gabor Wavelet Transform (GWT) and Local Binary Pattern (LBP) with a pre-trained deep CNN model. Moreover, we propose a Bayes rule-based informative patch selection approach to reduce the processing time with the selective image patches and design an algorithm to mark the lesion region from identified ultrasound image patches. To train this model, standard data ensures promising resolution. The testing phase considers generalized data with a varying resolution and test the performance of the model. Exploring cross-validation, it finds that a 5-fold strategy can successfully eradicate the overfitting problem. Experiment data are collected using 298 consecutive ultrasounds comprising 15,296 image patches. This proposed feature fusion technique confirms satisfactory performance compared to the current relevant works with an accuracy of 98.40%.
    Electronic ISSN: 2504-4990
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 22
    Publication Date: 2020-07-10
    Description: QR (quick response) Codes are one of the most popular types of two-dimensional (2D) matrix codes currently used in a wide variety of fields. Two-dimensional matrix codes, compared to 1D bar codes, can encode significantly more data in the same area. We have compared algorithms capable of localizing multiple QR Codes in an image using typical finder patterns, which are present in three corners of a QR Code. Finally, we present a novel approach to identify perspective distortion by analyzing the direction of horizontal and vertical edges and by maximizing the standard deviation of horizontal and vertical projections of these edges. This algorithm is computationally efficient, works well for low-resolution images, and is also suited to real-time processing.
    Electronic ISSN: 2313-433X
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 23
    Publication Date: 2020-07-08
    Description: Deep learning models have been applied for varied electrical applications in smart grids with a high degree of reliability and accuracy. The development of deep learning models requires the historical data collected from several electric utilities during the training of the models. The lack of historical data for training and testing of developed models, considering security and privacy policy restrictions, is considered one of the greatest challenges to machine learning-based techniques. The paper proposes the use of homomorphic encryption, which enables the possibility of training the deep learning and classical machine learning models whilst preserving the privacy and security of the data. The proposed methodology is tested for applications of fault identification and localization, and load forecasting in smart grids. The results for fault localization show that the classification accuracy of the proposed privacy-preserving deep learning model while using homomorphic encryption is 97–98%, which is close to 98–99% classification accuracy of the model on plain data. Additionally, for load forecasting application, the results show that RMSE using the homomorphic encryption model is 0.0352 MWh while RMSE without application of encryption in modeling is around 0.0248 MWh.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 24
    Publication Date: 2020-07-07
    Description: Fifth generation (5G) is a new generation mobile communication system developed for the growing demand for mobile communication. Channel coding is an indispensable part of most modern digital communication systems, for it can improve the transmission reliability and anti-interference. In order to meet the requirements of 5G communication, a dual threshold self-corrected minimum sum (DT-SCMS) algorithm for low-density parity-check (LDPC) decoders is proposed in this paper. Besides, an architecture of LDPC decoders is designed. By setting thresholds to judge the reliability of messages, the DT-SCMS algorithm erases unreliable messages, improving the decoding performance and efficiency. Simulation results show that the performance of DT-SCMS is better than that of SCMS. When the code rate is 1/3, the performance of DT-SCMS has been improved by 0.2 dB at the bit error rate of 10 − 4 compared with SCMS. In terms of the convergence, when the code rate is 2/3, the number of iterations of DT-SCMS can be reduced by up to 20.46% compared with SCMS, and the average proportion of reduction is 18.68%.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 25
    Publication Date: 2020-07-09
    Description: We report the design of a Spiking Neural Network (SNN) edge detector with biologically inspired neurons that has a conceptual similarity with both Hodgkin-Huxley (HH) model neurons and Leaky Integrate-and-Fire (LIF) neurons. The computation of the membrane potential, which is used to determine the occurrence or absence of spike events, at each time step, is carried out by using the analytical solution to a simplified version of the HH neuron model. We find that the SNN based edge detector detects more edge pixels in images than those obtained by a Sobel edge detector. We designed a pipeline for image classification with a low-exposure frame simulation layer, SNN edge detection layers as pre-processing layers and a Convolutional Neural Network (CNN) as a classification module. We tested this pipeline for the task of classification with the Digits dataset, which is available in MATLAB. We find that the SNN based edge detection layer increases the image classification accuracy at lower exposure times, that is, for 1 〈 t 〈 T /4, where t is the number of milliseconds in a simulated exposure frame and T is the total exposure time, with reference to a Sobel edge or Canny edge detection layer in the pipeline. These results pave the way for developing novel cognitive neuromorphic computing architectures for millisecond timescale detection and object classification applications using event or spike cameras.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 26
    Publication Date: 2020-07-08
    Description: The collection and processing of personal data offers great opportunities for technological advances, but the accumulation of vast amounts of personal data also increases the risk of misuse for malicious intentions, especially in health care. Therefore, personal data are legally protected, e.g., by the European General Data Protection Regulation (GDPR), which states that individuals must be transparently informed and have the right to take control over the processing of their personal data. In real applications privacy policies are used to fulfill these requirements which can be negotiated via user interfaces. The literature proposes privacy languages as an electronic format for privacy policies while the users privacy preferences are represented by preference languages. However, this is only the beginning of the personal data life-cycle, which also includes the processing of personal data and its transfer to various stakeholders. In this work we define a personal privacy workflow, considering the negotiation of privacy policies, privacy-preserving processing and secondary use of personal data, in context of health care data processing to survey applicable Privacy Enhancing Technologies (PETs) to ensure the individuals’ privacy. Based on a broad literature review we identify open research questions for each step of the workflow.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 27
    Publication Date: 2020-07-05
    Description: Microscopic crowd simulation can help to enhance the safety of pedestrians in situations that range from museum visits to music festivals. To obtain a useful prediction, the input parameters must be chosen carefully. In many cases, a lack of knowledge or limited measurement accuracy add uncertainty to the input. In addition, for meaningful parameter studies, we first need to identify the most influential parameters of our parametric computer models. The field of uncertainty quantification offers standardized and fully automatized methods that we believe to be beneficial for pedestrian dynamics. In addition, many methods come at a comparatively low cost, even for computationally expensive problems. This allows for their application to larger scenarios. We aim to identify and adapt fitting methods to microscopic crowd simulation in order to explore their potential in pedestrian dynamics. In this work, we first perform a variance-based sensitivity analysis using Sobol’ indices and then crosscheck the results by a derivative-based measure, the activity scores. We apply both methods to a typical scenario in crowd simulation, a bottleneck. Because constrictions can lead to high crowd densities and delays in evacuations, several experiments and simulation studies have been conducted for this setting. We show qualitative agreement between the results of both methods. Additionally, we identify a one-dimensional subspace in the input parameter space and discuss its impact on the simulation. Moreover, we analyze and interpret the sensitivity indices with respect to the bottleneck scenario.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 28
    Publication Date: 2020-06-30
    Description: The use of chatbots in news media platforms, although relatively recent, offers many advantages to journalists and media professionals and, at the same time, facilitates users’ interaction with useful and timely information. This study shows the usability of a news chatbot during a crisis situation, employing the 2020 COVID-19 pandemic as a case study. The basic targets of the research are to design and implement a chatbot in a news media platform with a two-fold aim in regard to evaluation: first, the technical effort of creating a functional and robust news chatbot in a crisis situation both from the AI perspective and interoperability with other platforms, which constitutes the novelty of the approach; and second, users’ perception regarding the appropriation of this news chatbot as an alternative means of accessing existing information during a crisis situation. The chatbot designed was evaluated in terms of effectively fulfilling the social responsibility function of crisis reporting, to deliver timely and accurate information on the COVID-19 pandemic to a wide audience. In this light, this study shows the advantages of implementing chatbots in news platforms during a crisis situation, when the audience’s needs for timely and accurate information rapidly increase.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 29
    Publication Date: 2020-06-30
    Description: Twitter is a microblogging platform that generates large volumes of data with high velocity. This daily generation of unbounded and continuous data leads to Big Data streams that often require real-time distributed and fully automated processing. Hashtags, hyperlinked words in tweets, are widely used for tweet topic classification, retrieval, and clustering. Hashtags are used widely for analyzing tweet sentiments where emotions can be classified without contexts. However, regardless of the wide usage of hashtags, general tweet topic classification using hashtags is challenging due to its evolving nature, lack of context, slang, abbreviations, and non-standardized expression by users. Most existing approaches, which utilize hashtags for tweet topic classification, focus on extracting hashtag concepts from external lexicon resources to derive semantics. However, due to the rapid evolution and non-standardized expression of hashtags, the majority of these lexicon resources either suffer from the lack of hashtag words in their knowledge bases or use multiple resources at once to derive semantics, which make them unscalable. Along with scalable and automated techniques for tweet topic classification using hashtags, there is also a requirement for real-time analytics approaches to handle huge and dynamic flows of textual streams generated by Twitter. To address these problems, this paper first presents a novel semi-automated technique that derives semantically relevant hashtags using a domain-specific knowledge base of topic concepts and combines them with the existing tweet-based-hashtags to produce Hybrid Hashtags. Further, to deal with the speed and volume of Big Data streams of tweets, we present an online approach that updates the preprocessing and learning model incrementally in a real-time streaming environment using the distributed framework, Apache Storm. Finally, to fully exploit the batch and stream environment performance advantages, we propose a comprehensive framework (Hybrid Hashtag-based Tweet topic classification (HHTC) framework) that combines batch and online mechanisms in the most effective way. Extensive experimental evaluations on a large volume of Twitter data show that the batch and online mechanisms, along with their combination in the proposed framework, are scalable, efficient, and provide effective tweet topic classification using hashtags.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 30
    Publication Date: 2020-06-30
    Description: Standard (Lomb-Scargle, likelihood, etc.) procedures for power-spectrum analysis provide convenient estimates of the significance of any peak in a power spectrum, based—typically—on the assumption that the measurements being analyzed have a normal (i.e., Gaussian) distribution. However, the measurement sequence provided by a real experiment or a real observational program may not meet this requirement. The RONO (rank-order normalization) procedure generates a proxy distribution that retains the rank-order of the original measurements but has a strictly normal distribution. The proxy distribution may then be analyzed by standard power-spectrum analysis. We show by an example that the resulting power spectrum may prove to be quite close to the power spectrum obtained from the original data by a standard procedure, even if the distribution of the original measurements is far from normal. Such a comparison would tend to validate the original analysis.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 31
    Publication Date: 2020-06-30
    Description: Toward strong demand for very high-speed I/O for processors, physical performance growth of hardware I/O speed was drastically increased in this decade. However, the recent Big Data applications still demand the larger I/O bandwidth and the lower latency for the speed. Because the current I/O performance does not improve so drastically, it is the time to consider another way to increase it. To overcome this challenge, we focus on lossless data compression technology to decrease the amount of data itself in the data communication path. The recent Big Data applications treat data stream that flows continuously and never allow stalling processing due to the high speed. Therefore, an elegant hardware-based data compression technology is demanded. This paper proposes a novel lossless data compression, called ASE coding. It encodes streaming data by applying the entropy coding approach. ASE coding instantly assigns the fewest bits to the corresponding compressed data according to the number of occupied entries in a look-up table. This paper describes the detailed mechanism of ASE coding. Furthermore, the paper demonstrates performance evaluations to promise that ASE coding adaptively shrinks streaming data and also works on a small amount of hardware resources without stalling or buffering any part of data stream.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 32
    Publication Date: 2020-06-30
    Description: When highly automated driving is realized, the role of the driver will change dramatically. Drivers will even be able to sleep during the drive. However, when awaking from sleep, drivers often experience sleep inertia, meaning they are feeling groggy and are impaired in their driving performance―which can be an issue with the concept of dual-mode vehicles that allow both manual and automated driving. Proactive methods to avoid sleep inertia like the widely applied ‘NASA nap’ are not immediately practicable in automated driving. Therefore, a reactive countermeasure, the sleep inertia counter-procedure for drivers (SICD), has been developed with the aim to activate and motivate the driver as well as to measure the driver’s alertness level. The SICD is evaluated in a study with N = 21 drivers in a level highly automation driving simulator. The SICD was able to activate the driver after sleep and was perceived as “assisting” by the drivers. It was not capable of measuring the driver’s alertness level. The interpretation of the findings is limited due to a lack of a comparative baseline condition. Future research is needed on direct comparisons of different countermeasures to sleep inertia that are effective and accepted by drivers.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 33
    Publication Date: 2020-07-01
    Description: Text annotation is the process of identifying the sense of a textual segment within a given context to a corresponding entity on a concept ontology. As the bag of words paradigm’s limitations become increasingly discernible in modern applications, several information retrieval and artificial intelligence tasks are shifting to semantic representations for addressing the inherent natural language polysemy and homonymy challenges. With extensive application in a broad range of scientific fields, such as digital marketing, bioinformatics, chemical engineering, neuroscience, and social sciences, community detection has attracted great scientific interest. Focusing on linguistics, by aiming to identify groups of densely interconnected subgroups of semantic ontologies, community detection application has proven beneficial in terms of disambiguation improvement and ontology enhancement. In this paper we introduce a novel distributed supervised knowledge-based methodology employing community detection algorithms for text annotation with Wikipedia Entities, establishing the unprecedented concept of community Coherence as a metric for local contextual coherence compatibility. Our experimental evaluation revealed that deeper inference of relatedness and local entity community coherence in the Wikipedia graph bears substantial improvements overall via a focus on accuracy amelioration of less common annotations. The proposed methodology is propitious for wider adoption, attaining robust disambiguation performance.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 34
    Publication Date: 2020-07-02
    Description: The problem posed by complex, articulated or deformable objects has been at the focus of much tracking research for a considerable length of time. However, it remains a major challenge, fraught with numerous difficulties. The increased ubiquity of technology in all realms of our society has made the need for effective solutions all the more urgent. In this article, we describe a novel method which systematically addresses the aforementioned difficulties and in practice outperforms the state of the art. Global spatial flexibility and robustness to deformations are achieved by adopting a pictorial structure based geometric model, and localized appearance changes by a subspace based model of part appearance underlain by a gradient based representation. In addition to one-off learning of both the geometric constraints and part appearances, we introduce a continuing learning framework which implements information discounting i.e., the discarding of historical appearances in favour of the more recent ones. Moreover, as a means of ensuring robustness to transient occlusions (including self-occlusions), we propose a solution for detecting unlikely appearance changes which allows for unreliable data to be rejected. A comprehensive evaluation of the proposed method, the analysis and discussing of findings, and a comparison with several state-of-the-art methods demonstrates the major superiority of our algorithm.
    Electronic ISSN: 2313-433X
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 35
    Publication Date: 2020-07-02
    Description: Image fusion is a process that integrates similar types of images collected from heterogeneous sources into one image in which the information is more definite and certain. Hence, the resultant image is anticipated as more explanatory and enlightening both for human and machine perception. Different image combination methods have been presented to consolidate significant data from a collection of images into one image. As a result of its applications and advantages in variety of fields such as remote sensing, surveillance, and medical imaging, it is significant to comprehend image fusion algorithms and have a comparative study on them. This paper presents a review of the present state-of-the-art and well-known image fusion techniques. The performance of each algorithm is assessed qualitatively and quantitatively on two benchmark multi-focus image datasets. We also produce a multi-focus image fusion dataset by collecting the widely used test images in different studies. The quantitative evaluation of fusion results is performed using a set of image fusion quality assessment metrics. The performance is also evaluated using different statistical measures. Another contribution of this paper is the proposal of a multi-focus image fusion library, to the best of our knowledge, no such library exists so far. The library provides implementation of numerous state-of-the-art image fusion algorithms and is made available publicly at project website.
    Electronic ISSN: 2313-433X
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 36
    Publication Date: 2020-07-02
    Description: Fitness and physical exercise are preferred in the pursuit of healthier and active lifestyles. The number of mobile applications aiming to replace or complement a personal trainer is increasing. However, this also raises questions about the reliability, integrity, and even safety of the information provided by such applications. In this study, we review mobile applications that serve as virtual personal trainers. We present a systematic review of 36 related mobile applications, updated between 2017 and 2020, classifying them according to their characteristics. The selection criteria considers the following combination of keywords: “workout”, “personal trainer”, “physical activity”, “fitness”, “gymnasium”, and “daily plan”. Based on the analysis of the identified mobile applications, we propose a new taxonomy and present detailed guidelines on creating mobile applications for personalised workouts. Finally, we investigated how can mobile applications promote health and well-being of users and whether the identified applications are used in any scientific studies.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 37
    Publication Date: 2020-08-31
    Description: Text similarity measurement is the basis of natural language processing tasks, which play an important role in information retrieval, automatic question answering, machine translation, dialogue systems, and document matching. This paper systematically combs the research status of similarity measurement, analyzes the advantages and disadvantages of current methods, develops a more comprehensive classification description system of text similarity measurement algorithms, and summarizes the future development direction. With the aim of providing reference for related research and application, the text similarity measurement method is described by two aspects: text distance and text representation. The text distance can be divided into length distance, distribution distance, and semantic distance; text representation is divided into string-based, corpus-based, single-semantic text, multi-semantic text, and graph-structure-based representation. Finally, the development of text similarity is also summarized in the discussion section.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 38
    Publication Date: 2020-06-30
    Description: Partially automated driving (PAD, Society of Automotive Engineers (SAE) level 2) features provide steering and brake/acceleration support, while the driver must constantly supervise the support feature and intervene if needed to maintain safety. PAD could potentially increase comfort, road safety, and traffic efficiency. As during manual driving, users might engage in non-driving related tasks (NDRTs). However, studies systematically examining NDRT execution during PAD are rare and most importantly, no established methodologies to systematically evaluate driver distraction during PAD currently exist. The current project’s goal was to take the initial steps towards developing a test protocol for systematically evaluating NDRT’s effects during PAD. The methodologies used for manual driving were extended to PAD. Two generic take-over situations addressing system limits of a given PAD regarding longitudinal and lateral control were implemented to evaluate drivers’ supervisory and take-over capabilities while engaging in different NDRTs (e.g., manual radio tuning task). The test protocol was evaluated and refined across the three studies (two simulator and one test track). The results indicate that the methodology could sensitively detect differences between the NDRTs’ influences on drivers’ take-over and especially supervisory capabilities. Recommendations were formulated regarding the test protocol’s use in future studies examining the effects of NDRTs during PAD.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 39
    Publication Date: 2020-06-30
    Description: This research concerns the application of micro X-ray fluorescence (µXRF) mapping to the investigation of a group of selected metal objects from the archaeological site of Ferento, a Roman and then medieval town in Central Italy. Specifically, attention was focused on two test pits, named IV and V, in which metal objects were found, mainly pertaining to the medieval period and never investigated before the present work from a compositional point of view. The potentiality of µXRF mapping was tested through a Bruker Tornado M4 equipped with an Rh tube, operating at 50 kV, 500 μA, and spot 25 μm obtained with polycapillary optics. Principal component analysis (PCA) and multivariate curve resolution (MCR) were used for processing the X-ray fluorescence spectra. The results showed that the investigated items are characterized by different compositions in terms of chemical elements. Three little wheels are made of lead, while the fibulae are made of copper-based alloys with varying amounts of tin, zinc, and lead. Only one ring is iron-based, and the other objects, namely a spatula and an applique, are also made of copper-based alloys, but with different relative amounts of the main elements. In two objects, traces of gold were found, suggesting the precious character of these pieces. MCR analysis was demonstrated to be particularly useful to confirm the presence of trace elements, such as gold, as it could differentiate the signals related to minor elements from those due to major chemical elements.
    Electronic ISSN: 2313-433X
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 40
    Publication Date: 2020-06-30
    Description: Geomechanical modelling of the processes associated to the exploitation of subsurface resources, such as land subsidence or triggered/induced seismicity, is a common practice of major interest. The prediction reliability depends on different sources of uncertainty, such as the parameterization of the constitutive model characterizing the deep rock behaviour. In this study, we focus on a Sobol’-based sensitivity analysis and uncertainty reduction via assimilation of land deformations. A synthetic test case application on a deep hydrocarbon reservoir is considered, where land settlements are predicted with the aid of a 3-D Finite Element (FE) model. Data assimilation is performed via the Ensemble Smoother (ES) technique and its variation in the form of Multiple Data Assimilation (ES-MDA). However, the ES convergence is guaranteed with a large number of Monte Carlo (MC) simulations, that may be computationally infeasible in large scale and complex systems. For this reason, a surrogate model based on the generalized Polynomial Chaos Expansion (gPCE) is proposed as an approximation of the forward problem. This approach allows to efficiently compute the Sobol’ indices for the sensitivity analysis and greatly reduce the computational cost of the original ES and MDA formulations, also enhancing the accuracy of the overall prediction process.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 41
    Publication Date: 2020-06-30
    Description: Prior research found that user personality significantly affects technology acceptance perceptions and decisions. Yet, evidence on the moderating influence of user gender on the relationship between personality and technology acceptance is barely existent despite theoretical consideration. Considering this research gap, the present study reports the results of a survey in which we examined the relationships between personality and technology acceptance from a gender perspective. This study draws upon a sample of N = 686 participants (n = 209 men, n = 477 women) and applied the HEXACO Personality Inventory—Revised along with established technology acceptance measures. The major result of this study is that we do not find significant influence of user gender on the relationship between personality and technology acceptance, except for one aspect of personality, namely altruism. We found a negative association between altruism and intention to use the smartphone in men, but a positive association in women. Consistent with this finding, we also found the same association pattern for altruism and predicted usage: a negative one in men and a positive one in women. Implications for research and practice are discussed, along with limitations of the present study and possible avenues for future research.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 42
    Publication Date: 2020-06-30
    Description: Clustering is an unsupervised machine learning technique with many practical applications that has gathered extensive research interest. Aside from deterministic or probabilistic techniques, fuzzy C-means clustering (FCM) is also a common clustering technique. Since the advent of the FCM method, many improvements have been made to increase clustering efficiency. These improvements focus on adjusting the membership representation of elements in the clusters, or on fuzzifying and defuzzifying techniques, as well as the distance function between elements. This study proposes a novel fuzzy clustering algorithm using multiple different fuzzification coefficients depending on the characteristics of each data sample. The proposed fuzzy clustering method has similar calculation steps to FCM with some modifications. The formulas are derived to ensure convergence. The main contribution of this approach is the utilization of multiple fuzzification coefficients as opposed to only one coefficient in the original FCM algorithm. The new algorithm is then evaluated with experiments on several common datasets and the results show that the proposed algorithm is more efficient compared to the original FCM as well as other clustering methods.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 43
    Publication Date: 2020-07-02
    Description: Knowing an accurate passengers attendance estimation on each metro car contributes to the safely coordination and sorting the crowd-passenger in each metro station. In this work we propose a multi-head Convolutional Neural Network (CNN) architecture trained to infer an estimation of passenger attendance in a metro car. The proposed network architecture consists of two main parts: a convolutional backbone, which extracts features over the whole input image, and a multi-head layers able to estimate a density map, needed to predict the number of people within the crowd image. The network performance is first evaluated on publicly available crowd counting datasets, including the ShanghaiTech part_A, ShanghaiTech part_B and UCF_CC_50, and then trained and tested on our dataset acquired in subway cars in Italy. In both cases a comparison is made against the most relevant and latest state of the art crowd counting architectures, showing that our proposed MH-MetroNet architecture outperforms in terms of Mean Absolute Error (MAE) and Mean Square Error (MSE) and passenger-crowd people number prediction.
    Electronic ISSN: 2313-433X
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 44
    Publication Date: 2020-07-03
    Description: For imaging events of extremely short duration, like shock waves or explosions, it is necessary to be able to image the object with a single-shot exposure. A suitable setup is given by a laser-induced X-ray source such as the one that can be found at GSI (Helmholtzzentrum für Schwerionenforschung GmbH) in Darmstadt (Society for Heavy Ion Research), Germany. There, it is possible to direct a pulse from the high-energy laser Petawatt High Energy Laser for Heavy Ion eXperiments (PHELIX) on a tungsten wire to generate a picosecond polychromatic X-ray pulse, called backlighter. For grating-based single-shot phase-contrast imaging of shock waves or exploding wires, it is important to know the weighted mean energy of the X-ray spectrum for choosing a suitable setup. In propagation-based phase-contrast imaging the knowledge of the weighted mean energy is necessary to be able to reconstruct quantitative phase images of unknown objects. Hence, we developed a method to evaluate the weighted mean energy of the X-ray backlighter spectrum using propagation-based phase-contrast images. In a first step wave-field simulations are performed to verify the results. Furthermore, our evaluation is cross-checked with monochromatic synchrotron measurements with known energy at Diamond Light Source (DLS, Didcot, UK) for proof of concepts.
    Electronic ISSN: 2313-433X
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 45
    Publication Date: 2020-07-02
    Description: The number of Internet of Things (IoT) devices is growing at a fast pace in smart homes, producing large amounts of data, which are mostly transferred over wireless communication channels. However, various IoT devices are vulnerable to different threats, such as cyber-attacks, fluctuating network connections, leakage of information, etc. Statistical analysis and machine learning can play a vital role in detecting the anomalies in the data, which enhances the security level of the smart home IoT system which is the goal of this paper. This paper investigates the trustworthiness of the IoT devices sending house appliances’ readings, with the help of various parameters such as feature importance, root mean square error, hyper-parameter tuning, etc. A spamicity score was awarded to each of the IoT devices by the algorithm, based on the feature importance and the root mean square error score of the machine learning models to determine the trustworthiness of the device in the home network. A dataset publicly available for a smart home, along with weather conditions, is used for the methodology validation. The proposed algorithm is used to detect the spamicity score of the connected IoT devices in the network. The obtained results illustrate the efficacy of the proposed algorithm to analyze the time series data from the IoT devices for spam detection.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 46
    Publication Date: 2020-07-02
    Description: Humans are capable of learning new concepts from small numbers of examples. In contrast, supervised deep learning models usually lack the ability to extract reliable predictive rules from limited data scenarios when attempting to classify new examples. This challenging scenario is commonly known as few-shot learning. Few-shot learning has garnered increased attention in recent years due to its significance for many real-world problems. Recently, new methods relying on meta-learning paradigms combined with graph-based structures, which model the relationship between examples, have shown promising results on a variety of few-shot classification tasks. However, existing work on few-shot learning is only focused on the feature embeddings produced by the last layer of the neural network. The novel contribution of this paper is the utilization of lower-level information to improve the meta-learner performance in few-shot learning. In particular, we propose the Looking-Back method, which could use lower-level information to construct additional graphs for label propagation in limited data settings. Our experiments on two popular few-shot learning datasets, miniImageNet and tieredImageNet, show that our method can utilize the lower-level information in the network to improve state-of-the-art classification performance.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 47
    Publication Date: 2020-07-06
    Description: Virtual worlds have become global platforms connecting millions of people and containing various technologies. For example, No Man’s Sky (nomanssky.com), a cross-platform virtual world, can dynamically and automatically generate content with the progress of user adventure. AltspaceVR (altvr.com) is a social virtual reality platform supporting motion capture through Microsoft’s Kinect, eye tracking, and mixed reality extension. The changes in industrial investment, market revenue, user population, and consumption drive the evolution of virtual-world-related technologies (e.g., computing infrastructure and interaction devices), which turns into new design requirements and thus results in the requirement satisfaction problem in virtual world system architecture design. In this paper, we first study the new or evolving features of virtual worlds and emerging requirements of system development through market/industry trend analysis, including infrastructure mobility, content diversity, function interconnectivity, immersive environment, and intelligent agents. Based on the trend analysis, we propose a new design requirement space. We, then, discuss the requirement satisfaction of existing system architectures and highlight their limitations through a literature review. The feature-based requirement satisfaction comparison of existing system architectures sheds some light on the future virtual world system development to match the changing trends of the user market. At the end of this study, a new architecture from an ongoing research, called Virtual Net, is discussed, which can provide higher resource sufficiency, computing reliability, content persistency, and service credibility.
    Electronic ISSN: 1999-5903
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 48
    Publication Date: 2020-07-06
    Description: With the rise of partially automated cars, drivers are more and more required to judge the degree of responsibility that can be delegated to vehicle assistant systems. This can be supported by utilizing interfaces that intuitively convey real-time reliabilities of system functions such as environment sensing. We designed a vibrotactile interface that communicates spatiotemporal information about surrounding vehicles and encodes a representation of spatial uncertainty in a novel way. We evaluated this interface in a driving simulator experiment with high and low levels of human and machine confidence respectively caused by simulated degraded vehicle sensor precision and limited human visibility range. Thereby we were interested in whether drivers (i) could perceive and understand the vibrotactile encoding of spatial uncertainty, (ii) would subjectively benefit from the encoded information, (iii) would be disturbed in cases of information redundancy, and (iv) would gain objective safety benefits from the encoded information. To measure subjective understanding and benefit, a custom questionnaire, Van der Laan acceptance ratings and NASA TLX scores were used. To measure the objective benefit, we computed the minimum time-to-contact as a measure of safety and gaze distributions as an indicator for attention guidance. Results indicate that participants were able to understand the encoded uncertainty and spatiotemporal information and purposefully utilized it when needed. The tactile interface provided meaningful support despite sensory restrictions. By encoding spatial uncertainties, it successfully extended the operating range of the assistance system.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 49
    Publication Date: 2020-07-03
    Description: The COVID-19 pandemic exploded at the beginning of 2020, with over four million cases in five months, overwhelming the healthcare sector. Several national governments decided to adopt containment measures, such as lockdowns, social distancing, and quarantine. Among these measures, contact tracing can contribute in bringing under control the outbreak, as quickly identifying contacts to isolate suspected cases can limit the number of infected people. In this paper we present BubbleBox, a system relying on a dedicated device to perform contact tracing. BubbleBox integrates Internet of Things and software technologies into different components to achieve its goal—providing a tool to quickly react to further outbreaks, by allowing health operators to rapidly reach and test possible infected people. This paper describes the BubbleBox architecture, presents its prototype implementation, and discusses its pros and cons, also dealing with privacy concerns.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 50
    Publication Date: 2020-07-05
    Description: Variation, adaptation, heredity and fitness, constraints and affordances, speciation, and extinction form the building blocks of the (Neo-)Darwinian research program, and several of these have been called “Darwinian principles”. Here, we suggest that caution should be taken in calling these principles Darwinian because of the important role played by reticulate evolutionary mechanisms and processes in also bringing about these phenomena. Reticulate mechanisms and processes include symbiosis, symbiogenesis, lateral gene transfer, infective heredity mediated by genetic and organismal mobility, and hybridization. Because the “Darwinian principles” are brought about by both vertical and reticulate evolutionary mechanisms and processes, they should be understood as foundational for a more pluralistic theory of evolution, one that surpasses the classic scope of the Modern and the Neo-Darwinian Synthesis. Reticulate evolution moreover demonstrates that what conventional (Neo-)Darwinian theories treat as intra-species features of evolution frequently involve reticulate interactions between organisms from very different taxonomic categories. Variation, adaptation, heredity and fitness, constraints and affordances, speciation, and extinction therefore cannot be understood as “traits” or “properties” of genes, organisms, species, or ecosystems because the phenomena are irreducible to specific units and levels of an evolutionary hierarchy. Instead, these general principles of evolution need to be understood as common goods that come about through interactions between different units and levels of evolutionary hierarchies, and they are exherent rather than inherent properties of individuals.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 51
    Publication Date: 2020-07-04
    Description: This paper presents an experiment on newsreaders’ behavior and preferences on the interaction with online personalized news. Different recommendation approaches, based on consumption profiles and user location, and the impact of personalized news on several aspects of consumer decision-making are examined on a group of volunteers. Results show a significant preference for reading recommended news over other news presented on the screen, regardless of the chosen editorial layout. In addition, the study also provides support for the creation of profiles taking into consideration the evolution of user’s interests. The proposed solution is valid for users with different reading habits and can be successfully applied even to users with small consumption history. Our findings can be used by news providers to improve online services, thus increasing readers’ perceived satisfaction.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 52
    Publication Date: 2020-07-06
    Description: Many industries today are struggling with early the identification of quality issues, given the shortening of product design cycles and the desire to decrease production costs, coupled with the customer requirement for high uptime. The vehicle industry is no exception, as breakdowns often lead to on-road stops and delays in delivery missions. In this paper we consider quality issues to be an unexpected increase in failure rates of a particular component; those are particularly problematic for the original equipment manufacturers (OEMs) since they lead to unplanned costs and can significantly affect brand value. We propose a new approach towards the early detection of quality issues using machine learning (ML) to forecast the failures of a given component across the large population of units. In this study, we combine the usage information of vehicles with the records of their failures. The former is continuously collected, as the usage statistics are transmitted over telematics connections. The latter is based on invoice and warranty information collected in the workshops. We compare two different ML approaches: the first is an auto-regression model of the failure ratios for vehicles based on past information, while the second is the aggregation of individual vehicle failure predictions based on their individual usage. We present experimental evaluations on the real data captured from heavy-duty trucks demonstrating how these two formulations have complementary strengths and weaknesses; in particular, they can outperform each other given different volumes of the data. The classification approach surpasses the regressor model whenever enough data is available, i.e., once the vehicles are in-service for a longer time. On the other hand, the regression shows better predictive performance with a smaller amount of data, i.e., for vehicles that have been deployed recently.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 53
    Publication Date: 2020-07-03
    Description: Business processes evolve over time to adapt to changing business environments. This requires continuous monitoring of business processes to gain insights into whether they conform to the intended design or deviate from it. The situation when a business process changes while being analysed is denoted as Concept Drift. Its analysis is concerned with studying how a business process changes, in terms of detecting and localising changes and studying the effects of the latter. Concept drift analysis is crucial to enable early detection and management of changes, that is, whether to promote a change to become part of an improved process, or to reject the change and make decisions to mitigate its effects. Despite its importance, there exists no comprehensive framework for analysing concept drift types, affected process perspectives, and granularity levels of a business process. This article proposes the CONcept Drift Analysis in Process Mining (CONDA-PM) framework describing phases and requirements of a concept drift analysis approach. CONDA-PM was derived from a Systematic Literature Review (SLR) of current approaches analysing concept drift. We apply the CONDA-PM framework on current approaches to concept drift analysis and evaluate their maturity. Applying CONDA-PM framework highlights areas where research is needed to complement existing efforts.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 54
    Publication Date: 2020-04-14
    Description: Let P be a set of n points in R d , k ≥ 1 be an integer and ε ∈ ( 0 , 1 ) be a constant. An ε-coreset is a subset C ⊆ P with appropriate non-negative weights (scalars), that approximates any given set Q ⊆ R d of k centers. That is, the sum of squared distances over every point in P to its closest point in Q is the same, up to a factor of 1 ± ε to the weighted sum of C to the same k centers. If the coreset is small, we can solve problems such as k-means clustering or its variants (e.g., discrete k-means, where the centers are restricted to be in P, or other restricted zones) on the small coreset to get faster provable approximations. Moreover, it is known that such coreset support streaming, dynamic and distributed data using the classic merge-reduce trees. The fact that the coreset is a subset implies that it preserves the sparsity of the data. However, existing such coresets are randomized and their size has at least linear dependency on the dimension d. We suggest the first such coreset of size independent of d. This is also the first deterministic coreset construction whose resulting size is not exponential in d. Extensive experimental results and benchmarks are provided on public datasets, including the first coreset of the English Wikipedia using Amazon’s cloud.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 55
    Publication Date: 2020-08-25
    Description: Today, convolutional and deconvolutional neural network models are exceptionally popular thanks to the impressive accuracies they have been proven in several computer-vision applications. To speed up the overall tasks of these neural networks, purpose-designed accelerators are highly desirable. Unfortunately, the high computational complexity and the huge memory demand make the design of efficient hardware architectures, as well as their deployment in resource- and power-constrained embedded systems, still quite challenging. This paper presents a novel purpose-designed hardware accelerator to perform 2D deconvolutions. The proposed structure applies a hardware-oriented computational approach that overcomes the issues of traditional deconvolution methods, and it is suitable for being implemented within any virtually system-on-chip based on field-programmable gate array devices. In fact, the novel accelerator is simply scalable to comply with resources available within both high- and low-end devices by adequately scaling the adopted parallelism. As an example, when exploited to accelerate the Deep Convolutional Generative Adversarial Network model, the novel accelerator, running as a standalone unit implemented within the Xilinx Zynq XC7Z020 System-on-Chip (SoC) device, performs up to 72 GOPs. Moreover, it dissipates less than 500mW@200MHz and occupies 5.6%, 4.1%, 17%, and 96%, respectively, of the look-up tables, flip-flops, random access memory, and digital signal processors available on-chip. When accommodated within the same device, the whole embedded system equipped with the novel accelerator performs up to 54 GOPs and dissipates less than 1.8W@150MHz. Thanks to the increased parallelism exploitable, more than 900 GOPs can be executed when the high-end Virtex-7 XC7VX690T device is used as the implementation platform. Moreover, in comparison with state-of-the-art competitors implemented within the Zynq XC7Z045 device, the system proposed here reaches a computational capability up to 20% higher, and saves more than 60% and 80% of power consumption and logic resources requirement, respectively, using 5.7× fewer on-chip memory resources.
    Electronic ISSN: 2313-433X
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 56
    Publication Date: 2020-04-23
    Description: This study presents an analysis of RePair, which is a grammar compression algorithm known for its simple scheme,while also being practically effective. First, we show that the main process of RePair, that is, the step by step substitution of the most frequent symbol pairs, works within the corresponding most frequent maximal repeats. Then, we reveal the relation between maximal repeats and grammars constructed by RePair. On the basis of this analysis, we further propose a novel variant of RePair, called MR-RePair, which considers the one-time substitution of the most frequent maximal repeats instead of the consecutive substitution of the most frequent pairs. The results of the experiments comparing the size of constructed grammars and execution time of RePair and MR-RePair on several text corpora demonstrate that MR-RePair constructs more compact grammars than RePair does, especially for highly repetitive texts.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 57
    Publication Date: 2020-04-24
    Description: The process of moving from experimental data to modeling and characterizing the dynamics and interactions in natural processes is a challenging task. This paper proposes an interactive platform for fitting data derived from experiments to mathematical expressions and carrying out spatial visualization. The platform is designed using a component-based software architectural approach, implemented in R and the Java programming languages. It uses experimental data as input for model fitting, then applies the obtained model at the landscape level via a spatial temperature grid data to yield regional and continental maps. Different modules and functionalities of the tool are presented with a case study, in which the tool is used to establish a temperature-dependent virulence model and map the potential zone of efficacy of a fungal-based biopesticide. The decision support system (DSS) was developed in generic form, and it can be used by anyone interested in fitting mathematical equations to experimental data collected following the described protocol and, depending on the type of investigation, it offers the possibility of projecting the model at the landscape level.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 58
    Publication Date: 2020-04-23
    Description: In this paper, we proposed a verification method for the message passing behavior of IoT systems by checking the accumulative event relation of process models. In an IoT system, it is hard to verify the behavior of message passing by only looking at the sequence of packet transmissions recorded in the system log. We proposed a method to extract event relations from the log and check for any minor deviations that exist in the system. Using process mining, we extracted the variation of a normal process model from the log. We checked for any deviation that is hard to be detected unless the model is accumulated and stacked over time. Message passing behavior can be verified by comparing the similarity of the process tree model, which represents the execution relation between each message passing event. As a result, we can detect minor deviations such as missing events and perturbed event order with occurrence probability as low as 3%.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 59
    Publication Date: 2020
    Description: During highly automated driving, the passenger is allowed to conduct non-driving related activities (NDRA) and no longer has to act as a fallback at the functional limits of the driving automation system. Previous research has shown that at lower levels of automation, passengers still wish to be informed about automated vehicle behavior to a certain extent. Due to the aim of the introduction of urban automated driving, which is characterized by high complexity, we investigated the information needs and visual attention of the passenger during urban, highly automated driving. Additionally, there was an investigation into the influence of the experience of automated driving and of NDRAs on these results. Forty participants took part in a driving simulator study. As well as the information presented on the human–machine interface (system status, navigation information, speed and speed limit), participants requested information about maneuvers, reasons for maneuvers, environmental settings and additional navigation data. Visual attention was significantly affected by the NDRA, while the experience of automated driving had no effect. Experience and NDRA showed no significant effect on the need for information. Differences in information needs seem to be due to the requirements of the individual passenger, rather than the investigated factors.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 60
    Publication Date: 2020
    Description: The Delphi method is one of the basic tools for forecasting values in various types of issues. It uses the knowledge of experts, which is properly aggregated (e.g., in the form of descriptive statistics measures) and returns to the previous group of experts again, thus starting the next round of forecasting. The multi-stage prediction under the Delphi method allows for better stabilization of the results, which is extremely important in the process of forecasting. Experts in the forecasting process often have access to time series forecasting software but do not necessarily use it. Therefore, it seems advisable to add to the aggregate the value obtained using forecasting software. The advantage of this approach is in saving the time and costs of obtaining a forecast. That should be understood as a smaller burden on data analysts and the value of their work. According to the above mentioned key factors, the main contribution of the article is the use of a virtual expert in the form of a computer-enhanced mathematical tool, i.e., a programming library for a forecasting time series. The chosen software tool is the Prophet library—a Facebook tool that can be used in Python or R programming languages.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 61
    Publication Date: 2020
    Description: Nowadays, we are observing a growing interest about Big Data applications in different healthcare sectors. One of this is definitely cardiology. In fact, electrocardiogram produces a huge amount of data about the heart health status that need to be stored and analysed in order to detect a possible issues. In this paper, we focus on the arrhythmia detection problem. Specifically, our objective is to address the problem of distributed processing considering big data generated by electrocardiogram (ECG) signals in order to carry out pre-processing analysis. Specifically, an algorithm for the identification of heartbeats and arrhythmias is proposed. Such an algorithm is designed in order to carry out distributed processing over the Cloud since big data could represent the bottleneck for cardiology applications. In particular, we implemented the Menard algorithm in Apache Spark in order to process big data coming form ECG signals in order to identify arrhythmias. Experiments conducted using a dataset provided by the Physionet.org European ST-T Database show an improvement in terms of response times. As highlighted by our outcomes, our solution provides a scalable and reliable system, which may address the challenges raised by big data in healthcare.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 62
    Publication Date: 2020
    Description: Arabic is one of the most semantically and syntactically complex languages in the world. A key challenging issue in text mining is text summarization, so we propose an unsupervised score-based method which combines the vector space model, continuous bag of words (CBOW), clustering, and a statistically-based method. The problems with multidocument text summarization are the noisy data, redundancy, diminished readability, and sentence incoherency. In this study, we adopt a preprocessing strategy to solve the noise problem and use the word2vec model for two purposes, first, to map the words to fixed-length vectors and, second, to obtain the semantic relationship between each vector based on the dimensions. Similarly, we use a k-means algorithm for two purposes: (1) Selecting the distinctive documents and tokenizing these documents to sentences, and (2) using another iteration of the k-means algorithm to select the key sentences based on the similarity metric to overcome the redundancy problem and generate the initial summary. Lastly, we use weighted principal component analysis (W-PCA) to map the sentences’ encoded weights based on a list of features. This selects the highest set of weights, which relates to important sentences for solving incoherency and readability problems. We adopted Recall-Oriented Understudy for Gisting Evaluation (ROUGE) as an evaluation measure to examine our proposed technique and compare it with state-of-the-art methods. Finally, an experiment on the Essex Arabic Summaries Corpus (EASC) using the ROUGE-1 and ROUGE-2 metrics showed promising results in comparison with existing methods.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 63
    Publication Date: 2020
    Description: During automated driving, there is a need for interaction between the automated vehicle (AV) and the passengers inside the vehicle and between the AV and the surrounding road users outside of the car. For this purpose, different types of human machine interfaces (HMIs) are implemented. This paper introduces an HMI framework and describes the different HMI types and the factors influencing their selection and content. The relationship between these HMI types and their influencing factors is also presented in the framework. Moreover, the interrelations of the HMI types are analyzed. Furthermore, we describe how the framework can be used in academia and industry to coordinate research and development activities. With the help of the HMI framework, we identify research gaps in the field of HMI for automated driving to be explored in the future.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 64
    Publication Date: 2020
    Description: We propose a new supervised learning algorithm for classification and regression problems where two or more preliminary predictors are available. We introduce KernelCobra, a non-linear learning strategy for combining an arbitrary number of initial predictors. KernelCobra builds on the COBRA algorithm introduced by [], which combined estimators based on a notion of proximity of predictions on the training data. While the COBRA algorithm used a binary threshold to declare which training data were close and to be used, we generalise this idea by using a kernel to better encapsulate the proximity information. Such a smoothing kernel provides more representative weights to each of the training points which are used to build the aggregate and final predictor, and KernelCobra systematically outperforms the COBRA algorithm. While COBRA is intended for regression, KernelCobra deals with classification and regression. KernelCobra is included as part of the open source Python package Pycobra (0.2.4 and onward), introduced by []. Numerical experiments were undertaken to assess the performance (in terms of pure prediction and computational complexity) of KernelCobra on real-life and synthetic datasets.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 65
    Publication Date: 2020
    Description: Computational ontologies are machine-processable structures which represent particular domains of interest. They integrate knowledge which can be used by humans or machines for decision making and problem solving. The main aim of this systematic review is to investigate the role of formal ontologies in information systems development, i.e., how these graphs-based structures can be beneficial during the analysis and design of the information systems. Specific online databases were used to identify studies focused on the interconnections between ontologies and systems engineering. One-hundred eighty-seven studies were found during the first phase of the investigation. Twenty-seven studies were examined after the elimination of duplicate and irrelevant documents. Mind mapping was substantially helpful in organising the basic ideas and in identifying five thematic groups that show the main roles of formal ontologies in information systems development. Formal ontologies are mainly used in the interoperability of information systems, human resource management, domain knowledge representation, the involvement of semantics in unified modelling language (UML)-based modelling, and the management of programming code and documentation. We explain the main ideas in the reviewed studies and suggest possible extensions to this research.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 66
    Publication Date: 2020
    Description: The transfer learning method is used to extend our existing model to more difficult scenarios, thereby accelerating the training process and improving learning performance. The conditional adversarial domain adaptation method proposed in 2018 is a particular type of transfer learning. It uses the domain discriminator to identify which images the extracted features belong to. The features are obtained from the feature extraction network. The stability of the domain discriminator directly affects the classification accuracy. Here, we propose a new algorithm to improve the predictive accuracy. First, we introduce the Lipschitz constraint condition into domain adaptation. If the constraint condition can be satisfied, the method will be stable. Second, we analyze how to make the gradient satisfy the condition, thereby deducing the modified gradient via the spectrum regularization method. The modified gradient is then used to update the parameter matrix. The proposed method is compared to the ResNet-50, deep adaptation network, domain adversarial neural network, joint adaptation network, and conditional domain adversarial network methods using the datasets that are found in Office-31, ImageCLEF-DA, and Office-Home. The simulations demonstrate that the proposed method has a better performance than other methods with respect to accuracy.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 67
    Publication Date: 2020
    Description: The most challenging issue with low-resource languages is the difficulty of obtaining enough language resources. In this paper, we propose a language service framework for low-resource languages that enables the automatic creation and customization of new resources from existing ones. To achieve this goal, we first introduce a service-oriented language infrastructure, the Language Grid; it realizes new language services by supporting the sharing and combining of language resources. We then show the applicability of the Language Grid to low-resource languages. Furthermore, we describe how we can now realize the automation and customization of language services. Finally, we illustrate our design concept by detailing a case study of automating and customizing bilingual dictionary induction for low-resource Turkic languages and Indonesian ethnic languages.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 68
    Publication Date: 2020
    Description: Keywords: virtual reality system; multi-channel cognition; cognitive load; QFD; human–computer interaction; prediction optimization
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 69
    Publication Date: 2020
    Description: Acoustic underwater communication is a challenging task. For a reliable transmission, not only good channel estimation and equalization, but also strong error correcting codes are needed. In this paper, we present the results of the coding competition “Wanted: Best channel codes for short underwater messages” as well as our own findings on the influence of the modulation alphabet size in the example of non-binary polar codes. Furthermore, the proposals of the competition are compared to other commonly used channel codes.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 70
    Publication Date: 2020
    Description: Information on automated driving functions when automation is not activated but is available have not been investigated thus far. As the possibility of conducting non-driving related activities (NDRAs) is one of the most important aspects when it comes to perceived usefulness of automated cars and many NDRAs are time-dependent, users should know the period for which automation is available, even when not activated. This article presents a study (N = 33) investigating the effects of displaying the availability duration before—versus after—activation of the automation on users’ activation behavior and on how the system is rated. Furthermore, the way of addressing users regarding the availability on a more personal level to establish “sympathy” with the system was examined with regard to acceptance, usability, and workload. Results show that displaying the availability duration before activating the automation reduces the frequency of activations when no NDRA is executable within the automated drive. Moreover, acceptance and usability were higher and workload was reduced as a result of this information being provided. No effects were found with regard to how the user was addressed.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 71
    Publication Date: 2020
    Description: The research on complex networks is a hot topic in many fields, among which community detection is a complex and meaningful process, which plays an important role in researching the characteristics of complex networks. Community structure is a common feature in the network. Given a graph, the process of uncovering its community structure is called community detection. Many community detection algorithms from different perspectives have been proposed. Achieving stable and accurate community division is still a non-trivial task due to the difficulty of setting specific parameters, high randomness and lack of ground-truth information. In this paper, we explore a new decision-making method through real-life communication and propose a preferential decision model based on dynamic relationships applied to dynamic systems. We apply this model to the label propagation algorithm and present a Community Detection based on Preferential Decision Model, called CDPD. This model intuitively aims to reveal the topological structure and the hierarchical structure between networks. By analyzing the structural characteristics of complex networks and mining the tightness between nodes, the priority of neighbor nodes is chosen to perform the required preferential decision, and finally the information in the system reaches a stable state. In the experiments, through the comparison of eight comparison algorithms, we verified the performance of CDPD in real-world networks and synthetic networks. The results show that CDPD not only has better performance than most recent algorithms on most datasets, but it is also more suitable for many community networks with ambiguous structure, especially sparse networks.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 72
    Publication Date: 2020
    Description: Usually taken as linguistic features by Part-Of-Speech (POS) tagging, Named Entity Recognition (NER) is a major task in Natural Language Processing (NLP). In this paper, we put forward a new comprehensive-embedding, considering three aspects, namely character-embedding, word-embedding, and pos-embedding stitched in the order we give, and thus get their dependencies, based on which we propose a new Character–Word–Position Combined BiLSTM-Attention (CWPC_BiAtt) for the Chinese NER task. Comprehensive-embedding via the Bidirectional Llong Short-Term Memory (BiLSTM) layer can get the connection between the historical and future information, and then employ the attention mechanism to capture the connection between the content of the sentence at the current position and that at any location. Finally, we utilize Conditional Random Field (CRF) to decode the entire tagging sequence. Experiments show that CWPC_BiAtt model we proposed is well qualified for the NER task on Microsoft Research Asia (MSRA) dataset and Weibo NER corpus. A high precision and recall were obtained, which verified the stability of the model. Position-embedding in comprehensive-embedding can compensate for attention-mechanism to provide position information for the disordered sequence, which shows that comprehensive-embedding has completeness. Looking at the entire model, our proposed CWPC_BiAtt has three distinct characteristics: completeness, simplicity, and stability. Our proposed CWPC_BiAtt model achieved the highest F-score, achieving the state-of-the-art performance in the MSRA dataset and Weibo NER corpus.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 73
    Publication Date: 2020
    Description: In future trac, automated vehicles may be equipped with external human-machine interfaces (eHMIs) that can communicate with pedestrians. Previous research suggests that, during first encounters, pedestrians regard text-based eHMIs as clearer than light-based eHMIs. However, in much of the previous research, pedestrians were asked to imagine crossing the road, and unable or not allowed to do so. We investigated the eects of eHMIs on participants’ crossing behavior. Twenty-four participants were immersed in a virtual urban environment using a head-mounted display coupled to a motion-tracking suit. We manipulated the approaching vehicles’ behavior (yielding, nonyielding) and eHMI type (None, Text, Front Brake Lights). Participants could cross the road whenever they felt safe enough to do so. The results showed that forward walking velocities, as recorded at the pelvis, were, on average, higher when an eHMI was present compared to no eHMI if the vehicle yielded. In nonyielding conditions, participants showed a slight forward motion and refrained from crossing. An analysis of participants’ thorax angle indicated rotation towards the approaching vehicles and subsequent rotation towards the crossing path. It is concluded that results obtained via a setup in which participants can cross the road are similar to results from survey studies, with eHMIs yielding a higher crossing intention compared to no eHMI. The motion suit allows investigating pedestrian behaviors related to bodily attention and hesitation.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 74
    Publication Date: 2020
    Description: In this paper, an efficient high-order multiple signal classification (MUSIC)-like method is proposed for mixed-field source localization. Firstly, a non-Hermitian matrix is designed based on a high-order cumulant. One of the steering matrices, that is related only with the directions of arrival (DOA), is proved to be orthogonal with the eigenvectors corresponding to the zero eigenvalues. The other steering matrix that contains the information of both the DOA and range is proved to span the same column subspace with the eigenvectors corresponding to the non-zero eigenvalues. By applying the Gram–Schmidt orthogonalization, the range estimation can be achieved one by one after substituting each estimated DOA. The analysis shows that the computational complexity of the proposed method is lower than other methods, and the effectiveness of the proposed method is shown with some simulation results.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 75
    Publication Date: 2020
    Description: The ability to stop malware as soon as they start spreading will always play an important role in defending computer systems. It must be a huge benefit for organizations as well as society if intelligent defense systems could themselves detect and prevent new types of malware as soon as they reveal only a tiny amount of samples. An approach introduced in this paper takes advantage of One-shot/Few-shot learning algorithms to solve the malware classification problems using a Memory Augmented Neural Network in combination with the Natural Language Processing techniques such as word2vec, n-gram. We embed the malware’s API calls, which are very valuable sources of information for identifying malware’s behaviors, in the different feature spaces, and then feed them to the one-shot/few-shot learning models. Evaluating the model on the two datasets (FFRI 2017 and APIMDS) shows that the models with different parameters could yield high accuracy on malware classification with only a few samples. For example, on the APIMDS dataset, it was able to guess 78.85% correctly after seeing only nine malware samples and 89.59% after fine-tuning with a few other samples. The results confirmed very good accuracies compared to the other traditional methods, and point to a new area of malware research.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 76
    Publication Date: 2020
    Description: A common use case for blockchain smart contracts (SC) is that of governing interaction amongst mutually untrusted parties, by automatically enforcing rules for interaction. However, while many contributions in the literature assess SC computational expressiveness, an evaluation of their power in terms of coordination (i.e., governing interaction) is still missing. This is why in this paper we test mainstream SC implementations by evaluating their expressive power in coordinating both inter-users and inter-SC activities. To do so, we exploit the archetypal Linda coordination model as a benchmark—a common practice in the field of coordination models and languages—by discussing to what extent mainstream blockchain technologies support its implementation. As they reveal some notable limitations (affecting, in particular, coordination between SC) we then show how Tenderfone, a custom blockchain implementation providing for a more expressive notion of SC, addresses the aforementioned limitations.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 77
    Publication Date: 2020
    Description: Virtualization has the advantages of strong scalability and high fidelity in host node emulation. It can effectively meet the requirements of network emulation, including large scale, high fidelity, and flexible construction. However, for router emulation, virtual routers built with virtualization and routing software use Linux Traffic Control to emulate bandwidth, delay, and packet loss rates, which results in serious distortions in congestion scenarios. Motivated by this deficiency, we propose a novel router emulation method that consists of virtualization plane, routing plane, and a traffic control method. We designed and implemented our traffic control module in multi-scale virtualization, including the kernel space of a KVM-based virtual router and the user space of a Docker-based virtual router. Experiments show not only that the proposed method achieves high-fidelity router emulation, but also that its performance is consistent with that of a physical router in congestion scenarios. These findings provide good support for network research into congestion scenarios on virtualization-based emulation platforms.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 78
    Publication Date: 2020
    Description: In this paper the Buechner–Tavani model of digital trust is revised—new conditions for self-trust are incorporated into the model. These new conditions raise several philosophical problems concerning the idea of a substantial self for social robotics, which are closely examined. I conclude that reductionism about the self is incompatible with, while the idea of a substantial self is compatible with, trust relations between human agents, between human agents and artificial agents, and between artificial agents.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 79
    Publication Date: 2020
    Description: Linguistic Pythagorean fuzzy (LPF) set is an efficacious technique to comprehensively represent uncertain assessment information by combining the Pythagorean fuzzy numbers and linguistic variables. In this paper, we define several novel essential operations of LPF numbers based upon Einstein operations and discuss several relations between these operations. For solving the LPF numbers fusion problem, several LPF aggregation operators, including LPF Einstein weighted averaging (LPFEWA) operator, LPF Einstein weighted geometric (LPFEWG) operator and LPF Einstein hybrid operator, are propounded; the prominent characteristics of these operators are investigated as well. Furthermore, a multi-attribute group decision making (MAGDM) approach is presented on the basis of the developed operators under an LPF environment. Ultimately, two application cases are utilized to demonstrate the practicality and feasibility of the developed decision approach and the comparison analysis is provided to manifest the merits of it.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 80
    Publication Date: 2020
    Description: With the rise of electric vehicles, the key of electric vehicle charging is how to charge them in residential areas and other closed environments. Addressing this problem is extremely important for avoiding adverse effects on the load and stability of the neighboring grids where multi-user centralized charging takes place. Therefore, we propose a charging dynamic scheduling algorithm based on user bidding. First, we determine the user charging priority according to bidding. Then, we design a resource allocation policy based on game theory, which could assign charge slots for users. Due to users leaving and urgent user needs, we found an alternate principle that can improve the flexibility slot utilization of charging. Simulation results show that the algorithm could meet the priority needs of users with higher charging prices and timely responses to requests. Meanwhile, this algorithm can ensure orderly electric vehicle charging, improve power utilization efficiency, and ease pressure on grid loads.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 81
    Publication Date: 2020
    Description: This paper discusses the nuances of a social robot, how and why social robots are becoming increasingly significant, and what they are currently being used for. This paper also reflects on the current design of social robots as a means of interaction with humans and also reports potential solutions about several important questions around the futuristic design of these robots. The specific questions explored in this paper are: “Do social robots need to look like living creatures that already exist in the world for humans to interact well with them?”; “Do social robots need to have animated faces for humans to interact well with them?”; “Do social robots need to have the ability to speak a coherent human language for humans to interact well with them?” and “Do social robots need to have the capability to make physical gestures for humans to interact well with them?”. This paper reviews both verbal as well as nonverbal social and conversational cues that could be incorporated into the design of social robots, and also briefly discusses the emotional bonds that may be built between humans and robots. Facets surrounding acceptance of social robots by humans and also ethical/moral concerns have also been discussed.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 82
    Publication Date: 2020
    Description: In this paper, the viability of neural network implementations of core technologies (the focus of this paper is on text technologies) for 10 resource-scarce South African languages is evaluated. Neural networks are increasingly being used in place of other machine learning methods for many natural language processing tasks with good results. However, in the South African context, where most languages are resource-scarce, very little research has been done on neural network implementations of core language technologies. In this paper, we address this gap by evaluating neural network implementations of four core technologies for ten South African languages. The technologies we address are part of speech tagging, named entity recognition, compound analysis and lemmatization. Neural architectures that performed well on similar tasks in other settings were implemented for each task and the performance was assessed in comparison with currently used machine learning implementations of each technology. The neural network models evaluated perform better than the baselines for compound analysis, are viable and comparable to the baseline on most languages for POS tagging and NER, and are viable, but not on par with the baseline, for Afrikaans lemmatization.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 83
    Publication Date: 2020
    Description: Online privacy has become immensely important with the growth of technology and the expansion of communication. Social Media Networks have risen to the forefront of current communication trends. With the current trends in social media, the question now becomes how can we actively protect ourselves on these platforms? Users of social media networks share billions of images a day. Whether intentional or unintentional, users tend to share private information within these images. In this study, we investigate (1) the users’ perspective of privacy, (2) pervasiveness of privacy leaks on Twitter, and (3) the threats and dangers on these platforms. In this study, we incorporate techniques such as text analysis, analysis of variance, and crowdsourcing to process the data received from these sources. Based on the results, the participants’ definitions of privacy showed overlap regardless of age or gender identity. After looking at the survey results, most female participants displayed a heightened fear of dangers on social media networks because of threats in the following areas: assets and identity. When the participants were asked to rank the threats on social media, they showed a high concern for burglary and kidnapping. We find that participants need more education about the threats of visual content and how these privacy leaks can lead to physical, mental, and emotional danger.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 84
    Publication Date: 2020
    Description: The adoption of games in the classroom has been studied from different angles, such as the readiness of teachers to use games or the barriers encountered. However, actual classroom practices with regard to the use of games have not been examined on a larger scale. With this research, we gave teachers a voice to report on their actual practices. We examined the current practices of a large sample of Estonian teachers (N = 1258, which constitutes almost 9% of the total Estonian teacher population) in primary and secondary education in 2017. We found that most of the teachers use games on a regular basis. Mainly, they use the games for motivation and alternation, but they also use them to consolidate and teach new skills. While awareness and motivation are high and experimentation on using games is widespread, practices appear fragmentary and not widely sustained. As a result of this study, we suggest the creation of an evidence base and a better integration of social support structures into teacher education. This is the first large-scale study to look into Estonian teacher’s actual practices, and although Estonian teachers have relatively high autonomy and technical skills, we believe that these results and further investigations are applicable in other contexts as well.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 85
    Publication Date: 2020
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 86
    Publication Date: 2020
    Description: In this paper, we derive recursive algorithms for calculating the determinant and inverse of the generalized Vandermonde matrix. The main advantage of the recursive algorithms is the fact that the computational complexity of the presented algorithm is better than calculating the determinant and the inverse by means of classical methods, developed for the general matrices. The results of this article do not require any symbolic calculations and, therefore, can be performed by a numerical algorithm implemented in a specialized (like Matlab or Mathematica) or general-purpose programming language (C, C++, Java, Pascal, Fortran, etc.).
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 87
    Publication Date: 2020
    Description: With the growth of e-services in the past two decades, the concept of web accessibility has been given attention to ensure that every individual can benefit from these services without any barriers. Web accessibility is considered one of the main factors that should be taken into consideration while developing webpages. Web Content Accessibility Guidelines 2.0 (WCAG 2.0) have been developed to guide web developers to ensure that web contents are accessible for all users, especially disabled users. Many automatic tools have been developed to check the compliance of websites with accessibility guidelines such as WCAG 2.0 and to help web developers and content creators with designing webpages without barriers for disabled people. Despite the popularity of accessibility evaluation tools in practice, there is no systematic way to compare the performance of web accessibility evaluators. This paper first presents two novel frameworks. The first one is proposed to compare the performance of web accessibility evaluation tools in detecting web accessibility issues based on WCAG 2.0. The second framework is utilized to evaluate webpages in meeting these guidelines. Six homepages of Saudi universities were chosen as case studies to substantiate the concept of the proposed frameworks. Furthermore, two popular web accessibility evaluators, Wave and SiteImprove, are selected to compare their performance. The outcomes of studies conducted using the first proposed framework showed that SiteImprove outperformed WAVE. According to the outcomes of the studies conducted, we can conclude that web administrators would benefit from the first framework in selecting an appropriate tool based on its performance to evaluate their websites based on accessibility criteria and guidelines. Moreover, the findings of the studies conducted using the second proposed framework showed that the homepage of Taibah University is more accessible than the homepages of other Saudi universities. Based on the findings of this study, the second framework can be used by web administrators and developers to measure the accessibility of their websites. This paper also discusses the most common accessibility issues reported by WAVE and SiteImprove.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 88
    Publication Date: 2020
    Description: Disaster scenarios are particularly catastrophic in urban environments, which are very densely populated in many cases. Disasters not only endanger the life of people, but also affect the existing communication infrastructures. In fact, such an infrastructure could be completely destroyed or damaged; even when it continues working, it suffers from high access demand to its limited resources within a short period of time. This work evaluates the performances of smartphones and leverages the ubiquitous presence of mobile devices in urban scenarios to assist search and rescue activities following a disaster. Specifically, it proposes a collaborative protocol that opportunistically organizes mobile devices in multiple tiers by targeting a fair energy consumption in the whole network. Moreover, it introduces a data collection scheme that employs drones to scan the disaster area and to visit mobile devices and collect their data in a short time. Simulation results in realistic settings show that the proposed solution balances the energy consumption in the network by means of efficient drone routes and smart self-organization, thereby effectively assisting search and rescue operations.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 89
    Publication Date: 2020
    Description: Since the actual factors in the instant distribution service scenario are not considered enough in the existing distribution route optimization, a route optimization model of the instant distribution system based on customer time satisfaction is proposed. The actual factors in instant distribution, such as the soft time window, the pay-to-order mechanism, the time for the merchant to prepare goods before delivery, and the deliveryman’s order combining, were incorporated in the model. A multi-objective optimization framework based on the total cost function and time satisfaction of the customer was established. Dual-layer chromosome coding based on the deliveryman-to-node mapping and the access order was conducted, and the nondominated sorting genetic algorithm version II (NSGA-II) was used to solve the problem. According to the numerical results, when time satisfaction of the customer was considered in the instant distribution routing problem, the customer satisfaction increased effectively and the balance between customer satisfaction and delivery cost in the means of Pareto optimization were obtained, with a minor increase in the delivery cost, while the number of deliverymen slightly increased to meet the on-time delivery needs of customers.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 90
    Publication Date: 2020
    Description: Relation extraction is an important task with many applications in natural language processing, such as structured knowledge extraction, knowledge graph construction, and automatic question answering system construction. However, relatively little past work has focused on the construction of the corpus and extraction of Uyghur-named entity relations, resulting in a very limited availability of relation extraction research and a deficiency of annotated relation data. This issue is addressed in the present article by proposing a hybrid Uyghur-named entity relation extraction method that combines a conditional random field model for making suggestions regarding annotation based on extracted relations with a set of rules applied by human annotators to rapidly increase the size of the Uyghur corpus. We integrate our relation extraction method into an existing annotation tool, and, with the help of human correction, we implement Uyghur relation extraction and expand the existing corpus. The effectiveness of our proposed approach is demonstrated based on experimental results by using an existing Uyghur corpus, and our method achieves a maximum weighted average between precision and recall of 61.34%. The method we proposed achieves state-of-the-art results on entity and relation extraction tasks in Uyghur.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 91
    Publication Date: 2020
    Description: Traditional methods for identifying naming ignore the correlation between named entities and lose hierarchical structural information between the named entities in a given text. Although traditional named-entity methods are effective for conventional datasets that have simple structures, they are not as effective for sports texts. This paper proposes a Chinese sports text named-entity recognition method based on a character graph convolutional neural network (Char GCN) with a self-attention mechanism model. In this method, each Chinese character in the sports text is regarded as a node. The edge between the nodes is constructed using a similar character position and the character feature of the named-entity in the sports text. The internal structural information of the entity is extracted using a character map convolutional neural network. The hierarchical semantic information of the sports text is captured by the self-attention model to enhance the relationship between the named entities and capture the relevance and dependency between the characters. The conditional random fields classification function can accurately identify the named entities in the Chinese sports text. The results conducted on four datasets demonstrate that the proposed method improves the F-Score values significantly to 92.51%, 91.91%, 93.98%, and 95.01%, respectively, in comparison to the traditional naming methods.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 92
    Publication Date: 2020
    Description: With the development and popular application of Building Internet of Things (BIoT) systems, numerous types of equipment are connected, and a large volume of equipment data is collected. For convenient equipment management, the equipment should be identified and labeled. Traditionally, this process is performed manually, which not only is time consuming but also causes unavoidable omissions. In this paper, we propose a k-means clustering-based electrical equipment identification toward smart building application that can automatically identify the unknown equipment connected to BIoT systems. First, load characteristics are analyzed and electrical features for equipment identification are extracted from the collected data. Second, k-means clustering is used twice to construct the identification model. Preliminary clustering adopts traditional k-means algorithm to the total harmonic current distortion data and separates equipment data into two to three clusters on the basis of their electrical characteristics. Later clustering uses an improved k-means algorithm, which weighs Euclidean distance and uses the elbow method to determine the number of clusters and analyze the results of preliminary clustering. Then, the equipment identification model is constructed by selecting the cluster centroid vector and distance threshold. Finally, identification results are obtained online on the basis of the model outputs by using the newly collected data. Successful applications to BIoT system verify the validity of the proposed identification method.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 93
    Publication Date: 2020
    Description: This article presents a defect detection model of sugarcane plantation images. The objective is to assess the defect areas occurring in the sugarcane plantation before the harvesting seasons. The defect areas in the sugarcane are usually caused by storms and weeds. This defect detection algorithm uses high-resolution sugarcane plantations and image processing techniques. The algorithm for defect detection consists of four processes: (1) data collection, (2) image preprocessing, (3) defect detection model creation, and (4) application program creation. For feature extraction, the researchers used image segmentation and convolution filtering by 13 masks together with mean and standard deviation. The feature extraction methods generated 26 features. The K-nearest neighbors algorithm was selected to develop a model for the classification of the sugarcane areas. The color selection method was also chosen to detect defect areas. The results show that the model can recognize and classify the characteristics of the objects in sugarcane plantation images with an accuracy of 96.75%. After the comparison with the expert surveyor’s assessment, the accurate relevance obtained was 92.95%. Therefore, the proposed model can be used as a tool to calculate the percentage of defect areas and solve the problem of evaluating errors of yields in the future.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 94
    Publication Date: 2020
    Description: Today, Android accounts for more than 80% of the global market share. Such a high rate makes Android applications an important topic that raises serious questions about its security, privacy, misbehavior and correctness. Application code analysis is obviously the most appropriate and natural means to address these issues. However, no analysis could be led with confidence in the absence of a solid formal foundation. In this paper, we propose a full-fledged formal approach to build the operational semantics of a given Android application by reverse-engineering its assembler-type code, called Smali. We call the new formal language Smali + . Its semantics consist of two parts. The first one models a single-threaded program, in which a set of main instructions is presented. The second one presents the semantics of a multi-threaded program which is an important feature in Android that has been glossed over in the-state-of-the-art works. All multi-threading essentials such as scheduling, threads communication and synchronization are considered in these semantics. The resulting semantics, forming Smali + , are intended to provide a formal basis for developing security enforcement, analysis and misbehaving detection techniques for Android applications.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 95
    Publication Date: 2020
    Description: This paper gives an overview of the cutting-edge approaches that perform facial cue analysis in the healthcare area. The document is not limited to global face analysis but it also concentrates on methods related to local cues (e.g., the eyes). A research taxonomy is introduced by dividing the face in its main features: eyes, mouth, muscles, skin, and shape. For each facial feature, the computer vision-based tasks aiming at analyzing it and the related healthcare goals that could be pursued are detailed.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 96
    Publication Date: 2020
    Description: Public administrations handle large amounts of data in relation to their internal processes as well as to the services that they offer. Following public-sector information reuse regulations and worldwide open data publication trends, these administrations are increasingly publishing their data as open data. However, open data are often released without agreed data models and in non-reusable formats, reducing interoperability and efficiency in data reuse. These aspects hinder interoperability with other administrations and do not allow taking advantage of the associated knowledge in an efficient manner. This paper presents the continued work performed by the Zaragoza city council over more than 15 years in order to generate its knowledge graph, which constitutes the key piece of their data management system, whose main strengthen is the open-data-by-default policy. The main functionalities that have been developed for the internal and external exploitation of the city’s open data are also presented. Finally, some city council experiences and lessons learned during this process are also explained.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 97
    Publication Date: 2020
    Description: In the modern business environment, characterized by rapid technological advancements and globalization, abetted by IoT and Industry 5.0 phenomenon, innovation is indispensable for competitive advantage and economic growth. However, many organizations are facing problems in its true implementation due to the absence of a practical innovation management framework, which has made the implementation of the concept elusive instead of persuasive. The present study has proposed a new innovation management framework labeled as “Absolute Innovation Management (AIM)” to make innovation more understandable, implementable, and part of the organization’s everyday routine by synergizing the innovation ecosystem, design thinking, and corporate strategy to achieve competitive advantage and economic growth. The current study used an integrative literature review methodology to develop the “Absolute Innovation Management” framework. The absolute innovation management framework links the innovation ecosystem with the corporate strategy of the firm by adopting innovation management as a strategy through design thinking. Thus, making innovation more user/human-centered that is desirable by the customer, viable for business and technically feasible, creating both entrepreneurial and customer value, and boosting corporate venturing and corporate entrepreneurship to achieve competitive advantage and economic growth while addressing the needs of IoT and Industry 5.0 era. In sum, it synergizes innovation, design thinking, and strategy to make businesses future-ready for IoT and industry 5.0 revolution. The present study is significant, as it not only make considerable contributions to the existing literature on innovation management by developing a new framework but also makes the concept more practical, implementable and part of an organization’s everyday routine.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 98
    Publication Date: 2020
    Description: The Treemap is one of the most relevant information visualization (InfoVis) techniques to support the analysis of large hierarchical data structures or data clusters. Despite that, Treemap still presents some challenges for data representation, such as the few options for visual data mappings and the inability to represent zero and negative values. Additionally, visualizing high dimensional data requires many hierarchies, which can impair data visualization. Thus, this paper proposes to add layered glyphs to Treemap’s items to mitigate these issues. Layered glyphs are composed of N partially visible layers, and each layer maps one data dimension to a visual variable. Since the area of the upper layers is always smaller than the bottom ones, the layers can be stacked to compose a multidimensional glyph. To validate this proposal, we conducted a user study to compare three scenarios of visual data mappings for Treemaps: only Glyphs (G), Glyphs and Hierarchy (GH), and only Hierarchy (H). Thirty-six volunteers with a background in InfoVis techniques, organized into three groups of twelve (one group per scenario), performed 8 InfoVis tasks using only one of the proposed scenarios. The results point that scenario GH presented the best accuracy while having a task-solving time similar to scenario H, which suggests that representing more data in Treemaps with layered glyphs enriched the Treemap visualization capabilities without impairing the data readability.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 99
    Publication Date: 2020
    Description: Traffic lights have been used for decades to control and manage traffic flows crossing road intersections to increase traffic efficiency and road safety. However, relying on fixed time cycles may not be ideal in dealing with the increasing congestion level in cities. Therefore, we propose a new Adaptive Traffic Light Control System (ATLCS) to assist traffic management authorities in efficiently dealing with traffic congestion in cities. The main idea of our ATLCS consists in synchronizing a number of traffic lights controlling consecutive junctions by creating a delay between the times at which each of them switches to green in a given direction. Such a delay is dynamically updated based on the number of vehicles waiting at each junction, thereby allowing vehicles leaving the city centre to travel a long distance without stopping (i.e., minimizing the number of occurrences of the `stop and go’ phenomenon), which in turn reduces their travel time as well. The performance evaluation of our ATLCS has shown that the average travel time of vehicles traveling in the synchronized direction has been significantly reduced (by up to 39%) compared to non-synchronized fixed time Traffic Light Control Systems. Moreover, the overall achieved improvement across the simulated road network was 17%.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 100
    Publication Date: 2020
    Description: This paper examines how social media are affecting Japanese civil society organizations, in relation to efficacy and political participation. Using data from the 2017 Japan Interest Group Study survey, we analyzed how the flow of information leads to the political participation of civil society organizations. The total number of respondents (organizations) were 1285 (942 organizations in Tokyo and 343 from Ibaraki). In the analysis of our survey we focused on the data portion related to information behavior and efficacy and investigated the meta-cognition of efficacy in lobbying among civil society organizations in Tokyo and Ibaraki. We found that organizations that use social media were relatively few. However, among the few organizations that use social media, we found that these organizations have a much higher meta-cognition of political efficacy in comparison to those that do not use social media. For instance, social media usage had a higher tendency of having cognition of being able to exert influence upon others. We also found that organizations that interact with citizens have a higher tendency to use social media. The correspondence analysis results point towards a hypothesis of how efficacy and participation are mutually higher among the organizations that use social media in Japan.
    Electronic ISSN: 2078-2489
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...