ALBERT

All Library Books, journals and Electronic Records Telegrafenberg

Ihre E-Mail wurde erfolgreich gesendet. Bitte prüfen Sie Ihren Maileingang.

Leider ist ein Fehler beim E-Mail-Versand aufgetreten. Bitte versuchen Sie es erneut.

Vorgang fortführen?

Exportieren
Filter
  • Artikel  (4.779)
  • Molecular Diversity Preservation International  (3.287)
  • MDPI  (1.278)
  • Ubiquity Press  (214)
  • Institute of Electrical and Electronics Engineers (IEEE)
  • 2020-2022  (2.190)
  • 2015-2019  (2.487)
  • 2010-2014  (102)
  • 1990-1994
  • 1945-1949
  • 2020  (2.190)
  • 2019  (2.487)
  • 2010  (102)
  • Informatik  (4.779)
Sammlung
  • Artikel  (4.779)
Verlag/Herausgeber
Erscheinungszeitraum
  • 2020-2022  (2.190)
  • 2015-2019  (2.487)
  • 2010-2014  (102)
  • 1990-1994
  • 1945-1949
Jahr
Zeitschrift
  • 1
    Publikationsdatum: 2020-08-27
    Beschreibung: Festivals are experiential products heavily depending on the recommendations of previous visitors. With the power of social media growing, understanding the antecedents of positive electronic word-of-mouth (eWOM) intentions of festival attendees is immensely beneficial for festival organizers to better promote their festivals and control negative publicity. However, there is still limited research regarding eWOM intentions in the festival context. Thus, this study aims to fill such a gap by investigating the relationships among festival attendees’ enjoyment seeking motivation, perceived value, visitor satisfaction, and eWOM intention in a local festival setting. Additionally, the moderating role of gender was tested as it is one of the most important demographic variables to show individual differences in behavioral intentions. The results of structural equation modeling showed a positive effect of enjoyment seeking motivation on perceived value, visitor satisfaction, and eWOM intention. Moreover, gender differences in eWOM intention and a full mediating effect of visitor satisfaction between perceived value and eWOM intention for female respondents were revealed. The findings of this study extend the existing festival literature and provide insights for strategically organizing and promoting festivals to generate more positive eWOM which can be utilized as an effective marketing tool and a feedback channel.
    Digitale ISSN: 2078-2489
    Thema: Informatik
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 2
    Publikationsdatum: 2020-08-26
    Beschreibung: Information and communication technologies transform modern education into a more available learning matrix. One of the unexplored aspects of open education is the constant communicative interaction within the student group by using social media. The aim of the study was to determine principal functions of student-led communication in the educational process, the method for assessing its strong points and the disadvantages disrupting traditional learning. For the primary study of the phenomenon, we used methods that made it possible to propose approaches to further analysis. Netnography is the main research method defining the essence and characteristics of the student-led peer-communication. In our research, we applied data visualization, analytical and quantitative methods and developed a set of quantitative indicators that can be used to assess various aspects of student communication in chats. The elaborated visual model can serve as a simple tool for diagnosing group communication processes. We revealed that online group chats perform a support function in learning. They provide constant informational resource on educational and organizational issues and create emotional comfort. Identified features serve to define shortcomings (e.g., lack of students’ readiness to freely exchange answers to assignments) and significant factors (e.g., underutilized opportunities for self-organization) that exist in the modern system of higher education.
    Digitale ISSN: 1999-5903
    Thema: Informatik
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 3
    Publikationsdatum: 2020-08-28
    Beschreibung: Due to the growing success of neural machine translation (NMT), many have started to question its applicability within the field of literary translation. In order to grasp the possibilities of NMT, we studied the output of the neural machine system of Google Translate (GNMT) and DeepL when applied to four classic novels translated from English into Dutch. The quality of the NMT systems is discussed by focusing on manual annotations, and we also employed various metrics in order to get an insight into lexical richness, local cohesion, syntactic, and stylistic difference. Firstly, we discovered that a large proportion of the translated sentences contained errors. We also observed a lower level of lexical richness and local cohesion in the NMTs compared to the human translations. In addition, NMTs are more likely to follow the syntactic structure of a source sentence, whereas human translations can differ. Lastly, the human translations deviate from the machine translations in style.
    Digitale ISSN: 2227-9709
    Thema: Informatik
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 4
    Publikationsdatum: 2020-08-29
    Beschreibung: The emergence and outbreak of the novel coronavirus (COVID-19) had a devasting effect on global health, the economy, and individuals’ daily lives. Timely diagnosis of COVID-19 is a crucial task, as it reduces the risk of pandemic spread, and early treatment will save patients’ life. Due to the time-consuming, complex nature, and high false-negative rate of the gold-standard RT-PCR test used for the diagnosis of COVID-19, the need for an additional diagnosis method has increased. Studies have proved the significance of X-ray images for the diagnosis of COVID-19. The dissemination of deep-learning techniques on X-ray images can automate the diagnosis process and serve as an assistive tool for radiologists. In this study, we used four deep-learning models—DenseNet121, ResNet50, VGG16, and VGG19—using the transfer-learning concept for the diagnosis of X-ray images as COVID-19 or normal. In the proposed study, VGG16 and VGG19 outperformed the other two deep-learning models. The study achieved an overall classification accuracy of 99.3%.
    Digitale ISSN: 2078-2489
    Thema: Informatik
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 5
    Publikationsdatum: 2020-08-29
    Beschreibung: In this work, we demonstrate how the blockchain and the off-chain storage interact via Oracle-based mechanisms, which build an effective connection between a distributed database and real assets. For demonstration purposes, smart contracts were drawn up to deal with two different applications. Due to the characteristics of the blockchain, we may still encounter severe privacy issues, since the data stored on the blockchain are exposed to the public. The proposed scheme provides a general solution for resolving the above-mentioned privacy issue; that is, we try to protect the on-chain privacy of the sensitive data by using homomorphic encryption techniques. Specifically, we constructed a secure comparison protocol that can check the correctness of a logic function directly in the encrypted domain. By using the proposed access control contract and the secure comparison protocol, one can carry out sensitive data-dependent smart contract operations without revealing the data themselves.
    Digitale ISSN: 2073-431X
    Thema: Informatik
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 6
    Publikationsdatum: 2020-08-29
    Beschreibung: Healthcare facilities are constantly deteriorating due to tight budgets allocated to the upkeep of building assets. This entails the need for improved deterioration modeling of such buildings in order to enforce a predictive maintenance approach that decreases the unexpected occurrence of failures and the corresponding downtime elapsed to repair or replace the faulty asset components. Currently, hospitals utilize subjective deterioration prediction methodologies that mostly rely on age as the sole indicator of degradation to forecast the useful lives of the building components. Thus, this paper aims at formulating a more efficient stochastic deterioration prediction model that integrates the latest observed condition into the forecasting procedure to overcome the subjectivity and uncertainties associated with the currently employed methods. This is achieved by means of developing a hybrid genetic algorithm-based fuzzy Markovian model that simulates the deterioration process given the scarcity of available data demonstrating the condition assessment and evaluation for such critical facilities. A nonhomogeneous transition probability matrix (TPM) based on fuzzy membership functions representing the condition, age and relative deterioration rate of the hospital systems is utilized to address the inherited uncertainties. The TPM is further calibrated by means of a genetic algorithm to circumvent the drawbacks of the expert-based models. A sensitivity analysis was carried out to analyze the possible changes in the output resulting from predefined modifications to the input parameters in order to ensure the robustness of the model. The performance of the deterioration prediction model developed is then validated through a comparison with a state-of-art stochastic model in contrast to real hospital datasets, and the results obtained from the developed model significantly outperformed the long-established Weibull distribution-based deterioration prediction methodology with mean absolute errors of 1.405 and 9.852, respectively. Therefore, the developed model is expected to assist decision-makers in creating more efficient maintenance programs as well as more data-driven capital renewal plans.
    Digitale ISSN: 1999-4893
    Thema: Informatik
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 7
    Publikationsdatum: 2020-08-29
    Beschreibung: The harmonic closeness centrality measure associates, to each node of a graph, the average of the inverse of its distances from all the other nodes (by assuming that unreachable nodes are at infinite distance). This notion has been adapted to temporal graphs (that is, graphs in which edges can appear and disappear during time) and in this paper we address the question of finding the top-k nodes for this metric. Computing the temporal closeness for one node can be done in O(m) time, where m is the number of temporal edges. Therefore computing exactly the closeness for all nodes, in order to find the ones with top closeness, would require O(nm) time, where n is the number of nodes. This time complexity is intractable for large temporal graphs. Instead, we show how this measure can be efficiently approximated by using a “backward” temporal breadth-first search algorithm and a classical sampling technique. Our experimental results show that the approximation is excellent for nodes with high closeness, allowing us to detect them in practice in a fraction of the time needed for computing the exact closeness of all nodes. We validate our approach with an extensive set of experiments.
    Digitale ISSN: 1999-4893
    Thema: Informatik
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 8
    Publikationsdatum: 2020-07-20
    Beschreibung: Computer programmers require various instructive information during coding and development. Such information is dispersed in different sources like language documentation, wikis, and forums. As an information exchange platform, programmers broadly utilize Stack Overflow, a Web-based Question Answering site. In this paper, we propose a recommender system which uses a supervised machine learning approach to investigate Stack Overflow posts to present instructive information for the programmers. This might be helpful for the programmers to solve programming problems that they confront with in their daily life. We analyzed posts related to two most popular programming languages—Python and PHP. We performed a few trials and found that the supervised approach could effectively manifold valuable information from our corpus. We validated the performance of our system from human perception which showed an accuracy of 71%. We also presented an interactive interface for the users that satisfied the users’ query with the matching sentences with most instructive information.
    Digitale ISSN: 2073-431X
    Thema: Informatik
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 9
    Publikationsdatum: 2020-07-19
    Beschreibung: Background: Health benefits from physical activity (PA) can be achieved by following the WHO recommendation for PA. To increase PA in inactive individuals, digital interventions can provide cost-effective and low-threshold access. Moreover, gamification elements can raise the motivation for PA. This study analyzed which factors (personality traits, app features, gamification) are relevant to increasing PA within this target group. Methods: N = 808 inactive participants (f = 480; m = 321; age = 48 ± 6) were integrated into the analysis of the desire for PA, the appearance of personality traits and resulting interest in app features and gamification. The statistical analysis included chi-squared tests, one-way ANOVA and regression analysis. Results: The main interests in PA were fitness (97%) and outdoor activities (75%). No significant interaction between personality traits, interest in PA goals, app features and gamification were found. The interest in gamification was determined by the PA goal. Participants’ requirements for features included feedback and suggestions for activities. Monetary incentives were reported as relevant gamification aspects. Conclusion: Inactive people can be reached by outdoor activities, interventions to increase an active lifestyle, fitness and health sports. The study highlighted the interest in specific app features and gamification to increase PA in inactive people through an app.
    Digitale ISSN: 2078-2489
    Thema: Informatik
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 10
    Publikationsdatum: 2020-01-01
    Digitale ISSN: 1683-1470
    Thema: Informatik
    Publiziert von Ubiquity Press
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 11
    Publikationsdatum: 2020-07-01
    Beschreibung: This paper presents a study related to human psychophysiological activity estimation based on a smartphone camera and sensors. In recent years, awareness of the human body, as well as human mental states, has become more and more popular. Yoga and meditation practices have moved from the east to Europe, the USA, Russia, and other countries, and there are a lot of people who are interested in them. However, recently, people have tried the practice but would prefer an objective assessment. We propose to apply the modern methods of computer vision, pattern recognition, competence management, and dynamic motivation to estimate the quality of the meditation process and provide the users with objective information about their practice. We propose an approach that covers the possibility of recognizing pictures of humans from a smartphone and utilizes wearable electronics to measure the user’s heart rate and motions. We propose a model that allows building meditation estimation scores based on these parameters. Moreover, we propose a meditation expert network through which users can find the coach that is most appropriate for him/her. Finally, we propose the dynamic motivation model, which encourages people to perform the practice every day.
    Digitale ISSN: 1999-5903
    Thema: Informatik
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 12
    Publikationsdatum: 2020-08-31
    Beschreibung: The recent literature concerning globalizing regional development has placed significant emphasis on the Global Production Network (GPN 2.0). GPN 2.0 in economic geography emphasizes that regional growth is caused by a shift in the strategic coupling mode from a low to high level. In addition, GPN 2.0 regards firm-level value capture trajectories as key analytical object, rather than the interactive relationships among scalar and divergent actors in GPN 1.0. To provide a better understanding of causal linkages between the GPNs and uneven regional development in the background of globalization and to test the applicability of GPN 2.0 analysis framework, the paper analyzed 62 Korean-invested automotive firms in Jiangsu Province, China. In order to explore the value capture trajectories of lead firms in the GPNs, the authors applied K-means clustering method to quantitatively analyze the local supply networks of lead firms from organizational and spatial dimensions. Then, comparisons were made between strategic coupling modes of GPNs and regional development in North and South Jiangsu. This study found obvious similarities within these two regions but obvious differences between them in terms of value capture trajectories. We observed that North Jiangsu is currently in the stage of “structural coupling”, whereas South Jiangsu is in the stage of “functional coupling.” Thus, this article argues that spatial settings such as regional assets and autonomy are key factors influencing uneven economic development. This research may provide a crucial reference for the regional development of Jiangsu, China.
    Digitale ISSN: 2078-2489
    Thema: Informatik
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 13
    Publikationsdatum: 2020-08-31
    Beschreibung: Software defined networking (SDN) is an emerging network paradigm that decouples the control plane from the data plane. The data plane is composed of forwarding elements called switches and the control plane is composed of controllers. SDN is gaining popularity from industry and academics due to its advantages such as centralized, flexible, and programmable network management. The increasing number of traffics due to the proliferation of the Internet of Thing (IoT) devices may result in two problems: (1) increased processing load of the controller, and (2) insufficient space in the switches’ flow table to accommodate the flow entries. These problems may cause undesired network behavior and unstable network performance, especially in large-scale networks. Many solutions have been proposed to improve the management of the flow table, reducing controller processing load, and mitigating security threats and vulnerabilities on the controllers and switches. This paper provides comprehensive surveys of existing schemes to ensure SDN meets the quality of service (QoS) demands of various applications and cloud services. Finally, potential future research directions are identified and discussed such as management of flow table using machine learning.
    Digitale ISSN: 1999-5903
    Thema: Informatik
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 14
    Publikationsdatum: 2020-07-16
    Beschreibung: High order convective Cahn-Hilliard type equations describe the faceting of a growing surface, or the dynamics of phase transitions in ternary oil-water-surfactant systems. In this paper, we prove the well-posedness of the classical solutions for the Cauchy problem, associated with this equation.
    Digitale ISSN: 1999-4893
    Thema: Informatik
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 15
    Publikationsdatum: 2020-01-01
    Digitale ISSN: 1683-1470
    Thema: Informatik
    Publiziert von Ubiquity Press
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 16
    Publikationsdatum: 2020-07-15
    Beschreibung: As Web applications become more and more complex, the development costs are increasing as well. A Model Driven Architecture (MDA) approach is proposed in this paper since it simplifies modeling, design, implementation, and integration of applications by defining software mainly at the model level. We adopt the The Unified Modeling Language (UML), as modeling language. UML provides a set of diagrams to model structural and behavioral aspects of the Web applications. Automatic translation of UML diagrams to the Object-Oriented code is highly desirable because it eliminates the chances of introducing human errors. Moreover, automatic code generation helps the software designers delivering of the software on time. In our approach, the automatic transformations across the MDA’s levels are based on meta-models for two of the most important constructs of UML, namely Use Cases and classes. A proprietary tool (called xGenerator) performs the transformations up to the Java source code. The architecture of the generated Web applications respects a variant of the well-known Model-View-Controller (MVC) pattern.
    Digitale ISSN: 2073-431X
    Thema: Informatik
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 17
    Publikationsdatum: 2020-07-15
    Beschreibung: It is critical for organizations to self-assess their Industry 4.0 readiness to survive and thrive in the age of the Fourth Industrial Revolution. Thereon, conceptualization or development of an Industry 4.0 readiness model with the fundamental model dimensions is needed. This paper used a systematic literature review (SLR) methodology with the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines and content analysis strategy to review 97 papers in peer-reviewed academic journals and industry reports published from 2000 to 2019. The review identifies 30 Industry 4.0 readiness models with 158 unique model dimensions. Based on this review, there are two theoretical contributions. First, this paper proposes six dimensions (Technology, People, Strategy, Leadership, Process and Innovation) that can be considered as the most important dimensions for organizations. Second, this review reveals that 70 (44%) out of total 158 total unique dimensions on Industry 4.0 pertain to the assessment of technology alone. This establishes that organizations need to largely improve on their technology readiness, to strengthen their Industry 4.0 readiness. In summary, these six most common dimensions, and in particular, the dominance of the technology dimension provides a research agenda for future research on Industry 4.0 readiness.
    Digitale ISSN: 2078-2489
    Thema: Informatik
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 18
    Publikationsdatum: 2020-07-16
    Beschreibung: This study introduces a software-based traffic congestion monitoring system. The transportation system controls the traffic between cities all over the world. Traffic congestion happens not only in cities, but also on highways and other places. The current transportation system is not satisfactory in the area without monitoring. In order to improve the limitations of the current traffic system in obtaining road data and expand its visual range, the system uses remote sensing data as the data source for judging congestion. Since some remote sensing data needs to be kept confidential, this is a problem to be solved to effectively protect the safety of remote sensing data during the deep learning training process. Compared with the general deep learning training method, this study provides a federated learning method to identify vehicle targets in remote sensing images to solve the problem of data privacy in the training process of remote sensing data. The experiment takes the remote sensing image data sets of Los Angeles Road and Washington Road as samples for training, and the training results can achieve an accuracy of about 85%, and the estimated processing time of each image can be as low as 0.047 s. In the final experimental results, the system can automatically identify the vehicle targets in the remote sensing images to achieve the purpose of detecting congestion.
    Digitale ISSN: 2078-2489
    Thema: Informatik
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 19
    Publikationsdatum: 2020-07-15
    Beschreibung: Fractal’s spatially nonuniform phenomena and chaotic nature highlight the function utilization in fractal cryptographic applications. This paper proposes a new composite fractal function (CFF) that combines two different Mandelbrot set (MS) functions with one control parameter. The CFF simulation results demonstrate that the given map has high initial value sensitivity, complex structure, wider chaotic region, and more complicated dynamical behavior. By considering the chaotic properties of a fractal, an image encryption algorithm using a fractal-based pixel permutation and substitution is proposed. The process starts by scrambling the plain image pixel positions using the Henon map so that an intruder fails to obtain the original image even after deducing the standard confusion-diffusion process. The permutation phase uses a Z-scanned random fractal matrix to shuffle the scrambled image pixel. Further, two different fractal sequences of complex numbers are generated using the same function i.e. CFF. The complex sequences are thus modified to a double datatype matrix and used to diffuse the scrambled pixels in a row-wise and column-wise manner, separately. Security and performance analysis results confirm the reliability, high-security level, and robustness of the proposed algorithm against various attacks, including brute-force attack, known/chosen-plaintext attack, differential attack, and occlusion attack.
    Digitale ISSN: 2313-433X
    Thema: Informatik
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 20
    Publikationsdatum: 2020-07-08
    Beschreibung: In the last decade, there has been a surge in interest in connected and automated vehicles (CAVs) and related enabling technologies in the fields of communication, automation, computing, sensing, and positioning [...]
    Digitale ISSN: 1999-5903
    Thema: Informatik
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 21
    Publikationsdatum: 2020-07-08
    Beschreibung: We consider a rather general problem of nonparametric estimation of an uncountable set of probability density functions (p.d.f.’s) of the form: f ( x ; r ) , where r is a non-random real variable and ranges from R 1 to R 2 . We put emphasis on the algorithmic aspects of this problem, since they are crucial for exploratory analysis of big data that are needed for the estimation. A specialized learning algorithm, based on the 2D FFT, is proposed and tested on observations that allow for estimate p.d.f.’s of a jet engine temperatures as a function of its rotation speed. We also derive theoretical results concerning the convergence of the estimation procedure that contains hints on selecting parameters of the estimation algorithm.
    Digitale ISSN: 1999-4893
    Thema: Informatik
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 22
    Publikationsdatum: 2020-07-08
    Beschreibung: The lockdown was crucial to stop the COVID-19 pandemic in Italy, but it affected many aspects of social life, among which traditional live science cafés. Moreover, citizens and experts asked for a direct contact, not relying on mass-media communication. In this paper, we describe how the Florence and Rome science cafés, contacted by citizens and experts, either directly or through the Florence science shop, responded to these needs by organizing online versions of traditional face-to-face events, experiencing high levels of participation. The science café methodology was also requested by a high school that needed to conclude an engagement experience with students and their families. We also report the results of a survey about the satisfaction of this new methodology with respect to the old one.
    Digitale ISSN: 1999-5903
    Thema: Informatik
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 23
    Publikationsdatum: 2020-07-09
    Beschreibung: This research presents a machine vision approach to detect lesions in liver ultrasound as well as resolving some issues in ultrasound such as artifacts, speckle noise, and blurring effect. The anisotropic diffusion is modified using the edge preservation conditions which found better than traditional ones in quantitative evolution. To dig for more potential information, a learnable super-resolution (SR) is embedded into the deep CNN. The feature is fused using Gabor Wavelet Transform (GWT) and Local Binary Pattern (LBP) with a pre-trained deep CNN model. Moreover, we propose a Bayes rule-based informative patch selection approach to reduce the processing time with the selective image patches and design an algorithm to mark the lesion region from identified ultrasound image patches. To train this model, standard data ensures promising resolution. The testing phase considers generalized data with a varying resolution and test the performance of the model. Exploring cross-validation, it finds that a 5-fold strategy can successfully eradicate the overfitting problem. Experiment data are collected using 298 consecutive ultrasounds comprising 15,296 image patches. This proposed feature fusion technique confirms satisfactory performance compared to the current relevant works with an accuracy of 98.40%.
    Digitale ISSN: 2504-4990
    Thema: Informatik
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 24
    Publikationsdatum: 2020-07-10
    Beschreibung: QR (quick response) Codes are one of the most popular types of two-dimensional (2D) matrix codes currently used in a wide variety of fields. Two-dimensional matrix codes, compared to 1D bar codes, can encode significantly more data in the same area. We have compared algorithms capable of localizing multiple QR Codes in an image using typical finder patterns, which are present in three corners of a QR Code. Finally, we present a novel approach to identify perspective distortion by analyzing the direction of horizontal and vertical edges and by maximizing the standard deviation of horizontal and vertical projections of these edges. This algorithm is computationally efficient, works well for low-resolution images, and is also suited to real-time processing.
    Digitale ISSN: 2313-433X
    Thema: Informatik
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 25
    Publikationsdatum: 2020-07-08
    Beschreibung: Deep learning models have been applied for varied electrical applications in smart grids with a high degree of reliability and accuracy. The development of deep learning models requires the historical data collected from several electric utilities during the training of the models. The lack of historical data for training and testing of developed models, considering security and privacy policy restrictions, is considered one of the greatest challenges to machine learning-based techniques. The paper proposes the use of homomorphic encryption, which enables the possibility of training the deep learning and classical machine learning models whilst preserving the privacy and security of the data. The proposed methodology is tested for applications of fault identification and localization, and load forecasting in smart grids. The results for fault localization show that the classification accuracy of the proposed privacy-preserving deep learning model while using homomorphic encryption is 97–98%, which is close to 98–99% classification accuracy of the model on plain data. Additionally, for load forecasting application, the results show that RMSE using the homomorphic encryption model is 0.0352 MWh while RMSE without application of encryption in modeling is around 0.0248 MWh.
    Digitale ISSN: 2078-2489
    Thema: Informatik
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 26
    Publikationsdatum: 2020-07-07
    Beschreibung: Fifth generation (5G) is a new generation mobile communication system developed for the growing demand for mobile communication. Channel coding is an indispensable part of most modern digital communication systems, for it can improve the transmission reliability and anti-interference. In order to meet the requirements of 5G communication, a dual threshold self-corrected minimum sum (DT-SCMS) algorithm for low-density parity-check (LDPC) decoders is proposed in this paper. Besides, an architecture of LDPC decoders is designed. By setting thresholds to judge the reliability of messages, the DT-SCMS algorithm erases unreliable messages, improving the decoding performance and efficiency. Simulation results show that the performance of DT-SCMS is better than that of SCMS. When the code rate is 1/3, the performance of DT-SCMS has been improved by 0.2 dB at the bit error rate of 10 − 4 compared with SCMS. In terms of the convergence, when the code rate is 2/3, the number of iterations of DT-SCMS can be reduced by up to 20.46% compared with SCMS, and the average proportion of reduction is 18.68%.
    Digitale ISSN: 2078-2489
    Thema: Informatik
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 27
    Publikationsdatum: 2020-07-09
    Beschreibung: We report the design of a Spiking Neural Network (SNN) edge detector with biologically inspired neurons that has a conceptual similarity with both Hodgkin-Huxley (HH) model neurons and Leaky Integrate-and-Fire (LIF) neurons. The computation of the membrane potential, which is used to determine the occurrence or absence of spike events, at each time step, is carried out by using the analytical solution to a simplified version of the HH neuron model. We find that the SNN based edge detector detects more edge pixels in images than those obtained by a Sobel edge detector. We designed a pipeline for image classification with a low-exposure frame simulation layer, SNN edge detection layers as pre-processing layers and a Convolutional Neural Network (CNN) as a classification module. We tested this pipeline for the task of classification with the Digits dataset, which is available in MATLAB. We find that the SNN based edge detection layer increases the image classification accuracy at lower exposure times, that is, for 1 〈 t 〈 T /4, where t is the number of milliseconds in a simulated exposure frame and T is the total exposure time, with reference to a Sobel edge or Canny edge detection layer in the pipeline. These results pave the way for developing novel cognitive neuromorphic computing architectures for millisecond timescale detection and object classification applications using event or spike cameras.
    Digitale ISSN: 1999-4893
    Thema: Informatik
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 28
    Publikationsdatum: 2020-07-08
    Beschreibung: The collection and processing of personal data offers great opportunities for technological advances, but the accumulation of vast amounts of personal data also increases the risk of misuse for malicious intentions, especially in health care. Therefore, personal data are legally protected, e.g., by the European General Data Protection Regulation (GDPR), which states that individuals must be transparently informed and have the right to take control over the processing of their personal data. In real applications privacy policies are used to fulfill these requirements which can be negotiated via user interfaces. The literature proposes privacy languages as an electronic format for privacy policies while the users privacy preferences are represented by preference languages. However, this is only the beginning of the personal data life-cycle, which also includes the processing of personal data and its transfer to various stakeholders. In this work we define a personal privacy workflow, considering the negotiation of privacy policies, privacy-preserving processing and secondary use of personal data, in context of health care data processing to survey applicable Privacy Enhancing Technologies (PETs) to ensure the individuals’ privacy. Based on a broad literature review we identify open research questions for each step of the workflow.
    Digitale ISSN: 2078-2489
    Thema: Informatik
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 29
    Publikationsdatum: 2020-07-05
    Beschreibung: Microscopic crowd simulation can help to enhance the safety of pedestrians in situations that range from museum visits to music festivals. To obtain a useful prediction, the input parameters must be chosen carefully. In many cases, a lack of knowledge or limited measurement accuracy add uncertainty to the input. In addition, for meaningful parameter studies, we first need to identify the most influential parameters of our parametric computer models. The field of uncertainty quantification offers standardized and fully automatized methods that we believe to be beneficial for pedestrian dynamics. In addition, many methods come at a comparatively low cost, even for computationally expensive problems. This allows for their application to larger scenarios. We aim to identify and adapt fitting methods to microscopic crowd simulation in order to explore their potential in pedestrian dynamics. In this work, we first perform a variance-based sensitivity analysis using Sobol’ indices and then crosscheck the results by a derivative-based measure, the activity scores. We apply both methods to a typical scenario in crowd simulation, a bottleneck. Because constrictions can lead to high crowd densities and delays in evacuations, several experiments and simulation studies have been conducted for this setting. We show qualitative agreement between the results of both methods. Additionally, we identify a one-dimensional subspace in the input parameter space and discuss its impact on the simulation. Moreover, we analyze and interpret the sensitivity indices with respect to the bottleneck scenario.
    Digitale ISSN: 1999-4893
    Thema: Informatik
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 30
    Publikationsdatum: 2020-06-30
    Beschreibung: The use of chatbots in news media platforms, although relatively recent, offers many advantages to journalists and media professionals and, at the same time, facilitates users’ interaction with useful and timely information. This study shows the usability of a news chatbot during a crisis situation, employing the 2020 COVID-19 pandemic as a case study. The basic targets of the research are to design and implement a chatbot in a news media platform with a two-fold aim in regard to evaluation: first, the technical effort of creating a functional and robust news chatbot in a crisis situation both from the AI perspective and interoperability with other platforms, which constitutes the novelty of the approach; and second, users’ perception regarding the appropriation of this news chatbot as an alternative means of accessing existing information during a crisis situation. The chatbot designed was evaluated in terms of effectively fulfilling the social responsibility function of crisis reporting, to deliver timely and accurate information on the COVID-19 pandemic to a wide audience. In this light, this study shows the advantages of implementing chatbots in news platforms during a crisis situation, when the audience’s needs for timely and accurate information rapidly increase.
    Digitale ISSN: 1999-5903
    Thema: Informatik
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 31
    Publikationsdatum: 2020-06-30
    Beschreibung: Twitter is a microblogging platform that generates large volumes of data with high velocity. This daily generation of unbounded and continuous data leads to Big Data streams that often require real-time distributed and fully automated processing. Hashtags, hyperlinked words in tweets, are widely used for tweet topic classification, retrieval, and clustering. Hashtags are used widely for analyzing tweet sentiments where emotions can be classified without contexts. However, regardless of the wide usage of hashtags, general tweet topic classification using hashtags is challenging due to its evolving nature, lack of context, slang, abbreviations, and non-standardized expression by users. Most existing approaches, which utilize hashtags for tweet topic classification, focus on extracting hashtag concepts from external lexicon resources to derive semantics. However, due to the rapid evolution and non-standardized expression of hashtags, the majority of these lexicon resources either suffer from the lack of hashtag words in their knowledge bases or use multiple resources at once to derive semantics, which make them unscalable. Along with scalable and automated techniques for tweet topic classification using hashtags, there is also a requirement for real-time analytics approaches to handle huge and dynamic flows of textual streams generated by Twitter. To address these problems, this paper first presents a novel semi-automated technique that derives semantically relevant hashtags using a domain-specific knowledge base of topic concepts and combines them with the existing tweet-based-hashtags to produce Hybrid Hashtags. Further, to deal with the speed and volume of Big Data streams of tweets, we present an online approach that updates the preprocessing and learning model incrementally in a real-time streaming environment using the distributed framework, Apache Storm. Finally, to fully exploit the batch and stream environment performance advantages, we propose a comprehensive framework (Hybrid Hashtag-based Tweet topic classification (HHTC) framework) that combines batch and online mechanisms in the most effective way. Extensive experimental evaluations on a large volume of Twitter data show that the batch and online mechanisms, along with their combination in the proposed framework, are scalable, efficient, and provide effective tweet topic classification using hashtags.
    Digitale ISSN: 2078-2489
    Thema: Informatik
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 32
    Publikationsdatum: 2020-06-30
    Beschreibung: Standard (Lomb-Scargle, likelihood, etc.) procedures for power-spectrum analysis provide convenient estimates of the significance of any peak in a power spectrum, based—typically—on the assumption that the measurements being analyzed have a normal (i.e., Gaussian) distribution. However, the measurement sequence provided by a real experiment or a real observational program may not meet this requirement. The RONO (rank-order normalization) procedure generates a proxy distribution that retains the rank-order of the original measurements but has a strictly normal distribution. The proxy distribution may then be analyzed by standard power-spectrum analysis. We show by an example that the resulting power spectrum may prove to be quite close to the power spectrum obtained from the original data by a standard procedure, even if the distribution of the original measurements is far from normal. Such a comparison would tend to validate the original analysis.
    Digitale ISSN: 1999-4893
    Thema: Informatik
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 33
    Publikationsdatum: 2020-06-30
    Beschreibung: Toward strong demand for very high-speed I/O for processors, physical performance growth of hardware I/O speed was drastically increased in this decade. However, the recent Big Data applications still demand the larger I/O bandwidth and the lower latency for the speed. Because the current I/O performance does not improve so drastically, it is the time to consider another way to increase it. To overcome this challenge, we focus on lossless data compression technology to decrease the amount of data itself in the data communication path. The recent Big Data applications treat data stream that flows continuously and never allow stalling processing due to the high speed. Therefore, an elegant hardware-based data compression technology is demanded. This paper proposes a novel lossless data compression, called ASE coding. It encodes streaming data by applying the entropy coding approach. ASE coding instantly assigns the fewest bits to the corresponding compressed data according to the number of occupied entries in a look-up table. This paper describes the detailed mechanism of ASE coding. Furthermore, the paper demonstrates performance evaluations to promise that ASE coding adaptively shrinks streaming data and also works on a small amount of hardware resources without stalling or buffering any part of data stream.
    Digitale ISSN: 1999-4893
    Thema: Informatik
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 34
    Publikationsdatum: 2020-06-30
    Beschreibung: When highly automated driving is realized, the role of the driver will change dramatically. Drivers will even be able to sleep during the drive. However, when awaking from sleep, drivers often experience sleep inertia, meaning they are feeling groggy and are impaired in their driving performance―which can be an issue with the concept of dual-mode vehicles that allow both manual and automated driving. Proactive methods to avoid sleep inertia like the widely applied ‘NASA nap’ are not immediately practicable in automated driving. Therefore, a reactive countermeasure, the sleep inertia counter-procedure for drivers (SICD), has been developed with the aim to activate and motivate the driver as well as to measure the driver’s alertness level. The SICD is evaluated in a study with N = 21 drivers in a level highly automation driving simulator. The SICD was able to activate the driver after sleep and was perceived as “assisting” by the drivers. It was not capable of measuring the driver’s alertness level. The interpretation of the findings is limited due to a lack of a comparative baseline condition. Future research is needed on direct comparisons of different countermeasures to sleep inertia that are effective and accepted by drivers.
    Digitale ISSN: 2078-2489
    Thema: Informatik
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 35
    Publikationsdatum: 2020-07-01
    Beschreibung: Text annotation is the process of identifying the sense of a textual segment within a given context to a corresponding entity on a concept ontology. As the bag of words paradigm’s limitations become increasingly discernible in modern applications, several information retrieval and artificial intelligence tasks are shifting to semantic representations for addressing the inherent natural language polysemy and homonymy challenges. With extensive application in a broad range of scientific fields, such as digital marketing, bioinformatics, chemical engineering, neuroscience, and social sciences, community detection has attracted great scientific interest. Focusing on linguistics, by aiming to identify groups of densely interconnected subgroups of semantic ontologies, community detection application has proven beneficial in terms of disambiguation improvement and ontology enhancement. In this paper we introduce a novel distributed supervised knowledge-based methodology employing community detection algorithms for text annotation with Wikipedia Entities, establishing the unprecedented concept of community Coherence as a metric for local contextual coherence compatibility. Our experimental evaluation revealed that deeper inference of relatedness and local entity community coherence in the Wikipedia graph bears substantial improvements overall via a focus on accuracy amelioration of less common annotations. The proposed methodology is propitious for wider adoption, attaining robust disambiguation performance.
    Digitale ISSN: 1999-4893
    Thema: Informatik
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 36
    Publikationsdatum: 2020-07-02
    Beschreibung: The problem posed by complex, articulated or deformable objects has been at the focus of much tracking research for a considerable length of time. However, it remains a major challenge, fraught with numerous difficulties. The increased ubiquity of technology in all realms of our society has made the need for effective solutions all the more urgent. In this article, we describe a novel method which systematically addresses the aforementioned difficulties and in practice outperforms the state of the art. Global spatial flexibility and robustness to deformations are achieved by adopting a pictorial structure based geometric model, and localized appearance changes by a subspace based model of part appearance underlain by a gradient based representation. In addition to one-off learning of both the geometric constraints and part appearances, we introduce a continuing learning framework which implements information discounting i.e., the discarding of historical appearances in favour of the more recent ones. Moreover, as a means of ensuring robustness to transient occlusions (including self-occlusions), we propose a solution for detecting unlikely appearance changes which allows for unreliable data to be rejected. A comprehensive evaluation of the proposed method, the analysis and discussing of findings, and a comparison with several state-of-the-art methods demonstrates the major superiority of our algorithm.
    Digitale ISSN: 2313-433X
    Thema: Informatik
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 37
    Publikationsdatum: 2020-07-02
    Beschreibung: Image fusion is a process that integrates similar types of images collected from heterogeneous sources into one image in which the information is more definite and certain. Hence, the resultant image is anticipated as more explanatory and enlightening both for human and machine perception. Different image combination methods have been presented to consolidate significant data from a collection of images into one image. As a result of its applications and advantages in variety of fields such as remote sensing, surveillance, and medical imaging, it is significant to comprehend image fusion algorithms and have a comparative study on them. This paper presents a review of the present state-of-the-art and well-known image fusion techniques. The performance of each algorithm is assessed qualitatively and quantitatively on two benchmark multi-focus image datasets. We also produce a multi-focus image fusion dataset by collecting the widely used test images in different studies. The quantitative evaluation of fusion results is performed using a set of image fusion quality assessment metrics. The performance is also evaluated using different statistical measures. Another contribution of this paper is the proposal of a multi-focus image fusion library, to the best of our knowledge, no such library exists so far. The library provides implementation of numerous state-of-the-art image fusion algorithms and is made available publicly at project website.
    Digitale ISSN: 2313-433X
    Thema: Informatik
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 38
    Publikationsdatum: 2020-07-02
    Beschreibung: Fitness and physical exercise are preferred in the pursuit of healthier and active lifestyles. The number of mobile applications aiming to replace or complement a personal trainer is increasing. However, this also raises questions about the reliability, integrity, and even safety of the information provided by such applications. In this study, we review mobile applications that serve as virtual personal trainers. We present a systematic review of 36 related mobile applications, updated between 2017 and 2020, classifying them according to their characteristics. The selection criteria considers the following combination of keywords: “workout”, “personal trainer”, “physical activity”, “fitness”, “gymnasium”, and “daily plan”. Based on the analysis of the identified mobile applications, we propose a new taxonomy and present detailed guidelines on creating mobile applications for personalised workouts. Finally, we investigated how can mobile applications promote health and well-being of users and whether the identified applications are used in any scientific studies.
    Digitale ISSN: 2078-2489
    Thema: Informatik
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 39
    facet.materialart.
    Unbekannt
    Molecular Diversity Preservation International
    Publikationsdatum: 2020-08-31
    Beschreibung: Text similarity measurement is the basis of natural language processing tasks, which play an important role in information retrieval, automatic question answering, machine translation, dialogue systems, and document matching. This paper systematically combs the research status of similarity measurement, analyzes the advantages and disadvantages of current methods, develops a more comprehensive classification description system of text similarity measurement algorithms, and summarizes the future development direction. With the aim of providing reference for related research and application, the text similarity measurement method is described by two aspects: text distance and text representation. The text distance can be divided into length distance, distribution distance, and semantic distance; text representation is divided into string-based, corpus-based, single-semantic text, multi-semantic text, and graph-structure-based representation. Finally, the development of text similarity is also summarized in the discussion section.
    Digitale ISSN: 2078-2489
    Thema: Informatik
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 40
    Publikationsdatum: 2020-06-30
    Beschreibung: Partially automated driving (PAD, Society of Automotive Engineers (SAE) level 2) features provide steering and brake/acceleration support, while the driver must constantly supervise the support feature and intervene if needed to maintain safety. PAD could potentially increase comfort, road safety, and traffic efficiency. As during manual driving, users might engage in non-driving related tasks (NDRTs). However, studies systematically examining NDRT execution during PAD are rare and most importantly, no established methodologies to systematically evaluate driver distraction during PAD currently exist. The current project’s goal was to take the initial steps towards developing a test protocol for systematically evaluating NDRT’s effects during PAD. The methodologies used for manual driving were extended to PAD. Two generic take-over situations addressing system limits of a given PAD regarding longitudinal and lateral control were implemented to evaluate drivers’ supervisory and take-over capabilities while engaging in different NDRTs (e.g., manual radio tuning task). The test protocol was evaluated and refined across the three studies (two simulator and one test track). The results indicate that the methodology could sensitively detect differences between the NDRTs’ influences on drivers’ take-over and especially supervisory capabilities. Recommendations were formulated regarding the test protocol’s use in future studies examining the effects of NDRTs during PAD.
    Digitale ISSN: 2078-2489
    Thema: Informatik
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 41
    Publikationsdatum: 2020-06-30
    Beschreibung: This research concerns the application of micro X-ray fluorescence (µXRF) mapping to the investigation of a group of selected metal objects from the archaeological site of Ferento, a Roman and then medieval town in Central Italy. Specifically, attention was focused on two test pits, named IV and V, in which metal objects were found, mainly pertaining to the medieval period and never investigated before the present work from a compositional point of view. The potentiality of µXRF mapping was tested through a Bruker Tornado M4 equipped with an Rh tube, operating at 50 kV, 500 μA, and spot 25 μm obtained with polycapillary optics. Principal component analysis (PCA) and multivariate curve resolution (MCR) were used for processing the X-ray fluorescence spectra. The results showed that the investigated items are characterized by different compositions in terms of chemical elements. Three little wheels are made of lead, while the fibulae are made of copper-based alloys with varying amounts of tin, zinc, and lead. Only one ring is iron-based, and the other objects, namely a spatula and an applique, are also made of copper-based alloys, but with different relative amounts of the main elements. In two objects, traces of gold were found, suggesting the precious character of these pieces. MCR analysis was demonstrated to be particularly useful to confirm the presence of trace elements, such as gold, as it could differentiate the signals related to minor elements from those due to major chemical elements.
    Digitale ISSN: 2313-433X
    Thema: Informatik
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 42
    Publikationsdatum: 2020-06-30
    Beschreibung: Geomechanical modelling of the processes associated to the exploitation of subsurface resources, such as land subsidence or triggered/induced seismicity, is a common practice of major interest. The prediction reliability depends on different sources of uncertainty, such as the parameterization of the constitutive model characterizing the deep rock behaviour. In this study, we focus on a Sobol’-based sensitivity analysis and uncertainty reduction via assimilation of land deformations. A synthetic test case application on a deep hydrocarbon reservoir is considered, where land settlements are predicted with the aid of a 3-D Finite Element (FE) model. Data assimilation is performed via the Ensemble Smoother (ES) technique and its variation in the form of Multiple Data Assimilation (ES-MDA). However, the ES convergence is guaranteed with a large number of Monte Carlo (MC) simulations, that may be computationally infeasible in large scale and complex systems. For this reason, a surrogate model based on the generalized Polynomial Chaos Expansion (gPCE) is proposed as an approximation of the forward problem. This approach allows to efficiently compute the Sobol’ indices for the sensitivity analysis and greatly reduce the computational cost of the original ES and MDA formulations, also enhancing the accuracy of the overall prediction process.
    Digitale ISSN: 1999-4893
    Thema: Informatik
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 43
    Publikationsdatum: 2020-06-30
    Beschreibung: Prior research found that user personality significantly affects technology acceptance perceptions and decisions. Yet, evidence on the moderating influence of user gender on the relationship between personality and technology acceptance is barely existent despite theoretical consideration. Considering this research gap, the present study reports the results of a survey in which we examined the relationships between personality and technology acceptance from a gender perspective. This study draws upon a sample of N = 686 participants (n = 209 men, n = 477 women) and applied the HEXACO Personality Inventory—Revised along with established technology acceptance measures. The major result of this study is that we do not find significant influence of user gender on the relationship between personality and technology acceptance, except for one aspect of personality, namely altruism. We found a negative association between altruism and intention to use the smartphone in men, but a positive association in women. Consistent with this finding, we also found the same association pattern for altruism and predicted usage: a negative one in men and a positive one in women. Implications for research and practice are discussed, along with limitations of the present study and possible avenues for future research.
    Digitale ISSN: 1999-5903
    Thema: Informatik
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 44
    Publikationsdatum: 2020-06-30
    Beschreibung: Clustering is an unsupervised machine learning technique with many practical applications that has gathered extensive research interest. Aside from deterministic or probabilistic techniques, fuzzy C-means clustering (FCM) is also a common clustering technique. Since the advent of the FCM method, many improvements have been made to increase clustering efficiency. These improvements focus on adjusting the membership representation of elements in the clusters, or on fuzzifying and defuzzifying techniques, as well as the distance function between elements. This study proposes a novel fuzzy clustering algorithm using multiple different fuzzification coefficients depending on the characteristics of each data sample. The proposed fuzzy clustering method has similar calculation steps to FCM with some modifications. The formulas are derived to ensure convergence. The main contribution of this approach is the utilization of multiple fuzzification coefficients as opposed to only one coefficient in the original FCM algorithm. The new algorithm is then evaluated with experiments on several common datasets and the results show that the proposed algorithm is more efficient compared to the original FCM as well as other clustering methods.
    Digitale ISSN: 1999-4893
    Thema: Informatik
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 45
    Publikationsdatum: 2020-07-02
    Beschreibung: Knowing an accurate passengers attendance estimation on each metro car contributes to the safely coordination and sorting the crowd-passenger in each metro station. In this work we propose a multi-head Convolutional Neural Network (CNN) architecture trained to infer an estimation of passenger attendance in a metro car. The proposed network architecture consists of two main parts: a convolutional backbone, which extracts features over the whole input image, and a multi-head layers able to estimate a density map, needed to predict the number of people within the crowd image. The network performance is first evaluated on publicly available crowd counting datasets, including the ShanghaiTech part_A, ShanghaiTech part_B and UCF_CC_50, and then trained and tested on our dataset acquired in subway cars in Italy. In both cases a comparison is made against the most relevant and latest state of the art crowd counting architectures, showing that our proposed MH-MetroNet architecture outperforms in terms of Mean Absolute Error (MAE) and Mean Square Error (MSE) and passenger-crowd people number prediction.
    Digitale ISSN: 2313-433X
    Thema: Informatik
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 46
    Publikationsdatum: 2020-07-03
    Beschreibung: For imaging events of extremely short duration, like shock waves or explosions, it is necessary to be able to image the object with a single-shot exposure. A suitable setup is given by a laser-induced X-ray source such as the one that can be found at GSI (Helmholtzzentrum für Schwerionenforschung GmbH) in Darmstadt (Society for Heavy Ion Research), Germany. There, it is possible to direct a pulse from the high-energy laser Petawatt High Energy Laser for Heavy Ion eXperiments (PHELIX) on a tungsten wire to generate a picosecond polychromatic X-ray pulse, called backlighter. For grating-based single-shot phase-contrast imaging of shock waves or exploding wires, it is important to know the weighted mean energy of the X-ray spectrum for choosing a suitable setup. In propagation-based phase-contrast imaging the knowledge of the weighted mean energy is necessary to be able to reconstruct quantitative phase images of unknown objects. Hence, we developed a method to evaluate the weighted mean energy of the X-ray backlighter spectrum using propagation-based phase-contrast images. In a first step wave-field simulations are performed to verify the results. Furthermore, our evaluation is cross-checked with monochromatic synchrotron measurements with known energy at Diamond Light Source (DLS, Didcot, UK) for proof of concepts.
    Digitale ISSN: 2313-433X
    Thema: Informatik
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 47
    Publikationsdatum: 2020-07-02
    Beschreibung: The number of Internet of Things (IoT) devices is growing at a fast pace in smart homes, producing large amounts of data, which are mostly transferred over wireless communication channels. However, various IoT devices are vulnerable to different threats, such as cyber-attacks, fluctuating network connections, leakage of information, etc. Statistical analysis and machine learning can play a vital role in detecting the anomalies in the data, which enhances the security level of the smart home IoT system which is the goal of this paper. This paper investigates the trustworthiness of the IoT devices sending house appliances’ readings, with the help of various parameters such as feature importance, root mean square error, hyper-parameter tuning, etc. A spamicity score was awarded to each of the IoT devices by the algorithm, based on the feature importance and the root mean square error score of the machine learning models to determine the trustworthiness of the device in the home network. A dataset publicly available for a smart home, along with weather conditions, is used for the methodology validation. The proposed algorithm is used to detect the spamicity score of the connected IoT devices in the network. The obtained results illustrate the efficacy of the proposed algorithm to analyze the time series data from the IoT devices for spam detection.
    Digitale ISSN: 2078-2489
    Thema: Informatik
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 48
    Publikationsdatum: 2020-07-02
    Beschreibung: Humans are capable of learning new concepts from small numbers of examples. In contrast, supervised deep learning models usually lack the ability to extract reliable predictive rules from limited data scenarios when attempting to classify new examples. This challenging scenario is commonly known as few-shot learning. Few-shot learning has garnered increased attention in recent years due to its significance for many real-world problems. Recently, new methods relying on meta-learning paradigms combined with graph-based structures, which model the relationship between examples, have shown promising results on a variety of few-shot classification tasks. However, existing work on few-shot learning is only focused on the feature embeddings produced by the last layer of the neural network. The novel contribution of this paper is the utilization of lower-level information to improve the meta-learner performance in few-shot learning. In particular, we propose the Looking-Back method, which could use lower-level information to construct additional graphs for label propagation in limited data settings. Our experiments on two popular few-shot learning datasets, miniImageNet and tieredImageNet, show that our method can utilize the lower-level information in the network to improve state-of-the-art classification performance.
    Digitale ISSN: 2078-2489
    Thema: Informatik
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 49
    Publikationsdatum: 2020-07-06
    Beschreibung: Virtual worlds have become global platforms connecting millions of people and containing various technologies. For example, No Man’s Sky (nomanssky.com), a cross-platform virtual world, can dynamically and automatically generate content with the progress of user adventure. AltspaceVR (altvr.com) is a social virtual reality platform supporting motion capture through Microsoft’s Kinect, eye tracking, and mixed reality extension. The changes in industrial investment, market revenue, user population, and consumption drive the evolution of virtual-world-related technologies (e.g., computing infrastructure and interaction devices), which turns into new design requirements and thus results in the requirement satisfaction problem in virtual world system architecture design. In this paper, we first study the new or evolving features of virtual worlds and emerging requirements of system development through market/industry trend analysis, including infrastructure mobility, content diversity, function interconnectivity, immersive environment, and intelligent agents. Based on the trend analysis, we propose a new design requirement space. We, then, discuss the requirement satisfaction of existing system architectures and highlight their limitations through a literature review. The feature-based requirement satisfaction comparison of existing system architectures sheds some light on the future virtual world system development to match the changing trends of the user market. At the end of this study, a new architecture from an ongoing research, called Virtual Net, is discussed, which can provide higher resource sufficiency, computing reliability, content persistency, and service credibility.
    Digitale ISSN: 1999-5903
    Thema: Informatik
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 50
    Publikationsdatum: 2020-07-06
    Beschreibung: With the rise of partially automated cars, drivers are more and more required to judge the degree of responsibility that can be delegated to vehicle assistant systems. This can be supported by utilizing interfaces that intuitively convey real-time reliabilities of system functions such as environment sensing. We designed a vibrotactile interface that communicates spatiotemporal information about surrounding vehicles and encodes a representation of spatial uncertainty in a novel way. We evaluated this interface in a driving simulator experiment with high and low levels of human and machine confidence respectively caused by simulated degraded vehicle sensor precision and limited human visibility range. Thereby we were interested in whether drivers (i) could perceive and understand the vibrotactile encoding of spatial uncertainty, (ii) would subjectively benefit from the encoded information, (iii) would be disturbed in cases of information redundancy, and (iv) would gain objective safety benefits from the encoded information. To measure subjective understanding and benefit, a custom questionnaire, Van der Laan acceptance ratings and NASA TLX scores were used. To measure the objective benefit, we computed the minimum time-to-contact as a measure of safety and gaze distributions as an indicator for attention guidance. Results indicate that participants were able to understand the encoded uncertainty and spatiotemporal information and purposefully utilized it when needed. The tactile interface provided meaningful support despite sensory restrictions. By encoding spatial uncertainties, it successfully extended the operating range of the assistance system.
    Digitale ISSN: 2078-2489
    Thema: Informatik
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 51
    Publikationsdatum: 2020-07-03
    Beschreibung: The COVID-19 pandemic exploded at the beginning of 2020, with over four million cases in five months, overwhelming the healthcare sector. Several national governments decided to adopt containment measures, such as lockdowns, social distancing, and quarantine. Among these measures, contact tracing can contribute in bringing under control the outbreak, as quickly identifying contacts to isolate suspected cases can limit the number of infected people. In this paper we present BubbleBox, a system relying on a dedicated device to perform contact tracing. BubbleBox integrates Internet of Things and software technologies into different components to achieve its goal—providing a tool to quickly react to further outbreaks, by allowing health operators to rapidly reach and test possible infected people. This paper describes the BubbleBox architecture, presents its prototype implementation, and discusses its pros and cons, also dealing with privacy concerns.
    Digitale ISSN: 2078-2489
    Thema: Informatik
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 52
    Publikationsdatum: 2020-07-05
    Beschreibung: Variation, adaptation, heredity and fitness, constraints and affordances, speciation, and extinction form the building blocks of the (Neo-)Darwinian research program, and several of these have been called “Darwinian principles”. Here, we suggest that caution should be taken in calling these principles Darwinian because of the important role played by reticulate evolutionary mechanisms and processes in also bringing about these phenomena. Reticulate mechanisms and processes include symbiosis, symbiogenesis, lateral gene transfer, infective heredity mediated by genetic and organismal mobility, and hybridization. Because the “Darwinian principles” are brought about by both vertical and reticulate evolutionary mechanisms and processes, they should be understood as foundational for a more pluralistic theory of evolution, one that surpasses the classic scope of the Modern and the Neo-Darwinian Synthesis. Reticulate evolution moreover demonstrates that what conventional (Neo-)Darwinian theories treat as intra-species features of evolution frequently involve reticulate interactions between organisms from very different taxonomic categories. Variation, adaptation, heredity and fitness, constraints and affordances, speciation, and extinction therefore cannot be understood as “traits” or “properties” of genes, organisms, species, or ecosystems because the phenomena are irreducible to specific units and levels of an evolutionary hierarchy. Instead, these general principles of evolution need to be understood as common goods that come about through interactions between different units and levels of evolutionary hierarchies, and they are exherent rather than inherent properties of individuals.
    Digitale ISSN: 2078-2489
    Thema: Informatik
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 53
    Publikationsdatum: 2020-07-04
    Beschreibung: This paper presents an experiment on newsreaders’ behavior and preferences on the interaction with online personalized news. Different recommendation approaches, based on consumption profiles and user location, and the impact of personalized news on several aspects of consumer decision-making are examined on a group of volunteers. Results show a significant preference for reading recommended news over other news presented on the screen, regardless of the chosen editorial layout. In addition, the study also provides support for the creation of profiles taking into consideration the evolution of user’s interests. The proposed solution is valid for users with different reading habits and can be successfully applied even to users with small consumption history. Our findings can be used by news providers to improve online services, thus increasing readers’ perceived satisfaction.
    Digitale ISSN: 2078-2489
    Thema: Informatik
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 54
    Publikationsdatum: 2020-07-06
    Beschreibung: Many industries today are struggling with early the identification of quality issues, given the shortening of product design cycles and the desire to decrease production costs, coupled with the customer requirement for high uptime. The vehicle industry is no exception, as breakdowns often lead to on-road stops and delays in delivery missions. In this paper we consider quality issues to be an unexpected increase in failure rates of a particular component; those are particularly problematic for the original equipment manufacturers (OEMs) since they lead to unplanned costs and can significantly affect brand value. We propose a new approach towards the early detection of quality issues using machine learning (ML) to forecast the failures of a given component across the large population of units. In this study, we combine the usage information of vehicles with the records of their failures. The former is continuously collected, as the usage statistics are transmitted over telematics connections. The latter is based on invoice and warranty information collected in the workshops. We compare two different ML approaches: the first is an auto-regression model of the failure ratios for vehicles based on past information, while the second is the aggregation of individual vehicle failure predictions based on their individual usage. We present experimental evaluations on the real data captured from heavy-duty trucks demonstrating how these two formulations have complementary strengths and weaknesses; in particular, they can outperform each other given different volumes of the data. The classification approach surpasses the regressor model whenever enough data is available, i.e., once the vehicles are in-service for a longer time. On the other hand, the regression shows better predictive performance with a smaller amount of data, i.e., for vehicles that have been deployed recently.
    Digitale ISSN: 2078-2489
    Thema: Informatik
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 55
    Publikationsdatum: 2020-07-03
    Beschreibung: Business processes evolve over time to adapt to changing business environments. This requires continuous monitoring of business processes to gain insights into whether they conform to the intended design or deviate from it. The situation when a business process changes while being analysed is denoted as Concept Drift. Its analysis is concerned with studying how a business process changes, in terms of detecting and localising changes and studying the effects of the latter. Concept drift analysis is crucial to enable early detection and management of changes, that is, whether to promote a change to become part of an improved process, or to reject the change and make decisions to mitigate its effects. Despite its importance, there exists no comprehensive framework for analysing concept drift types, affected process perspectives, and granularity levels of a business process. This article proposes the CONcept Drift Analysis in Process Mining (CONDA-PM) framework describing phases and requirements of a concept drift analysis approach. CONDA-PM was derived from a Systematic Literature Review (SLR) of current approaches analysing concept drift. We apply the CONDA-PM framework on current approaches to concept drift analysis and evaluate their maturity. Applying CONDA-PM framework highlights areas where research is needed to complement existing efforts.
    Digitale ISSN: 1999-4893
    Thema: Informatik
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 56
    Publikationsdatum: 2020-04-14
    Beschreibung: Let P be a set of n points in R d , k ≥ 1 be an integer and ε ∈ ( 0 , 1 ) be a constant. An ε-coreset is a subset C ⊆ P with appropriate non-negative weights (scalars), that approximates any given set Q ⊆ R d of k centers. That is, the sum of squared distances over every point in P to its closest point in Q is the same, up to a factor of 1 ± ε to the weighted sum of C to the same k centers. If the coreset is small, we can solve problems such as k-means clustering or its variants (e.g., discrete k-means, where the centers are restricted to be in P, or other restricted zones) on the small coreset to get faster provable approximations. Moreover, it is known that such coreset support streaming, dynamic and distributed data using the classic merge-reduce trees. The fact that the coreset is a subset implies that it preserves the sparsity of the data. However, existing such coresets are randomized and their size has at least linear dependency on the dimension d. We suggest the first such coreset of size independent of d. This is also the first deterministic coreset construction whose resulting size is not exponential in d. Extensive experimental results and benchmarks are provided on public datasets, including the first coreset of the English Wikipedia using Amazon’s cloud.
    Digitale ISSN: 1999-4893
    Thema: Informatik
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 57
    Publikationsdatum: 2019
    Beschreibung: Over the years, the cellular mobile network has evolved from a wireless plain telephone system to a very complex system providing telephone service, Internet connectivity and many interworking capabilities with other networks. Its air interface performance has increased drastically over time, leading to high throughput and low latency. Changes to the core network, however, have been slow and incremental, with increased complexity worsened by the necessity of backwards-compatibility with older-generation systems such as the Global System for Mobile communication (GSM). In this paper, a new virtualized Peer-to-Peer (P2P) core network architecture is presented. The key idea of our approach is that each user is assigned a private virtualized copy of the whole core network. This enables a higher degree of security and novel services that are not possible in today’s architecture. We describe the new architecture, focusing on its main elements, IP addressing, message flows, mobility management, and scalability. Furthermore, we will show some significant advantages this new architecture introduces. Finally, we investigate the performance of our architecture by analyzing voice-call traffic available in a database of a large U.S. cellular network provider.
    Digitale ISSN: 1999-5903
    Thema: Informatik
    Publiziert von MDPI
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 58
    Publikationsdatum: 2019
    Beschreibung: The ongoing digital transformation has the potential to revolutionize nearly all industrial manufacturing processes. However, its concrete requirements and implications are still not sufficiently investigated. In order to establish a common understanding, a multitude of initiatives have published guidelines, reference frameworks and specifications, all intending to promote their particular interpretation of the Industrial Internet of Things (IIoT). As a result of the inconsistent use of terminology, heterogeneous structures and proposed processes, an opaque landscape has been created. The consequence is that both new users and experienced experts can hardly manage to get an overview of the amount of information and publications, and make decisions on what is best to use and to adopt. This work contributes to the state of the art by providing a structured analysis of existing reference frameworks, their classifications and the concerns they target. We supply alignments of shared concepts, identify gaps and give a structured mapping of regarded concerns at each part of the respective reference architectures. Furthermore, the linking of relevant industry standards and technologies to the architectures allows a more effective search for specifications and guidelines and supports the direct technology adoption.
    Digitale ISSN: 1999-5903
    Thema: Informatik
    Publiziert von MDPI
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 59
    Publikationsdatum: 2019
    Beschreibung: Service recommendation is one of the important means of service selection. Aiming at the problems of ignoring the influence of typical data sources such as service information and interaction logs on the similarity calculation of user preferences and insufficient consideration of dynamic trust relationship in traditional trust-based Web service recommendation methods, a novel approach for Web service recommendation based on advanced trust relationships is presented. After considering the influence of indirect trust paths, the improved calculation about indirect trust degree is proposed. By quantifying the popularity of service, the method of calculating user preference similarity is investigated. Furthermore, the dynamic adjustment mechanism of trust is designed by differentiating the effect of each service recommendation. Integrating these efforts, a service recommendation mechanism is introduced, in which a new service recommendation algorithm is described. Experimental results show that, compared with existing methods, the proposed approach not only has higher accuracy of service recommendation, but also can resist attacks from malicious users more effectively.
    Digitale ISSN: 2078-2489
    Thema: Informatik
    Publiziert von MDPI
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 60
    Publikationsdatum: 2019
    Beschreibung: We explore the class of positive integers n that admit idempotent factorizations n = p ¯ q ¯ such that λ ( n ) ∣ ( p ¯ − 1 ) ( q ¯ − 1 ) , where λ is the Carmichael lambda function. Idempotent factorizations with p ¯ and q ¯ prime have received the most attention due to their cryptographic advantages, but there are an infinite number of n with idempotent factorizations containing composite p ¯ and/or q ¯ . Idempotent factorizations are exactly those p ¯ and q ¯ that generate correctly functioning keys in the Rivest–Shamir–Adleman (RSA) 2-prime protocol with n as the modulus. While the resulting p ¯ and q ¯ have no cryptographic utility and therefore should never be employed in that capacity, idempotent factorizations warrant study in their own right as they live at the intersection of multiple hard problems in computer science and number theory. We present some analytical results here. We also demonstrate the existence of maximally idempotent integers, those n for which all bipartite factorizations are idempotent. We show how to construct them, and present preliminary results on their distribution.
    Digitale ISSN: 2078-2489
    Thema: Informatik
    Publiziert von MDPI
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 61
    Publikationsdatum: 2019
    Beschreibung: Google’s Material Design, created in 2014, led to the extended application of floating action buttons (FAB) in user interfaces of web pages and mobile applications. FAB’s roll is to trigger an activity either on the present screen, or it can play out an activity that makes another screen. A few specialists in user experience (UX) and user interface (UI) design are sceptical regarding the usability of FAB in the interfaces of both web pages and mobile applications. They claim that the use of FAB easily distracts users and that it interferes with using other important functions of the applications, and it is unusable in applications designed for iOS systems. The aim of this paper is to investigate by an experiment the quality of experience (QoE) of a static and animated FAB and compare it to the toolbar alternative. The experimental results of different testing methods rejected the hypothesis that the usage and animation of this UI element has a positive influence on the application usability. However, its static and animated utilization enhanced the ratings of hedonic and aesthetic features of the user experience, justifying the usage of this type of button.
    Digitale ISSN: 1999-5903
    Thema: Informatik
    Publiziert von MDPI
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 62
    Publikationsdatum: 2019
    Beschreibung: Recommender systems are nowadays an indispensable part of most personalized systems implementing information access and content delivery, supporting a great variety of user activities [...]
    Digitale ISSN: 2078-2489
    Thema: Informatik
    Publiziert von MDPI
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 63
    Publikationsdatum: 2019
    Beschreibung: Finite element data form an important basis for engineers to undertake analysis and research. In most cases, it is difficult to generate the internal sections of finite element data and professional operations are required. To display the internal data of entities, a method for generating the arbitrary sections of finite element data based on radial basis function (RBF) interpolation is proposed in this paper. The RBF interpolation function is used to realize arbitrary surface cutting of the entity, and the section can be generated by the triangulation of discrete tangent points. Experimental studies have proved that the method is very convenient for allowing users to obtain visualization results for an arbitrary section through simple and intuitive interactions.
    Digitale ISSN: 2078-2489
    Thema: Informatik
    Publiziert von MDPI
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 64
    Publikationsdatum: 2019
    Beschreibung: The number of documents published on the Web in languages other than English grows every year. As a consequence, the need to extract useful information from different languages increases, highlighting the importance of research into Open Information Extraction (OIE) techniques. Different OIE methods have dealt with features from a unique language; however, few approaches tackle multilingual aspects. In those approaches, multilingualism is restricted to processing text in different languages, rather than exploring cross-linguistic resources, which results in low precision due to the use of general rules. Multilingual methods have been applied to numerous problems in Natural Language Processing, achieving satisfactory results and demonstrating that knowledge acquisition for a language can be transferred to other languages to improve the quality of the facts extracted. We argue that a multilingual approach can enhance OIE methods as it is ideal to evaluate and compare OIE systems, and therefore can be applied to the collected facts. In this work, we discuss how the transfer knowledge between languages can increase acquisition from multilingual approaches. We provide a roadmap of the Multilingual Open IE area concerning state of the art studies. Additionally, we evaluate the transfer of knowledge to improve the quality of the facts extracted in each language. Moreover, we discuss the importance of a parallel corpus to evaluate and compare multilingual systems.
    Digitale ISSN: 2078-2489
    Thema: Informatik
    Publiziert von MDPI
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 65
    Publikationsdatum: 2019
    Beschreibung: This paper aims to explore the current status, research trends and hotspots related to the field of infrared detection technology through bibliometric analysis and visualization techniques based on the Science Citation Index Expanded (SCIE) and Social Sciences Citation Index (SSCI) articles published between 1990 and 2018 using the VOSviewer and Citespace software tools. Based on our analysis, we first present the spatiotemporal distribution of the literature related to infrared detection technology, including annual publications, origin country/region, main research organization, and source publications. Then, we report the main subject categories involved in infrared detection technology. Furthermore, we adopt literature cocitation, author cocitation, keyword co-occurrence and timeline visualization analyses to visually explore the research fronts and trends, and present the evolution of infrared detection technology research. The results show that China, the USA and Italy are the three most active countries in infrared detection technology research and that the Centre National de la Recherche Scientifique has the largest number of publications among related organizations. The most prominent research hotspots in the past five years are vibration thermal imaging, pulse thermal imaging, photonic crystals, skin temperature, remote sensing technology, and detection of delamination defects in concrete. The trend of future research on infrared detection technology is from qualitative to quantitative research development, engineering application research and infrared detection technology combined with other detection techniques. The proposed approach based on the scientific knowledge graph analysis can be used to establish reference information and a research basis for application and development of methods in the domain of infrared detection technology studies.
    Digitale ISSN: 2078-2489
    Thema: Informatik
    Publiziert von MDPI
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 66
    Publikationsdatum: 2019
    Beschreibung: The literature on big data analytics and firm performance is still fragmented and lacking in attempts to integrate the current studies’ results. This study aims to provide a systematic review of contributions related to big data analytics and firm performance. The authors assess papers listed in the Web of Science index. This study identifies the factors that may influence the adoption of big data analytics in various parts of an organization and categorizes the diverse types of performance that big data analytics can address. Directions for future research are developed from the results. This systematic review proposes to create avenues for both conceptual and empirical research streams by emphasizing the importance of big data analytics in improving firm performance. In addition, this review offers both scholars and practitioners an increased understanding of the link between big data analytics and firm performance.
    Digitale ISSN: 2078-2489
    Thema: Informatik
    Publiziert von MDPI
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 67
    Publikationsdatum: 2019
    Beschreibung: Service Level Agreements are employed to set availability commitments in cloud services. When a violation occurs as in an outage, cloud providers may be called to compensate customers for the losses incurred. Such compensation may be so large as to erode cloud providers’ profit margins. Insurance may be used to protect cloud providers against such a danger. In this paper, closed formulas are provided through the expected utility paradigm to set the insurance premium under different outage models and QoS metrics (no. of outages, no. of long outages, and unavailability). When the cloud service is paid through a fixed fee, we also provide the maximum unit compensation that a cloud provider can offer so as to meet constraints on its profit loss. The unit compensation is shown to vary approximately as the inverse square of the service fee.
    Digitale ISSN: 1999-5903
    Thema: Informatik
    Publiziert von MDPI
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 68
    Publikationsdatum: 2019
    Beschreibung: Term translation quality in machine translation (MT), which is usually measured by domain experts, is a time-consuming and expensive task. In fact, this is unimaginable in an industrial setting where customised MT systems often need to be updated for many reasons (e.g., availability of new training data, leading MT techniques). To the best of our knowledge, as of yet, there is no publicly-available solution to evaluate terminology translation in MT automatically. Hence, there is a genuine need to have a faster and less-expensive solution to this problem, which could help end-users to identify term translation problems in MT instantly. This study presents a faster and less expensive strategy for evaluating terminology translation in MT. High correlations of our evaluation results with human judgements demonstrate the effectiveness of the proposed solution. The paper also introduces a classification framework, TermCat, that can automatically classify term translation-related errors and expose specific problems in relation to terminology translation in MT. We carried out our experiments with a low resource language pair, English–Hindi, and found that our classifier, whose accuracy varies across the translation directions, error classes, the morphological nature of the languages, and MT models, generally performs competently in the terminology translation classification task.
    Digitale ISSN: 2078-2489
    Thema: Informatik
    Publiziert von MDPI
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 69
    Publikationsdatum: 2019
    Beschreibung: Radar signal processing mainly focuses on target detection, classification, estimation, filtering, and so on. Compressed sensing radar (CSR) technology can potentially provide additional tools to simultaneously reduce computational complexity and effectively solve inference problems. CSR allows direct compressive signal processing without the need to reconstruct the signal. This study aimed to solve the problem of CSR detection without signal recovery by optimizing the transmit waveform. Therefore, a waveform optimization method was introduced to improve the output signal-to-interference-plus-noise ratio (SINR) in the case where the target signal is corrupted by colored interference and noise having known statistical characteristics. Two different target models are discussed: deterministic and random. In the case of a deterministic target, the optimum transmit waveform is derived by maximizing the SINR and a suboptimum solution is also presented. In the case of random target, an iterative waveform optimization method is proposed to maximize the output SINR. This approach ensures that SINR performance is improved in each iteration step. The performance of these methods is illustrated by computer simulation.
    Digitale ISSN: 2078-2489
    Thema: Informatik
    Publiziert von MDPI
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 70
    Publikationsdatum: 2019
    Beschreibung: In semi-autonomous robot conferencing, not only the operator controls the robot, but the robot itself also moves autonomously. Thus, it can modify the operator’s movement (e.g., adding social behaviors). However, the sense of agency, that is, the degree of feeling that the movement of the robot is the operator’s own movement, would decrease if the operator is conscious of the discrepancy between the teleoperation and autonomous behavior. In this study, we developed an interface to control the robot head by using an eye tracker. When the robot autonomously moves its eye-gaze position, the interface guides the operator’s eye movement towards this autonomous movement. The experiment showed that our interface can maintain the sense of agency, because it provided the illusion that the autonomous behavior of a robot is directed by the operator’s eye movement. This study reports the conditions of how to provide this illusion in semi-autonomous robot conferencing.
    Digitale ISSN: 1999-5903
    Thema: Informatik
    Publiziert von MDPI
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 71
    Publikationsdatum: 2019
    Beschreibung: Appropriate business processes management (BPM) within an organization can help attain organizational goals. It is particularly important to effectively manage the lifecycle of these processes for organizational effectiveness in improving ever-growing performance and competitivity-building across the company. This paper presents a process discovery and how we can use it in a broader framework supporting self-organization in BPM. Process discovery is intrinsically associated with the process lifecycle. We have made a pre-evaluation of the usefulness of our facts using a generated log file. We also compared visualizations of the outcomes of our approach with different cases and showed performance characteristics of the cash loan sales process.
    Digitale ISSN: 2078-2489
    Thema: Informatik
    Publiziert von MDPI
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 72
    Publikationsdatum: 2019
    Beschreibung: Correlations between observed data are at the heart of all empirical research that strives for establishing lawful regularities. However, there are numerous ways to assess these correlations, and there are numerous ways to make sense of them. This essay presents a bird’s eye perspective on different interpretive schemes to understand correlations. It is designed as a comparative survey of the basic concepts. Many important details to back it up can be found in the relevant technical literature. Correlations can (1) extend over time (diachronic correlations) or they can (2) relate data in an atemporal way (synchronic correlations). Within class (1), the standard interpretive accounts are based on causal models or on predictive models that are not necessarily causal. Examples within class (2) are (mainly unsupervised) data mining approaches, relations between domains (multiscale systems), nonlocal quantum correlations, and eventually correlations between the mental and the physical.
    Digitale ISSN: 2078-2489
    Thema: Informatik
    Publiziert von MDPI
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 73
    Publikationsdatum: 2019
    Beschreibung: The capacity of private information retrieval (PIR) from databases coded using maximum distance separable (MDS) codes was previously characterized by Banawan and Ulukus, where it was assumed that the messages are encoded and stored separably in the databases. This assumption was also usually made in other related works in the literature, and this capacity is usually referred to as the MDS-PIR capacity colloquially. In this work, we considered the question of if and when this capacity barrier can be broken through joint encoding and storing of the messages. Our main results are two classes of novel code constructions, which allow joint encoding, as well as the corresponding PIR protocols, which indeed outperformed the separate MDS-coded systems. Moreover, we show that a simple, but novel expansion technique allows us to generalize these two classes of codes, resulting in a wider range of the cases where this capacity barrier can be broken.
    Digitale ISSN: 2078-2489
    Thema: Informatik
    Publiziert von MDPI
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 74
    Publikationsdatum: 2019
    Beschreibung: Collaborative filtering based recommender systems have proven to be extremely successful in settings where user preference data on items is abundant. However, collaborative filtering algorithms are hindered by their weakness against the item cold-start problem and general lack of interpretability. Ontology-based recommender systems exploit hierarchical organizations of users and items to enhance browsing, recommendation, and profile construction. While ontology-based approaches address the shortcomings of their collaborative filtering counterparts, ontological organizations of items can be difficult to obtain for items that mostly belong to the same category (e.g., television series episodes). In this paper, we present an ontology-based recommender system that integrates the knowledge represented in a large ontology of literary themes to produce fiction content recommendations. The main novelty of this work is an ontology-based method for computing similarities between items and its integration with the classical Item-KNN (K-nearest neighbors) algorithm. As a study case, we evaluated the proposed method against other approaches by performing the classical rating prediction task on a collection of Star Trek television series episodes in an item cold-start scenario. This transverse evaluation provides insights into the utility of different information resources and methods for the initial stages of recommender system development. We found our proposed method to be a convenient alternative to collaborative filtering approaches for collections of mostly similar items, particularly when other content-based approaches are not applicable or otherwise unavailable. Aside from the new methods, this paper contributes a testbed for future research and an online framework to collaboratively extend the ontology of literary themes to cover other narrative content.
    Digitale ISSN: 1999-5903
    Thema: Informatik
    Publiziert von MDPI
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 75
    Publikationsdatum: 2019
    Beschreibung: The advent of utility computing has revolutionized almost every sector of traditional software development. Especially commercial cloud computing services, pioneered by the likes of Amazon, Google and Microsoft, have provided an unprecedented opportunity for the fast and sustainable development of complex distributed systems. Nevertheless, existing models and tools aim primarily for systems where resource usage—by humans and bots alike—is logically and physically quite disperse resulting in a low likelihood of conflicting resource access. However, a number of resource-intensive applications, such as Massively Multiplayer Online Games (MMOGs) and large-scale simulations introduce a requirement for a very large common state with many actors accessing it simultaneously and thus a high likelihood of conflicting resource access. This paper presents a systematic mapping study of the state-of-the-art in software technology aiming explicitly to support the development of MMOGs, a class of large-scale, resource-intensive software systems. By examining the main focus of a diverse set of related publications, we identify a list of criteria that are important for MMOG development. Then, we categorize the selected studies based on the inferred criteria in order to compare their approach, unveil the challenges faced in each of them and reveal research trends that might be present. Finally we attempt to identify research directions which appear promising for enabling the use of standardized technology for this class of systems.
    Digitale ISSN: 2078-2489
    Thema: Informatik
    Publiziert von MDPI
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 76
    Publikationsdatum: 2019
    Beschreibung: In this survey paper, we review various concepts of graph density, as well as associated theorems and algorithms. Our goal is motivated by the fact that, in many applications, it is a key algorithmic task to extract a densest subgraph from an input graph, according to some appropriate definition of graph density. While this problem has been the subject of active research for over half of a century, with many proposed variants and solutions, new results still continuously emerge in the literature. This shows both the importance and the richness of the subject. We also identify some interesting open problems in the field.
    Digitale ISSN: 1999-4893
    Thema: Informatik
    Publiziert von MDPI
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 77
    Publikationsdatum: 2019
    Beschreibung: Military named entity recognition (MNER) is one of the key technologies in military information extraction. Traditional methods for the MNER task rely on cumbersome feature engineering and specialized domain knowledge. In order to solve this problem, we propose a method employing a bidirectional long short-term memory (BiLSTM) neural network with a self-attention mechanism to identify the military entities automatically. We obtain distributed vector representations of the military corpus by unsupervised learning and the BiLSTM model combined with the self-attention mechanism is adopted to capture contextual information fully carried by the character vector sequence. The experimental results show that the self-attention mechanism can improve effectively the performance of MNER task. The F-score of the military documents and network military texts identification was 90.15% and 89.34%, respectively, which was better than other models.
    Digitale ISSN: 1999-5903
    Thema: Informatik
    Publiziert von MDPI
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 78
    Publikationsdatum: 2019
    Beschreibung: This article empirically demonstrates the impacts of truthfully sharing forecast information and using forecast combinations in a fast-moving-consumer-goods (FMCG) supply chain. Although it is known a priori that sharing information improves the overall efficiency of a supply chain, information such as pricing or promotional strategy is often kept proprietary for competitive reasons. In this regard, it is herein shown that simply sharing the retail-level forecasts—this does not reveal the exact business strategy, due to the effect of omni-channel sales—yields nearly all the benefits of sharing all pertinent information that influences FMCG demand. In addition, various forecast combination methods are used to further stabilize the forecasts, in situations where multiple forecasting models are used during operation. In other words, it is shown that combining forecasts is less risky than “betting” on any component model.
    Digitale ISSN: 2078-2489
    Thema: Informatik
    Publiziert von MDPI
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 79
    Publikationsdatum: 2019
    Beschreibung: An important problem in machine learning is that, when using more than two labels, it is very difficult to construct and optimize a group of learning functions that are still useful when the prior distribution of instances is changed. To resolve this problem, semantic information G theory, Logical Bayesian Inference (LBI), and a group of Channel Matching (CM) algorithms are combined to form a systematic solution. A semantic channel in G theory consists of a group of truth functions or membership functions. In comparison with the likelihood functions, Bayesian posteriors, and Logistic functions that are typically used in popular methods, membership functions are more convenient to use, providing learning functions that do not suffer the above problem. In Logical Bayesian Inference (LBI), every label is independently learned. For multilabel learning, we can directly obtain a group of optimized membership functions from a large enough sample with labels, without preparing different samples for different labels. Furthermore, a group of Channel Matching (CM) algorithms are developed for machine learning. For the Maximum Mutual Information (MMI) classification of three classes with Gaussian distributions in a two-dimensional feature space, only 2–3 iterations are required for the mutual information between three classes and three labels to surpass 99% of the MMI for most initial partitions. For mixture models, the Expectation-Maximization (EM) algorithm is improved to form the CM-EM algorithm, which can outperform the EM algorithm when the mixture ratios are imbalanced, or when local convergence exists. The CM iteration algorithm needs to combine with neural networks for MMI classification in high-dimensional feature spaces. LBI needs further investigation for the unification of statistics and logic.
    Digitale ISSN: 2078-2489
    Thema: Informatik
    Publiziert von MDPI
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 80
    Publikationsdatum: 2019
    Beschreibung: In an era of accelerating digitization and advanced big data analytics, harnessing quality data and insights will enable innovative research methods and management approaches. Among others, Artificial Intelligence Imagery Analysis has recently emerged as a new method for analyzing the content of large amounts of pictorial data. In this paper, we provide background information and outline the application of Artificial Intelligence Imagery Analysis for analyzing the content of large amounts of pictorial data. We suggest that Artificial Intelligence Imagery Analysis constitutes a profound improvement over previous methods that have mostly relied on manual work by humans. In this paper, we discuss the applications of Artificial Intelligence Imagery Analysis for research and practice and provide an example of its use for research. In the case study, we employed Artificial Intelligence Imagery Analysis for decomposing and assessing thumbnail images in the context of marketing and media research and show how properly assessed and designed thumbnail images promote the consumption of online videos. We conclude the paper with a discussion on the potential of Artificial Intelligence Imagery Analysis for research and practice across disciplines.
    Digitale ISSN: 1999-5903
    Thema: Informatik
    Publiziert von MDPI
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 81
    Publikationsdatum: 2019
    Beschreibung: Human eye movement is one of the most important functions for understanding our surroundings. When a human eye processes a scene, it quickly focuses on dominant parts of the scene, commonly known as a visual saliency detection or visual attention prediction. Recently, neural networks have been used to predict visual saliency. This paper proposes a deep learning encoder-decoder architecture, based on a transfer learning technique, to predict visual saliency. In the proposed model, visual features are extracted through convolutional layers from raw images to predict visual saliency. In addition, the proposed model uses the VGG-16 network for semantic segmentation, which uses a pixel classification layer to predict the categorical label for every pixel in an input image. The proposed model is applied to several datasets, including TORONTO, MIT300, MIT1003, and DUT-OMRON, to illustrate its efficiency. The results of the proposed model are quantitatively and qualitatively compared to classic and state-of-the-art deep learning models. Using the proposed deep learning model, a global accuracy of up to 96.22% is achieved for the prediction of visual saliency.
    Digitale ISSN: 2078-2489
    Thema: Informatik
    Publiziert von MDPI
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 82
    Publikationsdatum: 2019
    Beschreibung: The skyline query and its variant queries are useful functions in the early stages of a knowledge-discovery processes. The skyline query and its variant queries select a set of important objects, which are better than other common objects in the dataset. In order to handle big data, such knowledge-discovery queries must be computed in parallel distributed environments. In this paper, we consider an efficient parallel algorithm for the “K-skyband query” and the “top-k dominating query”, which are popular variants of skyline query. We propose a method for computing both queries simultaneously in a parallel distributed framework called MapReduce, which is a popular framework for processing “big data” problems. Our extensive evaluation results validate the effectiveness and efficiency of the proposed algorithm on both real and synthetic datasets.
    Digitale ISSN: 1999-4893
    Thema: Informatik
    Publiziert von MDPI
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 83
    Publikationsdatum: 2019
    Beschreibung: A generalization of Ding’s construction is proposed that employs as a defining set the collection of the sth powers ( s ≥ 2 ) of all nonzero elements in G F ( p m ) , where p ≥ 2 is prime. Some of the resulting codes are optimal or near-optimal and include projective codes over G F ( 4 ) that give rise to optimal or near optimal quantum codes. In addition, the codes yield interesting combinatorial structures, such as strongly regular graphs and block designs.
    Digitale ISSN: 1999-4893
    Thema: Informatik
    Publiziert von MDPI
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 84
    Publikationsdatum: 2019
    Beschreibung: The exorbitant increase in the computational complexity of modern video coding standards, such as High Efficiency Video Coding (HEVC), is a compelling challenge for resource-constrained consumer electronic devices. For instance, the brute force evaluation of all possible combinations of available coding modes and quadtree-based coding structure in HEVC to determine the optimum set of coding parameters for a given content demand a substantial amount of computational and energy resources. Thus, the resource requirements for real time operation of HEVC has become a contributing factor towards the Quality of Experience (QoE) of the end users of emerging multimedia and future internet applications. In this context, this paper proposes a content-adaptive Coding Unit (CU) size selection algorithm for HEVC intra-prediction. The proposed algorithm builds content-specific weighted Support Vector Machine (SVM) models in real time during the encoding process, to provide an early estimate of CU size for a given content, avoiding the brute force evaluation of all possible coding mode combinations in HEVC. The experimental results demonstrate an average encoding time reduction of 52.38%, with an average Bjøntegaard Delta Bit Rate (BDBR) increase of 1.19% compared to the HM16.1 reference encoder. Furthermore, the perceptual visual quality assessments conducted through Video Quality Metric (VQM) show minimal visual quality impact on the reconstructed videos of the proposed algorithm compared to state-of-the-art approaches.
    Digitale ISSN: 1999-5903
    Thema: Informatik
    Publiziert von MDPI
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 85
    Publikationsdatum: 2019
    Beschreibung: A crowdsourcing contest is one of the most popular modes of crowdsourcing and is also an important tool for an enterprise to implement open innovation. The solvers’ active participation is one of the major reasons for the success of crowdsourcing contests. Research on solvers’ participation behavior is helpful in understanding the sustainability and incentives of solvers’ participation in the online crowdsourcing platform. So, how to attract more solvers to participate and put in more effort is the focus of researchers. In this regard, previous studies mainly used the submission quantity to measure solvers’ participation behavior and lacked an effective measure on the degree of participation effort expended by a solver. For the first time, we use solvers’ participation time as a dependent variable to measure their effort in a crowdsourcing contest. Thus, we incorporate participation time into the solver’s participation research. With the data from Taskcn.com, we analyze how participation time is affected four key factors including task design, task description, task process, and environment, respectively. We found that, first, for task design, higher task rewards will attract solvers to invest more time in the participation process and the relationship between participation time and task duration is inverted U-shaped. Second, for task description, the length of the task description has a negative impact on participation time and the task description attachment will positively influence the participation time. Third, for the task process, communication and supplementary explanations in a crowdsourcing process positively affect participation time. Fourth, for environmental factors, the task density of the crowdsourcing platform and the market price of all crowdsourcing contests have respectively negative and positive effects on participation time.
    Digitale ISSN: 1999-5903
    Thema: Informatik
    Publiziert von MDPI
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 86
    Publikationsdatum: 2019
    Beschreibung: Balanced partitioning is often a crucial first step in solving large-scale graph optimization problems, for example, in some cases, a big graph can be chopped into pieces that fit on one machine to be processed independently before stitching the results together, leading to certain suboptimality from the interaction among different pieces. In other cases, links between different parts may show up in the running time and/or network communications cost, hence the desire to have small cut size. We study a distributed balanced-partitioning problem where the goal is to partition the vertices of a given graph into k pieces so as to minimize the total cut size. Our algorithm is composed of a few steps that are easily implementable in distributed computation frameworks such as MapReduce. The algorithm first embeds nodes of the graph onto a line, and then processes nodes in a distributed manner guided by the linear embedding order. We examine various ways to find the first embedding, for example, via a hierarchical clustering or Hilbert curves. Then we apply four different techniques including local swaps, and minimum cuts on the boundaries of partitions, as well as contraction and dynamic programming. As our empirical study, we compare the above techniques with each other, and also to previous work in distributed graph algorithms, for example, a label-propagation method, FENNEL and Spinner. We report our results both on a private map graph and several public social networks, and show that our results beat previous distributed algorithms: For instance, compared to the label-propagation algorithm, we report an improvement of 15–25% in the cut value. We also observe that our algorithms admit scalable distributed implementation for any number of partitions. Finally, we explain three applications of this work at Google: (1) Balanced partitioning is used to route multi-term queries to different replicas in Google Search backend in a way that reduces the cache miss rates by ≈ 0.5 % , which leads to a double-digit gain in throughput of production clusters. (2) Applied to the Google Maps Driving Directions, balanced partitioning minimizes the number of cross-shard queries with the goal of saving in CPU usage. This system achieves load balancing by dividing the world graph into several “shards”. Live experiments demonstrate an ≈ 40 % drop in the number of cross-shard queries when compared to a standard geography-based method. (3) In a job scheduling problem for our data centers, we use balanced partitioning to evenly distribute the work while minimizing the amount of communication across geographically distant servers. In fact, the hierarchical nature of our solution goes well with the layering of data center servers, where certain machines are closer to each other and have faster links to one another.
    Digitale ISSN: 1999-4893
    Thema: Informatik
    Publiziert von MDPI
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 87
    Publikationsdatum: 2019
    Beschreibung: Analyzing the structure of a social network helps in gaining insights into interactions and relationships among users while revealing the patterns of their online behavior. Network centrality is a metric of importance of a network node in a network, which allows revealing the structural patterns and morphology of networks. We propose a distributed computing approach for the calculation of network centrality value for each user using the MapReduce approach in the Hadoop platform, which allows faster and more efficient computation as compared to the conventional implementation. A distributed approach is scalable and helps in efficient computations of large-scale datasets, such as social network data. The proposed approach improves the calculation performance of degree centrality by 39.8%, closeness centrality by 40.7% and eigenvalue centrality by 41.1% using a Twitter dataset.
    Digitale ISSN: 1999-4893
    Thema: Informatik
    Publiziert von MDPI
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 88
    Publikationsdatum: 2019
    Beschreibung: Deep neural networks are successful learning tools for building nonlinear models. However, a robust deep learning-based classification model needs a large dataset. Indeed, these models are often unstable when they use small datasets. To solve this issue, which is particularly critical in light of the possible clinical applications of these predictive models, researchers have developed approaches such as virtual sample generation. Virtual sample generation significantly improves learning and classification performance when working with small samples. The main objective of this study is to evaluate the ability of the proposed virtual sample generation to overcome the small sample size problem, which is a feature of the automated detection of a neurodevelopmental disorder, namely autism spectrum disorder. Results show that our method enhances diagnostic accuracy from 84%–95% using virtual samples generated on the basis of five actual clinical samples. The present findings show the feasibility of using the proposed technique to improve classification performance even in cases of clinical samples of limited size. Accounting for concerns in relation to small sample sizes, our technique represents a meaningful step forward in terms of pattern recognition methodology, particularly when it is applied to diagnostic classifications of neurodevelopmental disorders. Besides, the proposed technique has been tested with other available benchmark datasets. The experimental outcomes showed that the accuracy of the classification that used virtual samples was superior to the one that used original training data without virtual samples.
    Digitale ISSN: 1999-4893
    Thema: Informatik
    Publiziert von MDPI
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 89
    facet.materialart.
    Unbekannt
    MDPI
    Publikationsdatum: 2019
    Beschreibung: In this paper, I first review signal detection theory (SDT) approaches to perception, and then discuss why it is thought that SDT theory implies that increasing attention improves performance. Our experiments have shown, however, that this is not necessarily true. Subjects had either focused attention on two of four possible locations in the visual field, or diffused attention to all four locations. The stimuli (offset letters), locations, conditions, and tasks were all known in advance, responses were forced-choice, subjects were properly instructed and motivated, and instructions were always valid—conditions which should optimize signal detection. Relative to diffusing attention, focusing attention indeed benefitted discrimination of forward from backward pointing Es. However, focusing made it harder to identify a randomly chosen one of 20 letters. That focusing can either aid or disrupt performance, even when cues are valid and conditions are idealized, is surprising, but it can also be explained by SDT, as shown here. These results warn the experimental researcher not to confuse focusing attention with enhancing performance, and warn the modeler not to assume that SDT is unequivocal.
    Digitale ISSN: 2078-2489
    Thema: Informatik
    Publiziert von MDPI
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 90
    Publikationsdatum: 2019
    Beschreibung: The growing demand on video streaming services increasingly motivates the development of a reliable and accurate models for the assessment of Quality of Experience (QoE). In this duty, human-related factors which have significant influence on QoE play a crucial role. However, the complexity caused by multiple effects of those factors on human perception has introduced challenges on contemporary studies. In this paper, we inspect the impact of the human-related factors, namely perceptual factors, memory effect, and the degree of interest. Based on our investigation, a novel QoE model is proposed that effectively incorporates those factors to reflect the user’s cumulative perception. Evaluation results indicate that our proposed model performed excellently in predicting cumulative QoE at any moment within a streaming session.
    Digitale ISSN: 1999-5903
    Thema: Informatik
    Publiziert von MDPI
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 91
    Publikationsdatum: 2019
    Beschreibung: This paper deals with the Arabic translation taṣawwur in Averroes’ Great Commentary of the term τῶν ἀδιαιρέτων νόησις (“ton adiaireton noesis”, thinking of the indivisibles) in Aristotle’s De anima and the Latin translation from Arabic with (in-)formatio, as quoted by Albertus Magnus [...]
    Digitale ISSN: 2078-2489
    Thema: Informatik
    Publiziert von MDPI
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 92
    Publikationsdatum: 2019
    Beschreibung: Opportunistic networks are considered as the promising network structures to implement traditional and typical infrastructure-based communication by enabling smart mobile devices in the networks to contact with each other within a fixed communication area. Because of the intermittent and unstable connections between sources and destinations, message routing and forwarding in opportunistic networks have become challenging and troublesome problems recently. In this paper, to improve the data dissemination environment, we propose an improved routing-forwarding strategy utilizing node profile and location prediction for opportunistic networks, which mainly includes three continuous phases: the collecting and updating of routing state information, community detection and optimization and node location prediction. Each mobile node in the networks is able to establish a network routing matrix after the entire process of information collecting and updating. Due to the concentrated population in urban areas and relatively few people in remote areas, the distribution of location prediction roughly presents a type of symmetry in opportunistic networks. Afterwards, the community optimization and location prediction mechanisms could be regarded as an significant foundation for data dissemination in the networks. Ultimately, experimental results demonstrate that the proposed algorithm could slightly enhance the delivery ratio and substantially degrade the network overhead and end-to-end delay as compared with the other four routing strategies.
    Digitale ISSN: 2078-2489
    Thema: Informatik
    Publiziert von MDPI
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 93
    Publikationsdatum: 2019
    Beschreibung: With digital media, not only are media extensions of their human users, as McLuhan posited, but there is a flip or reversal in which the human users of digital media become an extension of those digital media as these media scoop up their data and use them to the advantage of those that control these media. The implications of this loss of privacy as we become “an item in a data bank” are explored and the field of captology is described. The feedback of the users of digital media become the feedforward for those media.
    Digitale ISSN: 2078-2489
    Thema: Informatik
    Publiziert von MDPI
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 94
    Publikationsdatum: 2019
    Beschreibung: The main scope of the presented research was the development of an innovative product for the management of city parking lots. Our application will ensure the implementation of the Smart City concept by using computer vision and communication platforms, which enable the development of new integrated digital services. The use of video cameras could simplify and lower the costs of parking lot controls. In the aim of parking space detection, an aggregated decision was proposed, employing various metrics, computed over a sliding window interval provided by the camera. The history created over 20 images provides an adaptive method for background and accurate detection. The system has shown high robustness in two benchmarks, achieving a recognition rate higher than 93%.
    Digitale ISSN: 1999-5903
    Thema: Informatik
    Publiziert von MDPI
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 95
    Publikationsdatum: 2019
    Beschreibung: Anomaly detection of network traffic flows is a non-trivial problem in the field of network security due to the complexity of network traffic. However, most machine learning-based detection methods focus on network anomaly detection but ignore the user anomaly behavior detection. In real scenarios, the anomaly network behavior may harm the user interests. In this paper, we propose an anomaly detection model based on time-decay closed frequent patterns to address this problem. The model mines closed frequent patterns from the network traffic of each user and uses a time-decay factor to distinguish the weight of current and historical network traffic. Because of the dynamic nature of user network behavior, a detection model update strategy is provided in the anomaly detection framework. Additionally, the closed frequent patterns can provide interpretable explanations for anomalies. Experimental results show that the proposed method can detect user behavior anomaly, and the network anomaly detection performance achieved by the proposed method is similar to the state-of-the-art methods and significantly better than the baseline methods.
    Digitale ISSN: 2078-2489
    Thema: Informatik
    Publiziert von MDPI
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 96
    Publikationsdatum: 2019
    Beschreibung: Parameterized complexity theory has led to a wide range of algorithmic breakthroughs within the last few decades, but the practicability of these methods for real-world problems is still not well understood. We investigate the practicability of one of the fundamental approaches of this field: dynamic programming on tree decompositions. Indisputably, this is a key technique in parameterized algorithms and modern algorithm design. Despite the enormous impact of this approach in theory, it still has very little influence on practical implementations. The reasons for this phenomenon are manifold. One of them is the simple fact that such an implementation requires a long chain of non-trivial tasks (as computing the decomposition, preparing it, …). We provide an easy way to implement such dynamic programs that only requires the definition of the update rules. With this interface, dynamic programs for various problems, such as 3-coloring, can be implemented easily in about 100 lines of structured Java code. The theoretical foundation of the success of dynamic programming on tree decompositions is well understood due to Courcelle’s celebrated theorem, which states that every MSO-definable problem can be efficiently solved if a tree decomposition of small width is given. We seek to provide practical access to this theorem as well, by presenting a lightweight model checker for a small fragment of MSO 1 (that is, we do not consider “edge-set-based” problems). This fragment is powerful enough to describe many natural problems, and our model checker turns out to be very competitive against similar state-of-the-art tools.
    Digitale ISSN: 1999-4893
    Thema: Informatik
    Publiziert von MDPI
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 97
    Publikationsdatum: 2019
    Beschreibung: Let V be a finite set of positive integers with sum equal to a multiple of the integer b . When does V have a partition into b parts so that all parts have equal sums? We develop algorithmic constructions which yield positive, albeit incomplete, answers for the following classes of set V , where n is a given positive integer: (1) an initial interval { a ∈ ℤ + : a ≤ n } ; (2) an initial interval of primes { p ∈ ℙ : p ≤ n } , where ℙ is the set of primes; (3) a divisor set { d ∈ ℤ + : d | n } ; (4) an aliquot set { d ∈ ℤ + : d | n ,   d 〈 n } . Open general questions and conjectures are included for each of these classes.
    Digitale ISSN: 1999-4893
    Thema: Informatik
    Publiziert von MDPI
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 98
    Publikationsdatum: 2019
    Beschreibung: The blockchain technique is becoming more and more popular due to its advantages such as stability and dispersed nature. This is an idea based on blockchain activity paradigms. Another important field is machine learning, which is increasingly used in practice. Unfortunately, the training or overtraining artificial neural networks is very time-consuming and requires high computing power. In this paper, we proposed using a blockchain technique to train neural networks. This type of activity is important due to the possible search for initial weights in the network, which affect faster training, due to gradient decrease. We performed the tests with much heavier calculations to indicate that such an action is possible. However, this type of solution can also be used for less demanding calculations, i.e., only a few iterations of training and finding a better configuration of initial weights.
    Digitale ISSN: 1999-4893
    Thema: Informatik
    Publiziert von MDPI
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 99
    Publikationsdatum: 2019
    Beschreibung: The application of blockchain technology to the energy sector promises to derive new operating models focused on local generation and sustainable practices, which are driven by peer-to-peer collaboration and community engagement. However, real-world energy blockchains differ from typical blockchain networks insofar as they must interoperate with grid infrastructure, adhere to energy regulations, and embody engineering principles. Naturally, these additional dimensions make real-world energy blockchains highly dependent on the participation of grid operators, engineers, and energy providers. Although much theoretical and proof-of-concept research has been published on energy blockchains, this research aims to establish a lens on real-world projects and implementations that may inform the alignment of academic and industry research agendas. This research classifies 131 real-world energy blockchain initiatives to develop an understanding of how blockchains are being applied to the energy domain, what type of failure rates can be observed from recently reported initiatives, and what level of technical and theoretical details are reported for real-world deployments. The results presented from the systematic analysis highlight that real-world energy blockchains are (a) growing exponentially year-on-year, (b) producing relatively low failure/drop-off rates (~7% since 2015), and (c) demonstrating information sharing protocols that produce content with insufficient technical and theoretical depth.
    Digitale ISSN: 1999-5903
    Thema: Informatik
    Publiziert von MDPI
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
  • 100
    Publikationsdatum: 2019
    Beschreibung: In this study, we address the problem of compaction of Church numerals. Church numerals are unary representations of natural numbers on the scheme of lambda terms. We propose a novel decomposition scheme from a given natural number into an arithmetic expression using tetration, which enables us to obtain a compact representation of lambda terms that leads to the Church numeral of the natural number. For natural number n, we prove that the size of the lambda term obtained by the proposed method is O ( ( slog 2 n ) ( log n / log log n ) ) . Moreover, we experimentally confirmed that the proposed method outperforms binary representation of Church numerals on average, when n is less than approximately 10,000.
    Digitale ISSN: 1999-4893
    Thema: Informatik
    Publiziert von MDPI
    Standort Signatur Erwartet Verfügbarkeit
    BibTip Andere fanden auch interessant ...
Schließen ⊗
Diese Webseite nutzt Cookies und das Analyse-Tool Matomo. Weitere Informationen finden Sie hier...