ALBERT

All Library Books, journals and Electronic Records Telegrafenberg

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
Filter
  • Articles  (6,732)
  • 2015-2019  (6,732)
  • 1945-1949
  • Algorithms  (852)
  • Datenschutz und Datensicherheit  (531)
  • Pattern Recognition  (502)
  • BMC Medical Informatics and Decision Making  (392)
  • 110151
  • 3363
  • 87345
  • 9794
  • Computer Science  (6,732)
Collection
  • Articles  (6,732)
Years
Year
Topic
  • 101
    Publication Date: 2016-07-09
    Description: Mobile phone technology is utilized for better delivery of health services worldwide. In low-and-middle income countries mobile phones are now ubiquitous. Thus leveraging mHealth applications in health sector ...
    Electronic ISSN: 1472-6947
    Topics: Computer Science , Medicine
    Published by BioMed Central
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 102
    Publication Date: 2016-06-22
    Description: Sentiment analysis of online social media has attracted significant interest recently. Many studies have been performed, but most existing methods focus on either only textual content or only visual content. In this paper, we utilize deep learning models in a convolutional neural network (CNN) to analyze the sentiment in Chinese microblogs from both textual and visual content. We first train a CNN on top of pre-trained word vectors for textual sentiment analysis and employ a deep convolutional neural network (DNN) with generalized dropout for visual sentiment analysis. We then evaluate our sentiment prediction framework on a dataset collected from a famous Chinese social media network (Sina Weibo) that includes text and related images and demonstrate state-of-the-art results on this Chinese sentiment analysis benchmark.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 103
    Publication Date: 2016-06-23
    Description: We investigate the problem of minimizing the total power consumption under the constraint of the signal-to-noise ratio (SNR) requirement for the physical layer multicasting system with large-scale antenna arrays. In contrast with existing work, we explicitly consider both the transmit power and the circuit power scaling with the number of antennas. The joint antenna selection and beamforming technique is proposed to minimize the total power consumption. The problem is a challenging one, which aims to minimize the linear combination of ℓ 0 -norm and ℓ 2 -norm. To our best knowledge, this minimization problem has not yet been well solved. A random decremental antenna selection algorithm is designed, which is further modified by an approximation of the minimal transmit power based on the asymptotic orthogonality of the channels. Then, a more efficient decremental antenna selection algorithm is proposed based on minimizing the ℓ 0 norm. Performance results show that the ℓ 0 norm minimization algorithm greatly outperforms the random selection algorithm in terms of the total power consumption and the average run time.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 104
    Publication Date: 2016-06-28
    Description: The use of telemonitoring is a promising approach to optimizing outcomes in the treatment of heart failure (HF) for patients living in the community. HF telemonitoring interventions, however, have not been tes...
    Electronic ISSN: 1472-6947
    Topics: Computer Science , Medicine
    Published by BioMed Central
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 105
    Publication Date: 2016-05-06
    Description: While, lost to follow-up (LTFU) from antiretroviral therapy (ART) can be considered a catch-all category for patients who miss scheduled visits or medication pick-ups, operational definitions and methods for d...
    Electronic ISSN: 1472-6947
    Topics: Computer Science , Medicine
    Published by BioMed Central
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 106
    Publication Date: 2016-05-27
    Description: Recently manifold learning has received extensive interest in the community of pattern recognition. Despite their appealing properties, most manifold learning algorithms are not robust in practical applications. In this paper, we address this problem in the context of the Hessian locally linear embedding (HLLE) algorithm and propose a more robust method, called RHLLE, which aims to be robust against both outliers and noise in the data. Specifically, we first propose a fast outlier detection method for high-dimensional datasets. Then, we employ a local smoothing method to reduce noise. Furthermore, we reformulate the original HLLE algorithm by using the truncation function from differentiable manifolds. In the reformulated framework, we explicitly introduce a weighted global functional to further reduce the undesirable effect of outliers and noise on the embedding result. Experiments on synthetic as well as real datasets demonstrate the effectiveness of our proposed algorithm.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 107
    Publication Date: 2016-02-07
    Description: A new orthogonal projection method for computing the minimum distance between a point and a spatial parametric curve is presented. It consists of a geometric iteration which converges faster than the existing Newton’s method, and it is insensitive to the choice of initial values. We prove that projecting a point onto a spatial parametric curve under the method is globally second-order convergence.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 108
    Publication Date: 2016-07-13
    Description: Improving retention in prevention of mother to child transmission (PMTCT) of HIV programs is critical to optimize maternal and infant health outcomes, especially now that lifelong treatment is immediate regard...
    Electronic ISSN: 1472-6947
    Topics: Computer Science , Medicine
    Published by BioMed Central
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 109
    Publication Date: 2016-07-18
    Description: As health care becomes more complex, it becomes more important for clinicians and patients to share information. Electronic health information exchange can help address this need. To this end, all provinces an...
    Electronic ISSN: 1472-6947
    Topics: Computer Science , Medicine
    Published by BioMed Central
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 110
    Publication Date: 2016-07-19
    Description: The survival of patients with breast cancer is highly sporadic, from a few months to more than 15 years. In recent studies, the gene expression profiling of tumors has been used as a promising means of predict...
    Electronic ISSN: 1472-6947
    Topics: Computer Science , Medicine
    Published by BioMed Central
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 111
    Publication Date: 2016-07-19
    Description: Identifying subtypes of complex diseases such as cancer is the very first step toward developing highly customized therapeutics on such diseases, as their origins significantly vary even with similar physiolog...
    Electronic ISSN: 1472-6947
    Topics: Computer Science , Medicine
    Published by BioMed Central
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 112
    Publication Date: 2016-07-19
    Description: The Variome corpus, a small collection of published articles about inherited colorectal cancer, includes annotations of 11 entity types and 13 relation types related to the curation of the relationship between...
    Electronic ISSN: 1472-6947
    Topics: Computer Science , Medicine
    Published by BioMed Central
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 113
    Publication Date: 2016-07-22
    Description: Risk calculation is increasingly used in lipid management, congestive heart failure, and atrial fibrillation. The risk scores are then used for decisions about statin use, anticoagulation, and implantable defi...
    Electronic ISSN: 1472-6947
    Topics: Computer Science , Medicine
    Published by BioMed Central
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 114
    Publication Date: 2016-07-22
    Description: The utilization of routine health information systems (HIS) for surveillance of assisted partner services (aPS) for HIV in sub-Saharan is sub-optimal, in part due to poor data quality and limited use of inform...
    Electronic ISSN: 1472-6947
    Topics: Computer Science , Medicine
    Published by BioMed Central
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 115
    Publication Date: 2016-07-28
    Description: A specific Electronic Health Record (EHR) for ophthalmology was introduced in an academic center in Germany. As diagnoses coding corresponding to the International Classification of Diseases Version 10 (ICD-10...
    Electronic ISSN: 1472-6947
    Topics: Computer Science , Medicine
    Published by BioMed Central
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 116
    Publication Date: 2016-07-30
    Description: We consider the problem of estimating the measure of subsets in very large networks. A prime tool for this purpose is the Markov Chain Monte Carlo (MCMC) algorithm. This algorithm, while extremely useful in many cases, still often suffers from the drawback of very slow convergence. We show that in a special, but important case, it is possible to obtain significantly better bounds on the convergence rate. This special case is when the huge state space can be aggregated into a smaller number of clusters, in which the states behave approximately the same way (but their behavior still may not be identical). A Markov chain with this structure is called quasi-lumpable. This property allows the aggregation of states (nodes) into clusters. Our main contribution is a rigorously proved bound on the rate at which the aggregated state distribution approaches its limit in quasi-lumpable Markov chains. We also demonstrate numerically that in certain cases this can indeed lead to a significantly accelerated way of estimating the measure of subsets. The result can be a useful tool in the analysis of complex networks, whenever they have a clustering that aggregates nodes with similar (but not necessarily identical) behavior.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 117
    facet.materialart.
    Unknown
    Springer
    Publication Date: 2015-05-01
    Electronic ISSN: 1862-2607
    Topics: Computer Science , Law
    Published by Springer
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 118
    Publication Date: 2015-05-01
    Description: Die Erforschung von Vertrauen und Glaubwürdigkeit spielt in den Kommunikationswissenschaften eine bedeutsame Rolle für die Beschreibung, Erklärung und Vorhersage von Mediennutzungsprozessen und -mustern und daraus resultierenden Wirkungen. Im Fokus hinsichtlich der Rezeption von Nachrichten stehen dabei bislang vor allem Nachrichten in Printmedien und Fernsehen, nicht aber elektronische Medien. Der Beitrag geht auf den Zusammenhang zwischen der Rezeption von Nachrichten und politischen Informationen aus dem Internet und der Bedeutung des Vertrauens in diese bzw. deren Glaubwürdigkeit ein.
    Electronic ISSN: 1862-2607
    Topics: Computer Science , Law
    Published by Springer
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 119
    Publication Date: 2015-05-01
    Description: Öffentliche Äußerungen sind ein wirksames, oft das wirksamste Instrument für Datenschutzbehörden zur Durchsetzung des Datenschutzrechts. Demgemäß gibt es eine differenzierte und umfassende Öffentlichkeitsarbeit und oft eine enge Kooperation zwischen Aufsichtsbehörden und Presse. Hierbei kann im Einzelfall in die Rechte Dritter eingegriffen werden. Darauf gibt es in den Gesetzen bisher keine adäquate, differenzierende Antwort. Dem gegenüber hat die Rechtsprechung hierzu Aussagen gemacht, wobei Untergerichte regelmäßig eine restriktive Linie verfolgten, ohne dabei den verfassungs- und europarechtlichen Rahmen der Öffentlichkeitsarbeit von Aufsichtsbehörden hinreichend zu berücksichtigen.
    Electronic ISSN: 1862-2607
    Topics: Computer Science , Law
    Published by Springer
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 120
    Publication Date: 2015-05-01
    Description: Vor dem Hintergrund der stetig zunehmenden Fallzahlen von kriminellen Handlungen im Internet wird der Selbstschutz von Usern immer wichtiger. Gerade Jugendliche gelten als gefährdet, da sie sich häufig besonders sorglos im Netz bewegen. Der Beitrag präsentiert die Ergebnisse einer Befragung, die den Einfluss von fünf unterschiedlichen Faktoren auf das sicherheitsrelevante Verhalten von Jugendlichen untersucht hat. Besonderes Augenmerk wird dabei auf die Rolle von generalisiertem Vertrauen im Verhältnis zu anderen Faktoren gelegt.
    Electronic ISSN: 1862-2607
    Topics: Computer Science , Law
    Published by Springer
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 121
    facet.materialart.
    Unknown
    Springer
    Publication Date: 2015-05-01
    Description: Die Bezeichnung „Web of Services“ bezieht sich nach einer Definition des W3C auf ein nachrichtenbasiertes Designprinzip, das häufig zum Entwurf von Internet-Anwendungen oder Unternehmenssoftware zum Einsatz kommt. Die beiden dominierenden Ansätze sind hier derzeit SOAP und REST. Für REST existiert jedoch keine der SOAP-Security entsprechende Sicherheitsarchitektur. Mit den zunehmenden Einsatzmöglichkeiten in verteilten Anwendungen wird eine solche „REST-Security“ jedoch immer dringender benötigt. Diese muss abstrakte Sicherheitsmethoden definieren, deren konkrete Umsetzung über die bei Webanwendungen gebräuchlichen Sicherheitsmechanismen hinausgeht. Der Beitrag gibt einen Überblick über den aktuellen Stand der Technik und formuliert offene Forschungs- und Entwicklungsaufgaben in Form von Anforderungen an REST-Security.
    Electronic ISSN: 1862-2607
    Topics: Computer Science , Law
    Published by Springer
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 122
    facet.materialart.
    Unknown
    Springer
    Publication Date: 2015-05-01
    Electronic ISSN: 1862-2607
    Topics: Computer Science , Law
    Published by Springer
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 123
    facet.materialart.
    Unknown
    Springer
    Publication Date: 2015-05-01
    Electronic ISSN: 1862-2607
    Topics: Computer Science , Law
    Published by Springer
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 124
    facet.materialart.
    Unknown
    Springer
    Publication Date: 2015-05-01
    Electronic ISSN: 1862-2607
    Topics: Computer Science , Law
    Published by Springer
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 125
    facet.materialart.
    Unknown
    Springer
    Publication Date: 2015-05-01
    Description: In diesem Beitrag wird Privatheit entlang der Datenschutzprinzipien mithilfe des Referenzmodells über Vertrauen aufgeschlüsselt. Dazu wird das Referenzmodell über Vertrauen zunächst in Hinblick auf dessen Anwendbarkeit auf technische Lösungen angepasst, so dass die Elemente und Beziehungen des Modells für die Datenschutzprinzipien Transparenz, Einwilligung und Datensparsamkeit, Vertraulichkeit usw. für die elektronische Kommunikation anwendbar sind. Eine solche Modellierung erlaubt eine neue Sicht auf das IT-Risiko für den Missbrauch personenbezogener Daten. Anschließend erfolgt exemplarisch eine konkrete Abbildung der Modellkomponenten für das Datenschutzprinzip der Vertraulichkeit.
    Electronic ISSN: 1862-2607
    Topics: Computer Science , Law
    Published by Springer
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 126
    Publication Date: 2015-05-01
    Electronic ISSN: 1862-2607
    Topics: Computer Science , Law
    Published by Springer
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 127
    facet.materialart.
    Unknown
    Springer
    Publication Date: 2015-05-01
    Electronic ISSN: 1862-2607
    Topics: Computer Science , Law
    Published by Springer
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 128
    facet.materialart.
    Unknown
    Springer
    Publication Date: 2015-05-01
    Description: Bei einer Online-Bürgerbeteiligung (E-Partizipation) geben Bürgerinnen und Bürger vielfach direkt oder indirekt ihre eigene Meinung, ihre Gesinnung, ihre Herkunft etc. an die Öffentlichkeit bzw. in einem „halb-öffentlichen“ Bereich preis. Damit setzen sie sich–bewusst oder unbewusst–dem Risiko des Missbrauchs dieser persönlichen, sensiblen Daten durch andere Akteure aus. Um Vertrauen in E-Partizipationsangebote zu fördern, müssen Maßnahmen zur IT-Sicherheit und zum Datenschutz ergriffen werden, die Vertraulichkeit, Transparenz, Verfügbarkeit und Integrität der Kommunikation zwischen öffentlicher Verwaltung, den politischen Akteuren und den Bürgerinnen und Bürgern gewährleisten. Der Beitrag zeigt Gefahrenpotenziale und schützenswerte Güter auf und identifiziert erforderliche Schutzmaßnahmen sowie Forschungsbedarf.
    Electronic ISSN: 1862-2607
    Topics: Computer Science , Law
    Published by Springer
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 129
    facet.materialart.
    Unknown
    Springer
    Publication Date: 2015-05-01
    Electronic ISSN: 1862-2607
    Topics: Computer Science , Law
    Published by Springer
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 130
    Publication Date: 2015-05-01
    Description: Die EU hat in den vergangenen Jahren Vertrauen verloren. Verfehlte Politik, mangelnde Identifikation, negative Kommunikation, fehlender politischer Wettbewerb–das sind die vermuteten Ursachen. Im Europawahlkampf 2014 wurde erstmals eine TV-Debatte der Kandidaten für das Amt des EU-Kommissionpräsidenten durchgeführt. Ob es der Debatte gelungen ist, verlorenes Vertrauen zurückzugewinnen und welche Rolle dabei die Kommunikation über die Debatte in den sozialen Medien spielt, untersucht der vorliegende Beitrag.
    Electronic ISSN: 1862-2607
    Topics: Computer Science , Law
    Published by Springer
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 131
    facet.materialart.
    Unknown
    Springer
    Publication Date: 2015-05-01
    Description: Vertrauen ist Forschungsgegenstand vieler wissenschaftlicher Disziplinen. Da die Forschungsfragen und Anwendungskontexte jedoch stark variieren, stellt sich die Frage, wie man Vertrauen konzeptuell so fassen kann, dass es ein gemeinsames Verständnis der verschiedenen Disziplinen ermöglicht und dabei gleichzeitig die Möglichkeit eröffnet, disziplinäre Besonderheiten zu berücksichtigen. Ausgehend von einem sozialwissenschaftlich begründete Modell von Mayer, Davis und Schoormann [1, 2] geht der Beitrag der Frage nach, welche Gemeinsamkeiten und welche Unterschiede Vertrauenskonzepte in Psychologie, Politik- und Kommunikationswissenschaft sowie Informatik aufweisen. Dabei steht die Bedeutung von Vertrauen im politischen Kommunikationsprozess, also Vertrauen in politische Repräsentanten und Institutionen sowie Vertrauen in die Kommunikationsmedien, über die Bürger politische Inhalte rezipieren, im Mittelpunkt.
    Electronic ISSN: 1862-2607
    Topics: Computer Science , Law
    Published by Springer
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 132
    Publication Date: 2015-05-08
    Description: The construction of a similarity matrix is one significant step for the spectral clustering algorithm; while the Gaussian kernel function is one of the most common measures for constructing the similarity matrix. However, with a fixed scaling parameter, the similarity between two data points is not adaptive and appropriate for multi-scale datasets. In this paper, through quantitating the value of the importance for each vertex of the similarity graph, the Gaussian kernel function is scaled, and an adaptive Gaussian kernel similarity measure is proposed. Then, an adaptive spectral clustering algorithm is gotten based on the importance of shared nearest neighbors. The idea is that the greater the importance of the shared neighbors between two vertexes, the more possible it is that these two vertexes belong to the same cluster; and the importance value of the shared neighbors is obtained with an iterative method, which considers both the local structural information and the distance similarity information, so as to improve the algorithm’s performance. Experimental results on different datasets show that our spectral clustering algorithm outperforms the other spectral clustering algorithms, such as the self-tuning spectral clustering and the adaptive spectral clustering based on shared nearest neighbors in clustering accuracy on most datasets.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 133
    Publication Date: 2015-05-09
    Description: In this paper, we propose a detection method of pulmonary nodules in X-ray computed tomography (CT) scans by use of three image filters and appearance-based k-means clustering. First, voxel values are suppressed in radial directions so as to eliminate extra regions in the volumes of interest (VOIs). Globular regions are enhanced by moment-of-inertia tensors where the voxel values in the VOIs are regarded as mass. Excessively enhanced voxels are reduced based on displacement between the VOI centers and the gravity points of the voxel values in the VOIs. Initial nodule candidates are determined by these filtering processings. False positives are reduced by, first, normalizing the directions of intensity distributions in the VOIs by rotating the VOIs based on the eigenvectors of the moment-of-inertia tensors, and then applying an appearance-based two-step k-means clustering technique to the rotated VOIs. The proposed method is applied to actual CT scans and experimental results are shown.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 134
    Publication Date: 2015-05-09
    Description: We propose a linear time algorithm, called G2DLP, for generating 2D lattice L(n1, n2) paths, equivalent to two-item  multiset permutations, with a given number of turns. The usage of turn has three meanings: in the context of multiset permutations, it means that two consecutive elements of a permutation belong to two different items; in lattice path enumerations, it means that the path changes its direction, either from eastward to northward or from northward to eastward; in open shop scheduling, it means that we transfer a job from one type of machine to another. The strategy of G2DLP is divide-and-combine; the division is based on the enumeration results of a previous study and is achieved by aid of an integer partition algorithm and a multiset permutation algorithm; the combination is accomplished by a concatenation algorithm that constructs the paths we require. The advantage of G2DLP is twofold. First, it is optimal in the sense that it directly generates all feasible paths without visiting an infeasible one. Second, it can generate all paths in any specified order of turns, for example, a decreasing order or an increasing order. In practice, two applications, scheduling and cryptography, are discussed.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 135
    Publication Date: 2015-05-07
    Description: Background: In this study we implemented and developed state-of-the-art machine learning (ML) and natural language processing (NLP) technologies and built a computerized algorithm for medication reconciliation. Our specific aims are: (1) to develop a computerized algorithm for medication discrepancy detection between patients’ discharge prescriptions (structured data) and medications documented in free-text clinical notes (unstructured data); and (2) to assess the performance of the algorithm on real-world medication reconciliation data. Methods: We collected clinical notes and discharge prescription lists for all 271 patients enrolled in the Complex Care Medical Home Program at Cincinnati Children’s Hospital Medical Center between 1/1/2010 and 12/31/2013. A double-annotated, gold-standard set of medication reconciliation data was created for this collection. We then developed a hybrid algorithm consisting of three processes: (1) a ML algorithm to identify medication entities from clinical notes, (2) a rule-based method to link medication names with their attributes, and (3) a NLP-based, hybrid approach to match medications with structured prescriptions in order to detect medication discrepancies. The performance was validated on the gold-standard medication reconciliation data, where precision (P), recall (R), F-value (F) and workload were assessed. Results: The hybrid algorithm achieved 95.0%/91.6%/93.3% of P/R/F on medication entity detection and 98.7%/99.4%/99.1% of P/R/F on attribute linkage. The medication matching achieved 92.4%/90.7%/91.5% (P/R/F) on identifying matched medications in the gold-standard and 88.6%/82.5%/85.5% (P/R/F) on discrepant medications. By combining all processes, the algorithm achieved 92.4%/90.7%/91.5% (P/R/F) and 71.5%/65.2%/68.2% (P/R/F) on identifying the matched and the discrepant medications, respectively. The error analysis on algorithm outputs identified challenges to be addressed in order to improve medication discrepancy detection. Conclusion: By leveraging ML and NLP technologies, an end-to-end, computerized algorithm achieves promising outcome in reconciling medications between clinical notes and discharge prescriptions.
    Electronic ISSN: 1472-6947
    Topics: Computer Science , Medicine
    Published by BioMed Central
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 136
    Publication Date: 2015-05-09
    Description: In this work we generate the numerical solutions of Burgers’ equation by applying the Crank-Nicholson method and different schemes for solving nonlinear systems, instead of using Hopf-Cole transformation to reduce Burgers’ equation into the linear heat equation. The method is analyzed on two test problems in order to check its efficiency on different kinds of initial conditions. Numerical solutions as well as exact solutions for different values of viscosity are calculated, concluding that the numerical results are very close to the exact solution.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 137
    Publication Date: 2015-04-25
    Description: Background: Computerized clinical decision support (CDS) can help hospitals to improve healthcare. However, CDS can be problematic. The purpose of this study was to discover how the views of clinical stakeholders, CDS content vendors, and EHR vendors are alike or different with respect to challenges in the development, management, and use of CDS. Methods: We conducted ethnographic fieldwork using a Rapid Assessment Process within ten clinical and five health information technology (HIT) vendor organizations. Using an inductive analytical approach, we generated themes from the clinical, content vendor, and electronic health record vendor perspectives and compared them. Results: The groups share views on the importance of appropriate manpower, careful knowledge management, CDS that fits user workflow, the need for communication among the groups, and for mutual strategizing about the future of CDS. However, views of usability, training, metrics, interoperability, product use, and legal issues differed. Recommendations for improvement include increased collaboration to address legal, manpower, and CDS sharing issues. Conclusions: The three groups share thinking about many aspects of CDS, but views differ in a number of important respects as well. Until these three groups can reach a mutual understanding of the views of the other stakeholders, and work together, CDS will not reach its potential.
    Electronic ISSN: 1472-6947
    Topics: Computer Science , Medicine
    Published by BioMed Central
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 138
    Publication Date: 2015-03-28
    Description: An image analysis procedure based on a two dimensional Gaussian fitting is presented and applied to satellite maps describing the surface urban heat island (SUHI). The application of this fitting technique allows us to parameterize the SUHI pattern in order to better understand its intensity trend and also to perform quantitative comparisons among different images in time and space. The proposed procedure is computationally rapid and stable, executing an initial guess parameter estimation by a multiple regression before the iterative nonlinear fitting. The Gaussian fit was applied to both low and high resolution images (1 km and 30 m pixel size) and the results of the SUHI parameterization shown. As expected, a reduction of the correlation coefficient between the map values and the Gaussian surface was observed for the image with the higher spatial resolution due to the greater variability of the SUHI values. Since the fitting procedure provides a smoothed Gaussian surface, it has better performance when applied to low resolution images, even if the reliability of the SUHI pattern representation can be preserved also for high resolution images.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 139
    Publication Date: 2015-04-23
    Description: The auxiliary problem principle is a powerful tool for solving multi-area economic dispatch problem. One of the main drawbacks of the auxiliary problem principle method is that the convergence performance depends on the selection of penalty parameter. In this paper, we propose a self-adaptive strategy to adjust penalty parameter based on the iterative information, the proposed approach is verified by two given test systems. The corresponding simulation results demonstrate that the proposed self-adaptive auxiliary problem principle iterative scheme is robust in terms of the selection of penalty parameter and has better convergence rate compared with the traditional auxiliary problem principle method.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 140
    Publication Date: 2015-04-30
    Description: Background: The incidence of chronic diseases in low- and middle-income countries is rapidly increasing both in urban and rural regions. A major challenge for health systems globally is to develop innovative solutions for the prevention and control of these diseases. This paper discusses the development and pilot testing of SMARTHealth, a mobile-based, point-of-care Clinical Decision Support (CDS) tool to assess and manage cardiovascular disease (CVD) risk in resource-constrained settings. Through pilot testing, the preliminary acceptability, utility, and efficiency of the CDS tool was obtained. Methods: The CDS tool was part of an mHealth system comprising a mobile application that consisted of an evidence-based risk prediction and management algorithm, and a server-side electronic medical record system. Through an agile development process and user-centred design approach, key features of the mobile application that fitted the requirements of the end users and environment were obtained. A comprehensive analytics framework facilitated a data-driven approach to investigate four areas, namely, system efficiency, end-user variability, manual data entry errors, and usefulness of point-of-care management recommendations to the healthcare worker. A four-point Likert scale was used at the end of every risk assessment to gauge ease-of-use of the system. Results: The system was field-tested with eleven village healthcare workers and three Primary Health Centre doctors, who screened a total of 292 adults aged 40 years and above. 34% of participants screened by health workers were identified by the CDS tool to be high CVD risk and referred to a doctor. In-depth analysis of user interactions found the CDS tool feasible for use and easily integrable into the workflow of healthcare workers. Following completion of the pilot, further technical enhancements were implemented to improve uptake of the mHealth platform. It will then be evaluated for effectiveness and cost-effectiveness in a cluster randomized controlled trial involving 54 southern Indian villages and over 16000 individuals at high CVD risk. Conclusions: An evidence-based CVD risk prediction and management tool was used to develop an mHealth platform in rural India for CVD screening and management with proper engagement of health care providers and local communities. With over a third of screened participants being high risk, there is a need to demonstrate the clinical impact of the mHealth platform so that it could contribute to improved CVD detection in high risk low resource settings.
    Electronic ISSN: 1472-6947
    Topics: Computer Science , Medicine
    Published by BioMed Central
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 141
    Publication Date: 2015-04-09
    Electronic ISSN: 1862-2607
    Topics: Computer Science , Law
    Published by Springer
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 142
    facet.materialart.
    Unknown
    Springer
    Publication Date: 2015-04-09
    Description: Informationssicherheit betrachtet die Sicherheit von Programmen, Prozessen und Dienstleistungen. Im Entwurf eines Secure Elements für zukünftige Mobilgeräte wird der sichere Betrieb durch mehrere Parteien erbracht und durch eine Kombination aus Zertifizierung und Akkreditierung sichergestellt. Das Embedded UICC ermöglicht dadurch eine späte Personalisierung, das Einbringen von Personalisierungsdaten nach Auslieferung.
    Electronic ISSN: 1862-2607
    Topics: Computer Science , Law
    Published by Springer
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 143
    facet.materialart.
    Unknown
    Springer
    Publication Date: 2015-04-09
    Electronic ISSN: 1862-2607
    Topics: Computer Science , Law
    Published by Springer
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 144
    Publication Date: 2015-04-14
    Description: Aiming at improving the well-known fuzzy compactness and separation algorithm (FCS), this paper proposes a new clustering algorithm based on feature weighting fuzzy compactness and separation (WFCS). In view of the contribution of features to clustering, the proposed algorithm introduces the feature weighting into the objective function. We first formulate the membership and feature weighting, and analyze the membership of data points falling on the crisp boundary, then give the adjustment strategy. The proposed WFCS is validated both on simulated dataset and real dataset. The experimental results demonstrate that the proposed WFCS has the characteristics of hard clustering and fuzzy clustering, and outperforms many existing clustering algorithms with respect to three metrics: Rand Index, Xie-Beni Index and Within-Between(WB) Index.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 145
    Publication Date: 2015-04-16
    Description: Background: Numerous calls have been made for greater assimilation of information technology in healthcare organizations in general, and in primary care settings in particular. Considering the levels of IT investment and adoption in primary care medical practices, a deeper understanding is needed of the factors leading to greater performance outcomes from EMR systems in primary care. To address this issue, we developed and tested a research model centered on the concept of Extended EMR Use. Methods: An online survey was conducted of 331 family physicians in Canadian private medical practices to empirically test seven research hypotheses using a component-based structural equation modeling approach. Results: Five hypotheses were partially or fully supported by our data. Family physicians in our sample used 67% of the clinical and 41% of the communicational functionalities available in their EMR systems, compared to 90% of the administrative features. As expected, extended use was associated with significant improvements in perceived performance benefits. Interestingly, the benefits derived from system use were mainly tied to the clinical support provided by an EMR system. The extent to which physicians were using their EMR systems was influenced by two system design characteristics: functional coverage and ease of use. The more functionalities that are available in an EMR system and the easier they are to use, the greater the potential for exploration, assimilation and appropriation by family physicians. Conclusions: Our study has contributed to the extant literature by proposing a new concept: Extended EMR Use. In terms of its practical implications, our study reveals that family physicians must use as many of the capabilities supported by their EMR system as possible, especially those which support clinical tasks, if they are to maximize its performance benefits. To ensure extended use of their software, vendors must develop EMR systems that satisfy two important design characteristics: functional coverage and system ease of use.
    Electronic ISSN: 1472-6947
    Topics: Computer Science , Medicine
    Published by BioMed Central
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 146
    Publication Date: 2015-04-18
    Description: Background: Effective implementation of a Primary Care Medical Home model of care (PCMH) requires integration of patients’ contextual information (physical, mental, social and financial status) into an easily retrievable information source for the healthcare team and clinical decision-making.This project explored clinicians’ perceptions about important attributes of contextual information for clinical decision-making, how contextual information is expressed in CPRS clinical documentation as well as how clinicians in a highly computerized environment manage information flow related to these areas. Methods: A qualitative design using Cognitive Task Analyses and a modified Critical Incident Technique were used. The study was conducted in a large VA with a fully implemented EHR located in the western United States. Seventeen providers working in a PCMH model of care in Primary Care, Home Based Care and Geriatrics reported on a recent difficult transition requiring contextual information for decision-making. The transcribed interviews were qualitatively analyzed for thematic development related to contextual information using an iterative process and multiple reviewers with ATLAS@ti software. Results: Six overarching themes emerged as attributes of contextual information: Informativeness, goal language, temporality, source attribution, retrieval effort, and information quality. Conclusions: These results indicate that specific attributes are needed to in order for contextual information to fully support clinical decision-making in a Medical Home care delivery environment. Improved EHR designs are needed for ease of contextual information access, displaying linkages across time and settings, and explicit linkages to both clinician and patient goals. Implications relevant to providers’ information needs, team functioning and EHR design are discussed.
    Electronic ISSN: 1472-6947
    Topics: Computer Science , Medicine
    Published by BioMed Central
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 147
    Publication Date: 2015-04-18
    Description: Background: In Australia, bowel cancer screening participation using faecal occult blood testing (FOBT) is low. Decision support tailored to psychological predictors of participation may increase screening. The study compared tailored computerised decision support to non-tailored computer or paper information. The primary outcome was FOBT return within 12 weeks. Additional analyses were conducted on movement in decision to screen and change on psychological variables. Methods: A parallel, randomised controlled, trial invited 25,511 people aged 50–74 years to complete an eligibility questionnaire. Eligible respondents (n = 3,408) were assigned to Tailored Personalised Decision Support (TPDS), Non-Tailored PDS (NTPDS), or Control (CG) (intention-to-treat, ITT sample). TPDS and NTPDS groups completed an on-line baseline survey (BS) and accessed generic information. The TPDS group additionally received a tailored intervention. CG participants completed a paper BS only. Those completing the BS (n = 2270) were mailed an FOBT and requested to complete an endpoint survey (ES) that re-measured BS variables (per-protocol, PP sample). Results: FOBT return: In the ITT sample, there was no significant difference between any group (χ 2(2) = 2.57, p = .26; TPDS, 32.5%; NTPDS, 33%; and CG, 34.5%). In the PP sample, FOBT return in the internet groups was significantly higher than the paper group (χ 2(2) = 17.01, p 
    Electronic ISSN: 1472-6947
    Topics: Computer Science , Medicine
    Published by BioMed Central
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 148
    Publication Date: 2015-04-18
    Description: Background: Manual eligibility screening (ES) for a clinical trial typically requires a labor-intensive review of patient records that utilizes many resources. Leveraging state-of-the-art natural language processing (NLP) and information extraction (IE) technologies, we sought to improve the efficiency of physician decision-making in clinical trial enrollment. In order to markedly reduce the pool of potential candidates for staff screening, we developed an automated ES algorithm to identify patients who meet core eligibility characteristics of an oncology clinical trial. Methods: We collected narrative eligibility criteria from ClinicalTrials.gov for 55 clinical trials actively enrolling oncology patients in our institution between 12/01/2009 and 10/31/2011. In parallel, our ES algorithm extracted clinical and demographic information from the Electronic Health Record (EHR) data fields to represent profiles of all 215 oncology patients admitted to cancer treatment during the same period. The automated ES algorithm then matched the trial criteria with the patient profiles to identify potential trial-patient matches. Matching performance was validated on a reference set of 169 historical trial-patient enrollment decisions, and workload, precision, recall, negative predictive value (NPV) and specificity were calculated. Results: Without automation, an oncologist would need to review 163 patients per trial on average to replicate the historical patient enrollment for each trial. This workload is reduced by 85% to 24 patients when using automated ES (precision/recall/NPV/specificity: 12.6%/100.0%/100.0%/89.9%). Without automation, an oncologist would need to review 42 trials per patient on average to replicate the patient-trial matches that occur in the retrospective data set. With automated ES this workload is reduced by 90% to four trials (precision/recall/NPV/specificity: 35.7%/100.0%/100.0%/95.5%). Conclusion: By leveraging NLP and IE technologies, automated ES could dramatically increase the trial screening efficiency of oncologists and enable participation of small practices, which are often left out from trial enrollment. The algorithm has the potential to significantly reduce the effort to execute clinical research at a point in time when new initiatives of the cancer care community intend to greatly expand both the access to trials and the number of available trials.
    Electronic ISSN: 1472-6947
    Topics: Computer Science , Medicine
    Published by BioMed Central
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 149
    Publication Date: 2015-04-21
    Description: Background: One economical way to inform patients about their illness and medical procedures is to provide written health information material. So far, a generic and psychometrically sound scale to evaluate cognitive, emotional, and behavioral aspects of the subjectively experienced usefulness of patient information material from the patient’s perspective is lacking. The aim of our study was to develop and psychometrically test such a scale. Methods: The Usefulness Scale for Patient Information Material (USE) was developed using a multistep approach. Ultimately, three items for each subscale (cognitive, emotional, and behavioral) were selected under consideration of face validity, discrimination, difficulty, and item content.The final version of the USE was subjected to reliability analysis. Structural validity was tested using confirmatory factor analysis, and convergent and divergent validity were tested using correlation analysis. The criterion validity of the USE was tested in an experimental design. To this aim, patients were randomly allocated to one of two groups. One group received a full version of an information brochure on depression or chronic low back pain depending on the respective primary diagnosis. Patients in the second group received a reduced version with a lower design quality, smaller font size and less information.Patients were recruited in six hospitals in Germany. After reading the brochure, they were asked to fill in a questionnaire. Results: Analyzable data were obtained from 120 questionnaires. The confirmatory factor analysis supported the structural validity of the scale. Reliability analysis of the total scale and its subscales showed Cronbach’s α values between .84 and .94. Convergent and divergent validity were supported. Criterion validity was confirmed in the experimental condition. Significant differences between the groups receiving full and reduced information were found for the total score (p
    Electronic ISSN: 1472-6947
    Topics: Computer Science , Medicine
    Published by BioMed Central
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 150
    Publication Date: 2015-04-21
    Description: Background: Provision of care to patients with chronic diseases remains a great challenge for modern health care systems. eHealth is indicated as one of the strategies which could improve care delivery to this group of patients. The main objective of this study was to assess determinants of the acceptance of the Internet use for provision of chosen health care services remaining in the scope of current nationwide eHealth initiative in Poland. Methods: The survey was carried out among patients with diagnosed chronic conditions who were treated in three health care facilities in Krakow, Poland. Survey data was used to develop univariate and multivariate logistic regression models for six outcome variables originating from the items assessing the acceptance of specific types of eHealth applications. The variables used as predictors were related to the sociodemographic characteristics of respondents, burden related to chronic disease, and the use of the Internet and its perceived usefulness in making personal health-related decisions. Results: Among 395 respondents, there were 60.3% of Internet users. Univariate logistic regression models developed for six types of eHealth solutions demonstrated their higher acceptance among younger respondents, living in urban areas, who have attained a higher level of education, used the Internet on their own, and were more confident about its usefulness in making health-related decisions. Furthermore, the duration of chronic disease and hospitalization due to chronic disease predicted the acceptance of some of eHealth applications. However, when combined in multivariate models, only the belief in the usefulness of the Internet (five of six models), level of education (four of six models), and previous hospitalization due to chronic disease (three of six models) maintained the effect on the independent variables. Conclusions: The perception of the usefulness of the Internet in making health-related decision is a key determinant of the acceptance of provision of health care services online among patients with chronic diseases. Among sociodemographic factors, only the level of education demonstrates a consistent impact on the level of acceptance. Interestingly, a greater burden of chronic disease related to previous hospitalizations leads to lower acceptance of eHealth solutions.
    Electronic ISSN: 1472-6947
    Topics: Computer Science , Medicine
    Published by BioMed Central
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 151
    Publication Date: 2016-04-02
    Description: The fireworks algorithm (FA) is a new parallel diffuse optimization algorithm to simulate the fireworks explosion phenomenon, which realizes the balance between global exploration and local searching by means of adjusting the explosion mode of fireworks bombs. By introducing the grouping strategy of the shuffled frog leaping algorithm (SFLA), an improved FA-SFLA hybrid algorithm is put forward, which can effectively make the FA jump out of the local optimum and accelerate the global search ability. The simulation results show that the hybrid algorithm greatly improves the accuracy and convergence velocity for solving the function optimization problems.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 152
    Publication Date: 2016-04-13
    Description: Health care institutions have patient question sets that can expand over time. For a multispecialty group, each specialty might have multiple question sets. As a result, question set governance can be challeng...
    Electronic ISSN: 1472-6947
    Topics: Computer Science , Medicine
    Published by BioMed Central
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 153
    Publication Date: 2015-12-25
    Description: The gravitational search algorithm (GSA) is a kind of swarm intelligence optimization algorithm based on the law of gravitation. The parameter initialization of all swarm intelligence optimization algorithms has an important influence on the global optimization ability. Seen from the basic principle of GSA, the convergence rate of GSA is determined by the gravitational constant and the acceleration of the particles. The optimization performances on six typical test functions are verified by the simulation experiments. The simulation results show that the convergence speed of the GSA algorithm is relatively sensitive to the setting of the algorithm parameters, and the GSA parameter can be used flexibly to improve the algorithm’s convergence velocity and improve the accuracy of the solutions.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 154
    Publication Date: 2015-12-25
    Description: Kung-Traub’s conjecture states that an optimal iterative method based on d function evaluations for finding a simple zero of a nonlinear function could achieve a maximum convergence order of 2 d − 1 . During the last years, many attempts have been made to prove this conjecture or develop optimal methods which satisfy the conjecture. We understand from the conjecture that the maximum order reached by a method with three function evaluations is four, even for quadratic functions. In this paper, we show that the conjecture fails for quadratic functions. In fact, we can find a 2-point method with three function evaluations reaching fifth order convergence. We also develop 2-point 3rd to 8th order methods with one function and two first derivative evaluations using weight functions. Furthermore, we show that with the same number of function evaluations we can develop higher order 2-point methods of order r + 2 , where r is a positive integer, ≥ 1 . We also show that we can develop a higher order method with the same number of function evaluations if we know the asymptotic error constant of the previous method. We prove the local convergence of these methods which we term as Babajee’s Quadratic Iterative Methods and we extend these methods to systems involving quadratic equations. We test our methods with some numerical experiments including an application to Chandrasekhar’s integral equation arising in radiative heat transfer theory.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 155
    Publication Date: 2015-12-25
    Description: Quantitative electroencephalogram (EEG) is one neuroimaging technique that has been shown to differentiate patients with major depressive disorder (MDD) and non-depressed healthy volunteers (HV) at the group-l...
    Electronic ISSN: 1472-6947
    Topics: Computer Science , Medicine
    Published by BioMed Central
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 156
    Publication Date: 2015-12-25
    Description: The Patient Activation Measure (PAM13) is an instrument that assesses patient knowledge, skills, and confidence for disease self-management. This cross-sectional study was aimed to validate a culturally-adapte...
    Electronic ISSN: 1472-6947
    Topics: Computer Science , Medicine
    Published by BioMed Central
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 157
    Publication Date: 2015-12-26
    Description: Epidemics of hand, foot and mouth disease (HFMD) among children in East Asia have been a serious annual public health problem. Previous studies in China and island-type territories in East Asia showed that the...
    Electronic ISSN: 1472-6947
    Topics: Computer Science , Medicine
    Published by BioMed Central
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 158
    facet.materialart.
    Unknown
    Springer
    Publication Date: 2016-01-01
    Description: Mit der zunehmenden Digitalisierung des Lebens wird Cyber-Sicherheit immer mehr zu einem zentralen Baustein der inneren Sicherheit. Gemeinsames Ziel von Politik und Wirtschaft ist, dass die IT-Systeme und digitalen Infrastrukturen Deutschlands zu den sichersten weltweit gehören. Das IT-Sicherheitsgesetz schafft dafür den Rahmen. Beide Seiten müssen ihn nun durch vertrauensvolle Kooperation mit Leben füllen.
    Electronic ISSN: 1862-2607
    Topics: Computer Science , Law
    Published by Springer
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 159
    facet.materialart.
    Unknown
    Springer
    Publication Date: 2016-01-01
    Electronic ISSN: 1862-2607
    Topics: Computer Science , Law
    Published by Springer
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 160
    facet.materialart.
    Unknown
    Springer
    Publication Date: 2016-01-01
    Electronic ISSN: 1862-2607
    Topics: Computer Science , Law
    Published by Springer
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 161
    facet.materialart.
    Unknown
    Springer
    Publication Date: 2016-01-01
    Description: Die Erkenntnis ist bitter: Der Umgang mit Informationstechnologien wird fortlaufend durch Angriffe bedroht, deren Schadenswirksamkeit steigt. Diese Tendenz kann auch durch aufwendige technologische Entwicklungen nicht aufgehalten werden. Ein Ausweg aus dem Dilemma scheint eine bewusste Neuorientierung der Gesellschaft zu sein. Im Beitrag werden dafür Anregungen entwickelt, die eine IT-Sicherheitskultur als Teil einer Lösung empfehlen.
    Electronic ISSN: 1862-2607
    Topics: Computer Science , Law
    Published by Springer
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 162
    facet.materialart.
    Unknown
    Springer
    Publication Date: 2016-01-01
    Description: Das Gesetz zur Erhöhung der Sicherheit informationstechnischer Systeme (IT-Sicherheitsgesetz/ITSiG) stellt erhebliche, bislang unterschätzte Anforderungen an Telemedienanbieter zur Umsetzung von IT-Sicherheitsmaßnahmen. Der Beitrag stellt die Norm systematisch dar und analysiert deren Anwendung.
    Electronic ISSN: 1862-2607
    Topics: Computer Science , Law
    Published by Springer
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 163
    Publication Date: 2016-03-06
    Description: The accumulation of medical documents in China has rapidly increased in the past years. We focus on developing a method that automatically performs ICD-10 code assignment to Chinese diagnoses from the electron...
    Electronic ISSN: 1472-6947
    Topics: Computer Science , Medicine
    Published by BioMed Central
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 164
    Publication Date: 2016-03-09
    Description: Diagnosis of neuromuscular diseases in primary care is often challenging. Rare diseases such as Pompe disease are easily overlooked by the general practitioner. We therefore aimed to develop a diagnostic suppo...
    Electronic ISSN: 1472-6947
    Topics: Computer Science , Medicine
    Published by BioMed Central
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 165
    Publication Date: 2015-12-12
    Description: Big data are everywhere as high volumes of varieties of valuable precise and uncertain data can be easily collected or generated at high velocity in various real-life applications. Embedded in these big data are rich sets of useful information and knowledge. To mine these big data and to discover useful information and knowledge, we present a data analytic algorithm in this article. Our algorithm manages, queries, and processes uncertain big data in cloud environments. More specifically, it manages transactions of uncertain big data, allows users to query these big data by specifying constraints expressing their interests, and processes the user-specified constraints to discover useful information and knowledge from the uncertain big data. As each item in every transaction in these uncertain big data is associated with an existential probability value expressing the likelihood of that item to be present in a particular transaction, computation could be intensive. Our algorithm uses the MapReduce model on a cloud environment for effective data analytics on these uncertain big data. Experimental results show the effectiveness of our data analytic algorithm for managing, querying, and processing uncertain big data in cloud environments.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 166
    Publication Date: 2015-12-12
    Description: Background: Interoperable phenotyping algorithms, needed to identify patient cohorts meeting eligibility criteria for observational studies or clinical trials, require medical data in a consistent structured, coded format. Data heterogeneity limits such algorithms’ applicability. Existing approaches are often: not widely interoperable; or, have low sensitivity due to reliance on the lowest common denominator (ICD-9 diagnoses). In the Scalable Collaborative Infrastructure for a Learning Healthcare System (SCILHS) we endeavor to use the widely-available Current Procedural Terminology (CPT) procedure codes with ICD-9. Unfortunately, CPT changes drastically year-to-year – codes are retired/replaced. Longitudinal analysis requires grouping retired and current codes. BioPortal provides a navigable CPT hierarchy, which we imported into the Informatics for Integrating Biology and the Bedside (i2b2) data warehouse and analytics platform. However, this hierarchy does not include retired codes. Methods: We compared BioPortal’s 2014AA CPT hierarchy with Partners Healthcare’s SCILHS datamart, comprising three-million patients’ data over 15 years. 573 CPT codes were not present in 2014AA (6.5 million occurrences). No existing terminology provided hierarchical linkages for these missing codes, so we developed a method that automatically places missing codes in the most specific “grouper” category, using the numerical similarity of CPT codes. Two informaticians reviewed the results. We incorporated the final table into our i2b2 SCILHS/PCORnet ontology, deployed it at seven sites, and performed a gap analysis and an evaluation against several phenotyping algorithms. Results: The reviewers found the method placed the code correctly with 97 % precision when considering only miscategorizations (“correctness precision”) and 52 % precision using a gold-standard of optimal placement (“optimality precision”). High correctness precision meant that codes were placed in a reasonable hierarchal position that a reviewer can quickly validate. Lower optimality precision meant that codes were not often placed in the optimal hierarchical subfolder. The seven sites encountered few occurrences of codes outside our ontology, 93 % of which comprised just four codes. Our hierarchical approach correctly grouped retired and non-retired codes in most cases and extended the temporal reach of several important phenotyping algorithms. Conclusions: We developed a simple, easily-validated, automated method to place retired CPT codes into the BioPortal CPT hierarchy. This complements existing hierarchical terminologies, which do not include retired codes. The approach’s utility is confirmed by the high correctness precision and successful grouping of retired with non-retired codes.
    Electronic ISSN: 1472-6947
    Topics: Computer Science , Medicine
    Published by BioMed Central
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 167
    Publication Date: 2015-12-31
    Description: Individuals with spina bifida (SB) are vulnerable to chronic skin complications such as wounds on the buttocks and lower extremities. Most of these complications can be prevented with adherence to self-care ro...
    Electronic ISSN: 1472-6947
    Topics: Computer Science , Medicine
    Published by BioMed Central
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 168
    Publication Date: 2015-12-31
    Description: Follicular lymphoma (FL) is one of the most common lymphoid malignancies in the western world. FL cases are stratified into three histological grades based on the average centroblast count per high power field...
    Electronic ISSN: 1472-6947
    Topics: Computer Science , Medicine
    Published by BioMed Central
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 169
    Publication Date: 2016-01-01
    Electronic ISSN: 1472-6947
    Topics: Computer Science , Medicine
    Published by BioMed Central
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 170
    Publication Date: 2016-01-06
    Description: Given a graph whose nodes and edges are associated with a profit, a visiting (or traversing) time and an admittance time window, the Mixed Team Orienteering Problem with Time Windows (MTOPTW) seeks for a specific number of walks spanning a subset of nodes and edges of the graph so as to maximize the overall collected profit. The visit of the included nodes and edges should take place within their respective time window and the overall duration of each walk should be below a certain threshold. In this paper we introduce the MTOPTW, which can be used for modeling a realistic variant of the Tourist Trip Design Problem where the objective is the derivation of near-optimal multiple-day itineraries for tourists visiting a destination which features several points of interest (POIs) and scenic routes. Since the MTOPTW is a NP-hard problem, we propose the first metaheuristic approaches to tackle it. The effectiveness of our algorithms is validated through a number of experiments on POI and scenic route sets compiled from the city of Athens (Greece).
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI Publishing
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 171
    Publication Date: 2016-03-31
    Description: Recent advances in the adoption and use of health information technology (HIT) have had a dramatic impact on the practice of medicine. In many environments, this has led to the ability to achieve new efficienc...
    Electronic ISSN: 1472-6947
    Topics: Computer Science , Medicine
    Published by BioMed Central
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 172
    Publication Date: 2019
    Description: In this survey paper, we review various concepts of graph density, as well as associated theorems and algorithms. Our goal is motivated by the fact that, in many applications, it is a key algorithmic task to extract a densest subgraph from an input graph, according to some appropriate definition of graph density. While this problem has been the subject of active research for over half of a century, with many proposed variants and solutions, new results still continuously emerge in the literature. This shows both the importance and the richness of the subject. We also identify some interesting open problems in the field.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 173
    Publication Date: 2019
    Description: 〈p〉Publication date: December 2019〈/p〉 〈p〉〈b〉Source:〈/b〉 Pattern Recognition, Volume 96〈/p〉 〈p〉Author(s): Rameswar Panda, Amran Bhuiyan, Vittorio Murino, Amit K. Roy-Chowdhury〈/p〉 〈h5〉Abstract〈/h5〉 〈div〉〈p〉Existing approaches for person re-identification have concentrated on either designing the best feature representation or learning optimal matching metrics in a static setting where the number of cameras are fixed in a network. Most approaches have neglected the dynamic and open world nature of the re-identification problem, where one or multiple new cameras may be temporarily on-boarded into an existing system to get additional information or added to expand an existing network. To address such a very practical problem, we propose a novel approach for adapting existing multi-camera re-identification frameworks with limited supervision. First, we formulate a domain perceptive re-identification method based on geodesic flow kernel that can effectively find the best source camera (already installed) to adapt with newly introduced target camera(s), without requiring a very expensive training phase. Second, we introduce a transitive inference algorithm for re-identification that can exploit the information from best source camera to improve the accuracy across other camera pairs in a network of multiple cameras. Third, we develop a target-aware sparse prototype selection strategy for finding an informative subset of source camera data for data-efficient learning in resource constrained environments. Our approach can greatly increase the flexibility and reduce the deployment cost of new cameras in many real-world dynamic camera networks. Extensive experiments demonstrate that our approach significantly outperforms state-of-the-art unsupervised alternatives whilst being extremely efficient to compute.〈/p〉〈/div〉
    Print ISSN: 0031-3203
    Electronic ISSN: 1873-5142
    Topics: Computer Science
    Published by Elsevier
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 174
    facet.materialart.
    Unknown
    Elsevier
    Publication Date: 2019
    Description: 〈p〉Publication date: December 2019〈/p〉 〈p〉〈b〉Source:〈/b〉 Pattern Recognition, Volume 96〈/p〉 〈p〉Author(s): Chengzu Bai, Ren Zhang, Zeshui Xu, Rui Cheng, Baogang Jin, Jian Chen〈/p〉 〈h5〉Abstract〈/h5〉 〈div〉〈p〉Kernel entropy component analysis (KECA) is a recently proposed dimensionality reduction approach, which has showed superiority in many pattern analysis algorithms previously based on principal component analysis (PCA). The optimized KECA (OKECA) is a state-of-the-art extension of KECA and can return projections retaining more expressive power than KECA. However, OKECA is not robust to outliers and has high computational complexities attributed to its inherent properties of L2-norm. To tackle these two problems, we propose a new variant of KECA, namely L1-norm-based KECA (L1-KECA) for data transformation and feature extraction. L1-KECA attempts to find a new kernel decomposition matrix such that the extracted features store the maximum information potential, which is measured by L1-norm. Accordingly, we present a greedy iterative algorithm which has much faster convergence than OKECA's. Additionally, L1-KECA retains OKECA's capability to obtain accurate density estimation with very few features (just one or two). Moreover, a new semi-supervised L1-KECA classifier is developed and employed into the data classification. Extensive experiments on different real-world datasets validate that our model is superior to most existing KECA-based and PCA-based approaches. Code has been also made publicly available.〈/p〉〈/div〉
    Print ISSN: 0031-3203
    Electronic ISSN: 1873-5142
    Topics: Computer Science
    Published by Elsevier
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 175
    Publication Date: 2019
    Description: 〈p〉Publication date: December 2019〈/p〉 〈p〉〈b〉Source:〈/b〉 Pattern Recognition, Volume 96〈/p〉 〈p〉Author(s): Samitha Herath, Basura Fernando, Mehrtash Harandi〈/p〉 〈h5〉Abstract〈/h5〉 〈div〉〈p〉In this paper we raise two important question, “〈strong〉1.〈/strong〉 Is temporal information beneficial in recognizing actions from still images? 〈strong〉2.〈/strong〉 Do we know how to take the maximum advantage from them?”. To answer these question we propose a novel transfer learning problem, Temporal To Still Image Learning (〈em〉i.e.〈/em〉, T2SIL) where we learn to derive temporal information from still images. Thereafter, we use a two-stream model where still image action predictions are fused with derived temporal predictions. In T2SIL, the knowledge transferring occurs from temporal representations of videos (〈em〉e.g.〈/em〉, Optical-flow, Dynamic Image representations) to still action images. Along with the T2SIL we propose a new action still image action dataset and a video dataset sharing the same set of classes. We explore three well established transfer learning frameworks (〈em〉i.e.〈/em〉, GANs, Embedding learning and Teacher Student Networks (TSNs)) in place of the temporal knowledge transfer method. The use of derived temporal information from our TSN and Embedding learning improves still image action recognition.〈/p〉〈/div〉
    Print ISSN: 0031-3203
    Electronic ISSN: 1873-5142
    Topics: Computer Science
    Published by Elsevier
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 176
    facet.materialart.
    Unknown
    Elsevier
    Publication Date: 2019
    Description: 〈p〉Publication date: January 2020〈/p〉 〈p〉〈b〉Source:〈/b〉 Pattern Recognition, Volume 97〈/p〉 〈p〉Author(s): Pooya Ashtari, Fateme Nateghi Haredasht, Hamid Beigy〈/p〉 〈h5〉Abstract〈/h5〉 〈div〉〈p〉Centroid-based methods including k-means and fuzzy c-means are known as effective and easy-to-implement approaches to clustering purposes in many applications. However, these algorithms cannot be directly applied to supervised tasks. This paper thus presents a generative model extending the centroid-based clustering approach to be applicable to classification and regression tasks. Given an arbitrary loss function, the proposed approach, termed Supervised Fuzzy Partitioning (SFP), incorporates labels information into its objective function through a surrogate term penalizing the empirical risk. Entropy-based regularization is also employed to fuzzify the partition and to weight features, enabling the method to capture more complex patterns, identify significant features, and yield better performance facing high-dimensional data. An iterative algorithm based on block coordinate descent scheme is formulated to efficiently find a local optimum. Extensive classification experiments on synthetic, real-world, and high-dimensional datasets demonstrate that the predictive performance of SFP is competitive with state-of-the-art algorithms such as SVM and random forest. SFP has a major advantage over such methods, in that it not only leads to a flexible, nonlinear model but also can exploit any convex loss function in the training phase without compromising computational efficiency.〈/p〉〈/div〉
    Print ISSN: 0031-3203
    Electronic ISSN: 1873-5142
    Topics: Computer Science
    Published by Elsevier
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 177
    Publication Date: 2019
    Description: 〈p〉Publication date: January 2020〈/p〉 〈p〉〈b〉Source:〈/b〉 Pattern Recognition, Volume 97〈/p〉 〈p〉Author(s): Younghoon Kim, Hyungrok Do, Seoung Bum Kim〈/p〉 〈h5〉Abstract〈/h5〉 〈div〉〈p〉Graph-based clustering is an efficient method for identifying clusters in local and nonlinear data patterns. Among the existing methods, spectral clustering is one of the most prominent algorithms. However, this method is vulnerable to noise and outliers. This study proposes a robust graph-based clustering method that removes the data nodes of relatively low density. The proposed method calculates the pseudo-density from a similarity matrix, and reconstructs it using a sparse regularization model. In this process, noise and the outer points are determined and removed. Unlike previous edge cutting-based methods, the proposed method is robust to noise while detecting clusters because it cuts out irrelevant nodes. We use a simulation and real-world data to demonstrate the usefulness of the proposed method by comparing it to existing methods in terms of clustering accuracy and robustness to noisy data. The comparison results confirm that the proposed method outperforms the alternatives.〈/p〉〈/div〉
    Print ISSN: 0031-3203
    Electronic ISSN: 1873-5142
    Topics: Computer Science
    Published by Elsevier
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 178
    Publication Date: 2019
    Description: The skyline query and its variant queries are useful functions in the early stages of a knowledge-discovery processes. The skyline query and its variant queries select a set of important objects, which are better than other common objects in the dataset. In order to handle big data, such knowledge-discovery queries must be computed in parallel distributed environments. In this paper, we consider an efficient parallel algorithm for the “K-skyband query” and the “top-k dominating query”, which are popular variants of skyline query. We propose a method for computing both queries simultaneously in a parallel distributed framework called MapReduce, which is a popular framework for processing “big data” problems. Our extensive evaluation results validate the effectiveness and efficiency of the proposed algorithm on both real and synthetic datasets.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 179
    Publication Date: 2019
    Description: A generalization of Ding’s construction is proposed that employs as a defining set the collection of the sth powers ( s ≥ 2 ) of all nonzero elements in G F ( p m ) , where p ≥ 2 is prime. Some of the resulting codes are optimal or near-optimal and include projective codes over G F ( 4 ) that give rise to optimal or near optimal quantum codes. In addition, the codes yield interesting combinatorial structures, such as strongly regular graphs and block designs.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 180
    Publication Date: 2019
    Description: Balanced partitioning is often a crucial first step in solving large-scale graph optimization problems, for example, in some cases, a big graph can be chopped into pieces that fit on one machine to be processed independently before stitching the results together, leading to certain suboptimality from the interaction among different pieces. In other cases, links between different parts may show up in the running time and/or network communications cost, hence the desire to have small cut size. We study a distributed balanced-partitioning problem where the goal is to partition the vertices of a given graph into k pieces so as to minimize the total cut size. Our algorithm is composed of a few steps that are easily implementable in distributed computation frameworks such as MapReduce. The algorithm first embeds nodes of the graph onto a line, and then processes nodes in a distributed manner guided by the linear embedding order. We examine various ways to find the first embedding, for example, via a hierarchical clustering or Hilbert curves. Then we apply four different techniques including local swaps, and minimum cuts on the boundaries of partitions, as well as contraction and dynamic programming. As our empirical study, we compare the above techniques with each other, and also to previous work in distributed graph algorithms, for example, a label-propagation method, FENNEL and Spinner. We report our results both on a private map graph and several public social networks, and show that our results beat previous distributed algorithms: For instance, compared to the label-propagation algorithm, we report an improvement of 15–25% in the cut value. We also observe that our algorithms admit scalable distributed implementation for any number of partitions. Finally, we explain three applications of this work at Google: (1) Balanced partitioning is used to route multi-term queries to different replicas in Google Search backend in a way that reduces the cache miss rates by ≈ 0.5 % , which leads to a double-digit gain in throughput of production clusters. (2) Applied to the Google Maps Driving Directions, balanced partitioning minimizes the number of cross-shard queries with the goal of saving in CPU usage. This system achieves load balancing by dividing the world graph into several “shards”. Live experiments demonstrate an ≈ 40 % drop in the number of cross-shard queries when compared to a standard geography-based method. (3) In a job scheduling problem for our data centers, we use balanced partitioning to evenly distribute the work while minimizing the amount of communication across geographically distant servers. In fact, the hierarchical nature of our solution goes well with the layering of data center servers, where certain machines are closer to each other and have faster links to one another.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 181
    Publication Date: 2019
    Description: Analyzing the structure of a social network helps in gaining insights into interactions and relationships among users while revealing the patterns of their online behavior. Network centrality is a metric of importance of a network node in a network, which allows revealing the structural patterns and morphology of networks. We propose a distributed computing approach for the calculation of network centrality value for each user using the MapReduce approach in the Hadoop platform, which allows faster and more efficient computation as compared to the conventional implementation. A distributed approach is scalable and helps in efficient computations of large-scale datasets, such as social network data. The proposed approach improves the calculation performance of degree centrality by 39.8%, closeness centrality by 40.7% and eigenvalue centrality by 41.1% using a Twitter dataset.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 182
    Publication Date: 2019
    Description: Deep neural networks are successful learning tools for building nonlinear models. However, a robust deep learning-based classification model needs a large dataset. Indeed, these models are often unstable when they use small datasets. To solve this issue, which is particularly critical in light of the possible clinical applications of these predictive models, researchers have developed approaches such as virtual sample generation. Virtual sample generation significantly improves learning and classification performance when working with small samples. The main objective of this study is to evaluate the ability of the proposed virtual sample generation to overcome the small sample size problem, which is a feature of the automated detection of a neurodevelopmental disorder, namely autism spectrum disorder. Results show that our method enhances diagnostic accuracy from 84%–95% using virtual samples generated on the basis of five actual clinical samples. The present findings show the feasibility of using the proposed technique to improve classification performance even in cases of clinical samples of limited size. Accounting for concerns in relation to small sample sizes, our technique represents a meaningful step forward in terms of pattern recognition methodology, particularly when it is applied to diagnostic classifications of neurodevelopmental disorders. Besides, the proposed technique has been tested with other available benchmark datasets. The experimental outcomes showed that the accuracy of the classification that used virtual samples was superior to the one that used original training data without virtual samples.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 183
    Publication Date: 2019
    Description: 〈p〉Publication date: January 2020〈/p〉 〈p〉〈b〉Source:〈/b〉 Pattern Recognition, Volume 97〈/p〉 〈p〉Author(s): Qiong Wang, Lu Zhang, Wenbin Zou, Kidiyo Kpalma〈/p〉 〈h5〉Abstract〈/h5〉 〈div〉〈p〉In this paper, we present a novel method for salient object detection in videos. Salient object detection methods based on background prior may miss salient region when the salient object touches the frame borders. To solve this problem, we propose to detect the whole salient object via the adjunction of virtual borders. A guided filter is then applied on the temporal output to integrate the spatial edge information for a better detection of the salient object edges. At last, a global spatio-temporal saliency map is obtained by combining the spatial saliency map and the temporal saliency map together according to the entropy. The proposed method is assessed on three popular datasets (Fukuchi, FBMS and VOS) and compared to several state-of-the-art methods. The experimental results show that the proposed approach outperforms the tested methods.〈/p〉〈/div〉
    Print ISSN: 0031-3203
    Electronic ISSN: 1873-5142
    Topics: Computer Science
    Published by Elsevier
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 184
    Publication Date: 2019
    Description: 〈p〉Publication date: December 2019〈/p〉 〈p〉〈b〉Source:〈/b〉 Pattern Recognition, Volume 96〈/p〉 〈p〉Author(s): Zhuoyao Zhong, Lei Sun, Qiang Huo〈/p〉 〈h5〉Abstract〈/h5〉 〈div〉〈p〉Although Faster R-CNN based text detection approaches have achieved promising results, their localization accuracy is not satisfactory in certain cases due to their sub-optimal bounding box regression based localization modules. In this paper, we address this problem and propose replacing the bounding box regression module with a novel LocNet based localization module to improve the localization accuracy of a Faster R-CNN based text detector. Given a proposal generated by a region proposal network (RPN), instead of directly predicting the bounding box coordinates of the concerned text instance, the proposal is enlarged to create a search region so that an “In-Out” conditional probability to each row and column of this search region is assigned, which can then be used to accurately infer the concerned bounding box. Furthermore, we present a simple yet effective two-stage approach to convert the difficult multi-oriented text detection problem to a relatively easier horizontal text detection problem, which makes our approach able to robustly detect multi-oriented text instances with accurate bounding box localization. Experiments demonstrate that the proposed approach boosts the localization accuracy of Faster R-CNN based text detectors significantly. Consequently, our new text detector has achieved superior performance on both horizontal (ICDAR-2011, ICDAR-2013 and MULTILIGUL) and multi-oriented (MSRA-TD500, ICDAR-2015) text detection benchmark tasks.〈/p〉〈/div〉
    Print ISSN: 0031-3203
    Electronic ISSN: 1873-5142
    Topics: Computer Science
    Published by Elsevier
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 185
    Publication Date: 2019
    Description: 〈p〉Publication date: December 2019〈/p〉 〈p〉〈b〉Source:〈/b〉 Pattern Recognition, Volume 96〈/p〉 〈p〉Author(s): Chunfeng Song, Yongzhen Huang, Yan Huang, Ning Jia, Liang Wang〈/p〉 〈h5〉Abstract〈/h5〉 〈div〉〈p〉Gait recognition is one of the most important techniques for human identification at a distance. Most current gait recognition frameworks consist of several separate steps: silhouette segmentation, feature extraction, feature learning, and similarity measurement. These modules are mutually independent with each part fixed, resulting in a suboptimal performance in challenging conditions. In this paper, we integrate those steps into one framework, i.e., an end-to-end network for gait recognition, named 〈strong〉GaitNet〈/strong〉. It is composed of two convolutional neural networks: one corresponds to gait segmentation, and the other corresponds to classification. The two networks are modeled in one joint learning procedure which can be trained jointly. This strategy greatly simplifies the traditional step-by-step manner and is thus much more efficient for practical applications. Moreover, joint learning can automatically adjust each part to fit the global optimal objective, leading to obvious performance improvement over separate learning. We evaluate our method on three large scale gait datasets, including CASIA-B, SZU RGB-D Gait and a newly built database with complex dynamic outdoor backgrounds. Extensive experimental results show that the proposed method is effective and achieves the state-of-the-art results. The code and data will be released upon request.〈/p〉〈/div〉
    Print ISSN: 0031-3203
    Electronic ISSN: 1873-5142
    Topics: Computer Science
    Published by Elsevier
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 186
    Publication Date: 2019
    Description: 〈p〉Publication date: December 2019〈/p〉 〈p〉〈b〉Source:〈/b〉 Pattern Recognition, Volume 96〈/p〉 〈p〉Author(s): Chuan-Xian Ren, Xiao-Lin Xu, Zhen Lei〈/p〉 〈h5〉Abstract〈/h5〉 〈div〉〈p〉Person re-identification (re-ID) is to match different images of the same pedestrian. It has attracted increasing research interest in pattern recognition and machine learning. Traditionally, person re-ID is formulated as a metric learning problem with binary classification output. However, higher order relationship, such as triplet closeness among the instances, is ignored by such pair-wise based metric learning methods. Thus, the discriminative information hidden in these data is insufficiently explored. This paper proposes a new structured loss function to push the frontier of the person re-ID performance in realistic scenarios. The new loss function introduces two margin parameters. They operate as bounds to remove positive pairs of very small distances and negative pairs of large distances. A trade-off coefficient is assigned to the loss term of negative pairs to alleviate class-imbalance problem. By using a linear function with the margin-based objectives, the gradients 〈em〉w.r.t.〈/em〉 weight matrices are no longer dependent on the iterative loss values in a multiplicative manner. This makes the weights update process robust to large iterative loss values. The new loss function is compatible with many deep learning architectures, thus, it induces new deep network with pair-pruning regularization for metric learning. To evaluate the performance of the proposed model, extensive experiments are conducted on benchmark datasets. The results indicate that the new loss together with the ResNet-50 backbone has excellent feature representation ability for person re-ID.〈/p〉〈/div〉
    Print ISSN: 0031-3203
    Electronic ISSN: 1873-5142
    Topics: Computer Science
    Published by Elsevier
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 187
    Publication Date: 2019
    Description: 〈p〉Publication date: January 2020〈/p〉 〈p〉〈b〉Source:〈/b〉 Pattern Recognition, Volume 97〈/p〉 〈p〉Author(s): Shuzhao Li, Huimin Yu, Roland Hu〈/p〉 〈h5〉Abstract〈/h5〉 〈div〉〈p〉Person attributes are often exploited as mid-level human semantic information to help promote the performance of person re-identification task. In this paper, unlike most existing methods simply taking the attribute learning as a classification problem, we perform it in a different way with the motivation that attributes are related to specific local regions, which refers to the perceptual ability of attributes. We utilize the process of attribute detection to generate corresponding attribute-part detectors, whose invariance to many influences like poses and camera views can be guaranteed. With detected local part regions, our model extracts local part features to handle the body part misalignment problem, which is another major challenge for person re-identification. The local descriptors are further refined by fused attribute information to eliminate interferences caused by detection deviation. Finally, the refined local feature works together with a holistic-level feature to constitute our final feature representation. Extensive experiments on two popular benchmarks with attribute annotations demonstrate the effectiveness of our model and competitive performance compared with state-of-the-art algorithms.〈/p〉〈/div〉 〈h5〉Graphical abstract〈/h5〉 〈div〉〈p〉〈figure〉〈img src="https://ars.els-cdn.com/content/image/1-s2.0-S003132031930319X-fx1.jpg" width="301" alt="Graphical abstract for this article" title=""〉〈/figure〉〈/p〉〈/div〉
    Print ISSN: 0031-3203
    Electronic ISSN: 1873-5142
    Topics: Computer Science
    Published by Elsevier
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 188
    facet.materialart.
    Unknown
    Elsevier
    Publication Date: 2019
    Description: 〈p〉Publication date: January 2020〈/p〉 〈p〉〈b〉Source:〈/b〉 Pattern Recognition, Volume 97〈/p〉 〈p〉Author(s): Xin Wei, Hui Wang, Bryan Scotney, Huan Wan〈/p〉 〈h5〉Abstract〈/h5〉 〈div〉〈p〉Face recognition has achieved great success owing to the fast development of deep neural networks in the past few years. Different loss functions can be used in a deep neural network resulting in different performance. Most recently some loss functions have been proposed, which have advanced the state of the art. However, they cannot solve the problem of 〈em〉margin bias〈/em〉 which is present in class imbalanced datasets, having the so-called long-tailed distributions. In this paper, we propose to solve the margin bias problem by setting a minimum margin for all pairs of classes. We present a new loss function, Minimum Margin Loss (MML), which is aimed at enlarging the margin of those overclose class centre pairs so as to enhance the discriminative ability of the deep features. MML, together with Softmax Loss and Centre Loss, supervises the training process to balance the margins of all classes irrespective of their class distributions. We implemented MML in Inception-ResNet-v1 and conducted extensive experiments on seven face recognition benchmark datasets, MegaFace, FaceScrub, LFW, SLLFW, YTF, IJB-B and IJB-C. Experimental results show that the proposed MML loss function has led to new state of the art in face recognition, reducing the negative effect of margin bias.〈/p〉〈/div〉
    Print ISSN: 0031-3203
    Electronic ISSN: 1873-5142
    Topics: Computer Science
    Published by Elsevier
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 189
    Publication Date: 2019
    Description: Parameterized complexity theory has led to a wide range of algorithmic breakthroughs within the last few decades, but the practicability of these methods for real-world problems is still not well understood. We investigate the practicability of one of the fundamental approaches of this field: dynamic programming on tree decompositions. Indisputably, this is a key technique in parameterized algorithms and modern algorithm design. Despite the enormous impact of this approach in theory, it still has very little influence on practical implementations. The reasons for this phenomenon are manifold. One of them is the simple fact that such an implementation requires a long chain of non-trivial tasks (as computing the decomposition, preparing it, …). We provide an easy way to implement such dynamic programs that only requires the definition of the update rules. With this interface, dynamic programs for various problems, such as 3-coloring, can be implemented easily in about 100 lines of structured Java code. The theoretical foundation of the success of dynamic programming on tree decompositions is well understood due to Courcelle’s celebrated theorem, which states that every MSO-definable problem can be efficiently solved if a tree decomposition of small width is given. We seek to provide practical access to this theorem as well, by presenting a lightweight model checker for a small fragment of MSO 1 (that is, we do not consider “edge-set-based” problems). This fragment is powerful enough to describe many natural problems, and our model checker turns out to be very competitive against similar state-of-the-art tools.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 190
    Publication Date: 2019
    Description: Let V be a finite set of positive integers with sum equal to a multiple of the integer b . When does V have a partition into b parts so that all parts have equal sums? We develop algorithmic constructions which yield positive, albeit incomplete, answers for the following classes of set V , where n is a given positive integer: (1) an initial interval { a ∈ ℤ + : a ≤ n } ; (2) an initial interval of primes { p ∈ ℙ : p ≤ n } , where ℙ is the set of primes; (3) a divisor set { d ∈ ℤ + : d | n } ; (4) an aliquot set { d ∈ ℤ + : d | n ,   d 〈 n } . Open general questions and conjectures are included for each of these classes.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 191
    Publication Date: 2019
    Description: The blockchain technique is becoming more and more popular due to its advantages such as stability and dispersed nature. This is an idea based on blockchain activity paradigms. Another important field is machine learning, which is increasingly used in practice. Unfortunately, the training or overtraining artificial neural networks is very time-consuming and requires high computing power. In this paper, we proposed using a blockchain technique to train neural networks. This type of activity is important due to the possible search for initial weights in the network, which affect faster training, due to gradient decrease. We performed the tests with much heavier calculations to indicate that such an action is possible. However, this type of solution can also be used for less demanding calculations, i.e., only a few iterations of training and finding a better configuration of initial weights.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 192
    Publication Date: 2019
    Description: In this study, we address the problem of compaction of Church numerals. Church numerals are unary representations of natural numbers on the scheme of lambda terms. We propose a novel decomposition scheme from a given natural number into an arithmetic expression using tetration, which enables us to obtain a compact representation of lambda terms that leads to the Church numeral of the natural number. For natural number n, we prove that the size of the lambda term obtained by the proposed method is O ( ( slog 2 n ) ( log n / log log n ) ) . Moreover, we experimentally confirmed that the proposed method outperforms binary representation of Church numerals on average, when n is less than approximately 10,000.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 193
    Publication Date: 2019
    Description: 〈p〉Publication date: December 2019〈/p〉 〈p〉〈b〉Source:〈/b〉 Pattern Recognition, Volume 96〈/p〉 〈p〉Author(s): Mahsa Taheri, Zahra Moslehi, Abdolreza Mirzaei, Mehran Safayani〈/p〉 〈h5〉Abstract〈/h5〉 〈div〉〈p〉Measuring distance among data point pairs is a necessary step among numerous counts of algorithms in machine learning, pattern recognition and data mining. In the local perspective, the emphasis of all existing supervised metric learning algorithms is to shrink similar data points and to separate dissimilar ones in the local neighborhoods. This provides learning more appropriate distance metric in dealing with the within-class multi modal data. In this article, a new supervised local metric learning method named 〈em〉Self-Adaptive Local Metric Learning Method〈/em〉 (〈em〉SA-LM〈sup〉2〈/sup〉〈/em〉) has been proposed. The contribution of this method is in five aspects. First, in this method, learning an appropriate metric and defining the radius of local neighborhood are integrated in a joint formulation. Second, unlike the traditional approaches, SA-LM〈sup〉2〈/sup〉 learns the parameter of local neighborhood automatically thorough its formulation. As a result, it is a parameter free method, where it does not require any parameters that would need to be tuned. Third, SA-LM〈sup〉2〈/sup〉 is formulated as a SemiDefinite Program (SDP) with a global convergence guarantee. Fourth, this method does not need the similar set 〈em〉S〈/em〉, the focus here is on local areas’ data points and their separation from dissimilar ones. Finally, results of SA-LM〈sup〉2〈/sup〉 are less influenced by noisy input data points than the other compared global and local algorithms. Results obtained from different experiments indicate the outperformance of this algorithm over its counterparts.〈/p〉〈/div〉
    Print ISSN: 0031-3203
    Electronic ISSN: 1873-5142
    Topics: Computer Science
    Published by Elsevier
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 194
    Publication Date: 2019
    Description: In the vehicle routing problem with simultaneous pickup and delivery (VRPSPD), customers demanding both delivery and pickup operations have to be visited once by a single vehicle. In this work, we propose a fast randomized algorithm using a nearest neighbor strategy to tackle an extension of the VRPSPD in which the fleet of vehicles is heterogeneous. This variant is an NP-hard problem, which in practice makes it impossible to be solved to proven optimality for large instances. To evaluate the proposal, we use benchmark instances from the literature and compare our results to those obtained by a state-of-the-art algorithm. Our approach presents very competitive results, not only improving several of the known solutions, but also running in a shorter time.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 195
    Publication Date: 2019
    Description: 〈p〉Publication date: January 2020〈/p〉 〈p〉〈b〉Source:〈/b〉 Pattern Recognition, Volume 97〈/p〉 〈p〉Author(s): Zheng Ma, Jun Cheng, Dapeng Tao〈/p〉 〈h5〉Abstract〈/h5〉 〈div〉〈p〉Wearable/portable brain-computer interfaces (BCIs) for the long-term end use are a focus of recent BCI research. A challenge is how to update the BCI to meet changes in electroencephalography (EEG) signals, since the resource are so limited that retraining of traditional well-performed models, such as a support vector machine, is nearly impossible. To cope with this challenge, less-demanding adaptive online learning can be considered. We investigated an adaptive projected sub-gradient method (APSM) that is originated from the set theoretic estimation formulation and the projections onto convex sets theory. APSM provides a unifying framework for both adaptive classification and regression tasks. Coefficients of APSM are adjusted online as data arrive sequentially, with a regularization constraint made by projections onto a fixed closed ball. We extended the general APSM to a shrinkage form, where shrinkage closed balls were used instead of the original fixed one, expecting a more controllable fading effect and better adaptability. The convergence of shrinkage APSM was proved. It was also demonstrated that as shrinkage factor approached to 1, the limit point of shrinkage APSM would approach to the optimal solution with the least norm, which could be especially beneficial for generalization of the classifier. The performance of the proposed method was evaluated, and compared with those of the general APSM, the incremental support vector machine, and the passive aggressive algorithm, through an event-related potential-based BCI experiment. Results showed the advantage of the proposed method over the others on both the online classification performance and the easiness of tuning. Our study revealed the effectiveness of the proposed method for adaptive EEG classification, making it a promising tool for on-device training and updating of wearable/portable BCIs, as well as for application in other related fields, such as EEG-based biometrics.〈/p〉〈/div〉
    Print ISSN: 0031-3203
    Electronic ISSN: 1873-5142
    Topics: Computer Science
    Published by Elsevier
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 196
    Publication Date: 2019
    Description: Nowadays, the amount of digitally available information has tremendously grown, with real-world data graphs outreaching the millions or even billions of vertices. Hence, community detection, where groups of vertices are formed according to a well-defined similarity measure, has never been more essential affecting a vast range of scientific fields such as bio-informatics, sociology, discrete mathematics, nonlinear dynamics, digital marketing, and computer science. Even if an impressive amount of research has yet been published to tackle this NP-hard class problem, the existing methods and algorithms have virtually been proven inefficient and severely unscalable. In this regard, the purpose of this manuscript is to combine the network topology properties expressed by the loose similarity and the local edge betweenness, which is a currently proposed Girvan–Newman’s edge betweenness measure alternative, along with the intrinsic user content information, in order to introduce a novel and highly distributed hybrid community detection methodology. The proposed approach has been thoroughly tested on various real social graphs, roundly compared to other classic divisive community detection algorithms that serve as baselines and practically proven exceptionally scalable, highly efficient, and adequately accurate in terms of revealing the subjacent network hierarchy.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 197
    facet.materialart.
    Unknown
    Elsevier
    Publication Date: 2019
    Description: 〈p〉Publication date: January 2020〈/p〉 〈p〉〈b〉Source:〈/b〉 Pattern Recognition, Volume 97〈/p〉 〈p〉Author(s): Franco Manessi, Alessandro Rozza, Mario Manzo〈/p〉 〈h5〉Abstract〈/h5〉 〈div〉〈p〉In many different classification tasks it is required to manage structured data, which are usually modeled as graphs. Moreover, these graphs can be dynamic, meaning that the vertices/edges of each graph may change over time. The goal is to exploit existing neural network architectures to model datasets that are best represented with graph structures that change over time. To the best of the authors’ knowledge, this task has not been addressed using these kinds of architectures. Two novel approaches are proposed, which combine Long Short-Term Memory networks and Graph Convolutional Networks to learn long short-term dependencies together with graph structure. The advantage provided by the proposed methods is confirmed by the results achieved on four real world datasets: an increase of up to 12 percentage points in Accuracy and F1 scores for vertex-based semi-supervised classification and up to 2 percentage points in Accuracy and F1 scores for graph-based supervised classification.〈/p〉〈/div〉
    Print ISSN: 0031-3203
    Electronic ISSN: 1873-5142
    Topics: Computer Science
    Published by Elsevier
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 198
    Publication Date: 2019
    Description: Herein, robust pole placement controller design for linear uncertain discrete time dynamic systems is addressed. The adopted approach uses the so called “D regions” where the closed loop system poles are determined to lie. The discrete time pole regions corresponding to the prescribed damping of the resulting closed loop system are studied. The key issue is to determine the appropriate convex approximation to the originally non-convex discrete-time system pole region, so that numerically efficient robust controller design algorithms based on Linear Matrix Inequalities (LMI) can be used. Several alternatives for relatively simple inner approximations and their corresponding LMI descriptions are presented. The developed LMI region for the prescribed damping can be arbitrarily combined with other LMI pole limitations (e.g., stability degree). Simple algorithms to calculate the matrices for LMI representation of the proposed convex pole regions are provided in a concise way. The results and their use in a robust controller design are illustrated on a case study of a laboratory magnetic levitation system.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 199
    Publication Date: 2019
    Description: The objective of the cell suppression problem (CSP) is to protect sensitive cell values in tabular data under the presence of linear relations concerning marginal sums. Previous algorithms for solving CSPs ensure that every sensitive cell has enough uncertainty on its values based on the interval width of all possible values. However, we find that every deterministic CSP algorithm is vulnerable to an adversary who possesses the knowledge of that algorithm. We devise a matching attack scheme that narrows down the ranges of sensitive cell values by matching the suppression pattern of an original table with that of each candidate table. Our experiments show that actual ranges of sensitive cell values are significantly narrower than those assumed by the previous CSP algorithms.
    Electronic ISSN: 1999-4893
    Topics: Computer Science
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 200
    facet.materialart.
    Unknown
    Elsevier
    Publication Date: 2019
    Description: 〈p〉Publication date: January 2020〈/p〉 〈p〉〈b〉Source:〈/b〉 Pattern Recognition, Volume 97〈/p〉 〈p〉Author(s): Ying Liu, Konstantinos Tountas, Dimitris A. Pados, Stella N. Batalama, Michael J. Medley〈/p〉 〈h5〉Abstract〈/h5〉 〈div〉〈p〉High-dimensional data usually exhibit intrinsic low-rank structures. With tremendous amount of streaming data generated by ubiquitous sensors in the world of Internet-of-Things, fast detection of such low-rank pattern is of utmost importance to a wide range of applications. In this work, we present an 〈em〉L〈/em〉〈sub〉1〈/sub〉-subspace tracking method to capture the low-rank structure of streaming data. The method is based on the 〈em〉L〈/em〉〈sub〉1〈/sub〉-norm principal-component analysis (〈em〉L〈/em〉〈sub〉1〈/sub〉-PCA) theory that offers outlier resistance in subspace calculation. The proposed method updates the 〈em〉L〈/em〉〈sub〉1〈/sub〉-subspace as new data are acquired by sensors. In each time slot, the conformity of each datum is measured by the 〈em〉L〈/em〉〈sub〉1〈/sub〉-subspace calculated in the previous time slot and used to weigh the datum. Iterative weighted 〈em〉L〈/em〉〈sub〉1〈/sub〉-PCA is then executed through a refining function. The superiority of the proposed 〈em〉L〈/em〉〈sub〉1〈/sub〉-subspace tracking method compared to existing approaches is demonstrated through experimental studies in various application fields.〈/p〉〈/div〉
    Print ISSN: 0031-3203
    Electronic ISSN: 1873-5142
    Topics: Computer Science
    Published by Elsevier
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...