ALBERT

All Library Books, journals and Electronic Records Telegrafenberg

feed icon rss

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
Filter
  • inter-annotator agreement  (1)
Collection
Publisher
Years
  • 1
    Electronic Resource
    Electronic Resource
    Springer
    Computers and the humanities 34 (2000), S. 79-84 
    ISSN: 1572-8412
    Keywords: evaluation ; ambiguity resolution ; WSD ; inter-annotator agreement
    Source: Springer Online Journal Archives 1860-2000
    Topics: Computer Science , Media Resources and Communication Sciences, Journalism
    Notes: Abstract We present methods for evaluating human and automatictaggers that extend current practice in three ways. First, we show howto evaluate taggers that assign multiple tags to each test instance,even if they do not assign probabilities. Second, we show how toaccommodate a common property of manually constructed ``gold standards''that are typically used for objective evaluation, namely that there isoften more than one correct answer. Third, we show how to measureperformance when the set of possible tags is tree-structured in an IS-Ahierarchy. To illustrate how our methods can be used to measureinter-annotator agreement, we show how to compute the kappa coefficientover hierarchical tag sets.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...