Education Publications

Document Type

Article

Publication Date

2016

Journal

International Journal of Computational Linguistics and Applications

Volume

7

Issue

2

Abstract

There are several methods and available tools for terminology extraction, but the quality of the extracted terms is not always high. Hence, an important consideration in terminology extraction is to assess the quality of the extracted terms. In this paper, we propose and make available a tool for annotating the correctness of terms extracted by three term-extraction tools. This tool facilitates term annotation by using a domain-specific dictionary, a set of filters, and an annotation memory, and allows for post-hoc evaluation. We present a study in which two human judges used the developed tool for term annotation. Their annotations were then analyzed to determine the efficiency of term extraction tools by measures of precision, recall, and F-score, and to calculate the inter-annotator agreement rate.

Find in your library

Included in

Education Commons

Share

COinS