WorldCIST'15 - 3rd World Conference on Information Systems and Technologies

Full Program »

Question Answering Track Evaluation in TREC, CLEF and NTCIR

Question Answering (QA) Systems are put forward as a real alternative to Information Retrieval systems as they provide the user with a fast and comprehensible answer to his or her information need. The principal campaigns in the evaluation of information recovery have been specific tracks focusing on the development and evaluation of this type of system. This study analyzes the evaluation measures used in the QA tracks of the three principal campaigns in the evaluation of Information Retrieval –the Text REtrieval Conference (TREC), the Conference and Labs of Evaluation Forum (CLEF) and the NTCIR Conference–. We have identified the different tasks or specific labs created in each QA track, the types of evaluation question used, as well as the evaluation measures used in the different competitions analyzed.

Author(s):

María-Dolores Olvera-Lobo    
University of Granada, Department of Information and Communication & CSIC, Unidad Asociada Grupo SCImago
Spain

Juncal Gutiérrez-Artacho    
University of Granada, Department of Translation and Interpreting
Spain

 

Powered by OpenConf®
Copyright ©2002-2013 Zakon Group LLC