Systems analytic approach to the evaluation of information retrieval (IR)

Date

2000

Authors

Kagolovsky, Yuri

Journal Title

Journal ISSN

Volume Title

Publisher

Abstract

There is currently a high demand for an improvement to the information retrieval (IR) of documents from a variety of electronic resources. This improvement is focused on increasing the level of user satisfaction. While historically, the improvement of IR has been mainly concerned with improving the performance of search engines, new directions in research lean toward the development of a more user-centered evaluation of IR. At the same time, the variety of available evaluation methodologies, in addition to terminological differences, present difficulties for planning, analysis, and results comparison in IR research. This thesis examines information retrieval and its evaluation methodologies using the systems approach. Problems are identified based on an overview of IR, the terminology of the field, and IR evaluation. Several different approaches to IR evaluation, including the traditional Cranfield paradigm, as well as new userĀ­ oriented paradigms, are presented. A central concept of information science, relevance, and evaluation measures using this concept, are discussed. The systems approach to IR and its evaluation permits the identification of components of IR, their boundaries, structure, functions, and interactions. A new definition of information retrieval and different models of IR are proposed. The systems approach and the models of IR presented here create a basis for introducing a common terminology, as well as new approaches to IR evaluation. A critique of relevance based measures of recall and precision is presented, as well as an alternative evaluation methodology that uses methods of cognitive psychology for semantics capturing and comparison. Differences between the proposed and currently used evaluation methods are discussed, and directions of future research are identified.

Description

Keywords

Citation