skip to main content
10.5555/1889174guideproceedingsBook PagePublication PagesConference Proceedingsacm-pubtype
CLEF'10: Proceedings of the 2010 international conference on Multilingual and multimodal information access evaluation: cross-language evaluation forum
2010 Proceeding
Publisher:
  • Springer-Verlag
  • Berlin, Heidelberg
Conference:
Padua Italy September 20 - 23, 2010
ISBN:
978-3-642-15997-8
Published:
20 September 2010
Sponsors:
XRCE, Information Retrieval Facility, COFRIDIP, University of Padua

Bibliometrics
Abstract

No abstract available.

Skip Table Of Content Section
SECTION: Keynote addresses
Article
IR between science and engineering, and the role of experimentation
pp 1

Evaluation has always played a major role in IR research, as a means for judging about the quality of competing models. Lately, however, we have seen an over-emphasis of experimental results, thus favoring engineering approaches aiming at tuning ...

Article
Retrieval evaluation in practice
pp 2

Nowadays, most research on retrieval evaluation is about comparing different systems to determine which is the best one, using a standard document collection and a set of queries with relevance judgements, such as TREC. Retrieval quality baselines are ...

SECTION: Resources, tools, and methods
Article
A dictionary- and corpus-independent statistical lemmatizer for information retrieval in low resource languages
pp 3–14

We present a dictionary- and corpus-independent statistical lemmatizer StaLe that deals with the out-of-vocabulary (OOV) problem of dictionary-based lemmatization by generating candidate lemmas for any inflected word forms. StaLe can be applied with ...

Article
A new approach for cross-language plagiarism analysis
pp 15–26

This paper presents a new method for Cross-Language Plagiarism Analysis. Our task is to detect the plagiarized passages in the suspicious documents and their corresponding fragments in the source documents. We propose a plagiarism detection method ...

Article
Creating a Persian-English comparable corpus
pp 27–39

Multilingual corpora are valuable resources for cross-language information retrieval and are available in many language pairs. However the Persian language does not have rich multilingual resources due to some of its special features and difficulties in ...

SECTION: Experimental collections and datasets (1)
Article
Validating query simulators: an experiment using commercial searches and purchases
pp 40–51

We design and validate simulators for generating queries and relevance judgments for retrieval system evaluation. We develop a simulation framework that incorporates existing and new simulation strategies. To validate a simulator, we assess whether ...

Article
Using parallel corpora for multilingual (multi-document) summarisation evaluation
pp 52–63

We are presenting a method for the evaluation of multilingual multi-document summarisation that allows saving precious annotation time and that makes the evaluation results across languages directly comparable. The approach is based on the manual ...

SECTION: Experimental collections and datasets (2)
Article
MapReduce for information retrieval evaluation: "let's quickly test this on 12 TB of data"
pp 64–69

We propose to use MapReduce to quickly test new retrieval approaches on a cluster of machines by sequentially scanning all documents. We present a small case study in which we use a cluster of 15 low cost machines to search a web crawl of 0.5 billion ...

Article
Which log for which information? gathering multilingual data from different log file types
pp 70–81

In this paper, a comparative analysis of different log file types and their potential for gathering information about user behavior in a multilingual information system is presented. It starts with a discussion of potential questions to be answered in ...

SECTION: Evaluation methodologies and metrics (1)
Article
Examining the robustness of evaluation metrics for patent retrieval with incomplete relevance judgements
pp 82–93

Recent years have seen a growing interest in research into patent retrieval. One of the key issues in conducting information retrieval (IR) research is meaningful evaluation of the effectiveness of the retrieval techniques applied to task under ...

Article
On the evaluation of entity profiles
pp 94–99

Entity profiling is the task of identifying and ranking descriptions of a given entity. The task may be viewed as one where the descriptions being sought are terms that need to be selected from a knowledge source (such as an ontology or thesaurus). In ...

SECTION: Evaluation methodologies and metrics (2)
Article
Evaluating information extraction
pp 100–111

The issue of how to experimentally evaluate information extraction (IE) systems has received hardly any satisfactory solution in the literature. In this paper we propose a novel evaluation model for IE and argue that, among others, it allows (i) a ...

Article
Tie-breaking bias: effect of an uncontrolled parameter on information retrieval evaluation
pp 112–123

We consider Information Retrieval evaluation, especially at TREC with the trec_eval program. It appears that systems obtain scores regarding not only the relevance of retrieved documents, but also according to document names in case of ties (i.e., when ...

Article
Automated component-level evaluation: present and future
pp 124–135

Automated component-level evaluation of information retrieval (IR) is the main focus of this paper. We present a review of the current state of web-based and component-level evaluation. Based on these systems, propositions are made for a comprehensive ...

SECTION: Panels
Article
A PROMISE for experimental evaluation
pp 140–144

Participative Research laboratory for Multimedia and Multilingual Information Systems Evaluation (PROMISE) is a Network of Excellence, starting in conjunction with this first independent CLEF 2010 conference, and designed to support and develop the ...

Contributors
  • University of Padua
  • University of Padua
  • Institute of Information Science and Technologies "Alessandro Faedo"
  • University of Amsterdam
  • Dublin City University

Recommendations