Touché Task 3: Image Retrieval for Arguments

Synopsis

  • Task: Given a controversial topic and a collection of web documents with images, retrieve for each stance (pro/con) images that show support for that stance.
  • Input: [collection] [topics] [training judgements] (coming soon)
  • Submission: [tira] (coming soon)

Task

The goal of Task 3 is to support users in getting an overview of opinions on controversial topics. Given a controversial topic, the task is to retrieve images (from web pages) for each stance (pro/con) that show support for that stance.

Data

This task uses a focused crawl of at least 10,000 images (and associated web pages) as document collection. Systems are evaluated using Touché topics 1–50.

More information coming soon.

Evaluation

Systems are evaluated by the ratio of relevant images among 20 retrieved images (Precision) for each topic, namely 10 images for each stance.

Submission

We encourage participants to use TIRA for their submissions to allow for a better reproducibility (see the Quickstart section below). Email submission is allowed as a fallback. For each topic and stance, include 10 retrieved images. Each team can submit up to 5 different runs.

The submission format for the task will follow the standard TREC format, but makes use of the second column to indicate the stance:

qid stance doc rank score tag

With:

  • qid: The topic number.
  • stance: The stance, must be "PRO" or "CON".
  • doc: The ID of the image returned by your system for topic qid and stance at rank.
  • rank: The rank of the image in the result list for the topic and stance, must be an integer from 1 to 10.
  • score: The score (integer or floating point) that generated the ranking. Not used in the evaluation.
  • tag: A tag that identifies your group and the method you used to produce the run.
The fields should be separated by a whitespace.

TIRA Quickstart

Coming soon

Task Committee