DID YOU KNOW? DTIC has over 3.5 million final reports on DoD funded research, development, test, and evaluation activities available to our registered users. Click
HERE to register or log in.
Accession Number:
ADA581517
Title:
UT Austin in the TREC 2012 Crowdsourcing Track's Image Relevance Assessment Task
Descriptive Note:
Conference paper
Corporate Author:
TEXAS UNIV AT AUSTIN SCHOOL OF INFORMATION
Report Date:
2012-11-01
Pagination or Media Count:
13.0
Abstract:
We describe our submission to the Image Relevance Assessment Task IRAT at the 2012 Text REtrieval Conference TREC Crowdsourcing Track. Four aspects distinguish our approach 1 an interface for cohesive, efficient topic-based relevance judging and reporting judgment confidence 2 a variant of Welinder and Peronas method for online crowdsourcing 17 inferring quality of the judgments and judges during data collection in order to dynamically optimize data collection 3 a completely unsupervised approach using no labeled data for either training or tuning and 4 automatic generation of individualized error reports for each crowd worker, supporting transparent assessment and education of workers. Our system was built start-to-finish in two weeks, and we collected approximately 44,000 labels for about 40 US.
Distribution Statement:
APPROVED FOR PUBLIC RELEASE