Inferential Tasks as an Evaluation Technique for Visualization

Abstract
Designing suitable tasks for visualization evaluation remains challenging. Traditional evaluation techniques commonly rely on 'low-level' or 'open-ended' tasks to assess the efficacy of a proposed visualization, however, nontrivial trade-offs exist between the two. Low-level tasks allow for robust quantitative evaluations, but are not indicative of the complex usage of a visualization. Open-ended tasks, while excellent for insight-based evaluations, are typically unstructured and require time-consuming interviews. Bridging this gap, we propose inferential tasks: a complementary task category based on inferential learning in psychology. Inferential tasks produce quantitative evaluation data in which users are prompted to form and validate their own findings with a visualization. We demonstrate the use of inferential tasks through a validation experiment on two well-known visualization tools.
Description

CCS Concepts: Human-centered computing → Information visualization; Visualization design and evaluation methods

        
@inproceedings{
10.2312:evs.20221086
, booktitle = {
EuroVis 2022 - Short Papers
}, editor = {
Agus, Marco
and
Aigner, Wolfgang
and
Hoellt, Thomas
}, title = {{
Inferential Tasks as an Evaluation Technique for Visualization
}}, author = {
Suh, Ashley
and
Mosca, Ab
and
Robinson, Shannon
and
Pham, Quinn
and
Cashman, Dylan
and
Ottley, Alvitta
and
Chang, Remco
}, year = {
2022
}, publisher = {
The Eurographics Association
}, ISBN = {
978-3-03868-184-7
}, DOI = {
10.2312/evs.20221086
} }
Citation
Collections