Toward a Structured Theoretical Framework for the Evaluation of Generative AI-based Visualizations

No Thumbnail Available
Date
2024
Journal Title
Journal ISSN
Volume Title
Publisher
The Eurographics Association
Abstract
The automatic generation of visualizations is an old task that, through the years, has shown more and more interest from the research and practitioner communities. Recently, large language models (LLM) have become an interesting option for supporting generative tasks related to visualization, demonstrating initial promising results. At the same time, several pitfalls, like the multiple ways of instructing an LLM to generate the desired result, the different perspectives leading the generation (code-based, image-based, grammar-based), and the presence of hallucinations even for the visualization generation task, make their usage less affordable than expected. Following similar initiatives for benchmarking LLMs, this paper explores the problem of modeling the evaluation of a generated visualization through an LLM. We propose a theoretical evaluation stack, EvaLLM, that decomposes the evaluation effort in its atomic components, characterizes their nature, and provides an overview of how to implement them. One use case on the Llama2-70-b model shows the benefits of EvaLLM and illustrates interesting results on the current state-of-the-art LLM-generated visualizations. The materials are available at this GitHub repository: https://github.com/lucapodo/evallm_llama2_70b.git
Description

CCS Concepts: Human-centered computing→Visualization design and evaluation methods

        
@inproceedings{
10.2312:eurova.20241118
, booktitle = {
EuroVis Workshop on Visual Analytics (EuroVA)
}, editor = {
El-Assady, Mennatallah
and
Schulz, Hans-Jörg
}, title = {{
Toward a Structured Theoretical Framework for the Evaluation of Generative AI-based Visualizations
}}, author = {
Podo, Luca
and
Ishmal, Muhammad
and
Angelini, Marco
}, year = {
2024
}, publisher = {
The Eurographics Association
}, ISBN = {
978-3-03868-253-0
}, DOI = {
10.2312/eurova.20241118
} }
Citation
Collections