Browsing by Author "Moritz, Dominik"
Now showing 1 - 3 of 3
Results Per Page
Sort Options
Item Design Patterns and Trade-Offs in Responsive Visualization for Communication(The Eurographics Association and John Wiley & Sons Ltd., 2021) Kim, Hyeok; Moritz, Dominik; Hullman, Jessica; Borgo, Rita and Marai, G. Elisabeta and Landesberger, Tatiana vonIncreased access to mobile devices motivates the need to design communicative visualizations that are responsive to varying screen sizes. However, relatively little design guidance or tooling is currently available to authors. We contribute a detailed characterization of responsive visualization strategies in communication-oriented visualizations, identifying 76 total strategies by analyzing 378 pairs of large screen (LS) and small screen (SS) visualizations from online articles and reports. Our analysis distinguishes between the Targets of responsive visualization, referring to what elements of a design are changed and Actions representing how targets are changed. We identify key trade-offs related to authors' need to maintain graphical density, referring to the amount of information per pixel, while also maintaining the ''message'' or intended takeaways for users of a visualization. We discuss implications of our findings for future visualization tool design to support responsive transformation of visualization designs, including requirements for automated recommenders for communication-oriented responsive visualizations.Item How Accessible is my Visualization? Evaluating Visualization Accessibility with Chartability(The Eurographics Association and John Wiley & Sons Ltd., 2022) Elavsky, Frank; Bennett, Cynthia; Moritz, Dominik; Borgo, Rita; Marai, G. Elisabeta; Schreck, TobiasNovices and experts have struggled to evaluate the accessibility of data visualizations because there are no common shared guidelines across environments, platforms, and contexts in which data visualizations are authored. Between non-specific standards bodies like WCAG, emerging research, and guidelines from specific communities of practice, it is hard to organize knowledge on how to evaluate accessible data visualizations. We present Chartability, a set of heuristics synthesized from these various sources which enables designers, developers, researchers, and auditors to evaluate data-driven visualizations and interfaces for visual, motor, vestibular, neurological, and cognitive accessibility. In this paper, we outline our process of making a set of heuristics and accessibility principles for Chartability and highlight key features in the auditing process. Working with participants on real projects, we found that data practitioners with a novice level of accessibility skills were more confident and found auditing to be easier after using Chartability. Expert accessibility practitioners were eager to integrate Chartability into their own work. Reflecting on Chartability's development and the preliminary user evaluation, we discuss tradeoffs of open projects, working with high-risk evaluations like auditing projects in the wild, and challenge future research projects at the intersection of visualization and accessibility to consider the broad intersections of disabilities.Item Leveraging Analysis History for Improved In Situ Visualization Recommendation(The Eurographics Association and John Wiley & Sons Ltd., 2022) Epperson, Will; Lee, Doris Jung-Lin; Wang, Leijie; Agarwal, Kunal; Parameswaran, Aditya G.; Moritz, Dominik; Perer, Adam; Borgo, Rita; Marai, G. Elisabeta; Schreck, TobiasExisting visualization recommendation systems commonly rely on a single snapshot of a dataset to suggest visualizations to users. However, exploratory data analysis involves a series of related interactions with a dataset over time rather than one-off analytical steps. We present Solas, a tool that tracks the history of a user's data analysis, models their interest in each column, and uses this information to provide visualization recommendations, all within the user's native analytical environment. Recommending with analysis history improves visualizations in three primary ways: task-specific visualizations use the provenance of data to provide sensible encodings for common analysis functions, aggregated history is used to rank visualizations by our model of a user's interest in each column, and column data types are inferred based on applied operations. We present a usage scenario and a user evaluation demonstrating how leveraging analysis history improves in situ visualization recommendations on real-world analysis tasks.