Browsing by Author "Chang, Remco"
Now showing 1 - 5 of 5
Results Per Page
Sort Options
Item Defining an Analysis: A Study of Client-Facing Data Scientists(The Eurographics Association, 2019) Mosca, Abigail; Robinson, Shannon; Clarke, Meredith; Redelmeier, Rebecca; Coates, Sebastian; Cashman, Dylan; Chang, Remco; Johansson, Jimmy and Sadlo, Filip and Marai, G. ElisabetaAs the sophistication of data analyses increases many subject matter experts looking to make data-driven decisions turn to data scientists to help with their data analysis needs. These subject matter experts may have little to no experience in data analysis, and may have little to no idea of what exactly they need to support their decision making. It is up to data scientists to determine the exact analysis needs of these clients before they can run an analysis. We call this step of the analysis process initialization and define it as: translating clients' broad, high-level questions into analytic queries. Despite the fact that this can be a very time consuming task for data scientists, few visualization tools exist to support it. To provide guidance on how future tools may fill this gap, we conducted 14 semi-structured interviews with client-facing data scientists in an array of fields. In analyzing interviews we find data scientists generally employ three methods for initialization: working backwards, probing, and recommending. We discus existing techniques that share synergy with each of these methods and could be leveraged in the design of future visualization tools to support initialization.Item The Human User in Progressive Visual Analytics(The Eurographics Association, 2019) Micallef, Luana; Schulz, Hans-Jörg; Angelini, Marco; Aupetit, Michaël; Chang, Remco; Kohlhammer, Jörn; Perer, Adam; Santucci, Giuseppe; Johansson, Jimmy and Sadlo, Filip and Marai, G. ElisabetaThe amount of generated and analyzed data is ever increasing, and processing such large data sets can take too long in situations where time-to-decision or fluid data exploration are critical. Progressive visual analytics (PVA) has recently emerged as a potential solution that allows users to analyze intermediary results during the computation without waiting for the computation to complete. However, there has been limited consideration on how these techniques impact the user. Based on discussions from a Dagstuhl seminar held in October 2018, this paper characterizes PVA users by their common roles, their main tasks, and their distinct focus of analysis. It further discusses cognitive biases that play a particular role in PVA. This work will help PVA visualization designers in devising systems that are tailored for their specific target users and their characteristics.Item Inferential Tasks as an Evaluation Technique for Visualization(The Eurographics Association, 2022) Suh, Ashley; Mosca, Ab; Robinson, Shannon; Pham, Quinn; Cashman, Dylan; Ottley, Alvitta; Chang, Remco; Agus, Marco; Aigner, Wolfgang; Hoellt, ThomasDesigning suitable tasks for visualization evaluation remains challenging. Traditional evaluation techniques commonly rely on 'low-level' or 'open-ended' tasks to assess the efficacy of a proposed visualization, however, nontrivial trade-offs exist between the two. Low-level tasks allow for robust quantitative evaluations, but are not indicative of the complex usage of a visualization. Open-ended tasks, while excellent for insight-based evaluations, are typically unstructured and require time-consuming interviews. Bridging this gap, we propose inferential tasks: a complementary task category based on inferential learning in psychology. Inferential tasks produce quantitative evaluation data in which users are prompted to form and validate their own findings with a visualization. We demonstrate the use of inferential tasks through a validation experiment on two well-known visualization tools.Item Survey on the Analysis of User Interactions and Visualization Provenance(The Eurographics Association and John Wiley & Sons Ltd., 2020) Xu, Kai; Ottley, Alvitta; Walchshofer, Conny; Streit, Marc; Chang, Remco; Wenskovitch, John; Smit, Noeska and Oeltze-Jafra, Steffen and Wang, BeiThere is fast-growing literature on provenance-related research, covering aspects such as its theoretical framework, use cases, and techniques for capturing, visualizing, and analyzing provenance data. As a result, there is an increasing need to identify and taxonomize the existing scholarship. Such an organization of the research landscape will provide a complete picture of the current state of inquiry and identify knowledge gaps or possible avenues for further investigation. In this STAR, we aim to produce a comprehensive survey of work in the data visualization and visual analytics field that focus on the analysis of user interaction and provenance data. We structure our survey around three primary questions: (1) WHY analyze provenance data, (2) WHAT provenance data to encode and how to encode it, and (3) HOW to analyze provenance data. A concluding discussion provides evidence-based guidelines and highlights concrete opportunities for future development in this emerging area.Item A User-based Visual Analytics Workflow for Exploratory Model Analysis(The Eurographics Association and John Wiley & Sons Ltd., 2019) Cashman, Dylan; Humayoun, Shah Rukh; Heimerl, Florian; Park, Kendall; Das, Subhajit; Thompson, John; Saket, Bahador; Mosca, Abigail; Stasko, John; Endert, Alex; Gleicher, Michael; Chang, Remco; Gleicher, Michael and Viola, Ivan and Leitte, HeikeMany visual analytics systems allow users to interact with machine learning models towards the goals of data exploration and insight generation on a given dataset. However, in some situations, insights may be less important than the production of an accurate predictive model for future use. In that case, users are more interested in generating of diverse and robust predictive models, verifying their performance on holdout data, and selecting the most suitable model for their usage scenario. In this paper, we consider the concept of Exploratory Model Analysis (EMA), which is defined as the process of discovering and selecting relevant models that can be used to make predictions on a data source. We delineate the differences between EMA and the well-known term exploratory data analysis in terms of the desired outcome of the analytic process: insights into the data or a set of deployable models. The contributions of this work are a visual analytics system workflow for EMA, a user study, and two use cases validating the effectiveness of the workflow. We found that our system workflow enabled users to generate complex models, to assess them for various qualities, and to select the most relevant model for their task.