Putting Annotations to the Test
dc.contributor.author | Becker, Franziska | en_US |
dc.contributor.author | Ertl, Thomas | en_US |
dc.contributor.editor | Gillmann, Christina | en_US |
dc.contributor.editor | Krone, Michael | en_US |
dc.contributor.editor | Lenti, Simone | en_US |
dc.date.accessioned | 2023-06-10T06:31:35Z | |
dc.date.available | 2023-06-10T06:31:35Z | |
dc.date.issued | 2023 | |
dc.description.abstract | When users work with interactive visualization systems, they get to see more accessible representations of raw data and interact with these, e.g. by filtering the data or modifying the visualization parameters like color. Internal representations such as hunches about trends, outliers or data points of interest, relationships and more are usually not visualized and integrated in systems, i.e. they are not externalized. In addition, how externalizations in visualization systems can affect users in terms of memory, post-analysis recall, speed or analysis quality is not yet completely understood. We present a visualization-agnostic externalization framework that lets users annotate visualizations, automatically connect them to related data and store them for later retrieval. In addition, we conducted a pilot study to test the framework's usability and users' recall of exploratory analysis results. In two tasks, one without and one with annotation features available, we asked participants to answer a question with the help of visualizations and report their findings with concrete examples afterwards. Qualitative analysis of the summaries showed that there are only minor differences in terms of detail or completeness, which we suspect is due to the short task time and consequently more shallow analyses made by participants. We discuss how to improve our framework's usability and modify our study design for future research to gain more insight into externalization effects on post-analysis recall. | en_US |
dc.description.seriesinformation | EuroVis 2023 - Posters | |
dc.identifier.doi | 10.2312/evp.20231068 | |
dc.identifier.isbn | 978-3-03868-220-2 | |
dc.identifier.pages | 61-63 | |
dc.identifier.pages | 3 pages | |
dc.identifier.uri | https://doi.org/10.2312/evp.20231068 | |
dc.identifier.uri | https://diglib.eg.org:443/handle/10.2312/evp20231068 | |
dc.publisher | The Eurographics Association | en_US |
dc.rights | Attribution 4.0 International License | |
dc.rights.uri | https://creativecommons.org/licenses/by/4.0/ | |
dc.subject | CCS Concepts: Human-centered computing -> Empirical studies in visualization; Visualization systems and tools; Visual analytics | |
dc.subject | Human centered computing | |
dc.subject | Empirical studies in visualization | |
dc.subject | Visualization systems and tools | |
dc.subject | Visual analytics | |
dc.title | Putting Annotations to the Test | en_US |