Repository logo
  • Communities & Collections
  • All of DSpace
  • English
  • Čeština
  • Deutsch
  • Español
  • Français
  • Gàidhlig
  • Latviešu
  • Magyar
  • Nederlands
  • Português
  • Português do Brasil
  • Suomi
  • Svenska
  • Türkçe
  • Қазақ
  • বাংলা
  • हिंदी
  • Ελληνικά
  • Log In
    or
    New user? Click here to register.Have you forgotten your password?
  1. Home
  2. Browse by Author

Browsing by Author "Sedlmair, Michael"

Now showing 1 - 6 of 6
Results Per Page
Sort Options
  • Loading...
    Thumbnail Image
    Item
    Been There, Seen That: Visualization of Movement and 3D Eye Tracking Data from Real-World Environments
    (The Eurographics Association and John Wiley & Sons Ltd., 2023) Pathmanathan, Nelusa; Öney, Seyda; Becher, Michael; Sedlmair, Michael; Weiskopf, Daniel; Kurzhals, Kuno; Bujack, Roxana; Archambault, Daniel; Schreck, Tobias
    The distribution of visual attention can be evaluated using eye tracking, providing valuable insights into usability issues and interaction patterns. However, when used in real, augmented, and collaborative environments, new challenges arise that go beyond desktop scenarios and purely virtual environments. Toward addressing these challenges, we present a visualization technique that provides complementary views on the movement and eye tracking data recorded from multiple people in realworld environments. Our method is based on a space-time cube visualization and a linked 3D replay of recorded data. We showcase our approach with an experiment that examines how people investigate an artwork collection. The visualization provides insights into how people moved and inspected individual pictures in their spatial context over time. In contrast to existing methods, this analysis is possible for multiple participants without extensive annotation of areas of interest. Our technique was evaluated with a think-aloud experiment to investigate analysis strategies and an interview with domain experts to examine the applicability in other research fields.
  • Loading...
    Thumbnail Image
    Item
    ClustMe: A Visual Quality Measure for Ranking Monochrome Scatterplots based on Cluster Patterns
    (The Eurographics Association and John Wiley & Sons Ltd., 2019) Abbas, Mostafa M.; Aupetit, Michaël; Sedlmair, Michael; Bensmail, Halima; Gleicher, Michael and Viola, Ivan and Leitte, Heike
    We propose ClustMe, a new visual quality measure to rank monochrome scatterplots based on cluster patterns. ClustMe is based on data collected from a human-subjects study, in which 34 participants judged synthetically generated cluster patterns in 1000 scatterplots. We generated these patterns by carefully varying the free parameters of a simple Gaussian Mixture Model with two components. and asked the participants to count the number of clusters they could see (1 or more than 1). Based on the results, we form ClustMe by selecting the model that best predicts these human judgments among 7 different state-of-the-art merging techniques (DEMP). To quantitatively evaluate ClustMe, we conducted a second study, in which 31 human subjects ranked 435 pairs of scatterplots of real and synthetic data in terms of cluster patterns complexity. We use this data to compare ClustMe's performance to 4 other state-of-the-art clustering measures, including the well-known Clumpiness scagnostics. We found that of all measures, ClustMe is in strongest agreement with the human rankings.
  • Loading...
    Thumbnail Image
    Item
    EuroVA 2017: Frontmatter
    (Eurographics Association, 2017) Sedlmair, Michael; Tominski, Christian;
  • Loading...
    Thumbnail Image
    Item
    EuroVis 2019 CGF 38-3 STARs: Frontmatter
    (The Eurographics Association and John Wiley & Sons Ltd., 2019) Laramee, Robert S.; Oeltze, Steffen; Sedlmair, Michael; Laramee, Robert S. and Oeltze, Steffen and Sedlmair, Michael
  • Loading...
    Thumbnail Image
    Item
    Visual Analysis of Degree-of-Interest Functions to Support Selection Strategies for Instance Labeling
    (The Eurographics Association, 2019) Bernard, Jürgen; Hutter, Marco; Ritter, Christian; Lehmann, Markus; Sedlmair, Michael; Zeppelzauer, Matthias; Landesberger, Tatiana von and Turkay, Cagatay
    Manually labeling data sets is a time-consuming and expensive task that can be accelerated by interactive machine learning and visual analytics approaches. At the core of these approaches are strategies for the selection of candidate instances to label. We introduce degree-of-interest (DOI) functions as atomic building blocks to formalize candidate selection strategies. We introduce a taxonomy of DOI functions and an approach for the visual analysis of DOI functions, which provide novel complementary views on labeling strategies and DOIs, support their in-depth analysis and facilitate their interpretation. Our method shall support the generation of novel and better explanation of existing labeling strategies in future.
  • Loading...
    Thumbnail Image
    Item
    Visual Gaze Labeling for Augmented Reality Studies
    (The Eurographics Association and John Wiley & Sons Ltd., 2023) Öney, Seyda; Pathmanathan, Nelusa; Becher, Michael; Sedlmair, Michael; Weiskopf, Daniel; Kurzhals, Kuno; Bujack, Roxana; Archambault, Daniel; Schreck, Tobias
    Augmented Reality (AR) provides new ways for situated visualization and human-computer interaction in physical environments. Current evaluation procedures for AR applications rely primarily on questionnaires and interviews, providing qualitative means to assess usability and task solution strategies. Eye tracking extends these existing evaluation methodologies by providing indicators for visual attention to virtual and real elements in the environment. However, the analysis of viewing behavior, especially the comparison of multiple participants, is difficult to achieve in AR. Specifically, the definition of areas of interest (AOIs), which is often a prerequisite for such analysis, is cumbersome and tedious with existing approaches. To address this issue, we present a new visualization approach to define AOIs, label fixations, and investigate the resulting annotated scanpaths. Our approach utilizes automatic annotation of gaze on virtual objects and an image-based approach that also considers spatial context for the manual annotation of objects in the real world. Our results show, that with our approach, eye tracking data from AR scenes can be annotated and analyzed flexibly with respect to data aspects and annotation strategies.

Eurographics Association © 2013-2025  |  System hosted at Graz University of Technology      
DSpace software copyright © 2002-2025 LYRASIS

  • Cookie settings
  • Privacy policy
  • End User Agreement
  • Send Feedback