Publications
Click a title to expand the abstract. Publications are listed in reverse chronological order.
-
Sound, Touch, or the Full Monty? A Comparative Study of Data Accessibility Techniques for Blind Users
Abstract.Blind and low-vision (BLV) individuals encounter signifcant challenges in accessing and interpreting data, a critical component in many data-intensive felds. Traditional screen readers, which convert text to speech, have been the primary solution, but recent developments such as the Monarch refreshable tactile display and hybrid approaches such as TactualPlot, which combines sound and touch, ofer new possibilities for data accessibility. Our frst study explores the efcacy of these modalities—sound, touch, and their combination—through an empirical investigation conducted with Blind individuals. Participants engaged in tasks of varying complexity using the Monarch, Olli screen reader, and TactualPlot, with data presented in scatterplots, line charts, bar charts, and pie charts. Next, we conducted a co-design session to understand how blind individuals can utilize the Monarch in their data exploration workfow. Our fndings reveal the strengths and limitations of each modality and provide qualitative insights into blind individuals’ preferences. Our fndings show that TactualPlot generally led to better task accuracy, although confdence intervals overlapped across devices. Pie and bar charts exhibited higher accuracy across all devices compared to line and scatterplots, suggesting that certain visual structures translate more readily to non-visual modalities. The Monarch often resulted in the lowest task completion times, indicating efcient access for some tasks, but it faced challenges with complex visualizations like scatterplots due to occlusion. We find that prior experience with tactile media facilitated quicker adaptation to the Monarch, and screen reader users found Olli more familiar. Overall, participants expressed a desire for hybrid systems combining the overview capabilities of touch and audio, with the precision of text. This work contributes empirical fndings on the efectiveness of diferent modalities, qualitative perspectives from blind participants, enhancements to the TactualPlot technique to support more chart types, and fndings from a co-design session exploring visual analysis with refreshable tactile displays.
DOI (this paper is under review; will be updated after publication)
-
Understanding the Visualization and Analytics Needs of Blind and Low Vision Professionals
Abstract. Inclusivity for blind and low vision (BLV) professionals in data science and analytics is limited by a gap in understanding their unique data analysis needs. We contribute to the literature by reporting on a two-step online survey delving into the experiences and challenges faced by BLV individuals engaged in data-related roles. Our fndings highlight that despite expertise in programming and GUIbased analysis tools, BLV professionals faced accessibility issues at various points in the data analysis pipeline—issues ranging from data loading and transformation, availability and compatibility of data tools with assistive technology, and visualization authoring. The prevalent use of tools such as Excel, Python, and SAS alongside heavy reliance on assistive technologies highlights persistent accessibility challenges. Furthermore, frequent collaboration with sighted colleagues indicates compromised independence. These results underscore the urgent need for “born accessible” tools that ensure the inclusivity and autonomy of BLV professionals in the feld of data science.
-
TactualPlot: Spatializing Data as Sound using Sensory Substitution
Abstract. Tactile graphics are one of the best ways for a blind person to perceive a chart using touch, but their fabrication is often costly, time-consuming, and does not lend itself to dynamic exploration. Refreshable haptic displays tend to be expensive and thus unavailable to most blind individuals. We propose TACTUAL PLOT, an approach to sensory substitutionwhere touch interaction yields auditory (sonified) feedback. The technique relies on embodied cognition for spatial awareness—i.e., individuals can perceive 2D touch locations of their fingers with reference to other 2D locations such as the relative locations of other fingers or chart characteristics that are visualized on touchscreens. Combining touch and sound in this way yields a scalable data exploration method for scatterplots where the data density under the user’s fingertips is sampled. The sample regions can optionally be scaled based on how quickly the user moves their hand. Our development of TactualPlot was informed by formative design sessions with a blind collaborator, whose practice while using tactile scatterplots caused us to expand the technique for multiple fingers. We present results from an evaluation comparing our TactualPlot interaction technique to tactile graphics printed on swell touch paper.
-
Contextual In-Situ Help for Visual Data Interfaces
Abstract. As the complexity of data analysis increases, even well-designed data interfaces must guide experts in transforming their theoretical knowledge into actual features supported by the tool. This challenge is even greater for casual users who are increasingly turning to data analysis to solve everyday problems. To address this challenge, we propose datadriven, contextual, in-situ help features that can be implemented in visual data interfaces. We introduce five modes of help-seeking: (1) contextual help on selected interface elements, (2) topic listing, (3) overview, (4) guided tour, and (5) notifications. The difference between our work and general user interface help systems is that data visualization provide a unique environment for embedding context-dependent data inside on-screen messaging. We demonstrate the usefulness of such contextual help through two case studies of two visual data interfaces: Keshif and POD-Vis. We implemented and evaluated the help modes with two sets of participants, and found that directly selecting user interface elements was the most useful.
-
Towards Understanding Sensory Substitution for Accessible Visualization: An Interview Study
Abstract. For all its potential in supporting data analysis, particularly in exploratory situations, visualization also creates barriers: accessibility for blind and visually impaired individuals. Regardless of how effective a visualization is, providing equal access for blind users requires a paradigm shift for the visualization research community. To enact such a shift, it is not sufficient to treat visualization accessibility as merely another technical problem to overcome. Instead, supporting the millions of blind and visually impaired users around the world who have equally valid needs for data analysis as sighted individuals requires a respectful, equitable, and holistic approach that includes all users from the onset. In this paper, we draw on accessibility research methodologies to make inroads towards such an approach. We first identify the people who have specific insight into how blind people perceive the world: orientation and mobility (O&M) experts, who are instructors that teach blind individuals how to navigate the physical world using non-visual senses. We interview 10 O&M experts—all of them blind—to understand how best to use sensory substitution other than the visual sense for conveying spatial layouts. Finally, we investigate our qualitative findings using thematic analysis. While blind people in general tend to use both sound and touch to understand their surroundings, we focused on auditory affordances and how they can be used to make data visualizations accessible—using sonification and auralization. However, our experts recommended supporting a combination of senses—sound and touch—to make charts accessible as blind individuals may be more familiar with exploring tactile charts. We report results on both sound and touch affordances, and conclude by discussing implications for accessible visualization for blind individuals.
-
Once Upon a Time in Visualization: Understanding the Use of Textual Narratives for Causality
Abstract. Causality visualization can help people understand temporal chains of events, such as messages sent inadistr ibuted system, cause and effect in a historical conflict, or the interplay between political actors over time.H owever,a s the scale and complexity of these event sequences grows, even these visualizations can become overwhelming to use.I n this paper,w e propose the use of textual narratives as a data-driven storytelling method to augment causality visualization. We first propose a design space forh ow textual narratives can be used to describe causal data. We then present results fromacro wdsourced user study where participants were asked to recover causality information from two causality visualizations—causal graphs and Hasse diagrams—with and without an associated textual narrative. Finally,w e describe CAUSE WORKS ,a causality visualization system for understanding how specific interventions influence a causal model. The system incorporates an automatic textual narrative mechanism based on our design space. We validate CAUSEWORKS through interviews with experts who used the system for understanding complexe vents. IndexTerms—Causality visualization,naturallanguagegeneration,data-drivenstorytelling,temporaldata,quantitativestudies 1I NTR ODUCTION S TORIES are a central part of what it means to be human [40, 53]. They teach, guide, and caution; they store, recall, and archive; they praise, spread joy, and inspire. In particular, stories are especially useful for encapsulating causality—the cause and effect of events inaplot—in ar egular, understandable, an
-
Motor Accessibility of Smartwatch Touch and Bezel Input
Abstract. Smartwatches present inherent input difficulties due to the small touchscreen. In a controlled experiment with 14 participants with upper body motor impairments, we compared smartwatch touchscreen input to input on the bezel of the watch, the latter of which should at least theoretically stabilize user input due to its hard edge. Results demonstrate a speed-accuracy tradeoff whereby the touchscreen is faster but the bezel is more accurate.
-
Exploring Accessible Smartwatch Interactions for People with Upper Body Motor Impairments
Abstract. Smartwatches are always-available, provide quick access to information in a mobile setting, and can collect continuous health and fitness dat a. However, the small interaction space of these wearables may pose challenges for people with upper body motor impairments. To investigate accessible smartwatch interactions for this user group, we conducted two studies. First, we assessed the accessibili ty of existing smartwatch gestures with 10 participants with motor impairments. We found that not all participants were able to complete button, swipe and tap interactions. In a second study , we adopted a participatory approach to explore smartwatch gestur e preferences and to gain insight into alternative, more accessible smartwatch interaction techniques. Eleven participants with motor impairments created gestures for 16 common smartwatch actions on both touchscreen and non -touchscreen (bezel, wristband) a reas of the watch and the user’s body. We present results from both studies and provide design recommendations.
-
“Spiders in the Sky”: User Perceptions of Drones, Privacy, and Security
Abstract. Drones are increasingly being used for various purposes from recording footage in inaccessible areas to delivering packages. A rise in drone usage introduces privacy and security concerns about flying boundaries, what data drones collect in public and private spaces, and how that data is stored and disseminated. However, commercial and personal drone regulations focusing on privacy and security have been fairly minimal in the United States. To inform privacy and security guidelines for drone design and regulation, we need to understand users’ perceptions about drones, privacy, and security. In this paper, we describe a laboratory study with 20 participants who interacted with a real or model drone to elicit user perceptions of privacy and security issues around drones. We present our results, discuss the implications of our work, and make recommendations to improve drone design and regulations that enhance individual privacy and security. ACM Classification Keywords H.5.m. Information Interfaces and Presentation (e.g. HCI): Miscellaneous