Projects

Featured case studies are shown first. Below that, you can browse my full project set using search and filters.

Accessible smartwatch interactions

Two studies (assessment + participatory elicitation) producing actionable interaction guidelines for wearables.

  • Inclusive design
  • Wearables
  • Guidelines

Browse all projects

Use the search box and filters to browse projects. Results update as you type. This section is fully keyboard accessible.

Filter by tag

Select one or more tags. Clear filters to show everything.

Loading projects…

Textual narratives for causality & complex systems (CAUSEWORKS)

Once Upon A Time In Visualization: Understanding the Use of Textual Narratives for Causality
IEEE TVCG (VIS), 2021 · PDF · DOI

Problem

Causality visualizations can help analysts trace cause→effect chains over time, but they become difficult to use as event sequences grow in scale and complexity. The question is when text can complement visualization rather than distract from it.

Approach

  • Proposed a design space for how textual narratives can describe causal data (what to say, when to say it, and how it relates to the visual).
  • Ran a crowdsourced study comparing performance with and without narratives across two causality visualization types (causal graphs and Hasse diagrams).
  • Built CAUSEWORKS, a causality analytics system that integrates automatic narrative generation based on the design space.

Evaluation

  • Crowdsourced experiment: participants recovered causality information from visualizations with vs. without narratives.
  • Expert interviews: domain experts used CAUSEWORKS to interpret complex events and interventions.

Key findings

  • Narratives can reduce the cognitive overhead of reading complex causal structures when aligned to task intent.
  • Effectiveness depends on visualization type and what causal metadata the narrative highlights (e.g., cause/effect, correlation, connectivity, lifecycle).
  • Experts valued narratives as “guided interpretation,” especially for intervention reasoning in CAUSEWORKS.

Impact

Provides an evidence-backed design and evaluation template for human-centered explanations in analytical systems— directly relevant to AI systems that need to communicate reasoning, uncertainty, and causal claims.

Multimodal chart accessibility (touch + sound + speech)

Representative paper: Towards Understanding Sensory Substitution for Accessible Visualization: An Interview Study
Preprint · PDF

Problem

Visualization is often inaccessible to blind and low-vision (BLV) users. Accessibility requires more than retrofitting alt-text: it requires designing interaction and representation that leverage non-visual senses for spatial reasoning.

Approach

  • Interviewed 10 Orientation & Mobility (O&M) experts—all blind—to understand how non-visual senses support spatial layout understanding.
  • Used thematic analysis to extract design implications for sonification/auralization and tactile interaction.

Key findings

  • Blind people commonly use both sound and touch to build mental maps; designs should not assume audio-only solutions.
  • Experts recommended supporting combined modalities (e.g., tactile scaffolds + auditory cues) because tactile charts may be familiar and fast for many users.
  • Auditory affordances are powerful for trends and structure, but should be paired with mechanisms for precise values and orientation.

Impact

Establishes a principled foundation for multimodal data access designs and motivates “born-accessible” tooling where accessibility representations are first-class outputs.


Related systems (brief)

TactualPlot / touch exploration + sonification

Interaction technique where touch exploration yields continuous audio for trend/density and speech on demand for precise labels/values.

  • Prototype
  • Multimodal UX
  • Evaluation

Refreshable tactile displays (e.g., multi-line braille tablets)

Tactile representations preserve scaffolds (axes, ticks, legends) and support navigation via panning/zooming while pairing audio/speech for detail.

  • Tactile
  • Spatial reasoning
  • Accessibility

Born-accessible chart generation (prompt-driven era)

Principle: generate accessibility metadata (alt-text, sonification mappings, tactile-ready layers) alongside visuals to keep representations aligned.

  • LLMs
  • Tooling
  • Platform thinking

Accessible interactions for wearables (smartwatch input)

Exploring Accessible Smartwatch Interactions for People with Upper Body Motor Impairments
CHI 2018 · DOI

Problem

Smartwatches are always-available but have a very small interaction surface and often assume precise bimanual touch input. This creates major barriers for people with upper-body motor impairments.

Approach

  • Study 1 (accessibility assessment): evaluated how accessible off-the-shelf smartwatch inputs are (taps, swipes, button actions, text input, voice dictation) with 10 participants.
  • Study 2 (participatory elicitation): 11 participants created gestures for 16 common smartwatch actions across touchscreen and non-touchscreen areas (bezel, strap, and on-body locations).

Key findings

  • Not all participants could reliably perform common touch interactions (tap, swipe, button actions), and some had difficulty with speech input.
  • Participants often preferred interaction regions closer to the dominant hand on the watch (bezel/strap) over on-body locations.
  • Users created accessible alternatives to familiar touchscreen gestures (e.g., alternatives to two-finger zoom) when precision was hard.

Impact

Provides concrete design recommendations for accessible smartwatch interaction, extending beyond the touchscreen by using the watch’s physical form factor.

Privacy, perception, and emerging technology (drones)

“Spiders in the Sky”: User Perceptions of Drones, Privacy, and Security
CHI 2017 · DOI

Problem

Drones introduce privacy and security concerns, but regulations and design guidance have historically been minimal. Understanding user mental models is necessary to inform both policy and product design.

Approach & evaluation

  • Laboratory study with 20 participants.
  • Between-subjects comparison: participants interacted with a real drone or a life-sized model drone to isolate how real-world features (sound, wind, speed) shape perception.
  • Multi-step tasks exposing recording, approach behavior, and control; plus interviews and sketching/annotation exercises to elicit mental models.

Key findings

  • Participants expressed concerns about surveillance and privacy, plus fear of injury/damage and discomfort disclosing personal info under drone observation.
  • Perceptions were strongly shaped by design attributes (size, speed, noise) and additional cues (camera placement/quality, feedback lights, movements).

Impact

Produced actionable recommendations for drone design and regulation grounded in empirical evidence about human perception and trust.