Download Resume ↗
Senior HMI & UX Researcher

Gyanendra Sharma PhD

Advancing human-centered automotive research at the intersection of HCI, psychophysics, and ADAS — translating complex user data into safer, more intuitive vehicle systems.

0+
Years Exp.
0
Publications
0
Patents
0
Companies
Gyanendra Sharma, PhD
Current Role
Audi of America
Research Focus
ADAS & HMI
Scroll
Gyanendra Sharma, PhD
7+
Years UX/HMI Research
About Me

Bridging Humans & Machine Systems

I am a Senior Human Machine Interface (HMI) Researcher at Audi of America, Advanced Driver Assistance Systems, based in San Jose, CA. My work encompasses the full lifecycle of user research — from study design and execution to statistical analysis and stakeholder-ready reporting for Audi AG and Porsche AG.

My academic background spans Computer Science and Mathematics with a doctoral research focus on Human-Computer Interaction at Rensselaer Polytechnic Institute. Over 7+ years of industry and academic research, I have conducted rigorous studies across automotive HMI, psychophysics, immersive environments, and group dynamics.

Some of my previous work includes psychophysical methods to quantify human perception of haptic and audio feedback in driving contexts — from haptic lateral assistance systems to detection thresholds for audio-haptic asynchrony in force-feedback steering wheels. My research directly shapes ADAS product design at the OEM level.

🚗 Automotive HMI & ADAS Research
🧠 Psychophysics & Perception Studies
📊 Quantitative & Mixed-Methods Research
👁 Eye-Tracking & Behavioral Analysis
🔬 Real Drive & Simulation Studies
📋 Stakeholder Research Communication

Past & Present Affiliations

Work Experience

Professional Journey

A track record of rigorous user research across leading automotive and academic environments.

Senior HMI Researcher
Audi of America — ADAS, San Jose, CA
2023 — Present
  • Conducted mixed-methods research on the usability, acceptance, and situational awareness of hands-free driving systems, leveraging user interviews and inferential statistics to generate actionable insights for production ADAS.
  • Presented key research findings to stakeholders at Audi AG and Porsche AG through workshops, directly influencing ADAS product decisions and Level 2+ system HMI requirements.
  • Led eye-tracking instrumentation and analysis using Smart Eye Pro to assess driver attention and behavior in real-world studies at specialized facilities including the Nevada Automotive Test Center.
  • Consolidated HMI requirements for automated vehicle platforms through user research, regulatory exploration, and data analysis for the North American market.
HMI Researcher
Toyota Research Institute / Woven Planet, NA
2021 — 2023
  • Designed and executed a within-subject A/B testing study using a tabletop driving simulator to evaluate user perception and acceptance of continuous steering guidance systems with HUD visual feedback.
  • Conducted a psychophysical experiment to determine detection thresholds of audio-haptic asynchrony in force-feedback steering wheels, directly informing ADAS interaction design and timing standards.
  • Applied psychophysics-based modeling (MLDS) to evaluate vibration warning perception in force-feedback steering, revealing a linear perceptual relationship to optimize haptic feedback for drivers.
Postdoctoral Researcher
Network Science Institute, Northeastern University
2019 — 2020
  • Analyzed leadership dynamics in small group interactions by applying advanced statistical methods (regressions, multi-level modeling) to verbal and non-verbal behavioral data.
  • Synthesized research trends through a comprehensive meta-study on multimodal communication, identifying key gaps and future research directions in small group interaction research.
PhD Researcher
Rensselaer Polytechnic Institute
2014 — 2019
  • Developed a multi-person tracking system using camera arrays to capture real-time location and orientation data, enabling research on interactive spatial computing in large immersive environments.
  • Evaluated usability and workload of multimodal interaction (voice, gestures, mobile) through human-subject experiments in large immersive environments using validated psychometric instruments.
  • Prototyped and iteratively designed user-to-smart-room interactions through pilot studies with diverse input methods (mobile, LeapMotion, voice, gestures) to refine interactive spatial experiences.
Education

Academic Background

Combining rigorous Computer Science foundations with deep Human-Computer Interaction research expertise.

Doctor of Philosophy
Computer Science
Rensselaer Polytechnic Institute
2014 – 2019
Thesis: Spatially Aware Interactions in Large Scale Immersive Environments
Research Area: Human-Computer Interaction, Spatial Computing, Multimodal Interfaces
Bachelor of Arts
Mathematics & Computer Science
Connecticut College
2009 – 2013
9
Peer-Reviewed Publications
3
Patents (US & PCT)
Research Portfolio

Selected Research

Peer-reviewed studies spanning automotive HMI, psychophysics, and immersive human-computer interaction. Click Key Contributions on each project to view detailed research outcomes.

Real Drive User Studies
🚗
Automotive HMI

Real Drive User Studies — ADAS Prototype Evaluation

High-fidelity evaluation of custom HMI prototypes with real participants at specialized facilities including the Nevada Automotive Test Center. Studies assess usability, situational awareness, and acceptance of hands-free and Level 2+ ADAS at speed.

Real Drive Eye Tracking Smart Eye Pro ADAS Mixed-Methods Audi / Porsche
  • Led end-to-end real-drive user research at the Nevada Automotive Test Center, managing the complete study lifecycle from IRB protocol design to final stakeholder reporting for Audi Level 2+ ADAS systems.
  • Instrumented research vehicles with Smart Eye Pro eye-tracking to capture high-fidelity gaze patterns, attention allocation, and behavioral indicators of situational awareness in naturalistic driving at speed.
  • Designed comprehensive mixed-methods protocols combining behavioral observation, semi-structured post-drive interviews, think-aloud techniques, and validated scales (SUS, UMUX-Lite, Van der Laan Acceptance).
  • Applied inferential statistics (ANOVA, t-tests, regression) to eye-tracking and self-report data, extracting insights on trust calibration and driver acceptance of hands-free driving systems.
  • Delivered stakeholder workshops and research briefings directly to Audi AG and Porsche AG engineering and product teams in Germany, influencing ADAS HMI requirements for production vehicles.
  • Contributed to HMI requirements documentation for next-generation automated vehicle platforms, incorporating analysis of the North American regulatory landscape (SAE Levels, NHTSA guidelines).
Audio-Haptic Asynchrony Study
🎧
Psychophysics

Detection Threshold of Audio-Haptic Asynchrony in a Driving Context

Investigated the perceptual detection threshold (DT) for temporal asynchrony between audio and haptic feedback in a Sensodrive force-feedback steering wheel. Results directly inform ADAS multimodal feedback timing standards.

Psychophysics Haptic Feedback Driving Simulator MLDS arXiv 2023
  • Formulated and experimentally tested the open hypothesis of whether drivers can reliably detect millisecond-level temporal asynchrony between audio and haptic ADAS feedback cues.
  • Engineered a controlled psychophysical paradigm using a Sensodrive force-feedback steering wheel, delivering precisely timed audio-haptic stimulus pairs under realistic simulated driving conditions.
  • Applied Maximum Likelihood Difference Scaling (MLDS) — a Bayesian psychophysical scaling method — to derive detection threshold estimates with full statistical confidence intervals.
  • Established empirical detection threshold values (in milliseconds) defining the minimum synchrony window for multimodal ADAS alerts, providing a quantitative design specification for real-time haptic-audio rendering.
  • Results revealed drivers are more sensitive to audio-haptic asynchrony than prior literature suggested, directly setting new timing requirements for production ADAS haptic-alert architectures.
  • Published as a peer-reviewed preprint (arXiv:2307.05451), contributing foundational psychophysical benchmarks to the automotive HCI and ADAS multimodal interface design community.
Haptic Lateral Assistance Study
🎮
Automotive HMI

Continuous Visual Feedback for Haptic Lateral Assistance

Investigated user acceptance of continuous visual risk feedback for lateral assistance systems, combining a simulated Head-Up Display (HUD) with haptic steering feedback. Within-subject A/B test on a tabletop driving simulator.

Lateral Assist HUD Design A/B Testing Usability NASA-TLX arXiv 2023
  • Developed a novel HMI prototype combining a simulated HUD rendering continuous real-time risk visualization with haptic torque guidance in a lateral lane-keeping assistance system.
  • Designed and executed a rigorous within-subject A/B controlled experiment on a tabletop driving simulator, directly comparing haptic-only vs. haptic + continuous visual risk feedback conditions.
  • Administered the Van der Laan Acceptance Scale, System Usability Scale (SUS), and NASA Task Load Index (NASA-TLX) to quantify acceptance, usability, and cognitive workload.
  • Conducted and coded think-aloud protocol sessions; applied thematic analysis and affinity mapping to surface qualitative insights on user mental models of lateral assist behavior.
  • Demonstrated statistically significant improvements in user acceptance and usability for the HUD+haptic condition, while cognitive workload remained stable — validating the integrated HMI design approach.
  • Findings directly informed HMI design recommendations for continuous guidance displays in Toyota Research Institute / Woven Planet ADAS prototypes, contributing to product design decisions.
Driving Simulation Lab to Road
🛣
Automotive HMI

Extending Driving Simulation from Lab to the Road

Collaborative project with Cornell University investigating the ecological validity of driving simulation studies — systematically comparing controlled lab environments to in-vehicle real-world road conditions. Published at ACM CHI 2024.

Ecological Validity CHI 2024 Mixed-Methods External Validity Cornell Collaboration
  • Co-designed parallel experimental protocols with Cornell University enabling systematic, head-to-head comparison of tabletop driving simulator vs. real-world on-road study environments.
  • Coordinated multi-site data collection across simulation lab and real-world road conditions, managing participant logistics, IRB compliance, and rigorous data quality assurance across both environments.
  • Synthesized behavioral, eye-tracking, and self-report data across environments using mixed-effects modeling and statistical equivalence testing to assess methodological transferability.
  • Identified specific task conditions and behavioral metrics where simulator findings reliably generalize to on-road driving — and where they critically diverge — providing actionable guidance for study design.
  • Accepted at ACM CHI 2024 — the world's premier Human-Computer Interaction conference — demonstrating research impact at the highest academic level in HCI and user research.
  • Findings provide evidence-based guidelines for the UX and HMI research community on when and how to design externally valid driving simulation studies.
Multi-Person Spatial Interaction
🖥
Immersive HCI

Multi-Person Spatial Interaction in Large Immersive Displays

Investigated multi-person collaborative interaction with a large-scale immersive display using smartphones as touchpads, voice commands, and spatial location awareness. Published in Springer IntelliSys proceedings.

Spatial HCI Multi-User Springer 2021 Usability Camera Arrays
  • Designed and developed a real-time multi-person location and orientation tracking system using camera arrays in the CRAIVE-Lab, a human-scale immersive display environment at RPI.
  • Engineered multiple collaborative interaction prototypes for large-display control: smartphone-as-touchpad, voice command, LeapMotion gesture, and full-body spatial positioning modalities.
  • Conducted controlled human-subjects experiments with groups of 2–4 participants, evaluating interaction efficiency, group collaboration dynamics, and usability across all modalities.
  • Applied SUS, task completion time, error rate analysis, and qualitative coding to identify usability bottlenecks; iterated through multiple design cycles using human-centered methods.
  • Demonstrated that smartphone-as-touchpad achieved highest usability scores and supported simultaneous multi-user control — a key design finding for collaborative spatial and smart space interfaces.
  • Published in Springer IntelliSys 2021 proceedings; findings contributed to PCT patent (PCT/US2022/051474) on multi-sensor immersive virtual environment systems.
Multimodal Data Survey
📑
Survey & Review

Multimodal Data & Interactions in Group Communication Research

Comprehensive review paper exploring tools, technologies, and methodological approaches for multimodal data research in dynamic group interactions — spanning audio, video, physiological, and behavioral modalities.

Multimodal Group Dynamics Systematic Review arXiv 2024
  • Conducted a systematic literature review covering 200+ publications at the intersection of multimodal sensing and group communication research, spanning audio, video, physiological (EEG, EDA), and behavioral modalities.
  • Synthesized diverse methodological frameworks, sensing technologies (microphone arrays, cameras, biosensors, eye-trackers, motion capture), and analytical approaches into a coherent multi-disciplinary landscape.
  • Developed a novel taxonomic classification system for multimodal data modalities, collection tools, and analytical methods — providing a structured reference framework for future researchers.
  • Identified critical under-researched areas including real-world naturalistic multimodal capture, multi-person temporal fusion, and cross-cultural group dynamics, outlining a research agenda for the field.
  • Co-authored with an interdisciplinary team from Northeastern University Network Science Institute, combining expertise from HCI, network science, communication research, and computational social science.
  • Published as open-access preprint (arXiv:2401.15194), providing a comprehensive community resource for researchers at the intersection of multimodal sensing, AI, and human group interaction.
Scholarly Output

Publications & Patents

9 peer-reviewed publications spanning automotive HCI, psychophysics, and immersive computing.

1
Extending Driving Simulation from Lab to the Road
F. Bu, S. Li, G. Sharma, W. Ju, et al.
ACM CHI Conference on Human Factors in Computing Systems, 2024
2024 ↗ PDF
2
Multimodality in Group Communication Research: A Survey
R. Lange, R. J. Radke, G. Sharma, et al.
arXiv preprint arXiv:2401.15194, 2024
2024 ↗ PDF
3
Continuous Visual Feedback of Risk for Haptic Lateral Assistance
G. Sharma, H. Yasuda, and M. Kuehner
arXiv preprint arXiv:2301.10933, 2023
2023 ↗ PDF
4
Detection Threshold of Audio-Haptic Asynchrony in a Driving Context
G. Sharma, H. Yasuda, and M. Kuehner
arXiv preprint arXiv:2307.05451, 2023
2023 ↗ PDF
5
Multi-Person Spatial Interaction in a Large Immersive Display Using Smartphones as Touchpads
G. Sharma and R. J. Radke
Proceedings of IntelliSys 2020, Vol. 3, Springer, 2021, pp. 285–302
6
Occupant Location and Gesture Estimation in Large-Scale Immersive Spaces
D. Jivani, G. Sharma, and R. J. Radke
Living Labs Workshop, ACM CHI, 2018
2018
7
Manipulating Screen Elements in an Immersive Environment with a Wrist-Mounted Device and Free Body Movement
G. Sharma, D. Jivani, and R. Radke
Living Labs Workshop, ACM CHI, 2018
2018
8
Interactions in a Human-Scale Immersive Environment: The CRAIVE-Lab
G. Sharma, J. Braasch, and R. J. Radke
Cross-Surface 2016, ACM ISS, 2017
2017
9
Bridging Printed Music and Audio Through Alignment Using a Mid-Level Score Representation
Ö. Izmirli and G. Sharma
International Society for Music Information Retrieval (ISMIR), 2012, pp. 61–66
2012
🎓 View All on Google Scholar ↗
Intellectual Property

Patents

Three granted and published patents spanning automotive safety systems and immersive spatial computing.

2022 · US Patent Application
Systems and Methods for Enhancing Operator Vigilance
U.S. Pat. App. No. 18/095,286
H. Yasuda, M. Kuehner, G. Sharma, et al.
2022 · PCT International Application
Multi-Sensor Systems and Methods for Providing Immersive Virtual Environments
PCT/US2022/051474
G. Sharma, J. Mathews, J. Braasch, R. J. Radke, D. Jivani
2017 · US Patent Application
Hybrid Virtual and Physical Jewelry Shopping Experience
US2018/0357702A1
G. Sharma, M. Nawhal, A. Prakash, et al.
Capabilities

Skills & Expertise

A comprehensive research toolkit developed across academic, postdoctoral, and industry environments — targeted toward senior UX and HMI research roles.

🔭
Research Design & Methodology
Experimental Design Within-Subjects Between-Subjects Psychophysical Methods Threshold Detection MLDS Pilot Studies Wizard-of-Oz A/B Testing
📈
Quantitative Analysis
Regression Analysis ANOVA t-tests Multi-level Modeling Factor Analysis Likert Scaling Inferential Statistics Python R MATLAB
🎯
User Research Methods
Real Drive Studies Mixed-Methods In-Depth Interviews Usability Testing Think-Aloud Protocol Survey Design Qualtrics Thematic Coding Affinity Mapping Personas
👁
Instrumentation & Tools
Smart Eye Pro Eye Tracking Gaze Metrics Sensodrive Wheel Driving Simulator Raspberry Pi UserInterviews.com Camera Arrays
🚗
Automotive & HMI Domain
ADAS Systems Hands-Free Driving Haptic Feedback HUD Design Lateral Assist Situational Awareness Human Factors Safety Standards SAE Levels
📢
Communication & Collaboration
Stakeholder Reporting Research Workshops Cross-functional Teams Requirements Translation Regulatory Compliance Academic Writing Technical Presentations
Get in Touch

Let's Talk Research

I'm open to discussing new research opportunities, senior UX or HMI researcher roles, academic collaborations, and consulting engagements in automotive human factors and user research.

📞
📍
Location
Foster City, CA
🔬
Open to Opportunities

Targeting senior roles in UX Research, HMI Research, and Human Factors — particularly in automotive, mobility, and emerging technology domains. Open to industry and research leadership positions.

Available for new opportunities
Senior UX Researcher · HMI Researcher · Human Factors Specialist
✉ Send an Email 🔗 Connect on LinkedIn