Tuğçe Nur Pekçetin

Tuğçe Nur Pekçetin

Cognitive Science Ph.D Postdoctoral Fulbright Scholar

Research

Social Cognition & Mind Perception in HRI

Social Cognition & Mind Perception in HRI

My doctoral research explored the dynamics of mind perception, specifically how humans ascribe intentions and feelings to robots. To uncover the determinants of this process, I investigated the effects of agent type, action type, as well as generational and individual differences using a novel mixed-methods approach that combines real-time implicit metrics with explicit measures.Read more →

Human Perception of Robot Actions

Human Perception of Robot Actions

‘How do humans perceive and interpret robot actions? This research thread investigates the cognitive mechanisms underlying action understanding in HRI. We explore how people differentiate between communicative and noncommunicative robot behaviors, examining the role of multisensory cues and context in guiding action interpretation.Read more →

Naturalistic Laboratory Design

Naturalistic Laboratory Design

To bridge the gap between rigid laboratory experiments and uncontrolled real-world interactions, we developed a novel naturalistic laboratory setup. In this setup, we use Transparent OLED technology to present live human and robot actors performing live actions as experimental stimuli. This framework allowed us to achieve high ecological validity while maintaining rigorous experimental control.Read more →

Trust Dynamics in HRI

Trust Dynamics in HRI

In this emerging line of research, I am investigating the dynamics of trust calibration. My focus is on developing methodologies that align human expectations with robot capabilities for safer and more resilient collaboration.Read more →

Eye-Tracking & Reading Research

Eye-Tracking & Reading Research

Before transitioning to HRI, my research in cognitive science focused how humans process visual information. Specifically, I utilized eye-tracking methodologies to study eye movement control patterns and contributed to the development of a comprehensive eye-movement dataset.Read more →