IRMA-International.org: Creator of Knowledge
Information Resources Management Association
Advancing the Concepts & Practices of Information Resources Management in Modern Organizations

Assessing User Experience via Biometric Sensor Affect Detection

Assessing User Experience via Biometric Sensor Affect Detection
View Sample PDF
Author(s): Irfan Kula (Arizona State University, USA), Russell J. Branaghan (Arizona State University, USA), Robert K. Atkinson (Arizona State University, USA)and Rod D. Roscoe (Arizona State University, USA)
Copyright: 2018
Pages: 17
Source title: End-User Considerations in Educational Technology Design
Source Author(s)/Editor(s): Rod D. Roscoe (Arizona State University, USA), Scotty D. Craig (Arizona State University, USA)and Ian Douglas (Arizona State University, USA)
DOI: 10.4018/978-1-5225-2639-1.ch006

Purchase

View Assessing User Experience via Biometric Sensor Affect Detection on the publisher's website for pricing and purchasing information.

Abstract

Traditional user experience assessments rely on self-report, human-system performance, and observational data that incompletely capture users' psychological demands, processing, or affect. Specifically, self-report measures require users to identify and articulate subjective responses to product features, yet users may not possess accurate awareness or may be unwilling or unable to express themselves. Similarly, human-system performance and observational measures require analysts to make inferences about hidden psychological states based on observed external patterns. This chapter discusses how biometric sensor-based affect detection technologies (e.g., eye tracking and EEG) may supplement traditional methods. By measuring biometric indicators of psychological states, researchers can gain potentially richer and more accurate insights into user experience. These technologies are gaining traction in educational technology development and functionality, and thus the extension of these tools for usability and user experience evaluation is highly feasible.

Related Content

Rod D. Roscoe, Russell J. Branaghan, Nancy J. Cooke, Scotty D. Craig. © 2018. 34 pages.
Steve Ritter, R. Charles Murray, Robert G. M. Hausmann. © 2018. 17 pages.
Yvonne S. Kao, Bryan J. Matlen, Michelle Tiu, Linlin Li. © 2018. 24 pages.
Melissa L. Stone, Kevin M. Kent, Rod D. Roscoe, Kathleen M. Corley, Laura K. Allen, Danielle S. McNamara. © 2018. 23 pages.
Elizabeth R. Kazakoff, Melissa Orkin, Kristine Bundschuh, Rachel L. Schechter. © 2018. 24 pages.
Irfan Kula, Russell J. Branaghan, Robert K. Atkinson, Rod D. Roscoe. © 2018. 17 pages.
Erin Walker, Ruth Wylie, Andreea Danielescu, James P. Rodriguez III, Ed Finn. © 2018. 19 pages.
Body Bottom