Human-Computer Interaction Lab

School of Computing and Informatics
Home People Research Publications Gallery News Links Sponsors

Research

Research area for this lab includes 3D user interfaces, human computer interactions (HCI), games and virtual reality. We currently have three ongoing projects.


Eye gaze visualization techniques:


Eye gaze visualization techniques can help a teacher detect distracted students in a Virtual Reality (VR) environment. Keeping students engaged in a classroom is always a challenging task for a teacher. In a virtual environment, it is easy for students to get distracted/confused. We think that by tracking the eye gaze of students, a teacher could detect distracted/confused students and could better guide them towards objects of interest in the VR scene.


Publications

Measure attention level of students:


The attention level of students can be measured using several sensors such as EEG, Heart Rate, eye tracker etc. Attention can not be measured by tracking gaze only since a student could be looking at an object of interest and still be thinking about something else. Thus, we plan to use other sensors as well.


Publications
  • Asish, S. M., Kulshreshth, A.,and Borst, C. W., "Internal Distraction Detection Utilizing EEG Data in an Educational VR Environment", ACM Symposium on Applied Perception (ACM SAP 2023), Article 8, 1–10, 2023
  • Asish, S. M., Kulshreshth, A., and Borst, C. W.,"Detecting Distracted Students in an Educational VR Environment Using Machine Learning on EEG and Eye-Gaze Data" — Proceedings of the IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (IEEE-VR 2023), March 2023
  • Asish, S. M., Kulshreshth, A.,and Borst, C. W., "Detecting Distracted Students in Educational VR Environments using Machine Learning on Eye Gaze Data", Journal of Computer and Graphics, 2022, Volume 109, 75-87.
  • Asish, S. M., Kulshreshth, A.,and Borst, C. W., "User Identification Utilizing Minimal Eye Gaze Features in Virtual Reality Applications”, Journal of Virtual Worlds, 2022, Volume 1, 42-61
  • Asish, S. M., Kulshreshth, A.,and Borst, C. W.,"Detecting Internal Distraction in an Educational VR Environment using EEG Data", — Proceedings of the ACM Spatial User Interactions (SUI 2022), December 2022.
  • Asish, S. M., Kulshreshth, A.,and Borst, C. W.,"Supervised vs Unsupervised Learning on Gaze Data to Classify Student Distraction Level in an Educational VR Environment", — Proceedings of the ACM Spatial User Interactions (SUI 2021), November 2021.
  • Asish, S. M., Hossain, E., Kulshreshth, A.,and Borst, C. W., "Deep Learning on Eye Gaze Data to Classify Student Distraction Level in an Educational VR Environment", — Proceedings of the International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments (ICAT-EGVE), September 2021.


Dynamic adjustment of game difficulty level:


Dynamically adjusting the difficulty level of video games can help keep the player engaged and having fun. If a player is too skilled for the current level, they may lose interest and stop playing. Likewise, if the game is too difficult they may become stressed and stop playing as well. Dynamic difficulty adjustment (DDA) is a method of automatically modifying a game's features in real time. We plan to use EEG and heart rate sensors to gauge the mental state of a user and other biomarkers to adjust game difficulty automatically and compare it to traditional game difficulty models.


Interface for Enhanced Teacher Awareness:


Educational virtual reality (VR) applications are the most recent addition to the learning management tools in this modern age. Due to health concerns, financial concerns, and convenience, people are looking for alternate ways to teach and learn. An efficient VR-based teaching interface could enhance student engagement, learning outcomes, and overall educational experience. Typically, teachers in a VR classroom do not have a way to know what students are doing since students are not visible. An efficient teaching interface should include some mechanism for a teacher to monitor students and alert the teacher if a student is trying to catch the attention of the teacher. An ideal interface would be one, which helps a teacher effectively monitor students while teaching without increasing the cognitive load of the teacher.


Publications

Location Scouting in VR


The University of Louisiana at Lafayette, in collaboration with the Louisiana Economic Development, is currently developing a groundbreaking virtual reality (VR) scouting platform spearheaded by the university's HCI Lab. This project, which is still under development, is being created in close partnership with the Moving Images Arts program department, ensuring that the curriculum and platform closely align with the needs of students and the local industry. The VR app is being designed to enable users to virtually explore, interact with, and annotate potential filming locations throughout Louisiana. This innovative approach offers a cost-effective and efficient alternative to traditional scouting methods, allowing filmmakers and producers to compare virtual environments with real-world locations seamlessly. By integrating cutting-edge VR technology with the insights and expertise of the Moving Images Arts program and local industry feedback, this initiative aims to significantly enhance Louisiana's appeal as a top destination for film production, providing an immersive tool for location scouting that is unparalleled in its accuracy and utility. This video gives a glimpse into the app's ongoing development process, showcasing its potential to transform how locations are scouted for film and television projects in Louisiana and beyond.


Effectiveness of Visual Acuity Tests in VR vs the Real-World

Virtual Reality (VR) devices have opened a new dimension of merging technology and healthcare in an immersive and exciting way to test eye vision. Visual acuity is a person’s capacity to perceive small details. An optometrist or ophthalmologist determines a visual acuity score following a vision examination. In this work, we explored how recent VR devices could be utilized to conduct visual acuity tests. We used two Snellen charts to examine eye vision in VR, similar to testing in a doctor’s chamber. We found that VR could be utilized to conduct preliminary eye vision tests.

Publications

  • Asish, S. M., Salazar, R. E., and Kulshreshth, A.,"Effectiveness of Visual Acuity Tests in VR vs the Real-World" — Proceedings of the 2024 IEEE Conference on Virtual Reality and 3D User Interfaces (IEEE-VR 2024), March 2024