Hostname: page-component-5c6d5d7d68-wp2c8 Total loading time: 0 Render date: 2024-08-08T08:32:40.189Z Has data issue: false hasContentIssue false

IMMERSIVE GAZE SHARING FOR ENHANCING COMMUNICATION IN DESIGN EDUCATION: AN INITIAL USER STUDY IN THE CONTEXT OF ARCHITECTURAL DESIGN CRITIQUES

Published online by Cambridge University Press:  19 June 2023

Yuval Kahlon*
Affiliation:
Tokyo Institute of Technology;
Santosh Maurya
Affiliation:
Hitachi Ltd.
Haruyuki Fujii
Affiliation:
Tokyo Institute of Technology;
*
Kahlon, Yuval, Tokyo Institute of Technology, Israel, kahlon.y.aa@m.titech.ac.jp

Abstract

Core share and HTML view are not available for this content. However, as you have access to this content, a full PDF is available via the ‘Save PDF’ action button.

Gaze sharing is an emerging technology which can enhance human communication and collaboration. As such, it is expected to play an important role in future educational practices. To date, its application was not explored in the context of design education. Additionally, it was mainly implemented in non-immersive environments, which are limited in their potential for engaging learners. Acknowledging the growing interest in immersive learning, as well as the promise made by gaze sharing technology, we strive to develop an immersive gaze sharing environment, to support design education. As a first step, we have implemented and tested an immersive gaze sharing system for supporting design learners. The system was then tested by focusing on a scenario of architectural design, with the aim of collecting valuable user feedback regarding its usability and potential. Our initial user study informs developers of such systems regarding potential issues that may be encountered during deployment in a real-world setting and proposes concrete ways to address these. These insights can help to pave the way for integrating gaze sharing system into design education practices in the near future.

Type
Article
Creative Commons
Creative Common License - CCCreative Common License - BYCreative Common License - NCCreative Common License - ND
This is an Open Access article, distributed under the terms of the Creative Commons Attribution-NonCommercial-NoDerivatives licence (http://creativecommons.org/licenses/by-nc-nd/4.0/), which permits non-commercial re-use, distribution, and reproduction in any medium, provided the original work is unaltered and is properly cited. The written permission of Cambridge University Press must be obtained for commercial re-use or in order to create a derivative work.
Copyright
The Author(s), 2023. Published by Cambridge University Press

References

Birchfield, D. and Johnson-Glenberg, M. (2010), “A Next Gen Interface for Embodied Learning”, International Journal of Gaming and Computer-Mediated Simulations, Vol. 2 No. 1, pp. 4958. https://doi.org/10.4018/jgcms.2010010105.CrossRefGoogle Scholar
Birt, J. and Cowling, M. (2017), “Toward Future “Mixed Reality” Learning Spaces for STEAM Education”, International Journal of Innovation in Science and Mathematics Education, Vol. 25 No. 4, pp. 116.Google Scholar
Brennan, S. E. et al. (2008), “Coordinating cognition: The costs and benefits of shared gaze during collaborative search”, Cognition, Vol. 106 No. 3, pp. 14651477. https://doi.org/10.1016/j.cognition.2007.05.012.CrossRefGoogle ScholarPubMed
Brooke, J. (1996), “SUS: A “Quick and Dirty” Usability Scale”, in Usability Evaluation In Industry. CRC Press.Google Scholar
Gergle, D. and Clark, A. (2011), “See What I'm Saying? Using Dyadic Mobile Eye Tracking to”, in Proceedings of the ACM 2011 conference on Computer supported cooperative work - CSCW ’11. Hagzhou, pp. 435444. https://doi.org/https://doi.org/10.1145/1958824.1958892.CrossRefGoogle Scholar
Guo, F. et al. (2016), “Can eye-tracking data be measured to assess product design?: Visual attention mechanism should be considered”, International Journal of Industrial Ergonomics, Vol. 53, pp. 229235. https://doi.org/10.1016/j.ergon.2015.12.001.CrossRefGoogle Scholar
Gupta, K., Lee, G. A. and Billinghurst, M. (2016), “Do you see what i see? the effect of gaze tracking on task space remote collaboration”, IEEE Transactions on Visualization and Computer Graphics, Vol. 22 No. 11, pp. 24132422. https://doi.org/10.1109/TVCG.2016.2593778.CrossRefGoogle Scholar
Holmqvist, K. and Andersson, R. (2017), “Eye Tracking: A comprehensive guide to methods, paradigms and measures”, (March), p. 537.Google Scholar
Juraschek, M. et al. (2018), “Mixed Reality in Learning Factories”, Procedia Manufacturing, Vol. 23, pp. 153158. https://doi.org/10.1016/j.promfg.2018.04.009.CrossRefGoogle Scholar
Kahlon, Y. et al. (2022) ‘A User Study for Facilitating Effective Remote Education in Design Studios: Towards Integration of Cyber-Physical Technologies Into Design Education’, in International Conference on Engineering and Product Design Education. London. https://doi.org/10.35199/EPDE.2022.75.CrossRefGoogle Scholar
Kent, L. et al. (2021), “Mixed reality in design prototyping: A systematic review”, Design Studies, Vol. 77, p. 101046. https://doi.org/10.1016/j.destud.2021.101046.CrossRefGoogle Scholar
Kiefer, P. et al. (2014), “Starting to get bored: An outdoor eye tracking study of tourists exploring a city panorama”, Eye Tracking Research and Applications Symposium (ETRA), (March), pp. 315318. https://doi.org/10.1145/2578153.2578216.CrossRefGoogle Scholar
Kukkonen, S. (2005), “Exploring eye tracking in design evaluation”, University of Art and Design Helsinki, (Duchowski 2002), pp. 119126.Google Scholar
Massaro, D. et al. (2012), “When art moves the eyes: A behavioral and eye-tracking study”, PLoS ONE, Vol. 7 No. 5. https://doi.org/10.1371/journal.pone.0037285.CrossRefGoogle ScholarPubMed
Maurya, S. et al. (2019), “A mixed reality tool for end-users participation in early creative design tasks”, International Journal on Interactive Design and Manufacturing, Vol. 13 No. 1, pp. 163182. https://doi.org/10.1007/s12008-018-0499-z.CrossRefGoogle Scholar
Corporation, Microsoft (2023) Microsoft HoloLens 2. Available at: https://www.microsoft.com/en-us/hololens (Accessed: February 1, 2023).Google Scholar
Nguyen-Tran, K. et al. (2022), “Exploring Users’ Visual Impression of a Japanese Streetscape by Correlating Attention with Speech: Utilizing eye-tracking technology for computer-aided architectural planning”, in Proceedings of the 27th International Conference of the Association for Computer- Aided Architectural Design Research in Asia (CAADRIA) 2022, pp. 475484.Google Scholar
Peña-Rios, A., Callaghan, V. and Gardner, M. (2017) Multi-User Mixed Reality Environments for Distance Learning. https://doi.org/10.4018/978-1-5225-3719-9.ch010.CrossRefGoogle Scholar
Rahman, Y. et al. (2020), “Exploring Eye Gaze Visualization Techniques for Identifying Distracted Students in Educational VR”, pp. 868877. https://doi.org/10.1109/vr46266.2020.00009.CrossRefGoogle Scholar
Schneider, B. and Pea, R. (2013), “Real-time mutual gaze perception enhances collaborative learning and collaboration quality”, International Journal of Computer-Supported Collaborative Learning, Vol. 8 No. 4, pp. 375397. https://doi.org/10.1007/s11412-013-9181-4.CrossRefGoogle Scholar
Sopher, H., Milovanovic, J. and Gero, J. S. (2022), “Exploring the effect of immersive VR on student-tutor communication in the architecture design crits”, POST-CARBON, Proceedings of the 27th International Conference of the Association for Computer- Aided Architectural Design Research in Asia (CAADRIA) 2022, 2, pp. 2:315324.CrossRefGoogle Scholar
Sung, G., Feng, T. and Schneider, B. (2021), “Learners Learn More and Instructors Track Better with Real-time Gaze Sharing”, Proceedings of the ACM on Human-Computer Interaction, 5(CSCW1), pp. 123. https://doi.org/10.1145/3449208.CrossRefGoogle Scholar
Tang, Y. M., Au, K. M. and Leung, Y. (2018), “Comprehending products with mixed reality: Geometric relationships and creativity”, International Journal of Engineering Business Management, Vol. 10, pp. 112. https://doi.org/10.1177/1847979018809599.CrossRefGoogle Scholar
Technologies, Unity (2023) Unity. Available at: https://unity3d.com/ (Accessed: February 1, 2023).Google Scholar
Wang, L., Liu, X. and Li, X. (2021), “VR collaborative object manipulation based on viewpoint quality”, Proceedings - 2021 IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2021, pp. 6068. https://doi.org/10.1109/ISMAR52148.2021.00020.CrossRefGoogle Scholar