Emoji Essence: Detecting User Emotional Response on Visual Centre Field with Emoticons

Authors

  • Fatima Isiaka Department of Computer Science, Nasarawa State University, Keffi, Nigeria
  • Zainab Adamu Department of Computer Science, Ahmadu Bello University, Zaria, Nigeria

DOI:

https://doi.org/10.30564/jcsr.v3i4.3577

Abstract

User experience is understood in so many ways, like a one on one interaction (subjective views), online surveys and questionnaires. This is simply so get the user’s implicit response, this paper demonstrates the underlying user emotion on a particular interface such as the webpage visual content based on the context of familiarisation to convey users’ emotion on the interface using emoji, we integrated physiological readings and eye movement behaviour to convey user emotion on the visual centre field of a web interface. The physiological reading is synchronised with the eye tracker to obtain correlating user interaction, and emoticons are used as a form of emotion conveyance on the interface. The eye movement prediction is obtained through a control system’s loop and is represented by different color display of gaze points (GT) that detects a particular user’s emotion on the webpage interface. These are interpreted by the emoticons. Result shows synchronised readings which correlates to area of interests (AOI) of the webpage and user emotion. These are prototypical instances of authentic user response execution for a computer interface and to easily identify user response without user subjective response for better and easy design decisions.

Keywords:

Emotion emblem, Emoticons, Visual expression, Area of interest, Ergonomics, User interaction, Web interface

References

[1] Huang, Y., Gursoy, D., Zhang, M., Nunkoo, R., and Shi, S. Interactivity in online chat: Conversational cues and visual cues in the service recovery process. International Journal of Information Management (2021), 60, 102360.

[2] Jonathan Lazar, Harry Hochheiser, Measuring the Human, Measuring the Human, Research Methods in Human Computer Interaction, (2017), 23, 230 34-56.

[3] Isiaka, Fatima. Modelling stress levels based on physiological responses to web contents, Sheffield Hallam University, (2017).

[4] Fang Zhi-Gang, Kong Xiang Zong and Xu Jie. Design of Eye Movement Interactive Interface and Example Development; Information Technology Journal: Asian Network for Scientific Information, (2013), 1981-1987.

[5] Goldberg, J. H., and Kotval, X. P. Computer interface evaluation using eye movements: methods and constructs. International journal of industrial ergonomics, (1999), 24(6), 631-645. 18.

[6] Iáñez, E., Azorin, J. M., and Perez-Vidal, C. Using eye movement to control a computer: A design for a lightweight electro-oculogram electrode array and computer interface. PloS one, (2013), 8(7), e67099.

[7] Kim, K. N., and Ramakrishna, R. S. Vision-based eye-gaze tracking for human computer interface. In IEEE SMC’99 Conference Proceedings. IEEE International Conference on Systems, Man, and Cybernetics (Cat. No. 245 99CH37028) (1999), Vol. 2, pp. 324-329.

[8] Jacob, R. J. What you look at is what you get: eye movement-based interaction techniques. In Proceedings of the SIGCHI conference on Human factors in computing systems, (1990), 11-18.

[9] Ding, Q., Tong, K., and Li, G. (2006, January). Development of an EOG 250 (electro-oculography) based human-computer interface. In 2005 IEEE Engineering in Medicine and Biology 27th Annual Conference, (2005), pp. 6829-6831.

[10] Deng, L. Y., Hsu, C. L., Lin, T. C., Tuan, J. S., and Chang, S. M. EOG-based Human-Computer Interface system development. Expert Systems with Applications, (2010), 37(4), 3337-3343.

[11] Lv, Z., Wu, X. P., Li, M., and Zhang, D. A novel eye movement detection algorithm for EOG driven human computer interface. Pattern Recognition Letters, (2010), 31(9), 1041-1047.

[12] Gu, X., Cao, Z., Jolfaei, A., Xu, P., Wu, D., Jung, T. P., and Lin, C. T. 260 EEG-based brain-computer interfaces (BCIs): A survey of recent studies on signal sensing technologies and computational intelligence approaches and their applications. IEEE/ACM transactions on computational biology and bioinformatics, (2021).

[13] Juhola, M., Zhang, Y., and Rasku, J. (2013). Biometric verification of a subject through eye movements. Computers in biology and medicine, 43(1), 42-50. 19.

[14] Tecce, J. J., Gips, J., Olivieri, C. P., Pok, L. J., and Consiglio, M. R. Eye movement control of computer functions. International Journal of Psychophysiology, 29(3), 319-325. 270.

[15] Triadi, T., Wijayanto, I., and Hadiyoso, S. Electrooculogram (EOG) based Mouse Cursor Controller Using the Continuous Wavelet Transform and Statistic Features. Lontar Komputer: Jurnal Ilmiah Teknologi Informasi, (2021), 12(1), 53-61.

[16] Zamora, M., Toth, R., Morgante, F., Ottaway, J., Gillbe, T., Martin, S. and Denison, T. DyNeuMo Mk1: Design and Pilot Validation of an Investigational Motion-Adaptive Neurostimulator with Integrated Chronotherapy. bioRxiv, (2020), 19-29.

[17] Prendinger, H., Mori, J., and Ishizuka, M. (2005). Using human physiology to evaluate subtle expressivity of a virtual quizmaster in a mathematical game. International journal of human-computer studies, (2005), 62(2), 231-245.

[18] Chocarro, R., Corti˜nas, M., and Marcos-Mat´as, G. Teachers’ attitudes towards chatbots in education: a technology acceptance model approach considering the effect of social language, bot proactiveness, and users’ characteristics. Educational Studies, (2021), 1-19.

[19] Jin, Y., Deng, Y., Gong, J., Wan, X., Gao, G., and Wang, Q. OYaYa: A Desktop Robot Enabling Multimodal Interaction with Emotions. In 26th International Conference on Intelligent User Interfaces,(2021), pp. 55-57.

[20] Fraoua, K. E. (2021, July). How to Asses Empathy During Online Classes. In International Conference on Human-Computer Interaction, Springer, Cham, (2021), 427-436.

[21] Huang, A. H., Yen, D. C., and Zhang, X. Exploring the potential effects of emoticons. Information and Management, 45(7), 466-473.

[22] K., Suzuki, I., Iijima, R., Sarcar, S., and Ochiai, Y. EmojiCam: Emoji295 Assisted Video Communication System Leveraging Facial Expressions. In International Conference on Human-Computer Interaction, (2021), (pp. 611-625).

[23] Daantje Derks , Arjan E.R. Bos, Jasper von Grumbkow. Emoticons and social interaction on the Internet: the importance of social context, Computers in Human Behavior, Elsevier, 23 (2007) 842-849.

Downloads

How to Cite

Isiaka, F., & Adamu, Z. (2021). Emoji Essence: Detecting User Emotional Response on Visual Centre Field with Emoticons. Journal of Computer Science Research, 3(4), 1–8. https://doi.org/10.30564/jcsr.v3i4.3577

Issue

Article Type

Article