Sonification of Emotion in Social Media: Affect and Accessibility in Facebook Reactions

被引:3
作者
Cantrell S.J. [1 ]
Winters R.M. [1 ]
Kaini P. [1 ]
Walker B.N. [1 ]
机构
[1] Georgia Institute of Technology, Atlanta, GA
关键词
accessibility; affective computing; computer-mediated communication; design and evaluation; emotion; music; social media; sonification; universal design;
D O I
10.1145/3512966
中图分类号
学科分类号
摘要
Facebook Reactions are a collection of animated icons that enable users to share and express their emotions when interacting with Facebook content. The current design of Facebook Reactions utilizes visual stimuli (animated graphics and text) to convey affective information, which presents usability and accessibility barriers for visually-impaired Facebook users. In this paper, we investigate the use of sonification as a universally-accessible modality to aid in the conveyance of affect for blind and sighted social media users. We discuss the design and evaluation of 48 sonifications, leveraging Facebook Reactions as a conceptual framework. We conducted an online sound-matching study with 75 participants (11 blind, 64 sighted) to evaluate the performance of these sonifications. We found that sonification is an effective tool for conveying emotion for blind and sighted participants, and we highlight sonification design strategies that contribute to improved efficacy. Finally, we contextualize these findings and discuss the implications of this research with respect to HCI and the accessibility of online communities and platforms. © 2022 ACM.
引用
收藏
相关论文
共 62 条
[1]  
Abascal J., Nicolle C., Moving towards inclusive design guidelines for socially and ethically aware HCI, Interacting with Computers, 17, 5, pp. 484-505, (2005)
[2]  
Akoumianakis D., Stephanidis C., Universal design in HCI: A critical review of current research and practice, Proceedings of the Association for Computing Machinery Conference on Human Factors in Computing Systems, 14, (2001)
[3]  
Biernacki P., Waldorf D., Snowball Sampling: Problems and Techniques of Chain Referral Sampling, Sociological Methods Research, 10, 2, pp. 141-163, (1981)
[4]  
Blattner M.M., Sumikawa D.A., Greenberg R.M., Earcons and Icons: Their Structure and Common Design Principles, Human-Computer Interaction, 4, 1, pp. 11-44, (1989)
[5]  
Bonebright T.L., Miner N.E., Evaluation of Auditory Displays: Comments on Bonebright et al, ICAD 1998, ACM Transactions on Applied Perception, 2, 4, pp. 517-520, (2005)
[6]  
Brady E., Zhong Y., Ringel Morris M., Bigham J.P., Investigating the appropriateness of social network question asking as a resource for blind users, Proceedings of the ACM Conference on Computer Supported CooperativeWork, CSCW (CSCW '13). Association for Computing Machinery, New York, NY, USA, pp. 1225-1236, (2013)
[7]  
Brewster S.A., Wright P.C., Edwards N.A.D., Evaluation of earcons for use in auditory human-computer interfaces, Conference on Human Factors in Computing Systems-Proceedings (CHI '93). Association for Computing Machinery, New York, NY, USA, pp. 222-227, (1993)
[8]  
Brinkley J., Tabrizi N., A desktop usability evaluation of the facebook mobile interface using the JAWS screen reader with blind users, Proceedings of the Human Factors and Ergonomics Society, 2017, 1, pp. 828-832, (2017)
[9]  
Brock M., Ola Kristensson P., Supporting blind navigation using depth sensing and sonification, UbiComp 2013 Adjunct-Adjunct Publication of the 2013 ACM Conference on Ubiquitous Computing (UbiComp '13 Adjunct). Association for Computing Machinery, New York, NY, USA, pp. 255-258, (2013)
[10]  
Buxton W., Chunking and Phrasing and the Design of Human-Computer Dialogues, Readings in Human-Computer Interaction., 1, pp. 494-499, (1995)