论文:2023,Vol:41,Issue(4):654-660
引用本文:
闫靓, 周卉婷, 李争光. 噪声作用下的人脸面部情绪表情研究[J]. 西北工业大学学报
YAN Liang, ZHOU Huiting, LI Zhengguang. Study on facial emotional expression induced by the annoying noise[J]. Journal of Northwestern Polytechnical University

噪声作用下的人脸面部情绪表情研究
闫靓1, 周卉婷1, 李争光2
1. 西北工业大学 航海学院, 陕西 西安 710072;
2. 浙江科技学院 建筑系, 浙江 杭州 310023
摘要:
噪声使人烦恼。面部表情能够即时、客观地显现情绪状态。尝试以人在聆听噪声时面部表情特征的变化表征噪声的烦恼度,以期消除传统主观评价方法的诸多弊端。借助听音实验,采集了30位不同国籍听音者在聆听不同类型噪声时的面部视频、生理信号和个人烦恼度;对听音者面部关键动作单元(action units,AUs)进行提取、识别与分析,确定了听音者因噪声而烦恼时面部情绪表情的AUs组合表达;创设了融合(灰)色相明度变化的烦恼度评价量表,经网络调查验证有效。研究证实:以听音者面部(烦恼)情绪表情的特征变化客观评价噪声烦恼度是可行的;融入具象化元素的烦恼度评价量表能够有效避免理解偏差,提高评价效率。
关键词:    噪声烦恼度    复杂情绪    表情    面部情绪表情    人脸识别   
Study on facial emotional expression induced by the annoying noise
YAN Liang1, ZHOU Huiting1, LI Zhengguang2
1. School of Marine Science and Technology, Northwestern Polytechnical University, Xi'an 710072, China;
2. School of Civil Engineering and Architecture, Zhejiang University of Science and Technology, Hangzhou 310023, China
Abstract:
Noise is annoying. Facial expressions indicate diverse emotions instantly, objectively and accurately. In this paper, the facial emotional expression is used to assess the noise annoyance to nullify the disadvantages of traditional evaluation methods. Firstly, by exploiting listening experiments, the facial videos, physiological signals and individual annoyance from 30 listeners with different nationalities were collected. Then, the facial action units (AUs) were extracted, identified and analyzed, and the AUs combination reflects annoyed emotion induced by the noise was confirmed. Furthermore, a novel evaluation scale embodying color and emotional symbols was proposed, which has been proven to be valid by the social surveys. The present study confirms that it is feasible to evaluate the noise annoyance more objectively by exploring the listener's facial "annoyance" expression. The innovative scale can avoid the understanding bias and improve the evaluation efficiency remarkably.
Key words:    noise annoyance    complex emotions    expressions    facial emotional expressions    face recognition   
收稿日期: 2022-09-21     修回日期:
DOI: 10.1051/jnwpu/20234140654
基金项目: 国家自然科学基金面上项目(12074316)资助
通讯作者: 周卉婷(1999—),西北工业大学硕士研究生,主要从事心理声学研究。e-mail:zhouhuiting@mail.nwpu.edu.cn     Email:zhouhuiting@mail.nwpu.edu.cn
作者简介: 闫靓(1981—),西北工业大学副教授,主要从事噪声影响评价研究。
相关功能
PDF(2081KB) Free
打印本文
把本文推荐给朋友
作者相关文章
闫靓  在本刊中的所有文章
周卉婷  在本刊中的所有文章
李争光  在本刊中的所有文章

参考文献:
[1] GUSKI R, FELSCHER-SUHR U, SCHUEMER R. The concept of noise annoyance: how international experts see it[J]. Journal of Sound and Vibration, 1999, 223(4):513-527
[2] PETER L. Noise in cities: urban and transport planning determinants and health in cities: a framework[C]//Integrating Human Health into Urban and Transport Planning, Switzerland, 2018: 443-481
[3] 闫靓, 陈克安. 声品质与噪声影响评价.环境影响评价[J]. 2013, 35(6): 24-26 YAN Liang, CHEN Ke'an. Sound quality and noise impact assessment[J]. Environmental Impact Assessment, 2013, 35(6): 24-26 (in Chinese)
[4] 韩广华, 樊博. 李克特式量表语义差异对科学测量的影响[J]. 科技进步与对策, 2017, 34(20): 6-11 HAN Guanghua, FAN Bo. The effect of semantic difference of likert-type questionaires on scientific measurements[J]. Science & Technology Progress and Policy, 2017, 34(20): 6-11 (in Chinese)
[5] FIELDS J M, JONG R G D, GJESTLAND T, et al. Standardized general-purpose noise reaction questions for community noise surveys: research and a recommendation[J]. Journal of Sound and Vibration, 2001, 242(4): 641-679
[6] 闫靓. 混合噪声烦恼度评价及特性研究[D]. 西安: 西北工业大学, 2011 YAN Liang. Evaluation and characterization of annoyance of combined noise[D]. Xi'an: Northwestern Polytechnical University, 2011 (in Chinese)
[7] DESCOVICH K A, WATHAN J, LEACH M C, et al. Facial expression: an under-utilised tool for the assessment of welfare in mammals[J]. ALTEX-Alternatives to Animal Experi-mentation, 2017, 34(3): 409-429
[8] SWANSON G E, TOMKINS S S, KARON B P. Affect, imagery, consciousness: the positive affects[J]. American Sociological Review, 1963, 28(4): 661
[9] NAKAJIMA K, MINAMI T, NAKAUCHI S. Interaction between facial expression and color[J]. Scientific Reorts, 2017, 7: 41019
[10] LEACH M C, COULTER C A, RICHARDSON C A, et al. Are we looking in the wrong place? Implications for behavioural-based pain assessment in rabbits(Oryctolagus cuniculi) and beyond?[J]. Plos One, 2011, 6(3): e13347
[11] KRISTA M M, AMY L M, EMANUELA D C, et al. Conceptual and methodological issues relating to pain assessment in mammals: The development and utilisation of pain facial expression scales[J]. Applied Animal Behaviour Science, 2019, 217: 1-15
[12] 石林. 情绪研究中的若干问题综述[J]. 心理科学进展, 2000, 8(1): 63-68 SHI Lin. A review of several issues in emotional research[J]. Advances in Psychological Science, 2000, 8(1): 63-68 (in Chinese)
[13] 闫靓, 周卉婷, 范豪炜, 等. 声音诱发情绪反应测试与评价系统[简称:声音情绪测评]V1.0: 中国, 2022SR0398977[P]. 2021-09-09
[14] 钟辰, 李国棋, 陆宏瑶. Soundscape声音景观理论及声音分类法的研究[J]. 广播与电视技术, 2003, 30(12): 61-64 ZHONG Chen, LI Guoqi, LU Hongyao. Research on soundscape theory and sound classification[J]. Radio & TV Broadcast Engineering, 2003,30(12): 61-64 (in Chinese)
[15] 魏玮. 网络表情符号的情感表达探讨[J]. 科技传播, 2014, 6(13): 24-26 WEI Wei. Discussion on emotional expression of network emoticons[J]. Public Communication of Science & Technology, 2014, 6(13): 24-26 (in Chinese)
[16] 白露, 马慧, 黄宇霞, 等. 中国情绪图片系统的编制——在46名中国大学生中的试用[J]. 中国心理卫生杂志, 2005, 19(11): 719-722 BAI Lu, MA Hui, HUANG Yuxia, et al. The development of native chinses affective picture system-a pretest in 46 college students[J]. Chinese Mental Health Journal, 2005, 19(11): 719-722 (in Chinese)
[17] American National Standards Institute, Acoustical Society of America. Criteria for Evaluating Room Noise[S]. ANSI/ASA S12.2-2019
[18] EKMAN P, LEVENSON R, FRIESEN W. Autonomic nervous system activity distinguishes among emotions[J]. Science, 1983, 221(4616): 1208-1210
[19] SHAO Z, LIU Z, CAI J, et al. JAA-Net: Joint facial action unit detection and face alignment via adaptive attention[J]. International Journal of Computer Vision, 2021, 129(2): 321-340
[20] LI W, ABTAHI F, ZHU Z, et al. EAC-Net: deep nets with enhancing and cropping for facial action unit detection[J]. IEEE Trans on Pattern Analysis and Machine Intelligence, 2018, 40(11): 2583-2596