纳金网

标题: Emotion-enabled EEG-based Interaction [打印本页]

作者: 彬彬    时间: 2011-12-28 09:19
标题: Emotion-enabled EEG-based Interaction
1 Introduction

Emotions play an important role in human communications. A new

“emotional dimension” could be added to human-computer interfaces

to make communication more intuitive. Generally, emotions

could be recognized from text, voice, facial expression, gesture

and/or from biosignals (pulse, EEG, temperature, etc.). EEG is

electroencephalogram signals of the human brain and could reveal

the “inner” real human emotions. EEG-based emotion recognition

is relatively new area, and most of the works in this area focus on

off-line emotion recognition. Recently, wireless and portable devices

came to the market that made possible to add EEG-based interaction

to human-computer interfaces and develop the real-time

applications. We present a novel work on real-time EEG-based

emotion recognition and emotion-enabled applications for entertainment

such as emotional avatars, emotion-enabled animation,

and emotion-enabled personalized web-based player.

2 Our Approach

We proposed and implemented a real-time fractal-based algorithm

of emotion recognition from EEG. In Fig. 1a, the Arousal-Valence

emotion model used to define six emotions that could be recognized

by our algorithm is shown. Fear, frustrated, sad, happy, pleasant,

and satisfied emotions are recognized in real time using only

3 channels and with good accuracy [Liu et al. 2010]. Raw EEG

data could be acquired from any EEG device. Currently, our applications

could work with Emotiv, Pet 2, and Mindset 24 devices.

The steps of an overall algorithm of real-time applications are as

follows. First, raw data are read from the EEG device, filtered with

band pass filter 2-42 Hz, and entered to the real-time emotion recognition

algorithm. Then, the results of the recognition are fed to the

game, web site, or any other real-time software. In this work, we

use Emotiv headset [Emotiv ]. EEG-based emotional avatar implemented

with Haptek system [Haptek ] is presented in Fig. 1b.

The user’s emotions are recognized and visualized in real time on

his/her avatar. The emotions of the avatar are changed according to

the user’s “inner” emotions recognized from EEG. Emotions could

be induced to the user by sound stimuli with earphones. Emotiondriven

interactive 3D computer game was developed as well. The

e-mail: {eosourina,LIUY0053,RaymondKhoa}@ntu.edu.sg

user’s emotions are visualized in real-time by the body movements

of a 3D Penguin avatar and by color changing light of the 3D Penguin’s

snow house. Six clips representing 3D Penguin animations

corresponding to satisfied, pleasant, happy, frustrated, sad, and fear

emotions were proposed and implemented using Autodesk Maya.

Music could be used to induce emotions to the user. In Fig. 1c,

the emotional Penguin avatar expresses the satisfied emotion of the

user. The user “creates” the animations sequence by his/her emotions.

This approach demonstrates that a movie could be created

according to the user’s emotions. With EEG-based emotion recognition,

it is possible to personalize the movie according the user’s

current emotions. For example, if the scene of a horror movie is

supposed to scare the user but the current emotion recognized from

EEG is still positive the movie character could change the appearance

to scarier one to induce the emotion targeted by the movie. In

another emotion-enabled application “Dancing Robot”, the Robot’s

dance speed depends just on the user’s arousal level of the emotion.

The Robot dances faster if the user experiences any higher arousal

level and slowly if the user experiences lower arousal level of the

emotion. Web-based emotion-enabled “Music Player” which could

display the music according to the user’s current emotional state

was designed and implemented as well.

The proposed real-time EEG-based emotion recognition would advance

research on human computer interaction leading to the implementation

of new applications such as EEG-based serious games,

emotion-enabled adaptive movies, emotion-enabled personalized

search on theWeb, experimental art animation, personalized avatars

seamlessness communicating with virtual objects, other avatars, or

even with social robots, etc.

References

EMOTIV. http://www.emotiv.com.

HAPTEK. http://www.haptek.com.

LIU, Y., SOURINA, O., AND NGUYEN, M. K. 2010. Real-time

EEG-based human emotion recognition and visualization. In

Proc. 2010 Int. Conf. on Cyberworlds, 262–269
作者: 彬彬    时间: 2012-1-13 14:14



作者: 菜刀吻电线    时间: 2012-4-13 23:25
先垫一块,再说鸟

作者: 晃晃    时间: 2012-4-28 23:20
其实楼主所说的这些,俺支很少用!

作者: 奇    时间: 2012-5-11 23:21
顶!学习了!阅!

作者: 菜刀吻电线    时间: 2012-8-8 01:41
佩服,好多阿 ,哈哈

作者: 铁锹    时间: 2012-8-8 10:14

3D打印时代到来_制造工艺的革命一触即发?



角逐后有英雄_iOS 6 beta 4添加更多3D城市地图




DaVinci 3D首推出影院级2D转3D技术




便携式3D打印机_PopFab




CPG中国公司成立_联手天津研发3D技术




3D虚拟新星爆炸的结构外形

作者: 晃晃    时间: 2012-8-15 01:09
发了那么多,我都不知道该用哪个给你回帖了,呵呵

作者: 奇    时间: 2012-8-25 23:59
跑着去顶朋友滴铁

作者: tc    时间: 2013-2-18 00:03
我就看看,我不说话

作者: 晃晃    时间: 2013-2-22 23:20
提醒猪猪,千万不能让你看见





欢迎光临 纳金网 (http://go.narkii.com/club/) Powered by Discuz! X2.5