- 最后登录
- 2017-9-18
- 注册时间
- 2011-1-12
- 阅读权限
- 90
- 积分
- 12276
  
- 纳金币
- 5568
- 精华
- 0
|
1 Introduction
Emotions play an important role in human communications. A new
“emotional dimension” could be added to human-computer interfaces
to make communication more intuitive. Generally, emotions
could be recognized from text, voice, facial expression, gesture
and/or from biosignals (pulse, EEG, temperature, etc.). EEG is
electroencephalogram signals of the human brain and could reveal
the “inner” real human emotions. EEG-based emotion recognition
is relatively new area, and most of the works in this area focus on
off-line emotion recognition. Recently, wireless and portable devices
came to the market that made possible to add EEG-based interaction
to human-computer interfaces and develop the real-time
applications. We present a novel work on real-time EEG-based
emotion recognition and emotion-enabled applications for entertainment
such as emotional avatars, emotion-enabled animation,
and emotion-enabled personalized web-based player.
2 Our Approach
We proposed and implemented a real-time fractal-based algorithm
of emotion recognition from EEG. In Fig. 1a, the Arousal-Valence
emotion model used to define six emotions that could be recognized
by our algorithm is shown. Fear, frustrated, sad, happy, pleasant,
and satisfied emotions are recognized in real time using only
3 channels and with good accuracy [Liu et al. 2010]. Raw EEG
data could be acquired from any EEG device. Currently, our applications
could work with Emotiv, Pet 2, and Mindset 24 devices.
The steps of an overall algorithm of real-time applications are as
follows. First, raw data are read from the EEG device, filtered with
band pass filter 2-42 Hz, and entered to the real-time emotion recognition
algorithm. Then, the results of the recognition are fed to the
game, web site, or any other real-time software. In this work, we
use Emotiv headset [Emotiv ]. EEG-based emotional avatar implemented
with Haptek system [Haptek ] is presented in Fig. 1b.
The user’s emotions are recognized and visualized in real time on
his/her avatar. The emotions of the avatar are changed according to
the user’s “inner” emotions recognized from EEG. Emotions could
be induced to the user by sound stimuli with earphones. Emotiondriven
interactive 3D computer game was developed as well. The
e-mail: {eosourina,LIUY0053,RaymondKhoa}@ntu.edu.sg
user’s emotions are visualized in real-time by the body movements
of a 3D Penguin avatar and by color changing light of the 3D Penguin’s
snow house. Six clips representing 3D Penguin animations
corresponding to satisfied, pleasant, happy, frustrated, sad, and fear
emotions were proposed and implemented using Autodesk maya.
Music could be used to induce emotions to the user. In Fig. 1c,
the emotional Penguin avatar expresses the satisfied emotion of the
user. The user “creates” the animations sequence by his/her emotions.
This approach demonstrates that a movie could be created
according to the user’s emotions. With EEG-based emotion recognition,
it is possible to personalize the movie according the user’s
current emotions. For example, if the scene of a horror movie is
supposed to scare the user but the current emotion recognized from
EEG is still positive the movie character could change the appearance
to scarier one to induce the emotion targeted by the movie. In
another emotion-enabled application “Dancing Robot”, the Robot’s
dance speed depends just on the user’s arousal level of the emotion.
The Robot dances faster if the user experiences any higher arousal
level and slowly if the user experiences lower arousal level of the
emotion. Web-based emotion-enabled “Music Player” which could
display the music according to the user’s current emotional state
was designed and implemented as well.
The proposed real-time EEG-based emotion recognition would advance
research on human computer interaction leading to the implementation
of new applications such as EEG-based serious games,
emotion-enabled adaptive movies, emotion-enabled personalized
search on theWeb, experimental art animation, personalized avatars
seamlessness communicating with virtual objects, other avatars, or
even with social robots, etc.
References
EMOTIV. http://www.emotiv.com.
HAPTEK. http://www.haptek.com.
LIU, Y., SOURINA, O., AND NGUYEN, M. K. 2010. Real-time
EEG-based human emotion recognition and visualization. In
Proc. 2010 Int. Conf. on Cyberworlds, 262–269 |
|