纳金网

标题: Interactive Generation of Dancing Animation with Music Synchronization [打印本页]

作者: 彬彬    时间: 2011-12-30 18:53
标题: Interactive Generation of Dancing Animation with Music Synchronization
1 Introduction

With the explosive growth of user-generated content (UGC) such

as YouTube(R), computer animation that is synchronized with mu-

sic has been in great demand as a new type of UGC. For example,

it is popular to manually create such animation by a free software

called MikuMikuDance in Japan. However, it usually takes much

time to create a short piece of animation with requirement for nec-

essary knowledge, which is a matter of considerable concern for

UGC. On the other hand, several automatic systems are reported

such as [Xu et al. 2011], where dancing animation is synchronized

with the input music by rhythm and intensity features. However,

the automatically generated animation cannot reflect a user inten-

tion to see a specific dance motion (called performance motion) for

a specific part of music, thus user’s unique aesthetic appreciation is

not satisfied, which is essential for UGC.

In this paper, we present a novel approach to generate computer an-

imation by re-using motion data in an intuitive and simple way with

the use of interactive, user-in-the-loop techniques. Different from

the manual tools or fully automatic systems, our system allows user

to create his/her own animation interactively as choreographers do.

Basically, at specific time instants (e.g., the chorus), user composes

the desired performance motions by his/her unique sense. Then,

our system deals with the unspecified part and searches the best

motions in the database considering the requirements for music syn-

chronization, user’s composition, and motion smoothness.

2 Our Approach

Our system consists of off-line procedure and on-line procedure. A

graph structure, which is called meta motion graph, is constructed

in off-line. In the on-line procedure, given a set of performance

motions, the user arranges the performance motions in the timeline

as he/she likes by selecting a performance motion at proper time

instants when listening to the music, leaving the unspecified part in

the timeline being handled by the system. This procedure is called

performance composition in this paper. According to the perfor-

mance composition and beat information in the music, the motion

is concatenated from the meta motion graph by dynamic program-

ming, which is called motion synthesis.

Meta motion graph with performance motions: The original mo-

tion graph technique[Kovar et al. 2002] only focuses on the kine-

matics of motion. However, kinematics on its own is not enough

to solve our task due to the absence of such necessary features as

rhythm information and performance motion information. The ba-

e-mail: fji-xu, ko-takagi, sakazawag@kddilabs.jp

sic idea in the proposed meta motion graph is to concatenate motion

clips with a unit of beat interval. Namely, we segment the motions

into short clips by beat frames and organize these motion clips into

a graph structure by connecting the beat frames with similar poses

as shown in Fig. 1. In such a graph, any path has clear beat infor-

mation by counting the nodes. Therefore, the motion can easily be

synchronized with music. Another characteristics in our graph is to

embed the information of performance motions in the database.

In details, a meta motion graph consists of the set of nodes, edges,

and edge weights, as shown in Fig. 1. The node set includes all of

the beat frames in the motion database. Two successive beat frames

are connected by a uni-directional edge, where an edge label is at-

tached to embed the information of performance motion. For beat

frames with similar poses, they are connected by a bi-directional

edge and its edge weight is assigned as the pose similarity.

Motion synthesis: Here it is essential to properly define a cost

function. Basically, three requirements should be met in the gen-

erated motion. Firstly, beat instants in the generated motion should

be synchronized with those in the music. We can control the cost

of beat synchronization to zero by the method of [Xu et al. 2011].

Secondly, the performance composition should be well satisfied.

We define the distance between the desired performance motions

by the user and the generated ones by the system as: 0 if they are

the same performance motion or 1 for others. Thirdly, the gener-

ated animation should be as smooth as possible, where the source of

motion artifacts is the motion inconsistence at bi-directional edges.

Therefore, a penalty should be paid for selecting a bi-directional

edge, resulting in the cost function as:
作者: 彬彬    时间: 2012-1-13 14:51



作者: 奇    时间: 2012-4-6 23:21
心中有爱,爱咋咋地

作者: 菜刀吻电线    时间: 2012-5-15 23:24
我看看就走,你们聊!

作者: 菜刀吻电线    时间: 2012-5-29 23:19
既来之,则看之!

作者: C.R.CAN    时间: 2012-7-31 23:22
“再次路过……”我造一个-----特别路过

作者: 铁锹    时间: 2012-8-1 09:02
3D技术与人工智能契合中国经济走向_或成未来着陆点



遗留效应公司的Objet 3D打印机可打造电影道具




中国虚拟现实峰会北航召开




足不出户_在线虚拟旅游体验好山好水




借风奥运会_中国彩电加速3D普及



作者: 奇    时间: 2012-8-8 01:05
我看看就走,你们聊!

作者: 铁锹    时间: 2012-8-8 10:15

3D打印时代到来_制造工艺的革命一触即发?



角逐后有英雄_iOS 6 beta 4添加更多3D城市地图




DaVinci 3D首推出影院级2D转3D技术




便携式3D打印机_PopFab




CPG中国公司成立_联手天津研发3D技术




3D虚拟新星爆炸的结构外形

作者: 晃晃    时间: 2012-9-10 23:19
我看看就走,你们聊!

作者: 菜刀吻电线    时间: 2012-9-24 23:24
提醒猪猪,千万不能让你看见

作者: 奇    时间: 2012-9-27 23:24
水……生命之源……灌……

作者: C.R.CAN    时间: 2012-10-21 23:43
呵呵,真得不错哦!!

作者: C.R.CAN    时间: 2012-10-31 23:21
真不错,全存下来了.

作者: 菜刀吻电线    时间: 2013-2-4 23:31
非常感谢,管理员设置了需要对新回复进行审核,您的帖子通过审核后将被显示出来,现在将转入主题





欢迎光临 纳金网 (http://go.narkii.com/club/) Powered by Discuz! X2.5