纳金网

标题: Image-Based Bidirectional Scene Reprojection [打印本页]

作者: 晃晃    时间: 2011-12-28 09:20
标题: Image-Based Bidirectional Scene Reprojection
Image-Based Bidirectional Scene Reprojection

Lei Yang1 Yu-Chiu Tse1 Pedro V. Sander1 Jason Lawrence2 Diego Nehab3;4 Hugues Hoppe3 Clara L. Wilkins5

1Hong Kong UST 2University of Virginia 3Microsoft Research 4IMPA 5Wesleyan University



Abstract

We introduce a method for increasing the framerate of real-time

rendering applications. Whereas many existing temporal upsam-

pling strategies only reuse information from previous frames, our

bidirectional technique reconstructs intermediate frames from a pair

of consecutive rendered frames. This significantly improves the

accuracy and efficiency of data reuse since very few pixels are si-

multaneously occluded in both frames. We present two versions of

this basic algorithm. The first is appropriate for fill-bound scenes as

it limits the number of expensive shading calculations, but involves

rasterization of scene geometry at each intermediate frame. The sec-

ond version, our more significant contribution, reduces both shading

and geometry computations by performing reprojection using only

image-based buffers. It warps and combines the adjacent rendered

frames using an efficient iterative search on their stored scene depth

and flow. Bidirectional reprojection introduces a small amount of

lag. We perform a user study to investigate this lag, and find that its

effect is minor. We demonstrate substantial performance improve-

ments (3–4) for a variety of applications, including vertex-bound

and fill-bound scenes, multi-pass effects, and motion blur.

Keywords: real-time rendering, temporal upsampling



1 Introduction

Reprojection is a general approach for improving real-time rendering

by reusing expensive pixel shading from nearby frames [Scherzer

et al. 2011]. It has proven beneficial in popular games. For instance,

reverse reprojection [e.g. Nehab et al. 2007; Scherzer et al. 2007] is

used in Gears of War II to accelerate low-frequency lighting effects,

and in Crysis 2 to antialias distant geometry.

Such data reuse techniques can be broadly categorized based on

three separate algorithmic choices:

 Temporal direction: Whether shading data is propagated forward,

backward, or in both directions in animation time;

 Data access: Whether pixels are “pushed” (scattered) onto the

current frame, or “pulled” (gathered) from other frames;

 Correspondence domain: Whether the motion data (e.g., velocity

vectors) used to reproject the samples is defined over the source

image domain or over the rendered target.

As reviewed in Section 2, different techniques follow different strate-

gies for each of the choices above (see also Table 1).

In this paper, we present reprojection techniques for temporally up-

sampling rendered content by inserting interpolated frames between





pairs of rendered frames. Borrowing terminology from video com-

pression, we refer to rendered frames as intra- or simply I-frames,

and to interpolated frames as bidirectionally predicted- or B-frames.

Our approach offers two major contributions: (1) bidirectional repro-

ection which combines samples both forward and backward in time,

and (2) image-based reprojection which establishes reprojection

correspondences based on velocity fields stored in the I-frames.

Temporal direction A fundamental limitation of existing reverse

reprojection techniques [e.g. Nehab et al. 2007] is that they incur a

drop in performance whenever there are disoccluded regions in the

scene—elements visible in the current frame that were not visible in

he preceding frame. This is because such regions must be reshaded

from scratch. Since the number of disoccluded pixels varies over

ime, framerates may fluctuate undesirably. In addition, the entire

scene geometry must be processed in order to reshade, incurring

significant overhead in complex scenes.

Our bidirectional reprojection temporally upsamples rendered con-

ent by reusing data from both backward and forward temporal

directions. This provides two clear benefits:

 Smooth shading interpolation: The vast majority of pixels in a B-

frame are also visible in both I-frames. This lets us fetch shading

information from both directions and create an interpolated sig-

nal that greatly attenuates the popping artifacts associated with

one-sided reconstruction (i.e., sample-and-hold extrapolation).

This is particularly important for fast changing shading signals

(e.g., dynamic shadows and glossy lighting).

 Higher, more stable framerate: Disoccluded regions are ex-

tremely rare since they must be occluded in both I-frames.

Thus with bidirectional reprojection we can avoid reshading

and achieve higher and steadier framerates.

One downside of bidirectional reprojection is that it introduces a lag

n the resulting image sequence. This lag is not present in forward-

only reprojection schemes. We present a careful analysis of this lag,

showing that it is small (less than one I-frame). Moreover, results of

a user study we conducted allow us to conclude that it is beneficial

o use bidirectional reprojection in a real-time gaming scenario.









全文请下载附件:

作者: 菜刀吻电线    时间: 2012-1-22 23:23
年末感慨实在是多,三言两语道不完!最让我揪心的还是你,行李备好了没?火车票买了没?别感动,我只是问问,自己的事情还是要自己做滴!哈哈。

作者: 菜刀吻电线    时间: 2012-3-21 23:29
人过留名!

作者: 奇    时间: 2012-6-30 23:24
提醒猪猪,千万不能让你看见

作者: tc    时间: 2012-9-2 00:23
既来之,则看之!

作者: 奇    时间: 2012-9-19 23:25
有意思!学习了!

作者: C.R.CAN    时间: 2012-9-20 23:25
“再次路过……”我造一个-----特别路过

作者: C.R.CAN    时间: 2012-9-27 23:23
其实楼主所说的这些,俺支很少用!

作者: 晃晃    时间: 2012-10-27 23:25
水……生命之源……灌……

作者: tc    时间: 2012-12-15 23:23
好铁多多发,感激分享





欢迎光临 纳金网 (http://go.narkii.com/club/) Powered by Discuz! X2.5