Download presentation
Published byCamron Harper Modified over 9 years ago
1
Texture Optimization for Example-based Synthesis
Vivek Kwatra1,2 Irfan Essa1 Aaron Bobick1 Nipun Kwatra1 1GVU Center/College of Computing, Georgia Tech 2Computer Science Dept., UNC Chapel Hill
2
Motivation + = Input Texture Input Flow Field Output Sequence
I will start by motivating you with one of our results. We take an image of a texture like the one shown on the left as input. And synthesize a texture sequence like the one shown on the right. The texture flows in a coherent fashion in this synthesized sequence, even as it maintains the structural elements of the texture. Note that to synthesize such a sequence, we need another input besides the texture sample. This input is the flow field that controls the motion of the texture in the synthesized sequence. In this talk I will explain the elements of our technique that allow us to generate such sequences. -
3
Example-based Texture Synthesis
Texture synthesis has been extensively used in Comp. Graphics to generate realistic appearance for CG primitives. We wish to examine a few desirables that can enhance the usefulness of texture synthesis. We are interested in dynamic texturing, where the synthesized texture may changes its appearance over time. This would allow us to render CG primitives that may be changing in shape or form as in animation or video. An important problem in texture synthesis is that of adding controllability to the synthesis process. It is desirable to have user defined or procedural control knobs like flow, shape and illumination as for example I showed you in the previous slide.. A related issue is that of having an explicit measure of texture quality, that would allow us to evaluate the synthesized texture or texture sequence and refine it if necessary.
4
Example-based Texture Synthesis
Dynamic texturing (animation/video) Greater control: external knobs, e.g., flow, shape, illumination Explicit texture quality evaluation and refinement
5
Contributions I Global Optimization for texture synthesis
Global texture quality metric Iterative multi-scale optimization The specific contributions of our paper are as follows: Firstly, we introduce a novel optimization based technique for texture synthesis. We define a global metric to measure quality of a given texture and also present an iterative technique for optimizing over this metric. Our second contribution is a technique for motion-controlled texture synthesis where we want the synthesized texture to follow a given flow field. This can be thought of as a special case of our optimization framework with flow as the control knob. Potential applications of this technique include fluid flow rendering and texture-based flow visualization.
6
Contributions II + = Flow-guided texture animation
Flow incorporated into optimization Applications: fluid rendering, flow visualization + = Input Texture Input Flow Field Output Sequence
7
Related Work (a selected sample)
MRF Annealing [Paget’98] Texture Optimization [Kwatra’05] Histogram Matching [Heeger’95] Local vs. Global Graphcut Textures [Kwatra’03] Non-parametric sampling [Efros’99,Wei’00] Image Quilting [Efros’01] Pixel vs. Patch
8
Related Work (a selected sample)
Better Quality MRF Annealing [Paget’98] Texture Optimization [Kwatra’05] Histogram Matching [Heeger’95] Local vs. Global Graphcut Textures [Kwatra’03] Non-parametric sampling [Efros’99,Wei’00] Image Quilting [Efros’01] Pixel vs. Patch
9
Related Work (a selected sample)
Better Quality MRF Annealing [Paget’98] Texture Optimization [Kwatra’05] Greater Controllability Histogram Matching [Heeger’95] Local vs. Global Graphcut Textures [Kwatra’03] Consequently, global methods generally have a better handle at controlling the synthesis process, . However, existing global methods have difficulty achieving the same synthesis quality as state-of-the-art local patch based techniques. Our technique attempts to bridge the gap by striking a balance between quality and control. Non-parametric sampling [Efros’99,Wei’00] Image Quilting [Efros’01] Pixel vs. Patch
10
Related Work Flow-guided Synthesis Optimization
Advected Textures [Neyret’03] Flow-based Video Editing [Bhat’04] Optimization Image-based Priors [Fitzgibbon’03] Space-time Video Completion [Wexler’04]
11
Flow-guided Texture Animation
Flow consistency Perceived motion similar to flow Texture similarity Shape, size, orientation of texture elements similar to source Source Texture Texture Similarity box I will now describe the flow guided texture animation problem and our optimization technique for solving it. I will specifically work out how optimization is used for flow guided synthesis. Later I will show how the same technique can be applied for regular static texture synthesis. Lets examine the objectives that we want to satisfy for flow-guided synthesis. On the top is shown the source texture that we want to use for synthesis. The bottom right shows the desirable flow field that we want to use for controlling motion. And bottom left shows the synthesized texture sequence. We want to satisfy two criteria: we want to maintain flow consistency, i.e. perceived motion in the syn tex sim to desirable flow. Secondly, we want texture similarity. i.e. the shape, size, orientation of texture elements that come together to form the texture should be similar to the source. Flow Consistency Flowing Target Target Flow
12
Naive Approach Ignore Texture Similarity
Source Texture Texture Similarity Ignore Texture Similarity Warp each frame via flow field Source Texture Target Frames Warp X0 X1 Xn Texture Similarity We could try to solve the problem in a naïve way, by ignoring texture similarity and just considering flow consistency. We can then do the synth frame-by-frame, warping each frame according to the flow field. Flow Consistency Flowing Target Target Flow
13
Naive Approach Texture structure not maintained Source Texture Texture
Similarity However that doesn’t work… the structure of the texture is not maintained and it eventually disappears. Flow Consistency Flowing Target Target Flow
14
Instead of… Warp Warp Xn X0 X1 Target Frames
Our approach is to still do the synthesis frame by frame but add a correction step after the warping that adjusts the warped frame to obtain the synthesized frame Let X- be synthesized already, X new frame. W is warped frame
15
Instead of… Warp Warp Xn X0 X1 Target Frames
16
Our Approach Warp + Correct Target Frames X X– Warp W Correct
Warped Frame Synthesized Frame
17
“Correct” Operation Link to original goals Correct Target Frames X– X
We link this correction operation to our original goals Warp W
18
“Correct” Operation Link to original goals Correct Target Frames X– X
For flow consistency the synthesized frame should be close to the warped frame Warp Flow Consistency W
19
“Correct” Operation Link to original goals Correct Source
Texture Similarity Target Frames X– X Correct For texture similarity it should be similar to the source texture Warp Flow Consistency W
20
Optimization of Energy / Cost
Energy = Flow Energy + Texture Energy Source Texture Similarity Target Frames X– X Correct We cast this as an optimization problem where there is a cost or energy associated with each goal of the correction step.. We call these flow energy and the texture energy.. We want to minimize the total energy which is the sum of flow and texture energies Warp Flow Consistency W
21
Flow Energy Target Frames X– X Warp Flow Energy W
Flow energy is defined very simply.. It is the squared difference between the warped and the synthesized frame
22
Texture Energy Tricky! Compare texture “elements”
Pixel-by-pixel comparison of source and target not possible Compare texture “elements” Local pixel neighborhoods Want each target neighborhood to be similar to some source neighborhood TE is a little tricky The problem is that the target frame and source texture cannot be compared pixel by pixel. Instead we want to compare texture elements that make up the texture.. These elements can be thought of as residing within local pixel neighborhoods….
23
Texture Energy Z (source) X (target frame)
Lets look at it in more detail… source left synthesized frame right X (target frame)
24
Energy for Single Neighborhood
Z (source) p (pixel) First define energy for a single neighborhood.. I pick a pixel p and define nbd xp around it. X (target frame)
25
Energy for Single Neighborhood
Z (source) Xp (neighborhood) X (target frame)
26
Energy for Single Neighborhood
Z (source) Xp (neighborhood) We then want to find a source nbd that has the same appearance as xp.. So we look for its nearest nbr in the source appearance wise X (target frame)
27
Energy for Single Neighborhood
(nearest neighbor) Zp Z (source) Xp (neighborhood) X (target frame)
28
Energy for Single Neighborhood
(nearest neighbor) Zp Z Xp (neighborhood) Texture Energy (single neighborhood) Energy for single nbd xp is just the sqrd difference between it and its nearest nbd in the source.. X
29
Texture Energy for Entire Image
Z Now if we want to extend this definition to other nbds.. We can find their nearest nbrs.. X
30
Texture Energy for Entire Image
Z We sprinkle such nbds over the entire texture.. Total texture energy is now defined as the sum over the individual nbd energies.. Note that this energy will be zero if each synthesized nbd comes from some source nbd. X = individual neighborhood energy
31
Optimization Optimize Total Energy of target frame:
Initialize: X W (target frame warped frame) Iteratively improve target frame Correct operation.. It requires minimization of the total energy.. Which is the sum of the flow and texture energy
32
Optimization: Intuition
Zp Texture Force Xp Source Target Frame I will describe what it means to minimize this energy intuitively… going back to the nbg xp Analogy from physics: minimization of texture and flow energies leads to a texture and flow force It is actually being pulled in 2 directions… one towards the nearest source nbd .. And I call this pull the texture force.. Secondly towards its corresponding nbd in the warped frame.. .. I call that flow force Flow Force Wp Warped Frame
33
Optimization: Intuition
Zp Xp Blend Source Target Frame What the optimization ends up doing is it changes xp to be a blend of zp and wp.. i.e the source nbd and the warped nbd Wp Warped Frame
34
Overlapping Neighborhoods
Zp Xq Zq Blend Source Xp Target Frame A thing to note here is that the different nbds that we consider in the target frame may be overlapping….. So for example if xq overlaps with our original nbd xp then its nearest nbr zq also has an effect on xp… Due to the pixels in this overlapping region… so as shown here there will be a 3-way blend as a result of the optimization Wp Warped Frame
35
We are not done yet! Zp Xp Wp Nearest neighbor may change Source
Target Frame You may ask… are we done yet.. The answer is no! this is because the nearest neighbor of xp… may change after the blending operation.. So the old texture energy is no longer valid.. The energy has changed. Wp Warped Frame
36
Iterative Algorithm Step 1 Step 2 Repeat until convergence
Find Nearest Source Neighborhoods Step 2 Blend Source & Warped Neighborhoods Repeat until convergence This suggests an iterative algorithm with the following steps Step 1: find the nearest source nbds for target nbds Step 2: blend the source and warped nbds… goto 1… repeat.
37
Iterative Algorithm Step 1 Minimize E( X ) w.r.t {Zp}
Find Nearest Source Neighborhoods Minimize E( X ) w.r.t {Zp} Approximate nearest-neighbors Hierarchical Clustering [Johnson’67] Tree-Structured Vector Quantization [Wei’00]
38
Iterative Algorithm Step 2 Minimize E( X ) w.r.t X
Blend Source & Warped Neighborhoods Minimize E( X ) w.r.t X Set Solve Linear System
39
Robust Optimization Quadratic Texture Energy Et sensitive to outlier neighborhoods Non-quadratic Et more robust Et( X ) = p kXp – Zpkr, < r < 2 A drawback of the texture energy as defined is that it is quadratic which makes it sensitive to outliers.. Nbds where source and target are far away “could pull the synthesized frame towards a bad local minima” Better to solve robust functions like absolute value or square root. Approximately solve for such a function within our iterative framework. We convert the robust exponent term into a weighted quadratic term, where the weight is based on the target frame X synthesized by the previous iteration. This allows us to again solve a linear system at each iteration.
40
Robust Optimization Quadratic Texture Energy Et sensitive to outlier neighborhoods Non-quadratic Et more robust Et( X ) = p kXp – Zpkr, < r < 2 Solve weighted quadratic at each iteration Et( X ) = p kXp – Zpkr-2. kXp – Zpk2 = p p kXp – Zpk2
41
Minimize Texture Energy Alone
Novel Static Texture Synthesis Algorithm Source Texture Similarity X– X Correct Now I would like you to think about this… what would happen if we drop the flow energy from our formulation.. And only consider texture similarity… as you might have guessed… this leads to a novel algorithm for texture synthesis.. Flow Consistency W
42
Texture Energy Graph Random Initialization Multiple
Resolution Levels Neighborhood Sizes Progressively refined output Here is a graph that shows a texture being evolved. It was initialized randomly.. The minimization was run at multiple res levels with multple nbd sizes… the red star indicates A change in level. You see a jump in the energy because the energies at different levels mean different things.. Here I have just normalized them to cost per pixel so that you can see all of them in the same graph Note that the texture is progressively refined during the course of the algorithm which may be exploited for level-of-detail applications like video games. *Local minima* one level *not correspond* to local minima at the next level Local minima in energy surface for one resolution/scale level may not correspond to local minima at the next level
43
Texture Energy Graph Random Initialization Multiple
Resolution Levels Neighborhood Sizes Progressively refined output
44
Results: Texture Synthesis
46
Results: Texture Synthesis
Input Wei-Levoy [Wei’00] Image Quilting [Efros’01] Graph-cuts [Kwatra’03] Texture Optimization [Kwatra’05]
47
Results: Texture Synthesis
Image Quilting [Efros’01] Graph-cuts [Kwatra’03] Texture Optimization [Kwatra’05]
48
Results: Flow-guided Synthesis
Note that while the texture is following the flow field.. It Is also maintaining the structure of the source texture as desired.. This result is to show that we can handle highly complex flow fields… Here are some more textures shown side by side.. Note that the orientation of the keys is maintained even as it rotates… NUTS look like nuts… even as they flow We can also handle time-varying flow fields… Changing direction HANDLED WITHIN THE SAME FRAMEWORK Here we wrote the letters FLOW using manually designed flow fields.. Note how the elements of the texture stretch.. But then split after a certain point… of course our technique doesn’t Really know anything about elemetns .. It is purely using appearance information..
49
Results: Timing Static texture synthesis Flowing texture synthesis
5-10 iterations per resolution level Flowing texture synthesis 20-60 seconds per frame Resolution One iteration Total time 64x64 2.5 sec 20 sec 128x128 10 sec 2 min 256x256 25 sec 8.5 min Timing results. For static synthesis, where we use multiple resolution levels, we need about 5-10 iterations per level. The table should computation times per iteration for a given resolution as well as the total computation time for that resolution. For flow-guided synthesis, we averaged between sec per frame again depending upon the output resolution.
50
Summary + = Global Optimization for controllable texture synthesis
Explicit evaluation of texture quality Iterative progressive refinement of texture Flow-guided texture animation within optimization framework + = Input Texture Input Flow Field Output Sequence
51
Limitations and Future Work
Optimization finds local minima Blurry / misaligned texture elements Texture scale should be compatible with flow field Extensions Controllable video textures Other control criteria like shape, illumination
52
Acknowledgements Eugene Zhang for flow design code
Greg Turk, Gabriel Brostow, Delphine Nain, Ravi Ruddarraju, Gaurav Chanda, Stephanie Brubaker, Mitch Parry, Pei Yen Computational Perception Lab, Geometry Georgia Tech CMU Graphics Lab Web page:
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.