Crowd Light: Evaluating the Perceived Fidelity of Illuminated Dynamic Scenes Adrian Jarabo 1 Tom Van Eyck 2 Veronica Sundstedt 3 Kavita Bala 4 Diego Gutierrez.

Slides:



Advertisements
Similar presentations
Dynamic View Selection for Time-Varying Volumes Guangfeng Ji* and Han-Wei Shen The Ohio State University *Now at Vital Images.
Advertisements

Fast Depth-of-Field Rendering with Surface Splatting Jaroslav Křivánek CTU Prague IRISA – INRIA Rennes Jiří Žára CTU Prague Kadi Bouatouch IRISA – INRIA.
Real-time Shading with Filtered Importance Sampling
Multidimensional Lightcuts Bruce Walter Adam Arbree Kavita Bala Donald P. Greenberg Program of Computer Graphics, Cornell University.
Scalability with Many Lights 1 Lightcuts & Multidimensonal Lightcuts Course: Realistic Rendering with Many-Light Methods Note: please see website for the.
Computer graphics & visualization Global Illumination Effects.
Detail to attention: Exploiting Visual Tasks for Selective Rendering Kirsten Cater 1, Alan Chalmers 1 and Greg Ward 2 1 University of Bristol, UK 2 Anyhere.
Way to go: A framework for multi-level planning in games Norman Jaklin Wouter van Toll Roland Geraerts Department of Information and Computing Sciences.
Scalability with many lights II (row-column sampling, visibity clustering) Miloš Hašan.
A Perceptual Heuristic for Shadow Computation in Photo-Realistic Images Wednesday, 2 August 2006 Peter VangorpOlivier DumontToon LenaertsPhilip Dutré.
Rendering with Environment Maps Jaroslav Křivánek, KSVI, MFF UK
Guillaume Lavoué Mohamed Chaker Larabi Libor Vasa Université de Lyon
UVA / UNC / JHU Perceptually Guided Simplification of Lit, Textured Meshes Nathaniel WilliamsUNC David LuebkeUVA Jonathan D. CohenJHU Michael KelleyUVA.
Precomputed Local Radiance Transfer for Real-time Lighting Design Anders Wang Kristensen Tomas Akenine-Moller Henrik Wann Jensen SIGGRAPH ‘05 Presented.
Real-Time Rendering Paper Presentation Imperfect Shadow Maps for Efficient Computation of Indirect Illumination T. Ritschel T. Grosch M. H. Kim H.-P. Seidel.
Graphics-1 Gentle Introduction to Computer Graphics Based on: –David Brogan’s “Introduction to Computer Graphics” Course Slides, University of Virginia.
1 7M836 Animation & Rendering Animation Jakob Beetz Joran Jessurun
Probabilistic video stabilization using Kalman filtering and mosaicking.
Introduction to Image Quality Assessment
Lightcuts: A Scalable Approach to Illumination Bruce Walter, Sebastian Fernandez, Adam Arbree, Mike Donikian, Kavita Bala, Donald Greenberg Program of.
Direct-to-Indirect Transfer for Cinematic Relighting Milos Hasan (Cornell University) Fabio Pellacini (Dartmouth College) Kavita Bala (Cornell University)
Optimized Subdivisions for Preprocessed Visibility Oliver Mattausch, Jiří Bittner, Peter Wonka, Michael Wimmer Institute of Computer Graphics and Algorithms.
QUASID: Quantifying 3D Image Perception in VR Ferdi Smit Center for Mathematics and Computer Science (CWI) Amsterdam.
Fast Global-Illumination on Dynamic Height Fields
Matrix Row-Column Sampling for the Many-Light Problem Miloš Hašan (Cornell University) Fabio Pellacini (Dartmouth College) Kavita Bala (Cornell University)
Fast Synthetic Vision, Memory, and Learning Models for Virtual Humans.
Outline Reprojection and data reuse Reprojection and data reuse – Taxonomy Bidirectional reprojection Bidirectional reprojection.
1 7M836 Animation & Rendering Animation Jakob Beetz Joran Jessurun
Shading (introduction to rendering). Rendering  We know how to specify the geometry but how is the color calculated.
ICheat: A Representation for Artistic Control of Cinematic Lighting Juraj ObertJaroslav Křivánek Daniel Sýkora Fabio Pellacini Sumanta Pattanaik
Computer Aided Perception Validation of Tone Mapping Operators in the Simulation of Disability Glare A Masters Thesis Proposal by Charles Ehrlich UC Berkeley.
Efficient Irradiance Normal Mapping Ralf Habel, Michael Wimmer Institute of Computer Graphics and Algorithms Vienna University of Technology.
Interactive Virtual Relighting and Remodelling of Real Scenes C. Loscos 1, MC. Frasson 1,2,G. Drettakis 1, B. Walter 1, X. Granier 1, P. Poulin 2 (1) iMAGIS*
Jonathan M Chye Technical Supervisor : Mr Matthew Bett 2010.
Perceptual Influence of Approximate Visibility in Indirect Illumination Insu Yu 27 May 2010 ACM Transactions on Applied Perception (Presented at APGV 2009)
Plausible motion: conclusion Ronen Barzel John Hughes.
Efficient Rendering of Local Subsurface Scattering Tom Mertens 1, Jan Kautz 2, Philippe Bekaert 1, Frank Van Reeth 1, Hans-Peter Seidel
View-Dependent Precomputed Light Transport Using Nonlinear Gaussian Function Approximations Paul Green 1 Jan Kautz 1 Wojciech Matusik 2 Frédo Durand 1.
SIGGRAPH 2010 Interactive On-Surface Signal Deformation Tobias Ritschel 1 Thorsten Thormählen 1 Carsten Dachsbacher 2 Jan Kautz 3 Hans-Peter Seidel 1.
Real-time Indirect Lighting Using Clustering Visibility Zhao, Tobias, Thorsten, Jan* *University College London.
Towards a Taxonomy of Global Illumination Algorithms Philip Dutré Program of Computer Graphics Cornell University.
Goal and Motivation To study our (in)ability to detect inconsistencies in the illumination of objects in images Invited Talk! – Hany Farid: Photo Forensincs:
Fast Approximation to Spherical Harmonics Rotation Sumanta Pattanaik University of Central Florida Kadi Bouatouch IRISA / INRIA Rennes Jaroslav Křivánek.
Fast Approximation to Spherical Harmonics Rotation
Efficient Streaming of 3D Scenes with Complex Geometry and Complex Lighting Romain Pacanowski and M. Raynaud X. Granier P. Reuter C. Schlick P. Poulin.
- Laboratoire d'InfoRmatique en Image et Systèmes d'information
Adrian Jarabo, Hongzhi Wu, Julie Dorsey,
Differential Instant Radiosity for Mixed Reality Martin Knecht, Christoph Traxler, Oliver Mattausch, Werner Purgathofer, Michael Wimmer Institute of Computer.
CS 527 – Computer AnimationOctober 17, 2006 Estimating Cloth Simulation Parameters from Video Kiran S. Bhat, Christopher D. Twigg, Jessica K. Hodgins,
Photo-realistic Rendering and Global Illumination in Computer Graphics Spring 2012 Hybrid Algorithms K. H. Ko School of Mechatronics Gwangju Institute.
CASA 2006 CASA 2006 A Skinning Approach for Dynamic Mesh Compression Khaled Mamou Titus Zaharia Françoise Prêteux.
Shape-Dependent Gloss Correction
A Framework for Perceptual Studies in Photorealistic Augmented Reality Martin Knecht 1, Andreas Dünser 2, Christoph Traxler 1, Michael Wimmer 1 and Raphael.
Toward Real-Time Global Illumination. Global Illumination == Offline? Ray Tracing and Radiosity are inherently slow. Speedup possible by: –Brute-force:
Toward Real-Time Global Illumination. Project Ideas Distributed ray tracing Extension of the radiosity assignment Translucency (subsurface scattering)
Sébastien Paris, Anton Gerdelan, Carol O’Sullivan {Sebastien.Paris, gerdelaa, GV2 group, Trinity College Dublin.
From local motion estimates to global ones - physiology:
Real-Time Soft Shadows with Adaptive Light Source Sampling
GI is slow Global illumination is important but slow. So people use fast approximations, … Effects of Global Illumination Approximations on Material Appearance.
Combining Edges and Points for Interactive High-Quality Rendering
Reconstruction For Rendering distribution Effect
Gregory’s Top-down Theory
Interactive Computer Graphics
Contribution of spatial and temporal integration in heading perception
Image Space Based Visualization of Unsteady Flow on Surfaces
Real-time Volumetric Lighting in Participating Media
Conservative Visibility Preprocessing using Extended Projections Frédo Durand, George Drettakis, Joëlle Thollot and Claude Puech iMAGIS-GRAVIR/IMAG-INRIA.
Gradient Domain Salience-preserving Color-to-gray Conversion
open research problems
Real-time Global Illumination with precomputed probe
Presentation transcript:

Crowd Light: Evaluating the Perceived Fidelity of Illuminated Dynamic Scenes Adrian Jarabo 1 Tom Van Eyck 2 Veronica Sundstedt 3 Kavita Bala 4 Diego Gutierrez 1 Carol O’Sullivan 2 1 Universidad de Zaragoza 2 Trinity College Dublin 3 Blekinge Institute of Technology 4 Cornell University

Why Crowd Lighting? LOTR: The Two Towers (2002) © 2002 New Line Productions, Inc

Why Crowd Lighting? Assassins Creed (2007) © 2007 Ubisoft

Why Crowd Lighting? LOTR: The Return of the King (2003) © 2003 New Line Productions, Inc

Metropolis - Supercrowds for Multisensory Urban Simulations Why Crowd Lighting?

Realism Huge Cost Realism Why Crowd Lighting? Crowds + GI + Approx. GI Crowds += …or Perceived Fidelity

An example…

Second video rendered 3.64 times faster

Related Work Perceptual rendering: – Visible Difference Predictor [Bolin and Meyer 95/98, Myszkowski et al. 01, …] – Illumination components [Stokes et al. 04; Debattista et al. 05] – Approximated Visibility [Kozlowski and Kautz 07, Yu et al. 09, Ritschel et al. 08] – Visual attention [Yee et al. 01, Ferwerda and Pellacini 03, Sundstedt et al. 07, Hasic and Chalmers 09]

Related Work Perception in crowds: – Perception of general aggregates [Ramanarayanan et al. 08] – Perception of crowd variety [McDonnell et al. 08] Visual Equivalence: [Ramanarayanan et al. 07, Ramanarayanan et al. 08, Vangorp et al. 09, Krivanek et al. 10]

Our goal Evaluate and understand the perceived fidelity of illumination in scenes with complex dynamic crowds.

Our goal – Questions to answer Q1. Does the complexity of the crowd affect perceived quality of illumination? Q2. Are errors in direct or indirect lighting more salient? Q3. What effect does colour have on the perceived fidelity of illuminated crowd scenes?

Illumination Experiments Comparison with Video Quality Metric

Illumination Experiments Comparison with Video Quality Metric

Illumination x

xx1 light transport in the scene: direct + GI + SSS

Illumination – SH Huge vector modeling T(x, ,  o ) After [Sloan08] SH Coefficients [Sloan02]

Illumination – SH Used in Film Production [Pantaleoni11]:

Illumination – Interpolation SH Coefficients [Sloan02] t

Illumination – Interpolation Frame n Frame n+5 Frame i Frame i = lerp(Frame n, Frame n+5)

Illumination Experiments Comparison with Video Quality Metric

Experiments Question: “Is the illumination in the scene being evaluated the same quality as in the gold standard?”

Experiments – Methods Two screens – One shows test video. – Other shows reference from different PoV and desynchronized. Avoid side-by-side comparison

Experiment 1 Q1. Does the complexity of the crowd affect perceived quality? Q2. Are errors in direct or indirect lighting more salient? Q3. What effect does colour have on the perceived fidelity of illuminated crowd scenes?

Character Object (OBJ): Pawn & Human Experiment 1 – Variables

Variables – Character Object Pawn – Static – Smooth Human – Animated – Sharp gradients – Self-occlusions

Character Object (OBJ): Pawn & Human Crowd Movement (MOV): Army & Random Experiment 1 – Variables

Variables – Crowd Movement Army Random

Character Object (OBJ): Pawn & Human Crowd Movement (MOV): Army & Random Illumination Setup (ILL): Visibility only & Full GI Experiment 1 – Variables

Variables – Illumination Setup Visibility only Full GI

Character Object (OBJ): Pawn & Human Crowd Movement (MOV): Army & Random Illumination Setup (ILL): Visibility only & Full GI Interpolation Intervals (INT) : [GS, 2, 3, 4] Experiment 1 – Variables

Illumination – Interpolation Frame n Frame n+5

Character Object (OBJ): Pawn & Human Crowd Movement (MOV): Army & Random Illumination Setup (ILL): Visibility only & Full GI Interpolation Intervals (INT) : [GS, 2, 3, 4] 32 combinations Experiment 1 – Variables

Experiment 1 – Results Equivalent to GS Visible Differences

Experiment 1 – Discussion Local artifacts are not masked by global complexity. Interpolating direct lighting coefficients creates unacceptable artifacts in most cases.

Experiment 1 Q1. Does the complexity of the crowd affect perceived quality? Q2. Are errors in direct or indirect lighting more salient?

Experiment 2 Q1. Does the complexity of the crowd affect perceived quality? Q3. What effect does colour have on the perceived fidelity of illuminated crowd scenes?

Character Object (OBJ): Pawn & Human Crowd Movement (MOV): Army & Random Illumination Setup (ILL): Visibility only & Full GI Experiment 2 – Variables

Character Object (OBJ): Pawn & Human Crowd Movement (MOV): Army & Random Color (COL): Color & No Color Interpolation Type (TYP): Motion-based & GI-based Experiment 2 – Variables

T(x dyn, ,  o) Variables – Interpolation Type Motion-based T(x sta, ,  o) Dynamic Static Frame n+5

Variables – Interpolation Type Motion-based GI-based T dir (x, ,  o )T ind (x, ,  o )T(x, ,  o ) T ind (x, ,  o ) T dir (x, ,  o ) Frame n+5

Character Object (OBJ): Pawn & Human Crowd Movement (MOV): Army & Random Color (COL): Color & No Color Interpolation Type (TYP): Motion-based & GI-based Interpolation Intervals (INT) : [GS, 2, 5, 10, 30, 60] 48 combinations Experiment 2 – Variables

Experiment 2 – Results GI-based, Colour, ArmyGI-based, Colour, Random Equivalent to GS Visible Differences

Experiment 2 – Results Video GI-based, Army, Human, INT = 10

Experiment 2 – Results Video GI-based, Random, Human, INT = 30

Experiment 2 – Discussion Complexity masks artifacts produced by approximating GI: – Human allows more approximation than Pawn. – Random allows more approximation than Army. We can interpolate up to 10 frames for human crowds with structured motion, and 30 for the un-structured random motion.

Performance Timings and Speed Up for Human crowds: Speed-Ups are bounded by 1.19x and 4x for Intp.Type Motion- based and GI-based respectively. MotionIntp. TypeNTime/frameSpeed-Up Army Motion-based 54’18’’1.15x Crowd Motion-based 54’18’’1.15x Army GI-based 101’36’’3.08x Crowd GI-based 301’21’’3.64x

Illumination Experiments Comparison with Video Quality Metric

State of the art VQM [Aydin et al. 10]

Comparison with Video Quality Metric State of the art VQM [Aydin et al. 10]

Comparison with Video Quality Metric State of the art VQM [Aydin et al. 10]

Comparison with Video Quality Metric State of the art VQM [Aydin et al. 10] VQM focus on low-level vision (pixels): – Too conservative Introducing high-level vision knowledge allows more aggressive approximations.

Conclusion Presented a framework to evaluate the perceived fidelity of approximated illumination solutions in dynamic crowds. Errors in illumination can be masked by the aggregate characteristics. Faster rendering even with naïve approximations. Compared against VQM. Show that using scene properties would improve these metrics.

Conclusion – Questions to answer Q1. Does the complexity of the crowd affect perceived quality of illumination? More complex crowds allows approximating more the illumination when approximating GI. Q2. Are errors in direct or indirect lighting more salient? Errors in direct lighting are more salient. Q3. What effect does colour have on the perceived fidelity of illuminated crowd scenes? The most acceptable approximation for human crowds is to interpolate indirect illumination in colour scenes.

Future Work Explore other aggregate properties. E.g. numerosity, variety, LoD… Explore new approximation algorithms for rendering. Account for scene properties in objective video quality metrics.

Thank you! Acknowledgments Martin Prazak Science Foundation Ireland (Metropolis) European Commission, 7 th Framework Programme (GOLEM & VERVE) Spanish Ministry of Science National Science Foundation CAI Programa Europa Experiment participants

Crowd Light: Evaluating the Perceived Fidelity of Illuminated Dynamic Scenes Adrian Jarabo 1 Tom Van Eyck 2 Veronica Sundstedt 3 Kavita Bala 4 Diego Gutierrez 1 Carol O’Sullivan 2 1 Universidad de Zaragoza 2 Trinity College Dublin 3 Blekinge Institute of Technology 4 Cornell University