An evaluation of different setups for simulating lighting characteristics

The advance of technology continuously enables new luminaire designs and concepts. Evaluating such designs has traditionally been done using actual prototypes, in a real environment. The iterations needed to build, verify, and improve luminaire designs incur substantial costs and slow down the design process. A more attractive way is to evaluate designs using simulations, as they can be made cheaper and quicker for a wider variety of prototypes. However, the value of such simulations is determined by how closely they predict the outcome of actual perception experiments. In this paper, we discuss an actual perception experiment including several lighting settings in a normal office environment. The same office environment also has been modeled using different software tools, and photo-realistic renderings have been created of these models. These renderings were subsequently processed using various tonemapping operators in preparation for display. The total imaging chain can be considered a simulation setup, and we have executed several perception experiments on different setups. Our real interest is in finding which imaging chain gives us the best result, or in other words, which of them yields the closest match between virtual and real experiment. To answer this question, first of all an answer has to be found to the question, "which simulation setup matches the real world best?" As there is no unique, widely accepted measure to describe the performance of a certain setup, we consider a number of options and discuss the reasoning behind them along with their advantages and disadvantages.

[1]  Pieter J. H. Seuntiëns,et al.  A comparison of perceived lighting characteristics in simulations versus real-life setup , 2011, Electronic Imaging.

[2]  Rosemarie Rajae-Joordens,et al.  Paired comparisons in visual perception studies using small sample sizes , 2005, Displays.

[3]  Jonathan Goldstein,et al.  When Is ''Nearest Neighbor'' Meaningful? , 1999, ICDT.

[4]  Karol Myszkowski,et al.  Validation proposal for global illumination and rendering techniques , 2001, Comput. Graph..

[5]  Mark D. Fairchild,et al.  The iCAM Framework for Image Appearance, Image Differences, and Image Quality , 2002 .

[6]  Reiner Eschbach,et al.  Comparing image preference in controlled and uncontrolled viewing conditions , 2010, J. Electronic Imaging.

[7]  Céline Villa,et al.  Psychovisual Assessment of Tone-Mapping Operators for Global Appearance and Colour Reproduction , 2010, CGIV/MCS.

[8]  Nathan Moroney Thousands of on-line observers is just the beginning , 2009, Electronic Imaging.

[9]  Villa Céline,et al.  DRAFT OF Calibrating a display devide for subjective visual comfort tests: selection of light simulation programs and post- production operations , 2010 .

[10]  Geoffrey G. Roy A Comparative Study of Lighting Simulation Packages Suitable for use in Architectural Design , 2000 .

[11]  Mark D. Fairchild,et al.  iCAM06: A refined image appearance model for HDR image rendering , 2007, J. Vis. Commun. Image Represent..

[12]  Graham D. Finlayson,et al.  Comparing a Pair of Paired Comparison Experiments: Examining the Validity ofWeb-Based Psychophysics , 2011, Color Imaging Conference.

[13]  E. Reinhard Photographic Tone Reproduction for Digital Images , 2002 .