This paper demonstrates an image-based lighting analysis procedure and tool called Virtual Lighting Laboratory. Virtual Lighting Laboratory is a computer environment, where the user has been provided with matrixes of illuminance and luminance values extracted from high dynamic range digital images. The discussions mainly refer to the flexibility of capabilities within a virtual laboratory environment to handle various lighting design and analysis problems. Conception and utilization of virtual lighting meters and per-pixel lighting analyses are demonstrated through architectural examples. INTRODUCTION Lighting design is a synthesis of decision-making processes involving lamps, luminaires, controls, daylight apertures, surface materials, and colors. The choices affect the illuminance and luminance levels in space; thus they have direct impact on visual comfort and performance. This paper demonstrates an innovative lighting analysis tool and procedure called “Virtual Lighting Laboratory” (VLL), which is based on post-processing of high dynamic range (HDR) digital images. The study exemplifies an indepth utilization of digital technology to attain numerical and visual data so as to facilitate advanced lighting analysis and accelerate design decisionmaking. The lighting accuracy of digital images depends on the input and algorithmic capabilities of the rendering software. Yet, computational accuracy does not necessarily produce a visual match of the real-world scene. Display medium limits the luminance range and color gamut of the image (Ferwerda et al., 1996). Consequently, it might be erroneous to make lighting design decisions and performance evaluation from the appearance of the displayed images. The pixel information in digital images contain RGB values that are computed from the intensity of light reflected / transmitted from a surface towards the camera (Hall, 1999). Therefore, it is possible to process digital images to retrieve lighting information. Certain criteria have to be fulfilled to achieve meaningful lighting data from digital images. These criteria have been published elsewhere (Inanici, 2001; Inanici, 2003). They are briefly discussed here as image generation and analysis guidelines. IMAGE GENERATION AND ANALYSIS It is crucial to generate digital images with reasonably accurate photometric data in absolute values and physical units. Physical accuracy in lighting context depends on plausible modeling of light sources, light transportation, and light reflections. It is important to note that every simulation is a simplification. Validation studies should be referred to assess the accuracy of the rendering algorithms. The physical units of photometric data are HDR quantities. The dynamic range of luminance values from starlight to sunlight extends to 14 logarithmic units (Ferwerda et al., 1996). Rendering software internally use floating-point representations for these quantities. However, the storage of floating point numbers is not very feasible. Therefore, the data is usually clipped into 24 bit/pixel integer values, which allows a dynamic range of about 2 logarithmic units. This solution is efficient in terms of disk space, but the lost information is irrecoverable in terms of extracting absolute photometric information and performing operations outside the dynamic range of the stored values. RGBE and SGI Logluv are two image formats that enable the storage of HDR quantities (Ward, 1991; Ward, 1997). RGB is often used as the basis of color computation in computer graphics. Basically, RGB combinations are metamers in virtual environment for the spectral power curves of light in real world. The rendering software internally defines the CIE chromaticity coordinates of the RGB values. CIE XYZ data for each pixel can be quantified from RGB through a series of conversions that involve the sampling of the spectral curve using the CIE 1931 (2° observer) color matching functions (Hall, 1989; Hall, 1999). Operations such as gamma correction and exposure that might alter the stored pixel values should be Eighth International IBPSA Conference Eindhoven, Netherlands August 11-14, 2003
[1]
Roy Hall,et al.
Illumination and Color in Computer Generated Imagery
,
1988,
Monographs in Visual Communication.
[2]
M. Siminovitch,et al.
Experimental development of efficacious task source relationships in interior lighting applications
,
1989,
Conference Record of the IEEE Industry Applications Society Annual Meeting,.
[3]
M. Navvab,et al.
Contrast potential, an assessment technique using large solid angle illuminance measurements
,
1992,
Conference Record of the 1992 IEEE Industry Applications Society Annual Meeting.
[4]
Donald P. Greenberg,et al.
A model of visual adaptation for realistic image synthesis
,
1996,
SIGGRAPH.
[5]
Roy Hall.
Comparing Spectral Color Computation Methods
,
1999,
IEEE Computer Graphics and Applications.
[6]
APPLICATION OF THE STATE-OF-THE-ART COMPUTER SIMULATION AND VISUALIZATION IN ARCHITECTURAL LIGHTING RESEARCH
,
2001
.