HDRI (High Dynamic Range Images) in XSI
-- Image Based Lightning --

by Oktay CİNLİ
(v1.0 - 29.09.2001)

Note: You can view this page in XSI NetView: Copy this link and paste it into NetView: HDRINXSI_ENG.HTML


What is HDRI?

Cameras record light information and this information is reproduced on a different media (on a paper or on a computer screen), but the dynamic range of reproduced images isn't satisfying compared to real lightning conditions. They cannot display the dynamic range accurately (ratio between dark and bright regions). This range can only be covered by taking a series of pictures with different exposure settings. Those images are combined into a single high dynamic range image called a radiance map. Those maps are used for represent true illumination in computer graphics.

There are several image formats that can save and reproduce HDR information:

.HDR red-green-blue-exponent format
.PIC red-green-blue-exponent format
.PFM portable float map
.TIF .TIFF floating point tiff
.RAW .FLOAT  raw binary floating point
.MAP memory mapped image
.CT mental ray floating point image


For more information about the subject visit http://www.debevec.org/~debevec/Research/HDR .

Requirements for this tutorial:

1. HDR images: www.debevec.org/~debevec/Probes (You can choose any of them, or download all for further usage)

2. To view HDR images use HDRview: http://www.xsiturkiye.com/temp/HDRView.exe

3. Unfortunately XSI doesn't recognize .HDR image format. So we will convert .HDR images into another image format that XSI can recognize. We will use HDRTOXSI.EXE :  http://www.xsiturkiye.com/temp/hdrtoxsi.exe


Converting .HDR images  into .MAP images:

1. Run HDRTOXSI.EXE in a "Command Prompt" (You can drag and drop HDRTOXSI.EXE into Command Prompt). Press enter and you will see this screen:

2. How to use
c:\>hdritoxsi.exe HDRimage.hdr MAPimage.map

In CommandPrompt, type:

this command converts rnl_probe.hdr into rnl_probe.map. Note that all the images and the program are in the same directory. If they are in separate directories you must specify the exact location of your source files. You can simply drag and drop all.



1. Create a sphere, big enough to cover the whole scene.


2. Inverse the normals (direction of the polygons) of the sphere. To see the direction of normals:

 a. Click the eye    b. Choose "normals"    c. And you see the direction

2a. Apply "Model \ Poly. Mesh \ Invert Poligons"

 a. Click Model\Poly.Mesh    b. Apply "Invert Polygons"    c. New direction of the normals.


3. Attach "constant" material to the sphere: Get\Material\Constant


4. Then attach "Spherical Mapping" : Property\Texture Projection\Spherical


5. Now, attach HDRimage to the sphere. Be sure that sphere is selected and view "RenderTree" (You can use keyboard shortcut: "7")

a. Constant connected to the material.   b. Image Node command
c. Image Node d. Connect HDRimage to the image node
e. Connect image node to the Constant\Radiance f. Image node connected to the constant

 g. Final view Render Tree



6. Place the camera and set the view mode as "textured". It may look strange in the OpenGL view but MR will use the as expected.

a. Set the "textured" view.     b. Textured view


7. Your environment is ready. You can place your scene into the sphere but do not forget to delete the default light and create your own. Also you can create a grid that cuts the sphere into two parts.

8. Render using FinalGathering Rendering Method. Note that Lambert and reflective surfaces look good.

9. Some examples:

Oktay Cinli

©2001 All rights reserved.