RenderToolbox3: MATLAB tools that facilitate physically based stimulus rendering for vision research

Benjamin S Heasly, Nicolas P Cottaris, Daniel P Lichtman, Bei Xiao, David H Brainard, Benjamin S Heasly, Nicolas P Cottaris, Daniel P Lichtman, Bei Xiao, David H Brainard

Abstract

RenderToolbox3 provides MATLAB utilities and prescribes a workflow that should be useful to researchers who want to employ graphics in the study of vision and perhaps in other endeavors as well. In particular, RenderToolbox3 facilitates rendering scene families in which various scene attributes and renderer behaviors are manipulated parametrically, enables spectral specification of object reflectance and illuminant spectra, enables the use of physically based material specifications, helps validate renderer output, and converts renderer output to physical units of radiance. This paper describes the design and functionality of the toolbox and discusses several examples that demonstrate its use. We have designed RenderToolbox3 to be portable across computer hardware and operating systems and to be free and open source (except for MATLAB itself). RenderToolbox3 is available at https://github.com/DavidBrainard/RenderToolbox3.

Keywords: color; graphics rendering; material perception; stimuli; vision science.

Figures

Figure 1
Figure 1
Overview of the DragonColorChecker workflow. Files that the user must supply are labeled with italics. These include the Parent Scene, which may be modeled using the Blender application and must be exported as a Collada file; the Mappings File, which maps reflectance spectra to the dragon model; the Conditions File, which lists 24 reflectance spectra measured from a ColorChecker color rendition chart; and the Executive Script, which binds together the Parent Scene file, Mappings File, and Conditions File and invokes RenderToolbox3 utilities. Subsequent steps are performed automatically by RenderToolbox3 utilities. Twenty-four Native Scene Files are produced (only three are shown) that may drive one of the supported renderers, PBRT or Mitsuba, in order to produce 24 Multispectral Radiance Data files (only three are shown). All data files use a consistent MATLAB mat-file format and physical radiance units. Multispectral data files are combined into a single sRGB montage that resembles the original ColorChecker chart.
Listing 1
Listing 1
Excerpt from a conditions file that has two column headers followed by 24 rows (only three rows are shown). The left-hand column has the header imageName and contains an arbitrary mnemonic for each of 24 conditions. The right-hand column has the header dragonColor and lists 24 reflectance spectrum data file names. Columns are delimited by tab characters, and rows are delimited by new lines. In general, there is no limit to the number of columns or rows in a conditions file.
Listing 2
Listing 2
Excerpt from a mappings file that maps user-defined values to scene elements. Here all mappings are contained in a Generic block, which is applied to any renderer and is delimited by curly braces { }. The Dragon-material scene element is declared to be a material of type matte. The user-defined spectrum (dragonColor) is mapped to the diffuseReflectance property of Dragon-material. Parentheses indicate that the value of dragonColor may change for each condition and should be taken from the “dragonColor” column of the conditions file. The Wall-material and Floor-material elements are also declared as materials of type matte. Constant strings are mapped to the diffuseReflectance property of Wall-material and Floor-material. These strings specify spectrally uniform reflectance spectra in the range 300 though 800 nm, using a syntax for specifying arbitrary sampled spectra that PBRT and Mitsuba can parse. The reflectance of the wall is specified as 0.75, and that of the floor is 0.5.
Listing 3
Listing 3
A simplified DragonColorChecker MATLAB executive script that binds a parent scene Collada file with a conditions file and a mappings file and invokes RenderToolbox3 utilities to form a complete RenderToolbox3 recipe. The user chooses the PBRT renderer, using a structure of RenderToolbox3 hints. The MakeSceneFiles( ) utility uses the parent scene Collada file, conditions file, and mappings file to produce a family of PBRT native scene files that PBRT can render. The BatchRender( ) utility lets each PBRT native scene file drive PBRT in turn to produce a family of multispectral data files. The MakeMontage( ) utility condenses the family of multispectral data files into a single RGB image stored in a portable png file.
Figure 2
Figure 2
Summary of RadianceTest results. Each row shows results from one of the eight RadianceTest conditions. From left to right, each row contains (a) the name of the condition; (b) a schematic drawing of the scene geometry, including the light source (yellow circle or bar), reflector (gray bar), camera (black polygon), and manipulation (red highlight); (c) sRGB representation of the scene as rendered by PBRT; (d) sRGB representation of the scene as rendered by Mitsuba; and (e) reflected radiance profile taken at a single arbitrary wavelength (520 nm) at a horizontal slice through the multispectral renderings at the location of the dashed lines for PBRT (orange) and Mitsuba (blue). Radiance profiles vary in height and width with respect to the Reference condition in a manner consistent with physical principles and illuminant spectra that are treated in units of power per unit wavelength. A single common scale factor was applied to all sRGB images to facilitate visual comparisons.
Listing 4
Listing 4
Excerpt from a mappings file that adjusts Mitsuba camera properties using values contained in the parent scene Collada file. All mappings are contained in a Mitsuba path block, which would apply low-level path syntax to Mitsuba, and the block is delimited by curly braces {}. The right-hand value in square braces [] refers to the x-magnification property of the Camera scene element of the Collada parent scene. This value is applied to the x-scale factor of the Camera element of the Mitsuba scene that will be generated during batch processing.
Figure 3
Figure 3
Comparison of SimpleSphere renderings. PBRT and Mitsuba rendered the SimpleSphere scene. The SphereRendererToolbox produced a third Reference rendering. The top and middle rows contain sRGB representations of SimpleSphere multispectral renderings and element-wise differences between multispectral renderings. Top row, from left to right: PBRT, Mitsuba minus PBRT, and Mitsuba. Middle row, from left to right: PBRT minus Reference, Reference, and Mitsuba minus Reference. A single common scale factor was applied to all sRGB images to facilitate visual comparisons. The bottom row contains reflected radiance profiles taken at a single arbitrary wavelength (520 nm) at a vertical slice through the multispectral renderings at the location of the dashed lines for PBRT (orange), Mitsuba (blue), and the SphereRendererToolbox Reference (gray).
Figure 4
Figure 4
Summary of MaterialSphereBumps workflow. The top row shows key elements of the scene, from left to right: the parent scene 3-D model created in Blender; various sphere materials including red-looking matte, green-looking Ward material, and gold metal; and an image of the earth used as a bump map to alter the surface height of the sphere. The middle rows show a sRGB montage of renderings produced by PBRT. The bottom row shows a sRGB montage of renderings produced by Mitsuba. Both renderers support bump maps and produce renderings with surface height altered to resemble the earth image. However, the bumps appear taller in the PBRT rendering than they do in the Mitsuba rendering, indicating that the two renderers interpret bump maps differently.
Listing 5
Listing 5
A simplified MaterialSphereBumps MATLAB executive script that produces sensor images based on estimates of the spectral sensitivities of the human cones. The user chooses a Psychophysics Toolbox colorimetric data file that contains estimates of human cone sensitivities. The cone data act as the color-matching function passed to the RenderToolbox3 MakeSensorImages( ) utility, which transforms multispectral renderings into sensor images that estimate the responses of human cones to the renderings. The sensor images are saved in MATLAB mat-file data files with file names automatically chosen based on the multispectral data file names and the color-matching function name.
Figure 5
Figure 5
Summary of modifications to the “wild” Interior scene. Each row shows the scene at a particular step of modification and contains, from left to right, the name of the step, a schematic plan view of the Interior scene, and a Blender preview image or sRGB representation of the scene as rendered by Mitsuba. The original parent scene 3-D model was created by an unknown author for uses unrelated to RenderToolbox3. As viewed in Blender, many objects are visible, including two rear walls, five seats, and a hanging lamp. A perspective camera views the scene from a corner opposite the rear walls. When the original scene is rendered with the Direct Lighting strategy, the room is dark and few objects are visible. When rendered with the Path Tracing strategy, a few more objects become visible, and some rendering noise artifacts appear as green and gray spots. After the parent scene model is adjusted to contain large Area Lights, the scene appears lighter and many more objects are visible.
Figure 6
Figure 6
Summary and analysis of the SpectralIllusion. The SpectralIllusion was produced from two renderings. The top left image shows a sRGB representation of the Initial Mitsuba rendering. This initial rendering allowed estimation of the illumination arriving at the destination pip (inside the green circle). The top right image shows a sRGB representation of the Illusion Mitsuba rendering, which used a new reflectance for the destination pip that was calculated using the estimated illumination. In the Illusion image, the target pip (inside the blue square) and the destination pip have essentially equal RGB values even though the destination pip appears brighter/pinker (destination: [176 49 61] vs. target: [175 48 60]). The reference pip (inside the yellow circle) appears to have a color more similar to the target pip even though its RGB values differ substantially from the target's (reference: [115 28 40] vs. target: [175 48 60]). The second row plots the multispectral reflectance specified for the target pip (blue squares), the reflectance of the destination pip in the Initial rendering (green circles), and the calculated reflectance used for the destination pip in the Illusion rendering (red stars). The third row plots the estimated spectrum of illumination arriving at each pip. The illumination arriving at the destination pip has essentially the same spectrum in the Initial and Illusion renderings and is generally different from the illumination arriving at the target pip. The bottom row plots the final reflected radiance for each pip as read from multispectral renderings. The final reflected radiance of the target pip has essentially the same spectrum as the final reflected radiance of the destination pip in the Illusion rendering.
Listing 6
Listing 6
Excerpt from the SpectralIllusion executive script that calculates the reflectance for the destination pip. Previously, the executive script would have performed batch processing and an initial rendering, resulting in a multispectral data file. The spectral sampling used internally by the renderer S_renderer; the spectral sampling used to specify reflectances S_reflectance; and the initial, arbitrary reflectance of the destination pip R_destination would have been specified previously. The executive script loads rendering data from the initial rendering, locates pixels in the rendering that fall within the target and destination pips, and reads final reflected spectra F_target and F_destination from those pixels. The script can then estimate the spectrum of illumination I_destination at the destination pip and calculate a new reflectance for the destination pip R_destination. Note that the spectral samplings S_renderer and S_reflectance are not generally equal. Before performing element-wise arithmetic on spectra, the executive script uses the SplineRaw( ) utility from the Psychophysics Toolbox to resample final reflected spectra to match the sampling of the reflectance spectra.

Source: PubMed

3
订阅