Writeup for COS 526 Assignment 1: Photon Mapping

Crystal Qian, 10.14.16


Hi! This assignment implements a 2-part global illumination rendering algorithm.

  1. A photon tracing stage that shoots photons from every light source/ traces these photons' paths through the scene.
  2. A rendering stage that takes the resultant photons of the first step, and uses them to approximate lighting in a rendered image of a scene.

All of my code is in photonmapp.cpp; even the RenderImage() material found in rendering.cpp. Other than adding a function in R3Viewer.h to take in doubles when calculating a WorldRay from an image (needed for pixel integration), no libraries were changed.

I apologize in advance for the css on this website. css is really hard.


Phase 1: Photon tracing

Photon structure

Each item in our Photon class has the following components:

Photon emission

There's a hard-coded number of photons (_nTotalPhotons = 100000) that are shot out of the scene in total. What, so many photons?? If an emitted photon doesn't intersect with our scene, we don't care. It's the scene's loss! We don't replace the non-intersecting photon ray with another, which is why we emit so many to begin with. Space is not a limiting factor.

We give a proportion of these photons to each light in the scene, dependent on the light's intensity. Also, note that each photon's power is proportional to how many photons are shot out of that particular light! These emitted light visualizations have each photon's intensity as equal to the intensity of the light, so you can clearly see where photons are coming from. In implementation, however, because we're shooting many photons, each photon should carry very little power.

Point lights

Point lights are pretty straightforward; we position the photon at the source of the light, and emit in an arbitrary direction.

pointlight1.scn, showing paths.
pointlight1.scn, showing photons.
pointlight2.scn, showing paths.
pointlight2.scn, showing photons.

(I tried to tilt the image for points so you could see the sphere on the surface of the ground.)

Area lights

Area lights are stored as a circle and a normal. We emit photons from randomly chosen points on the circle, in a cosine-weighted random direction (more likely to shoot in the direction of the normal).

arealight1.scn, showing paths.
arealight1.scn, showing photons.
arealight2.scn, showing paths.
arealight2.scn, showing photons.

Spot lights

Spot lights are stored as a point, direction, cutoff angle, and drop off rate. We shoot photons from the spot light's origin in the hemisphere indicated by the direction, reject those that exceed the cutoff angle, and select photons randomly in proportion to a cosine/drop off rate ratio.

spotlight1.scn, showing paths.
spotlight2.scn, showing paths.

Directional lights

Directional lights are a bit difference, since they don't have a point of origin, only a direction. We make a moderately large circle outside of the bounds of the scene, and pick random points on that circle to shoot points into the specified direction. It's not specified how large of an area this directional light covers, so we arbitarily (lazily, probably) assume the area is exactly how large we make it so that our photon intensities are perfect.

dirlight1.scn, showing paths.
dirlight2.scn, showing paths.

Photon scattering/ Russian Roulette/ BRDF importance sampling

Now that we have all of these emitted photons, it's time to scatter them across the scene! (If you're following along in the code, this is my TracePhoton() function.) First, we find an intersection of the emitted photon with the scene. Depending on the brdf properties of the intersection material, there are probabilities of the path going in a diffuse, specular, or tranmission direction. Or, the path could terminate. This is Russian Roulette! My implementation gives each photon a 95% survival rate. When the photon continues, it will lose power proportional to its probability of surviving. Here are some examples of what our renderings would look like if the survival rate was lower.

Probability of survival is .3.
Probability of survival is .6.

If we decide that the path continues in a diffuse direction, we return a random direction along the hemisphere defined by the normal of the intersection. If the path continues in a specular direction, we choose a direction related to the shininess and angle. (Jason Lawrence's notes are really helpful here.) If the path continues in a transmission direction, based on the index of refraction and normal, we use Snell's law to get the new direction.

Multiple photon maps

As we trace the photons, we store the diffuse photons in either a global or caustic kdtree map, based on the path the photon has traversed. If the path is from an emitted photon to a specular/transmission photon to the current diffuse photon, then we store the photon in our caustic map. If the diffuse photon took some other path, we store it in our global map. So, there are no caustic photons in the global map.

Paths from the global map (-ntotalphotons = 20).
Paths from the global map (-ntotalphotons = 300).
Paths from the caustic map (-ntotalphotons = 500).
Paths from the caustic map (-ntotalphotons = 500).

Notice that as rays bounce off the colored sides, the power changes slightly to reflect this bounce. That's why some subsequent rays are red or blue.


Phase 2: rendering

Camera ray tracing

We generate ray(s) through each pixel and trace them through the scene, estimating radiance at each point (this is our GetIllumination() call). The total radiance is the sum of direct illumination, specular/transmission path illumination, indirect illumination, and caustic illumination.

To calculate direct illumination at a point, We iterate through each light in the scene. For each point not in shadow, we add the light's reflection at that point to our total illumination calculation.

If the brdf of the intersected material has some specular or transmission component, we get the new specular or transmission direction and recursively GetIllumination through the resulting ray, summing the illumination proportional to the value of the specular/transmission property. We cut off this recursion at depth 10.

Direct illumination only with default parameters.
Sum of direct illumination and specular/tranmission illumination with default parameters.
Sum of direct illumination, specular/tranmission illumination, and caustic illumination with default parameters.
Sum of all four parts of our radiance estimation: direct, specular/transmission, caustic, and indirect.

Indirect, caustic illumination (radiance estimation)

For indirect illumination calculations, we find the -nneighbors neighboring photons closest to our point in the global kdtree. We sum a factor of each photon's power in proportion to its distance and normal relational to the current point, divided by the radius of photons in the area. Approximately the same approach is applied to calculate caustic illumination, except we search in the caustic kdtree.

Below are some results with varying -nneighbors. Notice that the image becomes less "spotted" when we add calculations from more neighbors.

-nneighbors = 2
-nneighbors = 4
-nneighbors = 32
-nneighbors = 64
-nneighbors = 128
-nneighbors = 256

Pixel integration

For antialiasing and overall better visual effect, we shoot multiple rays through each pixel. For each pixel in the image, we randomly generate -raysperpixel points in the neighborhood of that pixel, and perform lighting calculations from each point. Then, we average the result.

Below are some results with varying -raysperpixel. Edges look more clear as this value increases. (Looks more clear zoomed up.)

-raysperpixel = 2
-raysperpixel = 4
-raysperpixel = 8

Rendering parameters

We added the following parameters for rendering.

If you would like to visualize the global and caustic maps, it's advised that you do not divide the emitted photons' powers by the number emitted so the paths won't look black. After removing this part from the code, pressing 'O' will toggle vector and intersection represetnations of the global map, and 'P' will do the same for the caustic map.


Other things

Gamma filtering

Gamma filtering changed my life. Before applying this (learned in COS 426), I was wondering why my code didn't work. After applying this, my code worked. It's a really great story, actually,

No gamma filter.
Gamma filter with gamma = .9.

Filtering

As in Jensen 2.3.2, I implemented cone filtering for indirect illumination calculations (such that each photon's contribution is inversely proportional to its distance from the targeted point), and Gaussian filtering for caustic illumination calculations. As a result, you'll notice that my caustics look pretty faint, but I think that the overall aesthetic is much nicer.

Indirect illumination without cone filter.
Indirect illumination with cone filter.
Caustic illumination without Gaussian filter.
Caustic illumination with Gaussian filter.

Shadow enhancement

Notice that we added a shadow calculation to direct lighting. We fire a ray from a light to an intended point; if the intersection is not where we expect, there must be an occlusion. In that case, we don't add the lighting calculation for that light.

No shadow.
Shadow.

Happy accidents

Accident 1
Accident 2
Accident 3
Accident 4
Accident 5
Accident 6
Accident 7
Accident 8

Art / other scenes

fourspheres.scn
cos526.scn.
transform.scn
specular.scn.

Citations

This assignment took a long time, but it was so rewarding to see all the pieces coming together!