Hi! This assignment implements a 2-part global illumination rendering algorithm.
All of my code is in photonmapp.cpp
; even the RenderImage()
material found in rendering.cpp
. Other than adding a function in R3Viewer.h
to take in doubles when calculating a WorldRay
from an image (needed for pixel integration), no libraries were changed.
I apologize in advance for the css on this website. css is really hard.
int Type
, which is either Diffuse
, Specular
, Transmission
, or Emitted
.Photon* Prev
and Photon* Next
, which is super useful for visualizing paths and also for checking paths for caustic properties. Prev
is NULL
for emitted photons.RNRgb Power
of the photon.R3Ray Ray
showing the starting point of the photon, and the direction its path leads in scattering. This direction is NULL
for terminal photons. There's a hard-coded number of photons (_nTotalPhotons = 100000
) that are shot out of the scene in total. What, so many photons?? If an emitted photon doesn't intersect with our scene, we don't care. It's the scene's loss! We don't replace the non-intersecting photon ray with another, which is why we emit so many to begin with. Space is not a limiting factor.
We give a proportion of these photons to each light in the scene, dependent on the light's intensity. Also, note that each photon's power is proportional to how many photons are shot out of that particular light! These emitted light visualizations have each photon's intensity as equal to the intensity of the light, so you can clearly see where photons are coming from. In implementation, however, because we're shooting many photons, each photon should carry very little power.
Point lights are pretty straightforward; we position the photon at the source of the light, and emit in an arbitrary direction.
![]() pointlight1.scn , showing paths. |
![]() pointlight1.scn , showing photons. |
![]() pointlight2.scn , showing paths. |
![]() pointlight2.scn , showing photons. |
(I tried to tilt the image for points so you could see the sphere on the surface of the ground.)
Area lights are stored as a circle and a normal. We emit photons from randomly chosen points on the circle, in a cosine-weighted random direction (more likely to shoot in the direction of the normal).
![]() arealight1.scn , showing paths. |
![]() arealight1.scn , showing photons. |
![]() arealight2.scn , showing paths. |
![]() arealight2.scn , showing photons. |
Spot lights are stored as a point, direction, cutoff angle, and drop off rate. We shoot photons from the spot light's origin in the hemisphere indicated by the direction, reject those that exceed the cutoff angle, and select photons randomly in proportion to a cosine/drop off rate ratio.
![]() spotlight1.scn , showing paths. |
![]() spotlight2.scn , showing paths. |
Directional lights are a bit difference, since they don't have a point of origin, only a direction. We make a moderately large circle outside of the bounds of the scene, and pick random points on that circle to shoot points into the specified direction. It's not specified how large of an area this directional light covers, so we arbitarily (lazily, probably) assume the area is exactly how large we make it so that our photon intensities are perfect.
![]() dirlight1.scn , showing paths. |
![]() dirlight2.scn , showing paths. |
Now that we have all of these emitted photons, it's time to scatter them across the scene! (If you're following along in the code, this is my TracePhoton()
function.) First, we find an intersection of the emitted photon with the scene. Depending on the brdf
properties of the intersection material, there are probabilities of the path going in a diffuse, specular, or tranmission direction. Or, the path could terminate. This is Russian Roulette! My implementation gives each photon a 95% survival rate. When the photon continues, it will lose power proportional to its probability of surviving. Here are some examples of what our renderings would look like if the survival rate was lower.
![]() |
![]() |
If we decide that the path continues in a diffuse direction, we return a random direction along the hemisphere defined by the normal of the intersection. If the path continues in a specular direction, we choose a direction related to the shininess and angle. (Jason Lawrence's notes are really helpful here.) If the path continues in a transmission direction, based on the index of refraction and normal, we use Snell's law to get the new direction.
As we trace the photons, we store the diffuse photons in either a global or caustic kdtree map, based on the path the photon has traversed. If the path is from an emitted photon to a specular/transmission photon to the current diffuse photon, then we store the photon in our caustic map. If the diffuse photon took some other path, we store it in our global map. So, there are no caustic photons in the global map.
![]() -ntotalphotons = 20 ). |
![]() -ntotalphotons = 300 ). |
![]() -ntotalphotons = 500 ). |
![]() -ntotalphotons = 500 ). |
Notice that as rays bounce off the colored sides, the power changes slightly to reflect this bounce. That's why some subsequent rays are red or blue.
We generate ray(s) through each pixel and trace them through the scene, estimating radiance at each point (this is our GetIllumination()
call). The total radiance is the sum of direct illumination, specular/transmission path illumination, indirect illumination, and caustic illumination.
To calculate direct illumination at a point, We iterate through each light in the scene. For each point not in shadow, we add the light's reflection at that point to our total illumination calculation.
If the brdf of the intersected material has some specular or transmission component, we get the new specular or transmission direction and recursively GetIllumination
through the resulting ray, summing the illumination proportional to the value of the specular/transmission property. We cut off this recursion at depth 10.
![]() |
![]() |
![]() |
![]() |
For indirect illumination calculations, we find the -nneighbors
neighboring photons closest to our point in the global kdtree. We sum a factor of each photon's power in proportion to its distance and normal relational to the current point, divided by the radius of photons in the area. Approximately the same approach is applied to calculate caustic illumination, except we search in the caustic kdtree.
Below are some results with varying -nneighbors
. Notice that the image becomes less "spotted" when we add calculations from more neighbors.
![]() -nneighbors = 2 |
![]() -nneighbors = 4 |
![]() -nneighbors = 32 |
![]() -nneighbors = 64 |
![]() -nneighbors = 128 |
![]() -nneighbors = 256 |
For antialiasing and overall better visual effect, we shoot multiple rays through each pixel. For each pixel in the image, we randomly generate -raysperpixel
points in the neighborhood of that pixel, and perform lighting calculations from each point. Then, we average the result.
Below are some results with varying -raysperpixel
. Edges look more clear as this value increases. (Looks more clear zoomed up.)
![]() -raysperpixel = 2 |
![]() -raysperpixel = 4 |
![]() -raysperpixel = 8 |
We added the following parameters for rendering.
-raysperpixel
takes in the number of rays shot through each pixel. The default is 4. (This is used for pixel integration.)-nneighbors
takes in the number of closest photons to each point used in indirect lighting calculations. The default is 128. (This is used for radiance estimation).-totalphotons
takes in the total number of photons shot through the scene. The default is 100000. (This is used for photon emission.If you would like to visualize the global and caustic maps, it's advised that you do not divide the emitted photons' powers by the number emitted so the paths won't look black. After removing this part from the code, pressing 'O' will toggle vector and intersection represetnations of the global map, and 'P' will do the same for the caustic map.
Gamma filtering changed my life. Before applying this (learned in COS 426), I was wondering why my code didn't work. After applying this, my code worked. It's a really great story, actually,
![]() |
![]() |
As in Jensen 2.3.2, I implemented cone filtering for indirect illumination calculations (such that each photon's contribution is inversely proportional to its distance from the targeted point), and Gaussian filtering for caustic illumination calculations. As a result, you'll notice that my caustics look pretty faint, but I think that the overall aesthetic is much nicer.
![]() |
![]() |
![]() |
![]() |
Notice that we added a shadow calculation to direct lighting. We fire a ray from a light to an intended point; if the intersection is not where we expect, there must be an occlusion. In that case, we don't add the lighting calculation for that light.
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() fourspheres.scn |
![]() cos526.scn . |
![]() transform.scn
|
![]() specular.scn . |
This assignment took a long time, but it was so rewarding to see all the pieces coming together!