Path Tracer

Path Tracer

9 devlogs
20h 52m
Created by Draedon

My second Path Tracer, built with WebGPU.

Why am I making a second one? I wasn't particularly happy with my last one (which was called Ray Tracer...) so I've decided to remake it, but better, both quality and code-wise.

The last one got to the end of the Ray Tracing in One Weekend book, by Peter Shirley, Trevor David Black, and Steve Hollasch, but I think I was more copying the code than learning the maths behind it, which is what I want to do this time around.

So here we are. My second path tracer.

Timeline

Implemented the Next Event Estimation (NEE) algorithm. If you look at the previous devlog, you'll see how it's super dark and relatively noisy, as it's pretty rare for a random ray to hit the light. What NEE does is in addition to shooting out a ray on every intersection (simulation bouncing), it also shoots out another ray at a random light source. This brightens the scene a little and reduces noise as it's proactively seeking out the lights instead of waiting for a ray to randomly hit one. Or at least, it's meant to reduce noise, but it doesn't really do that here. I'm almost certain that my implementation is a little broken.

Update attachment

Lighting, finally. Lighting should've been a quick implementation, but it required me to completely change the material system. Honestly, I should've done that ages ago, since previously, all objects simply had a colour attached to it. But now, they have a material index, which points to a material in a materials array, passed to the GPU. Also fixed many bugs along the way, such as normals facing the wrong way for planes, sometimes. Finally, I changed the scene to the famous Cornell Box.

Update attachment

Made the rendering loop progressive. I already gave a brief explanation of what that is, but basically, before, 30 rays were being shot out per pixel, per frame. But since that's really expensive, it now shoots only 1 ray, but accumulates the results across frames to effectively increase the frame rate. Now, instead of the <4 fps, I'm getting 80-100fps, although it does take longer for the image to converge. There's also the issue of floating-point precision issues, but I'm mostly avoiding that by limiting the number of rerenders.

Update attachment

Implemented rendering of quads. Even though the entire point of this project was for me to learn the maths behind path tracing, I don't think I want to learn the entire derivation behind ray-quad intersections. Fortunately, quads are just a general 2d primitive and it's really easy to change it to other 2d shapes, most notable of which being the triangle. But on the flip side, with 30 samples per pixel at a screen resolution of 1920x1080, I'm getting below 4fps. Next up is probably progressive path tracing (rendering one sample per frame and averaging the result with the last frame). If that isn't enough, then I'll have to optimise properly.

Update attachment

Added some basic performance metrics. These include the time per frame spent on the GPU ONLY, and the fps as calculated by 1/frameTime. These aren't super useful now but will come in handy later, when I start optimising. As you can see from the screenshot, with 30 samples per pixel, my crappy laptop is struggling to reach even 40fps on 1920x1080. But to be fair, over 62 million rays are being shot out per frame.

Update attachment

Shading is now ray traced! Currently, it just uses a basic Lambertian BRDF to determine how the rays bounce. Also took the the time to add in anti-aliasing (currently at 30 samples per pixel) and gamma correction (gamma = 1.5). Gamma right now is hardcoded because I can't be bothered making and binding another buffer.

Update attachment

Implemented some ray traced spheres. Don't get me wrong - rays are being shot into the scene and there is maths calculating the nearest collision, but the lighting isn't ray traced at all, but rather, just computed with a basic diffuse model, where the sphere's colour is multiplied by the dot product of the the surface normal and the light direction. Even though the shading technically isn't ray tracing, it was still useful to implement as proof that my surface normals are correct.

Update attachment

Technically have some actual ray tracing now! Rays are being shot out of the camera into the (empty) scene and returning the sky colour, which is a gradient depending on the vertical direction of the ray. Already ran into so many bugs though, including my worst nightmare: memory alignment issues in the shader. Next up is adding some spheres into the scene

Update attachment

Here we go. The start of my second path tracer. Not much to say about this first devlog. All that's happened is a boilerplate really. A framebuffer rendered to by a compute shader, which is displayed to the screen. People familiar with computer graphics will recognise the classic UV gradient that's being rendered.

Update attachment