Please sign in to access this page

Path Tracer

Path Tracer

14 devlogs
35h 43m
•  Ship certified
Created by Draedon

A simple path tracer built with no game engine and no libraries. Built using only JS and WebGPU. Demo available on the web. Currently only supports diffuse materials, metals, dielectrics, and lights.

Timeline

Ship 1

1 payout of shell 766.0 shells

Draedon

about 1 month ago

Draedon Covers 14 devlogs and 35h 43m

Added the configuration panel, finally. This just allows basic tweaking of settings, such as the maximum number of bounces and rerenders, and the gamma of the render. This was really easy to implement as it was just a bit of copy-paste on the HTML side and a few functions in JS. Also added a tiny bit of error handling in case the GPUDevice is lost, which will pretty much only happen if the GPU is overtaxed.

Update attachment

Implemented dielectric materials! Basically, just transparent materials. Overall, it wasn't that hard to implement. It was just Snell's Law, Fresnel's Law of Reflectance (with Schlick's approximation), and Total Internal Reflections. It also required a slight modification of the material system. Previously, since there were only diffuse and metal materials, I simply had a boolean flag in the Material struct to determine if it was a metal or not. But now, that has changed to be a u32 to accommodate more materials. This doesn't actual use more memory though, since previously, the bool was actually stored as a u32 due to wgsl limitations. Also, this project should be close to being shipped. Maybe a configuration panel is next.

Update attachment

Implemented some additional controls and a rotatable camera! Sadly, the camera isn't interactive yet, but that'll come eventually. The new controls include the Tab key, which changes scenes (currently there are only two scenes: the mirror room and the Cornell Box), and Ctrl + S, which downloads the current render, without the info box in the top left. Next up is probably transparent and translucent materials, which I think falls under the category of Dielectrics.

Update attachment

Metals implemented. How are metals different from a typical diffuse material, I hear you ask? Well, instead of scattering light in a nearly random fashion (I implemented a cosine-weighted BSDF for diffuse materials), metals are very smooth and scatter light almost perfectly. This makes them look very smooth since incoming light rays with the same direction will bounce off in the same direction too. Except, there is also a roughness factor, which offsets the bounces a little bit, making the surface of the metal look a little fuzzy, which can be used as a stylistic choice.

Honestly, metals were very easy to implement as it only involved checking to see if the material was metallic or not and changing the BSDF accordingly. I also found a bug with my Sphere object struct, which wasn't taking into memory alignment. I didn't find this bug earlier because I only ever had one sphere.

Update attachment

Spent ages trying to fix NEE and got nowhere. So this is filler devlog, without much progress. Some code cleanups were made, and I experimented with dynamic scenes instead of static scenes. However, since this is a progressive ray tracer (scenes rendered over the course of many frames), there will be much more noise when the scene changes every frame. Since we can't use the data from previous frames, we are effectively rendering with one spp (sample per pixel), which is incredibly noisy. I might just give up on fixing NEE for the time being and go implement something interesting, like metals and/or dielectrics.

Implemented the Next Event Estimation (NEE) algorithm. If you look at the previous devlog, you'll see how it's super dark and relatively noisy, as it's pretty rare for a random ray to hit the light. What NEE does is in addition to shooting out a ray on every intersection (simulation bouncing), it also shoots out another ray at a random light source. This brightens the scene a little and reduces noise as it's proactively seeking out the lights instead of waiting for a ray to randomly hit one. Or at least, it's meant to reduce noise, but it doesn't really do that here. I'm almost certain that my implementation is a little broken.

Update attachment

Lighting, finally. Lighting should've been a quick implementation, but it required me to completely change the material system. Honestly, I should've done that ages ago, since previously, all objects simply had a colour attached to it. But now, they have a material index, which points to a material in a materials array, passed to the GPU. Also fixed many bugs along the way, such as normals facing the wrong way for planes, sometimes. Finally, I changed the scene to the famous Cornell Box.

Update attachment

Made the rendering loop progressive. I already gave a brief explanation of what that is, but basically, before, 30 rays were being shot out per pixel, per frame. But since that's really expensive, it now shoots only 1 ray, but accumulates the results across frames to effectively increase the frame rate. Now, instead of the <4 fps, I'm getting 80-100fps, although it does take longer for the image to converge. There's also the issue of floating-point precision issues, but I'm mostly avoiding that by limiting the number of rerenders.

Update attachment

Implemented rendering of quads. Even though the entire point of this project was for me to learn the maths behind path tracing, I don't think I want to learn the entire derivation behind ray-quad intersections. Fortunately, quads are just a general 2d primitive and it's really easy to change it to other 2d shapes, most notable of which being the triangle. But on the flip side, with 30 samples per pixel at a screen resolution of 1920x1080, I'm getting below 4fps. Next up is probably progressive path tracing (rendering one sample per frame and averaging the result with the last frame). If that isn't enough, then I'll have to optimise properly.

Update attachment

Added some basic performance metrics. These include the time per frame spent on the GPU ONLY, and the fps as calculated by 1/frameTime. These aren't super useful now but will come in handy later, when I start optimising. As you can see from the screenshot, with 30 samples per pixel, my crappy laptop is struggling to reach even 40fps on 1920x1080. But to be fair, over 62 million rays are being shot out per frame.

Update attachment

Shading is now ray traced! Currently, it just uses a basic Lambertian BRDF to determine how the rays bounce. Also took the the time to add in anti-aliasing (currently at 30 samples per pixel) and gamma correction (gamma = 1.5). Gamma right now is hardcoded because I can't be bothered making and binding another buffer.

Update attachment

Implemented some ray traced spheres. Don't get me wrong - rays are being shot into the scene and there is maths calculating the nearest collision, but the lighting isn't ray traced at all, but rather, just computed with a basic diffuse model, where the sphere's colour is multiplied by the dot product of the the surface normal and the light direction. Even though the shading technically isn't ray tracing, it was still useful to implement as proof that my surface normals are correct.

Update attachment

Technically have some actual ray tracing now! Rays are being shot out of the camera into the (empty) scene and returning the sky colour, which is a gradient depending on the vertical direction of the ray. Already ran into so many bugs though, including my worst nightmare: memory alignment issues in the shader. Next up is adding some spheres into the scene

Update attachment

Here we go. The start of my second path tracer. Not much to say about this first devlog. All that's happened is a boilerplate really. A framebuffer rendered to by a compute shader, which is displayed to the screen. People familiar with computer graphics will recognise the classic UV gradient that's being rendered.

Update attachment