I was interested in about learning inverse rendering and Mitsuba 3.0. For this project I reproduced this from the Realistic Graphics Lab (RGL) at EPFL, Switzerland. I'm currently looking at setting up resin 3D printing to produce these lenses in real life.
Here's some results I rendered of caustic lenses produced with inverse rendering:
Skip to 4: Custom Caustics for the interesting stuff.
0_installation.ipynb contains the commands used to install mitsuba, drjit, and ipywidgets.
1_hello world.ipynb contains the "Hello World" equivilent of graphics, rendering a cornell box.
2_mitsuba quickstart - Learned mitsuba features, such as variants. Downloaded scene files provided by mitsuba, then rendered a modified cornel box and saved it as a png file.
3_gradient based optimization.ipynb - Sovled a basic gradient optimization problem. Began with rendering a cornell box, using the traverse command to show modifyable scene parameters, changed the wall colour, and re-rendered the scene. Given an image of a cornell box and a cornell box scene with a modified wall colour, use gradient descent to restore the original wall colour. Begin by selecting the scene paramaters to optimize (the wall colour). Render the scene, then compute the mean square error between the rendered image and original image. Use this difference to update the scene, then repeat until you reach a desired error threshold or number of iterations.
Here's an image of some caustics projected onto my desk:
I want to produce a lens that projects caustics in the shape of an image. Start by setting up the scene as follows:
Use a hightmap to control the shape of the lens. Use gradient descent to optimize the hightmap.
Here's some visualizations of the lens shape being optimized:
5_showcase.ipynb is some code I wrote to render a demonstration of the caustic lens moving in and out of the focal plane.









