Physically Based Rendering
I had a great discussion with Tim Davison, Evil Genius and Ph.D. candidate in the Computer Science department. We also happen to share an office so it makes for some excellent deep dives into theory and the current state of game engines, AR, VR and sometimes Sci-fi. I’ve been obsessed with Octane rendering in Unity and attempting to replicate real-world objects and place them in augmented environments. Problems occur with light conditions as well as baking the lightmaps for playback performance as no current mobile devices can render real-time what the most advanced GPUs can. PBR or physically based rendering treats light as it is in the natural world; that is, light is rendered from every source apparent in an environment and path traced so that it accurately re-creates shadows, reflections, etc.
Previously, Tim pointed out the issue of true blacks in AR HMDs where they are washed out because of the way they are reflected on to the materials used to see them overlayed. I’m not a fan of having to hold a mobile device as the interface to AR/MR/XR because it breaks the plane of natural and virtual too easily. Tim suggested doing the whole thing in VR and skipping AR entirely because of these shortcomings. This idea has become more attractive as I begin to model and render samples. An intermediary could be the Zed Mini using passthrough cameras with low latency but Tim said ‘why not control everything in VR and tell participants it’s AR’.
Reading I did over the Xmas break suggested that XR is the future. AR and VR will merge and MR will be the standard. This is easily foreseeable and part of what I’m doing is researching future tech and possibilities as far as they relate to discerning the natural from the virtual. Tim explained that part of the thesis can be what may happen in the future.
Time to dig even deeper into Unity, Cinema 4D and Octane!