This year, Monique Dewanchand and I were able to attend SIGGRAPH in Vancouver. Together we have a small company called At Mind, doing lots of Blender development and projects. Below is my report with observations of what the state of open source at SIGGRAPH is.
The past years, several VFX companies have opened their pipeline. This has resulted in Open Source projects, were they have presented their common challenges.
We went to several Open Source BoF meetings and met the people and studios behind the projects.
USD / Hydra
When I first heard of Universal Scene Description I thought that it was just a file format for data exchange that was capable of loading your scene very fast. At SIGGRAPH I saw the power of USD and the problems they want to solve.
USD is positioned as the data-backbone for studios. Studios can use it from Layout to final rendering and it is optimized for artistic iterations and even makes it possible that multiple artists work on the same asset at the same time. USD can be used to transfer data between departments and easily integrate the result of the work back.
When working with large scenes in Blender, you need to load all the data. Blender controls all the assets that are loaded what makes working with huge scenes not as fast as you want it to be.
Hydra is a viewport that can be integrated with tools to display the USD. You are able to select a renderer to render the scene to your display. It has an OpenGL back-end, but also back-ends for interactive renderers can be developed. I would like to see Cycles hooked up with Hydra.
USD/Hydra has its own dependency graph that is optimized for large scenes. I saw a demonstration where the Hydra viewport was integrated in Maya that displayed a very complex VFX scene. Only the part that the artist was working on (armature) was loaded in Maya so it interacted really quick.
In theory Hydra fits with Blender’s 2.8 viewport concept. There are technical challenges, like Hydra requires OpenGL 4.0 to run. Further research on this subject will probably find a solution.
MaterialX / ShaderX / OSL / Gaffer
When using multiple applications materials can be rendered differently. MaterialX (http://www.materialx.org/) addresses the lack of a common, open standard for representing the data values and relationships required to transfer the complete look of a model from one application or rendering platform to another. This includes shading setup, patterns and texturing, nested materials and geometric assignments.
MaterialX describes materials. It can also have multiple looks per material, for example one for dry condition and one for wet condition. MaterialX is integrated in many tools nowadays. This gives a similar representation of materials in different these tools.
MaterialX has some criteria before it can be used. You need to support a common set of BxDF shaders across the tools you use otherwise there might be visual differences. MaterialX library currently provides OSL definitions for the some basic nodes to make them straightforward to implement. In the future they hope that GLSL definitions and other common shading languages can be made available.
MaterialX has a well defined set of standard nodes for texturing, including reference implementation. However for shading there is no description or standard library. The surface, light or volume shaders are threaded in MaterialX as black boxes. ShaderX is an extension for MaterialX. ShaderX defines building blocks for describing shaders as well as supplying functionality to convert these shaders to OSL/GLSL source code. This code can then be compiled and used by applications and renderers.
A number of studios nowadays use Gaffer (http://www.gafferhq.org/). I never heard abour Gaffer, but met a developer of Gaffer at Beers of a Feather. He described Gaffer as an open source project for look-development. I was interested as I worked on the look-development in Blender 2.8.
Gaffer is a node based configuration of your scene, render layers, material assignment and triggering your rendering or compositing task. The configuration can be shared between scenes. For example a character will always be rendered in the character render layer. Which saves a lot of time setting up your rendering. Blender also has RenderLayers, but that needs to be set-up per scene. Having a node based system to configure the RenderLayers and when the renderer/compositor/sequencer is called would give Blender a lot of extra flexibility.
The past years not much has happened to OCIO. But nowadays even Autodesk is replacing their internal CM with OCIO. Changes they made will be released with the v2.0 release.
In previous versions there is a difference between the GPU and CPU implementation of OCIO. The GPU uses a baked LUT, but the CPU interprets the the Color Configuration directly. V2.0 gives you the choice of using the LUT or use the Color configuration on the GPU.
Studios need to create their own OCIO configuration. Most parts of these configurations are exactly the same between studios, for example the OCIO information for a camera or standard color spaces. The studios want to address this issue. Perhaps it wonâ€™t be part of OCIO, as it does not fit in the scope of OCIO.
VFX Reference Platform
The VFX reference platform (https://www.vfxplatform.com/) is something completely different. It is not an open source project, but an attempt of the industry to standardize on software versions, to ensure better binary compatibility. Ideally studios can develop add-ons, that run in multiple applications from a single code base or binary.
The power of this kind of platform is that the industry comes together and talk about very difficult topics, like migration from Python 2 to Python 3. Blender did it years ago (Blender 2.5, 2009). But in my experience this is a very difficult task. I was impressed in how it was discussed between studios and software vendors. Result: in 2019 the mayor software packages will support both python 2 and 3, so the studios can start migrating their pipelines. The aim is that python 3 will be the standard in 2020.
For Blender this reference platform is something we could look into so we can support studios better.