Deep compositing is one of the most wanted features of the EXR 2.0 integration with major render packages. Rendering deep image data is currently supported only by renderman and mantra. Arnold render has also the ability to render deep data but it’s not avaliable yet. In other hand, vray has a “deep pixels” plugin that is capable of writing this type of data for maya and 3ds max, and also a “deep reader” plugin for nuke. In this tutorial I am going to show you how you can write deep images with maya/3ds max and vray.
Installing the vray plugins
Download the plugins from chaos group forum, use the “readme.txt” and follow the instructions to install. In maya 2014 you don’t need to replace any files, because the vray plugin it’s already updated to write deep images, “vray_CameraStereoscopic.dll” in “autodesk\maya2014\vray\vrayplugins”. In 3ds max replace the plugin “vraystereoscopicNNNN.dlo” in vrayplugins with the one provided.
Now that you can render out deep images, you need to be able to load them into nuke, and for that place the nuke reader plugin, “vrstReaderDeep.dll” , in your nuke’s plugin directory. This plugin won’t place any special read node inside nuke, instead you can read vray deep images with the native deep nodes inside nuke.
The 3D scene
My scene is just a simple city made with Qtown script, I divided it in render layers, one for the floor, other for the front buildings and the last with the back buildings.
How to render deep
If you are working with maya, open the render settings, vray common tab and scroll down untill you find “post-translate Python script”, and enter this code:
from vray.utils import * p=create("OutputDeepWriter", "deepWriter") p.set("file", "path/to/deepfilename..vrst") # disable normal output to prevent errors findByType("SettingsOutput").set("img_file", "")
Basically, this script will override the current output settings and render out a “.vrst” file containing your deep data along with the beauty render and any other render elements that you may have in your scene.
In case you want to work with exr’s, you can easly convert the “.vrst” file to “exr” with the tool “vrst2exr” dragging the image file on top of the vrst2exr.exe. Both “.vrst” and “exr” will read fine with the deepread node inside nuke.
For 3ds max users, create a “VRayStereoscopic” helper (Helpers >> VRay >> VRayStereoscopic), set the Shademap mode to “Render shade map”, turn on the “adjust resolution” option, “Deep pixel mode” and select a file location. Render the scene and you should have your shademap(deep image) on the selected location.
Reading deep images in nuke
Now create your deepread nodes and combine them with deepmerge, and here is where you can see deep data goodness, your images merge each other without the need of any mattes/masks. It doesn’t matter the order you merge them, because the deep data in each file will have the values for every pixel along the “z axis”.
You can see your deep render pass in channels viewer or loading it into the alpha channel viewer and hovering the mouse in your image.
Exploring other deep nodes
With the deepToPoints node you can see a point cloud of your scene inside nuke 3D viewer, which can be really usefull to place nuke cards/objs, for this you will need to export the camera from your 3D app and load it into nuke.
Now place some nuke 3D objects in your scene so we can combine them later with our deep image.
Combine those objects into a scene node and render them with a scanlineRender. Now we can extract the deep data from that image with the deepFromImage node. Back to “deep space” we can merge our images comp with the new generated image inside nuke and again you won’t need any masks to combine the images, both have deep data and “know” how to merge each other.
You can play with the position of your objects and you will see how they get integrated into the scene, behind or in front of your original objects, without any roto.
As you can see, a lot of possibilities with deep compositing and I only “scratched the surface”, with more complex scenes, fluid effects and such, you can really take advantage of this workflow. As a final tip, you can use the deepRecolor node to combine a vray deep pass with a beauty render from other render engine that doesnt support deep data.