This is a very quick demo of how to install on Mac and use the gizmo mmColorTarget or at least how I use it for my texturing/references and lighting process. The gizmo itself was created by Marco Meyer.
lighting
IBL and sampling in Clarisse /
Using IBLs with huge ranges for natural light (sun) is just great. They give you a very consistent lighting conditions and the behaviour of the shadows is fantastic.
But sampling those massive values can be a bit tricky sometimes. Your render will have a lot of noise and artifacts, and you will have to deal with tricks like creating cropped versions of the HDRIs or clampling values out of Nuke.
Fortunately in Clarisse we can deal with this issue quite easily.
Shading, lighting and anti-aliasing are completely independent in Clarisse. You can tweak on of them without affecting the other ones saving a lot of rendering time. In many renderers shading sampling is multiplied by anti-aliasing sampling which force the users to tweak all the shaders in order to have decent render times.
- We are going to start with this noisy scene.
- The first thing you should do is changing the Interpolation Mode to
MipMapping in the Map File of your HDRI.
- Then we need to tweak the shading sampling.
- Go to raytracer and activate previz mode. This will remove lighting
information from the scene. All the noise here comes from the shaders.
- In this case we get a lot of noise from the sphere. Just go to the sphere's material and increase the reflection quality under sampling.
- I increased the reflection quality to 10 and can't see any noise in the scene any more.
- Select again the raytracer and deactivate the previz mode. All the noise here is coming now from lighting.
- Go to the gi monte carlo and disable affect diffuse. Doing this gi won't affect lighting. We have now only direct lighting here. If you see some noise just increase the sampling of our direct lights.
- Go to the gi monte carlo and re-enable affect diffuse. Increase the quality until the noise disappears.
- The render is noise free now but it still looks a bit low res, this is because of the anti-aliasing. Go to raytracer and increase the samples. Now the render looks just perfect.
- Finally there is a global sampling setting that usually you won't have to play with. But just for your information, the shading oversampling set to 100% will multiply the shading rays by the anti-aliasing samples, like most of the render engines out there. This will help to refine the render but rendering times will increase quite a bit.
- Now if you want to have quick and dirt results for look-dev or lighting just play with the image quality. You will not get pristine renders but they will be good enough for establishing looks.
Image Based Lighting in Clarisse /
I've been using Isotropix Clarisse in production for a little while now. Recently the VFX Facility where I work announced the usage of Clarisse as primary Look-Dev and Lighting tool, so I decided to start talking about this powerful raytracer on my blog.
Today I'm writing about how to set-up Image Based Lighting.
- We can start by creating a new context called ibl. We will put all the elements needed for ibl inside this context.
- Now we need to create a sphere to use as "world" for the scene.
- This sphere will be the support for the equirectangular HDRI texture.
- I just increased the radius a lot. Keep in mind that this sphere will be covering all your assets inside of it.
- In the image view tab we can see the render in real time.
- Right now the sphere is lit by the default directional light.
- Delete that light.
- Create a new matte material. This material won't be affected by lighting.
- Assign it to the sphere.
- Once assigned the sphere will look black.
- Create an image to load the HDRI texture.
- Connect the texture to the color input of the matte shader.
- Select the desired HDRI map in the texture path.
- Change the projection type to "parametric".
- HDRI textures are usually 32bit linear images. So you need to indicate this in the texture properties.
- I created two spheres to check the lighting. Just press "f" to fit them in the viewport.
- I also created two standard materials, one for each sphere. I'm creating lighting checkers here.
- And a plane, just to check the shadows.
- If I go back to the image view, I can see that the HDRI is already affecting the spheres.
- Right now, only the secondary rays are being affected, like the reflection.
- In order to create proper lighting, we need to use a light called "gi_monte_carlo".
- Right now the noise in the scene is insane. This is because all the crazy detail in the HDRI map.
- First thing to reduce noise would be to change the interpolation of the texture to Mipmapping.
- To have a noise free image we will have to increase the sampling quality of the "gi_monte_carlo" light.
- Noise reduction can be also managed with the anti aliasing sampling of the raytracer.
- The most common approach is to combine raytracer sampling, lighting sampling and shading sampling.
- Around 8 raytracing samples and something around 12 lighting samples are common settings in production.
- There is another method to do IBL in Clarisse without the cost of GI.
- Delete the "gi_monte_carlo" light.
- Create an "ambient_occlusion" light.
- Connect the HDRI texture to the color input.
- In the render only the secondary rays are affected.
- Select the environment sphere and deactivate the "cast shadows" option.
- Now everything works fine.
- To clean the noise increase the sampling of the "ambient_occlusion" light.
- This is a cheaper IBL method.
Normalize textures in Softimage /
Just a quick video tutorial where I talk about my process to normalize textures in Softimage. Spanish audio.
Do you like to see my tutorials in English? Send me a line.
Cheers.
Love Vray's IBL /
When you work for a big VFX or animation studio you usually light your shots with different complex light rigs, often developed by highly talented people.
But when you are working at home or for small studios or doing freelance tasks or whatever else.. you need to simplify your techniques and tray to reach the best quality as you can.
For those reasons, I have to say that I’m switching from Mental Ray to V-Ray.
One of the features that I most love about V-Ray is the awesome dome light to create image based lighting setups.
Let me tell you a couple of thing which make that dome light so great.
- First of all, the technical setup is incredible simple. Just a few clicks, activate linear workflow, correct the gamma of your textures and choose a nice hdri image.
- Is kind of quick and simple to reduce the noise generated by the hdri image. Increasing the maximum subdivisions and decreasing the threshold should be enough. Something between 25 to 50 or 100 as max. subdivision should work on common situations. And something like 0.005 is a good value for the threshold.
- The render time is so fast using raytracing stuff.
- Even using global illumination the render times are more than good.
- Displacement, motion blur and that kind of heavy stuff is also welcome.
- Another thing that I love about the dome light using hdri images is the great quality of the shadows. Usually you don’t need to add direct lights to the scene. If the hdri is good enough you can match the footage really fast and accurately enough.
- The dome light has some parameters to control de orientation of your hdri image and is quite simple to have a nice preview in the Maya’s viewport.
- In all the renders that you can see here, you probably realized that I’m using an hdri image with “a lot” of different lighting points, around 12 different lights on the picture. In this example I put a black color on the background and I changed all the lights by white spots. It is a good test to make a better idea of how the dome light treats the direct lighting. And it is great.
- The natural light is soft and nice.
- These are some of the key point because I love the VRay’s dome light :)
- On the other hand, I don’t like doing look-dev with the dome light. Is really really slow, I can’t recommend this light for that kind of tasks.
- The trick is to turn off your dome light, and create a traditional IBL setup using a sphere and direct lights, or pluging your hdri image to the VRay’s environment and turn on the global illumination.
- Work there on your shaders and then move on to the dome light again.
My favourite V-Ray passes /
Recently working with V-Ray I discovered that these are the render passes which I use more often.
Simple scene, simple asset, simple texture and shading and simple lighting, just to show my render passes and pre-compositing stuff.
- Global Illumination
- Direct lighting
- Normals
- Reflection
- Specular
- Z-Depth
- Occlusion
- Snow (or up/down)
- Uvs
- XYZ (or global position)
RGB
GI
Direct lighting
Normals
Occlusion
Reflection
Snow
Specular
UVs
XYZ global position
Slapcomp
Cinematographer Style /
I found on Youtube these series of videos. They are simply amazing, if you have the chance take some time to watch them.
Linear Workflow in Maya with Vray 2.0 /
I’m starting a new work with V-Ray 2.0 for Maya. I never worked before with this render engine, so first things first.
One of my first things is create a nice neutral light rig for testing shaders and textures. Setting up linear workflow is one of my priorities at this point.
Find below a quick way to set up this.
- Set up your gamma. In this case I’m using 2,2
- Click on “don’t affect colors” if you want to bake your gamma correction in to the final render. If you don’t click on it you’ll have to correct your gamma in post. No big deal.
- The linear workflow option is something created for Chaos Group to fix old VRay scenes which don’t use lwf. You shouldn’t use this at all.
- Click on affect swatches to see color pickers with the gamma applied.
- Once you are working with gamma applied, you need to correct your color textures. There are two different options to do it.
- First one: Add a gamma correction node to each color texture node. In this case I’, using gamma 2,2 what means that I need to use a ,0455 value on my gamma node.
- Second option: Instead of using gamma correction nodes for each color texture node, you can click on the texture node and add a V-Ray attribute to control this.
- By default all the texture nodes are being read as linear. Change your color textures to be read as sRGB.
- Click on view as sRGB on the V-Ray buffer, if not you’ll see your renders in the wrong color space.
- This is the difference between rendering with the option “don’t affect colors” enabled or disabled. As I said, no big deal.
Linear Workflow in Softimage /
A walk through video about setting up linear workflow in Softimage, audio only in Spanish but it’s quite simple to follow watching the movie.
Physical Sun and Sky and Linear Workflow /
- First of all activate Mental Ray in the Rendering Options.
- Create a Physical Sun and Sky system.
- Activate Final Gather. At the moment should be enough if you select the Preset Preview Final Gather. It’s just for testing purposes.
- Check that the mia_exposure_simple lens shader has been added to the camera. And Check that the gamma is set to 2.2
- Launch a render and you’ll realize that everything looks washed.
- We need to add a gamma correction node after each texture node, even procedural color shaders.
- Connect the texture file’s outColor to the “Gamma Correction” node’s value. Then connect the “Gamma Correct” node’s outValue to the shader’s diffuse.
- Use the value 0.455 in the gamma node.
- The gamma correction for sRGB devices (with a gamma of approximately 2.2) is 1/2.2 = 0.4545. If your texture files are gamma corrected for gamma 2.2, put 0.455 into the Gamma attribute text boxes.
- If you launch a render again, everything should looks fine.
- Once you are happy with the look of your scene, to do a batch render you need to put the gamma value of the lens camera shader to 1.
- Under the quality tab, in the framebuffer options, select RGBA float, set the gamma to 1 and the colorspace to raw.
- Render using openExr and that’s it.
Black holes with final gather contribution /
Black holes are a key feature in 3D lighting and compositing, but black holes with bounced information are super!
- Apply a Mental Ray Production Shader called “mip_rayswitch_advanced” to your black hole object.
- In the “eye” channel, connect a “surface shader” with the “out_matte_opacity” parameter pure black.
- In the Final Gather input, connect the original shader of your object. (a blinn shader for example).
Beauty channel.
Alpha channel