Hello patrons,
I just posted the first part of Clarisse scatterers, in this video I'll walk you through some of the point clouds and scatterers available in Clarisse. We will do three production exercises, very simple but hopefully you will understand the workflow to use these tools to create more complicated shots.
In the first exercise we'll be using the point array to create a simple but effective crowd of soldiers. Then we will use the point cloud particle system to generate the effect that you can see in the video attached to this post. A very common effect these days.
And finally we will use the point uv sampler to generate huge environments like forests or cities.
We will continue with more exercises in the second and last part of these scatterers series in Clarisse.
Check it out on my Patreon feed.
Thanks,
Xuan.
Clarisse
Custom attributes in Clarisse /
Using custom attributes to establish different looks is something that we have to deal with in many shots. In this short video I show you how to work with custome attributes between Maya and Clarisse.
Isotropix Clarisse Fastrack trailer /
This is the trailer for my upcoming Isotropix Clarisse online training, "Clarisse Fastrack".
It should be available by the end of the month. Thanks!
Clarisse training (wip) /
Last week I was invited by Isotropix to talk about the work that I'm doing with Clarisse, as part of the Siggraph London events. This was my little talk.
Clarisse shading layers: Crowd in 5 minutes /
One feature that I really like in Clarisse are the shading layers. With them you can drive shaders based on naming convention or location of assets in the scene. With this method you can assign shaders to a very complex scene structure in no time. In this particular case I'll be showing you how to shade an entire army and create shading/texturing variations in just a few minutes.
I'll be using an alembic cache simulation exported from Maya using Golaem. Usually you will get thousand of objects with different naming convention, which makes the shading assignment task a bit laborious. With shading layer rules in Clarisse we can speed up a lot this tedious process
- Import an alembic cache with the crowd simulation through file -> import -> scene
- In this scene I have 1518 different objects.
- I'm going to create an IBL rig with one of my HDRIs to get some decent lighting in the scene.
- I created a new context called geometry where I placed the army and also created a ground plane.
- I also created another context called shaders where I'm going to place all my shaders for the soldiers.
- In the shaders context I created a new material called dummy, just a lambertian grey shader.
- We are going to be using shading layers, to apply shaders globally based on context and naming convention. I created a shading layers called army (new -> shading layer).
- With the pass (image) selected, select the 3D layer and apply the shading layer.
- Using the shading layer editor, add a new rule to apply the dummy shader to everything in the scene.
- I'm going to add a rule for everything called heavyArmor.
- Then just configure the shader for the heavyArmour with metal properties and it's correspondent textures.
- Create a new rule for the helmets and apply the shader that contains the proper textures for the helmets.
- I keep adding rules and shaders for different parts of the sodliers.
- If I want to create random variation, I can create shading layers for specific names of parts or even easier and faster, I can put a few items in a new context and create a new shading rule for them. For the bodies I want to use caucasian and black skin soldiers. I grabbed a few bodies and place them inside a new context called black. Then create a new shading rules where I apply a shader with different skin textures to all the bodies in that context.
- I repeated the same process for the shields and other elements.
- At the end of the process I can have a very populated army with a lot of random texture variations in just a few minutes.
- This is how my shading layers look like at the end of the process.
Combining Zbrush and Mari displacements in Clarisse /
We all have to work with displacement maps painted in both Zbrush and Mari.
Sometimes we use 32 bits floating point maps, sometimes 16 bits maps, etc. Combining different displacement depths and scales is a common task for a look-dev artist working in the film industry.
Let's see how to setup different displacement maps exported from Zbrush and Mari in Isotropix Clarisse.
- First of all, have a look at all the individual displacement maps to be used.
- The first one has been sculpted in Zbrush and exported as .exr 32 bits displacement map. The non-displacement value is zero.
- The second one has been painted in Mari and exported also as .exr 32 bits displacement map. Technically this map is exactly the same as the Zbrush one, the only difference here is the scale.
- The third displacement map in this exercise also comes from Mari, but in this case it's a .tif 16 bits displacement map, which means that the mid-point will be 0,5 instead of zero.
- We need to combine all of them in Clarisse and get the expected result.
- Start creating a displacement node and assigning it to the mesh.
- We consider the Zbrush displacement as our main displacement layer. That said, the displacement node has to be setup like the image below. The offset or non-displacement value has to be zero, and the front value 1. This will give us exactly the same look that we have in Zbrush.
- In the material editor I'm connecting a multiply node after every single displacement layer. The input 2 is 1.1.1 by default. Increasing or reducing this value will control the strength of each displacement layer. It is not necessary to control the intensity of the Zbrush layer unless you want to do it. But it is necessary to reduce the intensity of the Mari displacement layers as they are way off compared with the Zbrush intensity.
- I also added an add node right after the 16 bits Mari displacement subtracting the value -0.5 in order to remap the value at the same level than the other 32 bits maps with non-displacement value of zero.
- Finally I used add nodes to mix all the displacement layers.
- It is a good idea to setup all the layers individually to find the right look.
- No displacement at all.
- Zbrush displacement.
- Mari high frequency detail.
- Mari low frequency detail.
- All displacement layers combined.
Rendering OpenVDB in Clarisse /
Clarisse is perfectly capable of rendering volumes while maintaining it's flexible rendering options like instances or scatterers. In this particular example I'm going to render a very simple smoke simulation.
Start by creating and IBL setup. Clarisse allows you to do it with just one click.
Using a couple of matte and chrome spheres will help to establish the desired lighting situation.
To import the volume simulation just go to import -> volume.
Clarisse will show you a basic representation of the volume in the viewport. Always real time.
To improve the visual representation of the volume in viewport just click on Progressive Rendering. Lighting will also affect the volume in the viweport.
Volumes are treated pretty much like geometry in Clarisse. You can render volumes with standard shaders if you wish.
The ideal situation of course it would be using volume shaders for volume simulations.
In the material editor I'm about to use an utility -> extract property node to read any embedded property in the simulation. In this case I'm reading the temperature.
Finally I drive the temperature color with a gradient map.
If you get a lof of noise in your renders, don't forget to increase the volume sampling of your lighting sources.
Final render.
rendering Maya particles in Clarisse /
This is a very simple tutorial explaining how to render particle systems simulated in Maya inside Isotropix Clarisse. I already have a few posts about using Clarisse for different purposes, if you check by the tag "Clarisse" you will find all the previous posts. Hope to be publishing more soon.
In this particular case we'll be using a very simple particle system in Maya. We are going to export it to Clarisse and use custom geometries and Clarisse's powerful scatterer system to render millions of polygons very fast and nicely.
- Once your particle system has been simulated in Maya, export it via Alembic. One of the standard 3D formats for exchanging information in VFX.
- Create an IBL rig in Clarisse. In a previous post I explain how to do it, it is quite simple.
- With Clarisse 2.0 it is so simple to do, just one click and you are ready to go.
- Go to File -> Import -> Scene and select the Alembic file exported from Maya.
- It comes with 2 types of particles, a grid acting as ground and the render camera.
- Create a few contexts to keep everything tidy. Geo, particles, cameras and materials.
- In the geo context I imported the toy_man and the toy_truck models (.obj) and moved the grid from the main context to the geo context.
- Moved the 2 particles systems and the camera to their correspondent contexts.
- In the materials context I created 2 materials and 2 color textures for the models. Very simple shaders and textures.
- In the particles context I created a new scatterer calle scatterer_typeA.
- In the geometry support of the scatter add the particles_typeA and in the geometry section add the toy_man model.
- I’m also adding some variation to the rotation.
- If I move my timeline I will see the particle animation using the toy_man model.
- Do not forget to assign the material created before.
- Create another scatterer for the partycles_typeB and configure the geometry support and the geometry to be used.
- Add also some rotation and position variation.
- As these models are quite big compared with the toy figurine, I’m offsetting the particle effect to reduce the presence of toy_trucks in the scene.
- Before rendering, I’d like to add some motion blur to the scene. Go to raytracer -> Motion Blur -> 3D motion blur. Now you are ready to render the whole animation.
Clarisse AOVs overview /
This is a very quick overview of how to use AOVs in Clarisse.
I started from this very simple scene.
Select your render image and then the 3D layer.
Open the AOV editor and select the components that you need for your compositing. In my case I only need diffuse, reflection and sss.
Click on the plus button to enable them.
Now you can check every single AOV in the image view frame buffer.
Create a new context called "compositing" and inside of it create a new image called "comp_image".
Add a black color layer.
Add an add filter and texture it using a constant color. This will be the entry point for our comp.
Drag and drop the constant color to the material editor.
Drag and drop the image render to the material editor.
If you connect the image render to the the constant color input, you will see the beauty pass. Let's split it into AOVs.
Rename the map to diffuse and select the diffuse channel.
Repeat the process with all the AOVs, you can copy and paste the map node.
Add a few add nodes to merge all the AOVs until you get the beauty pass. This is it, your comp in a real time 3D environment. Whatever you change/add in you scene will be updated automatically.
Lets say that you don't need your comp inside Clarisse. Fine, just select your render image, configure the output and bring the render manager to output your final render.
- Just do the comp in Nuke as usual.
Clarisse UV interpolation /
When subdividing models in Clarisse for rendering displacement maps, the software subdivides both geometry and UVs. Sometimes we might need to subdivide only the mesh but keeping the UVs as they are originally.
This depends on production requirements and obviously on how the displacement maps were extracted from Zbrush or any other sculpting package.
If you don't need to subdivide the UVs first of all you should extract the displacement map with the option SmoothUV turned off.
Then in Clarisse, select the option UV Interpolation Linear.
IBL and sampling in Clarisse /
Using IBLs with huge ranges for natural light (sun) is just great. They give you a very consistent lighting conditions and the behaviour of the shadows is fantastic.
But sampling those massive values can be a bit tricky sometimes. Your render will have a lot of noise and artifacts, and you will have to deal with tricks like creating cropped versions of the HDRIs or clampling values out of Nuke.
Fortunately in Clarisse we can deal with this issue quite easily.
Shading, lighting and anti-aliasing are completely independent in Clarisse. You can tweak on of them without affecting the other ones saving a lot of rendering time. In many renderers shading sampling is multiplied by anti-aliasing sampling which force the users to tweak all the shaders in order to have decent render times.
- We are going to start with this noisy scene.
- The first thing you should do is changing the Interpolation Mode to
MipMapping in the Map File of your HDRI.
- Then we need to tweak the shading sampling.
- Go to raytracer and activate previz mode. This will remove lighting
information from the scene. All the noise here comes from the shaders.
- In this case we get a lot of noise from the sphere. Just go to the sphere's material and increase the reflection quality under sampling.
- I increased the reflection quality to 10 and can't see any noise in the scene any more.
- Select again the raytracer and deactivate the previz mode. All the noise here is coming now from lighting.
- Go to the gi monte carlo and disable affect diffuse. Doing this gi won't affect lighting. We have now only direct lighting here. If you see some noise just increase the sampling of our direct lights.
- Go to the gi monte carlo and re-enable affect diffuse. Increase the quality until the noise disappears.
- The render is noise free now but it still looks a bit low res, this is because of the anti-aliasing. Go to raytracer and increase the samples. Now the render looks just perfect.
- Finally there is a global sampling setting that usually you won't have to play with. But just for your information, the shading oversampling set to 100% will multiply the shading rays by the anti-aliasing samples, like most of the render engines out there. This will help to refine the render but rendering times will increase quite a bit.
- Now if you want to have quick and dirt results for look-dev or lighting just play with the image quality. You will not get pristine renders but they will be good enough for establishing looks.
Zbrush displacement in Clarisse /
This is a very quick guide to set-up Zbrush displacements in Clarisse.
As usually, the most important thing is to extract the displacement map from Zbrush correctly. To do so just check my previous post about this procedure.
Once your displacement maps are exported follow this mini tutorial.
- In order to keep everything tidy and clean I will put all the stuff related with this tutorial inside a new context called "hand".
- In this case I imported the base geometry and created a standard shader with a gray color.
- I'm just using a very simple Image Based Lighting set-up.
- Then I created a map file and a displacement node. Rename everything to keep it tidy.
- Select the displacement texture for the hand and set-up the image to raw/linear. (I'm using 32bit .exr files).
- In the displacement node set the bounding box to something like 1 to start with.
- Add the displacement map to the front value, leave the value to 1m (which is not actually 1m, its like a global unit), and set the front offset to 0.
- Finally add the displacement node to the geometry.
- That's it. Render and you will get a nice displacement.
- If you are still working with 16 bits displacement maps, remember to set-up the displacement node offset to 0.5 and play with the value until you find the correct behaviour.
Introduction to scatterers in Clarisse /
Scatterers in Clarisse are just great. They are very easy to control, reliable and they render in no time.
I've been using them for matte painting purposes, just feed them with a bunch of different trees to create a forest in 2 minutes. Add some nice lighting and render insane resolution. Then use all the 3D material with all the needed AOV's in Nuke and you'll have full control to create stunning matte paintings.
To make this demo a bit funnier instead of trees I'm using cool Lego pieces :)
- Create a context called obj and import the grid.obj and the toy_man.obj
- Create another context called shaders and create generic shaders for the objs.
- Also create two textures and load the images from the hard drive.
- Assign the textures to the diffuse input of each shader and then assign each shader to the correspondent obj.
- Set the camera to see the Lego logo.
- Create a new context called crowd, and inside of it create a point cloud and a scatterer.
- In the point cloud set the parent to be the grid.
- In the scatterer set the parent to be the grid as well.
- In the scatterer set the point cloud as geometry support.
- In the geometry section of the scatterer add the toy_man.
- Go back to the point cloud and in the scattering geometry add the grid.
Now play with the density. In this case I’m using a value of 0.7
As you can see all the toy_men start to populate the image.
- In the decimate texture add the Lego logo. Now the toy_men stick to the Logo.
- Add some variation in the scatterer position and rotation.
- That’s it. Did you realise how easy was to setup this cool effect? And did you check the polycount? 108.5 million :)
- In order to make this look a little bit better, we can remove the default lighting and do some quick IBL setup.
Clarisse, layers and passes /
I will continue writing about my experiences working with Clarisse. This time I'm gonna talk about working with layers and passes, a very common topic in the rendering world no matter what software you are using.
Clarisse allows you to create very complex organization systems using contexts, layers/passes and images. In addition to that we can compose all the information inside Clarisse in order to create different outputs for compositing.
Clarisse has a very clever organization methods for huge scenes.
- For this tutorial I'm going to use a very simple scene. The goal is to create one render layer for each element of the scene. At the end of this article we will have foreground, midgrodund, backgorund, the floor and shadows isolated.
- At this point I have an image with a 3DLayer containing all the elements of the scene.
- I've created 3 different contexts for foreground, midground and background.
- Inside each context I put the correspondent geometry.
- Inside each context I created an empty image.
- I created a 3DLayer for each image.
- We need to indicate which camera and renderer need to be used in each 3DLayer.
- We also need to indicate which lights are going to be used in each layer.
- At this point you probably realized how powerful Clarisse can be for organization purposes.
- In the background context I'm rendering both the sphere and the floor.
- In the scene context I've created a new image. This image will be the recipient for all the other images created before.
- In this case I'm not creating 3DLayers but Image Layers.
- In the layers options select each one of the layers created before.
- I put the background on the bottom and the foreground on the top.
- We face the problem that only the sphere has working shadows. This is because there is no floor in the other contexts.
- In order to fix this I moved the floor to another context called shadow_catcher.
- I created a new 3DLayer where I selected the camera and renderer.
- I created a group with the sphere, cube and cylinder.
- I moved the group to the shadows parameter of the 3DLayer.
- In the recipient image I place the shadows at the bottom. That's it, we have shadows working now.
- Oh wait, no that fast. If you check the first image of this post you will realize that the cube is actually intersecting the floor. But in this render that is not happening at all. This is because the floor is not in the cube context acting as matte object.
- To fix this just create an instance of the floor in the cube context.
- In the shading options of the floor I localize the parameters matte and alpha (RMB and click on localize).
- Then I activated those options and set the alpha to 0%
- That's it, working perfectly.
- At this point everything is working fine, but we have the floor and the shadows together. Maybe you would like to have them separated so you can tweak both of them independently.
- To do this, I created a new context only with the floor.
- In the shadows context I created a new "decal" material and assigned it to the floor.
- In the decal material I activated receive illumination.
- And finally I added the new image to the recipient image.
- You can download the sample scene here.
Image Based Lighting in Clarisse /
I've been using Isotropix Clarisse in production for a little while now. Recently the VFX Facility where I work announced the usage of Clarisse as primary Look-Dev and Lighting tool, so I decided to start talking about this powerful raytracer on my blog.
Today I'm writing about how to set-up Image Based Lighting.
- We can start by creating a new context called ibl. We will put all the elements needed for ibl inside this context.
- Now we need to create a sphere to use as "world" for the scene.
- This sphere will be the support for the equirectangular HDRI texture.
- I just increased the radius a lot. Keep in mind that this sphere will be covering all your assets inside of it.
- In the image view tab we can see the render in real time.
- Right now the sphere is lit by the default directional light.
- Delete that light.
- Create a new matte material. This material won't be affected by lighting.
- Assign it to the sphere.
- Once assigned the sphere will look black.
- Create an image to load the HDRI texture.
- Connect the texture to the color input of the matte shader.
- Select the desired HDRI map in the texture path.
- Change the projection type to "parametric".
- HDRI textures are usually 32bit linear images. So you need to indicate this in the texture properties.
- I created two spheres to check the lighting. Just press "f" to fit them in the viewport.
- I also created two standard materials, one for each sphere. I'm creating lighting checkers here.
- And a plane, just to check the shadows.
- If I go back to the image view, I can see that the HDRI is already affecting the spheres.
- Right now, only the secondary rays are being affected, like the reflection.
- In order to create proper lighting, we need to use a light called "gi_monte_carlo".
- Right now the noise in the scene is insane. This is because all the crazy detail in the HDRI map.
- First thing to reduce noise would be to change the interpolation of the texture to Mipmapping.
- To have a noise free image we will have to increase the sampling quality of the "gi_monte_carlo" light.
- Noise reduction can be also managed with the anti aliasing sampling of the raytracer.
- The most common approach is to combine raytracer sampling, lighting sampling and shading sampling.
- Around 8 raytracing samples and something around 12 lighting samples are common settings in production.
- There is another method to do IBL in Clarisse without the cost of GI.
- Delete the "gi_monte_carlo" light.
- Create an "ambient_occlusion" light.
- Connect the HDRI texture to the color input.
- In the render only the secondary rays are affected.
- Select the environment sphere and deactivate the "cast shadows" option.
- Now everything works fine.
- To clean the noise increase the sampling of the "ambient_occlusion" light.
- This is a cheaper IBL method.