look-dev

Clarisse UV interpolation by Xuan Prada

When subdividing models in Clarisse for rendering displacement maps, the software subdivides both geometry and UVs. Sometimes we might need to subdivide only the mesh but keeping the UVs as they are originally.

This depends on production requirements and obviously on how the displacement maps were extracted from Zbrush or any other sculpting package.

If you don't need to subdivide the UVs first of all you should extract the displacement map with the option SmoothUV turned off.
Then in Clarisse, select the option UV Interpolation Linear.

By default Clarisse sets the UVs to Smooth.

You can easily change it to Linear.

Render with smooth UVs.

Render with linear UVs.

IBL and sampling in Clarisse by Xuan Prada

Using IBLs with huge ranges for natural light (sun) is just great. They give you a very consistent lighting conditions and the behaviour of the shadows is fantastic.
But sampling those massive values can be a bit tricky sometimes. Your render will have a lot of noise and artifacts, and you will have to deal with tricks like creating cropped versions of the HDRIs or clampling values out of Nuke.

Fortunately in Clarisse we can deal with this issue quite easily.
Shading, lighting and anti-aliasing are completely independent in Clarisse. You can tweak on of them without affecting the other ones saving a lot of rendering time. In many renderers shading sampling is multiplied by anti-aliasing sampling which force the users to tweak all the shaders in order to have decent render times.

  • We are going to start with this noisy scene.
  • The first thing you should do is changing the Interpolation Mode to 
    MipMapping
    in the Map File of your HDRI.
  • Then we need to tweak the shading sampling.
  • Go to raytracer and activate previz mode. This will remove lighting 
    information from the scene. All the noise here comes from the shaders.
  • In this case we get a lot of noise from the sphere. Just go to the sphere's material and increase the reflection quality under sampling.
  • I increased the reflection quality to 10 and can't see any noise in the scene any more. 
  • Select again the raytracer and deactivate the previz mode. All the noise here is coming now from lighting.
  • Go to the gi monte carlo and disable affect diffuse. Doing this gi won't affect lighting. We have now only direct lighting here. If you see some noise just increase the sampling of our direct lights.
  • Go to the gi monte carlo and re-enable affect diffuse. Increase the quality until the noise disappears.
  • The render is noise free now but it still looks a bit low res, this is because of the anti-aliasing. Go to raytracer and increase the samples. Now the render looks just perfect.
  • Finally there is a global sampling setting that usually you won't have to play with. But just for your information, the shading oversampling set to 100% will multiply the shading rays by the anti-aliasing samples, like most of the render engines out there. This will help to refine the render but rendering times will increase quite a bit.
  • Now if you want to have quick and dirt results for look-dev or lighting just play with the image quality. You will not get pristine renders but they will be good enough for establishing looks.

Zbrush displacement in Clarisse by Xuan Prada

This is a very quick guide to set-up Zbrush displacements in Clarisse.
As usually, the most important thing is to extract the displacement map from Zbrush correctly. To do so just check my previous post about this procedure.

Once your displacement maps are exported follow this mini tutorial.

  • In order to keep everything tidy and clean I will put all the stuff related with this tutorial inside a new context called "hand".
  • In this case I imported the base geometry and created a standard shader with a gray color.
  • I'm just using a very simple Image Based Lighting set-up.
  • Then I created a map file and a displacement node. Rename everything to keep it tidy.
  • Select the displacement texture for the hand and set-up the image to raw/linear. (I'm using 32bit .exr files).
  • In the displacement node set the bounding box to something like 1 to start with.
  • Add the displacement map to the front value, leave the value to 1m (which is not actually 1m, its like a global unit), and set the front offset to 0.
  • Finally add the displacement node to the geometry.
  • That's it. Render and you will get a nice displacement.

Render with displacement map.

Render without displacement map.

  • If you are still working with 16 bits displacement maps, remember to set-up the displacement node offset to 0.5 and play with the value until you find the correct behaviour.

Image Based Lighting in Clarisse by Xuan Prada

I've been using Isotropix Clarisse in production for a little while now. Recently the VFX Facility where I work announced the usage of Clarisse as primary Look-Dev and Lighting tool, so I decided to start talking about this powerful raytracer on my blog.

Today I'm writing about how to set-up Image Based Lighting.

  • We can start by creating a new context called ibl. We will put all the elements needed for ibl inside this context.
  • Now we need to create a sphere to use as "world" for the scene.
  • This sphere will be the support for the equirectangular HDRI texture.
  • I just increased the radius a lot. Keep in mind that this sphere will be covering all your assets inside of it.
  • In the image view tab we can see the render in real time.
  • Right now the sphere is lit by the default directional light.
  • Delete that light.
  • Create a new matte material. This material won't be affected by lighting.
  • Assign it to the sphere.
  • Once assigned the sphere will look black.
  • Create an image to load the HDRI texture.
  • Connect the texture to the color input of the matte shader.
  • Select the desired HDRI map in the texture path.
  • Change the projection type to "parametric".
  • HDRI textures are usually 32bit linear images. So you need to indicate this in the texture properties.
  • I created two spheres to check the lighting. Just press "f" to fit them in the viewport.
  • I also created two standard materials, one for each sphere. I'm creating lighting checkers here.
  • And a plane, just to check the shadows.
  • If I go back to the image view, I can see that the HDRI is already affecting the spheres.
  • Right now, only the secondary rays are being affected, like the reflection.
  • In order to create proper lighting, we need to use a light called "gi_monte_carlo".
  • Right now the noise in the scene is insane. This is because all the crazy detail in the HDRI map.
  • First thing to reduce noise would be to change the interpolation of the texture to Mipmapping.
  • To have a noise free image we will have to increase the sampling quality of the "gi_monte_carlo" light.
  • Noise reduction can be also managed with the anti aliasing sampling of the raytracer.
  • The most common approach is to combine raytracer sampling, lighting sampling and shading sampling.
  • Around 8 raytracing samples and something around 12 lighting samples are common settings in production.
  • There is another method to do IBL in Clarisse without the cost of GI.
  • Delete the "gi_monte_carlo" light.
  • Create an "ambient_occlusion" light.
  • Connect the HDRI texture to the color input.
  • In the render only the secondary rays are affected.
  • Select the environment sphere and deactivate the "cast shadows" option.
  • Now everything works fine.
  • To clean the noise increase the sampling of the "ambient_occlusion" light.
  • This is a cheaper IBL method.

Colorway in VFX - chapter 2 by Xuan Prada

A few days ago I did my first tests in Colorway. My idea is to use Colorway as texturing and look-development tool for VFX projects.

I think it can be a really powerful and artist friendly software to work on different type of assets.
It is also a great tool to present individual assets, because you can do quick and simple post-processing tasks like color correction, lens effects, etc. And of course Colorway allows you to create different variations of the same asset in no time.

With this second test I wanted to create an entire asset for VFX, make different variations and put everything together in a dailies template or similar to showcase the work.

At the end of the day I'm quite happy with the result and workflow combining Modo, Mari and Colorway. I found some limitations but I truly believe that Colorway will fit soon my needs as Texture Painter and Look-Dev Artist.

Transferring textures

One of the limitations that I found as Texture Painter is that Colorway doesn't manage UDIMs yet. I textured this character time ago at home using Mari following VFX standards and of course, I'm using UDIMs, actually something around 50 4k UDIMs.

I had to create a new UV Mapping using the 1001 UDIM only. In order to keep enough texture resolution I divided the asset in different parts. Head, both arms, both legs, pelvis and torso.
Then using the great "transfer" tool in Mari, I baked the high resolution textures based on UDIMs on to the low resolution UVs based on one single UV space. I created one 8k resolution texture for each part of the asset. I'm using only 3 texture channels, Color, Specular and Bump.

Layer Transfer tool in Mari.

All the new textures already baked in to the default UV space 1001

My lighting setup in Modo couldn't be more simple. I'm just using an Equirectangular HDRI map of Beverly Hills. This image is actually shipped with Modo.
Image Based Lighting works great in Modo and is also very easy to mix different IBLs in the same scene. Just works great.

Shading wise is also quite simple. Just one shading layer with Color, Specular and Bump maps connected. I'm using one shader for each part of the asset.

The render takes only around 3 minutes on my tiny MacBook Air.
Rendering for Colorway takes more than that but obviously you will save a lot of time later.
Once in Colorway I can easily play with colours and textures. I created a color texture variation in Mari and now in Colorway I can plug it and see the shading changes in no time.

All the different parts exported from Modo are on the left side toolbar.

On the right side all the lights will be available to play with. In this case I only have the IBL.

All the materials are listed on the right side. It is possible to change color, intensity and diffuse textures. This gives you a huge amount of freedom to create different variations of the same asset.

I really like the possibility of using post-precessing effects like Lens distortion or dispersion. You can have a quick visual feedback of very common lens effects used on VFX projects.

Finally I created a couple of color variations for this asset.

Notes

A couple of things that I noticed while working on this asset:

  • I had one part of the asset with the normals flipped. I didn't realize of this and when rendering for Colorway, Modo crashes. Once inverted the normals of that part, it never crashed again.
  • It would be nice to store looks, or having the option to export looks from one project to another one. Let's say that I'm working only on the upper part of the character, render for Colorway and create some nice looks (including effects like lens distortions, color corrections,etc). It would be great to keep that for the next time that I export the whole character to Colorway.

Colorway for Look-Development in VFX by Xuan Prada

A few days ago (or weeks) The Foundry released their latest cool product called "Colorway", and they did it for free.

Colorway is a product created to help designers with their work flow specially when dealing with color changes, texture updates, lighting, etc. Looks in general.
This software allow us to change those small thing once the render is done. We can do it in real time without waiting long hours for rendering again. We can change different things related with shading and lighting.

This is obviously quite an advantage when we are dealing with clients and they ask us for small changes related with color, saturation, brightness, etc. We don't need to render again anymore, just use Colorway to make those changes live in no time.
Even the clients can change some stuff and send us back a file with their changes.

Great idea, great product.

I'm not a designer, I'm a vfx artist doing mainly textures and look-development, and even if Colorway wasn't designed for vfx, it can be potentially be used in the vfx industry, at least for some tasks.

There are a few things that I'd like to have inside Colorway in order to be a more productive texturing&look-dev tool, but so far it can be used in some ways to create different versions of the same asset.

To test Colorway I used my model of War Machine.

  • Colorway allow us to render an asset using a base shader. Later we can apply different versions of the same textures, or just flat colors.
  • It all begins inside Modo (Cinema4D is on it's way).
  • It's very important how you organize your asset and shaders inside Modo. If you want to have a lot of control in Colorway you will have to split your scene in different parts.
  • In this example, I separated the head in different parts, so I can select them individually later on in Colorway.
  • Even if I'm using the same shader for the whole head, I made different copies so I can tweak them one by one if I want to have even more control in Colorway.
  • In Modo work on the look as you usually do. Once you are happy with the results export to Colorway.
  • In this case I'm using textures to create the look. Maybe you can do it without textures and apply them later in Colorway. You will be able also to remove all the textures in Colorway and start from scratch there. This is a personal taste.
  • Once happy just click on the Colorway button.
  • You can export all the materials and lights used in the scene or only those selected.
  • Click on the render button and that's it.
  • Once the render is done, just open the file exported from Modo and Colorway should pop up.
  • The workspace is super simple and well organized. There are selection groups and looks on the right and shaders, lighting and effects on the left.
  • Just select one of the parts on your left, one of the shaders on your right, or simply select in the viewport.
  • Automatically the controls for the material will pop up.
  • In the material options you can change the textures used by the shaders, or remove them if you want to start with a flat color.
  • I'm changing here the textures for just one of the materials, and later for all of them, creating a new version of my asset.
  • As I said before we can remove all the textures and use only the base shaders plus flat colors in order to create a new version of the asset.
  • Finally the versions that I created for this post :)

A few things that I'd like to see in Colorway in future versions in order to have more control and power for look-dev tasks.

  • Right now we can only change RGB textures. It would be nice to have control over secondary maps. Blending textures with masks would be also great.
  • We can't control the shaders parameters. Having that control for look-dev would be amazing.
  • Support UDIMs has to be a must.
  • Not sure how Colorway manages IBL. If you are using different lights seems to be ok, but if using only IBL it doesn't seem to work totally fine.
  • Transparency, glow and other shading options don't work in the current version.

Mari to Modo with just one click by Xuan Prada

UDIM workflow has been around for the last 10 years or so. It became more popular when Mari came out and these days it’s being used by everyone in the vfx industry.

In this blog you can find different ways to setup UDIMs in different software and render engines.
With Modo 801 has never been so easy, fast and great!
With just one click you are ready to go!

  • Export your textures from Mari. I always use the naming “component_UDIM.exr” “RGB_1001.exr”
  • Once in Modo, assign a new shader to your asset.
  • Add a new layer with a texture map, as usual. Add layer -> image map -> load udims.
  • Select the UDIM sequence that you exported from Mari.
  • Change the “effect” to point to the desired shader channel.
  • By default Modo enables the option “use clip udim”. You can check this in the “uv” properties. This means that you don’t need to do anything, Modo will handle the UDIM stuff by itself.
  • That’s it, all done :)
  • As an extra, you can go to the image manager, select one single map and check the UDIM coordinate.
  • Another cool thing, is that you can select all the UDIM sequence in the imagen manager, and change the color space with one single click! This is great if you are working with linear workflow or another color space.

Vector displacement in Modo by Xuan Prada

Another quick entry with my tips&tricks for Modo.
This time I’m going to write about setting up Mudbox’s vector displacements in Modo.

  • Check your displacement in Mudbox and clean your layer stack as much as you can. This will make faster the extraction process.
  • The extraction process is very simple. Just select your low and high resolution meshes.
  • Set the vector space to Absolute if you asset is a static element, like props or environments.
  • Set the vector space to Relative if your asset will be deformed. Like characters.
  • Always use 32 bit images.
  • As I said export the maps using EXR 32 bits.
  • Before moving to Modo or any other 3D package, check your maps in Nuke.
  • Once in Modo, select your asset and go to the geometry options.
  • Check Linear UVs and set the render subdivision level.
  • Assign a new shader to your asset.
  • Add a new texture layer with your vector displacement map.
  • Set it up ass Displacement Effect.
  • Set the low and high value to 0 and 100.
  • You will see a displacement preview in viewport.
  • Set the gamma to 1.0 Remember that 32bit images shouldn’t be gamma corrected using Linear Workflow.
  • In the shader options set the Displacement Distance to 1m this should give you the same result than Mudbox.
  • In the render options you can control the displacement rate, which is your displacement quality more or less.
  • 1.0 is fine, play with that. Lower values will give you sharper results but will need more time to render.
  • Finally render a quick test to see if everything looks as expected.

Zbrush displacement in Modo by Xuan Prada

Another of those steps that I need to do when I’m working on any kind of vfx project and I consider “a must”.
This is how I set up my Zbrush displacements in Modo.

  • Once you have finished your sculpting work in Zbrush, with all the layers activated go to the lowest subdivision level.
  • Go to the morph target panel, click on StoreMT and import your base geometry. Omit this step if you started your model in Zbrush.
  • Once the morph targer is created, you will se it in viewport. Go back to your sculpted mesh by clicking on the switch button.
  • Export all the displacement maps using the multi map exporter. I would recommend you to use always 32bit maps.
  • Check my settings to export the maps. The most important parameters are scale and intensity. Scale should be 1 and intensity will be calculated automatically.
  • Check the maps in Nuke and use the roto paint tool to fix small issues.
  • Once in Modo, import your original asset. Select your asset in the item list and check linear uvs and set the amount of subdivisions that you want to use.
  • Assign a new shader to your asset, add the displacement texture as texture layer and set the effect as displacement.
  • Low value and high value should be set to 0 and 100.
  • In the gamma texture options, set the value to 1.0
  • We are working in a linear workflow, which means that scalar textures don’t need to be gamma corrected.
  • In the shader options, go to the surface normal options and use 1m as value for the displacement distance. If you are using 32bit displacements this value should be the standard.
  • Finally in the render options, play with the displacement rate to increase the quality of your displacement maps.
  • 0.5 to 1 are welcome. Lower values are great but take more time to render, so be careful.
  • Render a displacement checker to see if everything works fine.

Multi UDIM workflow in Modo by Xuan Prada

You know I’m migrating to Modo.
Multi UDIM workflow is on my daily basis tasks, so this is how I do the setup.

  • First of all, check textures and UDIMs in Mari and export all the textures.
  • Check the asset and UVs in Modo.
  • Load all your textures in the Modo’s image manager.
  • Create a new material for the asset.
  • Add all the UDIM textures as image layers for each required channel.
  • In the texture locator for each texture change the horizontal repeat and vertical repeat to reset. And change the UV offset. It works with negative values (not like Softimage or Maya).
  • That’s it. Make a render check to see if everything works fine.

Arm texture breakdown by Xuan Prada

I did a simple and quick texture breakdown for an human arm.
These are the textures that I usually create when I need to texture digital doubles for films or any kind of humanoid character.

These are the most basic textures used.
Usually working on movies we need more additional textures depending on render engines, other pipeline tools or artistic decisions.
But as I said, take this example as a base or starting point for your work.

These are quick renders using a neutral lighting rig for look-dev.

Diffuse textures.

Overall textures.

Scatter textures.

Displacement textures.

Fine displacement textures.

Specular textures.

Zbrush displacement in V-Ray for Maya by Xuan Prada

It is always a bit tricky to set up Zbrush displacements in the different render engines.
If you recently moved from Mental Ray or another engine to V-Ray for Maya, maybe you should know a few things about displacement maps extracted from Zbrush.

I wrote down here a simple example of my workflow dealing with that kind of maps and V-Ray.

  • First of all drag and drop your 16 bits displacement to the displacement channel inside the shading group attributes.
  • Maya will create a displacement node for you in the hypershade. Don’t worry to much about this node, you don’t need to change anything there.
  • Select your geometry and add a V-Ray extra attribute to control the subdivisions and displacement properties.
  • If you exported your displacement subdividing the UV’s, you should check that property in the V-Ray attributes.
  • Edge lenght and Max subdivs are the most important parameter. Play with them until reach nice results.
  • Displacement amount is the strength of your displacement and displacement shift sould be half negative than your displacement amount if you are using 16 bits textures.
  • If you are using 32 bits .exr textures, the displacement shift should be 0 (zero).
  • Select your 32 bits .exr file and add a V-Ray attribute called allow negative colors.
  • Render and check that your displacement is looking good.
  • I’ve been using these displacement maps. 16 bits and 32 bits.

Vray sss test by Xuan Prada

Just testing Vray’s SSS shader for realistic skin look-dev purposes.
I ended with the theory that would be quite simple to set-up a nice, realistic and cheap SSS shader for human and creature assets. I love the raytraced solid scatter, but with complex models I can’t get rid of some of the artifacts in the SSS channel.
I will post more quite soon.

  • To achieve better results, I like to combine SSS shaders with Vray Mtl shaders which have better solutions for speculars and reflections. With this method the reflection of the surface is controled by BRDF instead of the poor spec control of the SSS shader.

Texturing for VFX film projects. Case study by Xuan Prada

These are the key points of an introduction lecture which I gave about texturing for VFX film projects.
We used different assets on the class but this is the only one which is not copyrighted and I can show here.
I created this asset specifically for this course.

Summary

- Check the model.
- Render a checker scene.
- Decide about the quality needed for the textures. Is it a hero asset?
- UV mapping.
- Organization methods.
- How many UDIM’s?
- Photo Shoot.
- What kind of lighting do I need?
- Accessories. (Color checkers, tripod, polarized filters, angular base, etc).
- Bakes. (dirt maps, dust maps, UVs, etc).
- Grading reference images. Create presets.
- Clean reference images for projections.
- Create cameras and guides in Maya/Softimage for projections.
- Adapt graded and cleaned reference images for projection guides.
- Project in 3D software or Mari. (Mari should be faster).
- Work on the projections inside Mari. (We can use Photoshop, Mari or both of them. Even Nuke).
- Create  a 16 bits sRGB colour textures.
- Test colour channel in the light rig.
- Create a 16 bits gray scale specular textures.
- Create a 16 bits gray scale bump textures.
- Create a 16 bits gray scale displacement textures.
- Create a 8 bits gray scale ISO textures.
- Look-Dev blocking.
- Import the light rig.
- Create a basic pass.
- Checker render (matte).
- Checker render (reflective).
- Create clusters.
- Block materials.
- Look-Dev primary.
- Set up diffuse.
- Set up specular and reflections.
- Balance materials.
- Look-Dev secondary.
- Set up bump.
- Set up displacement.
- Rebalance materials.
- Set up ISO’s.
- Look-Dev refinement.
- Rebalance materials if needed.
- Create material libraries.
- Render turntables.

Basic displacement in RenderMan by Xuan Prada

  • Select the object’s shape node in the Attribute Editor and then go to Attribute -> RenderMan -> Add Sudvision Scheme. This will create a smooth surface.
  • Load your displacement texture in the Hypershade.
  • Play with the Alpha Gain and Alpha Offset to scale the image.
  • Alpha Offset should be half negative than Alpha Gain. So if Alpha Gain is 2 Alpha Offset should be -1
  • Drag the displacement texture on to the displacement material in the shading group attributes.
  • This will create a displacement node.
  • Select the displacement node and go to Attributes -> RenderMan -> Add Displacement Attribues.
  • Set the displacement bound to something similar to your highest displacement value.
  • If you are using ray trace rendering you need to add ray traced attributes to your displacement.
  • Select your shape node and go to Attribute -> RenderMan -> Manage attributes and select TraceDisplacement.
  • Turn the shading rate down to increase the quality of your displacement. You can add a RenderMan attribute to control this instead change the global render options, you’ll save a lot of render time.

Linear Workflow in Maya with Vray 2.0 by Xuan Prada

I’m starting a new work with V-Ray 2.0 for Maya. I never worked before with this render engine, so first things first.
One of my first things is create a nice neutral light rig for testing shaders and textures. Setting up linear workflow is one of my priorities at this point.
Find below a quick way to  set up this.

  • Set up your gamma. In this case I’m using 2,2
  • Click on “don’t affect colors” if you want to bake your gamma correction in to the final render. If you don’t click on it you’ll have to correct your gamma in post. No big deal.
  • The linear workflow option is something created for Chaos Group to fix old VRay scenes which don’t use lwf. You shouldn’t use this at all.
  • Click on affect swatches to see color pickers with the gamma applied.
  • Once you are working with gamma applied, you need to correct your color textures. There are two different options to do it.
  • First one: Add a gamma correction node to each color texture node. In this case I’, using gamma 2,2 what means that I need to use a ,0455 value on my gamma node.
  • Second option: Instead of using gamma correction nodes for each color texture node, you can click on the texture node and add a V-Ray attribute to control this.
  • By default all the texture nodes are being read as linear. Change your color textures to be read as sRGB.
  • Click on view as sRGB on the V-Ray buffer, if not you’ll see your renders in the wrong color space.
  • This is the difference between rendering with the option “don’t affect colors” enabled or disabled. As I said, no big deal.