Unity – Realtime Mesh Deformation – Package

During the last week I was inspired by a friend to have a look on realtime mesh deformation and what can be achieved with it. On a long day at the university he stumbled across a deformation shader which was easy and fun to use on a bunch of objects lying around (original inspiration).
We discussed a bit about what can be done with it and what features were missing before we could use it on our own projects, which were the access to the deformed mesh data and the possibility to have it working with collision and the rest of Unity’s physics system.
And here is the Unity package which came out of it.
The package contains:

  1. Deformable.cs; A script which identifies a deformable object)
  2. Deformer.cs; A script which identifies a deforming object which influences deformables)
  3. DeformShader.shader; Shader which now handles texture blending depending on the deformation grade
  4. A bunch of textures; Those are just grabbed from the internet making use of the “fair use” for educational purposes
  5. Two example materials; one with melting snow/water/stone and one with a melting earth
  6. Two example models; one high res sphere and one high res plane
  7. One example scene with both objects setup as deformables; just have fun and play with it

The shader and scripts are mostly self explaining and contain several comments on what is done for what reason.

How to use it:

  • Create a new gameobject on your scene. Either empty one and add, mesh filter, mesh renderer and/or mesh collider to it, drop an prefab with the mentioned components to the scene or use one of the build-in 3D objects.
  • Add the Deformable component to your new object
  • Create another gameobject, this one does not have to have any component as only transformation is relevant.
  • Add the Deformer component to your second object
  • On the Deformable component add the Deformer object to the “Deformers” list to have it influence the deformable during runtime.
  • Last you need to play a bit with the settings to get the effect you are looking for.
    The Deformer only has one property which defines the radius of a virtual sphere and the range of effect the deformer has around it’s point of origin.
    The Deformable has some more properties, beside the list of deformers which stores all deformers which can affect the deformable, those are the other properties (variable name in parentheses):

    1. Deformers (_deformers): list of deformer objects
    2. Restore (_restore): flag indicating if the mesh shoud restore to it’s original state or not (uncheck to keep deformation once done)
    3. Update Collider (_updateCollider): flag indicates if any mesh collider should be updated frequently (uncheck to save physics processing time if you really don’t need it)
    4. Use Dynamic Normal (_useDynamicNormal): flag indicates if the deformable should deform along dynamic normals or use the mesh’s given normals on each vertex
    5. Dynamic Normal (_dynamicNormal): the pre set normal vector used as dynamic normal. Set to to (0, 0, 0) to use the inverse direction from each vertex to the deformer
    6. Fixed Normal Dir (_fixedNormalDir): changes the orientation of the normals when mesh’s normals are used (default to -1 to deform the vertices inwards)
    7. Deform Magnifier (_deformMagnifier): multiplier applied to the actual deformation of a vertex
    8. Effect Range (_effectRange): tweaking value which increases the range of effect of each deformer on the deformable
    9. Deformer Min Distance (_deformerMinDistance): distance a deformer should be at least close to a vertex to effect it
      If no deformer is inside the minimum distance processing of deformable is paused, for performance reasons. This can be easily changed in the Update() method by removing the eraly exit with the “return;” statements.
    10. Fade Speed (_fadeSpeed): speed at which the deformation is done, higher values will cause the deformation to be nearly instant, lower will delay/slow down the deformation.

Download WInt.RealtimeDeformer.unitypackage

Deformable.cs – Source Code

/*
 * inspired by
 * http://unitycoder.com/blog/2015/03/24/mesh-melt-shader-test/
 * 
 * This script and the content of the package are provided as-is.
 * You are free to modify and redistribute the package and code as you need.
 * 
 * As I put some effort into it I would he happy to see you mention ths use of the package in your project
 * Tell me about it at mailme[at]tobiaspott.de
 * 
 */
using UnityEngine;
using System.Collections;
using System;
using System.Collections.Generic;

public class Deformable : MonoBehaviour
{
    /// <summary>
    /// protected field to store the deformer associated for this deformable.
    /// </summary>
    [SerializeField()]
    protected List<Deformer> _deformers = new List<Deformer>();
    /// <summary>
    /// protected field to determine whether the deformable should restore it's state.
    /// </summary>
    [SerializeField()]
    protected bool _restore = true;
    /// <summary>
    /// protected field to determine whether collider data should be updated
    /// </summary>
    [SerializeField()]
    protected bool _updateCollider = true;

    /// <summary>
    /// protected field to determine whether the deformable should use dynamic normals instead of mesh's normals.
    /// </summary>
    [SerializeField()]
    protected bool _useDynamicNormal = false;
    /// <summary>
    /// protected field to store the dynamic normal. Set to Vector.zero to force deformer-dependend dynamic normals.
    /// </summary>
    [SerializeField()]
    protected Vector3 _dynamicNormal = Vector3.zero;
    /// <summary>
    /// protected field to store the orientation of deformation when fixed mesh's normals are used.
    /// </summary>
    [SerializeField()]
    protected float _fixedNormalDir = -1.0f;

    /// <summary>
    /// protected field to store the deform magnifier value applied to a displaced vertex
    /// </summary>
    [SerializeField()]
    protected float _deformMagnifier = 1.0f;

    /// <summary>
    /// protected field to store the effective range of a deformer on the deformable
    /// </summary>
    [SerializeField()]
    protected float _effectRange = 1.0f;
    /// <summary>
    /// protected field to store the minimum distance at which a deformer affects the deformable
    /// </summary>
    [SerializeField()]
    protected float _deformerMinDistance = 4.0f;
    /// <summary>
    /// protected field to the speed at which the deformation is processed (the higher the faster).
    /// </summary>
    [SerializeField()]
    protected float fadeSpeed = 2.0f;

    float viewDistance = 1.0f;

    // components
    private MeshFilter _meshFilter;
    private MeshCollider _meshCollider;
    // mesh duplication
    private Mesh _mesh0;
    private Vector3[] _bVertices, _bNormals;
    private Vector2[] _bUVs;
    private Color[] _bColors;
    private int[] _bTriangles;
    // mesh deformation
    private bool modified = false;
    private float orient = 0.0f;
    private float minDist = 0.0f;
    private float vtxDist = 0.0f;
    private float remapDist = 0.0f;
    private int i = 0;
    private int j = 0;
    private int closestDeformer = -1;
    private List<Vector3> _localDeformerPosition = new List<Vector3>();
    private Vector3 dynNormal = Vector3.zero;


    void Start()
    {
        // retrieve the gameObject's mesh filter and renderer
        _meshFilter = this.GetComponent<MeshFilter>();
        _meshCollider = this.GetComponent<MeshCollider>();
        // get the current mesh into a temporary object 
        Mesh baseMesh = _meshFilter.sharedMesh;

        // when no colors
        if (baseMesh.colors == null || baseMesh.colors.Length == 0)
        {
            Debug.LogWarning("Missing vertex colors, original mesh will be duplicated and reassigned to the mesh filter.");
            _mesh0 = new Mesh();

            // copy vertex positions
            _bVertices = new Vector3[baseMesh.vertexCount];
            Array.Copy(baseMesh.vertices, _bVertices, baseMesh.vertexCount);
            // copy mesh normals if any exist
            if (baseMesh.normals != null && baseMesh.normals.Length != 0)
            {
                _bNormals = new Vector3[baseMesh.vertexCount];
                Array.Copy(baseMesh.normals, _bNormals, baseMesh.vertexCount);
            }
            // copy mesh uvs if any exist
            if (baseMesh.uv != null && baseMesh.uv.Length != 0)
            {
                _bUVs = new Vector2[baseMesh.vertexCount];
                Array.Copy(baseMesh.uv, _bUVs, baseMesh.vertexCount);
            }
            // init color and flush with black
            _bColors = new Color[baseMesh.vertexCount];
            for (int i = 0; i < baseMesh.vertexCount; i++)
                _bColors[i] = Color.black;
            // copy mesh triangle indices
            _bTriangles = new int[baseMesh.triangles.Length];
            Array.Copy(baseMesh.triangles, _bTriangles, baseMesh.triangles.Length);

            // assign copied mesh data array to working mesh
            _mesh0.vertices = _bVertices;
            _mesh0.normals = _bNormals;
            _mesh0.uv = _bUVs;
            _mesh0.colors = _bColors;
            _mesh0.triangles = _bTriangles;

            // assign working mesh to the gameObject's mesh filter
            _meshFilter.mesh = null;
            _meshFilter.sharedMesh = _mesh0;
        }
        else
        {
            Color[] colors = baseMesh.colors;
            for (int i = 0; i < baseMesh.vertexCount; i++)
            {
                colors[i] = Color.black;
            }
            baseMesh.colors = colors;
            _meshFilter.sharedMesh = baseMesh;
        }

        if (baseMesh.normals == null || baseMesh.normals.Length == 0)
        {
            // otherwise activate dynamic normal usage (which does not require mesh normals)
            _useDynamicNormal = true;
            Debug.LogWarning("Missing vertex normals, dynamic normal will be used. " 
			+ "Either assign a value to the dynamic normal or runtime normals will apply.");
        }



        // assigned mesh to the collider if one exists
        if (_updateCollider && _meshCollider != null)
        {
            _meshCollider.sharedMesh = null;
            _meshCollider.sharedMesh = _mesh0;
        }
    }

    void Update()
    {
        // get the shared mesh from the gameObjects mesh filter
        _mesh0 = _meshFilter.sharedMesh;
        // get references to the mesh's vertices and vertex colors
        Vector3[] vertices = _mesh0.vertices;
        Color[] colors = _mesh0.colors;

        // process either using dynamic normals or mesh's normals
        if (_useDynamicNormal)
        {
            if (!this.DeformByDynamicNormal(vertices, colors))
                return;
        }
        else
        {
            // get reference to the mesh's normals
            Vector3[] normals = _mesh0.normals;
            if (!this.DisplaceByMeshNormal(vertices, normals, colors))
                return;
        }

        // push back verticies and colors to the working mesh
        _mesh0.vertices = vertices;
        _mesh0.colors = colors;

        // assigned & update mesh to the collider if one exists
        if (_updateCollider && _meshCollider != null) // && (Time.frameCount % 10 == 0))
        {
            _meshCollider.sharedMesh = null;
            _meshCollider.sharedMesh = _meshFilter.sharedMesh;
        }
    }

    /// <summary>
    /// transform the heat sources world position to deformable local space positions
    /// </summary>
    private void LocalizeHeatSources()
    {
        // if number if previous localized deformers differs from current number of heat sources
        if (_localDeformerPosition.Count != _deformers.Count)
        {
            // clear the previous localized list 
            _localDeformerPosition.Clear();
            // iterate over all deformers and localize their positions
            for (j = 0; j < _deformers.Count; j++)
                _localDeformerPosition.Add(this.transform.InverseTransformPoint(_deformers[j].transform.position));
        }
        // if the number of deformer didn't changed just update the values
        else
        {
            // iterate over all deformers and localize their positions
            for (j = 0; j < _deformers.Count; j++)
                _localDeformerPosition[j] = this.transform.InverseTransformPoint(_deformers[j].transform.position);
        }
    }

    /// <summary>
    /// displace the passed vertices and colors using the given normals
    /// </summary>
    /// <param name="verts">array of vertices to deform</param>
    /// <param name="normals">array of normals used for deformation</param>
    /// <param name="clrs">array of vertex colors to map the deformation back </param>
    /// <returns>true if any deformation was applied, false otherwise</returns>
    private bool DisplaceByMeshNormal(Vector3[] verts, Vector3[] normals, Color[] clrs)
    {
        // init values for processing
        modified = false;
        orient = Mathf.Clamp(_fixedNormalDir, -1.0f, 1.0f);
        minDist = 0.0f;
        vtxDist = 0.0f;
        remapDist = 0.0f;
        j = 0;
        // localize the deformers' positions
        this.LocalizeHeatSources();

        for (i = 0; i < _mesh0.vertexCount; i++)
        {
            // init minDist with maximum value
            minDist = float.MaxValue;
            for (j = 0; j < _deformers.Count; j++)
            {
                // if deformer was not transformed since last frame
                // and deformer component is disabled contine with next
                if (!_deformers[j].Transformed || !_deformers[j].enabled)
                    continue;
                // calculate the distance between current vertex and the current localized deformer position
				// (subtracting the deformer's spacing)
                vtxDist = Vector3.Distance(verts[i], _localDeformerPosition[j]) - _deformers[j].Spacing;
                // if distance to vertex is above minimum distance for deformers to apply to the deformable
                if (vtxDist > _deformerMinDistance)
                    continue;
                // if the distance to the deformer is lower than the minDist set it as new minDist
                if (vtxDist < minDist)
                    minDist = vtxDist;
            }

            // if minDist is unchanged
            if (minDist == float.MaxValue)
                continue;

            // if not modified before mark it now!
            if (!modified) modified = true;

            // remap the actual distance to the range 0..1 using a range of 0.._effectRange
            remapDist = this.Remap(minDist, 0.0f, _effectRange, 0.0f, 1.0f);

            // check whether the mesh shout restore itseld
            if (_restore)
                clrs[i].a = Mathf.Clamp01(clrs[i].a - (viewDistance - remapDist) * Time.deltaTime * fadeSpeed);
            else
                // using Mathf.Min keeps the color at it's lowest ever set value which results in non-restoring mesh
                clrs[i].a = Mathf.Min(clrs[i].a, 
				Mathf.Clamp01(clrs[i].a - (viewDistance - remapDist) * Time.deltaTime * fadeSpeed));

            // displace the vertex by the mesh's normal
            verts[i] = _bVertices[i] + (normals[i] * (1.0f - clrs[i].a) * orient * _deformMagnifier);
        }

        // return false if the mesh wasn't deformed
        if (!modified)
            return false;
        return true;
    }

    /// <summary>
    /// deforms the passed vertices and colors using dynamic normals
    /// </summary>
    /// <param name="verts">array of vertices to deform</param>
    /// <param name="clrs">array of vertex colors to map the deformation back </param>
    /// <returns>true if any deformation was applied, false otherwise</returns>
    private bool DeformByDynamicNormal(Vector3[] verts, Color[] clrs)
    {
        // init values for processing
        modified = false;
        orient = Mathf.Clamp(_fixedNormalDir, -1.0f, 1.0f);
        minDist = 0.0f;
        vtxDist = 0.0f;
        remapDist = 0.0f;
        closestDeformer = -1;

        // set later used dynamic normal to _dynamicNormal
        dynNormal = this.transform.InverseTransformDirection(_dynamicNormal);
        // localize the deformers' positions
        this.LocalizeHeatSources();

        // iterate over each vertex of the mesh
        for (i = 0; i < _mesh0.vertexCount; i++)
        {
            // init minDist with maximum value
            minDist = float.MaxValue;
            // iterate over all deformers
            for (j = 0; j < _deformers.Count; j++)
            {
                // if deformer was not transformed since last frame
                // and deformer component is disabled contine with next
                if (!_deformers[j].Transformed || !_deformers[j].enabled)
                    continue;
                // calculate the distance between current vertex and the current localized deformer position 
				// (subtracting the deformer's spacing)
                vtxDist = Vector3.Distance(verts[i], _localDeformerPosition[j]) - _deformers[j].Spacing;
                // if distance to vertex is above minimum distance for deformers to apply to the deformable
                if (vtxDist > _deformerMinDistance)
                    continue;
                // if the distance to the deformer is lower than the minDist
				// set it as new minDist and save the index of the deformer
                if (vtxDist < minDist)
                {
                    minDist = vtxDist;
                    closestDeformer = j;
                }
            }

            // if minDist is unchanged continue with next iteration
            if (minDist == float.MaxValue) continue;
            // if no closes source was detected continue with next iteration
            if (closestDeformer == -1) continue;

            // if not modified before mark it now!
            if (!modified) modified = true;

            // update dynamic normal when no custom dynamic normal is set
            if (_dynamicNormal == Vector3.zero)
                // calculate dynamic normal by the direction from the current vertex to the closest deformer
                dynNormal = (_bVertices[i] - _localDeformerPosition[closestDeformer]).normalized;// this.transform.InverseTransformDirection((verts[i] - _localizedHeatPosition[closestSource]).normalized);

            // remap the actual distance to the range 0..1 using a range of 0.._effectRange
            remapDist = this.Remap(minDist, 0.0f, _effectRange, 0.0f, 1.0f);

            // check whether the mesh shout restore itseld
            if (_restore)
                clrs[i].a = Mathf.Clamp01(clrs[i].a - (viewDistance - remapDist) * Time.deltaTime * fadeSpeed);
            else
                // using Mathf.Min keeps the color at it's lowest ever set value which results in non-restoring mesh
                clrs[i].a = Mathf.Min(clrs[i].a, 
				Mathf.Clamp01(clrs[i].a - (viewDistance - remapDist) * Time.deltaTime * fadeSpeed));

            // displace the vertex by the dynamic normal
            verts[i] = _bVertices[i] + (dynNormal * (1.0f - clrs[i].a) * _deformMagnifier);

        }

        // return false if the mesh wasn't deformed
        if (!modified)
            return false;
        return true;
    }

    // helper function
    float Remap(float val, float inLowerEnd, float inUpperEnd, float outLowerEnd, float outUpperEnd)
    {
        return outLowerEnd + (val - inLowerEnd) * (outUpperEnd - outLowerEnd) / (inUpperEnd - inLowerEnd);
    }

}

If you wonder about the bunch of class-scope variables. Most of them could be placed inside methods but this would increase the garbage collection activity depending on the complexity of the mesh.
And having the GC run every third frame or so significantly drops performance which is avoided by not letting it collect the data and keep it. This might be not the best way for platforms with fewer memory and requires adjustments.

Deformer.cs – Source Code

/*
 *  
 * This script and the content of the package are provided as-is.
 * You are free to modify and redistribute the package and code as you need.
 * 
 * As I put some effort into it I would he happy to see you mention ths use of the package in your project
 * Tell me about it at mailme[at]tobiaspott.de
 * 
 */

using UnityEngine;
using System.Collections;

public class Deformer : MonoBehaviour
{
    /// <summary>
    /// protected field to store the spacing of the deformer
    /// </summary>
    [SerializeField()]
    protected float _spacing = 0.0f;
    /// <summary>
    /// gets or sets the spacing the deformer has to a deformable
    /// </summary>
    public virtual float Spacing
    {
        get { return _spacing; }
        set { _spacing = value; }
    }

    /// <summary>
    /// protected field to store whether the Transform property should be ignored/overriden
    /// </summary>
    [SerializeField()]
    protected bool _ignoreTransformed = true;

    protected bool _transformed = false;
    /// <summary>
    /// gets whether the deformer has been transformed since the last frame (running in Update())
    /// </summary>
    public bool Transformed
    { get { return _transformed | _ignoreTransformed; } }

    // private fields to store last frame transformation data
    private Vector3 _lastPosition;
    private Quaternion _lastRotation;
    private Vector3 _lastScale;

    // runs a simple update loop to determine if it was transformed during the last frame
    void Update()
    {
        _transformed = false;
        if (this.transform.position != _lastPosition
            || this.transform.rotation != _lastRotation
            || this.transform.lossyScale != _lastScale)
            _transformed = true;

        _lastPosition = this.transform.position;
        _lastRotation = this.transform.rotation;
        _lastScale = this.transform.lossyScale;
    }
}

DeformShader.shader – Source Code

Shader "Labertorium/DeformShader" {
	Properties {
		_MainTex ("Texture", 2D) = "white" {}
		// _MainNormalTex ("Normal", 2D) = "bump" {} // unused to keep shader more simple
		_SecondTex ("Second Texture", 2D) = "white" {}
		// _SecondNormalTex ("Second Normal", 2D) = "bump" {}// unused to keep shader more simple
		_ThirdTex ("Third Texture", 2D) = "white" {}
		// _ThirdNormalTex ("Third Normal", 2D) = "bump" {}// unused to keep shader more simple
		// _Amount ("Extrusion Amount", Range(-1,1)) = 0.5 // unused as script no longer access it
	}
	
	SubShader {
		Tags { "RenderType" = "Opaque" }
	
		CGPROGRAM
		#pragma surface surf Lambert vertex:vert
	
		// declare input structure with uv and color fields	
		struct Input 
		{
			float2 uv_MainTex;
			float4 color;
		};
		
		void vert (inout appdata_full v,out Input o)
		{
			UNITY_INITIALIZE_OUTPUT(Input,o);
			o.color = v.color; // pass the vertex color to the output (required for blending textures)
			// for deformation on the graphics card the vertex shader could be used
			// as this does not affect the physics the following lines are commented out
			// but left for easier legacy support
			// v.vertex.xyz += v.normal * v.color.a * _Amount; // move in normal direction
			// v.vertex.xyz += float3(0, 1, 0) * v.color.a * _Amount; // move down
		}
		
		sampler2D _MainTex; //, _MainNormalTex;
		sampler2D _SecondTex; //, _SecondNormalTex;
		sampler2D _ThirdTex; //, _ThirdNormalTex;
	
		// remaps a value to the range specified by outLower and outUpper
		// using the inLower and inUpper as reference range
		float Remap(float val, float inLower, float inUpper, float outLower, float outUpper)
		{
			return outLower + (val - inLower) * (outUpper - outLower) / (inUpper - inLower);
		}

		void surf (Input IN, inout SurfaceOutput o) 
		{
			// legacy which uses the uv scrolling
			float2 scroll = float2(_Time.x * 0.064f * (1 - IN.color.a), _Time.y * 0.04f * (1 - IN.color.a));

			// get rgb + alpha for all three textures
			float4 main = tex2D(_MainTex, IN.uv_MainTex);
			// the scroll offset is applied to the second to have a scrolling uv-animation
			float4 second = tex2D(_SecondTex, IN.uv_MainTex + scroll * 4); 
			// apply scroll to this one too if the texture should be animated!
			float4 third = tex2D(_ThirdTex, IN.uv_MainTex); 

			// remap the vertex colors alpha (set via deformable script)
			// to use it as blend factor for the first and second texture
			float blend1To2 = Remap(IN.color.a, 0.25, 1.0, 0.0, 1.0);
			// remap the vertex colors alpha (set via deformable script) 
			// to use it as blend factor for the second and third
			float blend2To3 = Remap(IN.color.a, 0.0, 0.25, 0.0, 1.0);

			// init c with white
			float3 c = float3(1, 1, 1);
			// if the vertex color's alpha is above .. blend the first and second texture
			if (IN.color.a > 0.25)
				c = lerp(second.rgb, main.rgb, blend1To2);

			// if the vertex color's alpha is below .. blend the first and second texture
			if (IN.color.a < 0.25)
				c = lerp(third.rgb, second.rgb, blend2To3);

			// put the color to the out's albedo value
			o.Albedo = c;

			// you could use emission but it doesn't make sense for the examples
			// o.Emission = c * (1 - IN.color.a);
		}
		ENDCG
	} 
	Fallback "Diffuse"
}

Unity – Texture To VertexColor mapping – Script

To pick up my previous post on a vertex color shader I’ve written a short routine which takes a snapshot of a skinned & animated mesh and maps it’s texture colors to it’s vertex colors. Although the script is using a SkinnedMeshRenderer the actual mapping process can be done with any sort of mesh from a MeshRenderer component. It only requires to have uv coordinates (procedural materials might not work with this approach).
You can use the VertexColor shader from http://labertorium.de/unity/773/unity-vertexcolor-shader/ to visualize the results.

To read pixels from a texture you need to import it as read/write enabled. Select the texture from your project folder and change the texture type to “Advanced” and check the “Read/Write enabled”. Unity will throw an exception if you try to read pixels from a resource texture not marked as read/write enabled (This does not apply to textures completely generated from code, those are always readable & writable).

// variable used to prevent multiple simultaneous executions of the coroutine
bool _corCreateVCDuplicate = false;
IEnumerator COR_CreateVCDuplicate()
{
	// check if the blocking flag is set or not
	// if not execute the rest of the coroutine
	// if so just go to the end of the coroutine and end its execution
	if (!_corCreateVCDuplicate)
	{
		// setting the blocking variable so the coroutine cannot be executed multiple times simultaneously
		_corCreateVCDuplicate = true;

		// receive the SkinnedMeshRenderer component to retrieve the current mesh from 
		// it would also be possible to use an existing mesh (from a simple MeshRenderer)
		// it is only required to have a uv-set on the mesh (in the following example I'm using the first set 'uv')
		SkinnedMeshRenderer smr = this.GetComponentInChildren<SkinnedMeshRenderer>();
		// REMARK: comment the upper line and uncomment the lower if the SkinnedMeshRenderer component is on the same object as the script this code is placed in!!!
		// SkinnedMeshRenderer smr = this.GetComponent<SkinnedMeshRenderer>();
		// create a new Mesh object, which is empty and does not contain any data
		Mesh m = new Mesh();
		// bake the current mesh from the SkinnedMeshRenderer into the new empty mesh object
		smr.BakeMesh(m);

		// Here the actual mapping begins (you could use any mesh you have at hand for this)
		// create a temporary array to store the vertex colors in (after baking the mesh might not have any as vertex colors are rarely used)
		Color[] mColors = new Color[m.vertices.Length];

		// retrieve the currently used texture from the SkinnedMeshRenderer material 
		// REMARK: this script does only work flawless on meshes using one material only
		Texture2D tex = (Texture2D)smr.sharedMaterial.GetTexture("_MainTex");
		// get all pixels from the texture and put them into a new flat Color-array 
		Color[] pixels = tex.GetPixels();
		// yield the execution to continue the next frame
		yield return null;

		// calculate a yieldThreshold used to yield the further execution for 10 frames to prevent lags on the main thread execution
		// I'm using a 10th of the total number ov uv coordinates on the mesh (e.g. 24000 uvs gives me a threshold of 2400 the script needs to process every frame
		int yieldThreshold = m.uv.Length / 10;

		// iterate over all uvs (first set)
		for (int i = 0; i < m.uv.Length; i++)
		{
			// put the uv coordinate at index 'i' into a local variable (visible only inside the for-loop
			Vector2 uv = m.uv[i];
			// convert the uv coordinate into pixel space using the texture width and height
			// Mathf.FloorToInt converts an float type into an int and rounds it down to the next lower int value (4.22 becomees 4)
			// uv.x * tex.width is used to calculate the x-axis pixel coordinate (I assume uv.x is between 0.0 and 1.0 which should be the case for most manually and well layouted uvs on a mesh)
			int x = (int)Mathf.FloorToInt(uv.x * tex.width);
			// the same on the y axis
			int y = (int)Mathf.FloorToInt(uv.y * tex.height);
			// security check to prevent the pixel space coordinates getting out of bounds
			if (x >= tex.width) x = tex.width - 1;
			if (y >= tex.height) y = tex.height - 1;
			// check if the value of i is equal to yieldThreshold and yield the execution at this point until the next frame
			if (i % yieldThreshold == 0) yield return null;

			// caluclate the index to access the flat Color-array
			// the colors are aligned line by line and simply sticked together in that array
			int index = x + (y * tex.width);
			// use the pixels color to set the color in our temporary vertex color array.
			mColors[i] = pixels[index];
		}

		// set the vertex color array of our previously baked mesh to our temporaray vertex color array
		m.colors = mColors;

		// create a new gameobject with auto-added MeshFilter and MeshRenderer
		GameObject goTest = new GameObject("Duplicate", typeof(MeshFilter), typeof(MeshRenderer));
		// set the sharedMesh of the MeshFilter to our baked mesh 
		goTest.GetComponent<MeshFilter>().sharedMesh = m;
		// create a new Material from resources which can visualize the vertex colors
		// I'm using a testing material from my project
		// you need to adjust this line to load a material which uses a vertex color shader and is placed somewhere in your projects resources folder
		goTest.GetComponent<MeshRenderer>().sharedMaterial = new Material(Resources.Load<Material>("Core.Materials/_Testing_Mat"));
		// set the duplicates position and rotation so it matches the original 
		// it would be otherwise placed somewhere at the world-origin
		goTest.transform.position = this.transform.position;
		goTest.transform.rotation = this.transform.rotation;

		// resetting the blocking variable to false to let the coroutine be exectued again
		_corCreateVCDuplicate = false;
	}

}

// make a manual invokation inside the FixedUpdate function
// you can also put it into Update, it does not matter which one you use
void FixedUpdate()
{
	// check if the keyboard key 'Z' is down for the first time
	if(Input.GetKeyDown(KeyCode.Z))
	{	
		// start the function as a coroutine
		this.StartCoroutine(this.COR_CreateVCDuplicate());
	}
}

Unity – VertexColor – Shader

This time it is a simple surface shader which displays the vertex colors of a mesh and optionally multiplies it with a base color.
I’m using it to visualize the vertex colors on meshes (for debugging) or on solid block looking particles without textures.
See the code for comments and explanation.

Shader "Labertorium/Vertex Colored/Solid" {
	Properties {
		// introduces the base color property which can be used to multiply the vertex colors with (e.g. dim to black or a specific color)
		_Color ("Color", Color) = (1.00, 1.00, 1.00, 1.00) // white
	}
	SubShader {
		// Unity shader lab attributes
		// set the render type to be opaque (not using any transparencies or similar)
		Tags { "RenderType"="Opaque" }
		LOD 200
		
        // actual shader code        
		CGPROGRAM
		// set the 'surf' function to be used by the rendering pipeline as the surface shader function and let it use the lambertion lighting model
		#pragma surface surf Lambert

		// map the property to a vector4 of type fixed to use it inside the surf function
		fixed4 _Color;

		// declare the input structure we expect unity to pass to the surf-function
		struct Input {
			// add the uv coordinate for the main texture to the input structure
			float2 uv_MainTex;
			// add the vertex color of the rendered mesh to the input structure 
			// remark: the ': COLOR' tells Unity which of the mesh attribute it should but into the field
			float4 color : COLOR;
		};

		// actual surface function which does the color calculation
		void surf (Input IN, inout SurfaceOutput o) {
			// get the color value from the input structure and multiply it with the _Color property of the shader
			o.Albedo = IN.color.rgb * _Color.rgb;
			// for compatibility reasons (the shader can be easily made to support transparencies)
			// set the surfaces alpha value to the product of the vertex colors alpha and the _Color properties alpha value
			// it depends on how the alpha should be calculated (set by vertex color, set by the shaders property, a mixture of both or any other way
			o.Alpha = IN.color.a * _Color.a;
		}
		ENDCG
	} 
	// set a fallback for Unity to use when the shader fails to compile/load for rendering 
	// disable this line when you want to test the shaders functionality
	FallBack "Diffuse"
}

Unity – Multicast – Class

The Multicast class is one of the more powerful tools I’m using in Ouroboros. It functionality is quite simple but allows me to do physics checks in cusomized ways Unity itself does not have. The initial idea behind the Multicast class was a generic class which handles raycasting in various geometric ways just by providing some simple parameters instead of doing tons of calculations. At the moment the Multicast class can cast rays in circular and spherical manner which can be adjusted to your need to cover only a specific area of a circle or a sphere. This gives some advantages over Unitys colliders or spherecasts as those are always colliding with the complete shape.

Download the unitypackage containing the Multicast class and an example script ready to use in your scene.
Multicast.unitypackage

How to test:

Drop the “Multicaster” component onto an object and look at it inside the editor. Play around a bit with the values of the “Multicaster” component and see what happens. You’ll see the Multicast draw colored lines which represent the rays it is going to cast.
For more detailed information on what value does what you can either wait for the tooltip of the property to show up or look inside the Multicast.cs file and read what variable are used for.

Remarks:

The Multicast class doesn’t do any black magic and wraps the Unity Physics.Raycast() method into more comfortable ways to work with. You should take into account that the number of rays created by a multicast can incease really quick.
You also might consider using the Multicast class itself and not the Multicaster (see the components code how it uses the Multicast) to reduce the overhead created by drawing all rays inside the editors scene view (It is meant for debugging porposes and get dropped in builds but you will notice a performance drop when the editor has to handle a lot of line/ray drawing when in edit-mode).

Required additional code:

The Extensions class contains some helper functions, from which only the Compare(float, float, float) function is required. The function compares two floating point values for their equality and applies an optional threshold to the compare value to cope with possible inaccuracies of the floating point type.
Unity will bother you with a compiler error (something about a type or method which can not be found) if you don’t create an Extension.cs file and paste the below code to it (or paste the code to another C# script file).

public class Extensions
{
    public static bool Compare(float value, float compare, float threshold)
    {
        if (threshold == 0.0f)
            return (value == compare);
        else
            return ((compare - threshold) <= value) && (value <= (compare + threshold));
    }
}
License


Microsoft Reciprocal License (Ms-RL)
This license governs use of the accompanying software. If you use the software, you accept this license. If you do not accept the license, do not use the software.
Definitions
The terms "reproduce," "reproduction," "derivative works," and "distribution" have the same meaning here as under U.S. copyright law.
A "contribution" is the original software, or any additions or changes to the software.
A "contributor" is any person that distributes its contribution under this license. "Licensed patents" are a contributor's patent claims that read directly on its contribution. Grant of Rights
(A) Copyright Grant- Subject to the terms of this license, including the license conditions and limitations in section 3, each contributor grants you a non-exclusive, worldwide, royalty-free copyright license to reproduce its contribution, prepare derivative works of its contribution, and distribute its contribution or any derivative works that you create.
(B) Patent Grant- Subject to the terms of this license, including the license conditions and limitations in section 3, each contributor grants you a non-exclusive, worldwide, royalty-free license under its licensed patents to make, have made, use, sell, offer for sale, import, and/or otherwise dispose of its contribution in the software or derivative works of the contribution in the software.
Conditions and Limitations
(A) Reciprocal Grants- For any file you distribute that contains code from the software (in source code or binary format), you must provide recipients the source code to that file along with a copy of this license, which license will govern that file. You may license other files that are entirely your own work and do not contain code from the software under any terms you choose.
(B) No Trademark License- This license does not grant you rights to use any contributors' name, logo, or trademarks.
(C) If you bring a patent claim against any contributor over patents that you claim are infringed by the software, your patent license from such contributor to the software ends automatically.
(D) If you distribute any portion of the software, you must retain all copyright, patent, trademark, and attribution notices that are present in the software.
(E) If you distribute any portion of the software in source code form, you may do so only under this license by including a complete copy of this license with your distribution. If you distribute any portion of the software in compiled or object code form, you may only do so under a license that complies with this license.
(F) The software is licensed "as-is." You bear the risk of using it. The contributors give no express warranties, guarantees, or conditions. You may have additional consumer rights under your local laws which this license cannot change. To the extent permitted under your local laws, the contributors exclude the implied warranties of merchantability, fitness for a particular purpose and non-infringement.

Unity – OctreeNode – class

This time I’m providing a base class to generate an octree. The OctreeNode represents a single node inside an octree which can be consecutively divided into a new layer.
See http://en.wikipedia.org/wiki/Octree for some use cases of an octree.
The class is yet hardly optimized for performance and efficiency (memory consumption) but shows the basic approach how to create an octree.

The OctreeNode class is written in general C# but uses a Debug.DrawLine-call to display its ramification inside the Unity3D game engine.

// Update is called once per frame
void Update()
{
    // creates the first initial node with a size of 128 in all world units
    OctreeNode node = new OctreeNode(this.transform.position, new Vector3(128, 128, 128));
    // calculates the octree down to the depth level 2 (results in 3 total octree levels with 64 nodes)
    node.NextLevel(true, 2);
}

To test the octree inside Unity, add the above code to your C# script (should to the trick in UnityScript/Javascript as well although you need to take care on the class creation and compilation yourself). The above code creates a three level deep octree with an over all size of 128 world units on all axis.
You also need to add the code below to an existing or new C# file (note: the OctreeNode class is not derived from MonoBehaviour).

/// <summary>
/// OctreeNode class to generate a simple octree structure.
/// This can be adjusted to various requirements e.g. 3D space pathfinding
/// </summary>
public class OctreeNode
{
    // stores the current octree level of the node
    private int _level = 0;
    // stores the center position of the node
    private Vector3 _position;
    // stores the extends of the node
    private Vector3 _extends;
    // stores all subnodes of the node
    private OctreeNode[, ,] _subnodes = new OctreeNode[2, 2, 2];

    /// <summary>
    /// gets the current octree level
    /// </summary>
    public int Level
    { get { return _level; } }
    /// <summary>
    /// gets the current node position
    /// </summary>
    public Vector3 Position
    { get { return _position; } }
    /// <summary>
    /// gets the current node extends
    /// </summary>
    public Vector3 Extends
    { get { return _extends; } }
    /// <summary>
    /// gets a specific subnode
    /// </summary>
    /// <param name="x">index along the x axis (0 or 1)</param>
    /// <param name="y">index along the y axis (0 or 1)</param>
    /// <param name="z">index along the z axis (0 or 1)</param>
    /// <returns>the subnode at the specified indices</returns>
    public OctreeNode this[int x, int y, int z]
    {
        get { return _subnodes[x, y, z]; }
        set { _subnodes[x, y, z] = value; }
    }


    public OctreeNode(Vector3 position, Vector3 extends)
    {
        _position = position;
        _extends = extends;
    }

    private OctreeNode(int level, Vector3 position, Vector3 extends)
    {
        _level = level;
        _position = position;
        _extends = extends;
    }

    /// <summary>
    /// calculates the next level of octree nodes
    /// </summary>
    /// <param name="recursive">repeat the calculation on all descendant nodes</param>
    /// <param name="maxLevel">the maximum level to calculate octree nodes for (break condition for recursive calculation)</param>
    public void NextLevel(bool recursive = false, int maxLevel = 4)
    {
        Vector3 childPosition;
        Vector3 childExtend = _extends * 0.5f;
        Vector3 childExtendHalf = _extends * 0.25f;
        int nextLevel = _level + 1;

        for (int x = 0; x < 2; x++)
        {
            // calculate subnodes x position
            if (x == 0) childPosition.x = _position.x - childExtendHalf.x;
            else childPosition.x = _position.x + childExtendHalf.x;

            for (int y = 0; y < 2; y++)
            {
                // calculate subnodes y position
                if (y == 0) childPosition.y = _position.y - childExtendHalf.y;
                else childPosition.y = _position.y + childExtendHalf.y;

                for (int z = 0; z < 2; z++)
                {
                    // calculate subnodes z position
                    if (z == 0) childPosition.z = _position.z - childExtendHalf.z;
                    else childPosition.z = _position.z + childExtendHalf.z;

                    // insert subnode
                    _subnodes[x, y, z] = new OctreeNode(nextLevel, childPosition, childExtend);
                    // calculate next level if recursive is set and max octree level is not reached yet
                    if (recursive && nextLevel < maxLevel)
                        _subnodes[x, y, z].NextLevel(recursive, maxLevel);

                    // Debug drawing the line to the nodes (only do on the last octree level)
                    if (nextLevel == maxLevel)
                        Debug.DrawLine(this.Position, childPosition, new Color(x, y, z));
                }
            }
        }
    }
}
License


Microsoft Reciprocal License (Ms-RL)
This license governs use of the accompanying software. If you use the software, you accept this license. If you do not accept the license, do not use the software.
Definitions
The terms "reproduce," "reproduction," "derivative works," and "distribution" have the same meaning here as under U.S. copyright law.
A "contribution" is the original software, or any additions or changes to the software.
A "contributor" is any person that distributes its contribution under this license. "Licensed patents" are a contributor's patent claims that read directly on its contribution. Grant of Rights
(A) Copyright Grant- Subject to the terms of this license, including the license conditions and limitations in section 3, each contributor grants you a non-exclusive, worldwide, royalty-free copyright license to reproduce its contribution, prepare derivative works of its contribution, and distribute its contribution or any derivative works that you create.
(B) Patent Grant- Subject to the terms of this license, including the license conditions and limitations in section 3, each contributor grants you a non-exclusive, worldwide, royalty-free license under its licensed patents to make, have made, use, sell, offer for sale, import, and/or otherwise dispose of its contribution in the software or derivative works of the contribution in the software.
Conditions and Limitations
(A) Reciprocal Grants- For any file you distribute that contains code from the software (in source code or binary format), you must provide recipients the source code to that file along with a copy of this license, which license will govern that file. You may license other files that are entirely your own work and do not contain code from the software under any terms you choose.
(B) No Trademark License- This license does not grant you rights to use any contributors' name, logo, or trademarks.
(C) If you bring a patent claim against any contributor over patents that you claim are infringed by the software, your patent license from such contributor to the software ends automatically.
(D) If you distribute any portion of the software, you must retain all copyright, patent, trademark, and attribution notices that are present in the software.
(E) If you distribute any portion of the software in source code form, you may do so only under this license by including a complete copy of this license with your distribution. If you distribute any portion of the software in compiled or object code form, you may only do so under a license that complies with this license.
(F) The software is licensed "as-is." You bear the risk of using it. The contributors give no express warranties, guarantees, or conditions. You may have additional consumer rights under your local laws which this license cannot change. To the extent permitted under your local laws, the contributors exclude the implied warranties of merchantability, fitness for a particular purpose and non-infringement.

Unity – Brownian Tree (snippet)

This is a simple script which represents a brownian tree reference implementation in C# targeting the Unity3D game engine.
The original C# code was taken from Rosettacode.org removing the System.Drawing reference and reimplement it using the Unity API.

using UnityEngine;
using System.Collections;
using System.IO;

public class BrownianTree : MonoBehaviour 
{
    // Use this for initialization
    void Start () 
    {
	// example use:
	// creating a brownian tree on a Texture2D and write it as .png to the projects directory
	Texture2D brownianTreeTexture = BrownianTree.Create(256, 3000);
	byte[] png = brownianTreeTexture.EncodeToPNG();
	File.WriteAllBytes("brownianTree.png", png);
	GameObject.Destroy(brownianTreeTexture);
	Debug.Log("Finished writing brownian tree to file.");
    }
	
    public static Texture2D Create(int size, int numparticles)
    {
        Texture2D tex = new Texture2D(size, size);
        Rect bounds = new Rect(0,0,size, size);
        BrownianTree.FlushTexture(tex, Color.black);

        tex.SetPixel(Random.Range(0, size), Random.Range(0, size), Color.red);
        int ptX = 0; // current point x coordinate
        int ptY = 0; // current point y coordinate
        int newptX = 0; // new point x coordinate
        int newptY = 0; // new point y coordinate
        for (int i = 0; i < numparticles; i++)
        {
            // create the current point from two randomized numbers inside the range of 0 to size of texture
            ptX = Random.Range(0, size);
            ptY = Random.Range(0, size);
            // enter a do-while loop
            do
            {
                // create a new point (based on the current point) by adding +1/0/-1 from its x and y coordinates
                newptX = ptX + Random.Range(-1, 2); // what to add is determined by a random number inside the range of -1 to 1
                newptY = ptY + Random.Range(-1, 2); // what to add is determined by a random number inside the range of -1 to 1

                // check if the new point is not inside the texture bounds (is an invalid pixel coordinate)
                if (!bounds.Contains(new Vector2(newptX, newptY)))
                {
                    // continue;
                    // randomize the current point coordinates again
                    ptX = Random.Range(0, size);
                    ptY = Random.Range(0, size);
                }
                // check if the new points r value is above zero
                else if (tex.GetPixel(newptX, newptY).r > 0)
                {
                    // set the current points color to red (which is just an example color)
                    tex.SetPixel(ptX, ptY, Color.red);
                    // creak the do-while loop and interate the next particle
                    break;
                }
                else
                {
                    // set the current point to the new points coordinates
                    ptX = newptX;
                    ptY = newptY;
                }

            } 
            while (true);
        }
        // apply all changes made to the texture
        tex.Apply();
        // return the texture containing the brownian tree
        return tex;
    }

    /// <summary>
    /// flushes the given texture with the given color
    /// </summary>
    /// <param name="tex">texture to flush</param>
    /// <param name="flushColor">color to be flushed to the texture</param>
    public static void FlushTexture(Texture2D tex, Color flushColor)
    {
        // create an array of Color which is as big as the textures raw pixel data
        Color[] pixels = new Color[tex.width * tex.height];
        // iterate over the array and set each 'pixel' to the flush color
        for (int i = 0; i < pixels.Length; i++)
            pixels[i] = flushColor;
        // set the textures pixels to the array of colors
        tex.SetPixels(pixels);
        // apply the changes made to the texture
        tex.Apply();
    }

}

miLabel – Gizmo

So this is the first entry in our nuke section. This post is for the miLabel.gizmo which is a node to work with the pmsk.rendering.RenderPass.ObjectID() function.
 
First lets set up our environment (Windows only). To be ablt to use the gizmo in nuke we need to locate our .nuke directory which should be found at:
Windows XP: “C:\Dokumente und Einstellungen\Dominik\.nuke” or “C:\Documents and Preferences\Dominik\.nuke” or
Windows 7: “C:\Benutzer\Dominik\.nuke” or “C:\Users\Dominik\.nuke”
 
Dominik is my local user name and you should replace it with your own. Also notice that the .nuke directory won’t be visible until you have turned on the option in your windows explorer settings.

Download the archive (here) and copy the miLabel.gizmo into the directory and restart nuke. Afterwards you can access the gizmo by pressing “x” in the node graph and typing the node name (miLabel) into the textfield.
 
Well that’s it 🙂 have fun… just kidding, that was how adding custom gizmos to nuke works. This steps can be adapted for most custom gizmos on this website.
 
Now lets take a look at the node a little bit. As mentioned above this node works in conjunction with the objectID pass so I assume you have at least 2 images (1 beauty and 1 objectID).

As you can see in the image above simply connect the miLabel-Input to your miLabel read node and you are ready to go. The node will give you an RGBA output with the mask for the selected miLabel.
The mask input is for masking parts out of the id with a roto node for example.

The AdditionalLabel should be used when you want to make a combined mask of 2 miLabel nodes. (Important : make sure you set anti alias = 0, otherwise you will get double anti aliasing which will show up in a semitransparent line at the intersection border of both objects)

OK, that wasn’t that hard I assume but till now nothing has happened, so what to do next ?

If you doubleclick the node a little dot should appear in the viewer which is labeled screenPosition. Drag this dot onto the object where you want to generate the mask of and after that press the setID button in the node. You should have now a mask of that object if you look at the output of the miLabel node.

This can be used as a mask for color correction of other stuff, we will stick with the color correction to keep it simple.

I applied a simple gain in the color correction to shift the color of the handlebar towards yellow.

So lets go on with the node attributes. Below the setID you will find 3 color fields which holds the values of the mask This shouldn’t be touched unless you want to mess up your mask.

Right below are the anti alias settings. The following picture shows the 3 different modes.

no anti alias = 0 // simple anti alias = 1 / advanced anti alias = 2

In the top picture we have a visible border around our object, due to the fact that we can’t alias our miLabel pass and we will have some artifact’s around our objects.

To get rid of this you can turn anti alias to 1. This will blur the mask a little bit and should work for most cases. You can control the amount of blur with the blur slider (default 1 = 1 pixel blur). Values around 0.5 – 1.5 should give pleasant results.
 

If you set anti alias to 2 (bottom picture), you can also increase/decrease the size of the mask with the size slider. This is handy if your mask is to small (normally if your selected object lies behind another), so you can increase it a little bit and afterwards blur out the edges a little bit.
 

And last but not least the masking channel is the channel at the mask input which should be used for masking, blend is to increase/decrease mask intensity (still not sure if this works correctly)
 

Finally that’s it, hope you can get some nice things out of it.
We are pleased to see results created with this gizmo.
 
 

PointWorld – Gizmo

We now present a nuke gizmo, which works in conjunction with the image/s you get as rendering results when using the pmks.rendering.RenderPass.PointWorld() class in Maya. The original script was given to us by our prof Timo Schnitt, big thanks to him.
Download the gizmo here.
 

An instruction on how to get a custom gizmo to run in nuke take a look at the miLabel – Gizmo.

In general it is a masking tool to create 3D masks for static objects during compositing stage.
You can specify a size and a shape, also different falloffs are possible, which makes it really handy.

The image below shows the basic setup for the connections to make. There are a few traps you need to be aware of.
Ignore the red error onto the node, this happens from time to time cause of missing masking channels.

Make sure you put your PointWorld Pass/image into input 1 of the PointWorld node. Next is to set the AOV Channel to PointWorld inside the node. After that you can grab the screenPoint and place it where you want the center of the mask, click set Point-button inside the node and some values should appear inside the AOV Value r,g,b, boxes. That’s it 🙂
If you want to pick another point, adjust the screenPoint and press set Point again.

Shouldn’t there be a mask? If you check the output there isn’t anything like a mask inside the rgb or alpha channel. The resulting mask comes in the mask-channel, which makes it not that easy to visualise it. The easiest way to do so is a colorcorrect node the mask is used inside, dial the colorcorrect gain down or up and play with the settings of the PointWorld node to get the shape of the mask the way you want.
 
Explanation of available options:

Screen Point is the actual Pixel Position of the sampled pixel (Center of the mask).

Set Point will set the pixel sample to the actual pixel below the screenPoint dot.

AOV Channel : Choose the PointWorld Channel/Pass you have rendered out of maya.

AOV Value : Should be equal to the rgb data in the PointWorld Channel/Pass. Don’t mess with this its to spot errors if something is’t working right.

Radius : Size of the box or sphere which is used to generate the mask. It’s somehow related to the world size and not to the actual pixels of the rendered image

Shape : 0 = Box || 1 = Sphere anything in between is equal to 0. I know the tool-tip says its the opposite way around but i don’t want to change the node, because it’s not my own child, but perhaps in a later release I will make some improvements to it.

Blur Width : Anti Alias the mask, most of the time you will have to use this. (Point World pass is a technical pass so you can’t alias it and you will get problems where objects overlap).

Mask: Like the word says, here you can apply a mask to the resulting output to subtract areas. Make sure you connect the mask to the mask input of the node and choose the right channel. Multiply is used to dial the amount of the mask up or down.

mask by id: don’t touch this. These are special masks, which got written out during normal rendering and are company internal and might be removed in future releases.

So that concludes this node.

 

pmsk.rendering – RenderPass.ObjectID

This class is used to create image which includes a custom matte object for each geometry in the scene.

Make sure the MentalRay Plugin is loaded (Mayatomr.mll). Otherwise you will get an error.

To set it up you have to run:

import pmsk;
## creates the render pass for objectID
pmsk.rendering.RenderPass.ObjectID.CreateIDPass();
## iterate all objects and set ids
pmsk.rendering.RenderPass.ObjectID.AllocateIDs();

After the rendering you will find an extra hdr.file which should be called like: FileName_id.hdr.

Check out the nuke section to find a gizmo which will work with this pass.

At the moment you have to change the id’s by hand if you duplicate an object with an assigned miLabel. This can be found in the transform node under extra attributes.

Or you just set the id’s up right before you render, so there won’t be any duplicates.

pmsk.rendering – RenderPass.PointWorld

This class is used to create a PointWorld renderpass for each object in the scene. This pass can then be used to do realy cool stuff in nuke. We will add some nuke gizmos to us the pass later on.

So far here is the code you should run:

import pmsk;
## creating the render pass for pointWorld
renderPass = pmsk.rendering.RenderPass.PointWorld.CreateRenderPass();
## creating the sampler info to retrieve world position of each object
samplerInfo = pmsk.rendering.RenderPass.PointWorld.SamplerInfo4PointWorldPass();
## create color buffer node for render pass and connect all previously created nodes
pmsk.rendering.RenderPass.PointWorld.CreateColorBuffer(renderPass, samplerInfo);

The first line creates a customColorPass and signs it into the active renderpasses.
Afterwards we create a sampler info node which is used to get the point position of each object.
With the last function call we create a writeToColorBuffer node for every object in the scene and connect all the stuff together.