Author Archives: vcdearing

Site Move and Archiving

We have moved locations!  Please go to to view my up to date CV and Portfolio.

That being said, since there are still some good blog posts and info on this site I will be keeping it around as an archive from my time as a student at the Columbus College of Art & Design.  Everything here is considered as-is, and no tools or downloads are being maintained.  Thank you for your understanding, and please visit me at my new site listed above.

Leave a comment

Posted by on July 27, 2017 in Uncategorized


Emblem Beta Release

After many redesigns, rewrites, and optimizations, I am finally releasing my new Maya tool Emblem in a beta test state.  Emblem is a tool designed to quickly and painlessly generate texture maps by duplicating decals you input across an object’s surface.  These decals have a random position and size (within a settable range), but the positions are hit tested against the object’s uv shells so they will always appear within those shells.  Decals will not be generated, in blank space wasting file size, or along a uv seam, bringing attention to said seam.

Here is a picture demonstration:

GUI_Image emblemDemo

As you can see, I was generating 600 small blue dots, 300 medium green dots below the blue ones, and 150 larger red dots below both.  Emblem uses alpha testing when placing images to allow for the overlapping images.  All you need to do is run the tool, set you decal map save location, format, and sizes. Then add as many decal layers as you want, setting the number of decals each layer will have, the size variances, and the decal’s image (double clicking the source image location will load a file browser). Finally, select the object you want to generate the map for, target it with the target button, and hit the generate button.  The tool will do its work and show you progress bars at each step of the way.  Then you can grab the generated image from the location you specified, and use it as you see fit.

Some disclaimers:

  1. The program is still in a testing phase, so while it functions, there are no help files or extra documentation at the moment
  2. Due to the beta state, bugs, errors, and other inconsistencies may happen. Please report them to me so I can fix them
  3. Emblem’s GUI is running off of PyQt. That means you will not be able to open the tool unless you have it installed along side your Maya install
  4. Emblem was built for Maya 2013 x64 and may not function or it could look odd in any other version. I am very interested in seeing how it looks/works in other versions, so if you try it in other versions of Maya please send me the results

Where to get PyQt:

A pre-built version of PyQt for Maya 2013 x64 can be found here: Simply unzip the file Nathan has provided, and drop the contents into: C:\Program Files\Autodesk\Maya2013\Python\Lib\site-packages. This file will -NOT- work for the 32 bit version of Maya or any other Maya releases.


Simply unzip the file linked below and place the Emblem folder into your Documents\Maya\Scripts directory.  Then open maya and run the script by opening your script editor, selecting source script, and running from the Emblem folder you just placed.


Leave a comment

Posted by on September 19, 2013 in Uncategorized


Random Texture Decal Propagation Overview Part I (WIP)

For my next “white paper” post on the texture processing system I developed, I want to talk about the random texture decal propagation script that is an integral part of that system.  On the most basic level, the script generates a random point in UV space and hit tests it to see if that point is within a UV shell boundary of the currently selected object.  If the point is within a shell boundary, a texture decal can be placed in that spot by moving the placement of the decal via its place 2D node.  Now, simply repeat that process out and connect all the decals with a layered texture node, and you have a texture stack of randomly placed decals.  That stack can then be layered on top of the base textures if you want something like a bunch of stickers, or used to used to modulate textures like adding random scratches across a reflection/specularity map.  The script is more complicated than this, but that should give you a general idea of what is going on before I talk about each part in more detail.

Generating a random point is easy, the hard part is doing the UV hit testing.  First things first, what does UV hit testing entail?  The idea is to use a method called polygonal ray cast testing which lets you see if a point is in a polygon with two or more points.  The method walks around the boundary of a polygon casting rays into it along the y intercept of the point being tested.  If the y intercept is above the highest point or below the lowest point, that line segment between the two points is ignored.  If, however, the the y intercept falls between the two points, a new test is done to see if the point is to the left of the y intercept between the ray being cast and the line segment joining the two points.  If it is, a counter is increased by one.  After every line segment has been tested, the counter is analyzed to see if the counter is even, odd, or zero.  If the counter is zero or even, the point is outside of the the polygon.  If it is odd, the point is inside the polygon.  The idea is, if the point the ray is being cast from intercepts no line segments – or an even number of line segments – the point is outside the the polygon.  If, instead, the point is only to the left of an odd number of segments, there is an equal number of segments the point is to the right of.  In that case, the point is inside the polygon.

Here are a few sites that go over the process with some pictures:

-Note: There are other methods of doing polygonal hit testing, but this system is easy to conceptualize and implement, so it was the one I decided to use.  Also, there are special cases for perfectly horizontal line segments that are accounted for, but I won’t go over here.

Now, what does all that have to do with UV hit testing?  Well, if you take just the boundary edges of a UV shell, it forms a close looped polygon that can be tested with the above method!  The tricky part is isolating the boundary edges/UVs and organizing them in such a way that you are walking them in order.  I used a series of selections and selection conversions in order to isolate the just the boundary edges.  It is important to do this in series, or you will get some junk edges that will mess up the testing process.  You can view the process in the source code.  Once you know what the boundary edges are, you can grab the boundary UVs which will be used to do the ray cast testing just like the polygonal points are used above.  The problem at this point is figuring out how to walk the UVs.  If you do not walk them in the correct order, lines will be “drawn” across the UV shell which will give you incorrect intercept counts (an extra line will cause points inside to count as outside and vise versa).  The situation is similar to having a connect the dots image with no numbers (UV id’s are arbitrary and not useable for this process), but having a cheat sheet with the finished image next to it.  It is easy to walk by a person, but much trickier for a computer to figure out.

In order to solve that problem, I developed an algorithm that does the best it can to try and figure out where to go next.  The algorithm starts at a given UV that appears on the list of UVs making up a UV shell boundary.  From there, the first step is to grab every UV that is adjacent to that current UV by converting it to boundary edges and then back to UVs.  From there the filtering begins.  The first stage makes sure the new list of possible destination UVs are all in the current shell.  Then it removes both the current UV and the last UV to be tested.  The next stage is designed to removed UVs that appear on the wrong side of a shell if the shell is of a closed mesh.  This causes the edges and vertices to wrap around to the other side of the shell (like a cylindrical projection).  This second stage converts the current and last UVs to their related vertices, then back into UVs.  Those UVs are then able to be removed since the computer now knows they can not be possible destinations.  The third stage of filtering removes any points that have already been tested.  Finally, the last stage guesses based on the closest point to the current UV.  Even with the guess however, both the initial edge filtering and UV filter sequence have proved very successful in walking UV shells.  While errors can still occur, the better the UV layout, the better the results.  Bad in, bad out, as they say.

Here are some images illustrating the what happens when trying to walk a base Maya sphere shell, and what the filter stages do: (Click to Enlarge)

Initial UV selection via boundary edges (6 possible UVs to walk to)


First Stage Filtering (down to 4 UVs)


Second Stage Filtering (down to 2 UVs)


Fourth Stage Filtering (closest UV chosen as destination)


Leave a comment

Posted by on December 7, 2012 in Uncategorized


Distressed Texture Processing Overview

The distressed texture processing system I developed is part of a three pronged attack meant to solve the challenge of differentiating many similar objects in a scene.  The goal was to create a system that was capable of taking a single set of standard textures, meshes, and UV’s, and make each one look unique.  This component of my answer to that issue was to create a network capable of applying a sun damage and/or wear and tear texture processing effect to the base input textures.  The other two parts create the functionality and organization capable of sending different signals to each object via switches and modulation lines, and a decal propagation system capable of randomly generating texture stacks for each object.  For this post however, I want to focus solely on the distressed texture processing effect and how it is made.

Firstly, I want to talk about the general organizational structure I used to wrangle the network into an easy to use system.  I used a series of nested asset containers to sub-divide the different functions into make-shift objects; think object-oriented programing.  I then created a series of null nodes spread throughout the network to hold all of my custom attributes for user input.  From there, I simply published the attributes on those null nodes to the single wrapper asset that encloses the whole network.  The network finally outputs the processed textures to another null node.  This output method allows the network to be used with any render engine or individual shader, since it simply takes the textures in one end, processes them to add the effects, then outputs them out the other end.

Here is the closed network, showing the consolidation of the network and its null node published attributes(very important since null nodes do not appear in the hypershade tabs by default):

Click to Enlarge

Here is a picture of the master asset open, holding only the upper level nodes(note these pictures are of an older build, so the final null output node is not shown):

Click to Enlarge

Next up is the overview of the dynamic sun damage system.  The idea was to emphasize the difference in position and rotation of these many objects by highlighting a single facing angle with sun damage.  The system uses a directional light’s transform to allow a user to point an arrow at a group of objects, and generates a rotational matrix from the transform.  That matrix is then converted into a vector originating along the negative Z axis.  The new vector is then projected onto the object’s vertex positions in object space (gotten from a sampler info node) using a vector dot product.  The resulting vector is then passed through a set range node in order to apply the user’s additional spread input and then used to drive the V value of a V ramp.  Ultimately, this process allows a user to project a directional light’s direction onto a dynamically controlled ramp radiating out from the light’s intersection center, but only effecting the object’s surfaces that face the light.  This ramp is projected in object space, so it is animation/transformation independent, and will move with the object instead of swimming across its surface.  This ramp is then simply used in the final texture processing step as a screen blend over the final color output from the rest of the network, and a reverse is used as a multiply against the processed reflection gloss value.  Thus, this effect is able to desaturate and lighten a surface’s color, and dull its reflection, in the area affected by the generated/projected sun damage ramp.

Here is a picture of the sun damage network:

Click to Enlarge

You can see some sample images from a separate demo in another post on my blog here:

Next, I want to go over the more important texture processor, the wear and tear network.  The purpose of the wear and tear was to provide a dynamic post-texture creation distressed effect.  The system uses a simple white for on, black for off, texture map to apply the effect; similar to bump maps.  The texture map simply provides a boundary for the effect however.  There are three user-input values that provide for the dynamic nature of the effect, with a final option to work in coated or non-coated modes.  First is the color processing amount which works the same in both modes, and lightens and desaturates the base texture’s color map in the affected area.  Next is the bump processing amount which also works in both modes, and smooths out the base bump or normal map in order to make it flatter.  Finally, there is the reflection processing amount that lightens the base reflection map and increases he glossiness in non-coated mode, or darkens the base reflection and decreases the glossiness.  This is an important step that gives the effect of wearing off a surface coat like varnished wood or a waxed car, or the wearing down of a solid material like eroded stone or rubbed tarnished metal.

Here is the top wear and tear asset:

Click to Enlarge

Here is the fully opened asset:

Hopefully this explains what this part of the whole texture processing system can do.  I will be making similar posts soon explaining the other two parts of the system since they are just as important (if not more so) in differentiating multiple similar objects.  I will leave this post with one last set of images showing the effects highlights on the duck demo scene I made to showcase the whole system, as well as the full open network for that scene:

-Individual Texture Processing Demo

Click to Enlarge

-Texture Processing Highlights.  Blue is wear and tear, red is sun damage, and purple is decals

Click to Enlarge

-Fully opened texture processing network for this scene

Click to Enlarge

You can see this scene animated, all three components of this system in action, and more in my shading demo reel here:

Leave a comment

Posted by on November 6, 2012 in Uncategorized


Shader R&D Update Post 3

I wanted to post up the current dev build of my Advanced Water Shading System.  I have moved the water surface and foam layer shader, as well as the control script, to version 1.0.  That means all three have primary development complete, and have had their code cleaned up, commented, and formatted.

I compiled the current build with a looping camera fly-through, and have posted it up here:

If you don’t have the Unity web player (and don’t want to download it), you can see a desktop stream here:

Keep in mind, the compiled version is much smooth, and higher quality.

That build is using the current code, but still has the old terrain map in place.  I have Jillian working on a new terrain map, and when it is done I will merge the system into it.  I will consider that version my demonstration copy, and will release the finished system along side it.  I am basically ready to release the system now, but there is no documentation, so waiting for a demo copy before release gives me time to make the docs.

Here are the main feature improvements since the last post:

  1. New layered foam shader that runs along the coasts
  2. Improved animation controller with feathered ends instead of the old linear animations
  3. Setup texture and variable sharing between the the water surface shader and the foam shader to reduce user setup time
  4. More organized shader and control script properties, to make them easier to work with
  5. Improved displacement system
  6. Individual height controls for each wave normal map
  7. Tweaked textures to disable some mip-mapping (for a sharper appearance at longer distances), and increased texture anisotropic filtering to counter as much of the light flickering as possible over the surface of the water.
Leave a comment

Posted by on October 1, 2012 in Uncategorized


Shader R&D Post 2

I have been working really hard on the wear and tear/sun damage systems for my new shader, and have managed to finish and integrate both of them.

For the sun damage system, I managed to create a technique using a Euler angle to directional vector conversion, via a rotation matrix. It is able to both correctly place the sun damage effect by pointing a directional light at the spot a user wants the damage to be, and that damage will stay in place regardless of rotation/translation/scale (with no baking or caching required).  Here are the results:

The wear and tear system has two modes (that a user is able to toggle between using a check box).  The first mode uses additive reflectance to simulate rough materials that have been worn down to a smooth base.  An example of that is rusted metal that has a section worn down, showing the smooth metal underneath the rust.  The second mode simulates coated materials that are worn down to the rougher base.  An example would be a glossy varnished wooden table that has been worn down to the raw wood underneath.  Here are some samples of the wear and tear system in action.:

The two systems have been merged together as well, so they can be used in combination:

Both systems have been combined and organized using Maya’s hypershade asset system.  This means the entire shading system is condensed down into one asset node.  I then published the attributes that control the main shader (an mia_X for the time being), and the two system controls.  Here is a screen grab of the asset node and accompanying controls:

I am now going to move on to the decal system and decal propagation script.  I should have at least that ready for this time next week; if not the distressed blend as well.  Once the shader system is done, I will move on to the more practical demonstrations of the shader using a better object, environment, and textures.

Leave a comment

Posted by on September 25, 2012 in Uncategorized


Shader R&D Update Post 1

After figuring out the “too many texture interpreters” issue I was having, I have managed to wrap up the first draft of the second version of my Unity water shader.  Here are the major improvements I have added or changed since the first version:

  1. Added normal map stacking to allow for three individual normal maps to be used to create the desire water effect (deep waves, sub-surface waves, and surface turbulence)
  2. I redesigned the color system to be completely different.  It now has a main and alternate color chosen by the user that is slowly blended between using a texture map.  This slow, undulating, blend allows for more detail in the water system, while also making it appear more dynamic than before.
  3. I added the displacement mapping back into the system.  The design is based on a single broad texture map that moves back and forth, moving the verts by fairly small user-controlled increments.  This new effect makes the water seem like it is slowly swelling up and down, as opposed to the old static plane it was before.
  4. I made some level of transparency the norm, unlike the old system that was meant to be completely opaque.  In addition, I made the transparency view-angle dependent, so the water is more transparent at direct angles, and less transparent at glancing angles.
  5. I also applied the view-angle calculations to the reflection cube that already existed in the old system.  This change makes reflections more realistic by making the surface more reflective at glancing angles, and less reflective at direct angles.
  6. the texture map movement script (which is separate from the main shader, but meant to be used in conjunction) has been upgraded to give the user more control over the direction and speed of each individual texture map the moves.

All of these improvements have drastically increased the quality of the shader’s output.  There is still more work to be done however…

  1. I need to make a code cleaning pass to make sure it is organized and formatted well.
  2. I need to make sure the published properties (shader attributes) are organized and labeled well.
  3. I need to make a set of default property settings, game objects, and texture maps that, together, will be considered my “release package.” The idea is to be able to load up a scene and immediately have the shader up and running, without having to do a lot of work.
  4. Finally, my last major addition will be the development of a secondary shader that will be designed to be layered in on top of the current water shader.  This secondary shader will be used for coastal effects.  It will use the same terrain map that the main water shader uses in order to map an animated, tiled, decal onto the coastal areas of the terrain.  My current thought is to have the decal be duplicated and animated through a slight expanding of the map’s UV’s. This animation would create a cycle of growth, fading out of the now larger map, and fading in of the now back-to-normal-size second map.  The specific animation is subject to change once I see it in action, but this plan will work for now.  I will be tying in the animation controls to the existing water shader’s animation script, so as to keep things as centralized as possible.

Here is a short video capture of the new water shader in action:

Note, that the terrain I am using is untextured, and was made fairly quickly as a test bed.  I will be getting a much better terrain for official demos and documentation photos in the near future.

Leave a comment

Posted by on September 13, 2012 in Uncategorized


Website Redesign

I have spent the past few weeks fixing up the entire website, with a new theme, new portfolio pages, and complete reworkings of my MEL script pages.  You can check out the new pages by clicking “Online Portfolio” above, and the new script instructions can also be found above.  Both Synch Up and Real Scale Toolbox are up to date, but I have temporarily taken Sun and Sky Plus down as I fix a few issues with it and create a UI for it.  Check back soon for a brand new look and level of usability from Sun and Sky Plus.

Leave a comment

Posted by on February 28, 2012 in Uncategorized


Real Scale Tool Box Version 1.4

I just uploaded the new version of real scale tool box to the page on the tab with the same name on the top of this page.  It contains extensive changes from the last version so make sure to upgrade if you have been using it before.  Please check it out and send me any problems, concerns, or ideas you may have.

Leave a comment

Posted by on October 20, 2011 in Uncategorized


Finished Movie

Here is the final composited version of the movie:

Leave a comment

Posted by on May 11, 2011 in Uncategorized