Porting a shader

From Unify Community Wiki
Revision as of 19:27, 24 January 2009 by Tgraupmann (Talk | contribs)

Jump to: navigation, search

Contents

Introduction

This document aims to provide some guidelines for porting shaders between FX composer, RenderMonkey, Mental Mill, ShaderFX, and Unity3d. Porting shaders is a manual process. But most of the work is just copy paste. And there are some slight semantic differences on each platform.

Shader Applications

There are several applications that facilitate shader development and prototyping:
  • ShaderFX - A max plugin capable of exporting to FX composer. Similar to Mental Mill to be used by artists.
  • Mental Mill - An artist shader development application bundled with FX Composer similar to ShaderFX, but in a free flavor.
  • Unity - Our target shader platform

Getting Shader Help

  • AMD - Developer forums

Resources

  • Google search for DxMaterial_Effect_format.htm which has information on semmantics for porting between 3ds Max and Fx Composer

Shader Porting Practice

I found it helpful to first port the shader to FX composer. When comparing your ported shader it makes a good base to compare to. Before porting to Unity, I attempted a port to RenderMonkey first. That way I could identify all the correct semantics to use. And lastly I ported to Unity while checking the docs and chat for the Unity semantics.


RenderMonkey has a StreamMapping that controls the mapping of semantics into your vertex and shader programs. You likely need to double click the stream mapping outside of the pass and provide Normal, Tangent, and Binormal information. They should all map to index 0. And use names like rmTangent, rmBinormal, and rmNormal.


Note, when using the vertex and fragment programs, the Unity ShaderLab binding parameters are created for you. So you don't want to provide that redudant/wrong information in your shader.


Vertex and Fragment programs likely have you define input and output structures. It's recommended that you use your vertex program output structure also for the fragment program input. Or else bad things could happen.

Using FX Composer

When using FX Composer you must first create a project.


Occasionally the FX Composer default layout gets messed up. So you'll want to restore the FX composer layout with View->Layouts->Reset layout.


In the materials pane, right click and select Add Material From File. Select your FX file (exported from ShaderFX perhaps). Then create some geometry to test the material. Create->Teapot. Then Drag your material from the material window to the RenderWindow and drop it on the Teapot.


Double click a material and click the Editor tab in the top middle pane to see the code.


FX composer FX files have their UI elements and shaders code in the same file. They still use the "uniform" keyword to link the UI variables to the shader code.


FX composer vertex programs and fragment shaders can use functions that you only have to define once which is nice.


FX composer can define uniform input parameters on the vertex and fragment program entry functions.


Using RenderMonkey

RenderMonkey uses a single XML file to store the project and shaders which is swell.


You may have to use a lot of code that is copied and pasted several times if you have multiple passes that use the same functions. That's because you have to redefine the same functions each time you use it in a vertex program or fragment program.


RenderMonkey cannot define uniform input parameters on the entry functions. Instead place your uniform parameter definitions above your entry function.


RenderMonkey requires that you define variables in the IDE for them to be detected as uniform parameters. This includes matrices like modelViewProjection etc.


Functions have to be defined before they can be used. You may have the function defined, but need to swap the order.


Using Unity

Unity is very similar to using RenderMonkey in that functions need to be redefined in each pass.


Unity 2.1 supports Shader Model 3 but applies both DirectX and OpenGL constraints. The next version is planned to remove this constraint.


Including UnityCG.cginc provides some useful variables. Like the camera eye position in object space. You'll notice a lot of shaders calculate this value by multiplying some matrices that don't seem to be available in Unity, currently. Those operations can be optimized out.


Consult the manual for more built-in variable documentation.


For uniform properties to be detected, they must first be defined in the properties section. Creating a new shader prepopulates some properties to see how this is done.


Porting Related Useful Forum Posts

  • Successfully ported a relief shader from ShaderFX to Unity. post
  • I read the docs. I want more docs. post
Personal tools
Namespaces

Variants
Actions
Navigation
Extras
Toolbox