Add Unity's Standard (Roughness) shader properties to custom shader? - c#

I have created a custom shader that is used to blend two materials easily. How do I include properties such as Normal map and emission properties and other properties as well from Unity's Standard (Roughness) shader?
Shader "Myshaders/ChangeMaterial" {
Properties {
_Tint ("Tint Color", Color) = (.9, .9, .9, 1.0)
_TexMat1 ("Base (RGB)", 2D) = "white" {}
_TexMat2 ("Base (RGB)", 2D) = "white" {}
_Blend ("Blend", Range(0.0,1.0)) = 0.0
}
Category {
ZWrite On
Alphatest Greater 0
Tags {Queue=Transparent}
Blend SrcAlpha OneMinusSrcAlpha
ColorMask RGB
SubShader {
Pass {
Material {
Diffuse [_Tint]
Ambient [_Tint]
}
Lighting On
SetTexture [_TexMat1] { combine texture }
SetTexture [_TexMat2] { constantColor (0,0,0,[_Blend]) combine texture lerp(constant) previous }
SetTexture [_TexMat2] { combine previous +- primary, previous * primary }
}
}
FallBack " Diffuse", 1
}
}

In your example, you are incorrectly using the legacy ShaderLab passes which do not apply to the Physically Based Rendering pipeline (contemporary) nor the scriptable rendering pipeline (bleeding edge).
You should get familiar with the built in shader code located at https://github.com/TwoTailsGames/Unity-Built-in-Shaders. Observe the #include statements located in the various passes here: https://github.com/TwoTailsGames/Unity-Built-in-Shaders/blob/master/DefaultResourcesExtra/Standard.shader

Related

Hiding UI elements behind an image

I am trying to achieve to hide my UI elements such as text, images etc. behind another transparent image. The problem is that I am using the latest Unity version and these custom shaders have no effect on hiding the UI shaders. The two custom shaders I have applied for testing are:
Shader "Custom/DepthReserve"
{
Properties
{
}
SubShader
{
Tags { "RenderType" = "Opaque" "Queue"="Geometry-1" }
LOD 100
Blend Zero One
Pass
{
}
}
}
and
Shader "Custom/InvisibleMask" {
SubShader {
// draw after all opaque objects (queue = 2001):
Tags { "Queue"="Geometry+1" }
Pass {
Blend Zero One // keep the image behind it
}
}
}
So is there a way to use a transparent image to hide text, images and even gameobjects?

Shaders are always showed with fixed values in material editor? (Unity)

I'm trying to get a value property from a shader attached in an object at runtime, but it's never changes in material editor. Maybe I misunderstood anything of how shaders works?
I read about the GetFloat, GetColor, etc but don't figure out yet how it properly works to get an information of a shader in Update(). The real objective here is catch a specific value from shader (in realtime) and do something in C# script, if it's possible.
C# example:
public Color colorInfo;
Awake()
{
rend = GetComponent<Renderer>();// Get renderer
rend.material.shader = Shader.Find("Unlit/shadowCreatures");//shader
}
Update()// I want the info in a realtime
{
//get current float state of the shader
colorInfo = rend.material.GetColor("_Color");
//If I setup a white color in shader properties, the color in material editor is always white
}
Shader:
Shader "Unlit/shadowCreatures"
{
Properties {
_Color ("Color", Color) = (1,1,1,1)
[PerRendererData]_MainTex ("Sprite Texture", 2D) = "white" {}
_Cutoff("Shadow alpha cutoff", Range(0,1)) = 0.5
}
SubShader {
Tags
{
"Queue"="Geometry"
"RenderType"="TransparentCutout"
"PreviewType"="Plane"
"CanUseSpriteAtlas"="True"
}
LOD 200
Cull Off
CGPROGRAM
// Lambert lighting model, and enable shadows on all light types
#pragma surface surf Lambert addshadow fullforwardshadows
// Use shader model 3.0 target, to get nicer looking lighting
#pragma target 3.0
sampler2D _MainTex;
fixed4 _Color;
fixed _Cutoff;
struct Input
{
float2 uv_MainTex;
};
void surf (Input IN, inout SurfaceOutput o) {
fixed4 c = tex2D (_MainTex, IN.uv_MainTex) * _Color;
o.Albedo = c.rgb;
o.Alpha = c.a;
clip(o.Alpha - _Cutoff);
_Color = (0,0,0,0); //trying turn to black just for a test, and nothing happens
}
ENDCG
}
FallBack "Diffuse"
}
Thank you for your time
I think I figured out what you're trying to do and
That's not how things work, sorry
Piecing together your comments and question, this is what I think you're trying to fiddle with:
void surf (Input IN, inout SurfaceOutput o) {
_Color = (0,0,0,0); //trying turn to black just for a test, and nothing happens
}
That doesn't do what you think it does. That line sets this _Color:
#pragma target 3.0
sampler2D _MainTex;
fixed4 _Color;
fixed _Cutoff; //here
Not this one:
Properties {
_Color ("Color", Color) = (1,1,1,1) //here
[PerRendererData]_MainTex ("Sprite Texture", 2D) = "white" {}
_Cutoff("Shadow alpha cutoff", Range(0,1)) = 0.5
}
That second one is what is shown in the inspector panel, and its linkage with the CGPROGRAM block is effectively one-way because frag surf and geom are all called multiple times, in parallel, and rely on receiving the same data in, so the Properties value is read into the CGPROGRAM and the CGPROGRAM's values are discarded when it is done.
I don't think there's any way you can make the shader CGPROGRAM do anything that you can read from C# (because that code runs hundreds of times per frame, how would you know which one to read from?)
I know you can get at the properties (including changing a property for one instance or for all instances), but not the underlying CGPROGRAM data. The only way I can even think of getting around this would be to render to a texture and then read the texture data, and that would be slow (and again, you get into "which pixel has the value you want"?)

Render world space canvas first

I have a canvas (in world space/3d) which is not showing to camera due to this shader which is applied to another (sphere) object.
Shader "Custom/TextureHolderCustom" {
Properties {
_MainTex ("Albedo (RGB)", 2D) = "white" {}
_Mask ("Mask Texture", 2D) = "white" {}
}
SubShader {
Tags{"Queue" = "Transparent"}
Cull Off
Lighting On
Zwrite On
Blend SrcAlpha OneMinusSrcAlpha
Pass
{
SetTexture[_Mask]{combine texture}
SetTexture[_MainTex]{combine texture,previous}
}
}
FallBack "Diffuse"
}
sphere object where my above shader is attached showing first but my canvas is not showing. I want to show the canvas first. I think i attach above shader to my canvas but it is not possible to apply m
I am zero in shader programming

Fog Of War for Unity3D

I've been experimenting a fog of war. I have been following this tutorial apparently this is unity 4 and I'm using unity 5, I'm currently getting this error:
Surface shader vertex function 'vert' not found
I read on the comment section on this youtube video, I followed them, but it gives me the error. Tried the original version(The one in the video) but it makes my plane a always underneath even though it's y axis is zero, and the main map's y axis is -10.
btw here is my code for my shader:
Shader "Custom/FogOWarShader" {
Properties {
_Color ("Main Color", Color) = (1,1,1,1)
_MainTex ("Base (RGB)", 2D) = "white" {}
}
SubShader {
Tags { "RenderType"="Transparent" "LightMode"="ForwardBase" }
Blend SrcAlpha OneMinusSrcAlpha
Lighting Off
LOD 200
CGPROGRAM
//#pragma surface surf NoLighting Lambert alpha:blend --the one that makes the map always on top
#pragma surface surf Lambert vertex:vert alpha:blend
fixed4 LightingNoLighting(SurfaceOutput s, fixed3 lightDir, float aten){
fixed4 color;
color.rgb = s.Albedo;
color.a = s.Alpha;
return color;
}
fixed4 _Color;
sampler2D _MainTex;
struct Input {
float2 uv_MainTex;
};
void surf (Input IN, inout SurfaceOutput o) {
half4 baseColor = tex2D (_MainTex, IN.uv_MainTex);
o.Albedo = _Color.rgb * baseColor.b;
o.Alpha = _Color.a - baseColor.g;
}
ENDCG
}
FallBack "Diffuse"
}
You have a line in that shader which says:
#pragma surface surf Lambert vertex:vert alpha:blend
So you say you have a vertex shader called vert, but then there is no such function in the rest of your code. That is what it's complaining about.
Having taken a look at the actual shader, all you need to do it set up the blending. You can do that by modifying the pragma that is in the original shader to
#pragma surface surf NoLighting noambient alpha:blend
That should do the trick.

HLSL shaders, how to draw a model with it's original colors?

I made this very simple 3D model exported to .x format with Google SketchUp, the model is a simple cube and has no textures, only different colors for each of it's face.
I have written this HLSL code to try to render this model with it's original colors, but it doesn't work, I get a "The current vertex declaration does not include all the elements required by the current vertex shader. Color0 is missing" when calling the ModelMesh.Draw() method. Here is my HLSL Code:
float4x4 World;
float4x4 View;
float4x4 Projection;
struct AppToVertex
{
float4 Position : POSITION0;
float4 Color : COLOR0;
};
struct VertexToPixel
{
float4 Position : POSITION0;
float4 Color : COLOR0;
};
VertexToPixel ColoredVS(AppToVertex input)
{
VertexToPixel output = (VertexToPixel)0;
float4 iTransformed = mul(input.Position, World);
float4 iView = mul(iTransformed, View);
float4 iProjection = mul(iView, Projection);
output.Position = iProjection;
output.Color = input.Color;
return output;
}
float4 ColoredPS(VertexToPixel input) : COLOR
{
return input.Color;
}
technique Colored
{
pass Pass0
{
VertexShader = compile vs_2_0 ColoredVS();
PixelShader = compile ps_2_0 ColoredPS();
}
}
This is probably a noob question and I am aware that I could draw this model by simply using the BasicEffect class(which works), but I am doing this just for learning HLSL, and up to now all I was able to do was to draw a model with another color defined in the shader all over the model :[
This is the first time i've looked at a .X but I don't believe that Sketchup is exporting vertex colours for your model.
(Someone who is more experienced with this format please correct me if you see anything wrong!)
template Mesh {
<3D82AB44-62DA-11cf-AB39-0020AF71E433>
DWORD nVertices;
array Vector vertices[nVertices];
DWORD nFaces;
array MeshFace faces[nFaces];
[...]
}
template MeshVertexColors {
<1630B821-7842-11cf-8F52-0040333594A3>
DWORD nVertexColors;
array IndexedColor vertexColors[nVertexColors];
}
Looking at your model's .X 'declarations', I can see the mesh's geometry, channels such as colours & normals, etc are stored as seperate 'objects' (just like they are in the DOM in XNAs Content Pipeline).
Whereas your model has definitions for position data (Mesh), normal data (MeshNormals) and texture coords (MeshTextureCoords) (all defined below the templates section) I can see no section of the type MeshVertexColors.
Instead it appears that four Materials are defined, and applied to your models faces using the
template MeshMaterialList {
<F6F23F42-7686-11cf-8F52-0040333594A3>
DWORD nMaterials;
DWORD nFaceIndexes;
array DWORD faceIndexes[nFaceIndexes];
[Material]
}
section.
So you don't have any vertex colours for ModelProcessor to import, or your shader to display.
There is probably an option in Sketchup to enable Vertex Colours. However if you can't find/don't want to look for it, you can add and remove vertex channels in the Content Pipeline by creating a Content Processor in a Content Pipeline Extension project.
(I know this sounds really unpleasant but acctually this is a really great feature when you get used to it!)
There are quite a few tutorials online about this but in a nutshell:
Go into VS, right click, Add New Project > Content Pipeline Extension Library
In that new project, add a new Content Processor item, something along the lines of that below.
Recompile and then in the Content Processor field of your models .X file entry in the Content project, select your new importer (by the name you give it in the DisplayName attribute)
What this is doing is essentially intercepting the data from your .X after it has been imported into XNAs Document Object Model, and before it is processed into a new Model class, the processor checks for the presence of a vertex colour channel, and if one is not found, generates and adds it, before the ModelProcessor creates the VertexDeclaration and packs all the vertex data into a VertexBuffer to be loaded by the GPU.
This will just the display the cube as a solid purple but at least you can see if your shader is working.
[ContentProcessor(DisplayName = "Processor to make sure we have vertex colours in all models")]
public class Character_Model_Processor : ModelProcessor
{
public override ModelContent Process(NodeContent input, ContentProcessorContext context)
{
foreach(NodeContent c in input.Children)
{
if(c is MeshContent)
{
foreach(GeometryContent g in (c as MeshContent).Geometry)
{
//Stop here and check out the VertexContent object (g.Vertices) and the Channels member of it to see how
//vertex data is stored in the DOM, and what you can do with it.
System.Diagnostics.Debugger.Launch();
AddVertexColorChannel(g.Vertices);
}
}
}
ModelContent model = base.Process(input, context);
return model;
}
private void AddVertexColorChannel(VertexContent content)
{
if(content.Channels.Contains(VertexChannelNames.Color(0)) == false)
{
List<Microsoft.Xna.Framework.Color> VertexColors = new List<Microsoft.Xna.Framework.Color>();
for (int i = 0; i < content.VertexCount; i++)
{
VertexColors.Add(Color.Purple);
}
content.Channels.Add(VertexChannelNames.Color(0), VertexColors);
}
}
}
Good luck, sorry if you already knew most of that, I guess you probably know all about the pipeline modifications if you are writing your own shaders but better safe :)
EDIT: Just a note about that System.Diag... line, if you break into the debugger, make sure you when you're done exploring that you hit F5 and let it run its course otherwise it takes your main VS window with it when it exits - not a problem just really annoying when you forget to comment it out ;)

Categories