I'm banging my head at this problem for quite a while now and finally realized that i need serious help...
So basically i wanted to implement proper shadows into my project im writing in Monogame. For this I wrote a deferred dhader in HLSL using multiple tutorials, mainly for old XNA.
The Problem is, that although my lighting and shadow work for a spotlight, the light on the floor of my scene is very dependent on my camera, as you can see in the images: https://imgur.com/a/TU7y0bs
I tried many different things to solve this problem:
A bigger DepthBias will widen the radius that is "shadow free" with massive peter panning and the described issue is not fixed at all.
One paper suggested using an exponential shadow map, but i didn't like the results at all, as the light bleeding was unbearable and smaller shadows (like the one behind the torch at the wall) would not get rendered.
I switched my GBuffer DepthMap to 1-z/w to get more precision, but that did not fix the problem either.
I am using a
new RenderTarget2D(device,
Width, Height, false, SurfaceFormat.Vector2, DepthFormat.Depth24Stencil8)
to store the depth from the lights perspective.
I Calculate the Shadow using this PixelShader Function:
Note, that i want to adapt this shader into a point light in the future - thats why im simply using length(LightPos - PixelPos).
SpotLight.fx - PixelShader
float4 PS(VSO input) : SV_TARGET0
{
// Fancy Lighting equations
input.ScreenPosition.xy /= input.ScreenPosition.w;
float2 UV = 0.5f * (float2(input.ScreenPosition.x, -input.ScreenPosition.y) + 1) - float2(1.0f / GBufferTextureSize.xy);
// Sample Depth from DepthMap
float Depth = DepthMap.Sample(SampleTypeClamp, UV).x;
// Getting the PixelPosition in WorldSpace
float4 Position = 1.0f;
Position.xy = input.ScreenPosition.xy;
Position.z = Depth;
// Transform Position to WorldSpace
Position = mul(Position, InverseViewProjection);
Position /= Position.w;
float4 LightScreenPos = mul(Position, LightViewProjection);
LightScreenPos /= LightScreenPos.w;
// Calculate Projected UV from Light POV -> ScreenPos is in [-1;1] Space
float2 LightUV = 0.5f * (float2(LightScreenPos.x, -LightScreenPos.y) + 1.0f);
float lightDepth = ShadowMap.Sample(SampleDot, LightUV).r;
// Linear depth model
float closestDepth = lightDepth * LightFarplane; // Depth is stored in [0, 1]; bring it to [0, farplane]
float currentDepth = length(LightPosition.xyz - Position.xyz) - DepthBias;
ShadowFactor = step(currentDepth, closestDepth); // closestDepth > currentDepth -> Occluded, Shadow.
float4 phong = Phong(...);
return ShadowFactor * phong;
}
LightViewProjection is simply light.View * light.Projection
InverseViewProjection is Matrix.Invert(camera.View * Camera.Projection)
Phong() is a function i call to finalize the lighting
The lightDepthMap simply stores length(lightPos - Position)
I'd like to have that artifact shown in the pictures gone to be able to adapt the code to point lights, as well.
Could this be a problem with the way i retrieve the world position from screen space and my depth got a to low resolution?
Help is much appreciated!
--- Update ---
I changed my Lighting shader to display the difference between the distance stored in the shadowMap and the distance calculated on the spot in the Pixelshader:
float4 PixelShaderFct(...) : SV_TARGET0
{
// Get Depth from Texture
float4 Position = 1.0f;
Position.xy = input.ScreenPosition.xy;
Position.z = Depth;
Position = mul(Position, InverseViewProjection);
Position /= Position.w;
float4 LightScreenPos = mul(Position, LightViewProjection);
LightScreenPos /= LightScreenPos.w;
// Calculate Projected UV from Light POV -> ScreenPos is in [-1;1] Space
float2 LUV = 0.5f * (float2(LightScreenPos.x, -LightScreenPos.y) + 1.0f);
float lightZ = ShadowMap.Sample(SampleDot, LUV).r;
float Attenuation = AttenuationMap.Sample(SampleType, LUV).r;
float ShadowFactor = 1;
// Linear depth model; lightZ stores (LightPos - Pos)/LightFarPlane
float closestDepth = lightZ * LightFarPlane;
float currentDepth = length(LightPosition.xyz - Position.xyz) -DepthBias;
return (closestDepth - currentDepth);
}
As I am basically outputting Length - (Length - Bias) one would expect to have an Image with "DepthBias" as its color. But that is not the result I'm getting here:
https://imgur.com/a/4PXLH7s
Based on this result, I'm assuming that either i've got precision issues (which i find weird, given that im working with near- and farplanes of [0.1, 50]), or something is wrong with the way im recovering the worldPosition of a given pixel from my DepthMap.
I finally found the solution and I'm posting it here if someone stumbles across the same issue:
The Tutorial I used was for XNA / DX9. But as im targetting DX10+ a tiny change needs to be done:
In XNA / DX9 the UV coordinates are not align with the actual pixels and need to be aligned. That is what - float2(1.0f / GBufferTextureSize.xy); in float2 UV = 0.5f * (float2(input.ScreenPosition.x, -input.ScreenPosition.y) + 1) - float2(1.0f / GBufferTextureSize.xy); was for. This is NOT needed in DX10 and above and will result in the issue i had.
Solution:
UV Coordinates for a Fullscreen Quad:
For XNA / DX9:
input.ScreenPosition.xy /= input.ScreenPosition.w;
float2 UV = 0.5f * (float2(input.ScreenPosition.x, -input.ScreenPosition.y) + 1) - float2(1.0f / GBufferTextureSize.xy);
For Monogame / DX10+
input.ScreenPosition.xy /= input.ScreenPosition.w;
float2 UV = 0.5f * (float2(input.ScreenPosition.x, -input.ScreenPosition.y) + 1)
Related
I am getting 2d coordinates from Vector3. I have to correct positions to get right results. Indeed it seems I get correct positions like this but I do not know how to correct position when rotated.
Here my worldViewMatrix that I do operations but those operations not passed to my VertexData then I try to correct positions.
WorldViewMatrix = Matrix.Scaling(Scale) * Matrix.RotationX(Rotation.X) * Matrix.RotationY(Rotation.Y) * Matrix.RotationZ(Rotation.Z) * Matrix.Translation(Position.X, Position.Y, Position.Z) * viewProj;
I am trying to correct it like:
public Vector2 Convert_3Dto2D(Vector3 position, Vector3 translation, Vector3 scale, Vector3 rotation, Matrix viewProj, RenderForm_EX form)
{position += translation;
position += translation;
position = Vector3.Multiply(position, scale);
//ROTATION ?
var project = Vector3.Project(position, 0, 0, form.ClientSize.Width, form.ClientSize.Height, 0, 1, viewProj);
Console.WriteLine(project.X+" "+ project.Y);
return new Vector2(project.X, project.Y);
}
What can I do to correct rotated position ?
If you can, post a little more information about "correct positions". I will take a stab at this and assume you want to move your vertex into world space, then work out what pixel it occupies.
Usually you order multiplying your order by
Translate * Rotate * Scale;
if you want Viewprojection to apply correctly, I believe it should be at the start. V * (t * r * s).
The following link on gamedev stackexchange goes into this. matrix order
Also, your project takes in a Vector3 that has been already multiplied into wvp matrix, I dont see you have multiplied it in your convert_3dto2d function.
Basically, execute a TRS matrix multiply on your original vert, then multiply your WVP matrix then execute your project. You will then get your screen space pixel.
How do you map the points on a normalized sphere (radius of 0.5, center of 0,0,0) to a fish eye texture image? I am using the C# language and OpenGL. The results should be UV coordinates for the image. The sphere is simply a list of 3D coordinates for each vertice of the sphere, so each of these would get a UV coordinate in to the image.
The end results would be a full sphere with the fish eye image wrapping all the way around 360 degrees when textured on to the sphere.
Example fish eye image:
There isn't a single way to map to a sphere. The texture in your post looks like the seam would be up in a nominated world plane. Compare that to a typical skydome type texture, where the sphere usually joins at the bottom (where the camera cant see the join).
You might use shader code like this to map a point based on UV:
float3 PointOnSphere(float phi, float theta, float radius)
{
float3 pos = float3(0, 0, 0);
pos.x = radius * cos(phi) * sin(theta);
pos.y = radius * sin(phi) * sin(theta);
pos.z = radius * cos(theta);
return pos;
}
Or reverse that to get UV from a point on the surface:
float2 AngleFromPoint(float3 pos)
{
float phi = atan(pos.y / pos.x);
float theta = atan(pos.y / pos.z / sin(phi));
return(float2(phi, theta));
}
Which way is up and where the seam joins is something you'll have to workout yourself. If the texture looks compressed because it is non linear you may need to try something like:
texcoord = pow(texcoord, 0.5f);
Edit: and obviously, normalise the angles to 0 - 1 for texture coordinates
I am making a small RPG game and I'm currently on window rendering (things like inventory/quest windows).
I draw the actual window with 9 quads textured from a "skin", 4 corners, 4 borders and the middle.
The problem is, at arbitrary coordinates a triangle or two shifts slightly downwards.
It happens consistently, and only if PointClamp or PointWrap is used. Other opions work fine but they give a blurry look.
The same bug actually happens with the green bar above, however I "fixed" it by using Linear instead of Point.
I use this function to convert from pixel coordinates to screen coordinates:
Note: This function works as I originally expected in MonoGame.
public static Vector3 PixelToScreen(GraphicsDevice device, float X, float Y)
{
float xscale = (float)device.Viewport.Width / 2;
float yscale = (float)device.Viewport.Height / 2;
return new Vector3(((int)X / xscale) - 1f, 1f - ((int)Y / yscale), 0);
}
I suspect this function might be the source of my problem. Is there a "right way" to do it?
A picture is worth a thousand words, so here's a screenshot with the problem.
http://i.stack.imgur.com/rNsnH.png
I am quite sure the solution is trivial, but I just can't catch it.
UPDATE
After some more digging and researching, I finally found a solution. It turns out that
it has something to do with how texture pixels are mapped to the screen pixels.
UPDATE #2
After porting this code to MonoGame, I've noticed a different bug where everything looks "blurry". Curiously enough, removing the offset (reverting to the original function) fixes the problem!
The Fix
public static Vector3 PixelToScreen(GraphicsDevice device, float X, float Y)
{
X -= 0.5f; // Offset the "pixel value" by half a pixel
Y -= 0.5f; // To provide "expected results" use negative value
float xscale = (float)device.Viewport.Width / 2;
float yscale = (float)device.Viewport.Height / 2;
return new Vector3((X / xscale) - 1f, 1f - (Y / yscale), 0);
}
More on the topic:
Directly Mapping Texels to Pixels (Direct3D 9)
Understanding Half-Pixel and Half-Texel Offsets
This question is obviously resolved now, but I hope my findings will help someone stuck in the same situation.
The Fix
public static Vector3 PixelToScreen(GraphicsDevice device, float X, float Y)
{
X -= 0.5f; // Offset the "pixel value" by half a pixel
Y -= 0.5f; // To provide "expected results" use negative value
float xscale = (float)device.Viewport.Width / 2;
float yscale = (float)device.Viewport.Height / 2;
return new Vector3((X / xscale) - 1f, 1f - (Y / yscale), 0);
}
Do not use these offsets in MonoGame as it somehow takes care of the matters internally.
I am encountering an issue when updating a bullet's position when I rotate the ship that shoots it. Currently the code almost works except that at most angles the bullet is not fired from where i want it to. Sometimes it fires slightly upward than the centre where it should be, and sometimes downwards. I'm pretty sure it has something to do with the trigonometry but I'm having a hard time finding the problem.
This code is in the constructor; it sets the bullet's initial position and rotation based on the ship's position (sp.position) and its rotation (Controls.rotation):
this.backgroundRect = new RotatedRectangle(Rectangle.Empty, Controls.rotation);
//get centre of ship
this.position = new Vector2(sp.getPosition().X + sp.getTexture().Width/2,
(sp.getPosition().Y + sp.getTexture().Height / 2));
//get blast pos
this.position = new Vector2((float)(this.position.X + (sp.getTexture().Width / 2 *
Math.Cos(this.backgroundRect.Rotation))), (float)(this.position.Y +
(sp.getTexture().Width / 2 * Math.Sin(this.backgroundRect.Rotation))));
//get blast pos + texture height / 2
this.position = new Vector2((float)(this.position.X + (this.texture.Height / 2 *
Math.Cos(this.backgroundRect.Rotation))), (float)(this.position.Y +
(this.texture.Height / 2 * Math.Sin(this.backgroundRect.Rotation))));
this.backgroundRect = new RotatedRectangle(new Rectangle((int)this.position.X,
(int)this.position.Y, this.texture.Width, this.texture.Height), Controls.rotation);
speed = defSpeed;
Then in the update method I use the following code; I'm pretty sure this is working fine though:
this.position.X = this.position.X + defSpeed *
(float)Math.Cos(this.getRectangle().Rotation);
this.position.Y = this.position.Y + defSpeed *
(float)Math.Sin(this.getRectangle().Rotation);
this.getRectangle().CollisionRectangle.X = (int)(position.X + defSpeed *
(float)Math.Cos(this.getRectangle().Rotation));
this.getRectangle().CollisionRectangle.Y = (int)(position.Y + defSpeed *
(float)Math.Sin(this.getRectangle().Rotation));
Also I should mention that my ship had not been working correctly when I rotated it since the origin (center of rotation) was (0,0). To remedy this I changed the origin to the center of the ship (width/2,height/2) and also added those values to the drawing rectangle so that the sprite would draw correctly. I imagine this could be the problem and suppose there's a better way of going around this. The game is in 2D by the way.
The problem is that the sprites are not drawn on the positions they are supposed to be. When calculating the center position, you assume that the sprite is not rotated. This causes some trouble. Furthermore, rotating an arbitrary vector is not as easy as you did. You can only rotate a vector by multiplying its components with sin/cos when it is on the x-axis.
Here is how you can rotate arbitrary vectors:
Vector2 vecRotate(Vector2 vec, double phi)
{
double length = vec.Length(); //Save length of vector
vec.Normalize();
double alpha = Math.Acos(vec.X); //Angle of vector against x-axis
if(vec.Y < 0)
alpha = 2 * Math.PI - alpha
alpha += phi;
vec.X = Math.Cos(alpha) * length;
vec.Y = Math.Sin(alpha) * length;
return vec;
}
Make sure, the angle is in radians.
With that you can calculate the correct position of the ship:
Vector2 ShipCenter = sp.Position() + vecRotate(new Vector2(sp.getTexture().Width/2, sp.getTexture().Height/2), Controls.Rotation);
You can use the same function to determine the position of the bullet's center. I don't exactly know, where you want to place the bullet. I will assume, it is at the center of the right edge:
Vector2 offset = new Vector2(sp.getTexture.Width/2, 0);
this.position = ShipCenter + vecRotate(offset, Controls.Rotation);
If you then need the position of the upper left corner of the bullet, you can go further:
this.position -= vecRotate(new Vector2(this.texture.Width/2, this.texture.Height/2), Controls.Rotation)
However, as I stated, it is probably easier to handle the ship's and bullets' positions as central positions. Doing so, you need far less trigonometry.
Our game uses a 'block atlas', a grid of square textures which correspond to specific faces of blocks in the game. We're aiming to streamline vertex data memory by storing texture data in the vertex as shorts instead of float2s. Here's our Vertex Definition (XNA, C#):
public struct BlockVertex : IVertexType
{
public Vector3 Position;
public Vector3 Normal;
/// <summary>
/// The texture coordinate of this block in reference to the top left corner of an ID's square on the atlas
/// </summary>
public short TextureID;
public short DecalID;
// Describe the layout of this vertex structure.
public static readonly VertexDeclaration VertexDeclaration = new VertexDeclaration
(
new VertexElement(0, VertexElementFormat.Vector3,
VertexElementUsage.Position, 0),
new VertexElement(sizeof(float) * 3, VertexElementFormat.Vector3,
VertexElementUsage.Normal, 0),
new VertexElement(sizeof(float) * 6, VertexElementFormat.Short2,
VertexElementUsage.TextureCoordinate, 0)
);
public BlockVertex(Vector3 Position, Vector3 Normal, short TexID)
{
this.Position = Position;
this.Normal = Normal;
this.TextureID = TexID;
this.DecalID = TexID;
}
// Describe the size of this vertex structure.
public const int SizeInBytes = (sizeof(float) * 6) + (sizeof(short) * 2);
VertexDeclaration IVertexType.VertexDeclaration
{
get { return VertexDeclaration; }
}
}
I have little experience in HLSL, but when I tried to adapt our current shader to use this vertex (as opposed to the old one which stored the Texture and Decal as Vector2 coordinates), I got nothing but transparent blue models, which I believe means that the texture coordinates for the faces are all the same?
Here's the HLSL where I try to interpret the vertex data:
int AtlasWidth = 25;
int SquareSize = 32;
float TexturePercent = 0.04; //percent of the entire texture taken up by 1 pixel
[snipped]
struct VSInputTx
{
float4 Position : POSITION0;
float3 Normal : NORMAL0;
short2 BlockAndDecalID : TEXCOORD0;
};
struct VSOutputTx
{
float4 Position : POSITION0;
float3 Normal : TEXCOORD0;
float3 CameraView : TEXCOORD1;
short2 TexCoords : TEXCOORD2;
short2 DecalCoords : TEXCOORD3;
float FogAmt : TEXCOORD4;
};
[snipped]
VSOutputTx VSBasicTx( VSInputTx input )
{
VSOutputTx output;
float4 worldPosition = mul( input.Position, World );
float4 viewPosition = mul( worldPosition, View );
output.Position = mul( viewPosition, Projection );
output.Normal = mul( input.Normal, World );
output.CameraView = normalize( CameraPosition - worldPosition );
output.FogAmt = 1 - saturate((distance(worldPosition,CameraPosition)-FogBegin)/(FogEnd-FogBegin)); //This sets the fog amount (since it needs position data)
// Convert texture coordinates to short2 from blockID
// When they transfer to the pixel shader, they will be interpolated
// per pixel.
output.TexCoords = short2((short)(((input.BlockAndDecalID.x) % (AtlasWidth)) * TexturePercent), (short)(((input.BlockAndDecalID.x) / (AtlasWidth)) * TexturePercent));
output.DecalCoords = short2((short)(((input.BlockAndDecalID.y) % (AtlasWidth)) * TexturePercent), (short)(((input.BlockAndDecalID.y) / (AtlasWidth)) * TexturePercent));
return output;
}
I changed nothing else from the original shader which displayed everything fine, but you can see the entire .fx file here.
If I could just debug the thing I might be able to get it, but... well, anyways, I think it has to do with my limited knowledge of how shaders work. I imagine my attempt to use integer arithmetic is less than effective. Also, that's a lot of casts, I could believe it if values got forced to 0 somewhere in there. In case I am way off the mark, here is what I aim to achieve:
Shader gets a vertex which stores two shorts as well as other data. The shorts represent an ID of a certain corner of a grid of square textures. One is for the block face, the other is for a decal which is drawn over that face.
The shader uses these IDs, as well as some constants which define the size of the texture grid (henceforth referred to as "Atlas"), to determine the actual texture coordinate of this particular corner.
The X of the texture coordinate is the (ID % AtlasWidth) * TexturePercent, where TexturePercent is a constant which represents the percentage of the entire texture represented by 1 pixel.
The Y of the texture coordinate is the (ID / AtlasWidth) * TexturePercent.
How can I go about this?
Update: I got an error at some point "vs_1_1 does not support 8- or 16-bit integers" could this be part of the issue?
output.TexCoords = short2((short)(((input.BlockAndDecalID.x) % (AtlasWidth)) * TexturePercent), (short)(((input.BlockAndDecalID.x) / (AtlasWidth)) * TexturePercent));
output.DecalCoords = short2((short)(((input.BlockAndDecalID.y) % (AtlasWidth)) * TexturePercent), (short)(((input.BlockAndDecalID.y) / (AtlasWidth)) * TexturePercent));
These two lines contain these following divisions:
(input.BlockAndDecalID.x) / (AtlasWidth)
(input.BlockAndDecalID.y) / (AtlasWidth)
These are both integer divisions. Integer division cuts off all past the decimal point. I'm assuming AtlasWidth is always greater than each of the coordinates, therefore both of these divisions will always result in 0. If this assumption is incorrect, I'm assuming you still want that decimal data in some way?
You probably want a float division, something that returns a decimal result, so you need to cast at least one (or both) of the operands to a float first, e.g.:
((float)input.BlockAndDecalID.x) / ((float)AtlasWidth)
EDIT: I would use NormalizedShort2 instead; it scales your value by the max short value (32767), allowing a "decimal-like" use of short values (in other words, "0.5" = 16383). Of course if your intent was just to halve your tex coord data, you can achieve this while still using floating points by using HalfVector2. If you still insist on Short2 you will probably have to scale it like NormalizedShort2 already does before submitting it as a coordinate.
Alright, after I tried to create a short in one of the shader functions and faced a compiler error which stated that vs_1_1 doesn't support 8/16-bit integers (why didn't you tell me that before??), I changed the function to look like this:
float texX = ((float)((int)input.BlockAndDecalID.x % AtlasWidth) * (float)TexturePercent);
float texY = ((float)((int)input.BlockAndDecalID.x / AtlasWidth) * (float)TexturePercent);
output.TexCoords = float2(texX, texY);
float decX = ((float)((int)input.BlockAndDecalID.y % AtlasWidth) * (float)TexturePercent);
float decY = ((float)((int)input.BlockAndDecalID.y / AtlasWidth) * (float)TexturePercent);
output.DecalCoords = float2(decX, decY);
Which seems to work just fine. I'm not sure if it's changing stuff to ints or that (float) cast on the front. I think it might be the float, which means Scott was definitely on to something. My partner wrote this shader somehow to use a short2 as the texture coordinate; how he did that I do not know.