Opengl OpenTK - White screen when drawing depth Buffer [duplicate] - c#

This question already has an answer here:
OpenGL 4.2 LookAt matrix only works with -z value for eye position
(1 answer)
Closed 2 years ago.
I am currently trying to add shadows with Shadow Mapping to my 3D Engine.
First I render the scene from the light's point of view, and save the depth values in a texture. Then I use the defeault FBO to draw from the texture. Just like in this tutorial.
The problem is that my screen stays white, no matter where I move.
GL.GetError() outputs noError and the SSBO's which I use in vertex shader have the right values. GL.CheckFramebufferStatus() returns FramebufferCompleteExt.
This is how I create the FBO for depth values:
_depthMapFBO = GL.GenFramebuffer();
_depthMapFBOColorBuffer = BufferObjects.FBO_TextureAttachment(_depthMapFBO, PixelInternalFormat.DepthComponent, PixelFormat.DepthComponent, FramebufferAttachment.DepthAttachment, 1024, 1024);
GL.BindFramebuffer(FramebufferTarget.Framebuffer, _depthMapFBO);
GL.DrawBuffer(DrawBufferMode.None);
GL.ReadBuffer(ReadBufferMode.None);
====================================
public static int FBO_TextureAttachment(int FrameBuffer, PixelInternalFormat PixelInternalFormat, PixelFormat PixelFormat, FramebufferAttachment FramebufferAttachment, int Width, int Height)
{
// PixelInternalFormat = DepthComponent && PixelFormat = DepthComponent && FramebufferAttachment = DepthAttachment && Width, Height = 1024,
GL.BindFramebuffer(FramebufferTarget.Framebuffer, FrameBuffer);
int _texture = GL.GenTexture();
GL.BindTexture(TextureTarget.Texture2D, _texture);
GL.TexImage2D(TextureTarget.Texture2D, 0, PixelInternalFormat, Width, Height, 0, PixelFormat, PixelType.Float, IntPtr.Zero);
GL.TexParameter(TextureTarget.Texture2D, TextureParameterName.TextureMinFilter, (int)All.Nearest);
GL.TexParameter(TextureTarget.Texture2D, TextureParameterName.TextureMagFilter, (int)All.Nearest);
GL.TexParameter(TextureTarget.Texture2D, TextureParameterName.TextureWrapS, (int)All.Repeat);
GL.TexParameter(TextureTarget.Texture2D, TextureParameterName.TextureWrapT, (int)All.Repeat);
GL.FramebufferTexture2D(FramebufferTarget.Framebuffer, FramebufferAttachment, TextureTarget.Texture2D, _texture, 0);
return _texture;
}
In my Render function it looks like this:
GL.BindFramebuffer(FramebufferTarget.Framebuffer, _depthMapFBO);
GL.Clear(ClearBufferMask.DepthBufferBit);
GL.Viewport(0, 0, 1024, 1024);
_simpleDepthProgram.Use();
float _nearPlane = 1.0f, _farPlane = 100f;
_lightProjection = Matrix4.CreateOrthographicOffCenter(-100.0f, 100.0f, -100.0f, 100.0f, _nearPlane, _farPlane);
_ligthView = Matrix4.LookAt(_allLamps[0].Position, new Vector3(0f), new Vector3(0.0f, 1.0f, 0.0f));
_lightSpaceMatrix = _lightProjection * _ligthView;
GL.UniformMatrix4(21, false, ref _lightSpaceMatrix);
// Copy all SSBO's
GL.ActiveTexture(TextureUnit.Texture2);
GL.BindTexture(TextureTarget.Texture2D, _depthMapFBOColorBuffer);
Scene();
And the shader where I draw the depthMap:
#version 450 core
out vec4 FragColor;
uniform sampler2D scene;
uniform sampler2D bloomed;
uniform sampler2D depthMap;
uniform float zNear;
uniform float zFar;
float LinearizeDepth(float depth)
{
float z = depth * 2.0 - 1.0; // Back to NDC
return (2.0 * zNear * zFar) / (zFar + zNear - z * (zFar - zNear));
}
in vec2 TexCoord;
void main()
{
float depthValue = texture(depthMap, TexCoord).r;
//float depth = LinearizeDepth(gl_FragCoord.z) / far; // only for perspective
FragColor = vec4(vec3(depthValue), 1.0);
}

The computation of the _lightSpaceMatrix is wrong. The OpenTK matrix multiplication is reversed. See Problem with matrices #687:
Because of how matrices are treated in C# and OpenTK, multiplication order is inverted from what you might expect in C/C++ and GLSL. This is an old artefact in the library, and it's too late to change now, unfortunately.
Swap the _ligthView and _lightProjection when you multiply the matrices:
_lightSpaceMatrix = _lightProjection * _ligthView;
_lightSpaceMatrix = _ligthView * _lightProjection;

Related

How can i scale a texture to the given size in pixel with SharpDX.Direct3D9?

I'm currently trying to scale a texture to the given size in pixels via SharpDX.Direct3D9.
I have the following code which draws a texture on the screen (2D)
public static bool DrawTexture(IntPtr device, IntPtr txt, RectangleF rect, float rotation, Color tint)
{
try {
Texture texture = (Texture)txt;
Matrix m = Matrix.Identity * Matrix.Translation(-0.5f, -0.5f, 0.0f) * Matrix.Scaling(rect.Width, rect.Height, 1.0f) * Matrix.RotationZ(rotation) * Matrix.Translation(rect.X, rect.Y, 0.0f);
using (Sprite s = new Sprite((Device)device)) {
s.Begin();
s.Transform = m;
s.Draw(texture, tint.ToRawColorBGRA());
s.End();
}
return true;
}
catch (Exception ex) {
Main.managerInstance.console.PrintError(string.Format("[Direct3D9] An error occured while trying to draw texture. Details: {0}", ex.ToString()));
}
return false;
}
Matrix.Scaling(rect.Width, rect.Height, 1.0f) is responsible for scaling my texture to the given size (128x128 pixel).
But as far as i understand, the Matrix.Scaling function takes in a float from 0 - 1 where 1 is the full texture size and 2 would be double the texture size. But i would like to enter the size in pixel and not in units(?).
So i tried the following:
Size res = CGame.Resolution;
float cW = rect.Width / res.Width;
float cH = rect.Height / res.Height;
Matrix m = Matrix.Identity * Matrix.Translation(-0.5f, -0.5f, 0.0f) * Matrix.Scaling(cW, cH, 1.0f) * Matrix.RotationZ(rotation) * Matrix.Translation(rect.X, rect.Y, 0.0f);
I divide the given texture Width and Height (which is 128x128 pixel) by the Width and Height of the current screen resolution (Which in my case is 1920x1080).
This leaves me with the following:
The result of the division by the screen resolution
As you can see there is a red rectangle in the texture, which is actually 128x128 pixel in size, and in the background, there is my texture, which is supposed to be scaled to 128x128 but as you can see, it clearly is larger then the red rectangle which is 128x128.
Here is how i load my texture
// D3DX.DefaultNonPowerOf2 = -2
Texture t = Texture.FromFile(device, filePath, D3DX.DefaultNonPowerOf2, D3DX.DefaultNonPowerOf2, 1, Usage.None, Format.Unknown, Pool.Managed, Filter.None, Filter.None, 0);
If someone could help me out with this problem i would be really grateful!
Got it working!
Needed to divide the target Size of the texture by the actual texture size like so:
SurfaceDescription sd = texture.GetLevelDescription(0);
float cW = rect.Width / sd.Width;
float cH = rect.Height / sd.Height;

Rotate an image in openTK

I want to rotate an image which is shown in my GLControl to 10 degree. For that I have rotated the bitmap using c# code and passed this rotated bitmap to opengl shader code. But the resulted image is seems like the rotated part is hiding/cut like below. Shall I need to do any changes on it's view port while rotating? or is it good to rotate the image in shader code itself?
public void DrawImage(int image, int glcontrolWidth, int glcontrolHeight,Matrix4 **transformMatrix**)
{
GL.Viewport(new Rectangle(0, 0, glcontrolWidth, glcontrolHeight));
GL.MatrixMode(MatrixMode.Projection);
GL.PushMatrix();
GL.LoadIdentity();
GL.MatrixMode(MatrixMode.Modelview);
GL.PushMatrix();
GL.LoadIdentity();
GL.Disable(EnableCap.Lighting);
GL.Enable(EnableCap.Texture2D);
GL.ActiveTexture(TextureUnit.Texture0);
GL.BindTexture(TextureTarget.Texture2D, image);
GL.Uniform1(positionLocation1, 0);
RunShaders();
GL.Disable(EnableCap.Texture2D);
GL.PopMatrix();
GL.MatrixMode(MatrixMode.Projection);
GL.PopMatrix();
GL.MatrixMode(MatrixMode.Modelview);
}
public void RunShaders()
{
GL.UseProgram(program);
**GL.UniformMatrix4(transformLocation, false, ref transformMatrix);**
GL.DrawArrays(PrimitiveType.Triangles, 0, vertices.Length / 3);
ErrorCode ec = GL.GetError();
if (ec != 0)
System.Console.WriteLine(ec.ToString());
Console.Read();
}
public void Init()
{
CreateShaders();
CreateProgram();
InitBuffers();
}
public void CreateProgram()
{
program = GL.CreateProgram();
GL.AttachShader(program, vertShader);
GL.AttachShader(program, fragShader);
GL.LinkProgram(program);
}
public void InitBuffers()
{
buffer = GL.GenBuffer();
positionLocation = GL.GetAttribLocation(program, "a_position");
positionLocation1 = GL.GetUniformLocation(program, "sTexture");
**transformLocation = GL.GetUniformLocation(program, "u_transform");**
GL.EnableVertexAttribArray(positionLocation);
GL.BindBuffer(BufferTarget.ArrayBuffer, buffer);
GL.BufferData(BufferTarget.ArrayBuffer, (IntPtr)(vertices.Length * sizeof(float)), vertices, BufferUsageHint.StaticDraw);
GL.VertexAttribPointer(positionLocation, 3, VertexAttribPointerType.Float, false, 0, 0);
}
public void CreateShaders()
{
/***********Vert Shader********************/
vertShader = GL.CreateShader(ShaderType.VertexShader);
GL.ShaderSource(vertShader, #"attribute vec3 a_position;
varying vec2 vTexCoord;
**uniform mat4 u_transform;**
void main() {
vTexCoord = (a_position.xy+1)/2 ;
**gl_Position = u_transform * vec4(a_position, 1);**
}");
GL.CompileShader(vertShader);
/***********Frag Shader ****************/
fragShader = GL.CreateShader(ShaderType.FragmentShader);
GL.ShaderSource(fragShader, #"precision highp float;
uniform sampler2D sTexture_2;varying vec2 vTexCoord;
void main ()
{
vec4 color = texture2D (sTexture_2, vec2(vTexCoord.x, vTexCoord.y));
gl_FragColor =color;
}"); GL.CompileShader(fragShader);
}
Do not rotate the image, but rotate and scale the vertex coordinates.
Add a transformation matrix to the vertex shader:
attribute vec3 a_position;
varying vec2 vTexCoord;
uniform mat4 u_transform;
void main()
{
vTexCoord = (a_position.xy+1)/2;
gl_Position = u_transform * vec4(a_position, 1);
}
Get the location of the transformation matrix uniform (``u_transform´) (after the program is linked).
int transformLocation = GL.GetUniformLocation(program, "u_transform");
Compute the scale dependent on the angle:
double diagonal = Math.Sqrt(bmp.Width * bmp.Width + bmp.Height * bmp.Height);
double dia_angle1 = Math.Atan2(bmp.Height, bmp.Width) + angle * Math.PI / 180;
double dia_angle2 = Math.Atan2(bmp.Height, -bmp.Width) + angle * Math.PI / 180;
double rot_w = Math.Max(Math.Abs(diagonal * Math.Cos(dia_angle1)), Math.Abs(diagonal * Math.Cos(dia_angle2)));
double rot_h = Math.Max(Math.Abs(diagonal * Math.Sin(dia_angle1)), Math.Abs(diagonal * Math.Sin(dia_angle2)));
double scale = Math.Min(bmp.Width / rot_w, bmp.Height / rot_h);
Define a transformation matrix that scales and rotates the image taking into account the aspect ratio:
Matrix4 transformMatrix =
Matrix4.CreateScale((float)scale) *
Matrix4.CreateScale(this.Width, this.Height, 1.0f) *
Matrix4.CreateRotationZ((float)(angle * Math.PI / 180)) *
Matrix4.CreateScale(1.0f / this.Width, 1.0f / this.Height, 1.0f);
Set the matrix uniform, after the program is installed (after GL.UseProgram):
GL.UniformMatrix4(transformLocation, false, ref transformMatrix);

OpenTK Text rendering without GL.Begin()

I am writing an application with OpenTK and got to the point where i want to render text. From examples i patched together a version that creates a bitmap with the characters i need, using Graphics.DrawString(). That version works quite okay, but i am annoyed that it relies on GL.Begin(BeginMode.Quads) and GL.End() to render the text, which is why i want to use VAOs and VBOs from now on.
I am having a problem somewhere in my program, because i always get single colored squares, where the text should appear.
What i did so far to update my functions is the following:
I create the bitmap the same as before, i don't see why the problem should lie there.
After that i create a list of "Char" Objects, each creating a VBO, storing the position and texture coordinates like this:
float u_step = (float)GlyphWidth / (float)TextureWidth;
float v_step = (float)GlyphHeight / (float)TextureHeight;
float u = (float)(character % GlyphsPerLine) * u_step;
float v = (float)(character / GlyphsPerLine) * v_step;
float x = -GlyphWidth / 2, y = 0;
_vertices = new float[]{
x/rect.Width, -GlyphHeight/rect.Height, u, v,
-x/rect.Width, -GlyphHeight/rect.Height, u + u_step, v,
-x/rect.Width, y/rect.Height, u + u_step, v + v_step,
x/rect.Width, -GlyphHeight/rect.Height, u, v,
-x/rect.Width, y/rect.Height, u + u_step, v + v_step,
x/rect.Width, y/rect.Height, u, v + v_step
};
_VBO = GL.GenBuffer();
GL.BindBuffer(BufferTarget.ArrayBuffer, _VBO);
GL.BufferData<float>(BufferTarget.ArrayBuffer, (IntPtr)(sizeof(float) * _vertices.Length),
_vertices, BufferUsageHint.DynamicDraw);
Next i generate a Texture, set texture0 as active and bind the Texture as TextureTarget.Texture2D. Then i load the bitmap to the texture doing the following:
_shader.Use();
_vertexLocation = _shader.GetAttribLocation("aPosition");
_texCoordLocation = _shader.GetAttribLocation("aTexCoord");
_fontTextureID = GL.GenTexture();
GL.ActiveTexture(TextureUnit.Texture0);
GL.BindTexture(TextureTarget.Texture2D, _fontTextureID);
using (var image = new Bitmap("container.png")) //_fontBitmapFilename))//
{
var data = image.LockBits(
new Rectangle(0, 0, image.Width, image.Height),
ImageLockMode.ReadOnly,
System.Drawing.Imaging.PixelFormat.Format32bppArgb);
GL.TexImage2D(TextureTarget.Texture2D,
0,
PixelInternalFormat.Rgba,
image.Width,
image.Height,
0,
OpenTK.Graphics.OpenGL.PixelFormat.Bgra,
PixelType.UnsignedByte,
data.Scan0);
}
GL.TexParameter(TextureTarget.Texture2D, TextureParameterName.TextureMinFilter, (int)TextureMinFilter.Linear);
GL.TexParameter(TextureTarget.Texture2D, TextureParameterName.TextureMagFilter, (int)TextureMagFilter.Linear);
GL.TexParameter(TextureTarget.Texture2D, TextureParameterName.TextureWrapS, (int)TextureWrapMode.Repeat);
GL.TexParameter(TextureTarget.Texture2D, TextureParameterName.TextureWrapT, (int)TextureWrapMode.Repeat);
I create a VAO now, bind it, and bind one VBO to it. Then i set up the VertexAttribPointers to interpret the VBO Data:
_VAO = GL.GenVertexArray();
GL.BindVertexArray(_VAO);
GL.BindBuffer(BufferTarget.ArrayBuffer, _charSheet[87].VBO);
GL.EnableVertexAttribArray(_vertexLocation);
GL.VertexAttribPointer(_vertexLocation, 2, VertexAttribPointerType.Float, false, 4 * sizeof(float), 0);
GL.EnableVertexAttribArray(_texCoordLocation);
GL.ActiveTexture(TextureUnit.Texture0);
GL.BindTexture(TextureTarget.Texture2D, _fontTextureID);
GL.VertexAttribPointer(_texCoordLocation, 2, VertexAttribPointerType.Float, false, 4 * sizeof(float), 2);
GL.BindVertexArray(0);
_shader.StopUse();
My Render function starts by using the shader, binding the VAO and enabling the VertexAttribArrays. Then i bind the VBO, in this function a fixed one for testing, and reuse the VertexAttribPointer functions, so that the VAO updates its data (i might also be wrong thinking so..). At the end i draw two triangles, which makes a square, where the letter should appear.
_shader.Use();
GL.BindVertexArray(_VAO);
GL.EnableVertexAttribArray(_texCoordLocation);
GL.EnableVertexAttribArray(_vertexLocation);
GL.BindBuffer(BufferTarget.ArrayBuffer, _charSheet[87].VBO);
GL.VertexAttribPointer(_vertexLocation, 2, VertexAttribPointerType.Float, false, 4 * sizeof(float), 0);
GL.ActiveTexture(TextureUnit.Texture0);
GL.BindTexture(TextureTarget.Texture2D, _fontTextureID);
GL.VertexAttribPointer(_texCoordLocation, 2, VertexAttribPointerType.Float, false, 4 * sizeof(float), 2);
GL.DrawArrays(PrimitiveType.Triangles, 0, 6);
GL.BindVertexArray(0);
_shader.StopUse();
I do not know, where I'm doing something wrong in my program, maybe someone has a tip for me.
Vertex Shader
#version 330 core
layout(location = 0) in vec2 aPosition;
layout(location = 1) in vec2 aTexCoord;
out vec2 texCoord;
void main(void)
{
texCoord = aTexCoord;
gl_Position = vec4(aPosition, 0.0, 1.0);
}
Fragment Shader:
#version 330
out vec4 outputColor;
in vec2 texCoord;
uniform sampler2D texture0;
void main()
{
outputColor = texture(texture0, texCoord);
}
If a named buffer object is bound, then the last parameter of GL.VertexAttribPointer is treated as byte offset to the buffer object's data store.
The offset has to be 2*sizeof(float) rather than 2:
GL.VertexAttribPointer(_texCoordLocation, 2, VertexAttribPointerType.Float, false, 4 * sizeof(float), 2);
GL.VertexAttribPointer(_texCoordLocation, 2,
VertexAttribPointerType.Float, false, 4 * sizeof(float), 2*sizeof(float));
See glVertexAttribPointer and Vertex Specification.

C# OpenTK - Draw string on window

I've a big problem: I've an OpenTK window open where I draw textures, images, etc. I've to do a little videogame in this manner for a test and I'ld like to show text on it that shows game infos.
Actually I've been only able to open a Window form with text and it's not what I need.
Is there a manner to show text in a OpenTK window?
I can't use OpenTK 3.0, so QuickFont has to be excluded.
I can use GL Class.
Thank you very much!
One possibility would be to use FreeType library to load a TrueType Font to texture objects.
SharpFont provides Cross-platform FreeType bindings for C#.
The source can be found at GitHub - Robmaister/SharpFont.
(x64 SharpFont.dll and freetype6.dll from MonoGame.Dependencies)
A full example can be found at GitHub - Rabbid76/c_sharp_opengl.
The example eis based on LearnOpenGL - Text Rendering.
Load the font and glyph information for the characters and create a texture object for each character:
public struct Character
{
public int TextureID { get; set; }
public Vector2 Size { get; set; }
public Vector2 Bearing { get; set; }
public int Advance { get; set; }
}
// initialize library
Library lib = new Library();
Face face = new Face(lib, "FreeSans.ttf");
face.SetPixelSizes(0, 32);
// set 1 byte pixel alignment
GL.PixelStore(PixelStoreParameter.UnpackAlignment, 1);
// Load first 128 characters of ASCII set
for (uint c = 0; c < 128; c++)
{
try
{
// load glyph
//face.LoadGlyph(c, LoadFlags.Render, LoadTarget.Normal);
face.LoadChar(c, LoadFlags.Render, LoadTarget.Normal);
GlyphSlot glyph = face.Glyph;
FTBitmap bitmap = glyph.Bitmap;
// create glyph texture
int texObj = GL.GenTexture();
GL.BindTexture(TextureTarget.Texture2D, texObj);
GL.TexImage2D(TextureTarget.Texture2D, 0,
PixelInternalFormat.R8, bitmap.Width, bitmap.Rows, 0,
PixelFormat.Red, PixelType.UnsignedByte, bitmap.Buffer);
// set texture parameters
GL.TextureParameter(texObj, TextureParameterName.TextureMinFilter, (int)TextureMinFilter.Linear);
GL.TextureParameter(texObj, TextureParameterName.TextureMagFilter, (int)TextureMagFilter.Linear);
GL.TextureParameter(texObj, TextureParameterName.TextureWrapS, (int)TextureWrapMode.ClampToEdge);
GL.TextureParameter(texObj, TextureParameterName.TextureWrapT, (int)TextureWrapMode.ClampToEdge);
// add character
Character ch = new Character();
ch.TextureID = texObj;
ch.Size = new Vector2(bitmap.Width, bitmap.Rows);
ch.Bearing = new Vector2(glyph.BitmapLeft, glyph.BitmapTop);
ch.Advance = (int)glyph.Advance.X.Value;
_characters.Add(c, ch);
}
catch (Exception ex)
{
Console.WriteLine(ex);
}
}
Create a Vertex Array Object which draws a quad by 2 trinagles:
// bind default texture
GL.BindTexture(TextureTarget.Texture2D, 0);
// set default (4 byte) pixel alignment
GL.PixelStore(PixelStoreParameter.UnpackAlignment, 4);
float[] vquad =
{
// x y u v
0.0f, -1.0f, 0.0f, 0.0f,
0.0f, 0.0f, 0.0f, 1.0f,
1.0f, 0.0f, 1.0f, 1.0f,
0.0f, -1.0f, 0.0f, 0.0f,
1.0f, 0.0f, 1.0f, 1.0f,
1.0f, -1.0f, 1.0f, 0.0f
};
// Create [Vertex Buffer Object](https://www.khronos.org/opengl/wiki/Vertex_Specification#Vertex_Buffer_Object)
_vbo = GL.GenBuffer();
GL.BindBuffer(BufferTarget.ArrayBuffer, _vbo);
GL.BufferData(BufferTarget.ArrayBuffer, 4 * 6 * 4, vquad, BufferUsageHint.StaticDraw);
// [Vertex Array Object](https://www.khronos.org/opengl/wiki/Vertex_Specification#Vertex_Array_Object)
_vao = GL.GenVertexArray();
GL.BindVertexArray(_vao);
GL.EnableVertexAttribArray(0);
GL.VertexAttribPointer(0, 2, VertexAttribPointerType.Float, false, 4 * 4, 0);
GL.EnableVertexAttribArray(1);
GL.VertexAttribPointer(1, 2, VertexAttribPointerType.Float, false, 4 * 4, 2 * 4);
Furthermore create a method which draws a string at specified position which a given direction:
public void RenderText(string text, float x, float y, float scale, Vector2 dir)
{
GL.ActiveTexture(TextureUnit.Texture0);
GL.BindVertexArray(_vao);
float angle_rad = (float)Math.Atan2(dir.Y, dir.X);
Matrix4 rotateM = Matrix4.CreateRotationZ(angle_rad);
Matrix4 transOriginM = Matrix4.CreateTranslation(new Vector3(x, y, 0f));
// Iterate through all characters
float char_x = 0.0f;
foreach (var c in text)
{
if (_characters.ContainsKey(c) == false)
continue;
Character ch = _characters[c];
float w = ch.Size.X * scale;
float h = ch.Size.Y * scale;
float xrel = char_x + ch.Bearing.X * scale;
float yrel = (ch.Size.Y - ch.Bearing.Y) * scale;
// Now advance cursors for next glyph (note that advance is number of 1/64 pixels)
char_x += (ch.Advance >> 6) * scale; // Bitshift by 6 to get value in pixels (2^6 = 64 (divide amount of 1/64th pixels by 64 to get amount of pixels))
Matrix4 scaleM = Matrix4.CreateScale(new Vector3(w, h, 1.0f));
Matrix4 transRelM = Matrix4.CreateTranslation(new Vector3(xrel, yrel, 0.0f));
Matrix4 modelM = scaleM * transRelM * rotateM * transOriginM; // OpenTK `*`-operator is reversed
GL.UniformMatrix4(0, false, ref modelM);
// Render glyph texture over quad
GL.BindTexture(TextureTarget.Texture2D, ch.TextureID);
// Render quad
GL.DrawArrays(PrimitiveType.Triangles, 0, 6);
}
GL.BindVertexArray(0);
GL.BindTexture(TextureTarget.Texture2D, 0);
}
Vertex shader:
#version 460
layout (location = 0) in vec2 in_pos;
layout (location = 1) in vec2 in_uv;
out vec2 vUV;
layout (location = 0) uniform mat4 model;
layout (location = 1) uniform mat4 projection;
void main()
{
vUV = in_uv.xy;
gl_Position = projection * model * vec4(in_pos.xy, 0.0, 1.0);
}
Fragment shader:
#version 460
in vec2 vUV;
layout (binding=0) uniform sampler2D u_texture;
layout (location = 2) uniform vec3 textColor;
out vec4 fragColor;
void main()
{
vec2 uv = vUV.xy;
float text = texture(u_texture, uv).r;
fragColor = vec4(textColor.rgb*text, text);
}
See the example:
Matrix4 projectionM = Matrix4.CreateScale(new Vector3(1f/this.Width, 1f/this.Height, 1.0f));
projectionM = Matrix4.CreateOrthographicOffCenter(0.0f, this.Width, this.Height, 0.0f, -1.0f, 1.0f);
GL.ClearColor(0.2f, 0.3f, 0.3f, 1.0f);
GL.Clear(ClearBufferMask.ColorBufferBit);
GL.Enable(EnableCap.Blend);
GL.BlendFunc(BlendingFactor.SrcAlpha, BlendingFactor.OneMinusSrcAlpha);
text_prog.Use();
GL.UniformMatrix4(1, false, ref projectionM);
GL.Uniform3(2, new Vector3(0.5f, 0.8f, 0.2f));
font.RenderText("This is sample text", 25.0f, 50.0f, 1.2f, new Vector2(1f, 0f));
GL.Uniform3(2, new Vector3(0.3f, 0.7f, 0.9f));
font.RenderText("(C) LearnOpenGL.com", 50.0f, 200.0f, 0.9f, new Vector2(1.0f, -0.25f));

XNA: Orthographic Projection that matches Screen Coordinates

I am using XNA with SpriteBatch and custom drawn verticies in parallel. The goal is to have the same coordinate system for both techniques.
That means I need a projection matrix that maps to screen coordinates: (0, 0) is in the top left screen corner, while width and height are determined by the screen resolution.
Matrix.CreateOrthographicOffCenter(0, width, 0, height, -1, 1);
Works well but has the center in the bottom-left corner.
Matrix.CreateOrthographicOffCenter(0, width, height, 0, -1, 1);
Does not display anything at all.
Trying to combine the first projection matrix with a translation and scaling y by -1 does not display anything at all either. Scaling by positive values works well, translation too. But as soon as I scale by a negative value I do not get any output at all.
Any ideas?
PS: For testing purpose I am drawing vertices far beyond the screen coordinates, so I would at least see something if there is some error in translation.
I use this code to initialize my 2D camera for drawing lines, and use a basic custom effect to draw.
Vector2 center;
center.X = Game.GraphicsDevice.Viewport.Width * 0.5f;
center.Y = Game.GraphicsDevice.Viewport.Height * 0.5f;
Matrix View = Matrix.CreateLookAt( new Vector3( center, 0 ), new Vector3( center, 1 ), new Vector3( 0, -1, 0 ) );
Matrix Projection = Matrix.CreateOrthographic( center.X * 2, center.Y * 2, -0.5f, 1 );
Effect
uniform float4x4 xWorld;
uniform float4x4 xViewProjection;
void VS_Basico(in float4 inPos : POSITION, in float4 inColor: COLOR0, out float4 outPos: POSITION, out float4 outColor:COLOR0 )
{
float4 tmp = mul (inPos, xWorld);
outPos = mul (tmp, xViewProjection);
outColor = inColor;
}
technique Lines
{
pass Pass0
{
VertexShader = compile vs_2_0 VS_Basico();
FILLMODE = SOLID;
CULLMODE = NONE;
}
}

Categories