How do I scale the graphics of a game? - c#

I'm making a game in C# and XNA 4.0. It's pretty much finished, but now I want to add settings so that players can change the window size if they want to. The current setup goes like this:
void Initialize()
{
//The window size is initally 800x480
graphics.PreferredBackBufferWidth = 800;
graphics.PreferredBackBufferHeight = 480;
graphics.ApplyChanges();
}
void Update()
{
//If the player completes an action, the window size is changed
if (windowSizeChanged)
{
graphics.PreferredBackBufferWidth = 1024;
graphics.PreferredBackBufferHeight = 720;
graphics.ApplyChanges();
}
}
Using this code, this is what the game looks like at specific resolutions:
800x480
1024x720
As you can hopefully see, when the window size is changed it does not affect the actual graphics of the game. The sprites and hitboxes of all of the objects stay the same size, so they instead fill up a small box in the corner of the screen rather than the entire window. Can anyone tell me how I can scale the sprites so that they fill up the window? I assume I would need to use a matrix of some sort, but can anyone point me in the right direction?
Edit:
Here's the draw code.
void Draw(GameTime gameTime)
{
GraphicsDevice.Clear(Color.CornflowerBlue);
base.Draw(gameTime);
spriteBatch.Begin();
//Background texture drawn at default window size
spriteBatch.Draw(background, new Rectangle(0, 0, 800, 480), Color.White);
//Each object in the level (player, block, etc.) is drawn with a specific texture and a rectangle variable specifying the size and coordinates
//E.g. Each block is a size of 64x64 pixels and in this level they are placed at X-coordinates 0, 64, 128 and so on
//Each Y-coordinate for the blocks in this level is '480 - 64' (Window height minus block height)
foreach (/*Object in level*/)
{
spriteBatch.Draw(object.Texture, object.TextureSize, Color.White);
}
spriteBatch.End();
}

By default, SpriteBatch assumes that your world space is the same as client space, which is the size of the window. You can read about SpriteBatch and different spaces in a post by Andrew Russell.
When you are resizing the backbuffer, the window size will also change changing the world space together with it (which you don't want). In order not to allow that, you should stick a transformation matrix in between the transformation pipeline to make that correction.
SpriteBatch.Begin allows exactly that in one of its overloads.
There are numerous ways to approach the scaling, but I assume that you want to scale uniformly, meaning that sprites don't get stretched out when the aspect ratio changes compared to the initial aspect ratio. The following code will adjust the scaling based on initial screen height.
...
const float initialScreenHeight = 480f;
Matrix transform = Matrix.CreateScale(GraphicsDevice.Viewport.Height / viewportHeightInWorldSpace);
spriteBatch.Begin(SpriteSortMode.Deferred, null, null, null, null, null, transform);
...
Note that when changing the resolution such that the aspect ratio changes compared to the initial aspect ratio, you will run into issues such as drawing out of screen (to the right) or not drawing at the right edge of the screen (getting a similar blue background as currently).
Also, you don't want to calculate that scaling matrix every frame in the Draw method, but only when the resolution is changed.

Related

Monogame: Render only inside specified area

This may be a strange question, but I'm trying to find a way to render sprites only inside a specific allowed area rather then the entire buffer/texture.
Like so:
Basically allowing me to draw to the buffer or texture2D as I normally would, but with actual drawing happening only inside this specified area and remaining pixels outside of it remaining untouched.
Why this is needed - I'm building my own UI system and I would like to avoid using intermediary buffers as it is quite slow when there are many UI components on the screen (and each has to draw to their own buffer to prevent child elements being drawn outside of parent bounds).
And just to clarify - this is all for simple 2D rendering, not 3D.
If your UI is actually drawn with SpriteBatch you can use ScissorRectangle
GraphicsDevice.RasterizerState.ScissorTestEnable = true;
spriteBatch.GraphicsDevice.ScissorRectangle = ...
In 3D, you can render to a texture and draw just a portion of it - or with a shader (you could actually just send in the dimensions as parameter and set it to black in PixelShader if the Pixel is outside that Rectangle (or whatever you want to accomplish)
You can use:
spriteBatch.Draw(yourTexture,
//where and the size of what you want to draw on screen
//for example, new Rectangle(100, 100, 50, 50)//position and width, height
destinationRectangle,
//the area you want to draw from the original texture
//for example, new Rectangle(0, 0, 50, 50)//position and width, height
sourceRectangle,
Color.White);
Then it will only draw the area that you chose before. Hope this helps!

Drawing a circular magnifying lens showing scaled underlying content in XNA/Monogame (in 2D)

I have a 2D scene in Monogame with some primitives and sprites (i.e. in PrimitiveBatches and SpriteBatches) and I would like to create a magnifying glass effect with a circular lens showing a zoomed view of the content under it. How do I do that?
Thanks.
I do not use your environment but I always did this effect with pixel displacement. If you got pixel access to rendered scene (ideally while still in a back-buffer so it does not flicker) then just move the pixels inside your lens to the outward positions. Either use constant displacement or even better is when you move more (bigger zoom) in the middle and less near the edges.
Typical implementation looks like this:
copy lens area to some temp buffer
loop (x,y) through lens area
compute actual radius r of processed pixel from lens center (x0,y0)
ignore pixels outside lens area (r>R)
compute actual zoom m of processed pixel
I like to use cos for this like this:
m=1.0+(1.5*cos(0.5*M_PI*double(r)/double(r0))); // M_PI=3.1415...
you can play with the 1.0,1.5 constants. They determine minimal (1.0) and maximal (1.0+1.5) zoom. Also this is for cos taking angle in [rad] so if yours need [deg] instead change the 0.5*M_PI with 90.0
copy pixel from temp to backbuffer or screen
backbuffer(x,y)=temp(x0+(x-x0)/m,y0+(y-y0)/m)
Here C++/VCL example:
void TMain::draw()
{
// clear bmp (if image not covering whole area)
bmp->Canvas->Brush->Color=clBlack;
bmp->Canvas->FillRect(TRect(0,0,xs,ys));
// copy background image
bmp->Canvas->Draw(0,0,jpg); // DWORD pxy[ys][xs] is bmp direct pixel access, (xs,ys) is bmp size
// here comes the important stuff:
int x0=mx,y0=my; // position = mouse
const int r0=50; // radius
DWORD tmp[2*r0+3][2*r0+3]; // temp buffer
double m;
int r,x,y,xx,yy,xx0,xx1,yy0,yy1;
// zoom area bounding box
xx0=x0-r0; if (xx0< 0) xx0=0;
xx1=x0+r0; if (xx1>=xs) xx1=xs-1;
yy0=y0-r0; if (yy0< 0) yy0=0;
yy1=y0+r0; if (yy1>=ys) yy1=ys-1;
// copy bmp to tmp
for (y=yy0;y<=yy1;y++)
for (x=xx0;x<=xx1;x++)
tmp[y-yy0][x-xx0]=pyx[y][x];
// render zoomed area
for (y=yy0;y<=yy1;y++)
for (x=xx0;x<=xx1;x++)
{
// compute radius
xx=x-x0;
yy=y-y0;
r=sqrt((xx*xx)+(yy*yy));
if (r>r0) continue;
if (r==r0) { pyx[y][x]=clWhite; continue; }
// compute zoom: 2.5 on center, 1.0 at eges
m=1.0+(1.5*cos(0.5*M_PI*double(r)/double(r0))); // M_PI=3.1415...
// compute displacement
xx=double(double(xx)/m)+x0;
yy=double(double(yy)/m)+y0;
// copy
if ((xx>=xx0)&&(yy>=yy0)&&(xx<=xx1)&&(yy<=yy1))
pyx[y][x]=tmp[yy-yy0][xx-xx0];
}
// just refresh screen with backbuffer
Canvas->Draw(0,0,bmp);
}
And here animated GIF preview (quality and fps is lowered by GIF encoding):
If you need help with understanding the gfx access in my code see:
gfx rendering in C++

How to scale texture2d in XNA with window resizing

I'm developing an UI for a project for school, and I've tried similar methods to scaling my texture as listed here, but here is the issue:
Our project is developed at 1440 x 900, so I've made my own images that fit that screen resolution. When we have to demo our project in class, the projector can only render up to 1024 x 768, thus, many things on the screen goes missing. I have added window resizing capabilities, and I'm doing my scaling like this. I have my own class called "button" which has a texture 2d, and a Vector2 position contruscted by Button(Texture2d img, float width, float height).
My idea is to set the position of the image to a scalable % of the window width and height, so I'm attempting to set the position of the img to a number between 0-1 and then multiply by the window width and height to keep everything scaled properly.
(this code is not the proper syntax, i'm just trying to convey the point)
Button button = new Button(texture, .01, .01 );
int height = graphicsdevice.viewport.height * button.position.Y;
int width = graphicsdevice.viewport.width * button.position.X;
Rectangle rect = new Rectangle(0,0,width, height);
sprite.being()
sprite.draw (button.img, rect, color.white);
sprite.end
it doesn't end up scaling anything when i go to draw it and resize the window by dragging the mouse around. if i hard code in a different bufferheight and bufferwidth to begin with, the image stays around the same size regardless of resolution, except that the smaller the resolution is, the more pixelated the image looks.
what is the best way to design my program to allow for dynamic texture2d scaling?
As Hannesh said, if you run it in fullscreen you won't have these problems. However, you also have a fundamental problem with the way you are doing this. Instead of using the position of the sprite, which will not change at all during window resize, you must use the size of the sprite. I often do this using a property called Scale in my Sprite class. So instead of clamping the position of the sprite between 0 and 1, you should be clamping the Size property of the sprite between 0 and 1. Then as you rescale the window it will rescale the sprites.
In my opinion, a better way to do this is to have a default resolution, in your case 1440 x 900. Then, if the window is rescaled, just multiply all sprites' scaling factors by the ratio of the new screensize to the old screensize. This takes only 1 multiplication per resize, instead of a multiplication per update (which is what your method will do, because you have to convert from the clamped 0-1 value to the real scale every update).
Also, the effects you noticed during manual rescale of the sprites is normal. Rescaling images to arbitrary sizes causes artifacts in the rendered image because the graphics device doesn't know what to do at most sizes. A good way to get around this is by using filler art during the development process and then create the final art in the correct resolution(s). Obviously this doesn't apply in your situation because you are resizing a window to arbitrary size, but in games you will usually only be able to switch to certain fixed resolutions.

Sprite.Draw() draws my textures too small

I declared a device + sprite in a Windows.Form like this
PresentParameters presentParameters = new PresentParameters();
presentParameters.Windowed = true;
presentParameters.SwapEffect = SwapEffect.Copy;
var device = new Device(Manager.Adapters.Default.Adapter, DeviceType.Hardware, this, CreateFlags.HardwareVertexProcessing, presentParameters);
var sprite = new Sprite(device);
I loaded a texture via TextureLoader.FromFile(device, "image.png");
In my Draw method i startet the device scene, then the sprite scene, then i wrote
sprite.Draw2D(texture, PointF.Empty, 0, PointF.Empty, Color.White);
the drawing itself works, but it draws only a big portion of the image scaled up to the screen (like 90%)
i tried it with a source rectangle with the given texture size too, but the same bug occurred
any suggestions?
I am experienced in C++ DirectX, but not C# DirectX, so take this with a grain of salt.
In my experiences with the Sprite interface, you need to scale, rotate, and translate just like you need to with 3D objects. You may be forgetting to scale. Here is the code of my Update function.
void Button::Update()
{
Sprite->Begin(D3DXSPRITE_ALPHABLEND);
D3DXMATRIX trans;
D3DXMATRIX scale;
D3DXMATRIX world;
D3DXMatrixIdentity(&world);
D3DXMatrixTranslation(&trans, pos.x, pos.y, 0.0f);
D3DXMatrixScaling(&scale, scaleFactor, scaleFactor, 1.0f);
world = scale * trans;
Sprite->SetTransform(&world);
Sprite->Draw(buttonTexture, NULL, NULL, &D3DXVECTOR3(-width2, -height2, 0.0), whitecol);
Sprite->End();
}
Admittedly, this isn't a very object-oriented way of doing things, but it suits my needs.
Caveat: I am not an DirectX expert, but I had the same problem.
When you load the sprite it expands the sprite to fit a size where each dimension is a power of 2. For example, If you sprite was 200 x 65, the sprite will have a width of 256 (and the image will be expanded to a width of 256, increasing it slightly) by 128 (almost doubling the height).
When you draw the image, it will be almost twice the height you expected.
My solution was to modify my image file to have a height and width of a factor of 2 and then only draw the portion that was the original size.

"Wrap-around" effect with a Direct3D.Texture

Given a destination rectangle and an x/y offset value, I need an image to be drawn within the confines of that destination rectangle. If the offset would push the image off the edge of the rectangle, then the part that "pushes out" should appear on the opposite side of the destination rectangle. In simplest terms, I need a scrolling background.
In GDI, I can accomplish this with an "ImageAttributes" object that uses a tile wrap mode:
ImageAttributes attributes = new ImageAttributes();
attributes.SetWrapMode(System.Drawing.Drawing2D.WrapMode.Tile);
Rectangle rectangle = new Rectangle(0, 0, (int)width, (int)height);
g.DrawImage(bmp, rectangle, -x, -y, width, height, GraphicsUnit.Pixel, attributes);
Now, I need a way to do this in DirectX. Assume that this is the method I have right now:
public void RenderTexture(PrismDXObject obj, D3D.Texture texture, int xOffset, int yOffset)
{
if (obj != null && texture != null)
{
_renderSprite.Begin(D3D.SpriteFlags.AlphaBlend);
_renderSprite.Draw(texture,
new Rectangle(0, 0, (int)obj.Width, (int)obj.Height),
new Vector3(0.0f, 0.0f, 0.0f),
new Vector3((int)obj.Left, (int)obj.Top, 0.0f),
obj.RenderColor);
_renderSprite.End();
}
}
}
...where "_renderSprite" is a D3D.Sprite, and PrismDXObject is a simple class that stores x/y/width/height/color. How can I update this method so that xOffset and yOffset can be used to make the texture wrap? Remember, my end-goal is a scrolling background that loops as the player walks forward.
Incidentally, that RenderTexture() method is meant to be a "library method" which can be called from anywhere in my program... so if I'm doing something really inefficient or ill-advised, I'd welcome a friendly warning! My main concern is getting the wrapping background to work, though.
I'm not sure that the sprite mechanism allows for what I'm about to explain, but 2 triangles certainly do. If this does not work with sprites, use triangles directly:
What you're asking for is directly supported by the texturing subsystem, it is called texture wrapping.
When you specify the texture coordinates that your quad will use, instead of using the 0,0-1,1 range, you can use 0+xoffset/tex_x_size, 0+yoffset/tex_y_size, 1+xoffset/tex_x_size, 1+yoffset/tex_y_size for your texture coordinates.
Then, the only thing left to do is to specify that the texture sampler you will use to map your background does texture wrapping. To do this, you need to set to D3DTADDRESS_WRAP the D3DSAMP_ADDRESSU and D3DSAMP_ADDRESSV sampler states. Note, this is the default for the sampler state.
that's it. Now, getting back to D3D.Sprite specifically, the Draw method takes a rectangle that tells which part of the texture to use. have you tried drawing xoffset, yoffset, xOffset+obj,Width, yoffset+obj.height ? This will only work if the sprite subsystem uses a sampler that has wrapping on, and I don't know how sprite is implemented internally.

Categories