Draw travel distance in XNA 4 - c#

So basically i am creating an XNA game at college and i need some help with something as i cannot seem to figure it out myself and i'm pretty new to this.
Basically i have a spaceship with a scrolling background of stars. I have falling asteroids and basically the point of my game is to travel as far as possible without being hit by said asteroids.
I'm really looking for some guidance as to how i could measure a theoretical distance travelled by the ship and then draw it on screen?
Any help would be greatly appreciated.
Many thanks.

Solution A
Somewhere in your code you are defining the offset of the backdrop for each frame. You could just invert* this value and add it to the total amount every frame:
totalDistance += -backdropOffset;
If the offset is defined in pixels you have to convert it to your game world unit (kilometers, lightyears, ...) before displaying the distance.
* If the ship moves forward, the backdrop "slides" in the other direction.
Solution B (more work but less headaches)
It is actually not the backdrop that is moving; it's the ship. So why not move the ship and follow it with the camera?
You will be able to do all kinds of motion. Right now you have to invert every movement of the ship and then apply it to the backdrop. Kind of counter-intuitive, don't you think? So going with this solution your code will be much closer to reality => less headaches during debugging, easier maintenance of your application and you will be quicker when adding new features.
And of course, getting the total distance would be as trivial as
var totalDistance = myShip.Position.Y;

Related

The camera traspass a part of wall in first person Unity 3D

When I was testing my game written in Unity 3D, I see that the camera traspass the wall on the left superior part but in the right superior part don't do it.
Can anybody help me please?
A picture - the error is up at the left:
The wall is closer than the shortest distance at which the camera can render. The distance is called near clipping plane. You can change it from the inspector after you select the camera. Set it to a very small number like 0.01 to render objects at a very close distance and avoid this issue.
Keep in mind that as your far/near ratio increases, your depth buffer precision will decrease which can cause some artifacts. So only reduce the near plane/increase the far plan as much as you really need.

How could you create a ultrasonic vision in Unity3d? (Bat Vision)

I had the idea to create a game where you would see everything like a bat. There are some people which have achieved something like that.
The problem is I would like to have another type of "Bat Vision". It is pretty hard to explain but if i look into one direction and then rotate the character I would still see something I looked at before.
Explanation with timestamps:
Sec 0: I am looking at point A where a object is 300 meters away
Sec 1: I rotate to point B and see an object 100 meters away
Sec 2: I see object A and B at the same time
Sec 3: I still see object B
Explanation with picture:
As you can see the player first shoots the ultrasonic waves a first on A and afterwards waves b on B. They both come back at the same time so that both objects are seen at the same time.
If you did not understand why I see objects delayed. As everyone knows sound travels with a speed about 1/3 km/s. So an object which is 300 meters in the distance would get seen after about 2 sec.
I already tried to shoot RayCasts or GameObjects into the direction and as further they are away the darker the pixel at that position will be. But as you can think... Shooting 2 million RayCasts every Frame is not very healthy. I was also thinking about Shaders but I don't know how it is possible that one still sees the Shaders while looking into another direction. Also I don't have much experience with Shaders and would need help there anyways.
I would be glad if someone has another Idea how I could create this.
To achieve this the easiest way would be using an image effect shader that accesses the depth buffer.
The value of each pixel in your depth buffer is by definition the distance of that pixel in the world to the camera
You can decrease the value of each pixel by some value every second and set the value to max every X seconds based on the distance given by the depth buffer... this will give the effect of sound bouncing off that object back to your player
https://www.youtube.com/watch?v=OKoNp2RqE9A
tweak the colors to achieve needed effect
It appears to me that maybe you want to duplicate any meshes and surfaces that your player has 'seen', and keep this in view with some grayish light settings. And the 'real' objects are all hidden, that is, they all have their MeshRenderer components disabled or removed.
If the player shall see an item like a shadow in front of him even though he moves, you could have the cloned visible duplicated move along with the player some time until they fade.
To find out what to duplicate I am sure you'd need a lot less raycasts than doing it for each pixel. Clone the mesh, then check visibility for each vertex and set the visibility in the duplicated vertex color and alpha.

Unity Constant rotation whilst using Transform

This is something i've struggled with all week. I'm working on a 2D project, and what I want is for my enemies to move from the right to the left hand side of the screen (landscape). I'm moving them like this:
transform.Translate (new Vector3(1,0,0) * speed * Time.deltaTime);
At the same time, I want them to constantly rotate on a pivot in the middle of the sprite. As an example, imagine they are in space and are sort of floating uncontrollably, they would spin. I've asked this question a couple of times with no response so I guess my explanation isn't really very good. This is what I've tried:
Animating the objects. This didn't work because changing the Z rotation caused the sprites to spin in a tornado but not from one side of the screen to the other
This: transform.Rotate (0,0,50*Time.deltaTime); I messed around with the X,Y and Z properties but they pretty much all made a tornado type effect or rotated in 3d so disappeared when at 180 degrees.
I also imported a spritesheet where they are at different points in the "spin" so say at 10,20,30...360 degrees etc. but this wans't smooth at all
I hope this makes sense. I've spend quite a few hours on it now!! I can't get my head around it as I've moved from Xcode where this sort of stuff is 1 line of code. A point in the right direction would be amazing.
Note: if my question makes no sense please ask!
have you tried transform.rotation = Quaternion.lerp() ? I used this myself to rotate an object using an input though you could use it to do a random rotation on the z plane easily enough

Starfield Screensaver Equations

For those of you who don't remember exactly what the old windows Starfield screensaver looked like, here's a YouTube video: http://www.youtube.com/watch?v=r5AoFiVs2ME
Right now, I can generate random particles ("stars") inside in a certain radius. What I've having trouble doing is figuring out the best way the achieve the affected seen in the afore-linked video.
Question: Given that I have the coordinates (vectors) for my randomly generated particles. What is the best way and/or equation to give them a direction (vector) so that they move across the screen in a way which closely resembles that which is seen in the old screensaver?
Thanks!
They seem to move away from the center. You could try to calculate the vector from the center point of the screen to the generated particle position? Then use the same direction to move the particle and accelerate the particle until it is outside the screen.
A basic algorithm for you to work with:
Generate stars at random location, with a 3-D gaussian distribution (middle of screen most likely, less likely as you go farther from the screen). Note that the motion vector of the star is determined by this starting point... the motion will effectively travel along the line formed by the origin point and the starting location, outward.
Assign each newly generated star a distance. Note that distance is irrespective of starting location.
Move the star in a straight line at an exponentially increasing speed while simultaneously decreasing it's distance. You'll have to tweak these parameters yourself.
The star should disappear when it passes the boundary of the screen, regardless of speed.

Math - Displaying multiple objects in a single view using trigonometry

It might be that my math is rusty or I'm just stuck in my box after trying to solve this for so long, either way I need your help.
Background: I'm making a 2d-based game in C# using XNA. In that game I want a camera to be able to zoom in/out so that a certain part of objects always are in view. Needless to say, the objects move in two dimensions while the camera moves in three.
Situation: I'm currently using basic trigonometry to calculate which height the camera should be at for all objects to show. I also position the camera between those objects.
It looks something like this:
1.Loop through all objects to find the outer edges of our objects : farRight, farLeft, farUp, farDown.
2.When we know what the edges of what has to be shown are, calculate the center, also known as the camera position:
CenterX = farLeft + (farRight - farLeft) * 0.5f;
CenterY = farUp + (farDown - farUp) * 0.5f;
3.Loop through our edges to find the largest value compared to our camera position, thus the furthest distance from the center of screen.
4.Using the largest distance-value we can easily calculate the needed height to show all of those objects (points):
float T = 90f - Constants.CAMERA_FIELDOFVIEW * 0.5f;
float height = (float)Math.Tan(MathHelper.ToRadians(T)) * (length);
So far so good, the camera positions itself perfectly based on the calculations.
Problem:
a) My rendering target is 1280*720 with a Field of View of 45 degrees, so one always sees a bit more on the X-axis, 560 pixels more actually. This is not a problem per se but more one that on b)...
b) I want the camera to be a bit further out than it is, so that one sees a bit more on what is happening beyond the furthest point. Sure, this happens on the X-axis, but that is technically my flawed logic's result. I want to be able to see more on both the X- and Y-axis and to control this behavior.
Question
Uhm, so to clarify. I would like to have some input on a way to make the camera position itself, creating this state:
Objects won't get closer than say... 150 pixels to the edge of the X-axis and 100 pixels to the edge of the Y-axis. To do this the camera shall position itself along the Z-axis so that the field of view covers it all.
I don't need help with the coding, just the math and logic of calculating the height of my camera. As you probably can see, I have a hard time wrapping this inside my head and even harder time trying to explain it to you.
If anyone out there has been dealing with this or is just better than me at math, I'd appreciate whatever you have to say! :)
Don't you just need to add or subtract 150 or 100 pixels (depending on which edge you are looking at) to each distance measurement in your loop at step 3 and carry this larger value into length at step 4? Or am I missing something.
I can't explore this area further at the moment, but if anyone is having the same issue but is not satisfied by provided answer there is another possibility in XNA.
ViewPort.Unproject()
This nifty feature converts a screen space coordinate to a world space one.
ViewPort.Project()
Does the opposite thing, namely converting world space to screen space. Just thought that someone might want to go further than me. As much as my OCD hates to leave things not perfect, I can't be perfectioning this... yet.

Categories