Run two animations on object in Unity simultaneously - c#

I'm still pretty fresh to the Unity 3D world. I'm trying to get an animation on two children of an object (a door that has the exterior facing outward and a door facing inward). Maybe I'm approaching this the wrong way? See below:
Is there a way in which by pressing one key,i can get both "open" animations for both the inward and outward facing doors to play at the same time? Note: I originally created this FBX in Blender.

Related

Sprites disappear after game is built (unity engine C#) Not an issue of layer they are on

I am currently writing a 2D game in unity and whenever I build the actual game, the sprites all disappear, the player is the only thing that remains.
However, the sprites are still there, just invisible. It's not an issue of order of layer and am having trouble fixing it.
I am using SpriteShapeController.
SpriteShapeController auto shows as it having a sprite so it made me think I had a sprite already
I just made Custom Square Sprites and added them in.

Adjusting a different touch spots on object

I'm developing a game using the Unity3d game engine and there is a little problem that I need help with
The problem is basically, I want to create multiple touch spots on my object for example if I touch on the bottom of the object it does something.
But if I touch on the top of the object it should do something different. How can I do this?
The solution was simple: Create different collisions boxes or spheres etc.. on the object.

How to create multiplayer / shared AR game with Vuforia?

I would like to create a shared AR game on Android phones, where I would like to:
spawn a cube for each player on an ImageTarget
allow them to control the position of their cube
allow them to see the movements of all players' cubes
I'm using Vuforia as my AR library and PUN 2 as my networking library. I have no issue synchronizing the positions and rotations of all cubes. However, the cubes do not stay on the ImageTarget properly and "jump" around. On the other hand, if I place my two phones very close together and point them at the ImageTarget at roughly the same angle, the cubes do not jump as much.
This leads me to think that the 2 instances of ARCamera fail to realize that they are pointing at the same ImageTarget from 2 different angles, and instead think that the ImageTarget exists in 2 different orientations at the same time.
Is there any way for me to tell Vuforia that I'm using multiple instances of ARCamera pointing at the same ImageTarget? (Or if my hypothesis is completely wrong, how do I actually make a multiplayer AR game?)
Thanks so much in advance!
p.s. I know the Vuforia forums are a better place to ask this question but unfortunately that forum is not particularly active, so I'm trying my luck here.
I've solved the issue by going to the ARCamera gameobject, then in the Vuforia Behavior component, I changed the World Center Mode from DEVICE to FIRST_TARGET. This allows multiple instances of ARCameras to be in different positions.
More info on World Center Mode can be found here.

C# XNA Enter Houses

I'm making a role playing game in C# using XNA. I already have a map and some stuff, but that's not interesting at the moment. My question is: How can I give the player the possibility to enter houses or rooms?
To create the worlds, I've used standard int-arrays where each number represents a different type of tile. That works all fine, but the house isn't enterable but a solid textured block of something.
BTW I've used a Vector3 to determine in which world the player's currently located and which one the program must load next.
Any suggestions how I can make the houses enterable?
One easy way to make houses enterable is to create a trigger object at the door of the house.
This object can be a simple Rect along with an id.
While the player moves around your map check for a collision between the player and the trigger.
When the player enters this trigger you can change the displayed map with a new one (the interior of the house) and move the player where the door should lead.
If you're using C# to make a game, you might want to consider switching to MonoGame (which is practically XNA's successor since XNA is dead) or Unity.
As mentioned previously, your best option in XNA is to create a Rectangle which will trigger on collision and change the game world.

Unity3D Run Time Error

At the moment I've set up a dual camera scene in Unity. I've created an empty game object for my camera's to inherit from and attached some of the first person controller scripts to this empty game object.
When I run the program in the editor, it runs fine. When I build the project, the game crashes and my camera objects fall through the ground. I've never experienced something like this before in Unity. Attached is a copy of my my current fps set up values.
In the picture, you will see that I have turned gravity off (set it to 0), yet it still falls down when I run the built program.
Has anyone ever come across something like this before? I've spent all day trying to fix this, but I'm getting no where.
I experienced such an issue when my parent object (I usually use capsules to "carry" the camera as the players head like 1.70m above the ground) is set to low ... what happens if you move your camera-guy together with the game object one meter upwards? (so that he falls down a bit against the surface when you start).
Maybe there is a difference in some relations between editor-build and release-build.
Or, in case this isn't the solution, check the spatial positions of your involved objects again. Falling through a terrain is often produced by misplaced reference objects. (I sometimes hung a carrier object under a camera instead of hanging the camera under a carrier object.)
As you say that it works in the editor-build i assume that you have activated collision for the relevant objects.

Categories