Vuforia Video Playback is flickering in Unity app? - c#

I am developing an augmented reality application using Vuforia(8.1.10) in Unity.
Everything is working perfectly fine but the video is flickering very often. In fact, every time when the camera moves slightly the video is wavering like anything. It is shaking quite often. This is reducing the augmented reality experience.
I am attaching quad to ImageTarget. And in the quad, I am attaching the video player. The video is a chroma key video. To remove chroma, I am using the shader.
What should I do to make the video playback more stable? Any help or suggestion would be much appreciated. Thanks!

You need to change Your ARCamera Clipping Planes Near and Far options.
Try 0.05 for Near and 2000 for Far
I hope that helps and if it doesn't I'd suggest you to tinker around with the Clipping Plane values until it stops buffering.
Updating the Vuforia Engine to 8.1 is also recommended in case of bug fixes.

Try to change Vuforia Configuration like this,
ARCamera --> Open Vuforia Engine Configuration --> Device Tracker --> Track Device Pose = checkmark (True) and Tracking Mode = Positional

Related

Vuforia 6.2.2 Unity 5.4.4 Camera does not work

Everyone.
Vuforia Camera does not work, but only black screen on Android 6.0+ versions.
Vuforia Version is 6.2.2 and Unity version is 5.4.4
But vuforia camera works on less than android 5.0
How can I fix this issues?
I hope you to teach me about this.
This is an interesting problem I have and there is an unacceptable temporary fix I am using while testing this system out. For the iPhone 7, to get past this suspend the app and then return to it. After about 2 seconds the camera will work. I am guessing it will work similar for android. I will update with a better fix if I come by a real fix after my testing if I decide to use this system.
Edit:
Short answer:
Delete the meta data for any pre-existing custom camera controller script. If you are using your own camera controller you need to disable the vuforia one and delete the meta for that. You are basically hijacking the camera feed after it starts.
long: I began this application by building out my own system and to test vuforia I disabled these items (like the camera feed). I went through the logs and saw that even with these items disabled the camera feed was still running and this feed started AFTER the vuforia camera so basically my own start() methods (even though they were disabled) were hijacking the camera from vuforia. It turns out that my meta data for my camera controller script was still poised to run the script even though everything was disabled. After deleting my camera controller script meta data it worked fine. You can also just delete the camera controller and it will delete the meta. By camera controller I mean my custom written camera controller that was built before I added in vuforia. This is a hard to find fix because it works fine in unity, but not when you build out to a device. The meta doesn't seem to update for the device, just the unity engine.
If you are using the vuforia camera make sure you either use the vuforia plane that comes as a child of the camera or delete the meta data to whatever camera script you wrote. You should get camera feed in a new empty project just by dropping in the vuforia camera, there is no need to build your own script and if you do, make sure one isn't overriding the other like mine was.
If you want to simply test to make sure it isn't your device or code create a NEW empty unity project, import vuforia (no need to import the database, just the sdk) and then drop the vuforia camera into the project and test it. Don't add anything extra or do any image recognition. If this works, it's your code somewhere.
I am using Vuforia 6.2.10, Unity 5.4.4.f(64 bit) and a Nexus 7 Android 6. I had the same issue where the camera was black. I started over, adding one component at a time. The AR Camera alone worked fine. Adding a Target Image also worked. I added a plane and image to the TI and the camera failed to work. Setting the image texture type to Sprite 2d and ui seemed to help.
I discovered that by deleting the app from the device and creating a apk file each time it worked.
I am not sure I'd count on Vuforia in the near term.

how to create a responsive scene

I'm currently creating a 2D Android and iOS game using Unity3D engine. I'm testing the game on a nexus 5, and an iPhone 5s device. Everything until now is working fine and I am pretty happy with the result, but when I test that application on an iPad or a Samsung tablet all the objects in my game scene are not in the correct position anymore. Is this a common problem in Unity3D ?
I know I am missing something but I tried to do some research and what I found is only by changing the orthographic camera scale might fix this problem, but I found it as a big amount of code to write as my game have not only one scene but multiple scenes and every scene have it's own game objects.
Is there any other method to do, a good and simple work around for this problem?
It's all about setting the Anchors right.
If you're using the new UI System, make sure you anchor the objects where you want them to be, that's how you will achieve resolution independence.
For more information about anchoring, see this tutorial
Don't have separate scenes for separate devices. You can use the Screen object to check the height and width of your display. Then you can use this to set the orthographic size of your camera to something that makes everything visible as expected.
Update: I misunderstood your question, you say GameObject, i understand UI.
Please check this. I don't have this issue on my game. But when i try it with mac or windows machines, it is problematic. So maybe this can solve.
Other solution is more common which is you and Agumander say, change orthographic size of Camera.
This Is for UI
You can use Unity UI, and don't need to seperate. There are so many different resolution and density devices, you need to create so many scenes. So it is meanless separating scenes.
Unity UI has pixel based solutions which can be very helpfull for many density and resolution options. Forexample, It has VerticalLayoutGroup and HorizontalLayoutGroup for easy list like element visulation.
Most important thing is: Do you want to change UI for different screen size or resolutions? For example iPad has larger screen so user can be see more content. This change UX. Maybe you need to consider this.

How to use vuforia on PC Application

i'm new on vuforia and i need some help with this topic.
Can someone tells me if its possible use vuforia with a webcam on a pc application?
I got webcam frames on a plane texture and i tried to pass those textures to vuforia but i have not got it work.
I used WebCamTexture to get frames of webcam.
I'm not familiar with Vuforia (although it looks interesting, I'm going to try that out myself this weekend!), I might have a suggestion.
Is it possible Vuforia doesn't know what to do because of the images being WebCamTexture instead of something like Texture2d?
WebCamTexture is a Texture (Texture2D, WebCamTexture, RenderTexture all extend from Texture but they aren't exchangeable).
So, try to convert your texture and pass that on to Vuforia?
var tx2d = new Texture2D();
tx2d.SetPixels((go.renderer.material.mainTexture as WebCamTexture).GetPixels());
EDIT: What I found here, under the section Running in the editor:
There is a specific Web Cam Behaviour script.
To use Play Mode for Vuforia in Unity Pro, simply select the attached,
or built-in, webcam that you want to use from the Camera Device menu,
and then activate Play Mode using the Play button at the top of the
Editor UI.
You can also use the standard Unity Play Mode with non-Pro Unity
versions and by setting ‘Don’t use for Play Mode’ in the Web Cam
Behaviour component.
To use standard Play Mode, adjust the transform of the ARCamera object
to get your entire scene in view, and then run the application in the
Unity editor. There is no live camera image or tracking in standard
Play Mode, instead all Targets are assumed to be visible. This allows
you to test the non-AR components of your application, such as scripts
and animations, without having to deploy to the device each time.
Have you tried the $250 plugin at http://holographi.space/ ? I haven't paid $250 for it yet.

Monogame on iOS: different behavior when interpreting touches compared to Windows Phone?

Just to do some tests I have exeuted Microsofts XNA sample "Shooter" on Windows 7, Windows Phone 7 and iPhone (using Monogame).
Besides the fact the Monogame runs the game in portrait instead of landscape, I have noticed that the movement of the player sprite behaves really different.
On Phone 7, you tap anywhere, keep your finger on the screen and while moving it, the sprite follows the movement relative to the finger.
On iPhone however, the sprite first doesn't move at all, then moves extremely fast and moves up to the screens bounds. This makes the player uncontrollable.
The code used is:
while ( TouchPanel.IsGestureAvailable )
{
GestureSample gesture = TouchPanel.ReadGesture();
if ( gesture.GestureType == GestureType.FreeDrag )
{
player.Position += gesture.Delta;
}
}
Does that have to be different for iOS?
The MonoGame Team is working on achieving parity with XNA. There are a few hurdles but we still think these are surmountable.
The 2 areas in iOS we are hoping to get right for the MonoGame 2.5 release is Gestures and supporting Landscape mode properly.
I hope this helps.
D.
I haven't updated from GitHub in a little while, but when we started using MonoGame for a project we found the TouchPanel didn't work quite right.
It is a hard problem: how would you flatten event-driven touch events to a static class like XNA has?
To workaround it, we forked MonoGame to have our own TouchPanel implementation that just used regular C# events instead. This could be ugly if you need to support regular XNA as well.

C# Mobile Game Development

I'm currently trying to implement a marble maze game for a WM 5.0 device and have been struggling with developing a working prototype. The prototype would need the user to control the ball using the directional keys and display realistic acceleration and friction.
I was wondering if anyone has experience with this and can give me some advice or point me in the right direction of what is essential and the best way to go around doing such a thing.
Thanks in advance.
Frank.
When reading your answer I didn't get the feeling you are looking for a game framework, but more: how can I easily model a ball with acceleration and friction.
For this you don't need a full fledged physics framework since it is relatively simple to do:
First create a timer which fires 30 times a second, and in the timer callback do the following:
Draw the maze background
Draw a ball at ballX, ballY (both floating point variables)
Add ballSpdX to ballX and add ballSpdY to ballY (the speed)
Now check the keys...
if the directional key is left, then subtract a small amount of ballSpdX
if the directional key is topleft, then subtract a small amount of ballSpdX and ballSpdY
etc
For collision do the following:
first move the ball in the horizontal direction. Then check the collisions with the walls. If a collision has been detected, then move the ball back to its previous positions and reverse the speed: ballSpdX = -ballSpdX
move the ball in the vertical direction. Then check the collisions with the walls. If a collision has been detected, then move the ball back to its previous positions and reverse the speed: ballSpdY = -ballSpdY
by handling the vertical and horizontal movement separately, the collision is much easier since you know which side the ball needs to bounce to.
last nu not least friction, friction is just doing this every frame: ballSpdX *= friction;
Where friction is something like 0.99. This makes sure the speed of the ball get's smaller every frame due to friction;
Hope this helped
I would recommend checking out XNA Studio 3, it has built in support for PC, Xbox 360 and mobile devices, and it's an official & free spin-off of Visual Studio from Microsoft.
http://creators.xna.com/en-US/
http://blogs.msdn.com/xna/
If you search around, people have written tutorials using physics (velocity on this one)
http://www.xnamachine.com/2007/12/fun-with-very-basic-physics.html
Try XFlib. It is in c++, but most cool things for the mobile have to be in c++, unfortunately. The site has some very cool free games. You can also see the source of most of the game too. Many have the physics you want.
Unfortunately, XNA doesn't support the windows mobile platform. However, as it seems that you're not having a problem with the technical issue of drawing on the WM device, but with the logic required to implement physics based movement, then it's not a bad idea to consider XNA to prototype the physics and movement code.
Check out some of the educational topics at creators.xna.com, and also "gamedev.net"
If you are at a loss, there's no mistake in trying a "lighter" tool for prototype. I would try Torque Game Builder - it spits out XNA, although maybe not meant for your platform.
At the Samples of the Windows Mobile SDK (check out the WM 6.0 SDK too), there are a couple of game applications. One of them is a simple puzzle game; not much, but it is a starting point.
The use of physics in game development is not specific for Windows Mobile. You can find a huge literature about this subject. This comes up in my mind now. If you are serious about game development, in any platform, you should do a little research first.
I dont know if this may help but i saw a Marble application for the Android platform on google code. Check it out here, it may throw some insight on the actual logic of the game.
The code is open sourced and written in java (using the android sdk) put nevertheless it may be useful. Also to better understand the code checkout the documentation for the SensorsManager, SensorEvent etc here
I wouldn't recommend using the same architecture as this application thou.

Categories