Inside GvrViewer.cs I've tried setting the defaultDeviceProfile by using:
public Uri DefaultDeviceProfile = new Uri ("http://vr.google.com/cardboard/download/?p=CgZPd2wgVlISBlZpZXdlch0xCCw9JdV4aT0qEAAASEIAAEhCAABIQgAASEJYATUpXA89Oggfhes-H4VrPlAAYAM");
We've launched in the Play & App stores and it has been reported that the app isn't defaulting to the correct device profile - When the user scans the bar code the image distortion looks correct, however on launch the headset is not callibrated.
I'm aware that this only works if a headset hasn't been previously paired.
Question 1
How can I debug this? When we delete the app on both iOS android the previous profile is loaded :/
Question2 Can I force this elsewhere, (maybe BaseVRDevices.cs?)
Any help would be much appreciated, unfortunately this slipped through our Q&A...
Related
I'm working on a Unity project that uses the device camera for AR purposes (I'm using AR Foundation). I currently have it set so that the device won't go to sleep (because I want it to operate for long-periods without user-interaction). Specifically, Screen.sleepTImeout = SleepTimeout.NeverSleep. If I didn't have this, then the device would go to sleep, and the program would stop functioning. Does anyone know if there is a way to keep the program working in the background after the device goes to sleep...similar to how music apps keep playing music? I've come up empty handed after a few hours of searching for a solution. Any help would be greatly appreciated!!!
Some people aren't finding this in more recent versions of Unity. But it's still there: In Player settings, under the section "Resolution and Presentation", you find "Resolution" and below that, there's "Run In Background".
This is only available for Web players and standalones (Anything Except IOS And Android Pretty Much), though (you won't find it when you have the iOS, Android or Flash tab selected.
I just started messing with firebase analytics, when I build an android unity app, it doesn't seem to automatically track screen time and views per user...it's always at 0min0sec, and screen class is (not set) and screen name is (not set)...
I can see other events but cannot track screen time...
Added Firebase.Analytics.FirebaseAnalytics.SetCurrentScreen(), it doesn't show the time. Has anyone already had this issue ? If so how can I fix that...
Thanks
Guess I just had to wait...seems like firebase analytics is not in real time like I thought it was...
Computer has two webcams. How to tell to CEFsharp from which camera start recording? How to switch cameras via C# on the fly? My case is to chage default camera for Hangouts chatting in CEFsharp browser in WPF project.
I tried to search some info on CEF's forums but did not find any workaround there.
As a solution i tried switch off device fully (throught devcon.exe) but it require to reboot PC and it is unacceptable for me.
More over there is no notion of "Default" device for web-cams in Windows. This mean, that i cant change default camera on OS level.
So i want to change it on level of CEFsharp.
Working with maps on my xamarin forms project and when i start the simulator i get the map when it is zoomed out, but when i click a button (that moves the map closer, zooms it in) the map doesnt load, it remains gray. The simulator I am using is iPhone 6s. If I however use the iPad Air simulator, it works just fine and the map loads when it gets more zoomed in.
When I deploy the app on my phone it remains gray as well. If I zoom out again, then it loads, but if i go futher in into the map then it turns gray again.
I use Location always in usage + when in usage and the standard iOS map.
I use version 2.1.0.6529 on both Xamarin.Forms.Maps as well as Xamarin.Forms in my mainfolder and the iOS folder. (Latest version)
I have "Xamarin.FormsMaps.Init();" in my appdelegate.cs file as well as some other code (push notifications), might that interfere somehow?
I am very confused what the issue might be.
my first question here...
I have been searching how solve this issue i start a app using monotouch creating my MPMoviePlayerController like this:
this.mp = new MPMoviePlayerController(new NSURL("http://stream3.dyndns.org:1935/iphone/xeco.stream/playlist.m3u8"));
this.mp.useApplicationAudioSession=false //also try with true
this.mp.PrepareToPlay();
this.mp.Play();
Everything works great, i hear the audio
Notes:
I already search how fix this and on my plist file add UIBackgroundModes and string audio as many members advice. (Using the plist editor provided in monotouch). Also add this code to set the category of my audio session:
AudioSession.Initialize();
AudioSession.Interrupted += delegate{
Console.Writeline("interrupted");
}
AudioSession.Category = AudioSessionCategory.MediaPlayback; //also try ambientsound and others
The problem is:
How can i keep the audio playing when the app goes background, when the device blocks or the home button is pushed?
The stream always seems get muted when press home or block the device, when i return to the app the audio starts very quickly wich make me think that the app is streaming all the time but only doesnt hear.
I am testing only on the Iphone Simulator, i am starting to think that maybe this is caused only on simulator. any advice?
thanks in advance.
UPDATE:
I receive my licenses of monotouch and my register on the iOS Dev Program, so i test my app on the device. It works, the important part is on the AudioSession part:
AudioSessionCategory.MediaPlayback;
and when use the player on:
this.mp.useApplicationAudioSession=false
I receive my licenses of monotouch and my register on the iOS Dev Program, so i test my app on the device. It works, the important part is on the AudioSession part:
AudioSessionCategory.MediaPlayback;
and when use the player on:
this.mp.useApplicationAudioSession=false
The app works great on the device. Monotouch rules!