Can someone help me about to use the Native Camera of Android in Unity the same way we use WebcamTexture? I want the "camera preview image" in Unity as a texture2d or texture.
Thanks in advance.
I did it once using https://bitbucket.org/Unity-Technologies/graphicsdemos as starting point. You are going to be interested in the RenderingPlugin.cpp.
Back in the day I had to use code in Java as bridge to Android NDK since it was not possible to plug an ImageReader in the camera feed on the C level due to security reasons.
It looks like things got updated and it might be possible to simplify the architecture by using https://github.com/googlesamples/android-ndk/tree/master/camera.
Hope this helps.
ps-> In case you want to do a code walkthrough I put together an end-to-end demo here https://github.com/robsondepaula/unity-android-native-camera
Related
I had been trying a app that I developed in Unity3d on multiple android devices.
For some reason in some devices the camera seems blurry.Can anyone help me out please?
FOCUS_MODE_CONTINUOUSAUTO works well on all of my Android devices. Read the article below for more info.
https://developer.vuforia.com/library/articles/Solution/Camera-Focus-Modes
We encourage using FOCUS_MODE_CONTINUOUSAUTO in your applications whenever it is available on the device. When setting this mode, if the return value of setFocusMode() is TRUE your application will provide sharp camera images for both superior rendering, as well as for robust tracking performance.
I am looking for a audio player feature to add to my app, I would like to have a play, stop/pause, and progessbar option for the player. I reffered to https://developer.xamarin.com/samples/monodroid/Example_WorkingWithAudio/. I want to implement the feature for android, iOS, and windows.
Are there any available plugins which I can use or any other reference link which can help..
Take a look at the Xamarin-Forms-Labs Plugin it has a Audioservice.
Here is the code for the interface.
And here an usage example
For those who are facing the same issue, you can achieve this by creating the audio player yourself with help of dependency service and native (platform specific) api's.
I am looking at using the Kinect version 2 for PC to control an avatar, i.e. apply a skin to the body basics example provided by Microsoft.
The avatar isn't to be used in a game or anything at this point. I simply plan to be able to control the avatar using the Kinect 2 and record to .avi. I have already implemented a demo where I can record the Kinect footage of the skeleton directly to .avi.
After searching I found that it was possible to control an avatar with the original Kinect using XNA game studio. As XNA game studio seems a bit out dated now and I'm unsure if it would be compatible with version 2, would there be an alternative approach? Does anybody know of a project where someone has done this already? Or can anyone suggest a good approach to take to do this?
I have also seen some articles referring to a plugin that lets you use Unity with Kinect 2 but from what I can see I don't believe this is publicly available?
Many thanks for any help.
Yes, the unity plug in works great and its the best out there. the creator its call rumen, he will help you if you need it help but ofcourse you have to pay the plugin 1st. its worth it, it makes kinect way simpler.
I have some points like
1,2,3
1,1,1
2,3,4
2,5,6
9,10,2
66,43,23
I want to draw and see them in the page but I don't know how can I do this. I read a little about XNA but I think there is better way. Can you help me?
update
I want to simulate 3D points as 3DMax does. it means have a 3d shape and can look it in all degrees.
What you need is a 3d graphics drawing API.
There are 2 main ones:
OpenGL
Direct3D
OpenGL is cross platform and is used on mobile devices like Android and iPhone.
Direct3D is Microsoft specific and generally is for Windows platform only.
Don't overestimate the value of building a cross-platform app. If it's a small pet project and you're not planning to go cross platform in the future, don't choose a more difficult API just because it's cross platform.
Programming OpenGL quite easy to start with via GLUT from C++, and there are tons of simple examples on the web
You can also use Direct3D via XNA, which is also easy to start with. There are a huge number of XNA 4.0 tutorials on youtube (I haven't watched it, it just has a very green likes bar!) and also the msdn tutorials
I'm a student from Nanyang Technological University (NTU), Singapore. And currently developing a project using Kinect SDK.
And my question is, anyone know how to develop a program to detect a finger (or fingertip) using Kinect SDK, or maybe even some possible reference codes. Anyway, I also tried to search on google, but the only reference I got is using Open NI, instead of Kinect SDK.
Thanks and Regards
I was looking into that myself, although haven't gone deep into it.
OpenNI has some constants for finger tip/wrist detection, but that's not implemented yet,
but that's not an option for your setup anyway.
Here's a list of resources that hopefully will get you started:
MIT CSAIL Hand detection
FORTH ICS - Efficient model-based 3D tracking of hand articulations using Kinect
Candescent NUI project
Other random videos