I've got a problem during import a .png file to my project in code.
This is my .png file before importing:
After this code:
var pngImage = LoadPNG(pngPath);
string pngPath2 = Application.persistentDataPath + "/images/testImage.png";
var meshRenderer = GameObject.Find("SimInput").GetComponent<MeshRenderer>();
meshRenderer.material.mainTexture = pngImage;
public static Texture2D LoadPNG(string filePath)
{
Texture2D tex = null;
byte[] fileData;
if (File.Exists(filePath))
{
fileData = File.ReadAllBytes(filePath);
tex = new Texture2D(2, 2);
tex.LoadImage(fileData);
}
return tex;
}
I get the result as shown below:
Where am I doing a mistake?
I have been trying save this file again as a .png but the result was the same as on the first picture. Is there any property to change in Unity?
Thank you in advance.
This is a transparency issue. You are using the standard material which has it's "Rendering Mode" to set "Opaque". You have to set that to "Fade" or "Transparent". In this case "Fade", should work better. After this, you can then control the Metallic and Smoothness sliders to make it darker or lighter. You can also use another shader such as Sprites ---> Default, UI ---> Default or Unlit ---> Transparent and they should work without having to set anything else.
While this will solve your issue, if all you need to do is display the loaded Texture, use the RawImage component. This is is the appropriate way to display Texture2D. To create it, go to GameObject ---> UI ---> RawImage then use the simple code below to display it on the RawImage.
//Set a RawImage in the Inspector
public RawImage rawImage;
void Start()
{
Texture2D pngImage = LoadPNG(pngPath);
rawImage.texture = pngImage;
}
Related
I have got CamFeed.rendertexture and CamFeed.mat files. Im loading them using my script and setting it up to the camera as target texture and material to another quad respectively but it just shows a black screen.
Material newMat = Resources.Load("CamFeed", typeof(Material)) as Material;
RenderTexture rendertexture = Resources.Load("CamFeed", typeof(Material)) as RenderTexture;
FirstPersonCamera.targetTexture = rendertexture;
quad.GetComponent<Renderer>().material = newMat;
How can I be able to sort it out?
Well, firstly, why are you converting a typeof(Material) into RenderTexture? You should have one of each in your Resources folder, named distinctively, so that you instantiate them the following way:
Material newMat = Resources.Load<Material>("CamFeedMaterial");
RenderTexture rendertexture = Resources.Load<RenderTexture>("CamFeedTexture");
Also make sure that your material uses a shader such as Unlit\Texture, since you are going to need to attach the RenderTexture to it, either in the Editor or from code:
newMat.mainTexture = rendertexture;
Finally, your last bit of code stays the same:
FirstPersonCamera.targetTexture = rendertexture;
quad.GetComponent<Renderer>().material = newMat;
I have am developing a HoloLens project that needs to reference .txt files. I have the files stored in Unity's 'Resources' folder and have them working perfectly fine (when run via Unity):
string basePath = Application.dataPath;
string metadataPath = String.Format(#"\Resources\...\metadata.txt", list);
// If metadata exists, set title and introduction strings.
if (File.Exists(basePath + metadataPath))
{
using (StreamReader sr = new StreamReader(new FileStream(basePath + metadataPath, FileMode.Open)))
{
...
}
}
However, when building the program for HoloLens deployment, I am able to run the code but it doesn't work. None of the resources show up and when examining the HoloLens Visual Studio solution (created by selecting build in Unity), I don't even see a resources or assets folder. I am wondering if I am doing something wrong or if there was a special way to deal with such resources.
Also with image and sound files...
foreach (string str in im)
{
spriteList.Add(Resources.Load<Sprite>(str));
}
The string 'str' is valid; it works absolutely fine with Unity. However, again, it's not loading anything when running through the HoloLens.
You can't read the Resources directory with the StreamReader or the File class. You must use Resources.Load.
1.The path is relative to any Resources folder inside the Assets folder of your project.
2.Do not include the file extension names such as .txt, .png, .mp3 in the path parameter.
3.Use forward slashes instead of back slashes when you have another folder inside the Resources folder. backslashes won't work.
Text files:
TextAsset txtAsset = (TextAsset)Resources.Load("textfile", typeof(TextAsset));
string tileFile = txtAsset.text;
Supported TextAsset formats:
txt .html .htm .xml .bytes .json .csv .yaml .fnt
Sound files:
AudioClip audio = Resources.Load("soundFile", typeof(AudioClip)) as AudioClip;
Image files:
Texture2D texture = Resources.Load("textureFile", typeof(Texture2D)) as Texture2D;
Sprites - Single:
Image with Texture Type set to Sprite (2D and UI) and
Image with Sprite Mode set to Single.
Sprite sprite = Resources.Load("spriteFile", typeof(Sprite)) as Sprite;
Sprites - Multiple:
Image with Texture Type set to Sprite (2D and UI) and
Image with Sprite Mode set to Multiple.
Sprite[] sprite = Resources.LoadAll<Sprite>("spriteFile") as Sprite[];
Video files (Unity >= 5.6):
VideoClip video = Resources.Load("videoFile", typeof(VideoClip)) as VideoClip;
GameObject Prefab:
GameObject prefab = Resources.Load("shipPrefab", typeof(GameObject)) as GameObject;
3D Mesh (such as FBX files)
Mesh model = Resources.Load("yourModelFileName", typeof(Mesh)) as Mesh;
3D Mesh (from GameObject Prefab)
MeshFilter modelFromGameObject = Resources.Load("yourGameObject", typeof(MeshFilter)) as MeshFilter;
Mesh loadedMesh = modelFromGameObject.sharedMesh; //Or design.mesh
3D Model (as GameObject)
GameObject loadedObj = Resources.Load("yourGameObject") as GameObject;
//MeshFilter meshFilter = loadedObj.GetComponent<MeshFilter>();
//Mesh loadedMesh = meshFilter.sharedMesh;
GameObject object1 = Instantiate(loadedObj) as GameObject;
Accessing files in a sub-folder:
For example, if you have a shoot.mp3 file which is in a sub-folder called "Sound" that is placed in the Resources folder, you use the forward slash:
AudioClip audio = Resources.Load("Sound/shoot", typeof(AudioClip)) as AudioClip;
Asynchronous Loading:
IEnumerator loadFromResourcesFolder()
{
//Request data to be loaded
ResourceRequest loadAsync = Resources.LoadAsync("shipPrefab", typeof(GameObject));
//Wait till we are done loading
while (!loadAsync.isDone)
{
Debug.Log("Load Progress: " + loadAsync.progress);
yield return null;
}
//Get the loaded data
GameObject prefab = loadAsync.asset as GameObject;
}
To use: StartCoroutine(loadFromResourcesFolder());
If you want to load text file from resource please do not specify file extension
just simply follow the below given code
private void Start()
{
TextAsset t = Resources.Load<TextAsset>("textFileName");
List<string> lines = new List<string>(t.text.Split('\n'));
foreach (string line in lines)
{
Debug.Log(line);
}
}
debug.Log can print all the data that are available in text file
You can try to store your text files in streamingsasset folder. It works fine in my project
I need help for converting Emgucv videoCapture image to LoadRawTextureData of unity to display images/videos in Unity3d 2018.1.
I am able to display images but it show strange lines and distorted image effects or some kind of shuttering/scattering.
I read the question in this post and apply the solution but the problem is not solved Convert Mat to Texture2d(Stackoverflow)
I think the Colorspace of Emgucv image did not matched with the texture2d.
Code:
void Start () {
vc = new VideoCapture(0);
vc.FlipVertical = true; //The image I am getting from webcam is flipped
myMaterial = GetComponent<Renderer>().material;
frameHeight = (int)vc.GetCaptureProperty(Emgu.CV.CvEnum.CapProp.FrameHeight);
frameWidth = (int)vc.GetCaptureProperty(Emgu.CV.CvEnum.CapProp.FrameWidth);
camera = new Texture2D(frameHeight,frameWidth, TextureFormat.RGB24, false,false);
}
I am getting images in this part of code and converting the color space as per texture2d format.
void Update () {
Mat secondimage = new Mat();
Mat myimage = vc.QueryFrame();
CvInvoke.CvtColor(myimage, secondimage, Emgu.CV.CvEnum.ColorConversion.Bgr2Rgb);
Image<Rgb, byte> hello = secondimage.ToImage<Rgb, byte>();
camera.LoadRawTextureData(hello.bytes);
camera.Apply();
myMaterial.mainTexture = camera;
}
Results:
The Result is kind of strange,take a look on following result images.
This is my hand holding in Front of camera.This image is orignal texture2d image.
Image1
This image is displayed on the 3d plane in unity3d.
Image2
As code explain i already Convert image Bgr2Rgb.
update:
I simply exchange the width and height in texture2d and wow it worked.
camera = new Texture2D(frameWidth, frameHeight, TextureFormat.RGB24,false,false);
I've did a lot of research, but I can't find a suitable solution that works with Unity3d/c#. I'm using a Fove-HMD and would like to record/make a video of the integrated camera. So far I managed every update to take a snapshot of the camera, but I can't find a way to merge this snapshots into a video. Does someone know a way of converting them? Or can someone point me in the right direction, in which I could continue my research?
public class FoveCamera : SingletonBase<FoveCamera>{
private bool camAvailable;
private WebCamTexture foveCamera;
private List<Texture2D> snapshots;
void Start ()
{
//-------------just checking if webcam is available
WebCamDevice[] devices = WebCamTexture.devices;
if (devices.Length == 0)
{
Debug.LogError("FoveCamera could not be found.");
camAvailable = false;
return;
}
foreach (WebCamDevice device in devices)
{
if (device.name.Equals("FOVE Eyes"))
foveCamera = new WebCamTexture(device.name);//screen.width and screen.height
}
if (foveCamera == null)
{
Debug.LogError("FoveCamera could not be found.");
return;
}
//-------------camera found, start with the video
foveCamera.Play();
camAvailable = true;
}
void Update () {
if (!camAvailable)
{
return;
}
//loading snap from camera
Texture2D snap = new Texture2D(foveCamera.width,foveCamera.height);
snap.SetPixels(foveCamera.GetPixels());
snapshots.Add(snap);
}
}
The code works so far. The first part of the Start-Method is just for finding and enabling the camera. In the Update-Method I'm taking every update a snapshot of the video.
After I "stop" the Update-Method, I would like to convert the gathered Texture2D object into a video.
Thanks in advance
Create MediaEncoder
using UnityEditor; // VideoBitrateMode
using UnityEditor.Media; // MediaEncoder
var vidAttr = new VideoTrackAttributes
{
bitRateMode = VideoBitrateMode.Medium,
frameRate = new MediaRational(25),
width = 320,
height = 240,
includeAlpha = false
};
var audAttr = new AudioTrackAttributes
{
sampleRate = new MediaRational(48000),
channelCount = 2
};
var enc = new MediaEncoder("sample.mp4", vidAttr, audAttr);
Convert each snapshot to Texture2D
Call consequently AddFrame to add each snapshot to MediaEncoder
enc.AddFrame(tex);
Once done call Dispose to close the file
enc.Dispose();
I see two methods here, one is fast to implement, dirty and not for all platforms, second one harder but pretty. Both rely on FFMPEG.
1) Save every frame into image file (snap.EncodeToPNG()) and then call FFMPEG to create video from images (FFmpeg create video from images) - slow due to many disk operations.
2) Use FFMPEG via wrapper implemented in AForge and supply its VideoFileWriter class with images that you have.
Image sequence to video stream?
Problem here is it uses System.Bitmap, so in order to convert Texture2D to Bitmap you can use: How to create bitmap from byte array?
So you end up with something like:
Bitmap bmp;
Texture2D snap;
using (var ms = new MemoryStream(snap.EncodeToPNG()))
{
bmp = new Bitmap(ms);
}
vFWriter.WriteVideoFrame(bmp);
Both methods are not the fastest ones though, so if performance is an issue here you might want to operate on lower level data like DirectX or OpenGL textures.
In unity3d, I am using WWW class to download a texture from link, then I am creating a Sprite with the obtained texture and I am displaying it on the screen. This is working just fine in Unity Editor, but when I run it in the browser as HTML5 instead of displaying my Sprite, there is a red question mark. Why?
This is my C# code:
GameObject myImage;
Sprite neededSprite;
IEnumerator Start () {
WWW getMyImage= new WWW("http://previews.123rf.com/images/burakowski/burakowski1202/burakowski120200228/12221967-Grunge-Example-stamp-Stock-Vector-demo.jpg");
myImage = GameObject.Find("myImage");
yield return getMyImage;
Texture2D rawImage = getMyImage.texture;
neededSprite = Sprite.Create(rawImage, new Rect(0.0f, 0.0f, rawImage.width, rawImage.height), new Vector2(0.5f, 0.5f), 100.0f);
myImage.GetComponent<Image>().sprite = neededSprite;
}
After placing Header set Access-Control-Allow-Origin "*" in .htaccess file, everything is working as excepted