Unity Create directory accessible by players - c#

I am working on a game and want players to be able to place there own ships and islands into the game. I want players to be able to access folders which contain files which the ships use for configuration. How would I make and access these files?

i think you have to explain your question more to get more exact answer but i answer your question best i can.
technically its possible to let players modify directory of game their profile but i dont sujest that. if that was a music or a sound or something like that, you could just make a system for your game to play songs one after other like games like gta or other games menus. you can find that after a little search. but you are speaking of game content that needs to be controlled. is your game 3d or 2d? if its 3d should they be able to work with 3d modeling program? and you have to know after unity makes an output of your game, after players setup that, there will be no basic formats and unity packages them to assets format. its better to make a editor system for your game that players can modify what they want or choose what they want in game, as many games do this way.
this is example from unity official website. you can make your players add their files to your game:
// Opens a file selection dialog for a PNG file and overwrites any
// selected texture with the contents.
class EditorUtilityOpenFilePanel {
#MenuItem("Examples/Overwrite Texture")
static function Apply () {
var texture : Texture2D = Selection.activeObject;
if (texture == null) {
EditorUtility.DisplayDialog(
"Select Texture",
"You Must Select a Texture first!",
"Ok");
return;
}
var path = EditorUtility.OpenFilePanel(
"Overwrite with png",
"",
"png");
if (path.Length != 0) {
var www = WWW("file:///" + path);
www.LoadImageIntoTexture(texture);
}
}
}

Related

Azure Kinect DK, Body tracking: Mapping separate avatars on separate skeletons

To recreate my issue, I've setup a project in Unity3d 2020, using this sample project.
Here I can successfully map an avatar to a single skeleton. In my project, however, I want to map multiple skeletons - not only the closest, as the sample provides.
This far I've successfully rendered multiple skeletons, but when I try to map separate avatars on each of them - the two avatars will follow the motion of only one of the skeletons in the scene.
The following code is used to setup the avatars:
var avatarPrefab = Instantiate(avatarPrefab, new Vector3(0,0,0), Quaterion.identity);
avatarPrefab.GetComponent<PuppetAvatar>().KinectDevice = this;
avatarPrefab.GetComponent<PuppetAvatar>().CharacterRootTransform = avatarPrefab.transform;
avatarPrefab.GetComponent<PuppetAvatar>().RootPosition = this.transform.GetChild(i).transform.GetChild(0).gameObject; // gets the pelvis of each "rootBody" prefab.
The creator of the PuppetAvatar.cs script has yet to release any updates to support multibody tracking, but I posed a similar question in this thread.
Take a look at MoveBox: https://github.com/microsoft/MoveBox-for-Microsoft-Rocketbox
Movebox can parser the SMPL body models extracted by an external tool for 3D multi-person human pose estimation from RGB videos.
And it supports Azure Kinect DK

Selecting a camera and outputting video when connecting multiple cameras in Unity

I am trying to show AR video by connecting the hand tracking camera device (Leap motion: window device manager list: 0) and webcam camera (window device manager list: 1) in Unity.
Leap motion connects using the manufacturer's SDK and package files, so there is no need for additional coding to connect the camera in Unity.
The problem occurs when I connect a webcam camera (window device manager list: 1) in Unity and show AR video.
When the following code is applied to an object, if both Leap motion and webcam camera are connected, leap motion is recognized and output as video, and video output of webcam camera becomes impossible.
If only the webcam is connected after unplugging the leap motion from the PC, the video output of the webcam camera is possible.
I want to output video by selecting webcam camera (window device manager list: 1) on the object with both Leap motion and webcam camera connected to the PC.
Since I am a beginner in Unity, I need to simply modify it in the code below.
Waiting for help.
using UnityEngine;
using System.Collections;
public class WebCam : MonoBehaviour {
// Use this for initialization
void Start () {
WebCamTexture web = new WebCamTexture(1280,720,60);
GetComponent<MeshRenderer>().material.mainTexture = web;
web.Play();
}
// Update is called once per frame
void Update () {
}
}
There is a constructor of WebCamTexture that takes the parameter
deviceName: The name of the video input device to be used.
You can list all available devices via WebCamTexture.devices and get the name like e.g.
var devices = WebCamTexture.devices;
var webcamTexture = new WebCamTexture(devices[1].name);
You might also be able then to filter out the device you need like e.g.
using System.Linq;
...
var device = devices.Select(d => d.name).FirstOrDefault(n => !n.Contains("Leap"));
For finding out how the cameras are called and to be able to filter by name you could print them all like e.g.
Debug.Log(string.Join("\n", devices.Select(d => d.name)));
Theoretically you could even feed them into a dropdown and let the user decide which device to use before creating the WebCamTexture then you wouldn't have to guess the name hardcoded at all ;)
Also note:
Call Application.RequestUserAuthorization before creating a WebCamTexture.

Creating a material with texture in Autodesk Revit Forge Design Automation

I'm currently working on some Revit API code which is running in the Autodesk Forge Design Automation cloud solution. Basically, I'm trying to create a material and attach a texture to it via the following code:
private void AddTexturePath(AssetProperty asset, string texturePath) {
Asset connectedAsset = null;
if (asset.NumberOfConnectedProperties == 0)
asset.AddConnectedAsset("UnifiedBitmapSchema");
connectedAsset = (Asset) asset.GetConnectedProperty(0);
AssetPropertyString path = (AssetPropertyString) connectedAsset.FindByName(UnifiedBitmap.UnifiedbitmapBitmap);
if (!path.IsValidValue(texturePath)) {
File.Create("texture.png");
texturePath = Path.GetFullPath("texture.png");
}
path.Value = texturePath;
}
This is actually working well, as the value for the texture path:
path.Value = texturePath;
Needs to be a reference to an existing file. I do not have this file on the cloud instance of Forge, because the path to the texture name is specified by the user when he sends the request for the Workitem.
The problem is that this sets the texture path for the material as something like this:
T:\Aces\Jobs\<workitem_id>\texture.png
Which is basically the working folder for the Workitem instance. This path is useless, because a material with texture path like this needs to be manually re-linked in Revit.
The perfect outcome for me would be if I could somehow map the material texture path to some user-friendly directory like "C:\Textures\texture.png" and it seems that the Forge instance has a "C:\" drive present (being probably a Windows instance of some sorts), but my code runs on low privileges, so it cannot create any kind of directories/files outside the working directory.
Does somebody has any idea how this could be resolved? Any help would be greatly appreciated!
After a whole day of research I pretty much arrived at a satisfying solution. Just for clarity - I am going to reference to Autodesk Forge Design Automation API for Revit, simply as "Forge".
Basically the code provided above is correct. I did not find any possible way to create a file on Forge instance, in a directory different than the Workitem working directory which is:
T:\Aces\Jobs\<workitem_id>\texture.png
Interestingly, there is a C:\ drive on the Forge instance, which contains Windows, Revit and .NET Framework installations (as Forge instance is basically some sort of Windows instance with Revit installed). It is possible to enumerate a lot of these directories, but none of the ones I've tried (and I've tried a lot - mostly the most obvious, public access Windows directories like C:\Users\Public, C:\Program Files, etc.) allow for creation of directories or files. This corresponds to what is stated in "Restrictions" area of the Forge documentation:
Your application is run with low privileges, and will not be able to freely interact with the Windows OS :
Write access is typically restricted to the job’s working folder.
Registry access is mostly restricted, writing to the registry should be avoided.
Any sub-process will also be executed with low privileges.
So after trying to save the "dummy" texture file somewhere on the Forge C:\ drive, I've found another solution - the texture path for your texture actually does not matter.
This is because Revit offers an alternative for re-linking your textures. If you fire up Revit, you can go to File -> Options -> Rendering, and under "Additional render appearance paths" field, you can specify the directories on your local machine, that Revit can use to look for missing textures. With these, you can do the following operations in order to have full control on creating materials on Forge:
Send Workitem to Forge, create the materials.
Create a dummy texture in working directory, with the correct file name.
Attach the dummy texture file to the material.
Output the resulting file (.rvt or .rfa, depending on what you're creating on Forge).
Place all textures into one folder (or multiple, this doesn't matter that much).
Add the directories with the textures to the Additional render apperance paths.
Revit will successfully re-link all the textures to new paths.
I hope someone will find this useful!
Additionally, as per Jeremy request, I post a code sample for creating material with texture and modifying different Appearance properties in Revit by using Revit API (in C#):
private void SetAppearanceParameters(Document project, Material mat, MaterialData data) {
using(Transaction setParameters = new Transaction(project, "Set material parameters")) {
setParameters.Start();
AppearanceAssetElement genericAsset = new FilteredElementCollector(project)
.OfClass(typeof(AppearanceAssetElement))
.ToElements()
.Cast < AppearanceAssetElement > ().Where(i = >i.Name.Contains("Generic"))
.FirstOrDefault();
AppearanceAssetElement newAsset = genericAsset.Duplicate(data.Name);
mat.AppearanceAssetId = newAsset.Id;
using(AppearanceAssetEditScope editAsset = new AppearanceAssetEditScope(project)) {
Asset editableAsset = editAsset.Start(newAsset.Id);
AssetProperty assetProperty = editableAsset["generic_diffuse"];
SetColor(editableAsset, data.MaterialAppearance.Color);
SetGlossiness(editableAsset, data.MaterialAppearance.Gloss);
SetReflectivity(editableAsset, data.MaterialAppearance.Reflectivity);
SetTransparency(editableAsset, data.MaterialAppearance.Transparency);
if (data.MaterialAppearance.Texture != null && data.MaterialAppearance.Texture.Length != 0)
AddTexturePath(assetProperty, $#"C:\{data.MaterialIdentity.Manufacturer}\textures\{data.MaterialAppearance.Texture}");
editAsset.Commit(true);
}
setParameters.Commit();
}
}
private void SetTransparency(Asset editableAsset, int transparency) {
AssetPropertyDouble genericTransparency = editableAsset["generic_transparency"] as AssetPropertyDouble;
genericTransparency.Value = Convert.ToDouble(transparency);
}
private void SetReflectivity(Asset editableAsset, int reflectivity) {
AssetPropertyDouble genericReflectivityZero = (AssetPropertyDouble) editableAsset["generic_reflectivity_at_0deg"];
genericReflectivityZero.Value = Convert.ToDouble(reflectivity) / 100;
AssetPropertyDouble genericReflectivityAngle = (AssetPropertyDouble) editableAsset["generic_reflectivity_at_90deg"];
genericReflectivityAngle.Value = Convert.ToDouble(reflectivity) / 100;
}
private void SetGlossiness(Asset editableAsset, int gloss) {
AssetPropertyDouble glossProperty = (AssetPropertyDouble) editableAsset["generic_glossiness"];
glossProperty.Value = Convert.ToDouble(gloss) / 100;
}
private void SetColor(Asset editableAsset, int[] color) {
AssetPropertyDoubleArray4d genericDiffuseColor = (AssetPropertyDoubleArray4d) editableAsset["generic_diffuse"];
Color newColor = new Color((byte) color[0], (byte) color[1], (byte) color[2]);
genericDiffuseColor.SetValueAsColor(newColor);
}
private void AddTexturePath(AssetProperty asset, string texturePath) {
Asset connectedAsset = null;
if (asset.NumberOfConnectedProperties == 0) asset.AddConnectedAsset("UnifiedBitmapSchema");
connectedAsset = (Asset) asset.GetConnectedProperty(0);
AssetProperty prop = connectedAsset.FindByName(UnifiedBitmap.UnifiedbitmapBitmap);
AssetPropertyString path = (AssetPropertyString) connectedAsset.FindByName(UnifiedBitmap.UnifiedbitmapBitmap);
string fileName = Path.GetFileName(texturePath);
File.Create(fileName);
texturePath = Path.GetFullPath(fileName);
path.Value = texturePath;
}

Unity3D. Accessing camera torch/flash issues - Torch works but can't access while streaming webcamtexture

I am developing an app in unity in which the user can take photos using their device camera. This is working great using Unity's webcamtexture. However, there is no flash support for webcamtexture, so I have written my own code to access the device Torch. The code WORKS - However it doesn't work while streaming the webcamtexture (the camera is in use so the java service call returns an error). Does anyone have any suggestions for how to get around this issue? Is there any way to use Unity's WebCamTexture to activate the camera torch? Here is my code for activating the camera torch:
AndroidJavaClass cameraClass = new AndroidJavaClass("android.hardware.Camera");
// This is an ugly hack to make Unity
// generate Camera permisions
WebCamDevice[] devices = WebCamTexture.devices;
int camID = 0;
camera = cameraClass.CallStatic<AndroidJavaObject>("open", camID);
// I'm pretty sure camera will never be null at this point
// It will either be a valid object or Camera.open would throw an exception
if (camera != null)
{
AndroidJavaObject cameraParameters = camera.Call<AndroidJavaObject>("getParameters");
cameraParameters.Call("setFlashMode","torch");
camera.Call("setParameters",cameraParameters);
Active = true;
}
Try checking out camera capture kit for unity. It gives that functionality you want for Android as well as the source code for it.
https://www.assetstore.unity3d.com/en/#!/content/56673

C# XNA Loading in textures

I have having alot of issues loading in textures into my simple game. First off, I am able to load in a texture when im inside of "Game1.cs". However, I am currently trying to create a level. So I want to load in all the pictures in the Level class.
public Level(IServiceProvider _serviceProvider)
{
content = new ContentManager(_serviceProvider, "Content");
mNrOfTextures = 3;
mTextures[] = new Texture2D[mNrTextures];
mTextures[0] = Content.Load<Texture2D>("sky");
//And then more textures and other stuff..
}
But the program can never find the file sky. I dont really get any useful error messages and im moving away from any tutorials currently. Can anyone point me into the right direction?
Full path to file: C:\c++\ProjIV\ProjIV\ProjIVContent\
I personally just pass my ContentManager to my level class, instead of passing the service provider as others do.
In this case, you need to use your local content instance, not the static Content
mTextures[0] = content.Load<Texture2D>("sky");
EDIT: I see this did not work, can you attach a picture of your solution layout with the content?

Categories