Searching through all subfolders when using Resources.Load in Unity - c#

Is it possible to have Resources.Load(name, type) search for a fitting asset not just in the base Resources folder / a specified subfolder, but instead the full subfolder structure under Resources?
Example folder structure of
Resources
- Subfolder
- image.png
I would like something like Resources.Load("image", typeof(Texture2D)) to return the image without the user having to specify "Subfolder/image".
I know it's ugly, but it's supposed to be a "drop it in your bashed together project without worrying about your folder structure"-type utility script and I won't know the subfolder.

The accepted answer only works in the editor context, and doesn't work after building the game. The resources folder is packed when building your game, and will no longer be a file hierarchy you can loop through with Directory.GetDirectories. The only way to get this to work is to save all file paths while still in the editor context, and use this file hierarchy to load the assets at runtime. In my project I used the following code to add a button to the component that uses the assets in the resources folder, named CharacterGen. When this button is clicked, all png files in all subfolders in the Resources folder are saved to the public attribute named filePaths that CharacterGen has.
[CustomEditor(typeof(CharacterGen))]
public class RefreshCharacterList : Editor
{
CharacterGen charGen;
public void OnEnable()
{
charGen = (CharacterGen)target;
}
public override void OnInspectorGUI()
{
base.OnInspectorGUI();
if (GUILayout.Button("Load resource paths"))
{
List<String> paths = new List<string>();
LoadPathsRecursive("", ref paths);
charGen.filePaths = paths;
EditorUtility.SetDirty(charGen); //original post didn't have this line, but this is needed to make sure your changes are saved.
}
}
void LoadPathsRecursive(string path, ref List<string> paths)
{
var fullPath = Application.dataPath + "/Resources/" + path;
Debug.Log("fullPath: " + fullPath);
DirectoryInfo dirInfo = new DirectoryInfo(fullPath);
foreach(var file in dirInfo.GetFiles())
{
if (file.Name.Contains(".PNG") && !file.Name.Contains(".meta"))
{
paths.Add(path + "/" + file.Name.Replace(".PNG", ""));
}
}
foreach (var dir in dirInfo.GetDirectories())
{
LoadPathsRecursive(path + "/" + dir.Name, ref paths);
}
}
}
In CharacterGen I later evoke Resources.Load, using the paths that were saved when clicking the button.
foreach (var filePath in filePaths)
{
var loadedSprite = Resources.Load<Sprite>(filePath);
//Do something with loadedSprite
}
Edit: I failed to mention an important detail in my original post. the filePaths is a Monobehaviour field, and I marked it with [SerializeField] (alternatively it could be marked as public) so that unity actually serializes the values of this field and includes it in the build.

There is no way to change the Resouces.Load() static method functionality, it's Unity Internal. However, you can write your own custom class that does your desired functionality. The code needs to find all the directories inside the Resources folder and search for the file. Let's call the class ResourcesExtension
public class ResourcesExtension
{
public static string ResourcesPath = Application.dataPath+"/Resources";
public static UnityEngine.Object Load(string resourceName, System.Type systemTypeInstance)
{
string[] directories = Directory.GetDirectories(ResourcesPath,"*",SearchOption.AllDirectories);
foreach (var item in directories)
{
string itemPath = item.Substring(ResourcesPath.Length+1);
UnityEngine.Object result = Resources.Load(itemPath+"\\"+resourceName,systemTypeInstance);
if(result!=null)
return result;
}
return null;
}
}
Then all you need to do is calling the static method.
ResourcesExtension.Load("image", typeof(Texture2D))

if you want to find all Object of a type and return as a List.
it Emad code Edited a bit.
private List<T> FindAllObject<T>()
{
List<T> tmp = new List<T>();
string ResourcesPath = Application.dataPath + "/Resources";
string[] directories = Directory.GetDirectories(ResourcesPath, "*", SearchOption.AllDirectories);
foreach (string item in directories)
{
string itemPath = item.Substring(ResourcesPath.Length + 1);
T[] reasult = Resources.LoadAll(itemPath, typeof(T)).Cast<T>().ToArray();
foreach (T x in reasult)
{
if (!tmp.Contains(x))
{
tmp.Add(x);
}
}
}
return tmp;
}
To use it.
List<MyClass> myClasses = FindAllObject<MyClass>();

Related

Get file names from Resources sub folder

In my Resources folder I have a subfolder for images, I would like to get all the file names of those images from within that folder.
tried several Resources.loadAll methods to afterwards get the .name but without success
was is the right practice to achieve what I'm trying to do here ?
There is no built-in API to do this because the information is not after you build. You cant' even do this with what's in the accepted answer. That would only work in the Editor. When you build the project, your code will fail.
Here's what to do:
1. Detect when the build button is clicked or when a build is about to happen in the OnPreprocessBuild function.
2. Get all the file names with Directory.GetFiles, serialize it to json and save it to the Resources folder. We use json to make it easier to read individual file name. You don't have to use json. You must exclude the ".meta" extension.
Step 1 and 2 are done in the Editor.
3. After a build or during run-time, you can access the saved file that contains the file names as a TextAsset with Resources.Load<TextAsset>("FileNames") then de-serialize the json from TextAsset.text.
Below is very simplified example. No error handling and that's up to you to implement. The Editor script below saves the file names when you click on the Build button:
[Serializable]
public class FileNameInfo
{
public string[] fileNames;
public FileNameInfo(string[] fileNames)
{
this.fileNames = fileNames;
}
}
class PreBuildFileNamesSaver : IPreprocessBuildWithReport
{
public int callbackOrder { get { return 0; } }
public void OnPreprocessBuild(UnityEditor.Build.Reporting.BuildReport report)
{
//The Resources folder path
string resourcsPath = Application.dataPath + "/Resources";
//Get file names except the ".meta" extension
string[] fileNames = Directory.GetFiles(resourcsPath)
.Where(x => Path.GetExtension(x) != ".meta").ToArray();
//Convert the Names to Json to make it easier to access when reading it
FileNameInfo fileInfo = new FileNameInfo(fileNames);
string fileInfoJson = JsonUtility.ToJson(fileInfo);
//Save the json to the Resources folder as "FileNames.txt"
File.WriteAllText(Application.dataPath + "/Resources/FileNames.txt", fileInfoJson);
AssetDatabase.Refresh();
}
}
During run-time, you can retrieve the saved file names with the example below:
//Load as TextAsset
TextAsset fileNamesAsset = Resources.Load<TextAsset>("FileNames");
//De-serialize it
FileNameInfo fileInfoLoaded = JsonUtility.FromJson<FileNameInfo>(fileNamesAsset.text);
//Use data?
foreach (string fName in fileInfoLoaded.fileNames)
{
Debug.Log(fName);
}
Hmm... why not try this.
using System.IO;
Const String path = ""; /file path
private void GetFiles()
{
string [] files = Directory.GetFiles (path, "*.*");
foreach (string sourceFile in files)
{
string fileName = Path.GetFileName (sourceFile);
Debug.Log("fileName");
}
}

C# Reading of text file stopped working - with Unity

I have a program that reads in any amount of .txt files in a directory and assembles them in a list, this list is then displayed to the user one at a time. The user makes a decision at this point and the reaction time is stored. Recently the text import script stopped working for no reason. I haven't changed any of the text import code in months.
I've checked the code that reads in the files and nothing was being recognised at all.
This is part of Unity program just FYI.
TextImport
using UnityEngine;
using System.Collections;
using System.IO;
using System.Collections.Generic;
public class TextImport : MonoBehaviour {
public static List<string> TextLines;
bool FileRead = false;
// Use this for initialization
void Awake() {
TextLines = new List<string>();
string path;
if(Application.platform == RuntimePlatform.WindowsEditor)
{
path = Application.dataPath + "/Resources/";
}
else if (Application.platform == RuntimePlatform.WindowsPlayer)
{
path = Application.dataPath;
}
else if (Application.platform == RuntimePlatform.Android)
{
FileRead = true;
path = "/StroopTest/";
TextAsset Allfiles = Resources.Load("Android/AllText") as TextAsset;
string[] AllLinesAndroid = Allfiles.text.Split("\n"[0]);
foreach (string Line in AllLinesAndroid)
{
TextLines.Add(Line);
break;
}
}
else
{
path = Application.dataPath;
}
if (!FileRead)
{
DirectoryInfo info = new DirectoryInfo(path);
FileInfo[] fileInfo = info.GetFiles("*.txt");//I've changed the "*.txt" to nothing to read all files instead of just .txt files
foreach (FileInfo file in fileInfo)
{
//I created another List here to store the file names that were read, however none were.
string[] AllLines = File.ReadAllLines(file.FullName);
foreach (string line in AllLines)
{
TextLines.Add(line);
}
}
}
}
}
I suggest using Application.persistentDataPath.
Application.dataPath is a read-only directory, where (if packaged to do so) you can find game-related assets there.
Application.persistentDataPath, however, is a writeable directory and returns a path (different for different platforms) where you can add stuff at runtime (ex: writing a file, downloading a file, etc.)
I hope that helps.

C# Copy file or folder

I am trying to write a program to keep multiple folders in sync. To do this, I need to copy and delete files and subfolders.
To me, it doesn't make a difference if an object is a file or a folder, I want to create all necessary parent folders and copy the object, overwriting if necessary. I'm currently using a jagged array of FileSystemInfo to hold my files/folders.
This has the advantage of avoiding a duplication of code to sync files and folders separately.
However, I can't figure out how to Copy a FileSystemInfo. I'm looking for a way to be able to copy/delete/read creation or modified time that will work on both files and folders.
FileSystemInfo don't have Copy or Delete methods but is the base class for DirectoryInfo and FileInfo.
So when you loop over your FileSystemInfo objects you have to cast to the proper concrete class and use the specific copy/delete methods.
foreach( var fsi in fileSystemInfoObjects )
{
if( fsi is DirectoryInfo )
{
var directory = (DirectoryInfo)fsi;
//do something
}
else if (fsi is FileInfo )
{
var file = (FileInfo)fsi;
//do something
}
}
I used sam's answer to help me solve my problem. What I did was put the copying logic in my custom class so that I don't need to duplicate the logic whenever I use it in my code.
public class myFSInfo
{
public FileSystemInfo Dir;
public string RelativePath;
public string BaseDirectory;
public myFSInfo(FileSystemInfo dir, string basedir)
{
Dir = dir;
BaseDirectory = basedir;
RelativePath = Dir.FullName.Substring(basedir.Length + (basedir.Last() == '\\' ? 1 : 2));
}
private myFSInfo() { }
/// <summary>
/// Copies a FileInfo or DirectoryInfo object to the specified path, creating folders and overwriting if necessary.
/// </summary>
/// <param name="path"></param>
public void CopyTo(string path)
{
if (Dir is FileInfo)
{
var f = (FileInfo)Dir;
Directory.CreateDirectory(path.Substring(0,path.LastIndexOf("\\")));
f.CopyTo(path,true);
}
else if (Dir is DirectoryInfo) Directory.CreateDirectory(path);
}
}

C# How to loop through the source folder or subfolders that match the destination

I need to overwrite files from a source to a destination directory.
The structure of each folder is different so i'm trying to do it in a generic way.
The thing is, each folder (source and destination) could have numerous subdirectories or none at all.
The code I currently have is this:
//copy and overwrite the files depending on whatever is in the destination
//search through the destination to find the file
foreach (var dstfile in Directory.GetFiles(targetDir))
{
//search through the source to find the matching file
foreach (var srcfile in Directory.GetFiles(sourceDir))
{
//cut off the source file from the source path
strSrcFile = srcfile.Split(Path.DirectorySeparatorChar).Last();
strDstFile = dstfile.Split(Path.DirectorySeparatorChar).Last();
//if the destination and source files match up, replace the desination with the source
if (strSrcFile == strDstFile)
{
File.Copy(srcfile, Path.Combine(targetDir, Path.GetFileName(strSrcFile)), true);
}
}
}
//look through the subfolders to see if any files match up
foreach (var srcFolder in Directory.GetDirectories(sourceDir))
{
//search through the source for the files
foreach (var srcFile in Directory.GetFiles(srcFolder))
{
//search through the destination for the files
foreach (var dstFile in Directory.GetFiles(targetDir))
{
As you can see there are a lot of foreach loops, is there a way to streamline this?
Make a hash (dictionary) of the destination directory, then walk the source directory and see if the files already exist.
Dictionary<string,string> lut1 = new Dictionary<string,string>();
foreach (var dstfile in Directory.GetFiles(targetDir))
{
strDstFile = dstfile.Split(Path.DirectorySeparatorChar).Last();
lut1 [strDstFile ] = dstfile;
}
foreach (var srcfile in Directory.GetFiles(sourceDir))
{
strSrcFile = srcfile.Split(Path.DirectorySeparatorChar).Last();
string dstfile;
if (lut1.TryGetValue(strSrcFile, out dstfile)) {
File.Copy( srcfile,dstfile,true);
}
}
I haven't tested this but this should work (Not 100% efficient), Should give you some pointers at least
public void UpdateFiles(string directory, string otherDir)
{
var dirFiles = Directory.EnumerateFiles(directory, "*",
SearchOption.AllDirectories);
var otherDirFiles = Directory.EnumerateFiles(otherDir, "*",
SearchOption.AllDirectories);
foreach (var file in dirFiles)
{
string fi = Path.GetFileName(file);
var newFile = otherDirFiles.Where(x => fi == Path.GetFileName(x));
foreach(var foundFile in newFile)
File.Copy(file , foundFile, true);
}
}
I just did it this way in a console app... tested it to work for main target folder and sub folders, although probably not the most efficient.
Call this:
OperateOnSourceFiles(sourceDir, targetDir);
Which will check the current files in the source, and then recursively look through all source subdirectories.
private static void OperateOnSourceFiles(string source, string targetDir)
{
//Processes current source folder files
foreach (var file in Directory.GetFiles(source))
{
OverWrite(targetDir, file);
}
//Recursively processes files in source subfolders
List<string> subfolders = Directory.GetDirectories(source).ToList();
foreach (var subfolder in subfolders)
{
OperateOnSourceFiles(subfolder, targetDir);
}
}
Then your overwrite function could look something like this:
private static void OverWrite(string target, string sourcefile)
{
//Grab file name
var strSrcFile = sourcefile.Split(Path.DirectorySeparatorChar).Last();
//Search current target directory FILES, and copy only if same file name
List<string> targetfiles = Directory.GetFiles(target).Select(file=>file.Split(Path.DirectorySeparatorChar).Last()).ToList();
if (targetfiles.Contains(strSrcFile))
{
File.Copy(sourcefile, Path.Combine(target, Path.GetFileName(strSrcFile)), true);
}
//Recursively search current target directory SUBFOLDERS if any
List<string> subfolders = Directory.GetDirectories(target).ToList();
foreach (var subfolder in subfolders)
{
OverWrite(subfolder, sourcefile);
}
}
Feel free to correct me :)
Note: I realize it's still quite a lot of foreach loops, but at least they aren't nested, and makes life easier when debugging.
I liked the idea, so I tried this myself. Turned out a bit more complicated then I thought it would. Let's go into a deep dive, okay?
The basic idea is to synchronize directories, so we want references to DirectoryInfo instances.
var source = new DirectoryInfo(#"C:\SynchSource");
var target = new DirectoryInfo(#"C:\SynchTarget");
Synchronize(source, target);
Synchronize pretty much works in the following manner:
make sure all files are identical
make sure all directories are identical
go through all subdirectories and traverse
My implementation looks like this:
void Synchronize(DirectoryInfo sourceDir, DirectoryInfo targetDir)
{
SynchronizeFiles(sourceDir, targetDir);
SynchronizeDirectories(sourceDir, targetDir);
TraverseDirectories(sourceDir, targetDir);
}
Note the .Single() - we can never assume there is just one person/process working in the directories.
SynchronizeFiles does two things:
Copy/overwrite all files in the current directory into the target directory.
Remove redundant files that no more exist in the source directory
void MoveFiles(DirectoryInfo sourceDir, DirectoryInfo targetDir)
{
foreach (FileInfo sourceFile in sourceDir.GetFiles())
{
string targetFilePath = Path.Combine(targetDir.FullName, sourceFile.Name);
File.Copy(sourceFile.FullName, targetFilePath);
}
}
void RemoveRedundantFiles(DirectoryInfo sourceDir, DirectoryInfo targetDir)
{
foreach (var targetFile in targetDir.GetFiles())
{
var sourceFilePath = Path.Combine(sourceDir.FullName, targetFile.Name);
if (!File.Exists(sourceFilePath))
{
targetFile.Delete();
}
}
}
We can now assume all files in the current directory are the same, no more and no less. In order to traverse through the subdirectories, we first have to make sure the directory structure is the same. We do it in a similar manner to SynchronizeFiles:
Create missing directories in the target directory (CreateMissingDirectories)
Remove redundant directories that no more exist in the source directory (RemoveRedundantDirectories)
void CreateMissingDirectories(DirectoryInfo sourceDir, DirectoryInfo targetDir)
{
foreach (DirectoryInfo sourceSubDir in sourceDir.GetDirectories())
{
string targetSubDirPath = Path.Combine(targetDir.FullName, sourceSubDir.Name);
if (!Directory.Exists(targetSubDirPath))
{
Directory.CreateDirectory(targetSubDirPath);
}
}
}
void RemoveRedundantDirectories(DirectoryInfo sourceDir, DirectoryInfo targetDir)
{
foreach (DirectoryInfo targetSubDir in targetDir.GetDirectories())
{
string sourceSubDirPath = Path.Combine(sourceDir.FullName, targetSubDir.Name);
if (!Directory.Exists(sourceSubDirPath))
{
targetSubDir.Delete(true);
}
}
}
We are at the state that files and directories in the current level of hierarchy equal. We can now iterate through all subdirectories and call Synchronize:
void TraverseDirectories(DirectoryInfo sourceDir, DirectoryInfo targetDir)
{
foreach (DirectoryInfo sourceSubDir in sourceDir.GetDirectories())
{
DirectoryInfo targetSubDir = targetDir.GetDirectories(sourceSubDir.Name).Single();
Synchronize(sourceSubDir, targetSubDir);
}
}
And we are done.
For huge directory hierarchies, a big amount or large files or even concurrent processes working in the directory, there is much room for improvement. There is a lot of work to do for it to be fast (you may want to cache GetFiles / GetDirectories), skip unnecessary File.Copy calls (get a file hash before assuming a copy is needed).
Just as a side note: Other than synchronizing files every now and then, depending on the requirement, you may want to have a look at FileSystemWatcher, which can detect all changes recursively in a selected directory.

How do I compare one collection of files to another in c#?

I am just learning C# (have been fiddling with it for about 2 days now) and I've decided that, for leaning purposes, I will rebuild an old app I made in VB6 for syncing files (generally across a network).
When I wrote the code in VB 6, it worked approximately like this:
Create a Scripting.FileSystemObject
Create directory objects for the source and destination
Create file listing objects for the source and destination
Iterate through the source object, and check to see if it exists in the destination
if not, create it
if so, check to see if the source version is newer/larger, and if so, overwrite the other
So far, this is what I have:
private bool syncFiles(string sourcePath, string destPath) {
DirectoryInfo source = new DirectoryInfo(sourcePath);
DirectoryInfo dest = new DirectoryInfo(destPath);
if (!source.Exists) {
LogLine("Source Folder Not Found!");
return false;
}
if (!dest.Exists) {
LogLine("Destination Folder Not Found!");
return false;
}
FileInfo[] sourceFiles = source.GetFiles();
FileInfo[] destFiles = dest.GetFiles();
foreach (FileInfo file in sourceFiles) {
// check exists on file
}
if (optRecursive.Checked) {
foreach (DirectoryInfo subDir in source.GetDirectories()) {
// create-if-not-exists destination subdirectory
syncFiles(sourcePath + subDir.Name, destPath + subDir.Name);
}
}
return true;
}
I have read examples that seem to advocate using the FileInfo or DirectoryInfo objects to do checks with the "Exists" property, but I am specifically looking for a way to search an existing collection/list of files, and not live checks to the file system for each file, since I will be doing so across the network and constantly going back to a multi-thousand-file directory is slow slow slow.
Thanks in Advance.
The GetFiles() method will only get you files that does exist. It doesn't make up random files that doesn't exist. So all you have to do is to check if it exists in the other list.
Something in the lines of this could work:
var sourceFiles = source.GetFiles();
var destFiles = dest.GetFiles();
foreach (var file in sourceFiles)
{
if(!destFiles.Any(x => x.Name == file.Name))
{
// Do whatever
}
}
Note: You have of course no guarantee that something hasn't changed after you have done the calls to GetFiles(). For example, a file could have been deleted or renamed if you try to copy it later.
Could perhaps be done nicer somehow by using the Except method or something similar. For example something like this:
var sourceFiles = source.GetFiles();
var destFiles = dest.GetFiles();
var sourceFilesMissingInDestination = sourceFiles.Except(destFiles, new FileNameComparer());
foreach (var file in sourceFilesMissingInDestination)
{
// Do whatever
}
Where the FileNameComparer is implemented like so:
public class FileNameComparer : IEqualityComparer<FileInfo>
{
public bool Equals(FileInfo x, FileInfo y)
{
return Equals(x.Name, y.Name);
}
public int GetHashCode(FileInfo obj)
{
return obj.Name.GetHashCode();
}
}
Untested though :p
One little detail, instead of
sourcePath + subDir.Name
I would use
System.IO.Path.Combine(sourcePath, subDir.Name)
Path does reliable, OS independent operations on file- and foldernames.
Also I notice optRecursive.Checked popping out of nowhere. As a matter of good design, make that a parameter:
bool syncFiles(string sourcePath, string destPath, bool checkRecursive)
And since you mention it may be used for large numbers of files, keep an eye out for .NET 4, it has an IEnumerable replacement for GetFiles() that will let you process this in a streaming fashion.

Categories