So I've been building a reusable class that will handle recursion and copy directory one, to another directory. The folder structure is:
Root (D:\Arrigotti)
Source (C:\inetpub\wwwroot)
Destination (D:\Arrigotti\Backup)
Archive (D:\Arrigotti\Archive)
Those are the critical areas, for the example code I'm going to leave out some validation / error handling for simplicity sake.
public static class FileSystem
{
public static void CopyDirectory(string source, string destination)
{
DirectoryInfo directory = new DirectoryInfo(source);
DirectoryInfo[] directories = directory.GetDirectories();
foreach(DirectoryInfo dir in directories)
{
Console.WriteLine(#"Found Directory: {0}", dir.FullName);
if(Directory.Exists(Path.Combine(destination, dir.Name)))
{
Console.WriteLine(#"Attempting to write directory...");
Directory.CreateDirectory(Path.Combine(destination, dir.Name));
Console.WriteLine(#"Created Directory: {0}", dir.Name);
}
CopyDirectory(dir.FullName, Path.Combine(destination, dir.Name));
FileInfo[] files = dir.GetFiles();
foreach(FileInfo file in files)
{
Console.WriteLine(#"Found: {0}", file.FullName);
Console.WriteLine(#"Attempting to copy...");
file.CopyTo(Path.Combine(destination, file.Name), true);
}
}
}
}
Which I believe that part is quite accurate and working. However, my problem stems in my call.
public static class Backup
{
public static void Save()
{
string[] drives = Directory.GetLogicalDrives();
foreach(string drive in drives)
{
DriveInfo diagnose = new DriveInfo(drive);
if(diagnose.VolumeLabel == #"Backup" && diagnose.DriveType == DriveType.Fixed)
{
CopyDirectory(
ConfigurationManager.AppSettings[#"Source"],
ConfigurationManager.AppSettings[#"Destination"]);
}
}
}
}
The code is running it looks like smooth sailing at this point, it completes five out of the hundred directories of web-sites it is copying then it randomly throws an exception (I left error handling out, for simplicity but this is the error.)
IOException: The device is not ready.
It randomly stops reading and writing
I'm not entirely sure why this would occur, any advice would be terrific.
I would recommend just using the built-in FileSystem.CopyDirectory method, which handles directory copying properly without custom code.
Using the built in method is definitely more appropriate here
Microsoft.VisualBasic.FileIO.FileSystem.CopyDirectory(source,destination,true);
However, that is not the heart of the issue. The problem is when each drive is inspected. Specifically, the request for .VolumeLabel on the DirectoryInfo. You can read some background on the issue at the MSDN: http://msdn.microsoft.com/en-us/library/system.io.driveinfo.volumelabel(v=vs.110).aspx
What it breaks down to is that if a drive is examined, it has to be ready. Sometimes it is not ready and when it is accessed at that point, you get an exception
IOException - An I/O error occurred (for example, a disk error or a drive was not ready).
To remedy this, make sure that the drive is ready when accessed with diagnose.isReady
public static void Save()
{
string[] drives = Directory.GetLogicalDrives();
foreach(string drive in drives)
{
DriveInfo diagnose = new DriveInfo(drive);
if(diagnose.IsReady && diagnose.VolumeLabel == #"Backup" && diagnose.DriveType == DriveType.Fixed)
^^Make sure the drive is ready before examining properties
{
CopyDirectory(
ConfigurationManager.AppSettings[#"Source"],
ConfigurationManager.AppSettings[#"Destination"]);
}
}
}
Related
I'm trying to create a program that runs through all the drives on a pc and lists the files.
I have 9 drives in my pc and the program runs fine and lists files on all of them except on the drive from which I'm running the program from. (Doesn't matter which drive.)
I have a recursive function that takes all the files and directories it finds and compiles a list.
The function runs fine on all other drives but for the one from which I'm running the program from it says Could not find file 'D:\CreateFileList.deps.json'. and then crashes into catch() for that drive.
Here's the part of the code that does that.
static void DirSearch(string sDir, string file)
{
try
{
// Get files from root of the drive
if ( firstPass == 1 )
{
foreach (string f in Directory.GetFiles(sDir))
{
if (CheckExclusion(f))
{
WriteToFile(f, file);
}
}
firstPass = 0;
}
// Get files recursively
foreach (string d in Directory.GetDirectories(sDir))
{
if (CheckExclusion(d))
{
foreach (string f in Directory.GetFiles(d))
{
if (CheckExclusion(f))
{
WriteToFile(f, file);
}
}
}
DirSearch(d, file);
}
}
catch (System.Exception excpt)
{
Console.WriteLine(excpt.Message);
}
}
Obviously this file is not at the root of the drive but in the same directory as the .exe file.
Does anyone have any idea what might be wrong? Do I have some settings wrong or includes or what?
You need to look at the actual exception being returned. It will give you a hint as to what is happening.
For example if the OS is throwing UnauthorizedAccessException your process is not running as admin and will not be allowed to look at the directory/files.
You need to handle (catch) each of the exceptions listed at Directory.GetDirectories in your try catch.
By handling all (or just the most likely) exceptions, you will be able to have a working program.
I have this code to copy all files from source-directory, F:\, to destination-directory.
public void Copy(string sourceDir, string targetDir)
{
//Exception occurs at this line.
string[] files = System.IO.Directory.GetFiles(sourceDir, "*.jpg",
SearchOption.AllDirectories);
foreach (string srcPath in files)
{
File.Copy(srcPath, srcPath.Replace(sourceDir, targetDir), true);
}
}
and getting an exception.
If I omit SearchOption.AllDirectories and it works but only copies files from F:\
Use following function instead of System.IO.Directory.GetFiles:
IEnumerable<String> GetAllFiles(string path, string searchPattern)
{
return System.IO.Directory.EnumerateFiles(path, searchPattern).Union(
System.IO.Directory.EnumerateDirectories(path).SelectMany(d =>
{
try
{
return GetAllFiles(d,searchPattern);
}
catch (UnauthorizedAccessException e)
{
return Enumerable.Empty<String>();
}
}));
}
File system objects are subject to security. Some file system objects are secured in such a way that they can only be accessed by certain users. You are encountering a file to which the user executing the code does not have sufficient rights to access.
The reason that you don't have access rights for this particular folder is to protect the security of the different users on the system. The folder in question is the recycle bin on that drive. And each different user has their own private recycle bin, that only they have permission to access. If anybody could access any other user's recycle bin, then users would be able to read each other's files, a clear violation of the system's security policy.
Perhaps the simplest way around this is to skip hidden folders at the root level of the drive. That simple change would be enough to solve your problem because you surely don't want to copy recycle bins.
That folder is a secure system folder (your bin, each drive has its own bin). Just place your file.copy into a try catch statement and ignore/log all the failures. That way you will only copy actual files and skip system files/folders.
If you really want to avoid the try catch statement. Use the fileinfo and directory info classes to figure out which folders/files are of the system and will throw an exception.
This should do the trick:
private IEnumerable<string> RecursiveFileSearch(string path, string pattern, ICollection<string> filePathCollector = null)
{
try
{
filePathCollector = filePathCollector ?? new LinkedList<string>();
var matchingFilePaths = Directory.GetFiles(path, pattern);
foreach(var matchingFile in matchingFilePaths)
{
filePathCollector.Add(matchingFile);
}
var subDirectories = Directory.EnumerateDirectories(path);
foreach (var subDirectory in subDirectories)
{
RecursiveFileSearch(subDirectory, pattern, filePathCollector);
}
return filePathCollector;
}
catch (Exception error)
{
bool isIgnorableError = error is PathTooLongException ||
error is UnauthorizedAccessException;
if (isIgnorableError)
{
return Enumerable.Empty<string>();
}
throw error;
}
}
I want to index all my music files and store them in a database.
I have this function that i call recusively, starting from the root of my music drive.
i.e.
start > ReadFiles(C:\music\);
ReadFiles(path){
foreach(file)
save to index;
foreach(directory)
ReadFiles(directory);
}
This works fine, but while running the program the amount of memory that is used grows and grows and.. finally my system runs out of memory.
Does anyone have a better approach that doesnt need 4GB of RAM to complete this task?
Best Regards, Tys
Alxandr's queue based solution should work fine.
If you're using .NET 4.0, you could also take advantage of the new Directory.EnumerateFiles method, which enumerates files lazily, without loading them all in memory:
void ReadFiles(string path)
{
IEnumerable<string> files =
Directory.EnumerateFiles(
path,
"*",
SearchOption.AllDirectories); // search recursively
foreach(string file in files)
SaveToIndex(file);
}
Did you check for the . and .. entries that show up in every directory except the root?
If you don't skip those, you'll have an infinite loop.
You can implement this as a queue. I think (but I'm not sure) that this will save memory. At least it will free up your stack. Whenever you find a folder you add it to the queue, and whenever you find a file you just read it. This prevents recursion.
Something like this:
Queue<string> dirs = new Queue<string>();
dirs.Enqueue("basedir");
while(dirs.Count > 0) {
foreach(directory)
dirs.Enqueue(directory);
ReadFiles();
}
Beware, though, that EnumerateFiles() will stop running if you don't have access to a file or if a path is too long or if some other exception occurs. This is what I use for the moment to solve those problems:
public static List<string> getFiles(string path, List<string> files)
{
IEnumerable<string> fileInfo = null;
IEnumerable<string> folderInfo = null;
try
{
fileInfo = Directory.EnumerateFiles(str);
}
catch
{
}
if (fileInfo != null)
{
files.AddRange(fileInfo);
//recurse through the subfolders
fileInfo = Directory.EnumerateDirectories(str);
foreach (string s in folderInfo)
{
try
{
getFiles(s, files);
}
catch
{
}
}
}
return files;
}
Example use:
List<string> files = new List<string>();
files = folder.getFiles(path, files);
My solution is based on the code at this page: http://msdn.microsoft.com/en-us/library/vstudio/bb513869.aspx.
Update: A MUCH faster method to get files recursively can be found at http://social.msdn.microsoft.com/Forums/vstudio/en-US/ae61e5a6-97f9-4eaa-9f1a-856541c6dcce/directorygetfiles-gives-me-access-denied?forum=csharpgeneral. Using Stack is new to me (I didn't even know it existed), but the method seems to work. At least it listed all files on my C and D partition with no errors.
It could be junction folders wich leads to infinite loop when doing recursion but i am not sure , check this out and see by yourself . Link: https://learn.microsoft.com/en-us/windows-server/administration/windows-commands/mklink
I'm trying to iterate over the items on my start menu, but I keep receiving the UnauthorizedAccessException. I'm the directory's owner and my user is an administrator.
Here's my method (it's in a dll project):
// root = C:\Users\Fernando\AppData\Roaming\Microsoft\Windows\Start Menu
private void walkDirectoryTree(DirectoryInfo root) {
try {
FileInfo[] files = root.GetFiles("*.*");
foreach (FileInfo file in files) {
records.Add(new Record {Path = file.FullName});
}
DirectoryInfo[] subDirectories = root.GetDirectories();
foreach (DirectoryInfo subDirectory in subDirectories) {
walkDirectoryTree(subDirectory);
}
} catch (UnauthorizedAccessException e) {
// do some logging stuff
throw; //for debugging
}
}
The code fail when it starts to iterate over the subdirectories. What else should I do? I've already tried to create the manifest file, but it didn't work.
Another point (if is relevant): I'm just running some unit tests with visual studio (which is executed as administrator).
Based on your description, it appears there is a directory to which your user does not have access when running with UAC enabled. There is nothing inherently wrong with your code and the behavior in that situation is by design. There is nothing you can do in your code to get around the fact that your account doesn't have access to those directories in the context it is currently running.
What you'll need to do is account for the directory you don't have access to. The best way is probably by adding a few extension methods. For example
public static FileInfo[] GetFilesSafe(this DirectoryRoot root, string path) {
try {
return root.GetFiles(path);
} catch ( UnauthorizedAccessException ) {
return new FileInfo[0];
}
}
I am just learning C# (have been fiddling with it for about 2 days now) and I've decided that, for leaning purposes, I will rebuild an old app I made in VB6 for syncing files (generally across a network).
When I wrote the code in VB 6, it worked approximately like this:
Create a Scripting.FileSystemObject
Create directory objects for the source and destination
Create file listing objects for the source and destination
Iterate through the source object, and check to see if it exists in the destination
if not, create it
if so, check to see if the source version is newer/larger, and if so, overwrite the other
So far, this is what I have:
private bool syncFiles(string sourcePath, string destPath) {
DirectoryInfo source = new DirectoryInfo(sourcePath);
DirectoryInfo dest = new DirectoryInfo(destPath);
if (!source.Exists) {
LogLine("Source Folder Not Found!");
return false;
}
if (!dest.Exists) {
LogLine("Destination Folder Not Found!");
return false;
}
FileInfo[] sourceFiles = source.GetFiles();
FileInfo[] destFiles = dest.GetFiles();
foreach (FileInfo file in sourceFiles) {
// check exists on file
}
if (optRecursive.Checked) {
foreach (DirectoryInfo subDir in source.GetDirectories()) {
// create-if-not-exists destination subdirectory
syncFiles(sourcePath + subDir.Name, destPath + subDir.Name);
}
}
return true;
}
I have read examples that seem to advocate using the FileInfo or DirectoryInfo objects to do checks with the "Exists" property, but I am specifically looking for a way to search an existing collection/list of files, and not live checks to the file system for each file, since I will be doing so across the network and constantly going back to a multi-thousand-file directory is slow slow slow.
Thanks in Advance.
The GetFiles() method will only get you files that does exist. It doesn't make up random files that doesn't exist. So all you have to do is to check if it exists in the other list.
Something in the lines of this could work:
var sourceFiles = source.GetFiles();
var destFiles = dest.GetFiles();
foreach (var file in sourceFiles)
{
if(!destFiles.Any(x => x.Name == file.Name))
{
// Do whatever
}
}
Note: You have of course no guarantee that something hasn't changed after you have done the calls to GetFiles(). For example, a file could have been deleted or renamed if you try to copy it later.
Could perhaps be done nicer somehow by using the Except method or something similar. For example something like this:
var sourceFiles = source.GetFiles();
var destFiles = dest.GetFiles();
var sourceFilesMissingInDestination = sourceFiles.Except(destFiles, new FileNameComparer());
foreach (var file in sourceFilesMissingInDestination)
{
// Do whatever
}
Where the FileNameComparer is implemented like so:
public class FileNameComparer : IEqualityComparer<FileInfo>
{
public bool Equals(FileInfo x, FileInfo y)
{
return Equals(x.Name, y.Name);
}
public int GetHashCode(FileInfo obj)
{
return obj.Name.GetHashCode();
}
}
Untested though :p
One little detail, instead of
sourcePath + subDir.Name
I would use
System.IO.Path.Combine(sourcePath, subDir.Name)
Path does reliable, OS independent operations on file- and foldernames.
Also I notice optRecursive.Checked popping out of nowhere. As a matter of good design, make that a parameter:
bool syncFiles(string sourcePath, string destPath, bool checkRecursive)
And since you mention it may be used for large numbers of files, keep an eye out for .NET 4, it has an IEnumerable replacement for GetFiles() that will let you process this in a streaming fashion.