Im trying to copy files from "source" folder to my "destination" folder, without duplicates. I cant use destination folder to compare as it will eventually get deleted from there. I had gotten help here to make a batch file
Robocopy "source" "destination" "*chr.txt*" "*hdr.txt*" /M
that uses attribute to keep track of files copy before. However I need to do this in C# instead. I know theres command to copy
System.IO.File.Copy(sourceFile, destFile, true);
but not sure how to go about being specific on file name "*chr.txt" and taking care of duplicates.
I need to do this in C# instead
Process.Start("robocopy", "source destination chr.txt hdr.txt /M");
Here is a method I created a while ago that has worked well for me (aside from the checking for duplicates part that I just added and haven't tested). Let me know if any parts of it do not work for what you need.
private static void CopyDirectory(string from, string to)
{
var toFileNames = new DirectoryInfo(to)
.GetFiles()
.Select(f => f.Name)
.ToList();
var directory = new DirectoryInfo(from);
var files = directory.GetFiles();
foreach (var file in files)
if (!toFileNames.Contains(file.Name))
file.CopyTo(Path.Combine(to, file.Name));
var subDirectories = directory.GetDirectories();
foreach (var subDirectory in subDirectories)
{
var newDirectory = Directory.CreateDirectory(Path.Combine(to, subDirectory.Name));
CopyDirectory(subDirectory.FullName, newDirectory.FullName);
}
}
Related
I'm using below code to make a copy a folder available on a network. This folder has, subfolders and files with total of 455 files and 13 folders and of size 409 MB.
My method is recursively calling itself to create a copy of sub folders and files in it. Overall, this method is taking more than 10 minutes to finish the task and I'm looking to speed up the process.
So far, I've went through different posts but did not find any better solution. Is there a better way to achieve my task or any improvements to my code for a faster execution?
void CopyDirectoryAndFiles(string sourceDirectory, string destinationDirectory, bool recursive)
{
// Get information about the source directory
var dir = new DirectoryInfo(sourceDirectory);
// Check if the source directory exists
if (!dir.Exists)
throw new DirectoryNotFoundException($"Source directory not found:dir.FullName}");
// Cache directories before we start copying
DirectoryInfo[] dirs = dir.GetDirectories();
// Create the destination directory
Directory.CreateDirectory(destinationDirectory);
// Get the files in the source directory and copy to the destination directory
foreach (FileInfo file in dir.GetFiles())
{
string targetFilePath = Path.Combine(destinationDirectory, file.Name);
file.CopyTo(targetFilePath);
}
// If recursive and copying subdirectories, recursively call this method
if (recursive)
{
foreach (DirectoryInfo subDir in dirs)
{
string newDestinationDirectory= Path.Combine(destinationDirectory,subDir.Name);
CopyDirectoryAndFiles(subDir.FullName, newDestinationDirectory, true);
}
}
}
Thanks for your help.
I don't think there's a way to increase performance in a drastic way, but I can suggest a few things to try:
replace foreach w/ Parallel.ForEach to copy data in several streams;
you can use an external tool (e.g. xcopy), which is optimized for the task, and call this tool from your C# code. xcopy can copy folders recursively if you specify the /e flag.
I want to create a simple program that can access all the files in the folder, all of which being the same type of file, and then use a for each to rename all of them.
The new name will be IMG### with the ### being incremental. I know how to do this. How do I reference all the images to be used in a for each?
Example:
For Each X in Folder {
rename stuff here for x
}
All I need to know is how to reference file in the spot of x.
You can go about it along this line of thought:
var files = Directory.GetFiles(path, "*.", SearchOption.AllDirectories);
foreach (string item in files)
{
//You can rename by moving the files into their 'new names'.
//The names are full paths
File.Move(oldName, newName);
}
I know that if you want to delete a directory you have to delete all of it's files first.
However if you want to delete a directory which contains empty sub-directories, do you have to delete those sub-directories first? or can you just go ahead and delete the main directory?
Directory.Delete set the recurse flag to true, should do the job, no need to empty them first.
Directory.Delete(path, true);
I have just noticed that your tag refers to IsolatedStorage, in which case you will need to enumerate all the files and folders and delete as you go.
How to: Delete Files and Directories in Isolated Storage
You can try to delete recursively:
var path = Path.GetFullPath(#"C:\Temp\DeleteMe");
Directory.Delete(path,true); // true for recursive
This should delete everything including files if you have the proper permissions.
Why check if it is empty or not when you are going to delete it anyway.
You can use the Directory.Delete(yourpath,true) method only if you are sure that there isn't any readonly file in the directory. else it will throw an exception. Instead you can use your own recursive method like this which will first mark the file as normal before deleting it.
public static void DeleteDirectory(string target_dir)
{
string[] files = Directory.GetFiles(target_dir);
string[] dirs = Directory.GetDirectories(target_dir);
foreach (string file in files)
{
File.SetAttributes(file, FileAttributes.Normal);
File.Delete(file);
}
foreach (string dir in dirs)
{
DeleteDirectory(dir);
}
Directory.Delete(target_dir, false);
}
I am trying to read all .txt files in a folder using stream reader. I have this now and it works fine for one file but I need to read all files in the folder. This is what I have so far. Any suggestions would be greatly appreciated.
using (var reader = new StreamReader(File.OpenRead(#"C:\ftp\inbox\test.txt")))
You can use Directory.EnumerateFiles() method instead of.
Returns an enumerable collection of file names that match a search
pattern in a specified path.
var txtFiles = Directory.EnumerateFiles(sourceDirectory, "*.txt");
foreach (string currentFile in txtFiles)
{
...
}
You can call Directory.EnumerateFiles() to find all files in a folder.
You can retrieve the files of a directory:
string[] filePaths = Directory.GetFiles(#"c:\MyDir\");
Therefore you can iterate each file performing whatever you want. Ex: reading all lines.
And also you can use a file mask as a second argument for the GetFiles method.
Edit:
Inside this post you can see the difference between EnumerateFiles and GetFiles.
What is the difference between Directory.EnumerateFiles vs Directory.GetFiles?
I'm trying to write a function in C# that gets a directory path as parameter and returns a dictionary where the keys are the files directly under that directory and the values are their last modification time.
This is easy to do with Directory.GetFiles() and then File.GetLastWriteTime(). However, this means that every file must be accessed, which is too slow for my needs.
Is there a way to do this while accessing just the directory? Does the file system even support this kind of requirement?
Edit, after reading some answers:
Thank you guys, you are all saying pretty much the same - use FileInfo object. Still, it is just as slow to use Directory.GetFiles() (or Directory.EnumerateFiles()) to get those objects, and I suspect that getting them requires access to every file. If the file system keeps last modification time of its files in the files themselves only, there can't be a way to extract that info without file access. Is this the case here? Do GetFiles() and EnumerateFiles() of DirectoryInfo access every file or get their info from the directory entry? I know that if I would have wanted to get just the file names, I could do this with the Directory class without accessing every file. But getting attributes seems trickier...
Edit, following henk's response:
it seems that it really is faster to use FileInfo Object. I created the following test:
static void Main(string[] args)
{
Console.WriteLine(DateTime.Now);
foreach (string file in Directory.GetFiles(#"\\169.254.78.161\dir"))
{
DateTime x = File.GetLastWriteTime(file);
}
Console.WriteLine(DateTime.Now);
DirectoryInfo dirInfo2 = new DirectoryInfo(#"\\169.254.78.161\dir");
var files2 = from f in dirInfo2.EnumerateFiles()
select f;
foreach (FileInfo file in files2)
{
DateTime x = file.LastWriteTime;
}
Console.WriteLine(DateTime.Now);
}
For about 800 files, I usually get something like:
31/08/2011 17:14:48
31/08/2011 17:14:51
31/08/2011 17:14:52
I didn't do any timings but your best bet is:
DirectoryInfo di = new DirectoryInfo(myPath);
FileInfo[] files = di.GetFiles();
I think all the FileInfo attributes are available in the directory file records so this should (could) require the minimum I/O.
The only other thing I can think of is using the FileInfo-Class. As far as I can see this might help you or it might read the file as well (Read Permissions are required)