We're working on moving files from one server to another. The last time we did this, we used vb Script. We move around a million files (took several days).
We're looking to speed up the process, as this looks like it's going to be a recurring process with possibly more files to be moved going forward.
Parameters:
Files to move dictated by two text files with multiple lines
Multiple files will be associated with each line in .txt files
Will need to search for files with parameters from text files
Copy the found files over to the new drive
Here's the idea:
Store all parameters into an array
Loop through the array and find the met parameters using something like this:
System.IO.Directory.GetFiles("PATH").Where(s => s.Contains("three"));
If c# containerizes that search, then copy those files using RoboCopy so we can multithread
Question:
Can this be done?
If not is there a way to multithread the code in c# to move through this faster?
My experience with c# is what I've learned in the last few days. If you need some of the code I've created already, let me know.
I would suggest reversing the order to minimize the GetFiles:
Store all parameters into an array
Store all the files on the path:
var files = Directory.GetFiles("PATH")
Loop through the array and find the met parameters:
var ans = parameters.SelectMany(parameter => files.Where(s => s.Contains(parameter)))
.Distinct()
.ToList();
Then copy those files
Related
I'm new to writing in C# and .NET for that matter, and would like to make my program run more efficiently so I'm wondering if someone could help me out here. In a nutshell, my program goes through a series of Excel files and copies data into them. This is all working fine, but I would like to make it even more efficient. As of now I make a list of all the Excel file paths in the directory Clients like this:
listClientExcelPaths = new List<String>(Directory.GetFiles(PATH_Root + PATH_Clients, "*.xlsx", SearchOption.AllDirectories));
I then remove the Excel files from the list which are either temporary or obviously incorrect in some way and then cycle through the listClientExcelFiles and process each file. When I open the Excel file up I make sure the contents of the Excel file is what I want and if it isn't then I close it.
Now this is all well and good, however in order to make the processing a little more efficient I would like only the Excel files that are in the Admin folder under every client. So the directory structure of every client folder is such: ClientName\Admin.
My questions are: What would the most efficient way of doing this be? I'm thinking going through each path in the list and removing the path that doesn't include Admin? Could anyone give me an example of this?
Any help would be greatly appreciated!
thanks,
Justin
Your idea is fine. If you have no problem using LINQ, then the task is a one-liner like this:
var paths = new List<string>
{
#"a\b\admin\c",
#"x\y\z\",
#"ddd\ggg\hhh\admin",
#"zzz\yyy\rrr"
};
var filteredPaths = paths.Where (p => p.ToLower().Contains("admin")).ToList();
Output is:
a\b\admin\c
ddd\ggg\hhh\admin
I'm creating a program that scans for changes, not create or delete, in ALL the files in a given directory, and all of his sub directories, in the past 24 hours.
I've seen lots of other examples/tutorials but not all of them do what I'm looking for.
This is my code so far:
using System.IO;
public static void Main(string[] args)
{
string myDirectory = Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.MyDocuments), #"C:\Path");
var directory = new DirectoryInfo(myDirectory);
DateTime from_date = DateTime.Now.AddDays(-1);
DateTime to_date = DateTime.Now;
var files = directory.GetFiles()
.Where(file => file.LastWriteTime >= from_date && file.LastWriteTime <= to_date)
.ToArray();
Console.WriteLine();
}
The problem is that the code above only tells me the last change in the directory. What I want is a log of all the changes in the directory.
I am a learning student, so lots of explanation would be great! :)
I am NOT, I repeat NOT, looking for FileSystemWatcher, I don't want my server to stay on for 24 hours straight. I want to start this program once every 24 hours, make a log, close it.
If anyone can help me out with this, or at least give me something to start with, I would very much appreciate it!
EDIT
I finally my goal to work via another code, a whole different code.
I want to thank you all for helping me and raising up my understanding of a couple of things
Without a FileSystemWatcher you would have to have an external file which you would use to compare differences.
So on first run you would get
File 1
File 2
File 3
No changes!
Next run you might get
File 1
File 3
File 4
and you would be able to compare the first list to show that File 2 is missing and File 4 is new...
It is not possible. Neither files have log of changes nor OS writes such log. Only information you can get is a last write time. If you need whole log, then you should create it manually (e.g. create windows service with FileSystemWatcher), or you can consider to use some version control system, which tracks all changes to files (in this case changes to files should be done though version control software).
As others have already written: Without a file system watcher or storing a list of files you have no chance to detect deleted files. Also it is possible to change the file date (however, I do not know whether it is possible to change last write time or only create time). Also renaming a file usually does not change any file date.
The problem is that the code above only tells me the last change in the directory
I tested your code in VS 2010, and in the debugger I see multiple entries in the variable files. Maybe you only have one modified file in the directory itself and the other changed files in sub directories (see belov)? Maybe your output is wrong?
If you really have multiple recently changed files in the directory, I'd suggest to use an additional variable for intermediate results to check where the error occurs. (Does directory.GetFiles() give you an incomplete file list or is there a problem with the filter in where?)
Last but not least, you write
and all of his sub directories
According to MSDN GetFiles only returns the files in the current directory. If you also want sub directories, you have to recurse into them. Of course you should address the problem of incomplete file list before adding recursion.
Hope that helps
Depending on your needs, this may (as far as I know) not even be possible to do cleanly in .NET - what you're really looking for could be done using a File System Minifilter Driver, which runs in kernel space quietly and can intercept all IRP packets within a certain filepath (e.g. C:\Sub*). Such a program can be loaded using fltmc.exe command.
im new here, have been trying to compere one file to another and if previous file in folder is bigger then move the file. I was trying to find solution for vba, batch, c# but had no luck yet. there are many files in folder sorted by date and we need to compare one file to next and so on. If i could get ANY help on that i would greatly appreciate that!
For c#:
Use the static methods on the Directory class to get a list of the files in a folder.
Use the FileInfo class to get information about the files (IE: file size)
Use File.Move() to relocate the files if they match your criteria.
Have a look over the IOException and UnauthorizedAccessException documentation to see all the bad things that might happen when your program runs.
My question is actually related to this post.
VB script + read files (only files with "log" name) and copy content files into one file.txt
I want to do exactly what he has done (combine *.log files that I have a user browse for) however I need them to be inserted into the new log file time wise. For example:
1.log (12:15:66)
2.log (10:09:33)
3.log (15:11:10)
I need the out put to be in the final.log file but in the order (2.log, 1.log, 3.log) because thats the order timewise they were created. I also will have different numbers of log files so it needs to either combine all in a directory or ask for each file until I don't specify anymore. I am going to be using C# also not VB like in the example.
Help is much appreciated!
Once the user has selected all of the logs that he wants to include, you can get the FileInfo for each file. Store those in a list and sort by timestamp. Then use a simple loop to copy each one to the output file.
I am creating an application to back up files from a source directory into a destination directory. I store the files information from the source and destination folders in separate lists then I compare the lists based on their size, date modified etc to see which files need to be copied.
Anyways the point is that I end up with a list of the files that need to be copied and I will like to know how much time is every file taking therefore I have tried the following techniques:
Technique 1
Technique 2
Thechnique 3 : the regular File.Copy("source....","Destination")
The first two techniques are great because I can see the progress. The problem is that when I copy some files with those techniques, the new file sometimes has different dates. I will like both files to have the same modified date and also the same creation date. Moreover if for whatever reason my program crashes the file that is being copied will be corrupted because I have tried copying a large file ( a file that takes about a minute to get copied in windows) if I exit my program meanwhile the file is being copied the file that is being copied sometimes has the same attributes and the same size so I want to make sure I don't have corrupted files in case my program crashes.
Maybe I should use aether techniques 1 or 2 and then at the end copy the attributes from the source file and assign those to the destination file. I don't know how to do that though.
FileInfo has members CreationTime and LastWriteTime that are settable - so you could settle for your preferring techniques and set the dates afterwards if that helps.
Have you considered just writing a shell script that calls robocopy? Any time I've had to run backup tasks like this, I just write a script -- robocopy already does the heavy lifting for me, so there's often no need to create a bespoke application.
A solution that I have but its long:
I know I can copy the file from the source and then name the file in the destination something else like "fileHasNotBeenCopiedYet" with attributes of hidden then when my program finishes copying the file change the name to the source name and copy the attributes and then latter I know that if a file with that name ("fileHasNotBeenCopiedYet") exists that means that the file is corrupted.