Does anybody know why I am not allowed to "read" the size of a file that has the "ReadOnly" bit set? I am running my program as Administrator and I am not attempting to write to the file, I can read properties and file size from File Explorer just fine, even with lower credentials, but my software is not allowed to read from a readonly file and gives me an UnAuthorizedAccess exception. I don't see any logic behind this, anybody who does? Is there a workaround?
private static double DirSize(DirectoryInfo tdir) {
double size = 0;
FileInfo[] files = tdir.GetFiles();
foreach (FileInfo file in files) { size += file.Length; }
DirectoryInfo[] dirs = tdir.GetDirectories();
foreach (DirectoryInfo dir in dirs) { size += DirSize(dir); }
return( size );
}
Edit: the file it's complaining about is a shortcut to a directory that is readonly. Security tab shows no problems on both the directory itself and the shortcut. I guess it's not a big deal cause it's just a shortcut, but I like to understand what's going on and want to count that 1 KB shortcut to my totals.
On Windows, files and directories have different security credentials. You may access a file of a directory easily, but when it comes to a directory itself, you will need to set privileges for the user of your application.
Properties -> Security -> Groups or Usernames
And you must give access to the user of the application which is trying to access the data. (For example IIS user For web applications)
Related
Using Windows 10 Home (20H2) on HP Spectre
I am trying to use the following code to clear the contents of a directory
public int clearDirectory(string path)
{
DirectoryInfo targetDir = new DirectoryInfo(path);
foreach (FileInfo file in targetDir.GetFiles())
{
file.Delete();
}
foreach (DirectoryInfo dir in targetDir.GetDirectories())
{
dir.Delete(true);
}
return 0;
}
The target directory is on a USB SanDisk which has in its root dir one directory (which has a number of subdirectories) which I created and the following SanDisk files
SanDiskMemoryZone_AppInstaller.apk
SanDiskMemoryZone_QuickStartGuide.pdf
I replaced the path to the USB drive with a path to a directory on my C drive and that worked fine.
How does the bootmgr get involved in this?
In Windows you can declare for each drive if it is bootable or not.
The bootable partition in Windows is a hidden drive called system partition (and that‘s also the reason why no error was shown when you tried to delete the C:-drive)
Your SAN-drive seems to be using a file system which has a hidden folder with ACL only for the system account.
That‘s why you get this error. You have no access right to delete the file/folder.
One solution:
change your code s.t. It only selects files without a leading dot, like with regex: ^\.
If this still doesn’t fix your error then you should try use an elevated process (with adminr rights) to tackle this.
REMARK: before you delete everything from a usb stick, it would be easier to just reformat it then deleting every file.
Maybe look into Diskpart for this.
Apologies for the poor title wording,
I have a StreamWriter set up in my C# program that creates and then writes to multiple text files on a local storage drive. The issue is that as I test this program on multiple machines - the names of the drives are inconsistent from machine to machine and do not always have a C: , D: , etc. As a result I experience errors whilst trying to write to drives that do not exist.
I have attempted to not specify the drive to be written to in the hopes that it would default to an existing drive as the specific location is unimportant for my needs. I.e. "C:\\wLocation.txt" becomes "wLocation.txt" but this did not seem to fix anything.
Code:
public static string getWeatherLocation()
{
String locationFile = "C:\\wLocation.txt";
String location;
try
{
System.IO.StreamReader reader = new StreamReader(locationFile);
location = reader.ReadLine();
reader.Close();
return location;
}
catch (Exception ex)
{
return null;
}
}
I'm not particularly knowledgeable with regards to the StreamWriter so the solution may be fairly simple but any help would be appreciated.
You can use System.IO.DriveInfo.GetDrives to get a list of drives on the machine:
DriveInfo[] allDrives = DriveInfo.GetDrives();
foreach (DriveInfo d in allDrives)
{
Console.WriteLine(d.Name); //C:\ etc.
}
You can then simply compose the file name of the given volume label and your desired file name:
var filePath = d.Name + "filename.txt";
Or better:
var filePath = Path.Combine(d.Name, "filename.txt");
In order to cope with different drives on different machines, you have several options:
Use relative file paths, e.g. locationFile = "wLocation.txt" will write the file in the current directory.
Use a special folder, e.g. Documents or AppData. You can use the Environment.GetFolderPath method to get one of those directories and create the full path like this: locationFile = Path.Combine(sysFolderPath, "wLocation.txt");.
Make the folder configurable in your application.
Please note that besides getting a folder path that works on the specific machine, you also need to pay attention to the permissions of the directory. Especially the first option might not work due to permissions if you install your application under Program Files.
I want to make an exact copy of some files, directories and subdirectories that are on my USB drive I:/ and want them to be in C:/backup (for example)
My USB drive has the following structure:
(just to know, this is an example, my drive has more files, directories and subdirectories)
courses/data_structures/db.sql
games/pc/pc-game.exe
exams/exam01.doc
Well, I am not sure how to start with this but my first idea is to get all the files doing this:
string[] files = Directory.GetFiles("I:");
The next step could be to make a loop and use File.Copy specifying the destination path:
string destinationPath = #"C:/backup";
foreach (string file in files)
{
File.Copy(file, destinationPath + "\\" + Path.GetFileName(file), true);
}
At this point everything works good but not as I wanted cause this doesn't replicate the folder structure. Also some errors happen like the following...
The first one happens because my PC configuration shows hidden files for every folder and my USB has an AUTORUN.INF hidden file that is not hidden anymore and the loop tries to copy it and in the process generates this exception:
Access to the path 'AUTORUN.INF' is denied.
The second one happens when some paths are too long and this generates the following exception:
The specified path, file name, or both are too long. The fully
qualified file name must be less than 260 characters, and the
directory name must be less than 248 characters.
So, I am not sure how to achieve this and validate each posible case of error. I would like to know if there is another way to do this and how (maybe some library) or something more simple like an implemented method with the following structure:
File.CopyDrive(driveLetter, destinationFolder)
(VB.NET answers will be accepted too).
Thanks in advance.
public static void Copy(string src, string dest)
{
// copy all files
foreach (string file in Directory.GetFiles(src))
{
try
{
File.Copy(file, Path.Combine(dest, Path.GetFileName(file)));
}
catch (PathTooLongException)
{
}
// catch any other exception that you want.
// List of possible exceptions here: http://msdn.microsoft.com/en-us/library/c6cfw35a.aspx
}
// go recursive on directories
foreach (string dir in Directory.GetDirectories(src))
{
// First create directory...
// Instead of new DirectoryInfo(dir).Name, you can use any other way to get the dir name,
// but not Path.GetDirectoryName, since it returns full dir name.
string destSubDir = Path.Combine(dest, new DirectoryInfo(dir).Name);
Directory.CreateDirectory(destSubDir);
// and then go recursive
Copy(dir, destSubDir);
}
}
And then you can call it:
Copy(#"I:\", #"C:\Backup");
Didn't have time to test it, but i hope you get the idea...
edit: in the code above, there are no checks like Directory.Exists and such, you might add those if the directory structure of some kind exists at destination path. And if you're trying to create some kind of simple sync app, then it gets a bit harder, as you need to delete or take other action on files/folders that don't exist anymore.
This generally starts with a recursive descent parser. Here is a good example: http://msdn.microsoft.com/en-us/library/bb762914.aspx
You might want to look into the overloaded CopyDirectory Class
CopyDirectory(String, String, UIOption, UICancelOption)
It will recurse through all of the subdirectories.
If you want a standalone application, I have written an application that copies from one selected directory to another, overwriting newer files and adding subdirectories as needed.
Just email me.
to copy folders from local machto copy the complete files and folders , from local machine , i.e,
folder/directory path which is selected by user has to be completely[all
files within the path is be selected] is to be pasted/copied into
folder which is in webserver where the web application has been hosted.
ine to folder in server
Well, I've had quite a difficult time understanding your English. As I understood, your task is to make an exact copy of one folder including all nested folders and files, in some location?
If yes, then I would highly recommend using the console command xcopy for that, as it is perfomance-wise optimized and gives the benefit of copying the file-structure with all the related security permissions etc.
Try this:
string[] SourceFilez = System.IO.Directory.GetFiles("path", "*.*", System.IO.SearchOption.AllDirectories);
string[] targetFilez = new string[SourceFilez.Length];
SourceFilez.CopyTo( targetFilez, 0 );
for(int i = 0; i < targetFilez.Length; ++i)
{
targetFilez[i] = targetFilez[i].Replace("oldfolder", "newfolder");
string strThisDirectory = System.IO.Path.GetDirectoryName(targetFilez[i]);
if (!System.IO.Directory.Exists(strThisDirectory))
{
System.IO.Directory.CreateDirectory(strThisDirectory);
}
System.IO.File.Copy(SourceFilez[i], targetFilez[i]);
}
You might want to include the for loop content in a try-catch-finally block.
To account for empty directories, you might want to repeat the same code-block without file.copy, replacing SourceFilez with:
string[] SourceDirectories = System.IO.Directory.GetDirectories("path", "*.*", System.IO.SearchOption.AllDirectories);
If you have IIS7 and the user does not have the necessary permissions to write to the target folder, you need to use identity impersonate and switch the app-pool to classic mode.
Edit:
Or do you mean a way to upload all files in a folder on a web-application's user's computer to the server at once ? In that case you need JQuery uploadify:
http://www.uploadify.com/
I'm trying to write a function in C# that gets a directory path as parameter and returns a dictionary where the keys are the files directly under that directory and the values are their last modification time.
This is easy to do with Directory.GetFiles() and then File.GetLastWriteTime(). However, this means that every file must be accessed, which is too slow for my needs.
Is there a way to do this while accessing just the directory? Does the file system even support this kind of requirement?
Edit, after reading some answers:
Thank you guys, you are all saying pretty much the same - use FileInfo object. Still, it is just as slow to use Directory.GetFiles() (or Directory.EnumerateFiles()) to get those objects, and I suspect that getting them requires access to every file. If the file system keeps last modification time of its files in the files themselves only, there can't be a way to extract that info without file access. Is this the case here? Do GetFiles() and EnumerateFiles() of DirectoryInfo access every file or get their info from the directory entry? I know that if I would have wanted to get just the file names, I could do this with the Directory class without accessing every file. But getting attributes seems trickier...
Edit, following henk's response:
it seems that it really is faster to use FileInfo Object. I created the following test:
static void Main(string[] args)
{
Console.WriteLine(DateTime.Now);
foreach (string file in Directory.GetFiles(#"\\169.254.78.161\dir"))
{
DateTime x = File.GetLastWriteTime(file);
}
Console.WriteLine(DateTime.Now);
DirectoryInfo dirInfo2 = new DirectoryInfo(#"\\169.254.78.161\dir");
var files2 = from f in dirInfo2.EnumerateFiles()
select f;
foreach (FileInfo file in files2)
{
DateTime x = file.LastWriteTime;
}
Console.WriteLine(DateTime.Now);
}
For about 800 files, I usually get something like:
31/08/2011 17:14:48
31/08/2011 17:14:51
31/08/2011 17:14:52
I didn't do any timings but your best bet is:
DirectoryInfo di = new DirectoryInfo(myPath);
FileInfo[] files = di.GetFiles();
I think all the FileInfo attributes are available in the directory file records so this should (could) require the minimum I/O.
The only other thing I can think of is using the FileInfo-Class. As far as I can see this might help you or it might read the file as well (Read Permissions are required)