C#: new file() - where is the root save location? - c#

If i make the call File myFile = new File('myfile.txt'); where is does this get saved to?

It's relative to the process's current directory. What that is will depend on how your application has been started, and what else you've done - for example, some things like the choose file dialog can change the current working directory.
EDIT: If you're after a temporary file, Path.GetTempFileName() is probably what you're after. You can get the temp folder with Path.GetTempPath().

That won't compile.
EDIT: However, if you're after where creating a text file using StreamWriter or File.Create("text.txt"), etc, then I defer to the other answers above; where it will be where the application is running from. Be aware as others mentioned that if you're working out of debug it will be in the debug folder, etc.

NORMALLY it gets saved to the same directory the executable is running in. I've seen exceptions. (I can't remember the specifics, but it threw me for a loop. I seem to recall it being when it's run as a scheduled task, or if it's run from a shortcut.
The only reliable way to know where it is going to save if you are using the code snippet you provided is to check the value of System.Environment.CurrentDirectory. Better yet, explicitly state the path where you want it to save.
Edit - added
I know this is a bit late in modifying this question, but I found a better answer to the problem of ensuring that you always save the file to the correct location, relative to the executable. The accepted answer there is worth up-votes, and is probably relevant to your question.
See here: Should I use AppDomain.CurrentDomain.BaseDirectory or System.Environment.CurrentDirectory?

Related

How do I debug my C# app and simulate that its running from a specfic folder?

How do I debug my C# app when it really needs to be running from a specific folder and not from bin/debug? One of the first things my program does for example is determine what set of tools will be presented based on the executable file it finds. But since its running from the debug folder it can't find them. I can add the file there but that seems silly. There has to be a better way. Plus, there's really a bunch of other thighs it does that really requires it to be running from the proper folder eg. Z:\test.
I thought it might be the "Working Directory" setting under the "Debug" tab in Properties but that didn't seem to do anything. I'm using VS2010 and C# btw...
I hope I'm making sense.
Thanks
You can specify another output folder for your build.
Try setting Environment.CurrentDirectory to the directory you want to simulate.
Environment.CurrentDirectory = #"C:\SimulateThisDirectory\";
You can use
System.IO.Directory.SetCurrentDirectory("your-path");
from your code.
You need to differ between two things:
path from which process is started
current working directory of the process
First one you can't simulate easily. It would probably involve creating a symbolic link or creating some sort of rootkit.
As for second one, your method is fine, and you can check working directory in runtime by using Directory.GetCurrentDirectory or set it using Directory.SetCurrentDirectory.
Take note that if you are looking up directory of executing assembly, you will get path from which process is started.
You can run your program from as normal from the specific directory and then just attach the debugger
http://msdn.microsoft.com/en-us/library/c6wf8e4z.aspx

Intermittent "File in Use" Error

I have some code that I wrote to basically clear out the directory every time the program runs through this point. I didn't want to bother enumerating files. If this is a bad way to do this, please tell me.
My main question, however, is about how to deal with the following: one of the files in the folder appears to be in use when it is most certainly not. The program runs on a ButtonClick event, and it exploded the first four or five times, but it worked after I confirmed that nobody was using the file on the server. There is only one person besides myself that would have been using it, and he confirmed that there was nothing running on his side that would be touching the file. Any ideas for what would cause this error/how to avoid it/how to handle it?
I am also having trouble reproducing the error...
string directory = #"\\server\directory\folder\";
DirectoryInfo di = new DirectoryInfo(directory);
if (di.Exists)
di.Delete(true);
Directory.CreateDirectory(directory);
If you are using Windows XP, this may help : http://msdn.microsoft.com/en-us/library/dd997370.aspx#remove_open_handles
Just an extract from the top of this page :
"If you are running Windows XP or earlier, a delete operation on a file or directory that follows an enumeration could fail if there is an open handle that remains on one of the enumerated directories or files."
You may also use a software like Unlocker to identify the process locking your file.
If the file is in use, then someone is most certainly using it. :)
If you can access the server the files reside on, you can use a tool such as Process Explorer to find out which process has opened the file.

Include a batch file in a batch file

I have a problem calling a batch file from another batch file when trying to run everything by using Process.Start. Basically I call the execution of a batch file from my c# program that looks like this:
call include.bat
//execute the rest of the batch file here
The include.bat file sets up paths and can be used by a number of other batch files. When I run the Process.Start sometimes this works and sometimes I get ERROR: cannot find include.bat. First of all any idea why this happens? And ideas on how to fix this from the batch file?
To switch to the directory your batch file is located in, use this:
cd %~dp0
I do this in almost all of my batch scripts. That way relative paths should always work.
I know this is an old question but I thought it would be worth noting that the approach promoted by the accepted answer (i.e. changing the working directory) may not always be appropriate.
A better general approach is to refer to dependencies by full path:
call "%~dp0include.bat"
(Since %~dp0 already ends with a backslash, we don't need to add another one.)
Here are some benefits of not changing the working directory:
The rest of the batch file can still use the original working directory.
The original working directory in the command prompt is preserved, even without "SETLOCAL".
If the first batch file is run via a UNC path (such as "\\server\share\file.bat"), the full-path call will succeed while changing the directory (even with "cd /d") will fail. (Using pushd/popd would handle this point, but they have their own set of problems.)
These benefits are particularly important for alias-type batch files, even if they are not as important for the specific situation that motivated this question.
Before the script, try CD /D %~dp0
First thing I'd try is to use full path information in the call statement for include.bat. If that fixes it, you probably are just not running the batch file from the proper location. I'm sure there's a "working directory" capability in C#, I'm just not sure what it is.
Do you set ProcessStartInfo.WorkingDirectory ( http://msdn.microsoft.com/en-us/library/system.diagnostics.processstartinfo.workingdirectory.aspx ) on the ProcessStartInfo that you pass to Process.Start?
Since include.bat sometimes cannot be found, working directory may be wrong (not the folder where include.bat is located).

How to know the next temp file to be created in windows?

I am by no means a programmer but currently am wondering if an application creates a temp file that windows names. For example the file it creates is tmp001, is there a way i can take that name tmp001 and ask windows to give me the next temp file it would create before it creates it.
Thanks,
Mike
There is no direct means to get to know the next temporary filename to be created.
For example, programmers use the System.IO.Path.GetTempFileName method, but one can add application-specific prefixes or suffixes in order to make it easier for the application to find its newly created files.
One can even choose to save this temporary file elsewhere than the system Temp folder.
You would need to define a "temp file" much more explicitly in order to answer this question with a "Yes". The problem is that a "temp file" is just something not meant to be kept. It could exist anywhere on the system and be created by a user, application, or service. This would make it nearly (or actually) impossible to answer your question with a "Yes".
If you constrain the definition of a temp file to just the files in the official temp folder (or a subfolder), you still have a problem if you're trying to catch names not generated by windows. Any app could produce a particularly named temp file in that folder, without Windows caring.
If you further constrain the definition to be only those files named by Windows, you might be able to get somewhere. But, does that really meet your needs?
After all of that, maybe it would be better to describe the problem you're trying to solve. There may be a much better (workable) solution that would address the issue.
Typically applications use the Win32 API GetTempFileName to get the temporary directory.
The process of how the temp file is generated is described there.
I'm not sure why you want this info, but perhaps you could for example register for directory changes via a Win32 API like ReadDirectoryChangesW or by using a mini filter driver.
This kind of code just cannot work reliably on a multi-tasking operating system. Another thread in another process might pre-empt yours and claim the file name you are hoping to create.
This is otherwise easy enough to work around, just name your own files instead of relying on Windows doing it for you. Do so in the AppData folder so you'll minimize the risk of another process messing it up.

ignore files in flight

When I run the code below, it fills my array with a list of files in the specified directory.
This is good.
However, it also grabs files that are 'in flight' - meaning files that are currently being copied to that directory.
This is bad.
How do I go about ignoring those 'in-flight' files? Is there a way to check each file to make sure it's 'fully there' before I process it?
string[] files = Directory.GetFiles(ConfigurationSettings.AppSettings.Get("sourcePath"));
if (files.Length > 0)
{
foreach (string filename in files)
{
string filenameonly = Path.GetFileName(filename);
AMPFileEntity afe = new AMPFileEntity(filenameonly);
afe.processFile();
}
}
Unfortunately there is no way to achieve what you are looking for. Robert's suggestion of opening the file for writing solves a subset of the problem but does not solve the bigger issue which is
The file system is best viewed as a multi-threaded object over which you have no synchronization capabilities
No matter what synchronization construct you try to use to put the file system into a "known state", there is a way for the user to beat it.
The best way to approach this problem is to process the files as normal and catch the exceptions that result from using files that are "in flight". This is the only sane way to deal with the file system.
Yes, you can try to open the file for writing. If you are able to open for writing without an exception then it's likely not "in-flight" anymore. Unfortunately, I have encountered this problem several times before and have not come across a better solution.
Get a list of files in the directory.
Get a list of open handles (see below).
Remove the latter from the former.
You can get a list of open handles by p/invoking NtQuerySystemInformation. There's a project on CodeProject that shows how to do this. Alternatively, you can call Handle.exe, from Sysinternals, and parse its output.
Try to rename/move the file. If you can rename it, it's no longer in use.
Carra's answer gave me an idea.
If you have access to the program that copies the files to this directory, modify it so that it:
Writes files to a temporary directory on the same disk.
Move the files to the appropriate folder after they're finished writing to disk.
On the same filesystem, a move operation just updates the directory entries rather than changing the file's physical location on disk. Which means that it's extremely fast.

Categories