In my application I scan images and creat a treeview and when I click on the name of file I show the image in imageViewer,
but when I use delete directory after that I clear the treeview and the imageviewer
treevImage.Nodes.Clear();
imageViewer.Image.Dispose();
imageViewer.Image = null;
Directory.Delete(localPath, true);
I got an exception that one image was used by another thread, the problem is that it is random, the first one or any other image!
The process cannot access the file 'image9.tif' because it is being used by another process.
Is there a way to know which part of my application uses that image?
Edit :
When I add
imageViewer.Dispose();
l'app delete the files without exception but when I scan again I got exception when show a new image in imageViewer
Object reference not set to an instance of an object.
Edit 2:
Exception do to dispose() was corrected by MikeNakis but now After I delete the directory I make a new scan and to show the new images in the imageViewer, after dispose() and new ; I can't see the new images for the new scan
Once you have disposed the imageViewer with imageViewer.Dispose(); you need to then re-create it with imageViewer = new ImageViewer();.
I occasionally have to wait a few ms after closing and disposing a file before it is available for reading on the same thread - just as you describe above. I solved this by writing a method which wait until I had exclusive access to the file.
File.Open(path, FileMode.Open, FileAccess.Read, FileShare.None)
Remember to put a thread.sleep in that loop. If this locks up your program, you're forgetting to dispose something.
(I never investigated why this was required since the workaround was successful and has been in use for almost 10 year now. Could be a bug in my code, but since I was writing to the file before closing/disposing it I suspect that our anti-virus software placed a brief lock on the file)
Another solution is to prevent the image object locking the file at all. You can read the image file into a memory stream, and instantiate the image from it. Just ensure you do this right so that you don't end up copying the image around in memory more than neccessary.
Related
I've been Working on a c# program that takes a screenshot every second, but it always crashes after the second screenshot.
I guess it's maybe because it fails to save the image as the name was already taken by the previous screenshot.
it crashes at this line exactly :
screenshot.Save("Screenshot.png", ImageFormat.Png);
I want it to overwrite the image every time without any crashes.
You can use below code for replacing already created file, but be sure that you have already released previous file handle. If you use using like the code below, the operation for disposing thus releasing file handle is done automatically
using(FileStream fs = new FileStream(filePath, FileMode.Create,
FileAccess.ReadWrite, FileShare.None)
{
image.Save(fs, ImageFormat.Png) //example format for saving file
}
I am using DotNetZip 1.9.6 in my application which uses a file structure similar to e.g. *.docx: Zip file containing XML files.
Now every module of the application can store such XML files to my custom file management and on "save" they are serialized to streams which are then saved to the Zip file via DotNetZip.
To update the entries I use ZipFile.UpdateEntry(path, stream).
This works fine and the first time I save my file via calling ZipFile.Save() everything works.
But If I do this a second time (first some UpdateEntrycalls then Save) on the same instance the Zip file is corrupted: The file structure and meta-data (e.g. uncompressed size of each file) is still there, but all files are 0 byte in compressed size.
If I create a new instance from the just saved file after saving everything works fine, but shouldn't it be possible to avoid that and "reuse" the same instance?
The following example (also see https://dotnetfiddle.net/mHxEIy) can be used to reproduce the problem:
using System.IO;
using System.Text;
public class Program
{
public static void Main()
{
var zipFile = new Ionic.Zip.ZipFile();
var content1 = new MemoryStream(Encoding.Default.GetBytes("Content 1"));
zipFile.UpdateEntry("test.txt", content1);
zipFile.Save("test.zip"); // here the Zip file is correct
//zipFile = new Ionic.Zip.ZipFile("test.zip"); // uncomment and it works too
var content2 = new MemoryStream(Encoding.Default.GetBytes("Content 2"));
zipFile.UpdateEntry("test.txt", content2);
zipFile.Save(); // after that it is corrupt
}
}
To run this you need to add the "DotNetZip 1.9.6" NuGet package.
After the first save, this is what you get:
and after the second save:
This looks like it's a bug in the library, around removing an entry. If you just remove an entry and then save again, it correctly removes the file.
However, if you remove an entry and then add another one with the same name - which is what UpdateEntry is documented to do if the entry already exists - the old entry appears to be used instead.
The reason you're ending up with an empty file the second time is that the original MemoryStream is being read again - but by now, it's positioned at the end of the data, so there's no data to read. If you reset the position to the start of the stream (content1.Position = 0;) it will rewrite the original data. If you modify the data within content1, you end up with invalid compressed data.
The only workaround I can immediately think of is to keep your own map from filename to MemoryStream, and replace the contents of each MemoryStream when you want to update it... or just load the file each time, as per your existing workaround.
It's definitely worth filing a bug around this though, as it should work as far as I can tell.
As already suspected this was a bug in DotNetZip up to version 1.9.6.
I think I was able to fix this with THIS change which was just released as version 1.9.7 on NuGet. At least for me the problem does not happen anymore.
Some background what happend as far as I found out:
When you call Save the library sets an internal flag which remembers that the ZIP file was just save and on the second Save call instead of "recompressing" all entries in the ZIP file it copies them from the just saved file.
This works fine for adding/removing entries, but breaks when one of the entries was changed as then it "mixes" the old and the new entry and produces the inconsisten ZIP file.
My fix basically disables that "copy from old file" logic if an entry was changed.
I have a Windows Forms Application that uses 2 forms, with both writing to separate files (file paths given by inclusion of strings in textboxes on the form).
For form1, I have a number of functions that write data to the file on various different button clicks. This being the case, I used the StreamWriter consoleFile = new StreamWriter(File.OpenWrite(fileName)); method for the first writing to file and StreamWriter consoleFile = File.AppendText(fileName); for any subsequent ones. This has worked fine.
When it came to implementing the same feature for Form2, the main difference is that all the text is written at once (one function containing four sub-functions to try and keep the code tidy). I went about it like this...
public void writeChecklistToFile()
{
//open new file for writing
StreamWriter checklistFileStart = new StreamWriter(File.OpenWrite(getChecklistFile()));
checklistFileStart.WriteLine("Pre-Anaesthetic Checklist\n");
//sub-functions (one for each section of list)
//append tool used in separate functions
//StreamWriter checklistFile = File.AppendText(getChecklistFile());
writeAnimalDetails();
writeAnimalHistory();
writeAnimalExamination();
writeDrugsCheck();
}
Each of the sub-functions then contains the appendText variable shown above:
public void writeAnimalDetails()
{
StreamWriter checklistFile = File.AppendText(getChecklistFile());
//...
}
Whenever I click the button that calls the main function, it throws an exception on the first File.AppendText() method. It states that the destination file cannot be accessed because it is already being used in another process.
Presumably this has to be the OpenWrite() as it is not used anywhere before that, but I don't understand why this error would occur in my form2 when it doesn't in form1!
If anyone could help me get around this, or can point me in the direction of an easier way to do it, I'd really appreciate that.
Thanks
Mark
Read the error as "File cannot be accessed because [the file is still open for use by this] process".
The problem is that the file resource - from File.OpenWrite - is not being Disposed correctly and an unmanaged file handle, with an exclusive lock, is kept open. This in turn results in exceptions when trying to open the still-open file for writing. Use the using statement to help with lifetime management, as discussed here.
In this particular case I recommend supplying the StreamWriter - created once - as an argument to the functions that need to write to it and then Dispose the entire open file resource once at the end when complete. This ensures a more visible resource lifetime and avoids several open-close operations.
public void writeChecklistToFile()
{
// Open file for writing once..
using (var checklistWriter = new StreamWriter(File.OpenWrite(getChecklistFile())))
{
// .. write everything to it, using the same Stream
checklistWriter.WriteLine("Pre-Anaesthetic Checklist\n");
writeAnimalDetails(checklistWriter);
writeAnimalHistory(checklistWriter);
writeAnimalExamination(checklistWriter);
writeDrugsCheck(checklistWriter);
}
// And the file is really closed here, thanks to using/Dispose
}
Also see
File cannot be accessed because it is being used by another program
I think the reason it works in your first form is that you only ever have one StreamWriter existing at a time. You click a button, a StreamWriter is created, the function ends, and the StreamWriter is automatically closed before the next button click calls a function.
With your second form, however, you're calling your sub functions with their StreamWriters within a main function that also has a StreamWriter. What that amounts to is you have more than one StreamWriter trying to open the file at the same time, and thus the error.
To fix this, you can put after your call to WriteLine in your writeChecklistToFile function:
checklistFileStart.Close();
This will close your first FileStream, and allow your subsequent ones to open up the file.
This question already has answers here:
IOException: The process cannot access the file 'file path' because it is being used by another process
(12 answers)
Closed 2 years ago.
EDIT: I finally found the error. It is totally irrelevant with the bitmaps or streams or static. It appears that one of my colleagues has forgotten to remove email attachment after sending the mail, and the mail attachment service keeps open. I used a using statement for whole mail sending process, and it is solved. Thanks everyone.
I know you might say that there are billions of threads with the same title and this is a duplicate, but believe me it is not. I have been searching for the solution like 7 hours, but nothing helped so far.
The problem is the following: This is a photo capture application which uses WebcamSource as the webcam. The application runs well when first photo is taken and emailed to user. However, when user returns to the process all over again (where it started before the first run), application gives such a error. The erroneous code is below.
public static void SaveImageCapture(BitmapSource bitmap)
{
JpegBitmapEncoder encoder = new JpegBitmapEncoder();
// bitmap = BitmapFrame.Create(BitmapCreateOptions.None, BitmapCacheOption.OnLoad);
encoder.Frames.Add(BitmapFrame.Create(bitmap));
encoder.QualityLevel = 100;
encoder.Rotation = Rotation.Rotate270;
try
{
using (FileStream fstream = new FileStream("Photos\\" + fileName + ".jpg", FileMode.Create))
{
encoder.Save(fstream);
fstream.Close();
}
}
catch (Exception e) {
System.Windows.Forms.MessageBox.Show(e.ToString());
}
}
Code crashes at FileStream fstream = new FileStream("Photos\\" + fileName + ".jpg", FileMode.Create) and it gives the error
The process cannot access the fileĀ "C:\Users[username]\Dropbox[projectname][projectname]\bin\Debug\Photos" because it is being used by another process.
I tried closing webcam stream, surrounding the code with try/catch, putting it into using statement, adding FileAccess and FileShare fields, trying to add BitmapCreateOptions.None and BitmapCacheOption.OnLoad(it did not allow me), creating new images with different names rather than overwriting the same image, deleting image after sending email(it gave me the same error), and some small arrangements that may cause file access problems.
I would suggest making the filename something more generic so appending a timestamp or something, but additionally, in the using call fstream.Flush() before fstream.Close()
Do you reference that file location (Mainly the Photos folder) anywhere else in your code before it reaches this point?
It seems like you have accessed this elsewhere in your code and the connection to it has not been closed off - it's handy to use the Using block whenever calling IO methods.
Although it's not ideal, try to call GC.Collect() after encoder.Rotation = Rotation.Rotate270;
My application use "FileSystemWatcher()" to raise an event when a TXT file is created by an "X" application and then read its content.
the "X" application create a file (my application detect it successfully) but it take some time to fill the data on it, so the this txt file cannot be read at the creation time, so im
looking for something to wait until the txt file come available to reading. not a static delay but something related to that file.
any help ? thx
Create the file like this:
myfile.tmp
Then when it's finished, rename it to
myfile.txt
and have your filewatcher watch for the .txt extension
The only way I have found to do this is to put the attempt to read the file in a loop, and exit the loop when I don't get an exception. Hopefully someone else will come up with a better way...
bool FileRead = false;
while (!FileRead)
{
try
{
// code to read file, which you already know
FileRead = true;
}
catch(Exception)
{
// do nothing or optionally cause the code to sleep for a second or two
}
}
You could track the file's Changed event, and see if it's available for opening on change. If the file is still locked, just watch for the next change event.
You can open and read a locked file like this
using (var stream = new FileStream(#"c:\temp\file.txt", FileMode.Open, FileAccess.Read, FileShare.ReadWrite)) {
using (var file = new StreamReader(stream)) {
while (!file.EndOfStream) {
var line = file.ReadLine();
Console.WriteLine(line);
}
}
}
However, make sure your file writer flushes otherwise you may not see any changes.
The application X should lock the file until it closes it. Is application X also a .NET application and can you modify it? In that case you can simply use the FileInfo class with the proper value for FileShare (in this case FileShare.Read).
If you have no control over application X, the situation becomes a little more complex. But then you can always attempt to open the file exclusively via the same FileInfo.Open method. Provide FileShare.None in that case. It will attempt to open the file exclusively and will fail if the file is still in use. You can perform this action inside a loop until the file is closed by application X and ready to be read.
We have a virtual printer for creating pdf documents, and I do something like this to access that document after it's sent to the printer:
using (FileSystemWatcher watcher = new FileSystemWatcher(folder))
{
if(!File.Exists(docname))
for (int i = 0; i < 3; i++)
watcher.WaitForChanged(WatcherChangeTypes.Created, i * 1000);
}
So I wait for a total of 6 seconds (some documents can take a while to print but most come very fast, hence the increasing wait time) before deciding that something has gone awry.
After this, I also read in a for loop, in just the same way that I wait for it to be created. I do this just in case the document has been created, but not released by the printer yet, which happens nearly every time.
You can use the same class to be notified when file changes.
The Changed event is raised when changes are made to the size, system attributes, last write time, last access time, or security permissions of a file or directory in the directory being monitored.
So I think you can use that event to check if file is readable and open it if it is.
If you have a DB at your disposal I would recommend using a DB table as a queue with the file names and then monitor that instead. nice and transactional.
You can check if file's size has changed. Although this will require you to poll it's value with some frequency.
Also, if you want to get the data faster, you can .Flush() while writing, and make sure to .Close() stream as soon as you will finish writing to it.