Out of memory crash when trying to change image? - c#

(C#) I get an out-of-memory crash when I try setting the image of a "picture box" with one opened from a file.
My code:
string file = openImageBox.Text; // Our file
if (File.Exists(file))
{
File.Open(file, FileMode.Open); // Open the file for use.
Output.Text = "File Open Success!"; //Informing the user on how sucessful they are.
Output.ForeColor = System.Drawing.Color.Black;
Image img = Image.FromFile(file);
Display.Image = img;
}

Probably not the right answer (who knows.. it could be causing you all sorts of issues).
You don't need to "Open the file for use". This is holding a handle to the file you don't need. Just call Image.FromFile directly and it will work fine.
So remove this:
File.Open(file, FileMode.Open); // Open the file for use.
EDIT:
For completeness (and to help you learn), you need to store a reference to the stream if you want to close it. What I told you to remove above holds a handle to the file. The file is essentially open now.. until you close it.
For other code (where you're not using a method like Image.FromFile), you would either store a handle to the file so you can close it.. or use a using statement to close it for you.
Option A:
var stream = File.Open(file, FileMode.Open);
// do stuff here
stream.Close();
Option B (preferred):
using (var stream = File.Open(file, FileMode.Open)) {
// do stuff here
} // stream.Close automatically called for you

Related

Need to check whether a particular text file is in open state ... if yes need to close the editor using c#

I have placed a text file in c:\my_files\test1.txt. Using C# I need to check whether the file is in an open state. If it is, I need to close the editor.
In the below code it doesn't go to the catch block if the file is in open state.
string path = #"c:\my_files\test1.txt";
FileStream stream = null;
try
{
stream = File.Open( path, FileMode.Open, FileAccess.Read, FileShare.None);
}
catch (IOException)
{
//the file is unavailable because it is:
//still being written to
//or being processed by another thread
//or does not exist (has already been processed)
//return true;
Console.WriteLine("true");
}
finally
{
if (stream != null)
stream.Close();
}
You cannot determine whether the Notepad.exe application has a file opened in memory - it does not retain an open file stream. You might be able to check that the Notepad application processes window title has the name of the file in it, but it does not have the file path, so this is extremely fragile.
Since Notepad does not have the file open, its not meaningful to attempt to close the file. You can try to close Notepad if you think the file is has copied to its memory buffer is the one you think it is ...

Why is File.Open so expensive?

I have the following code:
try
{
string fileName = imageQueue.Dequeue();
FileStream fileStream = File.Open(
fileName, FileMode.Open, FileAccess.ReadWrite, FileShare.None);
Bitmap bitmap = new Bitmap(fileStream);
Image picture = (Image)bitmap;
pb.Tag = fileName;
pb.Image = picture;
return true;
}
catch (Exception ex)
{
errorCount++;
//If another PC has this image open it will error
return false;
}
Because this program is running on 2 PC's accessing the same folder to pick files up it will throw an exception when one has a file open and then move onto the next file in its list.
When I open the application on 2 PC's at the same time the first PC manages to open the image but the second doesn't. I am displaying 4 images at once on screen but doing some debugging shows that the second PC is taking 10.5 seconds to fail at opening 4 files before it finds one it can open.
Why is this so expensive and what can I do to speed it up?
UPDATE: I give it exclusive access because I want the applications to show unique images so PC1 shows image 1,2,3,4 and PC shows 5,6,7,8 because it cant get acccess to 1,2,3,4. I also then free the filestream once I'm done with it and at the last possible moment so it prevents other applications trying to open it.
You are opening the file as FileAccess.ReadWrite (you don't appear to be writing). You tell it that you don't want to share the file, FileShare.None, (so the first PC to get the file wins).
Also, you never close the stream. So the PC that gets the file first holds on to it until the garbage collector closes the stream for you. When you get your stream wrap it in a using block so that the file is closed automatically:
using (FileStream fileStream = File.Open(fileName, FileMode.Open, FileAccess.ReadWrite, FileShare.None)
{
// Do stuff with the filestream
}
// The stream will be closed when the closing brace is passed.
I can't answer definitively, but my best suggestion is that something in the system, either in the .net framework classes or the file system, is implementing a timeout/retry mechanism in case of file sharing failures. This would explain the inordinate delay you report.
After Edit
Since you want them to be locked you might consider rolling a light weight database (sqllite, xml, etc) that you could use to flag a file as "in use". Then in the method you would check to see if it's in use. This will eliminate having to wait for File.Open to timeout when trying to open a locked file.
Original
I guess I should have answered instead of commenting...
try
{
string fileName = imageQueue.Dequeue();
using( FileStream fileStream = File.Open( fileName, FileMode.Open, FileAccess.Read, FileShare.Read) )
{
Bitmap bitmap = new Bitmap(fileStream);
Image picture = (Image)bitmap;
pb.Tag = fileName;
pb.Image = picture;
}
return true;
}
catch (Exception ex)
{
errorCount++;
//If another PC has this image open it will error
return false;
}
Have you tried experimenting with these stream properties? You may be able to minimize the timeout, if nothing else:
http://msdn.microsoft.com/en-us/library/470w48b4.aspx

My C# app is locking a file, how I can find where it does it?

I'm writing code that check files path calculate hash (SHA1) and copy them.
I made sure that I do not lock them like for example using
public static string SHA1(string filePath)
{
var fs = new FileStream(filePath, FileMode.Open, FileAccess.Read);
var formatted = string.Empty;
using (var sha1 = new SHA1Managed())
{
byte[] hash = sha1.ComputeHash(fs);
foreach (byte b in hash)
{
formatted += b.ToString("X2");
}
}
return formatted;
}
So how I can, in Visual Studio, find where it does lock the file?
Can you keep the above syntax as and give a try?
using(var fs = new FileStream(filePath, FileMode.Open, FileAccess.Read))
{
//Your code goes here.
}
There is a little windows soft : process explorer and in this you can find which process has an handle on a file :
http://technet.microsoft.com/en-us/sysinternals/bb896653.aspx
Locking usually happens whenever you create a file stream on a file without later closing that stream. Unless you call fs.Close(); in your code, your application will keep the file open (and thus locked).
You could wrap this in a try-finally block or try the code that Siva Gopal posted.
You assumption that opening the file stream with just FileAccess.Read will not lock the file is faulty; the file is locked while it has been opened for a file operation and has not been closed.
A FileStream does not close an opened file until the FileStream is garbage collected or you explicitly call its Close or Dispose method. Either insert such an explicit call as soon as you are done with the file you opened, Or wrap the use of the FileStream in a using statement, which implies the call to Dispose, like other answers suggest.

The process cannot access the file because it is being used by another process

I am getting binary data from a SQL Server database field and am creating a document locally in a directory my application has permissions. However I am still getting the error specified in the title. I have tried numerous suggestions posted on the web including those suggested in previous posts on Stackoverflow. I have also used ProcessExplorer > Find Handle to locate the lock and it returns nothing as if the file is not locked.
I am using the code below to save the file to the file system and I then try to copy this file to a new location later in the application process within another method. It is this copy method that takes the path of the newly created file that throws the exception.
The file itself is created with its content and i can open it through Windows Explorer without any problems.
Am I missing something completely obvious? Am I creating the file correctly from the database? Any help on solving or better diagnosing the problem would be much appreciated.
// Get file from DB
FileStream fs = new FileStream(
"C:\myTempDirectory\myFile.doc", FileMode.OpenOrCreate, FileAccess.Write);
BinaryWriter br = new BinaryWriter(fs);
br.Write("BinaryDataFromDB");
fs.Flush();
fs.Close();
fs.Dispose();
// Copy file
File.Copy(sourceFileName, destinationFilename, true);
Try adding a call to GC.Collect() after you have disposed of your streams to force the garbage collector to clean up.
// Get file from DB
using(FileStream fs = new FileStream("C:\myTempDirectory\myFile.doc", FileMode.OpenOrCreate, FileAccess.Write))
using(BinaryWriter br = new BinaryWriter(fs))
{
br.Write("BinaryDataFromDB");
fs.Flush();
}
//Force clean up
GC.Collect();
// Copy file
File.Copy(sourceFileName, destinationFilename, true);
change your code as follows, the problem is that the the filestream isn't being garbage collected when you need it to:
// Get file from DB
using (FileStream fs = new FileStream("C:\myTempDirectory\myFile.doc", FileMode.OpenOrCreate, FileAccess.Write))
{
BinaryWriter br = new BinaryWriter(fs);
br.Write("BinaryDataFromDB");
fs.Flush();
fs.Close();
}
// Copy file
File.Copy(sourceFileName, destinationFilename, true);
What about using?
string source = #"C:\myTempDirectory\myFile.doc";
using(FileStream fs = new FileStream(
source, FileMode.OpenOrCreate, FileAccess.Write, FileShare.Read))
{
BinaryWriter br = new BinaryWriter(fs);
br.Write("BinaryDataFromDB");
}
File.Copy(sourceFileName, destinationFilename, true);
EDIT: Try to inform a FileShare.Read permission.
I came to understand that though the error debugger says that the error lies in this code.. But that isn't where the error is..
I faced a similar issue and I came with this solution.. consider all the posts above..
May be it may help some one..
I used the Image to show it in a picturebox after retreiving from Database and saving it as "temp.bmp" using normal coding without using "using" keyword using this code at first:
PictureBox1.Image = Image.FromFile("temp.bmp");
and it raised and error and I really didn't got head and tail of the error.. and so I came up with this solution.
Instead of assigning it directly try this code:
Bitmap img;
using (Bitmap bmp = new Bitmap("temp.bmp"))
{
img = new Bitmap(bmp);
}
pictureBox1.Image = img;
coming to the filestream part I just used normal code as follows:
FileStream fs = new FileStream("filepath",FileMode.Create);
and it worked like a piece of cake
It really helped
Is it that Filestream and BinaryWriter cannot be used at the same time? For instance, a streamwriter and a streamreader cannot both be called on the same file. However, this doesn't make sense, as far as I know, it should not in this case. Perhaps try closing br as well?
Try disposing your BinaryWriter object before doing the copy.
Is there a background process such as antivirus or file-system indexing locking the file just long enough to cause problems with your application? Try pausing for 3 seconds to see if your problem disappears.
using System.Threading;
fs.Dispose();
Thread.Sleep(3000);
// Copy file
This was what I did:
Stream stream = new MemoryStream();
var tempStream = new FileStream(pathToFile, FileMode.Open);
tempStream.CopyTo(stream);
tempStream.Close();
then use stream object wherever you want.
I seem to encounter this only after I have published an application. If you close the message box and try again to debug, nothing will happen.
MS Help says to get all the way out of Visual Studio and restart it. Its easier to bring up Task manager and end the process. Its name will be the solution name followed by ".vshost" (e.g. project name: PLC.sln; process name: PLC.vshost.exe). The process will restart automatically. In Windows 7 one could usually see something happen when ending a process. But in 10 the Task Manager window rarely changes.
I get this message when I try testing a program after changing code. Apparently the .exe is not closed when one stops debugging and the error occurs when VS tries to write a newly complied exe file. I sometimes have to end the process twice to keep the error message from coming back. (I have to encounter the error twice; ending the process twice and then rerunning doesn't help.)
This is a bug in Visual Studio.

Reusing a filestream

In the past I've always used a FileStream object to write or rewrite an entire file after which I would immediately close the stream. However, now I'm working on a program in which I want to keep a FileStream open in order to allow the user to retain access to the file while they are working in between saves. ( See my previous question).
I'm using XmlSerializer to serialize my classes to a from and XML file. But now I'm keeping the FileStream open to be used to save (reserialized) my class instance later. Are there any special considerations I need to make if I'm reusing the same File Stream over and over again, versus using a new file stream? Do I need to reset the stream to the beginning between saves? If a later save is smaller in size than the previous save will the FileStream leave the remainder bytes from the old file, and thus create a corrupted file? Do I need to do something to clear the file so it will behave as if I'm writing an entirely new file each time?
Your suspicion is correct - if you reset the position of an open file stream and write content that's smaller than what's already in the file, it will leave trailing data and result in a corrupt file (depending on your definition of "corrupt", of course).
If you want to overwrite the file, you really should close the stream when you're finished with it and create a new stream when you're ready to re-save.
I notice from your linked question that you are holding the file open in order to prevent other users from writing to it at the same time. This probably wouldn't be my choice, but if you are going to do that, then I think you can "clear" the file by invoking stream.SetLength(0) between successive saves.
There are various ways to do this; if you are re-opening the file, perhaps set it to truncate:
using(var file = new FileStream(path, FileMode.Truncate)) {
// write
}
If you are overwriting the file while already open, then just trim it after writing:
file.SetLength(file.Position); // assumes we're at the new end
I would try to avoid delete/recreate, since this loses any ACLs etc.
Another option might be to use SetLength(0) to truncate the file before you start rewriting it.
Recently ran into the same requirement. In fact, previously, I used to create a new FileStream within a using statement and overwrite the previous file. Seems like the simple and effective thing to do.
using (var stream = new FileStream(path, FileMode.Create, FileAccess.Write)
{
ProtoBuf.Serializer.Serialize(stream , value);
}
However, I ran into locking issues where some other process is locking the target file. In my attempt to thwart this I retried the write several times before pushing the error up the stack.
int attempt = 0;
while (true)
{
try
{
using (var stream = new FileStream(path, FileMode.Create, FileAccess.Write)
{
ProtoBuf.Serializer.Serialize(stream , value);
}
break;
}
catch (IOException)
{
// could be locked by another process
// make up to X attempts to write the file
attempt++;
if (attempt >= X)
{
throw;
}
Thread.Sleep(100);
}
}
That seemed to work for almost everyone. Then that problem machine came along and forced me down the path of maintaining a lock on the file the entire time. So in lieu of retrying to write the file in the case it's already locked, I'm now making sure I get and hold the stream open so there are no locking issues with later writes.
int attempt = 0;
while (true)
{
try
{
_stream = new FileStream(path, FileMode.Open, FileAccess.ReadWrite, FileShare.Read);
break;
}
catch (IOException)
{
// could be locked by another process
// make up to X attempts to open the file
attempt++;
if (attempt >= X)
{
throw;
}
Thread.Sleep(100);
}
}
Now when I write the file the FileStream position must be reset to zero, as Aaronaught said. I opted to "clear" the file by calling _stream.SetLength(0). Seemed like the simplest choice. Then using our serializer of choice, Marc Gravell's protobuf-net, serialize the value to the stream.
_stream.SetLength(0);
ProtoBuf.Serializer.Serialize(_stream, value);
This works just fine most of the time and the file is completely written to the disk. However, on a few occasions I've observed the file not being immediately written to the disk. To ensure the stream is flushed and the file is completely written to disk I also needed to call _stream.Flush(true).
_stream.SetLength(0);
ProtoBuf.Serializer.Serialize(_stream, value);
_stream.Flush(true);
Based on your question I think you'd be better served closing/re-opening the underlying file. You don't seem to be doing anything other than writing the whole file. The value you can add by re-writing Open/Close/Flush/Seek will be next to 0. Concentrate on your business problem.

Categories