Attempted to seek before beginning of stream inside a using() statement - c#

So I got the IOException: Attempted to Seek before the beginning of the stream. But when I looked into it the seek statement was inside of a using statement. I might be missunderstanding the using() because as far as I knew this initializes the in this case filestream before running the encased code.
private string saveLocation = string.Empty;
// This gets called inside the UI to visualize the save location
public string SaveLocation
{
get
{
if (string.IsNullOrEmpty(saveLocation))
{
saveLocation = Environment.GetFolderPath(Environment.SpecialFolder.DesktopDirectory) + #"\Pastes";
Initializer();
}
return saveLocation;
}
set { saveLocation = value; }
}
And this is the function it calls
private void Initializer()
{
// Check if the set save location exists
if (!Directory.Exists(saveLocation))
{
Debug.Log("Save location did not exist");
try
{
Directory.CreateDirectory(saveLocation);
}
catch (Exception e)
{
Debug.Log("Failed to create Directory: " + e);
return;
}
}
// Get executing assembly
if (string.IsNullOrEmpty(executingAssembly))
{
string codeBase = Assembly.GetExecutingAssembly().CodeBase;
UriBuilder uri = new UriBuilder(codeBase);
executingAssembly = Uri.UnescapeDataString(uri.Path);
}
// Get the last received list
if (!string.IsNullOrEmpty(executingAssembly))
{
var parent = Directory.GetParent(executingAssembly);
if (!File.Exists(parent + #"\ReceivedPastes.txt"))
{
// empty using to create file, so we don't have to clean up behind ourselfs.
using (FileStream fs = new FileStream(parent + #"\ReceivedPastes.txt", FileMode.CreateNew)) { }
}
else
{
using (FileStream fs = new FileStream(parent + #"\ReceivedPastes.txt", FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
{
if (fs.Seek(-20000, SeekOrigin.End) >= 0)
{
fs.Position = fs.Seek(-20000, SeekOrigin.End);
}
using (StreamReader sr = new StreamReader(fs))
{
while (sr.ReadLine() != null)
{
storedPastes.Add(sr.ReadLine());
}
}
}
}
}
isInitialized = true;
}

Are the commentors have posted: the file is less than 20000 bytes. It seems like you assume that Seek will stay at position 0 if the file is not large enough. It doesn't. It throws ArgumentException in that case.
Another thing. Seek will move the position for you. No need to do both. Either use:
fs.Seek(-20000, SeekOrigin.End);
or set the position:
fs.Position = fs.Length - 20000;
So what you really wanted to write is:
if (fs.Length > 20000)
fs.Seek(-20000, SeekOrigin.End);

Related

XML Serialization leaves file blank after restart

We have a problem where our industrial equipments software's .XML settings files become blank, yet they still have the correct number of bytes.
I have a feeling it might be caused by the way the customers are shutting down the PC as it tends to happen after they've down a shutdown, isolate, and boot. The way I save the files is,
Serialize to %temp% file
Validate that the newly created file starts with <?xml
If the /backup folders version of the file is older than a day, copy the existing file to the /backup folder
Copy new file to overwrite existing file.
I thought maybe it's related to encoding, disk caching, Windows Update, or Windows Recovery.
Looking for ideas as I've spent two years chasing down why this is happening.
As per request, here is the code.
public static bool SerializeObjXml(object Object2Serialize, string FilePath, Type type, bool gzip = false)
{
if (!Path.IsPathRooted(FilePath))
FilePath = Path.Combine(ApplicationDir, FilePath);
bool isSuccess = false;
var tmpFile = Path.GetTempFileName();
try
{
for (int i = 0; i < 3; i++)
{
try
{
Directory.CreateDirectory(Path.GetDirectoryName(FilePath));
if (gzip)
{
using (var ms = new MemoryStream())
{
XmlSerializer bf = new XmlSerializer(type);
bf.Serialize(ms, Object2Serialize);
ms.Position = 0;
using (var fileStream = new BinaryWriter(File.Open(tmpFile, FileMode.Create)))
{
using (GZipStream gzipStream = new GZipStream(fileStream.BaseStream, CompressionMode.Compress))
{
byte[] buffer = new byte[4096];
int numRead;
while ((numRead = ms.Read(buffer, 0, buffer.Length)) != 0)
{
gzipStream.Write(buffer, 0, numRead);
}
}
}
}
if (!FileChecker.isGZip(tmpFile))
throw new XmlException("Failed to write valid XML file " + FilePath);
}
else
{
using (var fs = new StreamWriter(File.Open(tmpFile, FileMode.Create), Encoding.UTF8))
{
XmlSerializer bf = new XmlSerializer(type);
bf.Serialize(fs, Object2Serialize);
}
if (!FileChecker.isXML(tmpFile))
throw new XmlException("Failed to write valid XML file " + FilePath);
}
isSuccess = true;
return true;
}
catch (XmlException)
{
return false;
}
catch (System.IO.DriveNotFoundException) { continue; }
catch (System.IO.DirectoryNotFoundException) { continue; }
catch (System.IO.FileNotFoundException) { continue; }
catch (System.IO.IOException) { continue; }
}
}
finally
{
if (isSuccess)
{
lock (FilePath)
{
try
{
//Delete existing .bak file
if (File.Exists(FilePath + ".bak"))
{
File.SetAttributes(FilePath + ".bak", FileAttributes.Normal);
File.Delete(FilePath + ".bak");
}
}
catch { }
try
{
//Make copy of file as .bak
if (File.Exists(FilePath))
{
File.SetAttributes(FilePath, FileAttributes.Normal);
File.Copy(FilePath, FilePath + ".bak", true);
}
}
catch { }
try
{
//Copy the temp file to the target
File.Copy(tmpFile, FilePath, true);
//Delete .bak file if no error
if (File.Exists(FilePath + ".bak"))
File.Delete(FilePath + ".bak");
}
catch { }
}
}
try
{
//Delete the %temp% file
if (File.Exists(tmpFile))
File.Delete(tmpFile);
}
catch { }
}
return false;
}
public static class FileChecker
{
const string gzipSig = "1F-8B-08";
static string xmlSig = "EF-BB-BF";// <?x";
public static bool isGZip(string filepath)
{
return FileChecker.CheckSignature(filepath, (3, gzipSig)) != null;
}
public static bool isXML(string filepath)
{
return FileChecker.CheckSignature(filepath, (3, xmlSig)) != null;
}
public static bool isGZipOrXML(string filepath, out bool isGZip, out bool isXML)
{
var sig = FileChecker.CheckSignature(filepath, (3, gzipSig), (3, xmlSig));
isXML = (sig == xmlSig);
isGZip = (sig == gzipSig);
return isXML || isGZip;
}
public static string CheckSignature(string filepath, params (int signatureSize, string expectedSignature)[] pairs)
{
if (String.IsNullOrEmpty(filepath))
throw new ArgumentException("Must specify a filepath");
if (String.IsNullOrEmpty(pairs[0].expectedSignature))
throw new ArgumentException("Must specify a value for the expected file signature");
int signatureSize = 0;
foreach (var pair in pairs)
if (pair.signatureSize > signatureSize)
signatureSize = pair.signatureSize;
using (FileStream fs = new FileStream(filepath, FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
{
if (fs.Length < signatureSize)
return null;
byte[] signature = new byte[signatureSize];
int bytesRequired = signatureSize;
int index = 0;
while (bytesRequired > 0)
{
int bytesRead = fs.Read(signature, index, bytesRequired);
bytesRequired -= bytesRead;
index += bytesRead;
}
foreach (var pair in pairs)
{
string actualSignature = BitConverter.ToString(signature, 0, pair.signatureSize);
if (actualSignature == pair.expectedSignature)
return actualSignature;
}
}
return null;
}
}
Using the operating system's move or copy file to overwrite an existing file is an atomic operation meaning the it wholly succeeds or doesn't and doesn't overlap other file operations.
Therefore what you have should work if that is how you are achieving step 4.
Copy new file to overwrite existing file.
If instead you are blanking out the existing file and re-writing the data I suspect that could be the the point of failure..
The issues while file space is being allocated the write is not occurring during shutdown, which leaves you when a file with bytes allocated without the data being flushed to disk.
During the OS shutdown, likely a ThreadAbortException is raised which triggers your finally block.
You can attempt to reproduce by calling Process.Start("shutdown", "-a") before your return statement but after you have set success = true.
I would suggest simplifying your code and have everything run inside of your try {} statement. This removes the possibility of having a state where success = true before your attempted your write to disk, which is then triggered in a finally statement trigged by a windows shutdown.
public static bool SerializeObjXml(
object Object2Serialize,
string FilePath,
Type type,
bool gzip = false)
{
if (!Path.IsPathRooted(FilePath))
FilePath = Path.Combine(ApplicationDir, FilePath);
Directory.CreateDirectory(FilePath);
for (int i = 0; i < 3; i++)
{
try
{
var tempFi = SerializeToXmlFile(Object2Serialize, type, gzip);
var fi = new FileInfo(FilePath);
if (fi.Exists)
fi.CopyTo(fi.FullName + ".bak", true);
tempFi.CopyTo(fi.FullName, true);
tempFi.Delete();
return true;
}
catch (Exception ex)
{
string message = $"[{DateTime.Now}] Error serializing file {FilePath}. {ex}";
File.WriteAllText(FilePath + ".log", message);
}
}
return false;
}
As a side note, you can simply use [Stream.CopyTo][1] and write directly to your temp file, without the need for intermediary streams or for manual buffer/byte read/write operations:
private static FileInfo SerializeToXmlFile(
object Object2Serialize,
Type type,
bool gzip)
{
var tmpFile = Path.GetTempFileName();
var tempFi = new FileInfo(tmpFile);
if (!gzip)
{
using (var fs = File.Open(tmpFile, FileMode.Create))
(new XmlSerializer(type)).Serialize(fs, Object2Serialize);
if (!FileChecker.isXML(tmpFile))
throw new Exception($"Failed to write valid XML file: {tmpFile}");
}
else
{
using (var fs = File.Open(tmpFile, FileMode.CreateNew))
using (var gz = new GZipStream(fs, CompressionMode.Compress))
(new XmlSerializer(type)).Serialize(fs, Object2Serialize);
if (!FileChecker.isGZip(tmpFile))
throw new Exception($"Failed to write valid XML gz file: {tmpFile}");
}
return tempFi;
}

Reading and writing very large text files in C#

I have a very large file, almost 2GB in size. I am trying to write a process to read the file in and write it out without the first row. I pretty much have been only able to read and write one line at a time which takes forever. I can open it, remove the first row and save it faster in TextPad, though that is still very slow.
I use this code to get the number of records in the file:
private long getNumRows(string strFileName)
{
long lngNumRows = 0;
string strMsg;
try
{
lngNumRows = 0;
using (var strReader = File.OpenText(#strFileName))
{
while (strReader.ReadLine() != null)
{
lngNumRows++;
}
strReader.Close();
strReader.Dispose();
}
}
catch (Exception excExcept)
{
strMsg = "The File could not be read: ";
strMsg += excExcept.Message;
System.Windows.MessageBox.Show(strMsg);
//Console.WriteLine("Thee was an error reading the file: ");
//Console.WriteLine(excExcept.Message);
//Console.ReadLine();
}
return lngNumRows;
}
This only takes seconds to run. When I add the following code it takes forever to run. Am I doing something wrong? Why does the write add so much time? Any ideas on how I can make this faster?
private void ProcessTextFiles(string strFileName)
{
string strDataLine;
string strFullOutputFileName;
string strSubFileName;
int intPos;
long lngTotalRows = 0;
long lngCurrNumRows = 0;
long lngModNumber = 0;
double dblProgress = 0;
double dblProgressPct = 0;
string strPrgFileName = "";
string strOutName = "";
string strMsg;
long lngFileNumRows;
try
{
using (StreamReader srStreamRdr = new StreamReader(strFileName))
{
while ((strDataLine = srStreamRdr.ReadLine()) != null)
{
lngCurrNumRows++;
if (lngCurrNumRows > 1)
{
WriteDataRow(strDataLine, strFullOutputFileName);
}
}
srStreamRdr.Dispose();
}
}
catch (Exception excExcept)
{
strMsg = "The File could not be read: ";
strMsg += excExcept.Message;
System.Windows.MessageBox.Show(strMsg);
//Console.WriteLine("The File could not be read:");
//Console.WriteLine(excExcept.Message);
}
}
public void WriteDataRow(string strDataRow, string strFullFileName)
{
//using (StreamWriter file = new StreamWriter(#strFullFileName, true, Encoding.GetEncoding("iso-8859-1")))
using (StreamWriter file = new StreamWriter(#strFullFileName, true, System.Text.Encoding.UTF8))
{
file.WriteLine(strDataRow);
file.Close();
}
}
Not sure how much this will improve the performance, but surely, opening and closing the output file for every line that you want to write is not a good idea.
Instead open both files just one time and then write the line directly
using (StreamWriter file = new StreamWriter(#strFullFileName, true, System.Text.Encoding.UTF8))
using (StreamReader srStreamRdr = new StreamReader(strFileName))
{
while ((strDataLine = srStreamRdr.ReadLine()) != null)
{
lngCurrNumRows++;
if (lngCurrNumRows > 1)
file.WriteLine(strDataRow);
}
}
You could also remove the check on lngCurrNumRow simply making an empty read before entering the while loop
strDataLine = srStreamRdr.ReadLine();
if(strDataLine != null)
{
while ((strDataLine = srStreamRdr.ReadLine()) != null)
{
file.WriteLine(strDataRow);
}
}
Depending on the memory of your machine. You could try the following (my big file was "D:\savegrp.log" I had a 2gb file knocking about) This used about 6gb memory when I tried it
int counter = File.ReadAllLines(#"D:\savegrp.log").Length;
Console.WriteLine(counter);
It does depends on the memory available..
File.WriteAllLines(#"D:\savegrp2.log",File.ReadAllLines(#"D:\savegrp.log").Skip(1));
Console.WriteLine("file saved");

Rewriting a text file after reading it

i got a file that is store in my appliction directory, and he got some site list.
i dont have any problem reading it, but when i want to write to it, i get
System.ArgumentException: Stream is not writeable
this is how i accsess the file:
FileStream theTextFileStream = new FileStream(Environment.CurrentDirectory + "/fourmlinks.txt",FileMode.OpenOrCreate);
and this is the function that throw me the expection:
public static void WriteNewTextToFile(string text, FileStream theFile)
{
string fileText = GetAllTextFromFile(theFile);
ArrayList fileLIst = populateListFromText(fileText);
using (StreamWriter fileWriter = new StreamWriter(theFile))
{
fileWriter.Write(String.Empty);
for (int i = 0; i < fileLIst.Count; i++)
{
fileWriter.WriteLine(fileLIst[i].ToString());
}
}
}
the function read the old and new text and add it to an arry. then i clean the file from every thing, and rewriting it with the old and new data from the arry i made.
i dont know if that will help but here is the file proprites:
Build Action: None
Copy To Out Put Directory: Copy always
why i cant rewrite the file?
this is the function i use to read the file content:
public static string GetAllTextFromFile(FileStream theFile)
{
string fileText = "";
using (theFile)
{
using (StreamReader stream = new StreamReader(theFile))
{
string currentLine = "";
while ((currentLine = stream.ReadLine()) != null)
{
fileText += currentLine + "\n";
}
}
}
return fileText;
}
You have to use Read/Write file access as third parameter -
FileStream theTextFileStream = new FileStream(Environment.CurrentDirectory + "/fourmlinks.txt",FileMode.OpenOrCreate, FileAccess.ReadWrite
);
Important - Remove using(theFile) statement:
public static string GetAllTextFromFile(FileStream theFile)
{
string fileText = "";
using (StreamReader stream = new StreamReader(theFile))
{
string currentLine = "";
while ((currentLine = stream.ReadLine()) != null)
{
fileText += currentLine + "\n";
}
}
return fileText;
}
Do not use using construct in your case as it will close the underlying stream as in your case you have to manually open and close stream objects.
This will allow you to write in the file as well.
For more information refer following links -
FileStream Constructor
FileAccess Enumeration

Can not delete the existing file in C#.net

I am trying to upload a file in asp.net. File may be image or pdf. If the file already exist then I have to remove existing file and upload the new file. But if I try to delete existing file, it shows an error that "The process cannot access the file because it is being used by another process"
This is the code for my file upload.
if (FileUploadFollowUpUN.HasFile)
{
if (Request.QueryString.Count > 0 && Request.QueryString["PCD"] != null)
{
filename = System.IO.Path.GetFileName(FileUploadFollowUpUN.FileName.Replace(FileUploadFollowUpUN.FileName, Request.QueryString["PCD"] + " " + "D" + Path.GetExtension(FileUploadFollowUpUN.FileName)));
SaveFilePath = Server.MapPath("~\\ECG\\") + filename;
DirectoryInfo oDirectoryInfo = new DirectoryInfo(Server.MapPath("~\\ECG\\"));
if (!oDirectoryInfo.Exists)
Directory.CreateDirectory(Server.MapPath("~\\ECG\\"));
if (File.Exists(SaveFilePath))
{
File.SetAttributes(SaveFilePath, FileAttributes.Normal);
File.Delete(SaveFilePath);
}
FileUploadFollowUpUN.SaveAs(Server.MapPath(this.UploadFolderPath) + filename);
Session["FileNameFollowUpUN"] = filename;
if (System.IO.Path.GetExtension(FileUploadFollowUpUN.FileName) == ".pdf")
{
imgPhoto.ImageUrl = "~/Images/pdf.jpg";
ZoomImage.ImageUrl = "~/Images/pdf.jpg";
imgPhoto.Enabled = true;
}
else
{
imgPhoto.ImageUrl = "~/ECG/" + filename;
imgPhoto.Enabled = true;
ZoomImage.ImageUrl = "~/ECG/" + filename;
}
}
}
How can I get rid out of this error?
There is a similar question here on how to find what process is using a file
You should try to dispose any file methods before trying to delete.
You could stick it in a while loop if you have something which will block until the file is accessible
public static bool IsFileReady(String sFilename)
{
// If the file can be opened for exclusive access it means that the file
// is no longer locked by another process.
try
{
using (FileStream inputStream = File.Open(sFilename, FileMode.Open, FileAccess.Read, FileShare.None))
{
if (inputStream.Length > 0)
{
return true;
}
else
{
return false;
}
}
}
catch (Exception)
{
return false;
}
}

image file is not released after it is disposed

I am working on a project that downloads some images and put them in a arrarList to be processed later. The following portion of code is where the problem is. It works with first download, but somehow the file images were saving to is locked up after the first download. I can't seems to find a way to unlock it. File.Delete("BufferImg"); is giving error saying the file is been used by another process when "BufferImg" was not used anywhere else of the program. What am I doing wrong?
int attempcnt=0;
if (ok)
{
System.Net.WebClient myWebClient = new System.Net.WebClient();
try
{
myWebClient.DownloadFile(pth, "BufferImg");
lock (IMRequest) { IMRequest.RemoveAt(0); }
attempcnt = 0;
}
catch // will attempcnt 3 time before it remove the request from the queue
{
attempcnt++;
myWebClient.Dispose();
myWebClient = null;
if(attempcnt >2)
{
lock (IMRequest) { IMRequest.RemoveAt(0); }
attempcnt = 0;
}
goto endofWhile;
}
myWebClient.Dispose();
myWebClient = null;
using (Image img = Image.FromFile("BufferImg"))
{
lock (IMBuffer)
{
IMBuffer.Add(img.Clone());
MessageBox.Show("worker filled: " + IMBuffer.Count.ToString() + ": " + pth);
}
img.Dispose();
}
}
endofWhile:
File.Delete("BufferImg");
continue;
}
The following line is why the image is not being released:
IMBuffer.Add(img.Clone());
When you clone something loaded through a resource (file), the file is still attached to the cloned object. You will have to use a FileStream, like so:
FileStream fs = new FileStream("BufferImg", FileMode.Open, FileAccess.Read);
using (Image img = Image.FromStream(fs))
{
lock (IMBuffer)
{
IMBuffer.Add(img);
MessageBox.Show("worker filled: " + IMBuffer.Count.ToString() + ": " + pth);
}
}
fs.Close();
This should release the file after you've loaded it in the buffer.

Categories