I am writing a payments system which communicates with a very old POS interface through files.
The POS system tries to open/create a file (fixed to a single name). If it opens the file it then writes the payment request and closes the file, if it fails to open the file the POS system will wait and try again (sorry don't know how long between attempts, but very short).
I have to monitor the directory and when I see a new file arrive, then I have to try to open the file and block any access to the POS system from writing any new entries. While I have the file open I read through the records and send them to the payments system, once I receive confirmation I have to close the file and delete it.
My concern/problem is - how do I block the POS system from writing anything between the close and delete statements?
I can set the FileShare to Delete, which allows my program to delete the file before I close it, but this means other processes could also delete the file regardless of whether or not I processed the records successfully.
Option 1
void Main()
{
string filename = #"C:\Temp\CHARGES.TXT";
using (var fs = new FileStream(filename, FileMode.Open, FileAccess.Read, FileShare.None))
{
var buffer = new byte[1024];
while (true)
{
var bytesRead = fs.Read(buffer, 0, buffer.Length);
if (bytesRead == 0)
break;
var text = ASCIIEncoding.ASCII.GetString(buffer, 0, bytesRead);
Console.Write(text);
}
}
File.Delete(filename);
}
Option 2
void Main()
{
string filename = #"C:\Temp\CHARGES.TXT";
using (var fs = new FileStream(filename, FileMode.Open, FileAccess.Read, FileShare.Delete))
{
var buffer = new byte[1024];
while (true)
{
var bytesRead = fs.Read(buffer, 0, buffer.Length);
if (bytesRead == 0)
break;
var text = ASCIIEncoding.ASCII.GetString(buffer, 0, bytesRead);
Console.Write(text);
}
File.Delete(filename);
}
}
As this is a payments system any loss of records is not acceptable. Also as I have no control over the POS system I cannot change the way it works.
Thank you in advance
Thank you to #Lasse and #Sam for your helpful comments. I have made the following changes to the code:
void Main()
{
string filename = #"C:\Temp\CHARGES.TXT";
int start = 0;
int end = 0;
using (var fs = new FileStream(filename, FileMode.Open, FileAccess.ReadWrite, FileShare.Read))
{
// get a big enough buffer to hold the whole file and read it in
long fileSize = new FileInfo(filename).Length;
var buffer = new byte[fileSize];
var bytesRead = fs.Read(buffer, 0, buffer.Length);
try
{
// convert each line to ASCII to process through to API
while (end < bytesRead)
{
while (end < bytesRead && buffer[end] != 0x0A && buffer[end] != 0x0D)
{
end++;
}
var text = ASCIIEncoding.ASCII.GetString(buffer, start, end - start);
// Send to API
Console.WriteLine("Start [{0}] End [{1}] Text [{2}]", start, end, text);
if (end < bytesRead)
{
end += 2;
start = end;
}
if (end > bytesRead / 2)
{
throw new NotImplementedException();
}
}
}
catch (Exception e)
{
Console.WriteLine("just to prove the partial");
}
finally
{
// convert any remaining records back to bytes and write back to file
fs.SetLength(0);
fs.Write(buffer, end, bytesRead - end);
}
}
}
The process will now go through the file and write back any failed/unprocessed messages to the file at the end - this covers off my locking concerns as well as covering the individual record failures.
Thank you agin
Related
I want to read file continuously like GNU tail with "-f" param. I need it to live-read log file.
What is the right way to do it?
More natural approach of using FileSystemWatcher:
var wh = new AutoResetEvent(false);
var fsw = new FileSystemWatcher(".");
fsw.Filter = "file-to-read";
fsw.EnableRaisingEvents = true;
fsw.Changed += (s,e) => wh.Set();
var fs = new FileStream("file-to-read", FileMode.Open, FileAccess.Read, FileShare.ReadWrite);
using (var sr = new StreamReader(fs))
{
var s = "";
while (true)
{
s = sr.ReadLine();
if (s != null)
Console.WriteLine(s);
else
wh.WaitOne(1000);
}
}
wh.Close();
Here the main reading cycle stops to wait for incoming data and FileSystemWatcher is used just to awake the main reading cycle.
You want to open a FileStream in binary mode. Periodically, seek to the end of the file minus 1024 bytes (or whatever), then read to the end and output. That's how tail -f works.
Answers to your questions:
Binary because it's difficult to randomly access the file if you're reading it as text. You have to do the binary-to-text conversion yourself, but it's not difficult. (See below)
1024 bytes because it's a nice convenient number, and should handle 10 or 15 lines of text. Usually.
Here's an example of opening the file, reading the last 1024 bytes, and converting it to text:
static void ReadTail(string filename)
{
using (FileStream fs = File.Open(filename, FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
{
// Seek 1024 bytes from the end of the file
fs.Seek(-1024, SeekOrigin.End);
// read 1024 bytes
byte[] bytes = new byte[1024];
fs.Read(bytes, 0, 1024);
// Convert bytes to string
string s = Encoding.Default.GetString(bytes);
// or string s = Encoding.UTF8.GetString(bytes);
// and output to console
Console.WriteLine(s);
}
}
Note that you must open with FileShare.ReadWrite, since you're trying to read a file that's currently open for writing by another process.
Also note that I used Encoding.Default, which in US/English and for most Western European languages will be an 8-bit character encoding. If the file is written in some other encoding (like UTF-8 or other Unicode encoding), It's possible that the bytes won't convert correctly to characters. You'll have to handle that by determining the encoding if you think this will be a problem. Search Stack overflow for info about determining a file's text encoding.
If you want to do this periodically (every 15 seconds, for example), you can set up a timer that calls the ReadTail method as often as you want. You could optimize things a bit by opening the file only once at the start of the program. That's up to you.
To continuously monitor the tail of the file, you just need to remember the length of the file before.
public static void MonitorTailOfFile(string filePath)
{
var initialFileSize = new FileInfo(filePath).Length;
var lastReadLength = initialFileSize - 1024;
if (lastReadLength < 0) lastReadLength = 0;
while (true)
{
try
{
var fileSize = new FileInfo(filePath).Length;
if (fileSize > lastReadLength)
{
using (var fs = new FileStream(filePath, FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
{
fs.Seek(lastReadLength, SeekOrigin.Begin);
var buffer = new byte[1024];
while (true)
{
var bytesRead = fs.Read(buffer, 0, buffer.Length);
lastReadLength += bytesRead;
if (bytesRead == 0)
break;
var text = ASCIIEncoding.ASCII.GetString(buffer, 0, bytesRead);
Console.Write(text);
}
}
}
}
catch { }
Thread.Sleep(1000);
}
}
I had to use ASCIIEncoding, because this code isn't smart enough to cater for variable character lengths of UTF8 on buffer boundaries.
Note: You can change the Thread.Sleep part to be different timings, and you can also link it with a filewatcher and blocking pattern - Monitor.Enter/Wait/Pulse. For me the timer is enough, and at most it only checks the file length every second, if the file hasn't changed.
This is my solution
static IEnumerable<string> TailFrom(string file)
{
using (var reader = File.OpenText(file))
{
while (true)
{
string line = reader.ReadLine();
if (reader.BaseStream.Length < reader.BaseStream.Position)
reader.BaseStream.Seek(0, SeekOrigin.Begin);
if (line != null) yield return line;
else Thread.Sleep(500);
}
}
}
so, in your code you can do
foreach (string line in TailFrom(file))
{
Console.WriteLine($"line read= {line}");
}
You could use the FileSystemWatcher class which can send notifications for different events happening on the file system like file changed.
private void button1_Click(object sender, EventArgs e)
{
if (folderBrowserDialog.ShowDialog() == DialogResult.OK)
{
path = folderBrowserDialog.SelectedPath;
fileSystemWatcher.Path = path;
string[] str = Directory.GetFiles(path);
string line;
fs = new FileStream(str[0], FileMode.Open, FileAccess.Read, FileShare.ReadWrite);
tr = new StreamReader(fs);
while ((line = tr.ReadLine()) != null)
{
listBox.Items.Add(line);
}
}
}
private void fileSystemWatcher_Changed(object sender, FileSystemEventArgs e)
{
string line;
line = tr.ReadLine();
listBox.Items.Add(line);
}
If you are just looking for a tool to do this then check out free version of Bare tail
I want to read a CD to write it's content to a file, but I get an IOException (Data error Cyclic Redundancy Check) after a few read/write to my output file.
I'm trying to make an ISO creator program and the disk I read is 500 Mo game CD, and the exception arise after reading about at 1,9 Mo of data. The CD is not broken and perfectly usable.
I don't know if they are limitations with chunk size or buffer size (using 4096 bytes).
I'm very interested by any idea or experience to fix that problem.
EDIT
I get this error with every CD, not only this one. And it's not broken, as I used it just before starting to make the program to install the game it contains.
I use this code to read the CD :
const int BUFFER_SIZE = 4096;
private void MakeISO()
{
_HDEV = NativeMethods.CreateFileR(DRIVE_NAME);
string targetFile = TARGET + "\\" + NAME;
try
{
_FSR = new FileStream(_HDEV, FileAccess.Read, BUFFER_SIZE);
_FSW = new FileStream(targetFile, FileMode.Create, FileAccess.Write, FileShare.None, BUFFER_SIZE);
byte[] buffer = new byte[BUFFER_SIZE];
int length;
while ((length = _FSR.Read(buffer, 0, buffer.Length)) > 0)
{
_FSW.Write(buffer, 0, buffer.Length);
}
// Don't work either
//do
//{
// _FSR.Read(buffer, 0, BUFFER_SIZE);
// _FSW.Write(buffer, 0, BUFFER_SIZE);
//}
//while (_fsw.Position == _fsr.Position);
}
catch (Exception ex)
{
//
}
finally
{
CloseAll();
}
}
I've got a rare case where a file cannot be read from a UNC path immediately after it was written. Here's the workflow:
plupload sends a large file in chunks to a WebAPI method
Method writes the chunks to a UNC path (a storage server). This loops until the file is completely uploaded.
After a few other operations, the same method tries to read the file again and sometimes it cannot find the file
It only seems to happen after our servers have been idle for a while. If I repeat the upload a few times, it starts to work.
I thought it might be a network configuration issue, or something to do with the file not completely closing before being read again.
Here's part of the code that writes the file (is the filestream OK in this case?)
SaveStream(stream, new FileStream(fileName, FileMode.Append, FileAccess.Write));
Here's SaveStream definition:
private static void SaveStream(Stream stream, FileStream fileStream)
{
using (var fs = fileStream)
{
var buffer = new byte[1024];
var l = stream.Read(buffer, 0, 1024);
while (l > 0)
{
fs.Write(buffer, 0, l);
l = stream.Read(buffer, 0, 1024);
}
fs.Flush();
fs.Close();
}
}
Here's the code that reads the file:
var fileInfo = new FileInfo(fileName);
var exists = fileInfo.Exists;
It's the fileInfo.Exists that is returning false.
Thank you
These kind of errors are mostly due to files not closed yet.
Try passing the fileName to SaveStream and then use it as follows:
private static void SaveStream(Stream stream, string fileName)
{
using (var fs = new FileStream(fileName, FileMode.Append, FileAccess.Write))
{
var buffer = new byte[1024];
var l = stream.Read(buffer, 0, 1024);
while (l > 0)
{
fs.Write(buffer, 0, l);
l = stream.Read(buffer, 0, 1024);
}
fs.Flush();
} // end of using will close and dispose fs properly
}
I want to read file continuously like GNU tail with "-f" param. I need it to live-read log file.
What is the right way to do it?
More natural approach of using FileSystemWatcher:
var wh = new AutoResetEvent(false);
var fsw = new FileSystemWatcher(".");
fsw.Filter = "file-to-read";
fsw.EnableRaisingEvents = true;
fsw.Changed += (s,e) => wh.Set();
var fs = new FileStream("file-to-read", FileMode.Open, FileAccess.Read, FileShare.ReadWrite);
using (var sr = new StreamReader(fs))
{
var s = "";
while (true)
{
s = sr.ReadLine();
if (s != null)
Console.WriteLine(s);
else
wh.WaitOne(1000);
}
}
wh.Close();
Here the main reading cycle stops to wait for incoming data and FileSystemWatcher is used just to awake the main reading cycle.
You want to open a FileStream in binary mode. Periodically, seek to the end of the file minus 1024 bytes (or whatever), then read to the end and output. That's how tail -f works.
Answers to your questions:
Binary because it's difficult to randomly access the file if you're reading it as text. You have to do the binary-to-text conversion yourself, but it's not difficult. (See below)
1024 bytes because it's a nice convenient number, and should handle 10 or 15 lines of text. Usually.
Here's an example of opening the file, reading the last 1024 bytes, and converting it to text:
static void ReadTail(string filename)
{
using (FileStream fs = File.Open(filename, FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
{
// Seek 1024 bytes from the end of the file
fs.Seek(-1024, SeekOrigin.End);
// read 1024 bytes
byte[] bytes = new byte[1024];
fs.Read(bytes, 0, 1024);
// Convert bytes to string
string s = Encoding.Default.GetString(bytes);
// or string s = Encoding.UTF8.GetString(bytes);
// and output to console
Console.WriteLine(s);
}
}
Note that you must open with FileShare.ReadWrite, since you're trying to read a file that's currently open for writing by another process.
Also note that I used Encoding.Default, which in US/English and for most Western European languages will be an 8-bit character encoding. If the file is written in some other encoding (like UTF-8 or other Unicode encoding), It's possible that the bytes won't convert correctly to characters. You'll have to handle that by determining the encoding if you think this will be a problem. Search Stack overflow for info about determining a file's text encoding.
If you want to do this periodically (every 15 seconds, for example), you can set up a timer that calls the ReadTail method as often as you want. You could optimize things a bit by opening the file only once at the start of the program. That's up to you.
To continuously monitor the tail of the file, you just need to remember the length of the file before.
public static void MonitorTailOfFile(string filePath)
{
var initialFileSize = new FileInfo(filePath).Length;
var lastReadLength = initialFileSize - 1024;
if (lastReadLength < 0) lastReadLength = 0;
while (true)
{
try
{
var fileSize = new FileInfo(filePath).Length;
if (fileSize > lastReadLength)
{
using (var fs = new FileStream(filePath, FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
{
fs.Seek(lastReadLength, SeekOrigin.Begin);
var buffer = new byte[1024];
while (true)
{
var bytesRead = fs.Read(buffer, 0, buffer.Length);
lastReadLength += bytesRead;
if (bytesRead == 0)
break;
var text = ASCIIEncoding.ASCII.GetString(buffer, 0, bytesRead);
Console.Write(text);
}
}
}
}
catch { }
Thread.Sleep(1000);
}
}
I had to use ASCIIEncoding, because this code isn't smart enough to cater for variable character lengths of UTF8 on buffer boundaries.
Note: You can change the Thread.Sleep part to be different timings, and you can also link it with a filewatcher and blocking pattern - Monitor.Enter/Wait/Pulse. For me the timer is enough, and at most it only checks the file length every second, if the file hasn't changed.
This is my solution
static IEnumerable<string> TailFrom(string file)
{
using (var reader = File.OpenText(file))
{
while (true)
{
string line = reader.ReadLine();
if (reader.BaseStream.Length < reader.BaseStream.Position)
reader.BaseStream.Seek(0, SeekOrigin.Begin);
if (line != null) yield return line;
else Thread.Sleep(500);
}
}
}
so, in your code you can do
foreach (string line in TailFrom(file))
{
Console.WriteLine($"line read= {line}");
}
You could use the FileSystemWatcher class which can send notifications for different events happening on the file system like file changed.
private void button1_Click(object sender, EventArgs e)
{
if (folderBrowserDialog.ShowDialog() == DialogResult.OK)
{
path = folderBrowserDialog.SelectedPath;
fileSystemWatcher.Path = path;
string[] str = Directory.GetFiles(path);
string line;
fs = new FileStream(str[0], FileMode.Open, FileAccess.Read, FileShare.ReadWrite);
tr = new StreamReader(fs);
while ((line = tr.ReadLine()) != null)
{
listBox.Items.Add(line);
}
}
}
private void fileSystemWatcher_Changed(object sender, FileSystemEventArgs e)
{
string line;
line = tr.ReadLine();
listBox.Items.Add(line);
}
If you are just looking for a tool to do this then check out free version of Bare tail
im doing a application in which i split a wmv file and transfer it to otherlocation(in 'x' kbs) .after the transfer gets completed the file doesnt play,it gives a message as the format is not supported.is there anyother way to do it.
sory i will explain what im doing now
i wrote an remote application,i want to transfer a .wmv file from one machine to other,i want to split the .wmv and send it to the remote machine and use it there.if i try to send the complete file means it will take lot of memory that seems very bad.so i want to split it and send it.but the file doesnt gets played it raises an exception the format is not supported.
the following is the code im doing i just done it in the local machine itself(not remoting):
try
{
FileStream fswrite = new FileStream("D:\\Movie.wmv", FileMode.Create);
int pointer = 1;
int bufferlength = 12488;
int RemainingLen = 0;
int AppLen = 0;
FileStream fst = new FileStream("E:\\Movie.wmv", FileMode.Open);
int TotalLen = (int)fst.Length;
fst.Close();
while (pointer != 0)
{
byte[] svid = new byte[bufferlength];
using (FileStream fst1 = new FileStream("E:\\Movie.wmv", FileMode.Open))
{
pointer = fst1.Read(svid, AppLen, bufferlength);
fst1.Close();
}
fswrite.Write(svid, 0, pointer);
AppLen += bufferlength;
RemainingLen = TotalLen-AppLen;
if(RemainingLen < bufferlength)
{
byte[] svid1 = new byte[RemainingLen];
using (FileStream fst2 = new FileStream("E:\\Movie.wmv", FileMode.Open))
{
pointer = fst2.Read(svid1, 0, RemainingLen);
fst2.Close();
}
fswrite.Write(svid, 0, pointer);
break;
}
}
fswrite.Close();
}
catch(Exception ex)
{
MessageBox.Show(ex.Message);
}
You'll probably find Good way to send a large file over a network in C# helpful.
Im going to make the assumtion your spliting the file when your sending it, and not trying to have the wmv in 3 different files on the remote machine.
When your sending the file what you basicly do is this:
Local machine
1) Read 16k bytes ( Or whatever number you prefere )
2) Send those 16k bytes over the network
3) Repeat above steps untill done
Remote machine
1) Listen for a connection
2) Get 16k bytes
3) Write 16k bytes
4) Repeat untill done.
This method will work, but your kind of inventing the wheel again, i would recommend using either something as simple as File.Copy ( Works fine over the network ) or if that does not meet your needs perhaps using a FTP client / server solution ( Plenty of C# examples on the net that can be hosted inside your application ).
i tried this
private void Splitinthread()
{
int bufferlength = 2048;
int pointer = 1;
int offset = 0;
int length = 0;
byte[] buff = new byte[2048];
FileStream fstwrite = new FileStream("D:\\TEST.wmv", FileMode.Create);
FileStream fst2 = new FileStream("E:\\karthi.wmv", FileMode.Open);
int Tot_Len = (int)fst2.Length;
int Remain_Buff = 0;
//Stream fst = File.OpenRead("E:\\karth.wmv");
while (pointer != 0)
{
try
{
fst2.Read(buff, 0, bufferlength);
fstwrite.Write(buff, 0, bufferlength);
offset += bufferlength;
Remain_Buff = Tot_Len - offset;
Fileprogress.Value = CalculateProgress(offset, Tot_Len);
if (Remain_Buff < bufferlength)
{
byte[] buff1 = new byte[Remain_Buff];
pointer = fst2.Read(buff1, 0, Remain_Buff);
fstwrite.Write(buff1, 0, pointer);
Fileprogress.Value = CalculateProgress(offset, Tot_Len);
fstwrite.Close();
fst2.Close();
break;
}
}
catch (Exception ex)
{
MessageBox.Show(ex.Message);
}
}
MessageBox.Show("Completed");
}