C# reading and writing from & to same file [duplicate] - c#

This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
How to both Read/Write File in C#
I want to write and read text to and from the same text file. Here is the code
TextWriter notesWriter = new StreamWriter("text.txt");
TextReader notesReader; = new StreamReader("text.txt");
Once the file is open for writing it is being locked and prevents to read from it.
So, some exception is thrown like,
The process cannot access the file 'text.txt' because it is being used by another process.
what is the workaround for this ? thank you.
EDIT:
Suppose, if want to change text in a textbox and save it in a file and read from it when required to set the text in the textbox.

Every time you finish writing you should Close the file, unless you need to write AND read at the same time which is impossible something you should not do because not so safe/hard/useless in most cases with standard text files used to store data.

Try something like
using(TextWriter notesWriter = new StreamWriter("text.txt"))
{
//do write-things here
}
after the closing-braked the Streamwriter will be disposed and you can read the file.

The workaround is not to do it. While technically this can be done, the way you want to do it (by accessing the file using stream semantics) is almost impossible to be correct as, even if you fix the file sharing, it would imply you're reading back the same stuff you wrote in an infinite loop.
You can use a paging based file access metaphor, which again is very unlikely what you want to do.
The most likely option is that you want to write into a different file, a (modified?) copy of the original file, and then swap the copy with the original file.

Sure you can read and write at the same time, but you only need one reference:
Stream l_fileStream = File.Open( "text.txt", FileMode.Open, FileAccess.ReadWrite );
http://msdn.microsoft.com/en-us/library/s67691sb.aspx
Now you can read/write to the file:
StreamReader l_reader = new StreamReader( l_fileStream );
StreamWriter l_writer = new StreamWriter( l_fileStream );
Why would you want to do this? I have no idea. Seems like you'd want to finish one operation before beginning the next, unless you want to get down and dirty in the actual byte array (like an advanced paging system), in which case you may not be quite at the experience level to pull such a thing off.

You don't need to read and write at the same time, considering your edits.
Open the application
Read the file. Put the file's content in the textbox. Close the file
Save the textbox content into the file. Close the file.
As you can see, you never need to read and write at the same time if you close the file in between your uses.

Related

Guidelines for designing a robust file format writer?

Suppose you want to write a .WAV file format writer like so:
using var stream = File.OpenRead("test.wav");
using var writer = new WavWriter(stream, ... /* .WAV format parameters */);
// write the file
// writer.Dispose() does a few things:
// - writes user-added chunks
// - updates the file header (chunk size) so the file is valid
There is a concpetual problem in doing so:
the user can change the stream position and therefore screw the writing process
You may suggest the following:
the writer should own the stream, this would work if writing to a file, but not to a stream
own its own memory stream so it can write to streams too, okay but memory concerns
I guess you get the point...
To me, the only viable thing would be to document that aspect but I may have missed something, hence the question.
Question:
How to make a file format writer be able to write to a stream yet defend yourself about possible changes to its position?
My suggestion would be to keep an internal position field in the WavWriter. Each time you do some operation you can check that this matches the position in the backing stream and throw an exception if it does not. Update this value at the end of each write operation.
Ideally you should also handle streams that does not support seeking, but it does not sound like your design would permit that anyway. It might be a good idea to check CanSeek in the constructor and throw if seek is not supported. It is in general a good idea to validate any arguments before usage.

How can I write to the column I want with StreamWriter? [duplicate]

I am trying to use StreamReader and StreamWriter to Open a text file (fixed width) and to modify a few specific columns of data. I have dates with the following format that are going to be converted to packed COMP-3 fields.
020100718F
020100716F
020100717F
020100718F
020100719F
I want to be able to read in the dates form a file using StreamReader, then convert them to packed fields (5 characters), and then output them using StreamWriter. However, I haven't found a way to use StreamWriter to right to a specific position, and beginning to wonder if is possible.
I have the following code snip-it.
System.IO.StreamWriter writer;
this.fileName = #"C:\Test9.txt";
reader = new System.IO.StreamReader(System.IO.File.OpenRead(this.fileName));
currentLine = reader.ReadLine();
currentLine = currentLine.Substring(30, 10); //Substring Containing the Date
reader.Close();
...
// Convert currentLine to Packed Field
...
writer = new System.IO.StreamWriter(System.IO.File.Open(this.fileName, System.IO.FileMode.Open));
writer.Write(currentLine);
Currently what I have does the following:
After:
!##$%0718F
020100716F
020100717F
020100718F
020100719F
!##$% = Ascii Characters SO can't display
Any ideas? Thanks!
UPDATE
Information on Packed Fields COMP-3
Packed Fields are used by COBOL systems to reduce the number of bytes a field requires in files. Please see the following SO post for more information: Here
Here is Picture of the following date "20120123" packed in COMP-3. This is my end result and I have included because I wasn't sure if it would effect possible answers.
My question is how do you get StreamWriter to dynamically replace data inside a file and change the lengths of rows?
I have always found it better to to read the input file, filter/process the data and write the output to a temporary file. After finished, delete the original file (or make a backup) and copy the temporary file over. This way you haven't lost half your input file in case something goes wrong in the middle of processing.
You should probably be using a Stream directly (probably a FileStream). This would allow you to change position.
However, you're not going to be able to change record sizes this way, at least, not in-line. You can have one Stream reading from the original file, and another writing to a new, converted copy of the file.
However, I haven't found a way to use StreamWriter to right to a specific position, and
beginning to wonder if is possible.
You can use StreamWriter.BaseStream.Seek method
using (StreamWriter wr = new StreamWriter(File.Create(#"c:\Temp\aaa.txt")))
{
wr.Write("ABC");
wr.Flush();
wr.BaseStream.Seek(0, SeekOrigin.Begin);
wr.Write("Z");
}

Handling strings more than 2 GB

I have an application where an XLS file with lots of data entered by the user is opened and the data in it is converted to XML. I have already mapped the columns in the XLS file to XML Maps. When I try to use the ExportXml method in XMLMaps, I get a string with the proper XML representation of the XLS file. I parse this string a bit and upload it to my server.
The problem is, when my XLS file is really large, the string produced for XML is over 2 GB and I get a Out of Memory exception. I understand that the limit for CLR objects is 2 GB. But in my case I need to handle this scenario. Presently I just message asking the user to send less data.
Any ideas on how I can do this?
EDIT:
This is just a jist of the operation I need to do on the generated XML.
Remove certain fields which are not needed for the server data.
Add something like ID numbers for each row of data.
Modify the values of certain elements.
Do validation on the data.
While the XMLReader stream is a good idea, I cannot perform these operations by that method. While data validation can be done by Excel itself, the other things cannot be done here.
Using XMLTextReader and XMLTextWriter and creating a custom method for each of the step is a solution I had thought of. But to go through the jist above, it requires the XML document to be gone through or processed 4 times. This is just not efficient.
If the XML is that large, then you might be able to use Export to a temporary file, rather than using ExportXML to a string - http://msdn.microsoft.com/en-us/library/microsoft.office.interop.excel.xmlmap.export.aspx
If you then need to parse/handle the XML in C#, then for handling such large XML structures, you'll probably be better off implementing a custom XMLReader (or XMLWriter) which works at the stream level. See this question for some similar advice - What is the best way to parse large XML (size of 1GB) in C#?
I guess there is no other way then using x64-OS and FX if you really need to hold the whole thing in RAM, but using some other way to process the data like suggested by Stuart may is the better way to go...
What you need to do is to use "stream chaining", i.e. you open up an input stream which reads from your excel file and an output stream that writes to your xml file. Then your conversion class/method will take the two streams as input and read sufficient data from the input stream to be able to write to the output.
Edit: very simple minimal Example
Converting from file:
123
1244125
345345345
4566
11
to
<List>
<ListItem>123</ListItem>
<ListItem>1244125</ListItem>
...
</List>
using
void Convert(Stream fromStream, Stream toStream)
{
using(StreamReader from= new StreamReader(fromStream))
using(StreamWriter to = new StreamWriter(toStream))
{
to.WriteLine("<List>");
while(!from.EndOfStream)
{
string bulk = from.ReadLine(); //in this case, a single line is sufficient
//some code to parse the bulk or clean it up, e.g. remove '\r\n'
to.WriteLine(string.Format("<ListItem>{0}</ListItem>", bulk));
}
to.WriteLine("</List>");
}
}
Convert(File.OpenRead("source.xls"), File.OpenWrite("source.xml"));
Of course you could do this in much more elegent, abstract manner but this is only to show my point

Should I build a string first and then write to file?

A program I am working on right now has to generate a file. Is it better for me to generate the file's contents as a string first and then write that string to the file, or should I just directly add the contents to the file?
Are there any advantages of one over the other?
The file will be about 0.5 - 1MB in size.
If you write to a file as-you-go, you'll have the benefit of not keeping everything in memory, if it's a big enough file and you constantly flush the stream.
However, you'll be more likely to run into problems with a partially-written file, since you're doing your IO over a period of time instead of in a single shot.
Personally, I'd build it up using a StringBuilder, and then write it all to disk in a single shot.
I think it's a better idea, in general, to create a StreamWriter and just write to it. Why keep things in memory when you don't have to? And it's a whole lot easier. For example:
using (var writer = new StreamWriter("filename"))
{
writer.WriteLine(header);
// write all your data with Write and WriteLine,
// taking advantage of composite formatting
}
If you want to build multiple lines with StringBuilder you have to write something like:
var sb = new StringBuilder();
sb.AppendLine(string.Format("{0:N0} blocks read", blocksRead));
// etc., etc.
// and finally write it to file
File.WriteAllText("filename", sb.ToString());
There are other options, of course. You could build the lines into a List<string> and then use File.WriteAllLines. Or you could write to a StringStream and then write that to the file. But all of those approaches have you handling the data multiple times. Just open the StreamWriter and write.
The primary reasons I think it's a better idea in general to go directly to output:
You don't have to refactor your code when it turns out that your output data is too big to fit in memory.
The planned destination is the file anyway, so why fool with formatting it in memory before writing to the file?
The API for writing multiple lines to a text file is, in my opinion, cleaner than the API for adding lines to a StringBuilder.
I think it is better to use string or stringbuilder to store your data then you can write to file using File.Write functions.

what is bad about this TextWriter method?

alt text http://img179.imageshack.us/img179/7827/textwriter.jpg
the tf.txt file has 0 bytes and when calling this method several times over the loop I get:
the process cannot access " " because it is being used by another process
Yes, you're not closing the TextWriter. Thus the file handle remains open, so you can't create another one writing to the same file.
Use a using statement:
// Consider using File.CreateText instead, btw
using (TextWriter writer = new StreamWriter(...))
{
...
}
I'm surprised that your file is empty, admittedly... did it throw an exception the first time you called it, e.g. in GetTerms()? That would explain it. You might need a using statement for IndexReader as well, by the way - we can't really tell.
Why is tw.Close commented out? This might be the cause of "is being used by another process" since the file would be held open until closed.

Categories