Here I want to delete line in a textfiles containg specific string like "21309#003" where item1 is a filename but It shows runtime exception that item1 (file) is already use in some process.How I Solve this problem.I am new in .net C#.
private void button1_Click(object sender, EventArgs e)
{
var selectedItems = listBox1.SelectedItems.Cast<String>().ToList();
foreach (var item in selectedItems)
{
listBox1.Items.Remove(item);
}
foreach (var item1 in selectedItems)
{
listBox1.Items.Remove(item1);
string line = null;
//string line_to_delete = "the line i want to delete";
using (StreamReader reader = new StreamReader(item1))
//item1= "C:\\IMP2711\\textpresent.txt"
{
using (StreamWriter writer = new StreamWriter(item1))
{
while ((line = reader.ReadLine()) != null)
{
//if (String.Compare(line, #"*21349#003*") == 0)
//if (!line.Contains("21349#003") )
if (!line.StartsWith("21349#003"))
{**strong text**
writer.WriteLine(line);
}
}
}
You are reading and writing to the same file at the same time.
var item2 = item1;
If the file is not to big you can read the lines into memory and then write the lines you want to keep back to the file. We can even simplify your code a little bit.
File.WriteAllLines(item1,
File.ReadLines(item1).Where(l => !l.StartsWith("21349#003")).ToList());
Another option if the file is very large is to write to a temporary file. Delete the original and then rename the temporary.
var tmp = Path.GetTempFileName();
File.WriteAllLines(tmp, File.ReadLines(item1).Where(l => !l.StartsWith("21349#003")));
File.Delete(item1);
File.Move(tmp, item1);
If your file is small first read it to memory and then try to write on it, you have two stream on the same file, a file can share between multiple streams but you can not modify a file when it is open by another stream, if your file is huge and you can not moved to memory you can create a temp file and write to temp file when your reading finished replacing original file with temp file and removing temp file.
There's some process that's locking the file c:\imp2711\textpresent.txt. You have to find and kill it.
To find it out, please refer to this question: https://superuser.com/questions/117902/find-out-which-process-is-locking-a-file-or-folder-in-windows
Related
I've work with large XML Files (~1000000 lines, 34mb) that are stored in a ZIP archive. The XML file is used at runtime to store and load app settings and measurements. The gets loadeted with this function:
public static void LoadFile(string path, string name)
{
using (var file = File.OpenRead(path))
{
using (var zip = new ZipArchive(file, ZipArchiveMode.Read))
{
var foundConfigurationFile = zip.Entries.First(x => x.FullName == ConfigurationFileName);
using (var stream = new StreamReader(foundConfigurationFile.Open()))
{
var xmlSerializer = new XmlSerializer(typeof(ProjectConfiguration));
var newObject = xmlSerializer.Deserialize(stream);
CurrentConfiguration = null;
CurrentConfiguration = newObject as ProjectConfiguration;
AddRecentFiles(name, path);
}
}
}
}
This works for most of the time.
However, some files don't get read to the end and i get an error that the file contains non valid XML. I used
foundConfigurationFile.ExtractToFile();
and fount that the readed file stops at line ~800000. But this only happens inside this code. When i open the file via editor everything is there.
It looks like the zip doesnt get loaded correctly, or for that matter, completly.
Am i running in some limitations? Or is there an error in my code i don't find?
The file is saved via:
using (var file = File.OpenWrite(Path.Combine(dirInfo.ToString(), fileName.ToString()) + ".pwe"))
{
var zip = new ZipArchive(file, ZipArchiveMode.Create);
var configurationEntry = zip.CreateEntry(ConfigurationFileName, CompressionLevel.Optimal);
var stream = configurationEntry.Open();
var xmlSerializer = new XmlSerializer(typeof(ProjectConfiguration));
xmlSerializer.Serialize(stream, CurrentConfiguration);
stream.Close();
zip.Dispose();
}
Update:
The problem was the File.OpenWrite() method.
If you try to override a file with this method it will result in a mix between the old file and the new file, if the new file is shorter than the old file.
File.OpenWrite() doenst truncate the old file first as stated in the docs
In order to do it correctly it was neccesary to use the File.Create() method. Because this method truncates the old file first.
I'm trying to open an archive Xml file (inside a zip file but not extracting it to a physical directory) in an in-memory stream then making changes to it and saving it. But archive xml file doesn't get overwritten rather it gets two copies of Xml data. One copy is the original copy of Xml data and the other one is changed/modified/edited copy of Xml data in the same archive file.
Here is my code, please help me overwrite the existing Xml data with the changes made rather than having 2 copies of Xml data in the same archive xml file.
static void Main(string[] args)
{
string rootFolder = #"C:\Temp\MvcApplication5\MvcApplication5\Package1";
string archiveName = "MvcApplication5.zip";
string folderFullPath = Path.GetFullPath(rootFolder);
string archivePath = Path.Combine(folderFullPath, archiveName);
string fileName = "archive.xml";
using (ZipArchive zip = ZipFile.Open(archivePath, ZipArchiveMode.Update))
{
var archiveFile = zip.GetEntry(fileName);
if (archiveFile == null)
{
throw new ArgumentException(fileName, "not found in Zip");
}
if (archiveFile != null)
{
using (Stream stream = archiveFile.Open())
{
XDocument doc = XDocument.Load(stream);
IEnumerable<XElement> xElemAgent = doc.Descendants("application");
foreach(var node in xElemAgent)
{
if(node.Attribute("applicationPool").Value!=null)
{
node.Attribute("applicationPool").Value = "MyPool";
}
}
doc.Save(stream);
}
Console.WriteLine("Document saved");
}
}
}
You are first reading the XML data from the stream and then writing to the same stream, which is pointing to the end of the file. To illustrate, let's say the old file contains ABCD and we want to replace this with 123.
The current approach would result in ABCD123, since the stream is pointing to the last char in ABCD.
If you reset the stream to position 0 (stream.Seek(0) before writing the changed file, the file would contain 123D, because it wouldn't reduce the file length.
The solution is to delete your old ZipArchiveEntry and create a new one.
I came across this same issue just now, and I fixed it by adding this first line:
stream.SetLength(0);
xmlDoc.Save(stream);
edit: I see you came across the same solution as you mentioned in the comments of the previous answer. You can add an answer to your own question. It would have helped someone like me :]
I'm using the two functions to read and write huge files (write to multiple files). I want to keep the file operation in the functions because the lines may be read/write from other sources.
Update:
C# doesn't really have coroutine. Is it a good use case for Reactive extensions?
foreach (var line in ReadFrom("filename"))
{
try
{
.... // Some actions based on the line
var l = .....
WriteTo("generatedFile1", l);
}
catch (Exception e)
{
var l = ..... // get some data from line, e and other objects etc.
WriteTo("generatedFile2", l);
}
}
The following function open the file once until all the lines are read and then close and release the resource.
private static IEnumerable<string> ReadFrom(string file)
{
string line;
using (var reader = File.OpenText(file))
{
while ((line = reader.ReadLine()) != null)
yield return line;
}
}
However, the following function, which write the lines instead of read lines, open and close the file for each line it writes. Is it possible to implement it in a way so it only open the file once and continue to write to the file until EOF is sent?
private static void WriteTo(string file, string line)
{
if (!File.Exists(file)) // Remove and recreate the file if existing
using (var tw = File.CreateText(file))
{
tw.WriteLine(line);
}
else
using (var tw = new StreamWriter(file, true))
{
tw.WriteLine(line);
}
}
Just use File.WriteAllLines. It will write all of the lines in a sequence to a file, and it won't open/close the file for each line.
You can remove the entire second method, and replace the call with var writer = new StreamWriter(file, true), as that constructor creates the file if it does not exist.
You can then use writer.WriteLine() until you're done writing, and Dispose() it afterwards.
I'm trying to write a .csv file in C# from a list<string> with roughly 350 lines (for 13 columns).
I write in the file with a loop but only a part of my list is written in the file (206 lines and a half).
This is my code :
StreamWriter file = new StreamWriter(#"C:\test.csv", true);
foreach (string s in MyListString)
{
Console.WriteLine(s); // Display all the data
file.WriteLine(s); // Write only a part of it
}
Why my file isn't properly filled ? Is there any limit to be considered ?
Looks like you might need to Flush or Close the writer. Also, most of the time you'd likely want to wrap the writer in a using statement.
Fortunately on dispose it automatically closes the writer, flushing the final batch of items to write, so it also solves your issue as well as disposing with any unmanaged items you are now finished with.
Try the following:
using (StreamWriter file = new StreamWriter(#"C:\test.csv", true))
{
foreach (string s in MyListString)
{
Console.WriteLine(s); // Display all the data
file.WriteLine(s); // Write only a part of it
}
}
You have to close your stream:
using(StreamWriter file = new StreamWriter(#"C:\test.csv", true))
{
foreach (string s in MyListString)
{
Console.WriteLine(s); // Display all the data
file.WriteLine(s); // Write only a part of it
}
}
using (StreamWriter file = new StreamWriter(#"C:\test.csv", true)){
foreach (string s in MyListString)
{
Console.WriteLine(s); // Display all the data
file.WriteLine(s); // Write only a part of it
}
}
I am able to do read/write/append operation on text file storing in isolated storage in WP7 application.
My scenario is that I am storing space seperated values in text file inside isolated storage.
So if I have to find for some particular line having some starting key then how to overwrite
value for that key without affecting the other line before and after it.
Example:
Key Value SomeOtherValue
*status read good
status1 unread bad
status2 null cantsay*
So if I have to change the whole second line based on some condition with key as same
status1 read good
How can I achieve this?
There are a number of ways you could do this, and the method you choose should be best suited to the size and complexity of the data file.
One option to get you started is to use the static string.Replace() method. This is crude, but if your file is only small then there is nothing wrong with it.
class Program
{
static void Main(string[] args)
{
StringBuilder sb = new StringBuilder();
sb.AppendLine("*status read good");
sb.AppendLine("status1 unread bad");
sb.AppendLine("status2 null cantsay*");
string input = sb.ToString();
var startPos = input.IndexOf("status1");
var endPos = input.IndexOf(Environment.NewLine, startPos);
var modifiedInput = input.Replace(oneLine.Substring(startPos, endPos - startPos), "status1 read good");
Console.WriteLine(modifiedInput);
Console.ReadKey();
}
}
If you store this information in text files then there won't be a way around replacing whole files. The following code does exactly this and might even be what you are doing right now.
// replace a given line in a given text file with a given replacement line
private void ReplaceLine(string fileName, int lineNrToBeReplaced, string newLine)
{
using (IsolatedStorageFile isf = IsolatedStorageFile.GetUserStoreForApplication())
{
// the memory writer will hold the read and modified lines
using (StreamWriter memWriter = new StreamWriter(new MemoryStream()))
{
// this is for reading lines from the source file
using (StreamReader fileReader = new StreamReader(new IsolatedStorageFileStream(fileName, System.IO.FileMode.Open, isf)))
{
int lineCount = 0;
// iterate file and read lines
while (!fileReader.EndOfStream)
{
string line = fileReader.ReadLine();
// check if this is the line which should be replaced; check is done by line
// number but could also be based on content
if (lineCount++ != lineNrToBeReplaced)
{
// just copy line from file
memWriter.WriteLine(line);
}
else
{
// replace line from file
memWriter.WriteLine(newLine);
}
}
}
memWriter.Flush();
memWriter.BaseStream.Position = 0;
// re-create file and save all lines from memory to this file
using (IsolatedStorageFileStream fileStream = new IsolatedStorageFileStream(fileName, System.IO.FileMode.Create, isf))
{
memWriter.BaseStream.CopyTo(fileStream);
}
}
}
}
private void button1_Click(object sender, RoutedEventArgs e)
{
ReplaceLine("test.txt", 1, "status1 read good");
}
And I agree with slugster: using SQLCE database might be a solution with better performance.