TextWriter not writing all files - c#

I've got written a service that has a separate thread running that reads roughly 400 records from a database and serializes them into xml files. It runs fine, there are no errors and it reports all files have been exported correctly, yet only a handful of xml files appear afterwards, and its always a different number each time. I've checked to see if it's a certain record causing problems, but they all read out fine, and seem to write fin, but don't...
After playing around and putting a delay in of 250ms between each write they are all exported properly, so I assume it must have something to do with writing so many files in such a quick succession, but I have no idea why, I would have thought it would report some kind of error if they didn't write properly, yet there's nothing.
Here is the code for anyone who wants to try it:
static void Main(string[] args)
{
ExportTestData();
}
public static void ExportTestData()
{
List<TestObject> testObjs = GetData();
foreach (TestObject obj in testObjs)
{
ExportObj(obj);
//Thread.Sleep(10);
}
}
public static List<TestObject> GetData()
{
List<TestObject> result = new List<TestObject>();
for (int i = 0; i < 500; i++)
{
result.Add(new TestObject()
{
Date = DateTime.Now.AddDays(-1),
AnotherDate = DateTime.Now.AddDays(-2),
AnotherAnotherDate = DateTime.Now,
DoubleOne = 1.0,
DoubleTwo = 2.0,
DoubleThree = 3.0,
Number = 345,
SomeCode = "blah",
SomeId = "wobble wobble"
});
}
return result;
}
public static void ExportObj(TestObject obj)
{
try
{
string path = Path.Combine(#"C:\temp\exports", String.Format("{0}-{1}{2}", DateTime.Now.ToString("yyyyMMdd"), String.Format("{0:HHmmssfff}", DateTime.Now), ".xml"));
SerializeTo(obj, path);
}
catch (Exception ex)
{
}
}
public static bool SerializeTo<T>(T obj, string path)
{
XmlSerializer xs = new XmlSerializer(obj.GetType());
using (TextWriter writer = new StreamWriter(path, false))
{
xs.Serialize(writer, obj);
}
return true;
}
Try commenting\uncommenting the Thread.Sleep(10) to see the problem
Does anybody have any idea why it does this? And can suggest how I can avoid this problem?
Thanks
EDIT: Solved. The time based filename wasn't unique enough and was overwriting previously written files. Should've spotted it earlier, thanks for your help

Perhaps try putting the writer in a using block for immediate disposal? Something like
XmlSerializer xs = new XmlSerializer(obj.GetType());
using(TextWriter writer = new StreamWriter(path, false))
{
xs.Serialize(writer, obj);
}

Ok I've found the problem, I was using a time based filename that I thought would be unique enough for each file, turns out in a loop that tight they're coming out with the same filenames and are over-writing each other.
If I change it to use actually unique filenames it works! Thanks for your help

Dispose the writer
public static bool SerializeTo<T>(T obj, string path)
{
XmlSerializer xs = new XmlSerializer(obj.GetType());
using(TextWriter writer = new StreamWriter(path, false)) {
xs.Serialize(writer, obj);
writer.Close();
}
return true;
}

If you're not getting any exceptions, then the using statements proposed by other answers won't help - although you should change to use them anyway. At that point, you don't need the close call any more:
XmlSerializer xs = new XmlSerializer(obj.GetType());
using(TextWriter writer = new StreamWriter(path, false))
{
xs.Serialize(writer, obj);
}
I don't think the problem lies in this code, however. I suspect it's something like the "capturing a loop variable in a lambda expression" problem which crops up so often. If you can come up with a short but complete program which demonstrates the problem, it will be a lot easier to diagnose.
I suggest you create a simple console application which tries to create (say) 5000 files serializing some simple object. See if you can get that to fail in the same way.

Multi-threading may cause that problem. The 250ms delay is an evidence of that.
Do you have multiple threads doing that?

Related

I'm trying to save and load choices and its giving me the same error over and over

I keep getting
The process cannot access the file another process is using it
Can someone tell me what's wrong?
public void SaveCheckedChoices()
{
using(StreamWriter writer = new StreamWriter(dataFile))
{
try
{
// Autorun
writer.WriteLine(sublimeAutorun);
writer.WriteLine(sublimePackagesAutorun);
writer.WriteLine(sharpDevelopAutorun);
writer.WriteLine(eclipseAutorun);
writer.WriteLine(outlookAutorun);
writer.WriteLine(youtubeAutorun);
writer.WriteLine(githubAutorun);
writer.WriteLine(trelloAutorun);
File.SetAttributes(dataFile, FileAttributes.Hidden);
}
finally
{
writer.Close();
}
}
}
public void LoadCheckedChoices()
{
StreamReader sr = new StreamReader(dataFile);
// Autorun
sublime.Checked = Convert.ToBoolean(sr.ReadLine());
sublimePackages.Checked = Convert.ToBoolean(sr.ReadLine());
sharpDevelop.Checked = Convert.ToBoolean(sr.ReadLine());
eclipse.Checked = Convert.ToBoolean(sr.ReadLine());
outlook.Checked = Convert.ToBoolean(sr.ReadLine());
youtube.Checked = Convert.ToBoolean(sr.ReadLine());
github.Checked = Convert.ToBoolean(sr.ReadLine());
trello.Checked = Convert.ToBoolean(sr.ReadLine());
sr.Close();
}
I made sure the file was created and that the only thing that was running was SharpDevelop and it still gives me the error, could someone please tell me what the problem is.
File.SetAttributes(dataFile, FileAttributes.Hidden);
Must be moved to after the writer.Close();. I believe it needs exclusive access to the file.
Also you can use File.ReadAllLines in the second function.
And the most important part of my answer. Never, ever, ever, ever, I must insist, NEVER hide a configuration file; EVER. It will come back to haunt you.
EDIT:
For proper serialization read xml or json or the documentation of newtonsoft.json

StreamWriter skip items to serialize in foreach loop

I'm having trouble with StreamWriter when serializing a big amount of objects in a foreach loop. Here's my code :
public bool Export(ItemToSerialize it){
try{
using(StreamWriter sw = new StreamWriter(Path.Combine(MySettings.ExportPath, randomFileName + ".xml"))){
XmlSerializer ser = new XmlSerializer(typeof(ItemToSerialize));
ser.Serialize(sw,it);
}
return true;
}
catch{throw;}
}
public bool ExportAll(){
List<ItemToSerialize> lst = RetrieveListToSerialize();
foreach(ItemToSerialize it in lst){
Export(it);
}
}
When I have a lot of data to export, it skips most of them. At first, I though that I had to flush the writer but flush / close when the export is done doesn't change anything.
What is surprising is that when I add a sleep (System.Threading.Thread.Sleep(1000)), it works. Most surprising, when I disminuish the sleep at 500ms, it keeps skipping some of them.
I suspect that it's writing faster than the writer can open / close or simply write. However, I expect the Export function not to return until the file is totally written. Is there something like a 'background' task when writing with a StreamWriter ?
Because with the provided code, I really don't understand this behaviour.
Thanks !

How to use SharpSVN to modify file xml and commit modified file

am a SharpSVN newbie.
We are currently busy with rebranding, and this entails updating all our reports with new colours etc. There are too many reports to do manually, so I am trying to find a way to find and replace the colours/fonts etc in one go.
Our reports are serialized and stored in a database, which is easy to replace, but we also want to apply the changes in the .rdl reports in our source control, which is Subversion.
My question is the following:
I know you can write files to a stream with SharpSVN, which I have done, now I would like to push the updated xml back into Subversion as the latest version.
Is this at all possible? And if so, how would I go about doing this? I have googled alot, but haven't been able to find any definitive answer to this.
My code so far (keep in mind this is a once off thing, so I'm not too concerned about clean code etc) :
private void ReplaceFiles()
{
SvnCommitArgs args = new SvnCommitArgs();
SvnCommitResult result;
args.LogMessage = "Rebranding - Replace fonts, colours, and font sizes";
using (SvnClient client = new SvnClient())
{
client.Authentication.DefaultCredentials = new NetworkCredential("mkassa", "Welcome1");
client.CheckOut(SvnUriTarget.FromString(txtSubversionDirectory.Text), txtCheckoutDirectory.Text);
client.Update(txtCheckoutDirectory.Text);
SvnUpdateResult upResult;
client.Update(txtCheckoutDirectory.Text, out upResult);
ProcessDirectory(txtCheckoutDirectory.Text, args, client);
}
MessageBox.Show("Done");
}
// Process all files in the directory passed in, recurse on any directories
// that are found, and process the files they contain.
public void ProcessDirectory(string targetDirectory, SvnCommitArgs args, SvnClient client)
{
var ext = new List<string> { ".rdl" };
// Process the list of files found in the directory.
IEnumerable<string> fileEntries = Directory.EnumerateFiles(targetDirectory, "*.*", SearchOption.AllDirectories)
.Where(s => ext.Any(e=> s.EndsWith(e)));
foreach (string fileName in fileEntries)
ProcessFile(fileName, args, client);
}
private void ProcessFile(string fileName, SvnCommitArgs args, SvnClient client)
{
using (MemoryStream stream = new MemoryStream())
{
SvnCommitResult result;
if (client.Write(SvnTarget.FromString(fileName), stream))
{
stream.Position = 0;
using (var reader = new StreamReader(stream))
{
string contents = reader.ReadToEnd();
DoReplacement(contents);
client.Commit(txtCheckoutDirectory.Text, args, out result);
//if (result != null)
// MessageBox.Show(result.PostCommitError);
}
}
}
}
Thank you to anyone who can provide some insight on this!
You don't want to perform a merge on the file, as you would only use that to merge the changes from one location into another location.
If you can't just checkout your entire tree and replace+commit on that, you might be able to use something based on:
string tmpDir = "C:\tmp\mytmp";
using(SvnClient svn = new SvnClient())
{
List<Uri> toProcess = new List<Uri>();
svn.List(new Uri("http://my-repos/trunk"), new SvnListArgs() { Depth=Infinity }),
delegate(object sender, SvnListEventArgs e)
{
if (e.Path.EndsWith(".rdl", StringComparison.OrdinalIgnoreCase))
toProcess.Add(e.Uri);
});
foreach(Uri i in toProcess)
{
Console.WriteLine("Processing {0}", i);
Directory.Delete(tmpDir, true);
// Create a sparse checkout with just one file (see svnbook.org)
string name = SvnTools.GetFileName(i);
string fileName = Path.Join(tmpDir, name)
svn.CheckOut(new Uri(toProcess, "./"), new SvnCheckOutArgs { Depth=Empty });
svn.Update(fileName);
ProcessFile(fileName); // Read file and save in same location
// Note that the following commit is a no-op if the file wasn't
// changed, so you don't have to check for that yourself
svn.Commit(fileName, new SvnCommitArgs { LogMessage="Processed" });
}
}
Once you updated trunk I would recommend merging that change to your maintenance branches... and if necessary only fix them after that. Otherwise further merges will be harder to perform than necessary.
I managed to get this done. Posting the answer for future reference.
Basically all I had to do was create a new .rdl file with the modified XML, and replace the checked out file with the new one before committing.
string contents = reader.ReadToEnd();
contents = DoReplacement(contents);
// Create an XML document
XmlDocument doc = new XmlDocument();
string xmlData = contents;
doc.Load(new StringReader(xmlData));
//Save XML document to file
doc.Save(fileName);
client.Commit(txtCheckoutDirectory.Text, args, out result);
Hopefully this will help anyone needing to do the same.

Does XML deserialize lock the file from reading?

I have an xml file which is used by multiple process for reading. Here is the code snippet used for deserializing the xml. I want to make sure the below code does not read lock the file.
public Address TestReadLock(string myXmlFile)
{
using (StreamReader sr = new StreamReader(myXmlFile))
{
XmlReaderSettings xrs = new XmlReaderSettings();
xrs.ValidationType = ValidationType.None;
xrs.XmlResolver = null;
using (XmlReader reader = XmlReader.Create(sr, xrs))
{
return (Address)xmlSerializer.Deserialize(reader);
}
}
}
I tried testing this by creating a dll of above function and loaded the file through powershell and VS in a loop at same time it worked fine.
public void Main()
{
for (int i = 0; i < 1000; i++)
{
Address myaddress = TestReadLock(#C:\MyDetails.xml")
}
}
Based on my understanding the above code should read lock the file nad while testing it is not the case
Is there a possibility like testing I did is wrong or my understanding is not correct?
new StreamReader(string) uses FileAccess.Read and FileShare.Read - it will not prevent other readers. If you want different control: use FileStream directly to control the access / sharing.

Clearing contents of memory mapped file in C#

I am using MemoryMappedFile for communication between 2 programs. Program "A" creates the mmf and reads it's contents on a timer. Program "B" writes xml data to the mmf on a timer. I have the memory map working but I run into an issue where the previous iteration of the XML data is longer than the current and old data gets carried over to the next round.
so for simplicity lets say program B writes
aaaa
Program A will read correctly,
Then the next write from program B is:
b
Program A reads
baaa
It seems like there should be some simple way to flush the contents of the memory mapped file but I can't seem to figure it out. It's very possible that I'm totally wrong in the way I'm going about this.
Here's what I'm currently doing.
Program A:
using (MemoryMappedFile mmf = MemoryMappedFile.OpenExisting("testmap",MemoryMappedFileRights.ReadWrite))
{
Mutex mutex = Mutex.OpenExisting("testmapmutex");
mutex.WaitOne();
string outputtext;
using (MemoryMappedViewStream stream = mmf.CreateViewStream(0,0))
{
XmlSerializer deserializer = new XmlSerializer(typeof(MyObject));
TextReader textReader = new StreamReader(stream);
outputtext = textReader.ReadToEnd();
textReader.Close();
}
mutex.ReleaseMutex();
return outputtext; //ends up in a textbox for debugging
}
Program B
using (MemoryMappedFile mmf = MemoryMappedFile.OpenExisting("testmap", MemoryMappedFileRights.ReadWrite))
{
Mutex mutex = Mutex.OpenExisting("testmapmutex");
mutex.WaitOne();
using (MemoryMappedViewStream stream = mmf.CreateViewStream(0, 0))
{
XmlSerializer serializer = new XmlSerializer(typeof(MyObject));
TextWriter textWriter = new StreamWriter(stream);
serializer.Serialize(textWriter, myObjectToExport);
textWriter.Flush();
}
mutex.ReleaseMutex();
}
Assuming length is reasonably small, you could really clear it out
textWriter.BaseStream.Seek(0, System.IO.SeekOrigin.Begin);
textWriter.BaseStream.Write(new byte[length], 0, length);
textWriter.BaseStream.Seek(0, System.IO.SeekOrigin.Begin);
EDIT: I think I misunderstood the OP's question. The problem he was having was not with clearing the contents of the MMF, but with stream manipulation. This should fix the problem:
textWriter.BaseStream.Seek(0, System.IO.SeekOrigin.Begin);
textWriter.Write("");
textWriter.Flush();
That being said, you might want to do both.
I haven't really worked with MemoryMappedStreams much but this question seemed interesting so I took a crack at it. I wrote a really basic windows example with two buttons (read/write) and a single text box. I didn't pass in "0, 0" to the CreateViewStream calls and I created the file with a fixed length using a call to "CreateOrOpen" and everything worked well! The following are the key pieces of code that I wrote:
WRITE The File
// create the file if it doesn't exist
if (sharedFile == null) sharedFile = MemoryMappedFile.CreateOrOpen("testmap", 1000, MemoryMappedFileAccess.ReadWrite);
// process safe handling
Mutex mutex = new Mutex(false, "testmapmutex");
if (mutex.WaitOne()) {
try {
using (MemoryMappedViewStream stream = sharedFile.CreateViewStream()) {
var writer = new StreamWriter(stream);
writer.WriteLine(txtResult.Text);
writer.Flush();
}
}
finally { mutex.ReleaseMutex(); }
}
READ The File
// create the file if it doesn't exist
if (sharedFile == null) sharedFile = MemoryMappedFile.CreateOrOpen("testmap", 1000, MemoryMappedFileAccess.ReadWrite);
// process safe handling
Mutex mutex = new Mutex(false, "testmapmutex");
if (mutex.WaitOne()) {
try {
using (MemoryMappedViewStream stream = sharedFile.CreateViewStream()) {
var textReader = new StreamReader(stream);
txtResult.Text = textReader.ReadToEnd();
textReader.Close();
}
}
finally { mutex.ReleaseMutex(); }
}
Dispose the file (after finished)
if (sharedFile != null) sharedFile.Dispose();
For the full example, see here: https://github.com/goopyjava/memory-map-test. Hope that helps!
EDIT/NOTE - If you look at the example provided you can write to the file as many times as you want and any time you read you will read exactly/only what was written last. I believe this was the original goal of the question.

Categories