I'm basically trying to add podcasts to an XML file and eventually retrieve them from that same XML, using the data in different parts of the application.
I successfully write the data to the XML but every time I reboot the application (debug) and press the "submit" button the XML file resets to 0 entries.
The submit code:
PodcastList podList = new PodcastList();
private void btnAddPod_Click(object sender, EventArgs e)
{
var podUrl = txtAddPod.Text;
var podName = txtEnterName.Text;
podList.AddPod(podUrl, podName, 0);
Xml.SaveListData(podList.GetPodcastList(), "Podcasts.xml");
}
Save to XML:
public static void SaveListData(object obj, string filename)
{
var serializer = new XmlSerializer(obj.GetType());
using (var stream = new StreamWriter(filename))
{
serializer.Serialize(stream, obj);
}
}
I guess the applications creates a new XML-file every time I press submit and has fresh objects. What am I doing wrong? Cheers
XML files are not generally 'appended to' due to the need for opening and closing tags. As opposed to say, other types of text file like log files where appending makes a bit more sense.
When you call serializer.Serialize the whole file gets overwritten.
What your program needs to do is read in the already-existing XML file on startup and store that as a PodcastList(). Your program can then add to it (in memory) and save the whole list as a file.
PodcastList podList = new PodcastList(); is at the class level.
If you want to maintain the state , Reload the XML file Deserialize the file to PodcastList in the Constructor or at the time of the load, Then you will be able to retain and reuse the collection and rewrite the data back to XML file.
Related
New to Blazor (web-assembly), so go easy :)
I would like to be able to select an xml file from local disk (in the region of 100mb in size), via InputFile and have it loaded in to XmlDocument so I can interrogate it.
Upon trying to load a file of this size, it crashes out on XmlDocument.load(). Unsure why.
I can get it to load smaller file sizes via a OpenReadStream and setting the maxAllowedSize to like 105000000 but they take a complete age in comparison to say loading it from a WPF c# app.
I'm unsure whether the stream is causing the slowness, or whether I'm missing something fundamental in accessing local disk files during the XmlDocument load process of this size?
Any help is greatly appreciated.
So basically, all i want to be able to do is something like this...
<InputFile OnChange="LoadFile">Select file...</InputFile>
#{
private void LoadFile()
{
XmlDocument newXml = new XmlDocument();
newXml.load(ChosenFilePath); //ChosenFilePath or hardcoded path for testing purposes still fails i.e. #"C:\temp\TestFile.xml"
}
}
You need to accept the InputFileChangeEventArgs and get your file through it, that's relatively simple, your event handler should be this:
private void LoadFile(InputFileChangeEventArgs args)
{
// The argument is to increase the maximum size, by default it only allows
// you to open files less than or equal to 500KiB, the number below is 200MiB
// Alternatively you could pass 'args.File.Size' to allow opening an arbitraily large file
var fileStream = args.File.OpenReadStream(200 * 1024 * 1024);
var newXml = new XmlDocument();
newXml.Load(fileStream);
}
I am building a C# winforms application which needs to read in data from multiple files and allow the user to view/edit the data. There will be a large amount of data, so the user needs to be able to save their changes, close the program, and resume their work later.
I am struggling with the best approach for retaining this data after the user exits the program. I've followed a tutorial for data binding to objects, but in this tutorial the data is hardcoded into the Form_Load event and the changes are lost when you exit the program. The author alludes to preferring to use an object-based data source instead of a database for data binding, but doesn't describe on how/if he saves data after the user exits.
Is there a way to store the data in the object-based data source between sessions, without setting up a local database or manually writing to some type of file? Or must I set up a local database in order to save data?
As users Jimi and bhmahler mentioned in the comments, the concept I was looking for was Object Serialization.
I created the following method to save my data:
private void Serialize()
{
Stream s = File.Open("data.txt", FileMode.Open);
BinaryFormatter b = new BinaryFormatter();
List<Airplane> data = new List<Airplane>();
foreach (var a in bsAirplanes) //bsAirplanes is the bindingsource object
{
data.Add((Airplane)a);
}
b.Serialize(s, data);
s.Close();
}
And this method to load saved data:
private void Deserialize()
{
Stream s = File.OpenRead("data.txt");
BinaryFormatter b = new BinaryFormatter();
List<Airplane> data = (List<Airplane>) b.Deserialize(s);
foreach(Airplane a in data)
{
bsAirplanes.Add(a);
}
s.Close();
}
I also had to mark the Airplane class with the [Serializable()] attribute.
I'm currently working on a project that has an external site posting xml data to a specified url on our site. My initial thoughts were to first of all save the xml data to a physical file on our server as a backup. I then insert the data into the cache and from then on, all requests for the data will be made to the cache instead of the physical file.
At the moment I have the following:
[HttpPost]
public void MyHandler()
{
// filePath = path to my xml file
// Delete the previous file
if (File.Exists(filePath))
File.Delete(filePath));
using (Stream output = File.OpenWrite(filePath))
using (Stream input = request.InputStream)
{
input.CopyTo(output);
}
// Deserialize and save the data to the cache
var xml = new XmlTextReader(filePath);
var serializer = new XmlSerializer(typeof(MyClass));
var myClass = (MyClass)serializer.Deserialize(xml);
HttpContext.Current.Cache.Insert(myKey,
myClass,
null,
myTimespan,
Cache.NoSlidingExpiration,
CacheItemPriority.Default, null);
}
The issue I have is that I'm always getting exceptions thrown because the file that I'm saving to 'is in use' when I try a second post to update the data.
A colleague suggested using a Mutex class just before I left work on the Friday so I wonder if that is the correct approach here?
Basically I'm just trying to sanity check that this is a good way of managing the data? I can see there's clearly an issue with how I'm writing the data to a file but aside from this, does my approach make sense?
Thanks
I build WPF program , And i use this code to save a List of objects to a File:
var list = new ArrayList();
list.Add("item1");
list.Add("item2");
// Serialize the list to a file
var serializer = new BinaryFormatter();
using (var stream = File.OpenWrite("test.dat"))
{
serializer.Serialize(stream, list);
}
And my problem is where to save this file on the disk. i read that i can't use the ProgramFiles Folder because sometimes only the admin user can save to this folder files.
There is any universal folder that i can use to save files?
I'd save it to Application Data. You can get it's path using
Environment.GetFolderPath(Environment.SpecialFolder.ApplicationData)
Is this data internal to your program, or user generated? If the data is internal, you probably want to use the Application Data folder. If it's user generated, you should probably default to My Documents, but let the user to decide where to save it.
You can call Environment.GetFolderPath() to get the location of these special folders.
I am having an xml file like:
<CurrentProject>
// Elements like
// last opened project file to reopen it when app starts
// and more global project independend settings
</CurrentProject>
Now I asked myself wether I should deliver this xml file with above empty elements with the installer for my app or should I create this file on the fly on application start if it does not exist else read the values from it.
Consider also that the user could delete this file and that should my application not prevent from working anymore.
What is better and why?
UPDATE:
What I did felt ok for me so I post my code here :) It just creates the xml + structure on the fly with some security checks...
public ProjectService(IProjectDataProvider provider)
{
_provider = provider;
string applicationPath = Environment.GetFolderPath(Environment.SpecialFolder.ApplicationData);
_projectPath = Path.Combine(applicationPath,#"TBM\Settings.XML");
if (!File.Exists(_projectPath))
{
string dirPath = Path.Combine(applicationPath, #"TBM");
if (!Directory.Exists(dirPath))
Directory.CreateDirectory(dirPath);
using (var stream = File.Create(_projectPath))
{
XElement projectElement = new XElement("Project");
projectElement.Add(new XElement("DatabasePath"));
projectElement.Save(stream, SaveOptions.DisableFormatting);
}
}
}
In a similar scenario, I recently went for creating the initial file on the fly. The main reason I chose this was the fact that I wasn't depending on this file being there and being valid. As this was a file that's often read from/written to, there's a chance that it could get corrupted (e.g. if the power is lost while the file is being written).
In my code I attempted to open this file for reading and then read the data. If anywhere during these steps I encountered an error, I simply recreated the file with default values and displayed a corresponding message to the user.