I'm currently working on a project that has an external site posting xml data to a specified url on our site. My initial thoughts were to first of all save the xml data to a physical file on our server as a backup. I then insert the data into the cache and from then on, all requests for the data will be made to the cache instead of the physical file.
At the moment I have the following:
[HttpPost]
public void MyHandler()
{
// filePath = path to my xml file
// Delete the previous file
if (File.Exists(filePath))
File.Delete(filePath));
using (Stream output = File.OpenWrite(filePath))
using (Stream input = request.InputStream)
{
input.CopyTo(output);
}
// Deserialize and save the data to the cache
var xml = new XmlTextReader(filePath);
var serializer = new XmlSerializer(typeof(MyClass));
var myClass = (MyClass)serializer.Deserialize(xml);
HttpContext.Current.Cache.Insert(myKey,
myClass,
null,
myTimespan,
Cache.NoSlidingExpiration,
CacheItemPriority.Default, null);
}
The issue I have is that I'm always getting exceptions thrown because the file that I'm saving to 'is in use' when I try a second post to update the data.
A colleague suggested using a Mutex class just before I left work on the Friday so I wonder if that is the correct approach here?
Basically I'm just trying to sanity check that this is a good way of managing the data? I can see there's clearly an issue with how I'm writing the data to a file but aside from this, does my approach make sense?
Thanks
Related
public async Task UploadParquetFromObjects<T>(string fileName, T objects)
{
var stringJson = JArray.FromObject(objects).ToString();
var parsedJson = ChoJSONReader.LoadText(stringJson);
var desBlob = blobClient.GetBlockBlobClient(fileName);
using (var outStream = await desBlob.OpenWriteAsync(true).ConfigureAwait(false))
using (ChoParquetWriter parser = new ChoParquetWriter(outStream))
{
parser.Write(parsedJson);
}
}
I'm using this code to send some data to a file on an Azure Blob Storage. At first, it worked fine, it created the file, put some information on it and it was readable, but with some investigation, it only write a fraction of the data I send. For example, I send a list of 15 items and it only writes 3. I tried different datasets, with different sizes and composed of different objects, the writer varies on the number of registers written, but it never gets to 100%.
Am I doing something wrong?
This issue is being tracked and addressed in GitHub issues section.
https://github.com/Cinchoo/ChoETL/issues/230
The issue was the input JSON has inconsistent members, hence missing datetime members are set as null by JSON reader. Parquet writer couldn't handle such null datetime values. Applied fix.
Sample fiddle: https://dotnetfiddle.net/PwxNWX
Packages used:
ChoETL.JSON.Core v1.2.1.49 (beta2)
ChoETL.Parquet v1.0.1.23 (beta6)
I'm making a game now in C# (which is a Console Application) and its variables need to be saved.
I've tried using Settings but there's a big problem about it: If the file name is changed or the file is transferred to somewhere else, the Settings are lost.
So what is a good alternative to Settings for saving variables and retrieving them later in the application?
EDIT: I'd like to save the variables to a text file and retrieve it later, is it possible? If yes, then how?
And please don't suggest online servers, because I'm working on a singleplayer game without keeping tracks of the players whatsoever.
One simple way to store data of a fixed type is serialization with the BinaryFormatter class.
See the MSDN documentation for Binary Formatter. I've copied some of the relevant code here.
using System;
using System.IO;
using System.Collections;
using System.Runtime.Serialization.Formatters.Binary;
using System.Runtime.Serialization;
void SaveData()
{
// Create a hashtable of values that will eventually be serialized.
Hashtable addresses = new Hashtable();
addresses.Add("Jeff", "123 Main Street, Redmond, WA 98052");
addresses.Add("Fred", "987 Pine Road, Phila., PA 19116");
addresses.Add("Mary", "PO Box 112233, Palo Alto, CA 94301");
// To serialize the hashtable and its key/value pairs,
// you must first open a stream for writing.
// In this case, use a file stream.
FileStream fs = new FileStream("DataFile.dat", FileMode.Create);
// Construct a BinaryFormatter and use it to serialize the data to the stream.
BinaryFormatter formatter = new BinaryFormatter();
try
{
formatter.Serialize(fs, addresses);
}
catch (SerializationException e)
{
Console.WriteLine("Failed to serialize. Reason: " + e.Message);
throw;
}
finally
{
fs.Close();
}
}
void LoadData()
{
// Declare the hashtable reference.
Hashtable addresses = null;
// Open the file containing the data that you want to deserialize.
FileStream fs = new FileStream("DataFile.dat", FileMode.Open);
try
{
BinaryFormatter formatter = new BinaryFormatter();
// Deserialize the hashtable from the file and
// assign the reference to the local variable.
addresses = (Hashtable) formatter.Deserialize(fs);
}
catch (SerializationException e)
{
Console.WriteLine("Failed to deserialize. Reason: " + e.Message);
throw;
}
finally
{
fs.Close();
}
// To prove that the table deserialized correctly,
// display the key/value pairs.
foreach (DictionaryEntry de in addresses)
{
Console.WriteLine("{0} lives at {1}.", de.Key, de.Value);
}
}
If your app structure is dynamic by it's nature, so it's name can be changed, location can be changed (even if , to be honest, don't understand reasons behind that) the only possibility I can see is relay in external source for retrieving or storing your config information.
In short: setup a server somewhere that holds your app configuration data, and on first startup try to reach that server, load file from it, read a data. If it fails, just load default information.
Good candidates could be : DropBox, SkyDrive, GoogleDrive, Box... find suitable C# API for any of them an store/read data you need. The only thing I would invite your attention to for this solution, is licensing. Keep an eye on it, and be sure that you can use it in your application in a way you decide to use it.
Saving the values out to a flat file...
Storing the values in an XML File, or a database file...
Windows Registry...
There are many places you can store information, and only experience will really teach you what to put where... To make an intelligent guess, you need to be familiar with all the approaches...
The only real option that isn't susceptible to the user intentionally changing the data stored on their computer, losing it due to changing machines, etc. would be to not store the data on their computer at all. Have a database or other server that you host that users connect to over the network which stores their data for them.
You may try to use the Isolated Storage
Isolated storage is not available for Windows Store apps. Instead, use
the application data classes in the Windows.Storage namespaces
included in the Windows Runtime API to store local data and files.
You may also try to use XML file to store the users setting and then store it in the SpecialFolder.ApplicationData directory.
You can also use the app.config file to save application-level settings
I have a strange problem. So my code follows as following.
The exe takes some data from the user
Call a web service to write(and create CSV for the data) the file at perticular network location(say \some-server\some-directory).
Although this web service is hosted at the same location where this
folder is (i.e i can also change it to be c:\some-directory). It then
returns after writing the file
the exe checks for the file to exists, if the file exists then further processing else quite with error.
The problem I am having is at step 3. When I try to read the file immediately after it has been written, I always get file not found exception(but the file there is present). I do not get this exception when I am debugging (because then I am putting a delay by debugging the code) or when Thread.Sleep(3000) before reading the file.
This is really strange because I close the StreamWriter before I return the call to exe. Now according to the documention, close should force the flush of the stream. This is also not related to the size of the file. Also I am not doing Async thread calls for writing and reading the file. They are running in same thread serially one after another(only writing is done by a web service and reading is done by exe. Still the call is serial)
I do not know, but it feels like there is some time difference between the file actually gets written on the disk and when you do Close(). However this baffling because this is not at all related to size. This happens for all file size. I have tried this with file with 10, 50, 100,200 lines of data.
Another thing which I suspected was since I was writing this file to a network location, it could be windows is optimizing the call by writing first to cache and then to network location. So I went ahead and changed the code to write it on drive(i.e use c:\some-directory), rather than network location. But it also resulted in same error.
There is no error in code(for reading and writing). As explained earlier, by putting a delay, it starts working fine. Some other useful information
The exe is .Net Framework 3.5
Windows Server 2008(64 bit, 4 GB Ram)
Edit 1
File.AppendAllText() is not correct solution, as it creates a new file, if it does not exits
Edit 2
code for writing
using (FileStream fs = new FileStream(outFileName, FileMode.Create))
{
using (StreamWriter writer = new StreamWriter(fs, Encoding.Unicode))
{
writer.WriteLine(someString)
}
}
code for reading
StreamReader rdr = new StreamReader(File.OpenRead(CsvFilePath));
string header = rdr.ReadLine();
rdr.Close();
Edit 3
used textwriter, same error
using (TextWriter writer = File.CreateText(outFileName))
{
}
Edit 3
Finally as suggested by some users, I am doing a check for the file in while loop for certain number of times before I throw the exception of file not found.
int i = 1;
while (i++ < 10)
{
bool fileExists = File.Exists(CsvFilePath);
if (!fileExists)
System.Threading.Thread.Sleep(500);
else
break;
}
So you are writing a stream to a file, then reading the file back to a stream? Do you need to write the file then post process it, or can you not just use the source stream directly?
If you need the file, I would use a loop that keeps checking if the file exists every second until it appears (or a silly amount of time has passed) - the writer would give you an error if you couldn't write the file, so you know it will turn up eventually.
Since you're writing over a network, most optimal solution would be to save your file in the local system first, then copy it to network location. This way you can avoid network connection problems. And as well have a backup in case of network failure.
Based on your update, Try this instead:
File.WriteAllText(outFileName, someString);
header = null;
using(StreamReader reader = new StreamReader(CsvFilePath)) {
header = reader.ReadLine();
}
Have you tried to read after disposing the writer FileStream?
Like this:
using (FileStream fs = new FileStream(outFileName, FileMode.Create))
{
using (StreamWriter writer = new StreamWriter(fs, Encoding.Unicode))
{
writer.WriteLine(someString)
}
}
using (StreamReader rdr = new StreamReader(File.OpenRead(CsvFilePath)))
{
string header = rdr.ReadLine();
}
I have developed a website in asp.net mvc that reads from a xml-file to display some data. This file is regularly updated by my backend process that builds this xml-file and finally uploads it to my webhost via ftp.
This have been working fine, but in the last week or so I have gotten into a problem. I get the asp.net exception "file is being used by another process". I have no clue what this can be, and I find it very odd. I haven't even changed anything in my code for some months now.
Below you will find a typical method that I use for serializing my xml-file:
public static IEnumerable<FStreamObject> GetStreams()
{
using (FileStream fs = new FileStream(HttpRuntime.AppDomainAppPath+"/ff.xml", FileMode.Open))
{
XmlReader ffXML = XmlReader.Create(fs);
XmlSerializer ser = new XmlSerializer(typeof(FXmlModel));
var sList = (FXmlModel)ser.Deserialize(ffXML);
fs.Close();
return sList.FSObjectList.OrderByDescending(x => x.Cash);
}
}
I guess the other process which updates the content of your file still has a lock on this file. So you may need to fix the code of that to make sure you are releasing any locks /connection to this file once you are done with the file operations.
The problem could be that an Exception is being thrown before the fs.Close() invocation.
Try using a try-catch-fianlly and put the fs.Close() in the finally block.
Also have in mind the Close() method is not destroying the object immediately, you could also try using Dispose() to be sure the reference is being cleaned from memory.
Hope it helps!
I need to download xml file from a secure link, and store the content into database.
Can i use text reader??? or i need to store the file into my local file system first, then read the content from my file system and store into my database?
HttpWebRequest downloadRequest = (HttpWebRequest)WebRequest.Create("https://line-to-xml-file.xml");
string Content;
downloadRequest.Credentials = new NetworkCredential()
{
UserName = this._userCredentials.UserName,
Password = this._userCredentials.Password
};
downloadRequest.PreAuthenticate = true;
using (HttpWebResponse downloadHTTPResponse = (HttpWebResponse)downloadRequest.GetResponse())
{
using (Stream downloadResponseStream = downloadHTTPResponse.GetResponseStream())
using (TextReader tReader = new StreamReader(downloadResponseStream))
{
Content = tReader.ReadToEnd();
}
}
return Content;
Since the remote file is huge, up to 100MB, i can see nothing from debug.
and when i try to save it.
using (TransactionScope trans = new TransactionScope()) <--- when comes to this line, exception throws...
{
// perform update, save the content into databse
// send a notification message to message bus, indicate content has been updated
}
It complain MSDTC transaction timeout/cancelled
By having it into a stream you should be fine... if you have any problem with that specific stream then you could use a MemoryStream instead of a FileStream and use it the same way. But I hardly doubt this is your case.
I think you should make also sure to open the connection RIGHT BEFORE you are going to save the stream and when you have it completely loaded... You can also play with your Command.TimeOut property if it is taking really, really long to save, here "A value of 0 indicates no limit" but this should be avoided.
but
There is an XmlReader, but if the Xml is already well-formed and you don't need to parse it because your database is just taking it as a blob, or the database engine is going to parse it, you could use whatever.
Depends on your database schema.
Sounds like the database is having trouble inserting it, but need more info.
100MB of text is a lot to cram in a table. Your statement is almost certainly timing out. Check your context and your SQL command object (if you have one) and increase the timeout value.
You will probably set a longer timeout duration to solve the timeout problem, and make sure you call the Complete function
using (TransactionScope trans = new TransactionScope(
TransactionScopeOption.Required, new TimeSpan(1, 4, 3))) // Sets time out to 1 hour 4 minutes and 3 seconds
{
// perform update, save the content into databse
// send a notification message to message bus, indicate content has been updated
trans.Complete();
}
as for reading the file, you could possible use WebClient. WebClient allows you to monitor the progress.
WebClient wc = new WebClient();
wc.Credentials = new NetworkCredential()
{
UserName = this._userCredentials.UserName,
Password = this._userCredentials.Password
};
wc.DownloadProgressChanged += new DownloadProgressChangedEventHandler(wc_DownloadProgressChanged);
wc.DownloadFile("https://line-to-xml-file.xml", "C:\\local.xml");
The handler could have log the progress if necessary:
void wc_DownloadProgressChanged(object sender, DownloadProgressChangedEventArgs e)
{
// Log or show the current progress (e.ProgressPercentage or e.BytesReceived)
}
You could use DownloadString instead of Download file if you want the string straight without needing to read off the file again.
wc.DownloadString("https://line-to-xml-file.xml");
How about using the WebClient and its download file method. Just save it locally, use it and delete upon completion of use.