Save List to file on disk - c#

I build WPF program , And i use this code to save a List of objects to a File:
var list = new ArrayList();
list.Add("item1");
list.Add("item2");
// Serialize the list to a file
var serializer = new BinaryFormatter();
using (var stream = File.OpenWrite("test.dat"))
{
serializer.Serialize(stream, list);
}
And my problem is where to save this file on the disk. i read that i can't use the ProgramFiles Folder because sometimes only the admin user can save to this folder files.
There is any universal folder that i can use to save files?

I'd save it to Application Data. You can get it's path using
Environment.GetFolderPath(Environment.SpecialFolder.ApplicationData)

Is this data internal to your program, or user generated? If the data is internal, you probably want to use the Application Data folder. If it's user generated, you should probably default to My Documents, but let the user to decide where to save it.
You can call Environment.GetFolderPath() to get the location of these special folders.

Related

How to obtain the full file path when using InputFile in Blazor Server?

I need to be able to extract the full file name, including the path when the user selects a file using my InputFile element.
So, as an example, using this
<InputFile OnChange="FileSelected" />
I can see the filename in the event handler like so
void FileSelected(InputFileChangeEventArgs eventArgs)
{
//eventArgs.File.Name has just the name of the file, e.g. ABC.csv but I need the full path like c:\userfolder\ABC.csv
but after various googling attempts, I haven't been able to figure out how to get the full file name.
The purpose here is to present the user with a file dialog box where they could pick a file and then I could load a few other files that are needed using the full file path.
Thanks
then I could load a few other files that are needed using the full file path
Nope.
The server cannot read from the client’s file system. Any files that need to be sent to the server, the client needs to send them.
Even the client-side code is very restricted by the browser’s sandboxed environment. The user needs to supply the file in order to grant permission. See: https://developer.mozilla.org/en-US/docs/Web/API/File
You’ll likely need to re-think the use case. Because browsers specifically don’t allow what you want to do.
try this....
public void OnChangeUpload(UploadChangeEventArgs args)
{
foreach (var file in args.Files)
{
var path = Path.GetFullPath("wwwroot\\Images\\") + file.FileInfo.Name;
FileStream filestream = new FileStream(path, FileMode.Create, FileAccess.Write);
file.Stream.WriteTo(filestream);
filestream.Close();
file.Stream.Close();
pathUrl = path;
}
}

Out Of Memory Exception in Foreach

I am trying to create a function that will retrieve all the uploaded files (which are now saved as byte in the database) and download it in a single zip file. I currently have 6000 files to download (and the number could grow).
The functionality is already working (from retrieval to download) if I limit the number of files being downloaded, otherwise, I get an OutOfMemoryException on the ForEach loop.
Here's a pseudo code: (files variable is a list of byte array and file name)
var files = getAllFilesFromDB();
foreach (var file in files)
{
var tempFilePath = Path.Combine(path, filename);
using (FileStream stream = new FileStream(tempfileName, FileMode.Create, FileAccess.ReadWrite))
{
stream.Write(file.byteArray, 0, file.byteArray.Length);
}
}
private readonly IEntityRepository<File> fileRepository;
IEnumerable<FileModel> getAllFilesFromDb()
{
return fileRepository.Select(f => new FileModel(){ fileData = f.byteArray, filename = f.fileName});
}
My question is, is there any other way to do this to avoid getting such errors?
To avoid this problem, you could avoid loading all the contents of all the files in one go. Most likely you will need to split your database call in to two database calls.
Retrieve a list of all the files without their contents but with some identifier - like the PK of the table.
A method which retrieves the contents of an individual file.
Then your (pseudo)code becomes
get list of all files
for each file
get the file contents
write the file to disk
Another possibility is to alter the way your query works currently, so that it uses deferred execution - this means it will not actually load all the files at once, but stream them one at a time from the database - but without seeing more code from your repository implementation, I cannot/ will not guess the right solution for you.

Creating files programmatically and getting the file names from WP7 isolated storage

I am building an app which requires saving a form whenever user enter the details. I want each form to be stored in separate .dat file. Using GUID as file names, I am able to store the data in separate files now. But I am not able to retrieve all the filenames and bind it to a listbox of hyperlinkbutton, on click of which will show all the data stored in the particular file. Please help!
Maybe you want to have a look at:
http://msdn.microsoft.com/en-us/library/cc190395(v=vs.95).aspx
"GetFileNames" is a method of "IsolatedStorageFile" which shows all the files in the directory it is pointing to.
In this question you can get an example:
How to read names of files stored in IsolatedStorage
Try using the GetFilesNames method of your IsolatedStorageFile. Use a wildcard (*.dat) to retrieve a list of your files.
For example:
using (var store = IsolatedStorageFile.GetUserStoreForApplication())
{
string pattern = "*.dat";
string[] files = store.GetFileNames(pattern);
}

Deploy an application's xml file with installer or create it on the fly if it does not exist

I am having an xml file like:
<CurrentProject>
// Elements like
// last opened project file to reopen it when app starts
// and more global project independend settings
</CurrentProject>
Now I asked myself wether I should deliver this xml file with above empty elements with the installer for my app or should I create this file on the fly on application start if it does not exist else read the values from it.
Consider also that the user could delete this file and that should my application not prevent from working anymore.
What is better and why?
UPDATE:
What I did felt ok for me so I post my code here :) It just creates the xml + structure on the fly with some security checks...
public ProjectService(IProjectDataProvider provider)
{
_provider = provider;
string applicationPath = Environment.GetFolderPath(Environment.SpecialFolder.ApplicationData);
_projectPath = Path.Combine(applicationPath,#"TBM\Settings.XML");
if (!File.Exists(_projectPath))
{
string dirPath = Path.Combine(applicationPath, #"TBM");
if (!Directory.Exists(dirPath))
Directory.CreateDirectory(dirPath);
using (var stream = File.Create(_projectPath))
{
XElement projectElement = new XElement("Project");
projectElement.Add(new XElement("DatabasePath"));
projectElement.Save(stream, SaveOptions.DisableFormatting);
}
}
}
In a similar scenario, I recently went for creating the initial file on the fly. The main reason I chose this was the fact that I wasn't depending on this file being there and being valid. As this was a file that's often read from/written to, there's a chance that it could get corrupted (e.g. if the power is lost while the file is being written).
In my code I attempted to open this file for reading and then read the data. If anywhere during these steps I encountered an error, I simply recreated the file with default values and displayed a corresponding message to the user.

OPEN a Resource.resx file instead of Creating it which overrides the previous Resource.resx file

I start my application from withint Visual Studio 2010.
I add then some files into my application and each file type`s icon like icon from doc,docx,xls,pdf etc are added as String/Bitmap key/value pair to my IconImages.Resx file via
private void DumpTempResourceToRealResourceFile(IDictionary<String, Bitmap> tempResource)
{
using (ResXResourceWriter writer = new ResXResourceWriter("IconImages.Resx"))
{
foreach (KeyValuePair<String,Bitmap> item in tempResource)
{
writer.AddResource(item.Key, item.Value);
}
writer.Generate();
}
}
When the icons are added to the resource I close the application.
Then I start my application again with VS 2010 and add some files within my document application. The file types are written again to my IconImages.Resx.
Then I close my application and check the IconImages.Resx file under the \bin\ folder and the previous saved images are gone and I have new/different ones now.
Why can I not say OPEN a .resx file and append stuff to it? Everytime I create a ResourceWriter object with the same name "IconImages.Resx" I overwrite the previous added stuff and thats stupid.
How can my IconImages.Resx file stay alive over an application session without being overwritten by other stuff I add?
I haven't used ResXResourceWriter, but usually *Writer classes simply write a data file from scratch.
If you want to "append" new data you would typically have to use a *Reader class to deserialise the existing data into memory, then merge/add in any new data you wish to, and use a *Writer object to then write the resulting data back out. Take a look at ResXResourceReader to see if it supports what you need to do this.
I am having now a lookup table "FiletypeImage" with the filetype ".docx" and the raw binary data aka blob. This table gets retrieved in my documentService and cached in a static variable. with a Get and Add method which are called by my DocumentListViewModel. Its very fast thx to sqlite :)

Categories