I would like to hold a file in a class file.
I am writing an attachment user control.
After uploading a file by upload control, I would like to hold it in a class before I upload the file in SharePoint.
User can upload more than one file.
Only after that when user click on save button, I will save all files and other data, files to SharePoint and other data to Database.
Here is my class
public class Document :
{
public string documentName, documentPath, spServerURL, spDocumentLibraryURL;
public DateTime lodegmentDate;
public System.Web.HttpPostedFile postedFile;
}
How should I handle it? Is it OK to use httpPostedFile? During my update, can I convert SPFile to httpPostedFile?
The way that we handle this is when the file is uploaded, it is saved in a well-known directory with a temporary file name based on a GUID. The temporary file name and the original, uploaded file name, are then stored in a list of FileDetails within our class. Our class is then serialized to the page's ViewState, but it could also be stored in session state (I wouldn't recommend this in case your users open multiple pages or sign on to multiple computers with the same login).
When the save button is pressed, we loop through the list of FileDetails in our class, retrieve each one from the temporary directory, and send it to sharepoint (or wherever it needs to go).
We also bind the uploaded files to a grid so that the user can see the list of uploaded files and, if they want to remove one, they can check a box in the grid (deleted) that we check before processing the files.
Note that this process can also support automatic unzipping of zipped files: if you detect that the uploaded file is zipped, you can unzip each of the files to the temporary directory and add an entry for each one to list of files in the class. This could be a big time saver for your users.
Keeping documents in memory is not an efficient use of webserver ram.
Much better to store it in either a temporary database or fileshare...
However, I would not recommend writing this yourself if you can avoid it.
Try looking at the RadUpload control from Telerik, they even give you a 60-day trial.
Related
I have a requirement likme i do have a bunch of folders and each folder contains some set of .pdf,.doc,.xls.I need to upload a folder at a time to database and read the folder data data.in database it'll be stored in the format of file type,file name,file size.I have to do this in C#.Plz some one help me urgently.Thanks in Advance...
Regards,
Snehasis
How-to Guide for iterating through files and folders here could be a good starting point. The sample code described in the guide, iterates through folders recursively and save list of files inside any folder in System.IO.FileInfo[] array, using which you can get all your required information about a particular file such as name, type etc.. and then save it in database.
I am building an interface whose primary function would be to act as a file renaming tool (the underlying task here is to manually classify each file within a folder according to rules that describe their content). So far, I have implemented a customized file explorer and a preview window for the files.
I now have to find a way to inform a user if a file has already been renamed (this will show up in the file explorer's listView). The program should be able to read as well as modify that state as the files are renamed. I simply do not know what method is optimal to save this kind of information, as I am not fully used to C#'s potential yet. My initial solution involved text files, but again, I do not know if there should be only one text file for all files and folders or simply a text file per folder indicating the state of its contained items.
A colleague suggested that I use an Excel spreadsheet and then simply import the row or columns corresponding to my query. I tried to find more direct data structures, but again I would feel a lot more comfortable with some outside opinion.
So, what do you think would be the best way to store this kind of data?
PS: There are many thousands of files, all of them TIFF images, located on a remote server to which I have complete access.
I'm not sure what you're asking for, but if you simply want to keep some file's information such as name, date, size etc. you could use the FileInfo class. It is marked as serializable, so that you could easily write an array of them in an xml file by invoking the serialize method of an XmlSerializer.
I am not sure I understand you question. But what I gather you want to basically store the meta-data regarding each file. If this is the case I could make two suggestions.
Store the meta-data in a simple XML file. One XML file per folder if you have multiple folders, the XML file could be a hidden file. Then your custom application can load the file if it exists when you navigate to the folder and present the data to the user.
If you are using NTFS and you know this will always be the case, you can store the meta-data for the file in a file stream. This is not a .NET stream, but a extra stream of data that can be store and moved around with each file without impacting the actual files content. The nice thin about this is that no matter where you move the file, the meta-data will move with the file, as long as it is still on NTFS
Here is more info on the file streams
http://msdn.microsoft.com/en-us/library/aa364404(VS.85).aspx
You could create an object oriented structure and then serialize the root object to a binary file or to an XML file. You could represent just about any structure this way, so you wouldn't have to struggle with the
I do not know if there should be only one text file for all files and folders or simply a text file per folder indicating the state of its contained items.
design issues. You would just have one file containing all of the metadata that you need to store. If you want speedier opening/saving and smaller size, go with binary, and if you want something that other people could open and view and potentially write their own software against, you can use XML.
There's lots of variations on how to do this, but to get you started here is one article from a quick Google:
http://www.codeproject.com/KB/cs/objserial.aspx
I have numerous byte[] representing pdf's. Each byte array needs to be loaded at the start of the application and shown as a thumbnail on my gui. So far I have managed to write the files to a temp location using:
System.IO.Path.GetTempPath();
Then using this path I get all pdf files, write them using
System.IO.File.WriteAllBytes(fileName, arrayOfPdfs[i]);
and then navigate to that directory, get all pdf files and turn them in to thumbnails in my app.
The thing is I only want the pdfs I have just put in the temp location only, so how else can I store the pdfs, or even where else can I store them so I know when I come to turn them in to thumbnails, the files I am reading are the ones I have just written? This is so I can be sure the user is only looking at the relevant pdfs that relate to their search on my system.
Thanks.
Build a randomly named directory in the base temporary directory:
string directoryName = Path.GetRandomFileName();
Directory.CreateDirectory(Path.Combine(Path.GetTempPath(), directoryName));
Store your files in there.
I would recommend in your users ApplicationData/LocalApplicationData folder provided by the OS for your app..
Environment.GetFolderPath(Environment.SpecialFolder.ApplicationData);
Of course if the storage doesn't need to persist very long (really temporary) then you could just use the temp folder and create a folder inside of it to isolate your files.
Could you just create a subdirectory in the Temp path?
string dir = System.IO.Path.GetTempPath() + "\\<searchstring>;
Use Path.GetTempFileName and keep track of the temporary files that you've allocated during the session. You should clean up when your program exits.
You can either:
Record the creation time of the first and last item, then only edit files which were created in that creation window
Move the files to a a holding folder, create the thumbnails and then move them to the folder that they're meant to be in, ensuring that the holding folder is empty at the end of the run
I am writing a website to consolidate a bunch of XML files with data into one MySQL database. I need to have a way to allow users to select a directory on their computer that contains a bunch of xml files. The site then reads each of those files and takes care of consolidating the information.
Is there a simple way (like the default open file dialog for win forms and wpf) to bring up a file dialog on a users computer, let the user pick a directory, and then be able to read the xml files in the selected directory? Would I have to upload them to the site temporarily first? Or could I just access them on the users computer?
Thanks!!
You can't access files from a webserver directly. You would need to write an ActiveX Control if you really don't find another way.
The standard conform way it just uploading one or more files with the browser fileupload:
http://msdn.microsoft.com/en-us/library/system.web.ui.webcontrols.fileupload.aspx
I would suggest that the user should zip the files and just upload the zip file.
There are some hacks - but I don't think it fits:
http://the-stickman.com/web-development/javascript/upload-multiple-files-with-a-single-file-element/
http://dotnetslackers.com/articles/aspnet/Upload_multiple_files_using_the_HtmlInputFile_control.aspx
I think you have to have a web dialog to upload the files to a temp location like you already mentioned and do the consolidation there before committing to your database. Or, maybe you can do the consolidation in JavaScript in the user's browser instance.
I am currently learning C# during my studies and I am writing a small movie database application in it. I know its not good to save pictures (etc) inside the database, especially when adding movie covers that are rather big. But I don't want the files to just get saved in a folder as this creates a mess if more and more movies are added to the list.
Does anyone know a way to store the files in some sort of a container file (like covers.xxx). Then that container contains all covers in one big file, the files can then be retrieved with an address or the name.
Thanks :)
http://dotnetzip.codeplex.com/
Use above library and following code snippet.
using (ZipFile zip = new ZipFile())
{
// add this map file into the "images" directory in the zip archive
zip.AddFile("c:\\images\\personal\\7440-N49th.png", "images");
// add the report into a different directory in the archive
zip.AddFile("c:\\Reports\\2008-Regional-Sales-Report.pdf", "files");
zip.AddFile("ReadMe.txt");
zip.Save("MyZipFile.zip");
}
I can't see why storing the files as binary in the db is necessarily a bad idea.
However if it's definitely out then it sounds like you want a way to store a non-compressed compilation of files - basically a .zip that isn't compressed. You could achieve this for yourself by creating a file that is simply the data of the files you want appended together with some sort of unique header string in between them that you can split on when you read from the file. Ultimately this is simulating a basic file DB so I'm not sure what you'd accomplish but it's an idea.