asp.net mvc file upload security breach - c#

ok apart from checking the file type and file size both server side how can i avoid security breach during uploading of file that may compromise my system (basically from the advance hacker) . is this enough? . i am not talking about any special scenario just simple file upload.
Well security is one of the major concern in my application.

You could make sure you don't store those files to a folder which is publicly accessible and executable by the web server. Also you could use heuristics by checking the first few bytes of the file for known file formats. For example common image formats have standard beginning headers so you might check for the presence of those headers.

Related

Upload large files to sitecore media library

I have a media centric website that requires us to upload large images, videos to the media library.
I have the default settings for the following settings in web.config.
Media.MaxSizeInDatabase (20MB)
httpRuntime maxRequestLength
I do not want to increase MaxSizeInDatabase limit on the production server for security reasons.
Also, Media.UploadAsFiles is set to false.
So, my question is - Is there a way to configure sitecore such that if the file being uploaded is less than 20MB, it gets stored in the database and the files larger than 20MB get stored on the file system?
As Martijn says, there is nothing built in to automatically detect this, but if you know that the file is going to be large (or the upload fails due to the large size) then you can manually "force it" to save to file on a per upload basis.
You need to use the Advanced Upload option and select the "Upload as Files" option.
EDIT: If you are able to use YouTube then consider the following modules with nicely/tightly integrated with Sitecore. There are a couple of others ways of achieving the same thing for different providers.
YouTube Integration
YouTube Uploader
No, not that I know of. At least not automatically. Uploaded files are either stored in the DB or on the filesystem, based on your setting.
You might want to create an override upload method which could automatically handle this for you or use the manual checkbox in the Advanced Media Upload method as Jammykam says.

how to find the timestamp of an online pdf file using c#?

I am writing an application that would download and replace a pdf file only if the timestamp is newer than that of the already existing one ...
I know its possible to read the time stamp of a file on a local computer via the code line below,
MessageBox.Show(File.GetCreationTime("C:\\test.pdf").ToString());
is it possible to read the timestamp of a file that is online without downloading it .. ?
Unless the directory containing the file on the site is configured to show raw file listings there's no way to get a timestamp for a file via HTTP. Even with raw listings you'd need to parse the HTML yourself to get at the timestamp.
If you had FTP access to the files then you could do this. If just using the basic FTP capabilities built into the .NET Framework you'd still need to parse the directory listing to get at the date. However there are third party FTP libraries that fill in the gaps such as editFTPnet where you get a FTPFile class.
Updated:
Per comment:
If I were to set up a simple html file with the dates and filenames
written manually , I could simply read that to find out which files
have actually been updated and download just the required files . is
that a feasible solution ..
That would be one approach, or if you have scripting available (ASP.NET, ASP, PHP, Perl, etc) then you could automate this and have the script get the timestamp of the files(s) and render them for you. Or you could write a very simple web service that returns a JSON or XML blob containing the timestamps for the files which would be less hassle to parse than some HTML.
It's only possible if the web server explicitly serves that data to you. The creation date for a file is part of the file system. However, when you're downloading something over HTTP it's not part of a file system at that point.
HTTP doesn't have a concept of "files" in the way people generally think. Instead, what would otherwise be a "file" is transferred as response data with a response header that gives information about the data. The header can specify the type of the data (such as a PDF "file") and even specify a default name to use if the client decides to save the data as a file on the client's local file system.
However, even when saving that, it's a new file on the client's local file system. It has no knowledge of the original file which produced the data that was served by the web server.

Document Management System - Where to Store Files?

I'm on charge of building an ASP.NET MVC Document Management System. It have to be able to do basic document management tasks like adding, editing and searching entries and also perform versioning.
Anyways, I'm targeting PDF, Office and many image formats as the file attached to each document entry in the database. My question is: What design guidelines do pros follow when building the storage mechanism? Do they store the document files in the file system? Database? How file uploading is handled?
I used to upload the files to a temporal location while the user was editing the data and move it to permanent storage when the user confirmed the entry creation. Is this good? Any suggestions on improvement?
Files should generally be stored on a filesystem, rather than a database.
You will, however, have to consider some other things:
Are you planning on ever supporting load-balancing, replication, etc for your system?
If so, you'll need to support saving / loading files from a network location of some sort.
This can be trickier than you may imagine.
Are you planning to secure access to the files?
If so, you'll need to ensure they can't be read by someone who happens to know the URL.
eg: by returning the file as an attachment to a request.
This also prevents user-provided files being executed on your server - eg someone uploading an .aspx or .exe file and then accessing it.

Silverlight 3: Best way to select a local file for input / upload

I'm using Silverlight 3 to write a LOB applcation that takes an input file, does some stuff, and then returns an output file. What is the easiest way to get the input file from the user and then return a file back to the user? Can I access the local file system to do this? How? Most likely the files will be ASCII files, but could be Excel some day (Hopefully soon).
You can access the local file system provided you go via the OpenFileDialog (for reading files) and the SaveFileDialog (for writing files). You can't access arbitrary files, only the ones where the user has seen and OKed the file dialog.
There is one exception to what itowlson says, you do have access to IsolatedStorage. However, this is limited. Access to "normal" files can only be through interaction of the user. This makes silverlight a much safer playground, from a user perspective, then older (like activex) technologies.

How can I tell if a file on an FTP is identical to a local file with out actually downloading the file?

I'm writing a simple program that is used to synchronize files to an FTP. I want to be able to check if the local version of a file is different from the remote version, so I can tell if the file(s) need to be transfered. I could check the file size, but that's not 100% reliable because obviously it's possible for two files to be the same size but contain different data. The date/time the files were modified is also not reliable as the user's computer date could be set wrong.
Is there some other way to tell if a local file and a file on an FTP are identical?
There isn't a generic way. If the ftp site includes a checksum file, you can download that (which will be a lot quicker since a checksum is quite small) and then see if the checksums match. But of course, this relies on the owner of the ftp site creating a checksum file and keeping it up to date.
Other then that, you are S.O.L.
If the server is plain-old FTP, you can't do any better than checking the size and timestamps.
FTP has no mechanism for giving you the hashes/checksums of files, so you would need to do something like keeping a special "listing file" that has all the file names and hashes, or doing a separate request via HTTP, or some other protocol.
Ideally, you should not be using FTP anyway, it's really an obsolete protocol. If you have control of the system, you could use rsync or something like it.
Use a checksum. You generate the md5 (or sha1, sha2 etc) hash of both files, and if the files are identical, then the hashes will be identical.
IETF tried to achieve this by adding new FTP commands such as MD5 and MMD5.
http://www.faqs.org/rfcs/ftp-rfcs.html
However, no all FTP vendors support them. So you must have a check on the targeting FTP server you application will work against to see if it supports MD5/MMD5. If not, you can pick up the workarounds mentioned above.
Couldn't you use a FileSystemWatcher and just have the client remember what changed?
http://msdn.microsoft.com/en-us/library/system.io.filesystemwatcher.aspx
Whenever your client uploads files to the FTP server, map each file to its hash and store it locally on the client computer (or store it anywhere you can access later, format doesnt matter, can be an xml file, plain text, as long as you can retreive the key/value pairs). Then when you upload files again just check the local files with the hash table you created, if it is different then upload the file. This way you don't have to rely on the server to maintain a checksum file and you dont have to have a process running to monitor the FileSystemWatcher events.

Categories