Upload large files to sitecore media library - c#

I have a media centric website that requires us to upload large images, videos to the media library.
I have the default settings for the following settings in web.config.
Media.MaxSizeInDatabase (20MB)
httpRuntime maxRequestLength
I do not want to increase MaxSizeInDatabase limit on the production server for security reasons.
Also, Media.UploadAsFiles is set to false.
So, my question is - Is there a way to configure sitecore such that if the file being uploaded is less than 20MB, it gets stored in the database and the files larger than 20MB get stored on the file system?

As Martijn says, there is nothing built in to automatically detect this, but if you know that the file is going to be large (or the upload fails due to the large size) then you can manually "force it" to save to file on a per upload basis.
You need to use the Advanced Upload option and select the "Upload as Files" option.
EDIT: If you are able to use YouTube then consider the following modules with nicely/tightly integrated with Sitecore. There are a couple of others ways of achieving the same thing for different providers.
YouTube Integration
YouTube Uploader

No, not that I know of. At least not automatically. Uploaded files are either stored in the DB or on the filesystem, based on your setting.
You might want to create an override upload method which could automatically handle this for you or use the manual checkbox in the Advanced Media Upload method as Jammykam says.

Related

Editing a file in Azure storage

I am using the Azure Storage File Shares client library for .NET in order to save files in the cloud, read them and so on. I got a file saved in the storage which is supposed to be updated after every time I'm doing a specific action in my code.
The way I'm doing it now is by downloading the file from the storage using
ShareFileDownloadInfo download = file.Download();
And then I edit the file locally and uploading it back to the storage.
The problem is that the file can be updated frequently which means lots of downloads and uploads of the file which increases in size.
Is there a better way of editing a file on Azure storage? Maybe some way to edit the file directly in the storage without the need to download it before editing?
Downloading and uploading the file is the correct way to make edits with the way you currently handling the data. If you are finding yourself doing this often, there are some strategies you could use to reduce traffic:
If you are the only one editing the file you could cache a copy of it locally and upload the updates to that copy instead of downloading it each time.
Cache pending updates and only update the file at regular intervals instead of with each change.
Break the single file up into multiple time-boxed files, say one per hour. This wouldn't help with frequency but it can with size.
FYI, when pushing logs to storage, many Azure services use a combination of #2 and #3 to minimize traffic.

Document Management System - Where to Store Files?

I'm on charge of building an ASP.NET MVC Document Management System. It have to be able to do basic document management tasks like adding, editing and searching entries and also perform versioning.
Anyways, I'm targeting PDF, Office and many image formats as the file attached to each document entry in the database. My question is: What design guidelines do pros follow when building the storage mechanism? Do they store the document files in the file system? Database? How file uploading is handled?
I used to upload the files to a temporal location while the user was editing the data and move it to permanent storage when the user confirmed the entry creation. Is this good? Any suggestions on improvement?
Files should generally be stored on a filesystem, rather than a database.
You will, however, have to consider some other things:
Are you planning on ever supporting load-balancing, replication, etc for your system?
If so, you'll need to support saving / loading files from a network location of some sort.
This can be trickier than you may imagine.
Are you planning to secure access to the files?
If so, you'll need to ensure they can't be read by someone who happens to know the URL.
eg: by returning the file as an attachment to a request.
This also prevents user-provided files being executed on your server - eg someone uploading an .aspx or .exe file and then accessing it.

asp.net mvc file upload security breach

ok apart from checking the file type and file size both server side how can i avoid security breach during uploading of file that may compromise my system (basically from the advance hacker) . is this enough? . i am not talking about any special scenario just simple file upload.
Well security is one of the major concern in my application.
You could make sure you don't store those files to a folder which is publicly accessible and executable by the web server. Also you could use heuristics by checking the first few bytes of the file for known file formats. For example common image formats have standard beginning headers so you might check for the presence of those headers.

How do CMS upload images?

I am just playing around and trying to make a very simple CMS. Right now I what I do right now is I use "FtpWebRequest" to get the file that they want to change around and stick it into a jquery plugin call html area(rich html editor).
Now I am wondering how could I allow them to add images that are not already hosted? Ie not on imageshack or something.
I am guessing I need to somehow upload the file and then store it somewhere but not sure how to do all that.
Thanks
A common approach for CMS systems that need to work in low-trust environments (like shared hosting) is to use the FileUpload control, and save the uploaded file as a binary (BLOB) in a database. This avoids dealing with the headache of disk access rights on the web server.
If you're using SQL Server, here's a great article on the database side of things (storing images as BLOBs).
The .NET side of things is pretty straightforward. The FileUpload.PostedFile property has all the information about the uploaded file, including a byte stream of its data.

Best process for auto-zipping of multiple MP3s

I've got a project which requires a fairly complicated process and I want to make sure I know the best way to do this. I'm using ASP.net C# with Adobe Flex 3. The app server is Mosso (cloud server) and the file storage server is Amazon S3. The existing site can be viewed at NoiseTrade.com
I need to do this:
Allow users to upload MP3 files to
an album "widget"
After the user has uploaded their
album/widget, I need to
automatically zip the mp3 (for other
users to download) and upload the
zip along with the mp3 tracks to
Amazon S3
I actually have this working already (using client side processing in Flex) but this no longer works because of Adobe's flash 10 "security" update. So now I need to implement this server-side.
The way I am thinking of doing this is:
Store the mp3 in a temporary folder
on the app server
When the artist "publishes" create a
zip of the files in that folder
using a c# library
Start the amazon S3 upload process (zip and mp3s)
and email the user when it is
finished (as well as deleting the
temporary folder)
The major problem I see with this approach is that if a user deletes or adds a track later on I'll have to update the zip file but the temporary files will not longer exist.
I'm at a loss at the best way to do this and would appreciate any advice you might have.
Thanks!
The bit about updating the zip but not having the temporary files if the user adds or removes a track leads me to suspect that you want to build zips containing multiple tracks, possibly complete albums. If this is incorrect and you're just putting a single mp3 into each zip, then StingyJack is right and you'll probably end up making the file (slightly) larger rather than smaller by zipping it.
If my interpretation is correct, then you're in luck. Command-line zip tools frequently have flags which can be used to add files to or delete files from an existing zip archive. You have not stated which library or other method you're using to do the zipping, but I expect that it probably has this capability as well.
MP3's are compressed. Why bother zipping them?
I would say it is not necessary to zip a compressed file format, you are only gong to get a five percent reduction in filesize, give or take a little. Mp3's dont really zip up by their nature the have compressed most of the possible data already.
DotNetZip can zip up files from C#/ASP.NET. I concur with the prior posters regarding compressibility of MP3s. DotNetZip will automatically skip compression on MP3, and just store the file, just for this reason. It still may be interesting to use a zip as a packaging/archive container, aside from the compression.
If you change the zip file later (user adds a track), you could grab the .zip file from S3, and just update it. DotNetZip can update zip files, too. But in this case you would have to pay for the transfer cost into and out of S3.
DotNetZip can do all of this with in-memory handling of the zips - though that may not be feasible for large archives with lots of MP3s and lots of concurrent users.

Categories