File upload to cloud from application without exposing creditals - c#

I have an image upload application (C# desktop) for end users and I want to switch to the cloud(storage)(the VPS is to expensive and unlimited hosting providers don't allow image hosting scripts).
In order to do that I need to embed the login credentials inside my application (delivered to the end users) and maybe update them according to changes. So it's not a solution to me because of security reasons (cracking and upload things that aren't images).
One solution would be to host a PHP script so that my application uploads to that script (check if it's a image) and than to re-upload that image to the cloud storage. The problem is, I use the double bandwidth.
Is there any cloud hosting model that enables me that (without paying the double bandwidth or additional fees for running the PHP application or other thenology) or how to "hide" the creditals?
I have about 5000 uniqe visitors a day with about 70 users online in each moment (google analytics). I am offering a free service in a free app without adds so i have no earnings and i would really try to keep it like that with minimal hosting costs :(

Try out Google App Engine and Blobstore. It's relatively easy to use (sorry, no PHP support tho!) and is free up to a specified limit, so you can develop without having to spend.
https://developers.google.com/appengine/docs/python/blobstore/overview
How it works:
Your C# app GETs to a short Python script that simply returns a URL
that contains a random key (that Google gives your Python script)
Your C# app POSTs to that URL using an appropriate mime type.
Your C# app will be redirected to a final URL once the upload
completes. That URL will again be a Python script, which will
then record the uploaded URL (and any metadata) to prevent it from becoming orphaned.
Another advantage of using Google App Engine is that they offer image transforms (crop, scale) that will be handled on their end -- allowing you to ask for thumbnails, for example, without having to download the full sized image first.
For step 3, you could record the URL (really just the blobstore entry ID) to your main application. The point is: you don't want to lose that random key because that's the only way of retrieving the image file.

Related

Mitigating the risk of Server-Side Request Forgery when downloading files with the .NET Framework

Question: If I have an untrusted, user-supplied URL to a file, how do I protect myself against server-side request forgery when I download that file? Are there tools in the .NET Framework (4.8) base class library that help me, or is there some canonical reference implementation for this use case?
Details: Our web application (an online product database) allows users to upload product images. We have the requirement that users should be allowed to supply the URL to a (self-hosted) image instead of uploading an image.
So far, so good. However, sometimes our web application will have to fetch the image from the (external, user-supplied) URL to do something with it (for example, to include the image in a PDF product data sheet).
This exposes my web application to the risk of Server-Side Request Forgery. The OWASP Cheat Sheet documents this use case as "Case 2" and suggests mitigations such as validating URLs and backlisting known internal IP addresses.
This means that I cannot use the built-in methods for downloading files such as WebClient or HttpWebRequest, since those classes take care of DNS resolution, and I need to validate IP addresses after DNS resolution but before performing the HTTP request. I could perform DNS resolution myself and then create a web request with the (validated) IP address and a custom Host header, but that might mess up TLS certificate checking.
To make a long story short, I feel like I am reinventing the wheel here, for something that sounds like a common-enough use case. (I am surely not the first web developer who has to fetch files from user-supplied URLs.) The .NET Framework has tools for protection against CSRF built-in, so I'm wondering if there are similar tools available for SSRF that I just haven't found.
Note: There are similar question (such as this one) in the ssrf tag, but, contrary to them, my goal is not to "get rid of a warning", but to actually protect my system against SSRF.
Confirm the requirement with the business stakeholders. It's very possible they don't care how the file is obtained-- they just want the user to be able to specify a URL rather than a local file. If that is the case, your application can use Javascript to download the file from the browser then upload it from there. This avoids the server-side problem completely.
If you have to do it server-side, ask for budget for a a dedicated server. Locate this in your DMZ (between the perimeter firewall and the firewall that isolates your web servers from the rest of your network). Use this server to run a program that downloads the URLs and puts the data where your main application can get it, e.g. a database.
If you have to host it on your existing hardware, use a dedicated process, running in a dedicated application pool with a dedicated user identity. The proper location for this service is on your web server (not application or database servers).
Audit and monitor the security logs for the dedicated user.
Revoke any permission to private keys or local resources such as the filesystem.
Validate the protocol (http or https only).
To the extent possible, validate the IP address, and maintain a black list.
Validate the domain name to ensure it is a public URL and not something within your network. If possible, use a proxy server with public DNS.

Openning cmd from website page - php or other option?

I built a WPF C# application that map drives to the local machine. However, I'm trying to make it on website. Is it possible to do it?
I'm doing it by open cmd with Process.Start command in c# in my application.
I read about exec() function in PHP, but is it can help to reach my target or I should try something else?
In short, no you can't do what you want from a web application.
A website runs on a remote server and (rightly) has no control over the machine of the end user who views it. All that happens is the user's browser makes a request to the remote website, and the website returns some HTML (essentially HTML is just text in an agreed format) to display inside the browser. JavaScript code can run to manipulate the structure of the page within the browser, and can send/receive more data from the server.
But for security reasons the JavaScript cannot do anything which changes the state of the end-user's machine. It is caged within the browser environment. It also cannot get much useful info about the user's machine, without the user's explicit, manual consent. Think about if it could - malicious websites (or, arguably worse) malicious code injected into legitimate-looking websites) could steal information from your machine, delete files, map unwanted drives (!), change passwords etc etc - without having to ask your permission. The web would be un-usable for all practical purposes.

.NET Upload files to Amazon S3 directly from client

I am building an app using ASP.NET C# MVC5. You can add records to a database and these records can have images attached to them (it can be a single image for now if it makes it more complex).
I have seen some other posts on storage uploads but I don't see them working for me:
Upload File to Amazon S3
Upload files directly to Amazon S3 from ASP.NET application
How to upload a file to amazon S3 without passing it by a server?
Idea:
Upload a file (image) to Amazon S3 without the data passing through my server.
Persist any data on the form that was entered by the user (I'm happy with a post-back, modal or no-refreshing of the page)
Save a database value in a File table with the name and type of the file etc. that was uploaded (this will also be linked to the item that they are associated with)
Process:
User selects file to upload and presses upload button
File is uploaded directly to amazon (c# async method?, ajax?)
Thumbnail version is also created and uploaded to amazon using the same method? ('-thumb' is appended to filename)
Database records created with filename, amazon fileKey, fileType etc. for both files
(optional) thumbnail is then shown on-screen next to the uploader
User presses 'Save' to save the record along with the image information
When the user goes to save the item information (could be creating a new record or editing an existing one), the file information can be attached (if it isn't attached at the point of uploading the file??)
Constraints:
Avoid using client-based solution for improved app performance and accessibility across platforms (so avoid JavaScript, and definitely flash)
Minimise possibility of the user uploading a file and forgetting to save (or the browser crashing and loosing the link to the file that has been uploaded)
If I knew more about ASP.NET MVC I would have liked to have more suggestions or ideas. I wonder about using async methods or ajax calls to stop the page re-loading and loosing inputted data.
I have downloaded the Amazon SDK and built some classes around it but realised that it wasn't that useful if I'm not uploading via my app!
I took samples for client side upload from https://github.com/floatingfrisbee/amazonfileupload which was fairly useful (I managed to get it working in my app for client side uploads) but there is a lot missing from that solution to fit my problem and I would like a neat, reliable solution for my more complex problem.
Any ideas very welcome!! I'm really struggling to come up with something.
UPDATE 15.12.2016
To clarify, I would like to minimise the amount of client-side processing to the point that a low powered mobile device wouldn't struggle to process the request (obviously if the device can't handle simple image uploads then I don't need to worry).
In terms of server-side processing, the main thing I want to avoid is uploading the image via the server for bandwidth reasons, any other processing I am very happy to happen on the server.

Implement "cloud saving" in C#

I have a game I've been working on that I want to do a sort of "cloud saving" with. My issue is securely uploading save files so that we don't expose our website or FTP server. Right now, I'm using FTP with a severely restricted account that has access to /saves, but it also has access to each user's save directory. Malicious destruction of save data was solved with some clever design, and it's not what I'm worried about. I am worried about someone getting ahold of the FTP account I use to login (wouldn't be too hard, because it has to be stored in code) and using it to make multiple connections and upload massive files. I don't want to place an upload restriction on the account, because all of my users have to use the same account for uploading, and I don't want legitimate users running into issues. However, this still presents an issue. Users have a WordPress username and password they use to launch the game, and the launcher validates permissions through WordPress. Ideally, when people buy the game I'd like to create a directory for them, as well as a username and password and upload limit of probably 10MB/day, but I doubt our hosting service provides this so I'm looking at alternate methods.
tl;dr How do I restrict users of my game into a specific directory with an upload limit, potentially without using FTP? I tried to do uploading with PHP before, but it's generally frowned upon when a remote PHP script tries to access files on a user's machine without any sort of FORM element. I guess it might work if I could initiate some sort of upload from the client... I'd still have to find a way to prevent malicious uploads, though.
Any ideas, anyone? This is something I'd really like to do, and to do it I need to make it secure against attacks.
Thanks!
Isn't this the kind of problem that web service created to solve? You can create a web service, integrate it with your user database, so your game would call the service to upload and download the data with authentication token from Wordpress. It won't stop anyone from DDOSing your webservice, but at least no risk for leaked password. Do note, according to this article, there's a hard limit to the uploaded data at 4MB. Of course you can simply split the file before sending them and handle the joining at the server.

ASP.NET: Clustered environment communication Problem

I'm pretty noob with ASP.NET programming and i'm a bit confused on a problem that i'm facing right now.
We, as devs in my company, live in a clustered environment like the one shown in figure
Nothing really special.
You see, IIS Websites are duplicated on evry FrontEnd Servers. Business logic resides on BackEnds that are sitting, togheder with DB and NAS File system, behind a firewall.
So communication between the public space and the protected one is permitted only through particular channels, with particular requests, to particular IPs and Protocols.
Now, we were asked to build a site where the user can customize his own environment, upload images that will be dispalyed in his HomePage and other features.
In a classical configuration, a user upload an image that is written in a folder in the site root, and then the HTML refers to that image to populate whatever control to display it.
But, when a user connect to our site, the load balancer will choose one particular frontend that's not the same for evry session.
So, a user will upload his file, disconnect and then come back only to find that his image is gone, 'couse the Load Balancer has routed his request to a differnt frontend where the image does not exist.
So the need to write a piece of code that pull the file from the NAS behind the Firewall.
The upload part is stupid, and i can understand it.
My problem is:
when the user connects to his page, how i reference in the HTML an image that's not on the machine the site is running on but on a completely hidden File system?
I thought of writing a WCF that serve the image as as a byte stream, but which ASP.NET control to use on the Page to put the stream content on, and How?
I Know that asking the experts community will bring me the best way to accomplish this.
Hope this is clear enough.
Thanks so much for the replays and excuse me for the bad english form.
Instead of using your file system for User Images, i would recommend you to go for some CDN solution like Amazon S3 or some other you like. This way you don't need to take care of where the image is stored or requested from, it shall always be accessed through CDN url(given when you upload an image on CDN programmatically), where url shall be in your database.Using a CDN releives you of many concerns that you shall face while storing at your location.
Even though if you don't like to use S3 , then your option is good enough to develop a simple WCF/WEB service to upload/serve the images.

Categories