I'm pretty noob with ASP.NET programming and i'm a bit confused on a problem that i'm facing right now.
We, as devs in my company, live in a clustered environment like the one shown in figure
Nothing really special.
You see, IIS Websites are duplicated on evry FrontEnd Servers. Business logic resides on BackEnds that are sitting, togheder with DB and NAS File system, behind a firewall.
So communication between the public space and the protected one is permitted only through particular channels, with particular requests, to particular IPs and Protocols.
Now, we were asked to build a site where the user can customize his own environment, upload images that will be dispalyed in his HomePage and other features.
In a classical configuration, a user upload an image that is written in a folder in the site root, and then the HTML refers to that image to populate whatever control to display it.
But, when a user connect to our site, the load balancer will choose one particular frontend that's not the same for evry session.
So, a user will upload his file, disconnect and then come back only to find that his image is gone, 'couse the Load Balancer has routed his request to a differnt frontend where the image does not exist.
So the need to write a piece of code that pull the file from the NAS behind the Firewall.
The upload part is stupid, and i can understand it.
My problem is:
when the user connects to his page, how i reference in the HTML an image that's not on the machine the site is running on but on a completely hidden File system?
I thought of writing a WCF that serve the image as as a byte stream, but which ASP.NET control to use on the Page to put the stream content on, and How?
I Know that asking the experts community will bring me the best way to accomplish this.
Hope this is clear enough.
Thanks so much for the replays and excuse me for the bad english form.
Instead of using your file system for User Images, i would recommend you to go for some CDN solution like Amazon S3 or some other you like. This way you don't need to take care of where the image is stored or requested from, it shall always be accessed through CDN url(given when you upload an image on CDN programmatically), where url shall be in your database.Using a CDN releives you of many concerns that you shall face while storing at your location.
Even though if you don't like to use S3 , then your option is good enough to develop a simple WCF/WEB service to upload/serve the images.
Related
I am building an app using ASP.NET C# MVC5. You can add records to a database and these records can have images attached to them (it can be a single image for now if it makes it more complex).
I have seen some other posts on storage uploads but I don't see them working for me:
Upload File to Amazon S3
Upload files directly to Amazon S3 from ASP.NET application
How to upload a file to amazon S3 without passing it by a server?
Idea:
Upload a file (image) to Amazon S3 without the data passing through my server.
Persist any data on the form that was entered by the user (I'm happy with a post-back, modal or no-refreshing of the page)
Save a database value in a File table with the name and type of the file etc. that was uploaded (this will also be linked to the item that they are associated with)
Process:
User selects file to upload and presses upload button
File is uploaded directly to amazon (c# async method?, ajax?)
Thumbnail version is also created and uploaded to amazon using the same method? ('-thumb' is appended to filename)
Database records created with filename, amazon fileKey, fileType etc. for both files
(optional) thumbnail is then shown on-screen next to the uploader
User presses 'Save' to save the record along with the image information
When the user goes to save the item information (could be creating a new record or editing an existing one), the file information can be attached (if it isn't attached at the point of uploading the file??)
Constraints:
Avoid using client-based solution for improved app performance and accessibility across platforms (so avoid JavaScript, and definitely flash)
Minimise possibility of the user uploading a file and forgetting to save (or the browser crashing and loosing the link to the file that has been uploaded)
If I knew more about ASP.NET MVC I would have liked to have more suggestions or ideas. I wonder about using async methods or ajax calls to stop the page re-loading and loosing inputted data.
I have downloaded the Amazon SDK and built some classes around it but realised that it wasn't that useful if I'm not uploading via my app!
I took samples for client side upload from https://github.com/floatingfrisbee/amazonfileupload which was fairly useful (I managed to get it working in my app for client side uploads) but there is a lot missing from that solution to fit my problem and I would like a neat, reliable solution for my more complex problem.
Any ideas very welcome!! I'm really struggling to come up with something.
UPDATE 15.12.2016
To clarify, I would like to minimise the amount of client-side processing to the point that a low powered mobile device wouldn't struggle to process the request (obviously if the device can't handle simple image uploads then I don't need to worry).
In terms of server-side processing, the main thing I want to avoid is uploading the image via the server for bandwidth reasons, any other processing I am very happy to happen on the server.
I have ASP.NET WebApi(MVC) application. On the site I need to show images. This images located on a separate server in the local network. I've tried the following:
<img src='file://///server_name/...'>
I tried various src addresses and only the one worked, but only in the IE, I got error in the Chrome and in the Firefox. I know it's security limitations.
And, I have question, What is best approach to get images from the separate server? I have WCF applciation also on main server. May be there are exist some way or may be to write console application(or another one web applciation) and host it on the images server, and it's applciation will serve http requesst from my main site? The perfomance is main goal for me.
I can't move images because there are hundreds terabytes. And I don't want to use ASP.NET Impersonation.
Map a Virtual Directory in IIS Manager. More info here.
And then reference the image:
<img src='/MyVirtualDirectory/photo.jpg'>
I have an image upload application (C# desktop) for end users and I want to switch to the cloud(storage)(the VPS is to expensive and unlimited hosting providers don't allow image hosting scripts).
In order to do that I need to embed the login credentials inside my application (delivered to the end users) and maybe update them according to changes. So it's not a solution to me because of security reasons (cracking and upload things that aren't images).
One solution would be to host a PHP script so that my application uploads to that script (check if it's a image) and than to re-upload that image to the cloud storage. The problem is, I use the double bandwidth.
Is there any cloud hosting model that enables me that (without paying the double bandwidth or additional fees for running the PHP application or other thenology) or how to "hide" the creditals?
I have about 5000 uniqe visitors a day with about 70 users online in each moment (google analytics). I am offering a free service in a free app without adds so i have no earnings and i would really try to keep it like that with minimal hosting costs :(
Try out Google App Engine and Blobstore. It's relatively easy to use (sorry, no PHP support tho!) and is free up to a specified limit, so you can develop without having to spend.
https://developers.google.com/appengine/docs/python/blobstore/overview
How it works:
Your C# app GETs to a short Python script that simply returns a URL
that contains a random key (that Google gives your Python script)
Your C# app POSTs to that URL using an appropriate mime type.
Your C# app will be redirected to a final URL once the upload
completes. That URL will again be a Python script, which will
then record the uploaded URL (and any metadata) to prevent it from becoming orphaned.
Another advantage of using Google App Engine is that they offer image transforms (crop, scale) that will be handled on their end -- allowing you to ask for thumbnails, for example, without having to download the full sized image first.
For step 3, you could record the URL (really just the blobstore entry ID) to your main application. The point is: you don't want to lose that random key because that's the only way of retrieving the image file.
The place where I work has 2 servers and a load balancer. The setup is horrible since I have to manually make sure both servers have the same files. I know there are ways to automate this but it has not been implemented, hopefully soon (I have no control over this). I wrote an application that collects a bunch of information from a user, then creates a folder named after the email of the user in one of the servers. The problem is that I can't control in which server the folder gets created in, so let say a user goes in.. fills his stuff and his folder gets created in server 1, user goes away for a while and goes back to the site but this time the load balancer throws the user into server 2, now the user does something that needs to be saved into his folder but since it didn't created in this server an error occurs. What can I do about this? any suggestions?
Thanks
It sounds like you could solve a few issues by implementing a cloud file service for the file writes such as Amazon S3 http://aws.amazon.com/s3/
Disk size management would no longer be a concern
Files are now written and read from S3 so load balancer concerns are solved
Benefits of a semi-edge network with AWS. (not truly edge but in my experience better than most internally hosted solutions)
Don't store your data in the file system, store it in a database.
If you really can't avoid using the file system, you could look at storing the files in a network share both servers have access to. This would be a terrible hack, however.
It sounds like you may be having a session state issue. It sounds odd the way you describe it, but have a look at this article. It's old, but covers the basics. If it doesn't try googling "asp.net session state web farm"
http://ondotnet.com/pub/a/dotnet/2003/03/24/sessionstate.html
Use NAS or SAN to centralize storage. That same network-accessible storage can store the shared configuration that IIS can be setup to use.
Web Deploy v2 just released from Microsoft, I would encourage the powers that be to investigate that, along with Application Request Routing and the greater Web Farm Framework.
This is a normal infrastructure setup. Below are the two commonly used solutions for the situation you are in.
If you have network attached storage available (e.g. Netapps), you can use this storage to centrally store all of your user files that need to be available across all servers in your web farm.
Redesign your application to store all user specific data in a database.
I need to let a company push information up to my site.
The best way to explain what I am talking about is to explain how it is currently done with their previous website:
This company uploads a CSV file to an FTP set up by the website. The website then processes the CSV file and puts it into an SQL database so that it can be used by the website.
In this case, I am the website and I am working with the company. Both sides are willing to change what they do. So my question is...
What is the best way to accept batch information like this? Is there a more automated way that doesn't involve FTP? In the future I may have a lot of companies wanting to do this, and I'd hate to have to setup accounts for each one.
The project is C# ASP.NET MSSQL
Let me know if you need more information...
Set up a web service to accept incoming data. That way you can validate immediately and reject bad data before it ever gets into your system.
If you want to eliminate FTP, you could allow them to upload files to your site leveraging using FileUpload. Once the file is uploaded you can do your server side processing.
EDIT: From the OP's comment's it seems to be an automated process. That said, if their process generates the file, you could:
Allow them to continue their current process which would involve them generating their file and placing it somewhere where it could be accessed via a URI with authentication, you could access this file on a schedule and process it. From what it seems right now they generate a file and upload it to your FTP server, so there seems to a manual element to begin with.