I have a service for uploading files which works well.
Is it possible to submit a file upload to a asp.net method (like normal uploads), but then forward the upload file details to the remote service. Therefore using the asp.net method as a proxy and NOT actual upload method?
The actual file saving will be done at the remote service.
Note
I'm using c# and .net 3.5
Regards
If you are using an ASP.NET method, the file MUST be uploaded to the server. However, it doesn't have to be saved using the "SaveAs" method or any other method. You can access the file directly as a stream which you can pass to your other service if you can send streams to it.
The idea is explained in this blog post (slightly different use but same idea):
http://weblogs.asp.net/meligy/archive/2008/02/18/unit-test-friendly-file-upload-handling-in-n-tier-applications.aspx
So, if your remote service call can be simplified as a method like:
public void MyServiceMethod(Stream inputStream) { ........ }
You can pass the file content from the page without saving it some way like:
myService.MyServiceMethod(myFileUploadControl.PostedFile.InputStream);
Related
a little background on our system, we have a mvc application that creates and displays forms, then posts these forms to a controller within the mvc application. it then does verification etc. etc.
I want to be able to use this to upload a file (currently using a post with the contoller pulling out the httppostedfilebase) have it send that file to a seperate application API which will pull the file information, store the information in the database, and save the file as something generic.
I have a method that can do all the pull apart/save file stuff, I have a controller that accepts my form post and gets all the relevant data including an httppostedfilebase. What I need is a way to send that file (which is not saved yet) over to our API.
We are hoping to avoid turning the file into a base64 string.
This is in c#.
Have you looked at this answer:
Web API: how to access multipart form values when using MultipartMemoryStreamProvider?
I think it will provide some ideas on how to handle streaming files in memory.
Solution we used:
using HttpPostedFileBase from the multipart form,
create byte array
stream file contents into the byte array
convert byte array to base64 string
add to json object along with file headers (name and extension)
post json object to api using HttpClient
My ASP.NET MVC application will be deployed to a series of load-balanced web servers. One problem I'm still working out is how to handle dynamically-uploaded file content, such as user-uploaded images -- obviously, saving them on the server where they were uploaded won't allow them to be accessed from the other servers in the load balanced group.
Currently I'm planning to save these to a shared storage location, specifically a UNC path referring to a directory on our NAS; but I'm not sure how best to retrieve these files to display them to the client. I'm thinking I'll need to write a custom route handler of some kind to retrieve them from the non-web-accessible storage location on the server side and then stream them back to the client. This seems relatively straightforward to do, yet I'm struggling with how to begin to approach this in ASP.NET.
Another solution I've considered is creating a Virtual Directory in each application directory which points to the network directory.
I've even considered uploading the files to Amazon S3 (via the file upload handling code) and using CloudFront to delivery them, but I'd rather avoid the external service dependency.
Which approach do you recommend, and are there established best practices or even existing components/libraries available for accomplishing this sort of thing?
In ASP.NET MVC you can handle this with a controller action, like so:
public class SharedImageController : Controller {
public ActionResult GetImage(String imageId) {
String uncPath = GetImageUncLocationFromId( imageId );
Response.ContentType = "image/jpeg"; // change as appropriate
return new FileResult( uncPath );
}
}
and in your HTML:
<img src="<%= Url.Action("GetImage", "SharedImage", new { imageId = "SomeImage.jpg" } %>" alt="Some descriptive text" />
You could make a custom HtmlHelper extension method to make this less error-prone if you'll be using this a lot.
i think there are 2 ways to fix it
1.use a tool synchronous files between machines
it makes duplicate files,per machine has same file
2.upload file to a net address like //192.168.1.1/upload,and host a website on iis like img.domain.com,than img url use this domain
file not duplicate,so you should make sure the browser can find it.the img domain not balance
or upload file to a cloud service
I have a custom ASP.Net server in a WinForms application. The server was created using CreateApplicationHost and a simple HttpWorkerRequest implementation.
I find that the custom server only processes requests for aspx files. If I try to access xml / txt / png files from the browser, it gives a "The resource cannot be found." error.
My question is: what must be done to be able to serve such files?
The answer is that the HttpWorkerRequest.SendResponseFromFile method must be overridden to send the file via the response stream.
Hihi all,
I am able to return stream from my WCF restful json webservice, everything works fine. But when I mixed the stream with another piece of data (both wrap into a custom class), upon consuming the webservice from my client, it gives an error message of "An existing connection was forcibly closed by the remote host".
Any advice how can I achieve the above? What it's required for my webservice is to allow downloading of a file with the file length as an additional piece of information for validation at the client end.
Thanks in advance! :)
There are various restrictions while using Stream in WCF service contracts - as per this MDSN link, only one (output) parameter or return value (of type stream) can be used while streaming.
In another MSDN documentation (this is anyway a good resource, if you want to stream large data using WCF), it has been hinted that one can combine stream and some input/output data by using Message Contract.
For example, see this blog post where author has used explicit message contract to upload both file name & file data. You have to do the similar thing from download perspective.
Finally, if nothing works then you can always push the file length as a custom (or standard such as content-length) HTTP header. If you are hosting in IIS then enable ASP.NET compatibility and use HttpContext.Current.Response to add your custom header.
Is there a way to upload a file from local filesystem to a folder in a server using ASMX web services(no WCF, don't ask why:)?
UPD
P.S.file size can be 2-10 GB
Sure:
[WebMethod]
public void Upload(byte[] contents, string filename)
{
var appData = Server.MapPath("~/App_Data");
var file = Path.Combine(appData, Path.GetFileName(filename));
File.WriteAllBytes(file, contents);
}
then expose the service, generate a client proxy from the WSDL, invoke, standard stuff.
--
UPDATE:
I see your update now about handling large files. The MTOM protocol with streaming which is built into WCF is optimized for handling such scenarios.
When developing my free tool to upload large files to a server, I am also using .NET 2.0 and web services.
To make the application more error tolerant for very large files, I decided to not upload one large byte[] array but instead do a "chuncked" upload.
I.e. for uploading a 1 MB file, I do call my upload SOAP function 20 times, each call passing a byte[] array of 50 KB and concating it on the server together again.
I also count the packages, when one drops, I try to upload it again for several times.
This makes the upload more error tolerant and more responsive in the UI.
If you are interested, this is a CP article of the tool.
For very large files, the only efficient way to send them to web services is with MTOM. And MTOM is only supported in WCF, which you have ruled out. The only way to do this with old-style .asmx web services is the answer that #Darin Dimitrov gave. And with that solution, you'll have to suffer the cost of the file being base64 encoded (33% more bandwidth).
We had the same requirement, basically uploading a file via HTTP POST using the standard FileUpload controls on the client side.
In the end we just added an ASPX page to the ASMX web service project (after all its just a web project) - this allowed us to upload to i.e. http://foo/bar/Upload.aspx when the web service was at http://foo/bar/baz.asmx. This kept the functionality within the web service, even though it was using a separate web page.
This might or might not fit your requirements, #Darins approach would work as a workaround as well but you would have to make modifications on the client side for that, which wasn't an option for us.
You can try to convert the file to Base64 and pass it as a string to the service and then convert back to a byte array.
https://forums.asp.net/t/1980031.aspx?Web+Service+method+with+Byte+array+parameter+throws+ArgumentException
How to convert file to base64 in JavaScript?
The input is not a valid Base-64 string as it contains a non-base 64 character