Is there a way to upload a file from local filesystem to a folder in a server using ASMX web services(no WCF, don't ask why:)?
UPD
P.S.file size can be 2-10 GB
Sure:
[WebMethod]
public void Upload(byte[] contents, string filename)
{
var appData = Server.MapPath("~/App_Data");
var file = Path.Combine(appData, Path.GetFileName(filename));
File.WriteAllBytes(file, contents);
}
then expose the service, generate a client proxy from the WSDL, invoke, standard stuff.
--
UPDATE:
I see your update now about handling large files. The MTOM protocol with streaming which is built into WCF is optimized for handling such scenarios.
When developing my free tool to upload large files to a server, I am also using .NET 2.0 and web services.
To make the application more error tolerant for very large files, I decided to not upload one large byte[] array but instead do a "chuncked" upload.
I.e. for uploading a 1 MB file, I do call my upload SOAP function 20 times, each call passing a byte[] array of 50 KB and concating it on the server together again.
I also count the packages, when one drops, I try to upload it again for several times.
This makes the upload more error tolerant and more responsive in the UI.
If you are interested, this is a CP article of the tool.
For very large files, the only efficient way to send them to web services is with MTOM. And MTOM is only supported in WCF, which you have ruled out. The only way to do this with old-style .asmx web services is the answer that #Darin Dimitrov gave. And with that solution, you'll have to suffer the cost of the file being base64 encoded (33% more bandwidth).
We had the same requirement, basically uploading a file via HTTP POST using the standard FileUpload controls on the client side.
In the end we just added an ASPX page to the ASMX web service project (after all its just a web project) - this allowed us to upload to i.e. http://foo/bar/Upload.aspx when the web service was at http://foo/bar/baz.asmx. This kept the functionality within the web service, even though it was using a separate web page.
This might or might not fit your requirements, #Darins approach would work as a workaround as well but you would have to make modifications on the client side for that, which wasn't an option for us.
You can try to convert the file to Base64 and pass it as a string to the service and then convert back to a byte array.
https://forums.asp.net/t/1980031.aspx?Web+Service+method+with+Byte+array+parameter+throws+ArgumentException
How to convert file to base64 in JavaScript?
The input is not a valid Base-64 string as it contains a non-base 64 character
Related
I am using RestSharp to send a POST request, the POST request contains a zip file along with the following headers:
request.AddParameter("username", this.username, ParameterType.GetOrPost);
request.AddParameter("userid", this.userid, ParameterType.GetOrPost);
request.AddParameter("projectid", this.projectid, ParameterType.GetOrPost);
//add the file
request.AddFile("os_serverpackage", this.filelocation);
request.AlwaysMultipartFormData = true;
In C# I am sending the request synchronously.
The file is making it to the server, we can see it coming through, so we don't think it is a RestSharp issue.
The problem we are having is the zip file is not being properly handled by our server (which is a node-backed server built on an ubuntu machine).
So we think the problem is the way we're using C# libraries to create the zip file.
The problem is when I send the same document as a zip file which was created with the sendToZip functionality on the desktop (windows) it works. However when I create the zip using C#'s native System.IO.Compression.ZipFile.CreateFromDirectory method to create the zip file, the server fails to respond.
What is the difference between the two?
I am using C#'s native System.IO.Compression.ZipFile.CreateFromDirectory method to create the zip file.
We're not experts on zip and compression, or using .NET to compress files, but we've done some basic troubleshooting. When we open one of these zip files on a Mac (a *nix machine of sorts), that originated from our .NET compression, we run into a problem where the zip is turned into a CPGZ and things look odd as outlined in this link http://osxdaily.com/2013/02/13/open-zip-cpgz-file/.
Another option we have is to change the way our node server handles the zip files. The server at the receiving end of the POST request is using adm-zip to extract the zip file.
Do any compression experts have any tips on ways to leverage these libraries to ensure that our zip file works cross platform?
Seems to me that you're missing this (caveat: I'm not familiar with RestSharp):
request.AddHeader("Content-Type", "application/zip, application/octet-stream");
I know that Linux servers tend to be fussy about correct Content-Type HTTP headers and if you don't set it correctly then it won't be seen. At the very least you should use something like Fiddler2 to see if your messages are being transmitted correctly
I have a pretty big video file I upload to a web service via multipart/form-data.
It takes ~ 30 seconds to arrive and I would prefer not waiting that long simply to access parameters I send along with the file.
My question is simple, can I access parameters sent with the form without waiting for the video payload to be uploaded?
Can this be done using headers or any other methods?
Streaming vs. Buffering
It's about how the webserver is set up. For IIS you can enable Streaming.
Otherwise, by default, IIS will use 'buffering' - the whole request is loaded into memory first (IIS's memory that you can't get to) before your app running in IIS can get it.
Not using IIS? You have to figure out how to get the webserver to do the same thing.
How to stream using IIS:
Streaming large file uploads to ASP.NET MVC
Note the way the file is read in the inner loop:
while ((cbRead = clientRequest.InputStream.Read(rgbBody, 0, rgbBody.Length)) > 0)
{
fileStream.Write(rgbBody, 0, cbRead);
}
Here instead of just saving the data like that question does, you will have to parse any xml/json/etc or whatever contains the file parameters you speak of ... and expect the video to be sent afterwards. You can process them right away if it's a quick process ... then get the rest of the video ... or you can send them to a background thread.
You probably won't be able to parse it just dumping what you have to a json or xml parser, there will be an unclosed tag or } at the top that isn't closed til after the video data is uploaded (however that is done). Or if it's multipart data from a form submission, as you imply, you will have to parse that partial upload yourself, instead of just asking IIS for the post data.
So this will be tricky, you can first start by writing 1k at a time to a log file with a time stamp to prove that you're getting the data as it comes. after that it's just a coding headache.
Getting this to work also means you'll have to have some control over the client and how it sends the data.
That's because you'll at least have to ensure it sends the file parameters FIRST!
Which concerns me, because, if you have control of the client, why can't you take the simple route (as Nobody and Nkosi imply) and use 2 requests? You mention you need one. Why not write js client code to send the parameters first in an XHR and then the file in a second request, using a correlation ID in both to tie them together? (the server could return this from the first request and you could send it in the 2nd).
Obviously, if you're just having a form with some inputs and a file upload and doing submit, then you need one request ;-) But if you have control over the client side you're not stuck with that.
Good luck, there is some advanced programming here, but nothing super high-tech. You will make it work!!!
If you don't have control over the server code, you are probably stuck, if the server app's webserver is buffering, the server app won't get anything, of course, if you wanted to do something with the file parameters first, this really implies you have control of the server side ;-)
Hihi all,
I am able to return stream from my WCF restful json webservice, everything works fine. But when I mixed the stream with another piece of data (both wrap into a custom class), upon consuming the webservice from my client, it gives an error message of "An existing connection was forcibly closed by the remote host".
Any advice how can I achieve the above? What it's required for my webservice is to allow downloading of a file with the file length as an additional piece of information for validation at the client end.
Thanks in advance! :)
There are various restrictions while using Stream in WCF service contracts - as per this MDSN link, only one (output) parameter or return value (of type stream) can be used while streaming.
In another MSDN documentation (this is anyway a good resource, if you want to stream large data using WCF), it has been hinted that one can combine stream and some input/output data by using Message Contract.
For example, see this blog post where author has used explicit message contract to upload both file name & file data. You have to do the similar thing from download perspective.
Finally, if nothing works then you can always push the file length as a custom (or standard such as content-length) HTTP header. If you are hosting in IIS then enable ASP.NET compatibility and use HttpContext.Current.Response to add your custom header.
I am trying to create a webservice that would allow its consumers to download files (can be very huge files). At the server, I have many files that need to be sent back to the consumer, so I am compressing all those file into one big zip file and streaming them back to the user. Right now, my webservice will start compressing the files, when the request comes in, forms the zip files and streams it back.Sometimes compression can take a lot of time, and the request may time out. What can I do to avoid such situations? My solution right now is to, seperate the data into smaller zip files, send a response to consumer saying there would be these many smaller files, and let consumers send request for individual smaller files. So, if i have 1GB zip file, i will break it into 10 smaller zip files, and ask the consumer to request for smaller files in 10 requests. Is this the correct approach? What problems can I be facing? Has anyone dealth with such issues before? I would be glad if you can share your experiences. Also, is it possible to start streaming the zip files without forming them fully?
Treat the request and the delivery as asynchronous operations.
The client can make a request for the file using one method. Another method can let the client know the status of the file packaging (whether they are ready for download yet). A third method can actually download the files.
It may be worth looking at a restful approach. Rather than a soap web service. As OrbMan, suggested an asynch approach may be best.
With REST you could expose a resource as: http://yourlocation/generatefile
Which (when called with a post) returns a http response with a response code of 301 'accepted' and a location header value of location=http://yourlocation/generatefile/id00124 which suggests the location of the data.
You can then poll the http://yourlocation/generatefile/id00124 resource (maybe just header request) to get the status i.e. processing / complete.
When processing is complete. Do a get on the http://yourlocation/generatefile/id00124 to download your file. The response http message should identify you file and the format i.e. encryption and compression types so any consumer knows how to read it.
This is a nice solution to problems which are long running and returns data in formats other than soap anbd general xml.
I hope this helps
I would poll from the calling client as part of the method which gets the file. The client code might flow something like this:
byte[] GetFile()
{
response = request.Post(http://yourlocation/generatefile);
string dataResource = response.Headers["Location"];
bool resourceReady = false;
while(!reasourceReady)
{
resH = request.Header(dataResource);
if(resH.Headers[Status] == "complete")
break;
else
Thread.Sleep(OneSecond); ?? or whetever
}
fileRes = request.Get(dataResource);
return fileRes.ToByteArray();
}
This is only psuedo, but I hope it makes sense...
In our project we want to query a document management system for a specific document or movie. The dms returns a URL with the document location (for example: http://mydomain.myserver1.share/mypdf.pdf or http://mydomain.myserver2.share/mymovie.avi).
We want to expose the document to internet users and intranet users. The requested file can be large (large video files).
Our architecture is like:
request goes like: webapp1 -> webapp2 -> webapp3 -> dms
response goes like: dms -> webapp3 -> webapp2 -> webapp1
webapp1 could be on the internet.
I have have been thinking how we can obfuscate the real url from the dms, due to security issues. I have seen implementations from other webapps where the pdf URL was obfusicated by creating a temp file for the requested document that is specific for the session and user. So other users cannot easily guess the documentname of other users.
My question: is there a pattern that deals with exposing company/user vulernable data to the public ?
Our development is in C# 3.5.
The easiest way to handle it is to create a ashx file (or some other way of creating a URL) and have it serve the pdf. Since WCF supports REST you could always do it through that too. Just load the pdf into memory and push the byte contents into the response stream.
Alternatively, you might want to look into these:
http://www.microsoft.com/forefront/edgesecurity/isaserver/en/us/
http://www.isapirewrite.com/