I'm looking for a way to pause or resume an upload process via C#'s WebClient.
pseudocode:
WebClient Client = new WebClient();
Client.UploadFileAsync(new Uri("http://mysite.com/receiver.php"), "POST", "C:\MyFile.jpg");
Maybe something like..
Client.Pause();
any idea?
WebClient doesn't have this kind of functionality - even the slightly-lower-level HttpWebRequest doesn't, as far as I'm aware. You'll need to use an HTTP library which gives you more control over exactly when things happen (which will no doubt involve more code as well, of course). The point of WebClient is to provide a very simple API to use in very simple situations.
As stated by Jon Skeet, this is not available in the Webclient not HttpWebRequest classes.
However, if you have control of the server, that receives the upload; perhaps you could upload small chunks of the file using WebClient, and have the server assemble the chunks when all has been received. Then it would be somewhat easier for you to make a Pause/resume functionality.
If you do not have control of the server, you will need to use an API that gives you mere control, and subsequently gives you more stuff to worry about. And even then, the server might give you a time-out if you pause for too long.
ok, with out giving you code examples I will tell you what you can do.
Write a WCF service for your upload, that service needs to use streaming.
things to remember:
client and server needs to identify
the file some how i suggest the use
of a Guid so the server knows what
file to append the extra data too.
Client needs to keep track of
position in the array so it knows
where to begin the streaming after it
resumes it. (you can even get the
server to tell the client how much
data it has but make sure the client
knows too).
Server needs to keep track of how
much data it has already downloaded
and how much still missing. files
should have a life time on the
server, you dont want half uploaded
and forgotten files stored on the
server forever.
please remember that, streaming does
not allow authentication since the
whole call is just one httprequest.
you can use ssl but remember that
will add a overhead.
you will need to create the service
contract at message level standard
method wont do.
I currently writing a Blog post about the very subject, It will be posted this week with code samples for how to get it working.
you can check it on My blog
I know this does not contain code samples but the blog will have some but all in all this is one way of doing stop and resume of file uploads to a server.
To do something like this you must write your own worker thread that does the actual http post stepwise.
Before sending a you have to check if the operation is paused and stop sending file content until it is resumed.
However depending on the server the connection can be closed if it isn't active for certain period of time and this can be just couple of seconds.
Related
I am currently building a very small kind of API in php. Depending on the data the client is requesting, it can take hours until the data is collected and can be returned. My client is currently a C# program. It gets a timeout after some time.
Is there a way in php to notify the client that the server is still working?
I do not want to increase the clients timeout span
I do not want to write some white spaces to prevent the time out. This would damage the format of the response (csv file) and would require to send the header before being sure that everything worked
Wikipedia lists the status code 102 Processing, which notifies the client that the server is still working. This is exactly what I need. Does somebody know how to send that without canceling the execution of the script?
If you think I need to do this with threading, I can try that. But it looks like some work and I would prefer a more simple way
Thanks for reading!
The simplest solution in my opinion is to return a url that the client can poll to check if the result is ready.
This is how it should behave precisely: http://farazdagi.com/blog/2014/rest-long-running-jobs/
I need to transfer 1 GB using a web service. I think to transfer piecewise using msmq. Maybe there is a way to take it easy?
If you CAN break the data up in smaller chunks, then do. Web services aren't designed to transport that much data in one go, so even though it's possible, it's gonna be a bumpy ride.
But the world doesn't work in an efficient way, so here's what you do:
write the data as binary to a local file.
2. Create a streamwriter that writes to your webservice using a streamreader to read from the file.
3. If anything happens, catch the exception and try to resume from where your file pointer is.
4. If you can modify the webservice, have it read the data and write to a binary file, catching any errors and trying to write any new data on resume to the file at the current pointer.
The trick is going to be to figure out how to tell the service you're trying to resume an interrupted request.
If this isn't clear, I'll try to expand some more.
I need to transfer 1 GB using a web service. I think to transfer piecewise using msmq.
I want to transport people with a car. I Think of using a plane.
Get it? Either web service, or MSMQ. They do not magically mix.
THAT SAID: Web service, alrge data = bad idea. Even JSON has overhead. STreaming, non streaming? That is a LOT of open variables, and in most cases the web service here makes relatively little sense.
Up (sent to service) or down (to the service)? More questions - I would not really want a 1gb upload to a web service.
If you have to, splice the data and make an api to ask for all "parts" and then get part by part - that also allows a progress bar to be shown. Your software MUST handle re-requests for parts due to failures which MAY happen in transit.
I would seriously consider not using a web service here if the data is binary and just go with a REST api, at least for downloads. Likely also for uploads. Lots depends on all the stuff you did not even know how to ask for or did not bother to describe.
You can make some service to first creat buffer in destination next split data and send it through service, then finalize it.
Need some help figuring out what I am looking for. Basically, I need a service in which the Server dumps a bunch of XML into a stream (over a period of time) and every time the dump occurs N number of clients read the dump.
Example: Every time one of a 1000 stocks goes up by 5 cents, the service dumps some XML into a stream. The connecting applications grab the information from the stream.
I don't think the connection will ever close, as there needs to be something reading the stream for new data.
This needs to adhere to WCF REST standards, is there something out there that I'm looking for? In the end, it's just a non-stop stream of data.
Update: Looks like the service needs to be a multi-part/mixed content type.
An application I'm working on has a similar architecture, and I'm planning to use SignalR to push updates to clients, using long-polling techniques. I haven't implemented it yet, so I can't swear it will work for you, but their documentation seems promising: Update: I have implemented this now, and it works very well.
Pushing data from the server to the client (not just browser clients)
has always been a tough problem. SignalR makes it dead easy and
handles all the heavy lifting for you.
Scott Hansleman has a good blog on the subject and there is a useful article (involving WCF, REST, and SignalR) here: http://www.codeproject.com/Articles/324841/EventBroker
Instead of using WCF, have you look into ASP.NET MVC WebAPI?
For more information about using PushStreamContent in WebAPI, Henrik has a nice blog with example (under the heading 'Push Content').
Have you considered archived Atom feeds? They are 100% RESTful (hypermedia controls and all) and most importantly, they are very scalable.
Specifically, the archive documents never change, so you can set a cache expiry of 1 year or more. The subscription document is where all the newest events go and is constantly changing, but with the appropriate HTTP caching headers, you can make so you return 304 Not Modified if nothing has changed between each client request. Also, if you service has a natural time resolution, you can set the max-age to take advantage of that. For instance, if you data has a 20min resolution, you could include the following header in the subscription document response:
Cache-Control: max-age=1200
that way you can let you caches do most of the heaving lifting and the clients can poll the subscription document as often as they like, without bringing your service to it's knees.
Problem: I need to download hundreds of images from different hosts. Each host has anywhere between 20-hundreds of images.
Solution: using a new WebClient every time a image needs to be downloaded through the WebClient's DownloadData method.
Or would be better to keep a pool of open socket connections and making the http request using lower level calls?
Is it expensive to open/close a tcp connection (I'm assuming that is what WebClient does), so that using a pools sounds more efficient?
I believe the underlying infrastructure which WebClient uses will already pool HTTP connections, so there's no need to do this. You may want to check using something like Wireshark of course, with some sample URLs.
Fundamentally, I'd take the same approach to this as with other programming tasks - write the code in the simplest way that works, and then check whether it performs well enough for your needs. If it does, you're done. If it doesn't, use appropriate tools (network analyzers etc) to work out why it's not performing well enough, and use more complicated code only if it fixes the problem.
My experience is that WebClient is fine if it doesn't what you need - but it doesn't give you quite as much fine-grained control as WebRequest. If you don't need that control, go with WebClient.
I use HttpWebRequest and HttpWebResponse to scrape anything I want. Unless, of course, there are services available for the requirement, but even though, sometimes, there are limitations (business limitations) and I often prefer to dig the html from pure http request. Sometimes just make feel more like developer, you know...
I'm using WebClient to mine a bunch of data. To conserve bandwidth (for both the client and web server), and speed my program up, I'd like to abort certain downloads early if it becomes evident that the file I'm downloading doesn't contain the information I'm looking for.
I'd like to base this decision based on the headers (mime type and file size), and possibly some of the content.
I'm presently using webClient.DownloadData, but I'd obviously have to switch this to an asynchronous method call. However, the async version doesn't pass the information I need either (headers and data). Is there perhaps another freely available class that meets these requirements?
Something that fires an event as soon as the headers have completed downloading would be nice, and periodically with progress updates.
If you want to decide whether or not to download something based on the headers, you can also send a HTTP HEAD request, which tells the server to only reply its headers.
Use the WebRequest class.