I'm trying to create a collection of FTP web requests to download a collection of files.
Was working correctly doing this in a single thread but am trying to do with multiple threads now but am getting a timeout exception. I think I'm missing something pretty simple but cannot seem to work it out
Here is code:
internal static void DownloadLogFiles(IEnumerable<string> ftpFileNames, string localLogsFolder)
{
BotFinder.DeleteAllFilesFromDirectory(localLogsFolder);
var ftpWebRequests = new Collection<FtpWebRequest>();
// Create web request for each log filename
foreach (var ftpWebRequest in ftpFileNames.Select(filename => (FtpWebRequest) WebRequest.Create(filename)))
{
ftpWebRequest.Credentials = new NetworkCredential(BotFinderSettings.FtpUserId, BotFinderSettings.FtpPassword);
ftpWebRequest.KeepAlive = false;
ftpWebRequest.UseBinary = true;
ftpWebRequest.CachePolicy = NoCachePolicy;
ftpWebRequest.Method = WebRequestMethods.Ftp.DownloadFile;
ftpWebRequests.Add(ftpWebRequest);
}
var threadDoneEvents = new ManualResetEvent[ftpWebRequests.Count];
for (var x = 0; x < ftpWebRequests.Count; x++)
{
var ftpWebRequest = ftpWebRequests[x];
threadDoneEvents[x] = new ManualResetEvent(false);
var threadedFtpDownloader = new ThreadedFtpDownloader(ftpWebRequest, threadDoneEvents[x]);
ThreadPool.QueueUserWorkItem(threadedFtpDownloader.PerformFtpRequest, localLogsFolder);
}
WaitHandle.WaitAll(threadDoneEvents);
}
class ThreadedFtpDownloader
{
private ManualResetEvent threadDoneEvent;
private readonly FtpWebRequest ftpWebRequest;
/// <summary>
///
/// </summary>
public ThreadedFtpDownloader(FtpWebRequest ftpWebRequest, ManualResetEvent threadDoneEvent)
{
this.threadDoneEvent = threadDoneEvent;
this.ftpWebRequest = ftpWebRequest;
}
/// <summary>
///
/// </summary>
/// <param name="localLogsFolder">
///
/// </param>
internal void PerformFtpRequest(object localLogsFolder)
{
try
{
// TIMEOUT IS HAPPENING ON LINE BELOW
using (var response = ftpWebRequest.GetResponse())
{
using (var responseStream = response.GetResponseStream())
{
const int length = 1024*10;
var buffer = new Byte[length];
var bytesRead = responseStream.Read(buffer, 0, length);
var logFileToCreate = string.Format("{0}{1}{2}", localLogsFolder,
ftpWebRequest.RequestUri.Segments[3].Replace("/", "-"),
ftpWebRequest.RequestUri.Segments[4]);
using (var writeStream = new FileStream(logFileToCreate, FileMode.OpenOrCreate))
{
while (bytesRead > 0)
{
writeStream.Write(buffer, 0, bytesRead);
bytesRead = responseStream.Read(buffer, 0, length);
}
}
}
}
threadDoneEvent.Set();
}
catch (Exception exception)
{
BotFinder.HandleExceptionAndExit(exception);
}
}
}
It seems to be downloading the first two files (using two threads I'm assuming) but then timeout seems to occur when these complete and application tries to move onto next file.
I can confirm that the FTPWebRequest which is timing out is valid and the file exists, I think I may have an open connection or something.
Was going to post a comment but probably easier to read in an answer:
Firstly, if I set the ftpRequest.Timout property to Timeout.Infinite, the timeout issue disappears however having an infinite timeout is probably not best practice. So I'd prefer to go about solving this another way...
Debugging the code, I can see that when it gets to:
ThreadPool.QueueUserWorkItem(threadedFtpDownloader.PerformFtpRequest, localLogsFolder);
It enters into PerformFtpRequest method for each FTP web request and calls the ftpWebRequest.GetResponse() but then only progresses further for the first two requests. The rest of the requests stay active but don't go any further until the first two finish. So this basically means they are left open while waiting for other requests to complete before starting.
I think the solution to this problem would either be allowing all the requests to execute at once (ConnectionLimit property is having no effect here) or to prevent execution from calling GetResponse until it's actually ready to use the response.
Any good ideas on best way to solve this? At the moment all I can seem to think of are hacky solutions which I'd like to avoid :)
Thanks!
You should get the ServicePoint for the request and set the ConnectionLimit
ServicePoint sp = ftpRequest.ServicePoint;
sp.ConnectionLimit = 10;
The default ConnectionLimit is 2 -- that's why you're seeing that behavior.
UPDATE: See this answer for a more thorough explanation:
How to improve the Performance of FtpWebRequest?
Related
The documentation for DataReader's DetachBuffer and DetachStream is very vague. It just says `Detaches a buffer that was previously attached to the reader'.
In short
When should reader.DetachBuffer(); be used?
Background
Reading
An example read method for a SerialDevice could look something like this:
using (var reader = new DataReader(inputStream))
{
var bytesReceived = await reader.LoadAsync(EXPECTED_RESPONSE_LENGTH);
var receivedBuffer = new byte[bytesReceived];
reader.ReadBytes(receivedBuffer);
reader.DetachStream();
return receivedBuffer;
}
This code works and seems to be stable, but since I write and read multiple times a second on an embedded device I want to avoid creating the receivedBuffer buffer each time. I modified my method to be something like the code below.
byte[] _receivedBuffer = new byte[EXPECTED_RESPONSE_LENGTH];
private async Task<byte[]> ReadOnceAsync(IInputStream inputStream)
{
using (var reader = new DataReader(inputStream))
{
reader.InputStreamOptions = InputStreamOptions.Partial;
uint bytesReceived = await reader.LoadAsync(EXPECTED_RESPONSE_LENGTH);
var isExpectedLength = (bytesReceived == EXPECTED_RESPONSE_LENGTH);
if (isExpectedLength)
{
reader.ReadBytes(_receivedBuffer);
}
reader.DetachStream();
return isExpectedLength ? _receivedBuffer: null;
}
}
This code crashes my application, sometimes with Access Violation message, within minutes of starting or within seconds if the connected device stops responding.
After I added reader.DetachBuffer(); the code is stable again, but I still don't know if DetachBuffer should be called always, sometimes or not at all.
Writing
My write method does not call writer.DetachStream() but I don't know if it should or not. The code is:
using (var writer = new DataWriter(outputStream))
{
writer.WriteBytes(toSend);
var bytesWritten = await writer.StoreAsync();
//Should writer.DetachBuffer(); be called?
writer.DetachStream();
return bytesWritten;
}
I am aware that this question has been submitted multiple times, but the last answers are dated and I wonder whether there would be a solution available today? This is not a feature request. I'm rather looking for any workaround that would work.
I'm using RestSharp on a client which talks with my API. The API replies with an "application/octect-stream" which can take up to several minutes to download. Link to the GitHub code here.
public void Download()
{
if (NewerApp == null) return;
var client = new RestClient(ServerAddress);
var request = new RestRequest("/Apps/", Method.POST);
request.AddParameter("id", CurrentApp.EncryptedId());
request.AddParameter("action", "download");
var asyncHandle = client.ExecuteAsync<App>(request, response => {
HandleResponseToDownloadRequest(response);
});
}
I would need to report the progress of the reception of "response" to my UI to build a progress bar or something like that. I already know the expected amount of data to be received (through a previous API response), I would just need to know how many bytes were received before the full response is received and parsed.
I don't believe RestSharp currently offers a 'report progress' type of event, right? I can see several approaches:
Use client.DownloadData(request).SaveAs(path);. Maybe the file is created while it is downloaded. I could read the size of the file to report the progress of the download. But my first impression is that the client first downloads the data, then saves the file. In that case, that wouldn't help.
Using a stream to load the response. Evaluate the size of the stream at regular intervals, or everytime the buffer size extends.
Change the type of response from the API to another one (send the data in a JSON, for example?).
Any other option?
What do you think? Anyone has managed to report the progress of the upload?
Can I access the RawBytes of 'response' while ExecuteAsync is still executing? Should I use Execute (without Asyn) instead? Should I use stream and update watch the size of the stream at regular updates?
I managed to get a progress report, but not using RestSharp. I used System.Net.HttpClient based on kievic's answer here. As kievic highlighted, the key here is that aClient.SendAsync returns once the HTTP headers are received and read, but the content is still loading. Then the content is slowly loaded through a stream in the while loop. RestSharp doesn't seem to enable this, or I couldn't achieve that.
public async Task DownloadAsync(Action<bool> callback)
{
if (NewerApp == null) return;
// Your original code.
HttpClientHandler aHandler = new HttpClientHandler();
aHandler.ClientCertificateOptions = ClientCertificateOption.Automatic;
HttpClient aClient = new HttpClient(aHandler);
aClient.DefaultRequestHeaders.ExpectContinue = false;
HttpRequestMessage message = new HttpRequestMessage(HttpMethod.Post, ServerAddress + "/Apps/");
string content = "id=" + CurrentApp.EncryptedId() + "&action=download";
message.Content = new StringContent(content);
HttpResponseMessage response = await aClient.SendAsync(message,
HttpCompletionOption.ResponseHeadersRead); // Important! ResponseHeadersRead.
// New code.
Stream stream = await response.Content.ReadAsStreamAsync();
MemoryStream memStream = new MemoryStream();
// Start reading the stream
var res = stream.CopyToAsync(memStream);
// While reading the stream
while (true)
{
// Report progress
this.DownloadedSize = memStream.Length;
this.Progress = 100.0 * (double)memStream.Length / (double)NewerApp.Filesize;
// Leave if no new data was read
if (res.IsCompleted)
{
// Report progress one last time
this.DownloadedSize = memStream.Length;
this.Progress = 100.0 * (double)memStream.Length / (double)NewerApp.Filesize;
break;
}
}
// Get the bytes from the memory stream
byte[] responseContent = new byte[memStream.Length];
memStream.Position = 0;
memStream.Read(responseContent, 0, responseContent.Length);
// Function has ended - return whether the app was donwloaded
// properly and verified, or not
callback(HandleResponseToDownloadRequest(responseContent));
}
Every time this.DownloadedSize and this.Progress are assigned a new value, they fire an event which can be caught by the UI.
private double progress = 0;
/// <summary>
/// Progress of the download of the App. From 0.0 (%) to 100.0 (%)
/// </summary>
public double Progress
{
get { return progress; }
set
{
// Max / Min
double val = value;
if (val > 100.0) val = 100;
else if (val < 0.0) val = 0.0;
// Assign value
if (progress != val)
{
progress = val;
OnProgressReport("Progress");
OnPropertyChanged("Progress");
}
}
}
public long downloadedSize = 0;
/// <summary>
/// Quantity of bytes downloaded of the app.
/// Note: there can be more bytes downloaded than advertized because
/// the quantity of advertize filesize is not encrypted while the
/// received bytes are encrypted.
/// TODO: advertize the size of the encrypted file.
/// </summary>
public long DownloadedSize
{
get
{
return downloadedSize;
}
set
{
if (downloadedSize != value)
{
downloadedSize = value;
OnDownloadedSizeReport("DownloadedSize");
}
}
}
/// <summary>
/// Fired when the progress of the download of the app file
/// has updated (more bytes received).
/// </summary>
public event PropertyChangedEventHandler ProgressReport;
protected void OnProgressReport(string name)
{
PropertyChangedEventHandler handler = ProgressReport;
if (handler != null)
{
handler(this, new PropertyChangedEventArgs(name));
}
}
/// <summary>
/// Fired when the progress of the download of the app file
/// has updated (more bytes received).
/// </summary>
public event PropertyChangedEventHandler DownloadedSizeReport;
protected void OnDownloadedSizeReport(string name)
{
PropertyChangedEventHandler handler = DownloadedSizeReport;
if (handler != null)
{
handler(this, new PropertyChangedEventArgs(name));
}
}
I call DownloadAsync like this:
// Start the download - await makes the execution of the method in background
// so that UI can refresh and keep responsive.
// downloaded: bool, true if file properly downloaded
// Updater_AppDownloaded: function called once download has ended (failed or succeeded, either way)
await System.Threading.Tasks.Task.Run(() =>
Updater.DownloadAsync(downloaded =>
Updater_AppDownloaded(downloaded)));
I just want to download 'n' files using ftp server at same time. My code is as follows...
Each time I run this code, only one file is getting downloaded and then raising an exception in GetResponse() line:
The remote server returned an error: (501) Syntax error in parameters or arguments.
class main{
public static void main(){
Multiple_File_Downloader MFD= new Multiple_File_Downloader();
MFD.Multi_Thread(); }
}
class Multiple_File_Downloader
{
public void Multi_Thread()
{
Thread a = new Thread(new ThreadStart(() => Downloadfile("7.jpg")));
Thread b = new Thread(new ThreadStart(() => Downloadfile("8.jpg")));
a.IsBackground = true;
b.IsBackground = true;
a.Start();
b.Start();
}
public void Downloadfile(string _filename)
{
string localPath = #"E:\FTPTrialPath\";
FtpWebRequest requestFileDownload = (FtpWebRequest)WebRequest.Create("ftp://url/" + _filename);
requestFileDownload.Credentials = new NetworkCredential("Login","password");
requestFileDownload.Method = WebRequestMethods.Ftp.DownloadFile;
requestFileDownload.UsePassive = true;
using(FtpWebResponse responseFileDownload = (FtpWebResponse)requestFileDownload.GetResponse()) //<<< ERROR HERE...
{
Stream responseStream = responseFileDownload.GetResponseStream();
FileStream writeStream = new FileStream(localPath + _filename, FileMode.Create);
int Length = 2048;
Byte[] buffer = new Byte[Length];
int bytesRead = responseStream.Read(buffer, 0, Length);
while (bytesRead > 0)
{
writeStream.Write(buffer, 0, bytesRead);
bytesRead = responseStream.Read(buffer, 0, Length);
}
}
requestFileDownload = null;
}
}
Is it possible to do so without interfering the parameters of other thread?
Thanks for the help in Advance :)
Each time you call a method it has its own set of parameters which is specific to that method call only.
Unless you
Pass something by reference (by using the ref keyword)
Pass a reference type (for example a class)
Pass a value type containing a reference type (i.e. a structure containing a class)
Modify a global variable (for example a class level variable)
...there will be no problem running the same method in multiple threads.
In your code you are creating two different strings which will only be accessible by the "method instance" that you passed it to.
Your problem has nothing to do with a concurrent access. Your code is perfectly thread-safe.
I see two possible problems:
You do not wait for the threads to finish. Your application abruptly aborts. The exception can be just a side effect of that abort.
Use the Thread.Join to wait for the threads to finish at the end of the Multi_Thread method:
a.Join();
b.Join();
The server may have problems with multiple parallel transfers due to a lack of available ports. Did you test parallel transfers from the same server using a standalone FTP client?
i think i'm missing something about how HttpWebRequest works via streaming when uploading large files.
basicly, i found out that i receive timeout exception when sending large files to the server, so a post suggested to do it via Async and handle the timeout myself.
The thing is, that after debugging, i found out that "GetRequestStreamAsync" method, and writing to it does nothing at the server side, the server is called only when doing GetResponseAsync
so my question is:
- code marked as //1 - it writes the file to the request stream, but i don't see that the memory is increasing, or the server even getting any request - where does the streaming go to?
This is basicly my code:
HttpWebRequest request = RESTUtils.InitializeRequest(...);
request.AllowWriteStreamBuffering = false;
request.ContentLength = i_InputStream.Length;
request.Timeout = 5000;
using (Stream requestStream = request.GetRequestStreamWithTimeout())
{
if (requestStream != null) //1
{
// We will write the stream to the request
byte[] buffer = new byte[UPLOAD_FILE_BUFFER_SIZE];
int read = i_InputStream.Read(buffer, 0, buffer.Length);
while (read > 0)
{
requestStream.Write(buffer, 0, read);
read = i_InputStream.Read(buffer, 0, buffer.Length);
}
}
}
using (var response = request.GetResponseWithTimeout(-1))
{
using (var responseStream = response.GetResponseStream())
{
}
}
public static class WebRequestExtensions
{
public static Stream GetRequestStreamWithTimeout(
this WebRequest request,
int? millisecondsTimeout = null)
{
return AsyncToSyncWithTimeout(
request.BeginGetRequestStream,
request.EndGetRequestStream,
millisecondsTimeout ?? request.Timeout);
}
public static WebResponse GetResponseWithTimeout(
this HttpWebRequest request,
int? millisecondsTimeout = null)
{
return AsyncToSyncWithTimeout(
request.BeginGetResponse,
request.EndGetResponse,
millisecondsTimeout ?? request.Timeout);
}
private static T AsyncToSyncWithTimeout<T>(
Func<AsyncCallback, object, IAsyncResult> begin,
Func<IAsyncResult, T> end,
int millisecondsTimeout)
{
var iar = begin(null, null);
if (!iar.AsyncWaitHandle.WaitOne(millisecondsTimeout))
{
var ex = new TimeoutException();
throw new WebException(ex.Message, ex, WebExceptionStatus.Timeout, null);
}
return end(iar);
}
}
Thanks!
== Edit 9/9/15 ==
Something even weirder happens, i'm attaching breakpoint right after GetResponseAsync, then i see that the server receives the call.
after that, i'm closing the process of the client -> the server is uploading the file successfully.
this happens also if i do "Abort".
anyone knows why?
Instead of using the old-style begin/end async pattern, you should consider switching to async/await which would greatly simplify your code.
You would then set the Timeout property against the request to a large value to accommodate your waiting time; then instead of using the callback-based async code, you could just do:
var request = SomeMethodToCreateRequest();
request.Timeout = int.MaxValue; // (don't do this)
var response = await request.GetResponse();
The timeout should be respected internally, and you get to simplify your code.
Sorry if the title is not clear or correct, dont know what title should i put. Please correct if wrong.
I have this code to download images from IP camera and it can download the images.The problem is how can i do the images downloading process at the same time for all cameras if i have two or more cameras?
private void GetImage()
{
string IP1 = "example.IPcam1.com:81/snapshot.cgi;
string IP2 = "example.IPcam2.com:81/snapshot.cgi;
.
.
.
string IPn = "example.IPcamn.com:81/snapshot.cgi";
for (int i = 0; i < 10; i++)
{
string ImagePath = Server.MapPath("~\\Videos\\liveRecording2\\") + string.Format("{0}", i, i + 1) + ".jpeg";
string sourceURL = ip;
WebRequest req = (WebRequest)WebRequest.Create(sourceURL);
req.Credentials = new NetworkCredential("user", "password");
WebResponse resp = req.GetResponse();
Stream stream = resp.GetResponseStream();
Bitmap bmp = (Bitmap)Bitmap.FromStream(stream);
bmp.Save(ImagePath);
}
}
You should not run long-running code like that from an ASP.NET application. They are meant to simply respond to requests.
You should place this code in a service (Windows Services are easy), and control the service through a WCF service running inside of it.
You're also going to get into trouble because you don't have your WebResponse and Stream in using blocks.
There are several methods that will depend on how you want to report feedback to the user. It all comes down to multi-threading.
Here is one example, using the ThreadPool. Note that this is missing a bunch of error checking throughout... It is here as an example of how to use the ThreadPool, not as a robust application:
private Dictionary<String, String> _cameras = new Dictionary<String, String> {
{ "http://example.IPcam1.com:81/snapshot.cgi", "/some/path/for/image1.jpg" },
{ "http://example.IPcam2.com:81/snapshot.cgi", "/some/other/path/image2.jpg" },
};
public void DoImageDownload() {
int finished = 0;
foreach (KeyValuePair<String, String> pair in _cameras) {
ThreadPool.QueueUserWorkItem(delegate {
BeginDownload(pair.Key, pair.Value);
finished++;
});
}
while (finished < _cameras.Count) {
Thread.Sleep(1000); // sleep 1 second
}
}
private void BeginDownload(String src, String dest) {
WebRequest req = (WebRequest) WebRequest.Create(src);
req.Credentials = new NetworkCredential("username", "password");
WebResponse resp = req.GetResponse();
Stream input = resp.GetResponseStream();
using (Stream output = File.Create(dest)) {
input.CopyTo(output);
}
}
This example simply takes the work you are doing in the for loop and off-loads it to the thread pool for processing. The DoImageDownload method will return very quickly, as it is not doing much actual work.
Depending on your use case, you may need a mechanism to wait for the images to finish downloading from the caller of DoImageDownload. A common approach would be the use of event callbacks at the end of BeginDownload to notify when the download is complete. I have put a simple while loop here that will wait until the images finish... Of course, this needs error checking in case images are missing or the delegate never returns.
Be sure to add your error checking throughout... Hopefully this gives you a place to start.