Low Download Speed when mouse is over Window - c#

I have the following code to download some files from a FTP Server :
EDIT : I've solved the problem using DotNet, a good FTP WPF Library !
public partial class MainWindow
{
DispatcherTimer dispatcherTimer = new System.Windows.Threading.DispatcherTimer();
private byte[] downloadedData;
string FTPAddress = "ftp://ftp.cluster007.ovh.net";
double currentBytes;
double oldBytes;
public MainWindow()
{
InitializeComponent();
// DispatcherTimer setup
dispatcherTimer.Tick += new EventHandler(dispatcherTimer_Tick);
dispatcherTimer.Interval = new TimeSpan(0, 0, 1);
}
public static void DoEvents()
{
Application.Current.Dispatcher.Invoke(DispatcherPriority.Background,
new Action(delegate { }));
}
private void dispatcherTimer_Tick(object sender, EventArgs e)
{
currentBytes = Dl_ProgressBar.Value;
Dl_Speed.Content = "Vitesse : " + ((currentBytes - oldBytes) / 1000000).ToString("0.00") + " Mo/s";
oldBytes = Dl_ProgressBar.Value;
// Forcing the CommandManager to raise the RequerySuggested event
CommandManager.InvalidateRequerySuggested();
}
private void Dl_Button_Click(object sender, RoutedEventArgs e)
{
downloadFile();
}
private void downloadFile()
{
downloadedData = new byte[0];
try
{
//Create FTP request
//Note: format is ftp://server.com/file.ext
FtpWebRequest request = FtpWebRequest.Create(FTPAddress + "/" + filename) as FtpWebRequest;
//Get the file size first (for progress bar)
request.Method = WebRequestMethods.Ftp.GetFileSize;
request.Credentials = new NetworkCredential(username, password);
request.UsePassive = true;
request.UseBinary = true;
request.KeepAlive = true; //don't close the connection
int dataLength = (int)request.GetResponse().ContentLength;
Dl_Status.Content = "Téléchargement en cours...";
DoEvents();
//Now get the actual data
request = FtpWebRequest.Create(FTPAddress + "/" + filename) as FtpWebRequest;
request.Method = WebRequestMethods.Ftp.DownloadFile;
request.Credentials = new NetworkCredential(username, password);
request.UsePassive = true;
request.UseBinary = true;
request.KeepAlive = false; //close the connection when done
//Set up progress bar
Dl_ProgressBar.Value = 0;
Dl_ProgressBar.Maximum = dataLength;
//Streams
FtpWebResponse response = request.GetResponse() as FtpWebResponse;
Stream reader = response.GetResponseStream();
//Download to memory
//Note: adjust the streams here to download directly to the hard drive
MemoryStream memStream = new MemoryStream();
byte[] buffer = new byte[1024]; //downloads in chuncks
dispatcherTimer.Start();
while (true)
{
DoEvents(); //prevent application from crashing
int bytesRead = reader.Read(buffer, 0, buffer.Length);
if (bytesRead == 0)
{
//Nothing was read, finished downloading
Dl_ProgressBar.Value = Dl_ProgressBar.Maximum;
Dl_Percent.Content = "Progression : 100%";
DoEvents();
break;
}
else
{
//Write the downloaded data
memStream.Write(buffer, 0, bytesRead);
//Update the progress bar
if (Dl_ProgressBar.Value + bytesRead <= Dl_ProgressBar.Maximum)
{
Dl_ProgressBar.Value += bytesRead;
Dl_Percent.Content = "Progression : " + ((Dl_ProgressBar.Value / 1000000000000000) * dataLength).ToString("0.00") + "%";
DoEvents();
}
}
}
//Convert the downloaded stream to a byte array
downloadedData = memStream.ToArray();
//Clean up
reader.Close();
memStream.Close();
response.Close();
Dl_Status.Content = "Téléchargement terminé";
DoEvents();
}
catch (Exception)
{
Dl_Status.Content = "Erreur de connexion au FTP";
}
}
}
My problem is that when I pass the mouse over the window, the download speed is dropping significantly...
It changes from 3.70Mb/s to 2.20Mb/s.
When I have the mouse out of the window, there's no problem, but when I'm over it, it slows down, particularly when I do some very short movements, the download speed go to 0.20Mb/s.
I've tried to use Threads and Dispatcher but it was the same.

To answer your specific question, WPF's Dispatcher uses a priority queue, and Input level events (like those originating from mouse movement) take priority over Background level events. Your DoEvents() method periodically drains the message queue of all events with Background priority or higher, so when you move the mouse over the window, the queue fills up with input events to process. This means that DoEvents takes longer to return, and more time elapses before you can resume processing the download.
That said, this is a terrible way to accomplish a download; you should never use this kind of DoEvents() hack in WPF; do some research on the async and await features of C# (or, if that is not an option, BackgroundWorker). You will find many examples on StackOverflow of how to perform asynchronous downloads without having to resort to this sort of Dispatcher trickery to keep the UI responsive.

Related

How can i make that my splash screen will start and finish according to the file downloading time?

In my form1 constructor i have:
while (splash_flag == true)
{
splash.Show();
Thread.Sleep(3000);
splash_flag = false;
}
if (splash_flag == false)
{
splash.Close();
}
fileDownloadRadar(remote_image_on_server,combinedTemp);
This is the fileDownloadRadar method:
HttpWebRequest request;
void fileDownloadRadar(string uri, string fileName)
{
request = (System.Net.HttpWebRequest)System.Net.WebRequest.Create(uri);
request.CookieContainer = new CookieContainer();
request.AllowAutoRedirect = true;
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
if (response.ContentType == "")
{
Logger.Write("ContentType is Empty download was not fine !!!!!");
}
if ((response.StatusCode == HttpStatusCode.OK ||
response.StatusCode == HttpStatusCode.Moved ||
response.StatusCode == HttpStatusCode.Redirect) &&
response.ContentType.StartsWith("image", StringComparison.OrdinalIgnoreCase))
{
Logger.Write("ContentType is not empty meaning download is fine");
using (Stream inputStream = response.GetResponseStream())
using (Stream outputStream = File.OpenWrite(fileName))
{
byte[] buffer = new byte[4096];
int bytesRead;
do
{
bytesRead = inputStream.Read(buffer, 0, buffer.Length);
outputStream.Write(buffer, 0, bytesRead);
} while (bytesRead != 0);
}
FinishWebRequest();
}
else
{
timer1.Stop();
timer3.Start();
}
}
The way it's working now the splash screen wait 3 seconds close and then the form is loading.
But i want that the splash screen will start when the download file start and to to be closed when the download completed and then the form to be loaded.
The method FinishWebRequest is where the file downloaded. Like completed download.
My problem is how to calculate the time or how to make that the splash screen will start with the file download and close when finished ?
The splash screen and the download will have to be run in different threads. You can either create a new thread or you can switch to using an asynchronous download method.
Either way, instead of a splash_flag boolean, I'd use an AutoResetEvent (it could retain the name I guess). It would start off in the Reset state. The FinishWebRequest() method would call splash_flag.Set() (or whatever you decided to name it). This would trigger the splash screen to close. You'd replace your while loop with something simple like:
splash.Show();
splash_flag.WaitOne();
splash.Close();
splash_flag.WaitOne() will hang until FinishWebRequest() calls splash_flag.Set().
EDIT: Since your splash screen has to be animated. You'll have to just poll for completion. Having a loop in the UI thread will cause the form to become unresponsive.
You could just create a timer to do the polling:
splash.Show();
// Poll every 100ms. Change as desired.
var timer = new System.Timers.Timer(100) { AutoReset = true };
timer.Elapsed += (s, e) =>
{
if (splash_flag.WaitOne(0)) // Check for signal and return without blocking
{
// Use Invoke() so you only deal with the form in the UI thread
splash.Invoke(new Action(() => { splash.Close(); }));
timer.Stop();
}
}
timer.Start();

Why Backgroundeworker + BlockingCollection Combination is slower?

I have a program accessing database and downloading images. I was using BlockingCollection for that purpose. However, to access some UI elements I decided to use combination of Backgroundworker and BlockingCollection. It reduced speed of processing considerably as compared to speed when only Blockingcollection was used. What can be the reason? Or as I am now accessing UI elements, there is reduction in speed?
Here is the code I am working on:
private void button_Start_Click(object sender, System.EventArgs e)
{
BackgroundWorker bgWorker = new BackgroundWorker();
bgWorker.DoWork += bw_DoWork;
bgWorker.RunWorkerCompleted += bw_RunWorkerCompleted;
bgWorker.ProgressChanged += bw_ProgressChanged;
bgWorker.WorkerSupportsCancellation = true;
bgWorker.WorkerReportsProgress = true;
Button btnSender = (Button)sender;
btnSender.Enabled = false;
bgWorker.RunWorkerAsync();
}
and Do_Work() is as follows:
{
HttpWebRequest request = null;
using (BlockingCollection<ImageFileName> bc = new BlockingCollection<ImageFileName>(30))
{
using (Task task1 = Task.Factory.StartNew(() =>
{
foreach (var fileName in fileNames)
{
string baseUrl = "http://some url";
string url = string.Format(baseUrl, fileName);
request = (HttpWebRequest)WebRequest.Create(url);
request.Method = "GET";
request.ContentType = "application/x-www-form-urlencoded";
var response = (HttpWebResponse)request.GetResponse();
Stream stream = response.GetResponseStream();
img = Image.FromStream(stream);
FileNameImage = new ImageFileName(fileName.ToString(), img);
bc.Add(FileNameImage);
Thread.Sleep(100);
Console.WriteLine("Size of BlockingCollection: {0}", bc.Count);
}
}))
{
using (Task task2 = Task.Factory.StartNew(() =>
{
foreach (ImageFileName imgfilename2 in bc.GetConsumingEnumerable())
{
if (bw.CancellationPending == true)
{
e.Cancel = true;
break;
}
else
{
int numIterations = 4;
Image img2 = imgfilename2.Image;
for (int i = 0; i < numIterations; i++)
{
img2.Save("C:\\path" + imgfilename2.ImageName);
ZoomThumbnail = img2;
ZoomSmall = img2;
ZoomLarge = img2;
ZoomThumbnail = GenerateThumbnail(ZoomThumbnail, 86, false);
ZoomThumbnail.Save("C:\\path" + imgfilename2.ImageName + "_Thumb.jpg");
ZoomThumbnail.Dispose();
ZoomSmall = GenerateThumbnail(ZoomSmall, 400, false);
ZoomSmall.Save("C:\\path" + imgfilename2.ImageName + "_Small.jpg");
ZoomSmall.Dispose();
ZoomLarge = GenerateThumbnail(ZoomLarge, 1200, false);
ZoomLarge.Save("C:\\path" + imgfilename2.ImageName + "_Large.jpg");
ZoomLarge.Dispose();
// progressBar1.BeginInvoke(ProgressBarChange);
int percentComplete = (int)(((i + 1.0) / (double)numIterations) * 100.0);
//if (progressBar1.InvokeRequired)
//{
// BeginInvoke(new MethodInvoker(delegate{bw.ReportProgress(percentComplete)};))
//}
}
Console.WriteLine("This is Take part and size is: {0}", bc.Count);
}
}
}))
Task.WaitAll(task1, task2);
}
}
}
A better option might be to make retrieving the data and writing it to disk run synchronously, and instead use Parallel.ForEach() to allow multiple requests to be in-flight at the same time. That should reduce the amount of waiting in a couple spots:
No need to wait for one HTTP request to complete before issuing subsequent requests.
No need to block on that BlockingCollection
No need to wait for one disk write to complete before firing off the next one.
So perhaps something more like this:
Parallel.ForEach(fileNames,
(name) =>
{
string baseUrl = "http://some url";
string url = string.Format(baseUrl, fileName);
var request = (HttpWebRequest)WebRequest.Create(url);
request.Method = "GET";
request.ContentType = "application/x-www-form-urlencoded";
var response = (HttpWebResponse)request.GetResponse();
Stream stream = response.GetResponseStream();
var img = Image.FromStream(stream);
// Cutting out a lot of steps from the 2nd Task to simplify the example
img.Save(Path.Combine("C:\\path", fileName.ToString()));
});
One possible problem you could run into with this approach is that it will start generating too many requests at once. That might cause resource contention issues, or perhaps the webserver will interpret it as malicious behavior and stop responding to you. You can limit the number of requests that happen at the same time by setting the MaxDegreeOfParallelism. The following example shows how you could limit the operation to process no more than 4 files at the same time.
var options = new ParallelOptions { MaxDegreeOfParallelism = 4 };
Parallel.ForEach(fileNames, (name) => { /* do stuff */ }, options);

98% usage of CPU - simple C# service

I wrote a simple application in C#. It downloads a log file via ftp, checked whether firefox is running on the PC, changes the log string, uploads the log back to server.
I am running it every 10 seconds using a Timer.
When the service starts, its memory usage is 10Mb and CPU usage <1%. After about two minutes, its memory usage is ~12Mb, but the CPU usage jumps to over 90%!
This is what my app does every 10 seconds:
1) Download log via ftp and store in string log.
2) Go through a list of processes running on the PC, and if there if a firefox.exe process, appropriately change the log string to indicate firefox running.
3) Save the log string to as a txt file, read the file to send id via ftp back to the server.
I doubt saving/reading a couple of lines of text onto hard-drive requires so much CPU power.
Any guesses on what might be going on? Thanks!!
EDIT: Here is my whole class
class Program : System.ServiceProcess.ServiceBase
{
private static System.Timers.Timer timer;
static string myIP = "";
static void start()
{
string strHostName = Dns.GetHostName();
IPHostEntry ipEntry = Dns.GetHostEntry(strHostName);
IPAddress[] addr = ipEntry.AddressList;
int i = 0;
foreach (IPAddress address in addr)
{
if (("" + addr[i].AddressFamily).Equals("InterNetwork"))
myIP = "" + addr[i];
i++;
}
timer = new System.Timers.Timer();
timer.Elapsed += new ElapsedEventHandler(firefoxChecker); // Everytime timer ticks, timer_Tick will be called
timer.Interval = (1000) * (5);
timer.Enabled = true; // Enable the timer
timer.Start();
}
protected override void OnStart(string[] args)
{
start();
}
public static void Main()
{
System.ServiceProcess.ServiceBase.Run(new Program());
}
static string downloadLog()
{
FtpWebRequest reqFTP = (FtpWebRequest)FtpWebRequest.Create(new Uri("ftp://server/log.txt"));
// Provide the WebPermission Credintials
reqFTP.Credentials = new NetworkCredential("username", "password");
reqFTP.Proxy = null;
reqFTP.Method = WebRequestMethods.Ftp.DownloadFile;
FtpWebResponse response = (FtpWebResponse)reqFTP.GetResponse();
Stream responseStream = response.GetResponseStream();
StreamReader reader = new StreamReader(responseStream);
string log = reader.ReadToEnd();
reader.Close();
reader.Dispose();
return log;
}
static void sendLogThroughFTP(string log)
{
FtpWebRequest reqFTP = (FtpWebRequest)FtpWebRequest.Create(new Uri("ftp://server/log.txt"));
reqFTP.Credentials = new NetworkCredential("username", "password");
reqFTP.Proxy = null;
reqFTP.Method = WebRequestMethods.Ftp.UploadFile;
StreamWriter wr = new StreamWriter(#"C:\logs\temp.txt");
wr.Write(log);
wr.Close();
StreamReader sourceStream = new StreamReader(#"C:\logs\temp.txt");
byte[] fileContents = Encoding.UTF8.GetBytes(sourceStream.ReadToEnd());
sourceStream.Close();
reqFTP.ContentLength = fileContents.Length;
Stream requestStream = reqFTP.GetRequestStream();
requestStream.Write(fileContents, 0, fileContents.Length);
requestStream.Close();
sourceStream.Dispose();
}
static void firefoxChecker(object sender, EventArgs e)
{
string firefoxOwner = "----------";
TerminalServicesManager manager = new TerminalServicesManager();
ITerminalServer server = null;
string log = downloadLog();
bool diceFirefoxRunning = false;
bool monsterFirefoxRunning = false;
bool careerbuilderFirefoxRunning = false;
try
{
server = manager.GetLocalServer();
server.Open();
foreach (ITerminalServicesSession session in server.GetSessions())
{
if (session.ConnectionState == ConnectionState.Active)
{
firefoxOwner = session.UserAccount.ToString();
string ip = session.ClientIPAddress.ToString();
string user = session.UserAccount.ToString();
System.Collections.Generic.IList<Cassia.ITerminalServicesProcess> list = session.GetProcesses();
foreach (ITerminalServicesProcess process in list)
{
if (Equals(process.ProcessName, "firefox.exe"))
{
// change firefoxOwner string appropriately
log = updateLog(log, user, firefoxOwner);
}
}
}
}
server.Close();
sendLogThroughFTP(log);
}
catch
{
// do nothing
}
}
static string updateLog(string log, string username, string ffOwner)
{
// make some changes to log string
return log;
}
}
}
Thanks for all the inputs!
Disable the timer when you start doing your work and re-enable it when you are done.
You are downloading and uploading via FTP which could take more than the 5 seconds you have set fro your timer. If you disable the timer before you start and re-enable it at the end, you will poll 5 seconds after the last upload completed.
You may also want to consider upping your polling time to something a little more reasonable. Do you really need to poll every 5 seconds to make sure firefox is still running?

Uploading HTTP progress tracking

I've got WPF application I'm writing that posts files to one of social networks.
Upload itself working just fine, but I'd like to provide some indication of how far along I am with the uploading.
I tried a bunch of ways to do this:
1) HttpWebRequest.GetStream method:
using (
var FS = File.Open(
localFilePath, FileMode.Open, FileAccess.Read, FileShare.Read))
{
long len = FS.Length;
HttpWebRequest request = (HttpWebRequest) WebRequest.Create(url);
request.Method = "POST";
request.ProtocolVersion = HttpVersion.Version11;
request.ContentType = "multipart/form-data; boundary=--AaB03x";
//predata and postdata is two byte[] arrays, that contains
//strings for MIME file upload (defined above and is not important)
request.ContentLength = predata.Length + FS.Length + postdata.Length;
request.AllowWriteStreamBuffering = false;
using (var reqStream = request.GetRequestStream())
{
reqStream.Write(predata, 0, predata.Length);
int bytesRead = 0;
int totalRead = 0;
do
{
bytesRead = FS.Read(fileData, 0, MaxContentSize);
totalRead += bytesRead;
reqStream.Write(fileData, 0, bytesRead);
reqStream.Flush(); //trying with and without this
//this part will show progress in percents
sop.prct = (int) ((100*totalRead)/len);
} while (bytesRead > 0);
reqStream.Write(postdata, 0, postdata.Length);
}
HttpWebResponse responce = (HttpWebResponse) request.GetResponse();
using (var respStream = responce.GetResponseStream())
{
//do things
}
}
2) WebClient way (much shorter):
void UploadFile (url, localFilePath)
{
...
WebClient client = new WebClient();
client.UploadProgressChanged += new UploadProgressChangedEventHandler(UploadPartDone);
client.UploadFileCompleted += new UploadFileCompletedEventHandler(UploadComplete);
client.UploadFileAsync(new Uri(url), localFilePath);
done.WaitOne();
//do things with responce, received from UploadComplete
JavaScriptSerializer jssSer = new JavaScriptSerializer();
return jssSer.Deserialize<UniversalJSONAnswer>(utf8.GetString(UploadFileResponce));
//so on...
...
}
void UploadComplete(object sender, UploadFileCompletedEventArgs e)
{
UploadFileResponce=e.Result;
done.Set();
}
void UploadPartDone(object sender, UploadProgressChangedEventArgs e)
{
//this part expected to show progress
sop.prct=(int)(100*e.BytesSent/e.TotalBytesToSend);
}
3) Even TcpClient way:
using (
var FS = File.Open(
localFilePath, FileMode.Open, FileAccess.Read, FileShare.Read))
{
long len = FS.Length;
long totalRead = 0;
using (var client = new TcpClient(urli.Host, urli.Port))
{
using (var clearstream = client.GetStream())
{
using (var writer = new StreamWriter(clearstream))
using (var reader = new StreamReader(clearstream))
{
//set progress to 0
sop.prct = 0;
// Send request headers
writer.WriteLine("POST " + urli.AbsoluteUri + " HTTP/1.1");
writer.WriteLine("Content-Type: multipart/form-data; boundary=--AaB03x");
writer.WriteLine("Host: " + urli.Host);
writer.WriteLine("Content-Length: " + (predata.Length + len + postdata.Length).ToString());
writer.WriteLine();
//some data for MIME
writer.Write(utf8.GetString(predata));
writer.Flush();
int bytesRead;
do
{
bytesRead = FS.Read(fileData, 0, MaxContentSize);
totalRead += bytesRead;
writer.BaseStream.Write(fileData, 0, bytesRead);
writer.BaseStream.Flush();
sop.prct = (int) ((100*totalRead)/len);
} while (bytesRead > 0)
writer.Write(utf8.GetString(postdata));
writer.Flush();
//read line of response and do other thigs...
respStr = reader.ReadLine();
...
}
}
}
}
In all cases the file was successfully sent to the server.
But always progress looks like this: for a few seconds it runs from 0 to 100 and then waits until file actually uploading (about 5 minutes - file is 400MB).
So I think the data from a file is buffered somewhere and I'm tracking not uploading, but buffering data. And then must wait until it's uploaded.
My questions are:
1) Is there any way to track actual uploading data? That the method Stream.Write() or Flush() (which as I read somewhere, does not work for NetworkStream) did not return until it receives confirmation from the server that the TCP packets received.
2) Or can I deny buffering (AllowWriteStreamBUffering for HttpWebRequest doesn't work)?
3) And does it make sense to go further "down" and try with Sockets?
updated:
To avoid any doubts in the way of progress displaying on UI, I rewrote the code to log a file.
so, here is code:
using (var LogStream=File.Open("C:\\123.txt",FileMode.Create,FileAccess.Write,FileShare.Read))
using (var LogWriter=new StreamWriter(LogStream))
using (var FS = File.Open(localFilePath, FileMode.Open, FileAccess.Read, FileShare.Read))
{
long len = FS.Length;
HttpWebRequest request = (HttpWebRequest) WebRequest.Create(url);
request.Timeout = 7200000; //2 hour timeout
request.Method = "POST";
request.ProtocolVersion = HttpVersion.Version11;
request.ContentType = "multipart/form-data; boundary=--AaB03x";
//predata and postdata is two byte[] arrays, that contains
//strings for MIME file upload (defined above and is not important)
request.ContentLength = predata.Length + FS.Length + postdata.Length;
request.AllowWriteStreamBuffering = false;
LogWriter.WriteLine(DateTime.Now.ToString("o") + " Start write into request stream. ");
using (var reqStream = request.GetRequestStream())
{
reqStream.Write(predata, 0, predata.Length);
int bytesRead = 0;
int totalRead = 0;
do
{
bytesRead = FS.Read(fileData, 0, MaxContentSize);
totalRead += bytesRead;
reqStream.Write(fileData, 0, bytesRead);
reqStream.Flush(); //trying with and without this
//sop.prct = (int) ((100*totalRead)/len); //this part will show progress in percents
LogWriter.WriteLine(DateTime.Now.ToString("o") + " totalRead= " + totalRead.ToString() + " / " + len.ToString());
} while (bytesRead > 0);
reqStream.Write(postdata, 0, postdata.Length);
}
LogWriter.WriteLine(DateTime.Now.ToString("o") + " All sent!!! Waiting for responce... ");
LogWriter.Flush();
HttpWebResponse responce = (HttpWebResponse) request.GetResponse();
LogWriter.WriteLine(DateTime.Now.ToString("o") + " Responce received! ");
using (var respStream = responce.GetResponseStream())
{
if (respStream == null) return null;
using (var streamReader = new StreamReader(respStream))
{
string resp = streamReader.ReadToEnd();
JavaScriptSerializer jssSer = new JavaScriptSerializer();
return jssSer.Deserialize<UniversalJSONAnswer>(resp);
}
}
}
and here is result (I cut the middle):
2011-11-19T22:00:54.5964408+04:00 Start write into request stream.
2011-11-19T22:00:54.6404433+04:00 totalRead= 1048576 / 410746880
2011-11-19T22:00:54.6424434+04:00 totalRead= 2097152 / 410746880
2011-11-19T22:00:54.6434435+04:00 totalRead= 3145728 / 410746880
2011-11-19T22:00:54.6454436+04:00 totalRead= 4194304 / 410746880
2011-11-19T22:00:54.6464437+04:00 totalRead= 5242880 / 410746880
2011-11-19T22:00:54.6494438+04:00 totalRead= 6291456 / 410746880
.......
2011-11-19T22:00:55.3434835+04:00 totalRead= 408944640 / 410746880
2011-11-19T22:00:55.3434835+04:00 totalRead= 409993216 / 410746880
2011-11-19T22:00:55.3464837+04:00 totalRead= 410746880 / 410746880
2011-11-19T22:00:55.3464837+04:00 totalRead= 410746880 / 410746880
2011-11-19T22:00:55.3464837+04:00 All sent!!! Waiting for responce...
2011-11-19T22:07:23.0616597+04:00 Responce received!
as you can see program thinks that it uploaded ~400MB for about 2 seconds. And after 7 minutes file actually uploads and I receive responce.
updated again:
Seems to this is happening under WIndows 7 (not shure about x64 or x86).
When I run my code uder XP everything works perfectly and progress is shown absolute correctly
it's more than year since this question was posted, but I think my post can be usefull for someone.
I had the same problem with showing progress and it behaved exactly like you described. So i decided to use HttpClient which shows upload progress correctly. Then I've encountered interesting bug - when I had Fiddler launched HttpClient started to show its upload progress in unexpected way like in WebClient/HttpWebRequest above so I thinked maybe that was a problem of why WebClient showed upload progres not correctly (I think I had it launched). So I tried with WebClient again (without fiddler-like apps launched) and all works as it should, upload progress has correct values. I have tested in on several PC with win7 and XP and in all cases progress was showing correctly.
So, I think that such program like Fiddler (probably not only a fiddler) has some affect on how WebClient and other .net classes shows upload progress.
this discussion approves it:
HttpWebRequest doesn't work except when fiddler is running
You could use the WebClient's UploadFile to upload file rather than using writing file as a file stream. In order to track the percentage of the data received and uploaded you can use UploadFileAsyn and subscribe to its events.
In the code bellow I've used UploadFileAsyn to the upload files synchronously, but it need not to be synchronous as far as you don't dispose the instance of the uploader.
class FileUploader : IDisposable
{
private readonly WebClient _client;
private readonly Uri _address;
private readonly string _filePath;
private bool _uploadCompleted;
private bool _uploadStarted;
private bool _status;
public FileUploader(string address, string filePath)
{
_client = new WebClient();
_address = new Uri(address);
_filePath = filePath;
_client.UploadProgressChanged += FileUploadProgressChanged;
_client.UploadFileCompleted += FileUploadFileCompleted;
}
private void FileUploadFileCompleted(object sender, UploadFileCompletedEventArgs e)
{
_status = (e.Cancelled || e.Error == null) ? false : true;
_uploadCompleted = true;
}
private void FileUploadProgressChanged(object sender, UploadProgressChangedEventArgs e)
{
if(e.ProgressPercentage % 10 == 0)
{
//This writes the pecentage data uploaded and downloaded
Console.WriteLine("Send: {0}, Received: {1}", e.BytesSent, e.BytesReceived);
//You can have a delegate or a call back to update your UI about the percentage uploaded
//If you don't have the condition (i.e e.ProgressPercentage % 10 == 0 )for the pecentage of the process
//the callback will slow you upload process down
}
}
public bool Upload()
{
if (!_uploadStarted)
{
_uploadStarted = true;
_client.UploadFileAsync(_address, _filePath);
}
while (!_uploadCompleted)
{
Thread.Sleep(1000);
}
return _status;
}
public void Dispose()
{
_client.Dispose();
}
}
Client Code:
using (FileUploader uploader = new FileUploader("http://www.google.com", #"C:\test.txt"))
{
uploader.Upload();
}
You can register a custom callback (may be a delegate) on the FileUploadProgressChanged event handler to update your WPF UI.
The upload progress changed event get called more often if your callback for the event does any IO then that'll slowdown the download progress. It's best to have infrequent update e.g. the following code update only evey 10% up.
private int _percentageDownloaded;
private void FileUploadProgressChanged(object sender, UploadProgressChangedEventArgs e)
{
if (e.ProgressPercentage % 10 == 0 && e.ProgressPercentage > _percentageDownloaded)
{
_percentageDownloaded = e.ProgressPercentage;
//Any callback instead of printline
Console.WriteLine("Send: {0} Received: {1}", e.BytesSent, e.BytesReceived);
}
}
my suggestion is to use new HTTPClient class (available in .NET 4.5). It supports progress.
This article helped me a lot with this:
http://www.strathweb.com/2012/06/drag-and-drop-files-to-wpf-application-and-asynchronously-upload-to-asp-net-web-api/
My code for upload file:
private void HttpSendProgress(object sender, HttpProgressEventArgs e)
{
HttpRequestMessage request = sender as HttpRequestMessage;
Console.WriteLine(e.BytesTransferred);
}
private void Window_Loaded_1(object sender, RoutedEventArgs e)
{
ProgressMessageHandler progress = new ProgressMessageHandler();
progress.HttpSendProgress += new EventHandler<HttpProgressEventArgs>(HttpSendProgress);
HttpRequestMessage message = new HttpRequestMessage();
StreamContent streamContent = new StreamContent(new FileStream("e:\\somefile.zip", FileMode.Open));
message.Method = HttpMethod.Put;
message.Content = streamContent;
message.RequestUri = new Uri("{Here your link}");
var client = HttpClientFactory.Create(progress);
client.SendAsync(message).ContinueWith(task =>
{
if (task.Result.IsSuccessStatusCode)
{
}
});
}
This one has been bugging me for at least one day. I have started with using WebClient.UploadFileAsync, next tried the ProgressMessageHandler for HttpClient then rolled my own HttpContent for the HttpClient API. None of those approaches worked (for me).
It appears HttpWebRequest, which sits at the bottom of most (all?) .NET Http abstraction like WebClient and HttpClient, buffers the request and response stream by default, which I confirmed by looking at it in ILSpy.
As others have noted, you can make your request use chunked encoding one way or another which will effectively disable buffering the request stream, but still this is not going to fix the progress reporting.
I found that it was necessary to flush the request stream after each block that I send in order to accurately reflect sending progress, or else your data will simply be buffered one step further down the pipeline (probably somewhere in NetworkStream or OS, didn't check). The sample code below works for me and also does a minimalistic job at translating back from a HttpWebResponse to HttpResponseMessage (which you may not need, YMMV).
public async Task<HttpResponseMessage> UploadFileAsync( string uploadUrl, string absoluteFilePath, Action<int> progressPercentCallback )
{
var length = new FileInfo( absoluteFilePath ).Length;
var request = new HttpWebRequest( new Uri(uploadUrl) ) {
Method = "PUT",
AllowWriteStreamBuffering = false,
AllowReadStreamBuffering = false,
ContentLength = length
};
const int chunkSize = 4096;
var buffer = new byte[chunkSize];
using (var req = await request.GetRequestStreamAsync())
using (var readStream = File.OpenRead(absoluteFilePath))
{
progressPercentCallback(0);
int read = 0;
for (int i = 0; i < length; i += read)
{
read = await readStream.ReadAsync( buffer, 0, chunkSize );
await req.WriteAsync( buffer, 0, read );
await req.FlushAsync(); // flushing is required or else we jump to 100% very fast
progressPercentCallback((int)(100.0 * i / length));
}
progressPercentCallback(100);
}
var response = (HttpWebResponse)await request.GetResponseAsync();
var result = new HttpResponseMessage( response.StatusCode );
result.Content = new StreamContent( response.GetResponseStream() );
return result;
}
At fast guess, you are running this code on UI thread. You need to run upload stuff on new thread.
At that point you have 2 options. 1) You run timer on UI thread and update UI. 2) You update UI using Invoke(because you can't access UI from another thread) calls to update UI.
In the first example I think your progress bar is showing how fast you write into the stream from the file on disk - not the actual upload progress (which is why it all happens to 100% really quickly then the upload chugs on*).
I might be wrong ^^ and have no WPF experience but I have uploaded massive files from Silverlight to WCF and the model used there is (as you do) to break up the file into blocks. Send each block. When you get a response from the server ("block 26 received ok"), update the progress bar as really, you can't (or should not) update the progress bar unless you /know/ that block x made it - and a good way to know that is if the server says it got it.
*I wish I could upload 400Mb in 5 mins. Would take me all day...
I had the same problem. I spent a lot of time and solved the problem as follows:
Antivirus AVAST. When I turn it off my program works perfectly...

Download file from FTP with Progress - TotalBytesToReceive is always -1?

I am trying to download a file from an FTP server with a progress bar.
The file is downloading, and the ProgressChanged event is calling, except in the event args TotalBytesToReceive is always -1. TotalBytes increases, but I am unable to calculate the percentage without the total.
I imagine I could find the file size through other ftp commands, but I wonder why this doesn't work?
My code:
FTPClient request = new FTPClient();
request.Credentials = credentials;
request.DownloadProgressChanged += new DownloadProgressChangedEventHandler(request_DownloadProgressChanged);
//request.DownloadDataCompleted += new DownloadDataCompletedEventHandler(request_DownloadDataCompleted);
request.DownloadDataAsync(new Uri(folder + file));
while (request.IsBusy) ;
....
static void request_DownloadProgressChanged(object sender, DownloadProgressChangedEventArgs e)
{
if (e.TotalBytesToReceive == -1)
{
l.reportProgress(-1, FormatBytes(e.BytesReceived) + " out of ?" );
}
else
{
l.reportProgress(e.ProgressPercentage, "Downloaded " + FormatBytes(e.BytesReceived) + " out of " + FormatBytes(e.TotalBytesToReceive) + " (" + e.ProgressPercentage + "%)");
}
}
....
class FTPClient : WebClient
{
protected override WebRequest GetWebRequest(System.Uri address)
{
FtpWebRequest req = (FtpWebRequest)base.GetWebRequest(address);
req.UsePassive = false;
return req;
}
}
Thanks.
So I had the same issue. I got around it by retrieving the file size first.
// Get the object used to communicate with the server.
FtpWebRequest request = (FtpWebRequest)WebRequest.Create("URL");
request.Method = WebRequestMethods.Ftp.GetFileSize;
request.Credentials = networkCredential;
FtpWebResponse response = (FtpWebResponse)request.GetResponse();
Stream responseStream = response.GetResponseStream();
bytes_total = response.ContentLength; //this is an int member variable stored for later
Console.WriteLine("Fetch Complete, ContentLength {0}", response.ContentLength);
response.Close();
webClient = new MyWebClient();
webClient.Credentials = networkCredential; ;
webClient.DownloadDataCompleted += new DownloadDataCompletedEventHandler(FTPDownloadCompleted);
webClient.DownloadProgressChanged += new DownloadProgressChangedEventHandler(FTPDownloadProgressChanged);
webClient.DownloadDataAsync(new Uri("URL"));
Then do the math in the callback.
private void FTPDownloadProgressChanged(object sender, DownloadProgressChangedEventArgs e)
{
progressBar.Value = (int)(((float)e.BytesReceived / (float)bytes_total) * 100.0);
}
With FTP protocol, WebClient in general does not know total download size. So you commonly get -1 with FTP.
Note that the behavior actually contradicts the .NET documentation, which says for FtpWebResponse.ContentLength (where the value of TotalBytesToReceive comes from):
For requests that use the DownloadFile method, the property is greater than zero if the downloaded file contained data and is zero if it was empty.
But you will easily find out many of questions about this, effectively showing that the behavior is not always as documented. The FtpWebResponse.ContentLength has a meaningful value for GetFileSize method only.
The FtpWebRequest/WebClient makes no explicit attempt to find out a size of the file that it is downloading. All it does is that it tries to look for (xxx bytes). string in 125/150 responses to RETR command. No FTP RFC mandates that the server should include such information. ProFTPD (see data_pasv_open in src/data.c) and vsftpd (see handle_retr in postlogin.c) seem to include this information. Other common FTP servers (IIS, FileZilla) do not do this.
If your server does not provide size information, you have to query for size yourself before download. A complete solution using FtpWebRequest and Task:
private void button1_Click(object sender, EventArgs e)
{
// Run Download on background thread
Task.Run(() => Download());
}
private void Download()
{
try
{
const string url = "ftp://ftp.example.com/remote/path/file.zip";
NetworkCredential credentials = new NetworkCredential("username", "password");
// Query size of the file to be downloaded
WebRequest sizeRequest = WebRequest.Create(url);
sizeRequest.Credentials = credentials;
sizeRequest.Method = WebRequestMethods.Ftp.GetFileSize;
int size = (int)sizeRequest.GetResponse().ContentLength;
progressBar1.Invoke(
(MethodInvoker)(() => progressBar1.Maximum = size));
// Download the file
WebRequest request = WebRequest.Create(url);
request.Credentials = credentials;
request.Method = WebRequestMethods.Ftp.DownloadFile;
using (Stream ftpStream = request.GetResponse().GetResponseStream())
using (Stream fileStream = File.Create(#"C:\local\path\file.zip"))
{
byte[] buffer = new byte[10240];
int read;
while ((read = ftpStream.Read(buffer, 0, buffer.Length)) > 0)
{
fileStream.Write(buffer, 0, read);
int position = (int)fileStream.Position;
progressBar1.Invoke(
(MethodInvoker)(() => progressBar1.Value = position));
}
}
}
catch (Exception e)
{
MessageBox.Show(e.Message);
}
}
The core download code is based on:
Upload and download a binary file to/from FTP server in C#/.NET
FTP Wont give you content sizes like HTTP does, you would probably be better doing this on your own.
FtpWebRequest FTPWbReq = WebRequest.Create("somefile") as FtpWebRequest;
FTPWbReq .Method = WebRequestMethods.Ftp.GetFileSize;
FtpWebResponse FTPWebRes = FTPWbReq.GetResponse() as FtpWebResponse;
long length = FTPWebRes.ContentLength;
FTPWebRes.Close();

Categories