I have a windows service which is uploading files to the other website which is processing them. The problem is that with small files it's working fine and it's getting response from there, but with large files (about 6 minute to process) it leaves forever in a waiting mode.
Here is the part of external website post method code:
try
{
...
LogResults();
return string.Empty;
}
catch (Exception e)
{
return e.Message;
}
The problem is that I can see logs even for large files, so it means that website always returning value, but for large files my windows service doesn't wait for them.
And here is the code from windows service
var valuesp = new NameValueCollection
{
{ "AccountId", datafeed.AccountId }
};
byte[] resultp = UploadHelper.UploadFiles(url, uploadFiles, valuesp);
response = Encoding.Default.GetString(resultp);
UploadFiles method returns value for small files, but waiting forever for large ones.
Here is complete code of UploadFiles
public static byte[] UploadFiles(string address, IEnumerable<UploadFile> files, NameValueCollection values)
{
var request = WebRequest.Create(address);
request.Timeout = System.Threading.Timeout.Infinite; //3600000; // 60 minutes
request.Method = "POST";
var boundary = "---------------------------" +
DateTime.Now.Ticks.ToString("x", NumberFormatInfo.InvariantInfo);
request.ContentType = "multipart/form-data; boundary=" + boundary;
boundary = "--" + boundary;
using (var requestStream = request.GetRequestStream())
{
// Write the values
if (values != null)
{
foreach (string name in values.Keys)
{
var buffer = Encoding.ASCII.GetBytes(boundary + Environment.NewLine);
requestStream.Write(buffer, 0, buffer.Length);
buffer =
Encoding.ASCII.GetBytes(string.Format("Content-Disposition: form-data; name=\"{0}\"{1}{1}", name,
Environment.NewLine));
requestStream.Write(buffer, 0, buffer.Length);
buffer = Encoding.UTF8.GetBytes(values[name] + Environment.NewLine);
requestStream.Write(buffer, 0, buffer.Length);
}
}
// Write the files
if (files != null)
{
foreach (var file in files)
{
var buffer = Encoding.ASCII.GetBytes(boundary + Environment.NewLine);
requestStream.Write(buffer, 0, buffer.Length);
buffer =
Encoding.UTF8.GetBytes(
string.Format("Content-Disposition: form-data; name=\"{0}\"; filename=\"{1}\"{2}", file.Name,
file.Filename, Environment.NewLine));
requestStream.Write(buffer, 0, buffer.Length);
buffer =
Encoding.ASCII.GetBytes(string.Format("Content-Type: {0}{1}{1}", file.ContentType,
Environment.NewLine));
requestStream.Write(buffer, 0, buffer.Length);
requestStream.Write(file.Stream, 0, file.Stream.Length);
buffer = Encoding.ASCII.GetBytes(Environment.NewLine);
requestStream.Write(buffer, 0, buffer.Length);
}
}
var boundaryBuffer = Encoding.ASCII.GetBytes(boundary + "--");
requestStream.Write(boundaryBuffer, 0, boundaryBuffer.Length);
}
using (var response = request.GetResponse())
using (var responseStream = response.GetResponseStream())
using (var stream = new MemoryStream())
{
responseStream.CopyTo(stream);
return stream.ToArray();
}
}
What I'm doing wrong here?
EDIT: Locally it's working even for 7-8 minutes processing. But in live environment doesn't. Can it be related with main app IIS settings? Can it be related with windows service server settings?
EDIT 2: Remote server web.config httpRuntime settings
<httpRuntime enableVersionHeader="false" maxRequestLength="300000" executionTimeout="12000" targetFramework="4.5" />
The problem was with Azure Load Balancer, which has Idle Timeout set to 4 minutes by default. The new post of Windows Azure blog says that this timeout is now configurable to the value between 4-30 minutes
http://azure.microsoft.com/blog/2014/08/14/new-configurable-idle-timeout-for-azure-load-balancer/
However, the problem was solved with sending additional KeepAlive bytes via TCP, which told Load Balancer to not kill requests.
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(address);
...
request.Proxy = null;
request.ServicePoint.SetTcpKeepAlive(true, 30000, 5000); //after 30 seconds, each 5 second
Setting Proxy to null (not using proxy) is mandatory action here, because otherwise proxy will not pass tcp bytes to the server.
Upload your files with a method that comply with RFC1867
See this
And then :
UploadFile[] files = new UploadFile[]
{
new UploadFile(fileName1),
new UploadFile(fileName2)
};
NameValueCollection form = new NameValueCollection();
form["name1"] = "value1";
form["name2"] = "xyzzy";
string response = UploadHelper.Upload(url, files, form);
That's all folks!
EDIT :
I use the method above to upload files with over 100MB in size, I don't use ASP at all, it works just perfect!
I had a similar issue where upload would work locally but timeout on server. There are 2 things at play here - I'm assuming you're talking about IIS7+. If not, let me know and I'll gladly delete my answer.
So first, there's ASP.NET which looks at this setting (under system.web):
<!-- maxRequestLength is in KB - 10 MB here. This is needed by ASP.NET. Must match with maxAllowedContentLength under system.webserver/security/requestLimits -->
<httpRuntime targetFramework="4.5" maxRequestLength="1048576" />
Then there's IIS:
<system.webServer>
<security>
<requestFiltering>
<!-- maxAllowedContentLength in bytes - 10MB -->
<requestLimits maxAllowedContentLength="10485760" />
</requestFiltering>
</security>
<system.webServer>
You need both of these settings to be present and the limit to match for things to work. Notice that one is in KB and the other one is in bytes.
Have you checked the IIS file size limit on the remote server? It defaults to 30MB, so if the files you are trying to upload are larger than that, it will fail. here's how to change the upload limit (IIS7): http://www.web-site-scripts.com/knowledge-base/article/AA-00696/0/Increasing-maximum-allowed-size-for-uploads-on-IIS7.html
You can achieve this task with AsyncCallback technique.
Whats AsyncCallback?
When the async method finish the processing, AsyncCallback method is automatically called, where post processing stmts can be executed. With this technique there is no need to poll or wait for the async thread to complete.
you can find more details from this link
AsynCallback
Maybe it is caused by the IIS upload limit?
In IIS7 the standard value is 30MB. See MSDN
EDIT1:
Please be aware that if you are uploading multiple files in 1 request the size adds up.
EDIT2:
In all MS examples there is always only one Stream.Write().
In MSDN it is stated: After the Stream object has been returned, you can send data with the HttpWebRequest by using the Stream.Write method. My interpretation of this sentence would be that you should call Write() only once. Put all data you want to send into the buffer and call the Write() afterwards.
you should change this executionTimeout="12000" to executionTimeout="360000" wich means changing execution timeout from 2 minutes to 6 minutes.
Related
I am experiencing some strange behaviour from my code which i am using to stream files to my clients.
I have a mssql server which acts as a filestore, with files that is accessed via an UNC path.
On my webserver i have some .net code running that handles streaming the files (in this case pictures and thumbnails) to my clients.
My code works, but i am experiencing a constant delay of ~12 sec on the initial file request. When i have made the initial request it is as the server wakes up and suddenly becomes responsive only to fall back to the same behaviour some time after.
At first i thought it was my code, but from what i can see on the server activity log there is no ressource intensive code going on. My theory is that at each call to the server the path must first be mounted and that is what causes the delay. It will then unmount some time after and will have to remount.
For reference i am posting my code (maybe i just cannot see the problem):
public async static Task StreamFileAsync(HttpContext context, FileInfo fileInfo)
{
//This controls how many bytes to read at a time and send to the client
int bytesToRead = 512 * 1024; // 512KB
// Buffer to read bytes in chunk size specified above
byte[] buffer = new Byte[bytesToRead];
// Clear the current response content/headers
context.Response.Clear();
context.Response.ClearHeaders();
//Indicate the type of data being sent
context.Response.ContentType = FileTools.GetMimeType(fileInfo.Extension);
//Name the file
context.Response.AddHeader("Content-Disposition", "filename=\"" + fileInfo.Name + "\"");
context.Response.AddHeader("Content-Length", fileInfo.Length.ToString());
// Open the file
using (var stream = fileInfo.OpenRead())
{
// The number of bytes read
int length;
do
{
// Verify that the client is connected
if (context.Response.IsClientConnected)
{
// Read data into the buffer
length = await stream.ReadAsync(buffer, 0, bytesToRead);
// and write it out to the response's output stream
await context.Response.OutputStream.WriteAsync(buffer, 0, length);
try
{
// Flush the data
context.Response.Flush();
}
catch (HttpException)
{
// Cancel the download if a HttpException happens
// (ie. the client has disconnected by we tried to send some data)
length = -1;
}
//Clear the buffer
buffer = new Byte[bytesToRead];
}
else
{
// Cancel the download if client has disconnected
length = -1;
}
} while (length > 0); //Repeat until no data is read
}
// Tell the response not to send any more content to the client
context.Response.SuppressContent = true;
// Tell the application to skip to the EndRequest event in the HTTP pipeline
context.ApplicationInstance.CompleteRequest();
}
If anyone could shed some light over this problem i would be very grateful!
I'm working in C# on a program to list all course resources for a MOOC (e.g. Coursera). I don't want to download the content, just get a listing of all the resources (e.g. pdf, videos, text files, sample files, etc...) which are made available to the course.
My problem lies in parsing the html source (currently using HtmlAgilityPack) without downloading all the content.
For example, if you go to this intro video for a banking course on Coursera and check the source (F12 in Chrome for Developer Tools), you can see the page source. I can stop the video download which autoplays, but still see the source.
How can I get the source in C# without download all the content?
I've looked in the HttpWebRequest headers (problem: time out), and DownloadDataAsync with Cancel (problem: the Completed Result object is invalid when cancelling the async request). I've also tried various Loads from HtmlAgilityPack but with no success.
Time out:
HttpWebRequest postRequest = (HttpWebRequest)WebRequest.Create(url);
postRequest.Timeout = TIMEOUT * 1000000; //Really long
postRequest.Referer = "https://www.coursera.org";
if (headers != null)
{ //headers here }
//Deal with cookies
if (cookie != null)
{ cookieJar.Add(cookie); }
postRequest.CookieContainer = cookiejar;
postRequest.Method = "GET";
postRequest.AllowAutoRedirect = allowRedirect;
postRequest.ServicePoint.Expect100Continue = true;
HttpWebResponse postResponse = (HttpWebResponse)postRequest.GetResponse();
Any tips on how to proceed?
There are at least two ways to do what you're asking. The first is to use a range get. That is, specify the range of the file you want to read. You do that by calling AddRange on the HttpWebRequest. So if you want, say, the first 10 kilobytes of the file, you'd write:
request.AddRange(-10240);
Read carefully what the documentation says about the meaning of that parameter. If it's negative, it specifies the ending point of the range. There are also other overloads of AddRange that you might be interested in.
Not all servers support range gets, though. If that doesn't work, you'll have to do it another way.
What you can do is call GetResponse and then start reading data. Once you've read as much data as you want, you can stop reading and close the stream. I've modified your sample slightly to show what I mean.
string url = "https://www.coursera.org/course/money";
HttpWebRequest postRequest = (HttpWebRequest)WebRequest.Create(url);
postRequest.Method = "GET";
postRequest.AllowAutoRedirect = true; //allowRedirect;
postRequest.ServicePoint.Expect100Continue = true;
HttpWebResponse postResponse = (HttpWebResponse) postRequest.GetResponse();
int maxBytes = 1024*1024;
int totalBytesRead = 0;
var buffer = new byte[maxBytes];
using (var s = postResponse.GetResponseStream())
{
int bytesRead;
// read up to `maxBytes` bytes from the response
while (totalBytesRead < maxBytes && (bytesRead = s.Read(buffer, 0, maxBytes)) != 0)
{
// Here you can save the bytes read to a persistent buffer,
// or write them to a file.
Console.WriteLine("{0:N0} bytes read", bytesRead);
totalBytesRead += bytesRead;
}
}
Console.WriteLine("total bytes read = {0:N0}", totalBytesRead);
That said, I ran this sample and it downloaded about 6 kilobytes and stopped. I don't know why you're having trouble with timeouts or too much data.
Note that sometimes trying to close the stream before the entire response is read will cause the program to hang. I'm not sure why that happens at all, and I can't explain why it only happens sometimes. But you can solve it by calling request.Abort before closing the stream. That is:
using (var s = postResponse.GetResponseStream())
{
// do stuff here
// abort the request before continuing
postRequest.Abort();
}
trying to get to the bottom of this!
i have a very basic app that is using httpwebrequests to login, navigate to a page and then grab the html of that page. it then preforms another webrequest to a third page every 5 mins in a loop.
its all working fine and is single threaded (and fairly old), however circumstances have changed and i now need to run multiple instances of this app closely together (i have a .bat starting the app every 2seconds as a temporary measure until i am able to code a new multithreaded solution).
when the first instances of the app start everything is fine, first request is completed in ~2seconds. second one in about 3seconds.
however as more and more instances of this app are run concurrently (>100) something strange starts to happen.
the first web request still takes ~2 seconds, however the second request gets delayed much more >1min up to the point of timeout. i cant seem to think why this is. the second page is larger than the first, but nothing out of the ordinary that would take >1min to download.
The internet connection and hardware of this server is more than capable of handling these requests.
CookieContainer myContainer = new CookieContainer();
// first request is https
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(https://mysite.com/urlone);
request.CookieContainer = myContainer;
request.Proxy = proxy;
Console.WriteLine(System.DateTime.Now.ToLongTimeString() + " " + "Starting login request");
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
Stream resStream = response.GetResponseStream();
string tempString = null;
int count = 0;
do
{
// fill the buffer with data
count = resStream.Read(buf, 0, buf.Length);
// make sure we read some data
if (count != 0)
{
// translate from bytes to ASCII text
tempString = Encoding.ASCII.GetString(buf, 0, count);
// continue building the string
sb.Append(tempString);
}
}
while (count > 0); // any more data to read?
sb.Clear();
response.Close();
resStream.Close();
string output6;
Console.WriteLine(System.DateTime.Now.ToLongTimeString() + " " + "login request comeplete");
HttpWebRequest request6 = (HttpWebRequest)WebRequest.Create(#"http://mysite.com/page2");
request6.CookieContainer = myContainer;
response = (HttpWebResponse)request6.GetResponse();
resStream = response.GetResponseStream();
tempString = null;
count = 0;
do
{
count = resStream.Read(buf, 0, buf.Length);
if (count != 0)
{
tempString = Encoding.ASCII.GetString(buf, 0, count);
sb.Append(tempString);
}
}
while (count > 0);
output6 = sb.ToString();
sb.Clear();
response.Close();
resStream.Close();
Any ideas? Im not very advanced with http web requests so if someone could check i haven't made any silly code mistakes above id appreciate it. Im at a loss as to what other information i may need to include here, if i have missed anything out please tell me and i will do my best to provide.
Thanks in advance.
EDIT 1:
I used fiddler to find out the source of the issue. It looks like the issue lies with the application (or windows) not sending the requests for some reason - the physical request actually takes < 1second according to fiddler.
Check out a few things
ServicePointManager.DefaultConnectionLimit : if you are planning to open more then 100 connection the set this value to something like 200-300.
If possible use HttpWebRequest.KeepALive = true
Try wrapping your request into a using directive to make sure it's always properly closed. If you hit the max number of connections, you otherwise have to wait for the earlier ones to time out before new ones connect.
I have tried to implement a REST WCF in order to explore difference between PUT and POST verb. I have uploded a file in a location using the service.
The service implementation is as folowing:
[OperationContract]
[WebInvoke(UriTemplate = "/UploadFile", Method = "POST")]
void UploadFile(Stream fileContents);
public void UploadFile(Stream fileContents)
{
byte[] buffer = new byte[32768];
MemoryStream ms = new MemoryStream();
int bytesRead, totalBytesRead = 0;
do
{
bytesRead = fileContents.Read(buffer, 0, buffer.Length);
totalBytesRead += bytesRead;
ms.Write(buffer, 0, bytesRead);
} while (bytesRead > 0);
using (FileStream fs = File.OpenWrite(#"C:\temp\test.txt"))
{
ms.WriteTo(fs);
}
ms.Close();
}
Client code is as following:
HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create("http://localhost:1922 /EMPRESTService.svc/UploadFile");
request.Method = "POST";
request.ContentType = "text/plain";
byte[] fileToSend = File.ReadAllBytes(#"C:\TEMP\log.txt"); // txtFileName contains the name of the file to upload.
request.ContentLength = fileToSend.Length;
using (Stream requestStream = request.GetRequestStream())
{
// Send the file as body request.
requestStream.Write(fileToSend, 0, fileToSend.Length);
//requestStream.Close();
}
using (HttpWebResponse response = (HttpWebResponse)request.GetResponse())
Console.WriteLine("HTTP/{0} {1} {2}", response.ProtocolVersion, (int)response.StatusCode, response.StatusDescription);
Console.ReadLine();
The file is being uploaded and the response status code is being returned as "200 OK". The satus code is same in case of existance or non-existance of the file in the upload location.
I have changed the REST verb to PUT and the status code is same as above.
Could anybody explain, how I can identify the differences between the verbs in this context? I couldn't able to simulate generating continious request fron client code. If the behaviour will differ on doing so, could anybody help me in modifying the client code in ordrr to send continious request in a row ?
POST verb is used when are you creating a new resource (a file in your case) and repeated operations would create multiple resources on the server. This verb would make sense if uploading a file with the same name multiple times creates multiple files on the server.
PUT verb is used when you are updating an existing resource or creating a new resource with a predefined id. Multiple operations would recreate or update the same resource on the server. This verb would make sense if uploading a file with the same name for the second, third... time would overwrite the previously uploaded file.
I have a monitoring system and I want to save a snapshot from a camera when alarm trigger.
I have tried many methods to do that…and it’s all working fine , stream snapshot from the camera then save it as a jpg in the pc…. picture (jpg format,1280*1024,140KB)..That’s fine
But my problem is in the application performance...
The app need about 20 ~30 seconds to read the steam, that’s not acceptable coz that method will be called every 2 second .I need to know what wrong with that code and how I can get it much faster than that. ?
Many thanks in advance
Code:
string sourceURL = "http://192.168.0.211/cgi-bin/cmd/encoder?SNAPSHOT";
byte[] buffer = new byte[200000];
int read, total = 0;
WebRequest req = (WebRequest)WebRequest.Create(sourceURL);
req.Credentials = new NetworkCredential("admin", "123456");
WebResponse resp = req.GetResponse();
Stream stream = resp.GetResponseStream();
while ((read = stream.Read(buffer, total, 1000)) != 0)
{
total += read;
}
Bitmap bmp = (Bitmap)Bitmap.FromStream(new MemoryStream(buffer, 0,total));
string path = JPGName.Text+".jpg";
bmp.Save(path);
I very much doubt that this code is the cause of the problem, at least for the first method call (but read further below).
Technically, you could produce the Bitmap without saving to a memory buffer first, or if you don't need to display the image as well, you can save the raw data without ever constructing a Bitmap, but that's not going to help in terms of multiple seconds improved performance. Have you checked how long it takes to download the image from that URL using a browser, wget, curl or whatever tool, because I suspect something is going on with the encoding source.
Something you should do is clean up your resources; close the stream properly. This can potentially cause the problem if you call this method regularly, because .NET will only open a few connections to the same host at any one point.
// Make sure the stream gets closed once we're done with it
using (Stream stream = resp.GetResponseStream())
{
// A larger buffer size would be benefitial, but it's not going
// to make a significant difference.
while ((read = stream.Read(buffer, total, 1000)) != 0)
{
total += read;
}
}
I cannot try the network behavior of the WebResponse stream, but you handle the stream twice (once in your loop and once with your memory stream).
I don't thing that's the whole problem but I'd give it a try:
string sourceURL = "http://192.168.0.211/cgi-bin/cmd/encoder?SNAPSHOT";
WebRequest req = (WebRequest)WebRequest.Create(sourceURL);
req.Credentials = new NetworkCredential("admin", "123456");
WebResponse resp = req.GetResponse();
Stream stream = resp.GetResponseStream();
Bitmap bmp = (Bitmap)Bitmap.FromStream(stream);
string path = JPGName.Text + ".jpg";
bmp.Save(path);
Try to read bigger pieces of data, than 1000 bytes per time. I can see no problem with, for example,
read = stream.Read(buffer, 0, buffer.Length);
Try this to download the file.
using(WebClient webClient = new WebClient())
{
webClient.DownloadFile("http://192.168.0.211/cgi-bin/cmd/encoder?SNAPSHOT", "c:\\Temp\myPic.jpg");
}
You can use a DateTime to put a unique stamp on the shot.