Why does WebClient.UploadValues overwrites my html web page? - c#

I'm familiar with Winform and WPF, but new to web developing. One day saw WebClient.UploadValues and decided to try it.
static void Main(string[] args)
{
using (var client = new WebClient())
{
var values = new NameValueCollection();
values["thing1"] = "hello";
values["thing2"] = "world";
//A single file that contains plain html
var response = client.UploadValues("D:\\page.html", values);
var responseString = Encoding.Default.GetString(response);
Console.WriteLine(responseString);
}
Console.ReadLine();
}
After run, nothing printed, and the html file content becomes like this:
thing1=hello&thing2=world
Could anyone explain it, thanks!

The UploadValues method is intended to be used with the HTTP protocol. This means that you need to host your html on a web server and make the request like that:
var response = client.UploadValues("http://some_server/page.html", values);
In this case the method will send the values to the server by using application/x-www-form-urlencoded encoding and it will return the response from the HTTP request.
I have never used the UploadValues with a local file and the documentation doesn't seem to mention anything about it. They only mention HTTP or FTP protocols. So I suppose that this is some side effect when using it with a local file -> it simply overwrites the contents of this file with the payload that is being sent.

You are using WebClient not as it was intended.
The purpose of WebClient.UploadValues is to upload the specified name/value collection to the resource identified by the specified URI.
But it should not be some local file on your disk, but instead it should be some web-service listening for requests and issuing responces.

Related

C# WPF POST Request to php file on https site on strato server not working

I want to post one or more values to a php file on a strato (the host) server on a https domain using a C# WPF desktop application. However, after several attempts with a test program nothing seems to work. The testing value is not posted to the server, the $_POST Array is empty, respectively. I do get an echoed answer from the server but its always 'Error' (See script below).
I did try this with several techniques as well:
Webclient / HttpClient
HttpWebRequest
Adjusting SecurityProtocolType
sending value as byte[]
and what not.
$_GET by the way works perfectly fine as I always get back the computed testing value from the php script. However, I would like to have a POST Request since I am sending user data to the server.
More precisely, I already have tried these solutions (and several similar ones):
How to make HTTP POST web request (basically all of them except 3rd Party)
C# HttpClient not sending POST variables
https://holger.coders-online.net/118
http://www.howtosolvenow.com/how-to-send-https-post-request-with-c/
Testing PHP Script:
$newnumber = 'Error';
if(isset($_POST['number']))
{
$newnumber = $_POST['number']+1;
}
echo $newnumber;
//keeps on returning 'Error'
Latest attempt of C# code:
string resultString = null;
string _url = "https://myURL.com/test.php";
//ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls11;
HttpClient client = new HttpClient();
var content = new FormUrlEncodedContent(new[] {
new KeyValuePair<string,string>("number", "2"),
});
var response = await client.PostAsync(_url, content);
resultString = await response.Content.ReadAsStringAsync();
this._txt.Text = resultString;

How do I send a file on a self hosted web API and process it on the server?

I have a self hosted web api using Owin and Katana. I would like to send files (can be pretty large, a few hundred MB) from a sample client, and would like to save these files on the server's disk. Currently just testing the server on my local machine.
I have the following on the test client's machine (it says image here, but it's not always going to be an image):
using System;
using System.IO;
using System.Net.Http;
class Program
{
string port = "1234";
string fileName = "whatever file I choose will be here";
static void Main(string[] args)
{
string baseAddress = "http://localhost:" + port;
InitiateClient(baseAddress);
}
static void InitiateClient(string serverBase)
{
Uri serverUri = new Uri(serverBase);
using(HttpClient client = new HttpClient())
{
client.BaseAddress = serverUri;
HttpResponseMessage response = SendImage(client, fileName);
Console.WriteLine(response);
Console.ReadLine();
}
}
private static HttpResponseMessage SendImage(HttpClient client, string imageName)
{
using (var content = new MultipartFormDataContent())
{
byte[] imageBytes = System.IO.File.ReadAllBytes(imageName);
content.Add(new StreamContent(new MemoryStream(imageBytes)), "File", "samplepic.png");
client.DefaultRequestHeaders.Accept.Add(new System.Net.Http.Headers.MediaTypeWithQualityHeaderValue("multipart/form-data"));
return client.PostAsync("api/ServiceA", content).Result;
}
}
First, is this the right way of sending a file using POST?
And now here is where I'm really lost. I am not sure how to save the file I receive in the Post method of my ServiceAController which inherits ApiController. I saw some other examples which used HttpContext.Current, but since it's self hosted, it seems to be null.
I would split files into chunks before uploading. 100 Mb is a bit large for single HTTP POST request. Most of web servers also have certain limit on HTTP request size.
By using chunks you won't need to resend all data again if connection times out.
It should not matter whether you use self hosting or IIS, and whether it is an image file or any type of file.
You can check my answer that will give you simple code to do so
https://stackoverflow.com/a/10765972/71924
Regarding size, it is definitely better to chunk if possible but this gives more work for your clients (unless you also own the API client code) and for you on the server as you must rebuild the file.
It will depend if all files will be over 100MB or if only few. If they are consistently large, I would suggest to look for http byte range support. This is part of the http standard and I am sure you can find somebody who implemented it using WebAPI

How to get server response in JSON?

I am uploading a .json file from my local drive:
using (WebClient client = new WebClient())
{
client.Headers.Add("Content-Type", "application/json");
byte[] resp = client.UploadFile("http://mycoolWebsite.com", "POST", "path to file");
string textResponse = System.Text.Encoding.ASCII.GetString(resp)
}
The response from client.UploadFile is of type byte[] when I want it to be json so I can more easily parse through it. How can I ask the server to give me back json?
The method is defined as returning byte[] with good reason. It allows the method to be used with any web service and return back the raw response from the server. Defining server-side response is the responsibility of the server (obviously). Your best bet is to take the raw response, encode it as text (as you're doing), and then check to see if the response contains well-formed JSON, allowing you to re-encode as JSON and parse at that time.
The response will be whatever the server returns; it's up to you to handle it.

Resend HTTP header

I have application. It send request to my proxy class. Proxy must to parse http header string (I done this) and resend it request to server to get a video.
At first, mediacomponent connect to proxy:
var uri = new Uri("http://127.0.0.1:2233/files/1.mp4");
videoPlayer.Source = uri;
Play();
Proxy get http header string
"GET /files/1.mp4 HTTP/1.1\r\nCache-Control: no-cache\r\nConnection: Keep-Alive\r\nPragma: getIfoFileURI.dlna.org\r\nAccept: */*\r\nUser-Agent: NSPlayer/12.00.7601.17514 WMFSDK/12.00.7601.17514\r\nGetContentFeatures.DLNA.ORG: 1\r\nHost: 127.0.0.1:2233\r\n\r\n"
I replase host:
"GET /files/1.mp4 HTTP/1.1\r\nCache-Control: no-cache\r\nConnection: Keep-Alive\r\nPragma: getIfoFileURI.dlna.org\r\nAccept: */*\r\nUser-Agent: NSPlayer/12.00.7601.17514 WMFSDK/12.00.7601.17514\r\nGetContentFeatures.DLNA.ORG: 1\r\nHost: myserver.ru\r\n\r\n"
Now proxy must get video from server. What must I do?
When using .NET, you don't have to manually create the HTTP message itself. Instead, use the classes in the System.Net.Http namespace to form and send an HTTP message and process the response.
For example, sending an HTTP GET message to a URL can be as simple as:
var uri = new Uri("http://www.foobar.com/");
var client = new HttpClient();
string body = await client.GetStringAsync(uri);
Note that this general approach will download the entire contents of the resource at the given URI. In your case, you may not want to wait for the whole video to download before you start playing/processing/storing it. In which case, you might want to use the HttpClient.ReadAsStream() method which will return a stream from which you can read until the stream closes.

System.Net.WebClient unreasonably slow

When using the System.Net.WebClient.DownloadData() method I'm getting an unreasonably slow response time.
When fetching an url using the WebClient class in .NET it takes around 10 sec before I get a response, while the same page is fetched by my browser in under 1 sec.
And this is with data that's 0.5kB or smaller in size.
The request involves POST/GET parameters and a user agent header if perhaps that could cause problems.
I haven't (yet) tried if other ways to download data in .NET gives me the same problems, but I'm suspecting I might get similar results. (I've always had a feeling web requests in .NET are unusually slow...)
What could be the cause of this?
Edit:
I tried doing the exact thing using System.Net.HttpWebRequest instead, using the following method, and all requests finish in under 1 sec.
public static string DownloadText(string url)
var request = (HttpWebRequest)WebRequest.Create(url);
var response = (HttpWebResponse)request.GetResponse();
using (var reader = new StreamReader(response.GetResponseStream()))
{
return reader.ReadToEnd();
}
}
While this (old) method using System.Net.WebClient takes 15-30s for each request to finish:
public static string DownloadText(string url)
{
var client = new WebClient();
byte[] data = client.DownloadData(url);
return client.Encoding.GetString(data);
}
I had that problem with WebRequest. Try setting Proxy = null;
WebClient wc = new WebClient();
wc.Proxy = null;
By default WebClient, WebRequest try to determine what proxy to use from IE settings, sometimes it results in like 5 sec delay before the actual request is sent.
This applies to all classes that use WebRequest, including WCF services with HTTP binding.
In general you can use this static code at application startup:
WebRequest.DefaultWebProxy = null;
Download Wireshark here http://www.wireshark.org/
Capture the network packets and filter the "http" packets.
It should give you the answer right away.
There is nothing inherently slow about .NET web requests; that code should be fine. I regularly use WebClient and it works very quickly.
How big is the payload in each direction? Silly question maybe, but is it simply bandwidth limitations?
IMO the most likely thing is that your web-site has spun down, and when you hit the URL the web-site is slow to respond. This is then not the fault of the client. It is also possible that DNS is slow for some reason (in which case you could hard-code the IP into your "hosts" file), or that some proxy server in the middle is slow.
If the web-site isn't yours, it is also possible that they are detecting atypical usage and deliberately injecting a delay to annoy scrapers.
I would grab Fiddler (a free, simple web inspector) and look at the timings.
WebClient may be slow on some workstations when Automatic Proxy Settings in checked in the IE settings (Connections tab - LAN Settings).
Setting WebRequest.DefaultWebProxy = null; or client.Proxy = null didn't do anything for me, using Xamarin on iOS.
I did two things to fix this:
I wrote a downloadString function which does not use WebRequest and System.Net:
public static async Task<string> FnDownloadStringWithoutWebRequest(string url)
{
using (var client = new HttpClient())
{
//Define Headers
client.DefaultRequestHeaders.Accept.Clear();
client.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue("application/json"));
var response = await client.GetAsync(url);
if (response.IsSuccessStatusCode)
{
string responseContent = await response.Content.ReadAsStringAsync();
//dynamic json = Newtonsoft.Json.JsonConvert.DeserializeObject(responseContent);
return responseContent;
}
Logger.DefaultLogger.LogError(LogLevel.NORMAL, "GoogleLoginManager.FnDownloadString", "error fetching string, code: " + response.StatusCode);
return "";
}
}
This is however still slow with Managed HttpClient.
So secondly, in Visual Studio Community for Mac, right click on your Project in the Solution -> Options -> set HttpClient implementation to NSUrlSession, instead of Managed.
Screenshot: Set HttpClient implementation to NSUrlSession instead of Managed
Managed is not fully integrated into iOS, doesn't support TLS 1.2, and thus does not support the ATS standards set as default in iOS9+, see here:
https://learn.microsoft.com/en-us/xamarin/ios/app-fundamentals/ats
With both these changes, string downloads are always very fast (<<1s).
Without both of these changes, on every second or third try, downloadString took over a minute.
Just FYI, there's one more thing you could try, though it shouldn't be necessary anymore:
//var authgoogle = new OAuth2Authenticator(...);
//authgoogle.Completed...
if (authgoogle.IsUsingNativeUI)
{
// Step 2.1 Creating Login UI
// In order to access SFSafariViewController API the cast is neccessary
SafariServices.SFSafariViewController c = null;
c = (SafariServices.SFSafariViewController)ui_object;
PresentViewController(c, true, null);
}
else
{
PresentViewController(ui_object, true, null);
}
Though in my experience, you probably don't need the SafariController.
Another alternative (also free) to Wireshark is Microsoft Network Monitor.
What browser are you using to test?
Try using the default IE install. System.Net.WebClient uses the local IE settings, proxy etc. Maybe that has been mangled?
Another cause for extremely slow WebClient downloads is the destination media to which you are downloading. If it is a slow device like a USB key, this can massively impact download speed. To my HDD I could download at 6MB/s, to my USB key, only 700kb/s, even though I can copy files to this USB at 5MB/s from another drive. wget shows the same behavior. This is also reported here:
https://superuser.com/questions/413750/why-is-downloading-over-usb-so-slow
So if this is your scenario, an alternative solution is to download to HDD first and then copy files to the slow medium after download completes.

Categories