Box API v2 creating folder with cyrillic letters in it's name - c#

I'm trying to create folder using new API.
If folder name contains cyrillic letters, I receive HTTP 400 Bad Request.
However it works fine with latin letters.
Is it known issue?

I found correct answer here: Detecting the character encoding of an HTTP POST request
the default encoding of a HTTP POST is ISO-8859-1.
The only thing I need is to manually set encoding of the request.
By the way, here is working code:
public static Task<string> Post(string url, string data, string authToken) {
var client = new WebClient { Encoding = Encoding.UTF8 };
client.Headers.Add("Content-Type:application/x-www-form-urlencoded");
client.Headers.Add(AuthHeader(authToken));
return client.UploadStringTaskAsync(new Uri(url), "POST", data);
}

Usually, complications involving international characters in Box API calls just need minor adjustments to the encoding of the requests. I'm guessing you'll just have to encode the target folder name with a urlencode.
If that doesn't do the trick, we may be able to help more if you send a sample request or code snippet. If you do, keep the api key and auth token to yourself.

Related

C# MVC Android URL encoded parameters

I have a simple website I am playing around with, that takes in an email and password to log in. This all works fine on a PC browser, and on apple phones / tablets.
However, droid devices will not log in. I routed my test phone via my pc and used fiddler to look at the http traffic, and Android is encoding email addresses, so instead of foo#here.com I am seeing foo%40here.com, and so my server side validation is failing as this is not a valid email address.
The easy way to my mind of detecting this is to simply look at the email string before I validate it, and look for the %40 symbol. But if android is goign to urlencode everything by default, this could give me issues further on down the line, and having to examin strings, and guess whether a % is part of some url encoding, or a genuine percentage will become a problem.
So is there an attribute or filter I can apply to my method so that the framework will unencode it as required, or do I need to handle this on a case by case basis?
try by adding ("Content-Type", "application/x-www-form-urlencoded;charset=UTF-8"); line in your request before sending request
like this request.setHeader("Content-Type", "application/x-www-form-urlencoded;charset=UTF-8");
se the example
HttpClient httpClient = new DefaultHttpClient();
HttpPost request = new HttpPost(Constants.mWebURL+Constants.mSendBloodReq);
request.setHeader("Content-Type", "application/x-www-form-urlencoded;charset=UTF-8");
// adding paramaters
List<NameValuePair> postParameters = new ArrayList<NameValuePair>();
postParameters.add(new BasicNameValuePair("user_id",params[0]));
postParameters.add(new BasicNameValuePair("password",params[1]));
UrlEncodedFormEntity formEntity = new UrlEncodedFormEntity(postParameters,"UTF-8");
request.setEntity(formEntity);
HttpResponse httpresponse = httpClient.execute(request);
//getting response
response = EntityUtils.toString(httpresponse.getEntity());
My Method on the controller wasn't decorated with a [HttpPost] attribute. That seems to have fixed it for me

Character Encoding causes my ASP Web API request to fail with HTTP 400: Bad Request

I have a json object which I serialize and post. I found that if one of the string members of my json object has a few unicode characters such as é, ó, 6̄ (I have some other unicode characters that �犞ݮ do not return an HTTP 400 error) that the request returns HTTP 400 Bad Request.
Here is the simplified code for how the call is being made:
WebClient client;
Foo myObject = new Foo();
myObject.StringField1 = "é";
string serializedObjects = Newtonsoft.Json.JsonConvert.SerializeObject(myObject)
client.Headers[HttpRequestHeader.ContentType] = "application/json;charset=utf-8";
var response = client.UploadString(url, serializedObjects);
Here is the code for how the calls is being received on server side:
public async Task<IHttpActionResult> Post([FromBody] IList<Foo> myObjects){}
Here are some things that I have researched/tried and some assumptions I have made:
1) Get the string object as UTF8 bytes, then transform it back to a string
Encoding.UTF8.GetString(Encoding.UTF8.GetBytes(serializedObjects))
2) When I make the request and capture the traffic using Fiddler and inspect the request in Raw. For the unicode characters that don't return a 400 error they are replaced with a '?' however for the ones that do cause a 400 they are displayed as a blank. However when I look at JSON view it shows as a null character '�'
3) HTTP 400 is returned is ASP Web API can't deserialize the object correctly, so I tried using Newtonsoft library to serialize and deserialize the object and this works just fine. (The default serializer for ASP Web API is JSON.NET so it should be performing the same serialization http://www.asp.net/web-api/overview/formats-and-model-binding/json-and-xml-serialization)
string b = JsonConvert.SerializeObject(a);
Foo c = (Foo) JsonConvert.DeserializeObject(b, typeof(Foo));
// And I tried deserializing using the generic method
Foo d = JsonConvert.DeserializeObject<Foo>(b);
4) I was convinced that this was an issue with my server rejecting these special characters, but when I used Fiddler to retry the request (inserting the é back in the place where Fiddler had removed it from in the original request) the call succeeds.
5) I validated my Content-Type on http://www.iana.org/assignments/media-types/media-types.xhtml and also specify the utf8 charset explicitly.
So this seems to me to be an encoding issue, but é should be a valid UTF-8 character. I validated it in Jon Skeet's Unicode Explorer widget on his page. So I'm out of ideas for why this is causing issues...
As #roeland and #dbc pointed out there is an Encoding property on the WebClient. Setting this to Encoding.UTF8 resolved the issue.
client.Encoding = Encoding.UTF8;

Can I allow raw unicode in HTTP headers using NSUrlSession?

I'm constructing an NSUrlSession as follows:
NSUrlSessionConfiguration sessionCfg = NSUrlSessionConfiguration.CreateBackgroundSessionConfiguration("mySpecialSessionName");
NSUrlSessionDelegate sessionDelegate = new MySessionDelegate();
urlSession = NSUrlSession.FromConfiguration(sessionCfg, sessionDelegate, NSOperationQueue.MainQueue);
And invoking background downloads with custom HTTP headers:
NSMutableUrlRequest mutableRequest = new NSMutableUrlRequest();
mutableRequest.HttpMethod = "POST";
mutableRequest.Url = NSUrl.FromString(someEndpoint);
mutableRequest["MyCustomHeader"] = someStringWithUnicodeChars;
mutableRequest.Body = NSData.FromString(somePostBody);
NSUrlSessionDownloadTask downloadTask = m_UrlSession.CreateDownloadTask(mutableRequest);
downloadTask.Resume();
However, the header value string seems to get truncated at the first character above 255. For example, the header value:
SupeЯ Σario Bros
is received by the server as
Supe
When instead using .NET HttpClient on xamarin, unicode header strings successfully make it to the server unmodified. However, I'd like to make use of NSUrlSession's background downloading feature.
(I realize that support of unicode in HTTP headers is hit-and-miss, but since the HTTP server in this case is a particular custom server that doesn't currently support things like base64 encoding, passing the raw string is desired)
I don't know whether you'll be able to make that work, but two things come to mind:
What you have here is equivalent to calling setValue:forKey: on the URL request. I don't think that will do what you're expecting. Try calling the setValue:forHTTPHeaderField: method instead.
Try specifying the encoding before you specify your custom header value, e.g. [theRequest setValue:#"...; charset=UTF-8" forHTTPHeaderField:#"Content-Type"];
If neither of those helps, you'll probably have to encode the data in some way. I would suggest using URL encoding, because that's a lot simpler to implement on the server side than Base64. For the iOS side, see this link for info on how to URL-encode a string:
https://developer.apple.com/library/mac/documentation/Cocoa/Conceptual/URLLoadingSystem/WorkingwithURLEncoding/WorkingwithURLEncoding.html

German characters sending data using POST method from ASP page to PHP page

I have a problem with sending data from ASP with the POST Method to a PHP page.
I would like to send mail with names. And since I live in Austria the names are in German and we have some Special characters. These characters don't arrive write.
I'm still pretty new to programming with C# btw. I had the Website before in Java-Script but I had to connect it with a database and therefore I switched to C# and now I'm like a "babe in the woods".
this.hdnDaten.Value = "ÄÖÜ|äöü|ß|é|#";
// mit POST versuchen
using (var client = new WebClient())
{
var postData = new System.Collections.Specialized.NameValueCollection();
postData.Add("von", this.hdnVon.Value);
postData.Add("an", this.hdnAn.Value);
postData.Add("betreff", this.hdnBetreff.Value);
postData.Add("daten", this.hdnDaten.Value);
byte[] response = client.UploadValues("http://xxxxxx.php", "POST", postData);
var responsebody = Encoding.UTF8.GetString(response);
}
And this is how the characters (in this.hdnDaten.Value) from above arrive in the mail-body:
ÄÖÜ|äöü|ß|é|#
Does anybody know what I can do to get the same characters in the end?
Edit 20143013: I think I have a clue: I have to encode the postData into ANSI (Codepage 1252). I tried do do this, but it doesn't work. Does anybody have an Idea how I could do this?
Edit 20140320: I don't even dare to give you the answer: I was looking all the time in the wrong place (somewhat like MH370): The problem was with the receiving side of the mail (I was using a POP3-Viewer for testing); when I downloaded the mail to Outlook everything was OK. The funny thing was that this didn't happen in the original (Javascript) Version that's why I was looking at the wrong place.
Thanks
Eddie
Try setting client.Encoding to UTF-8 before calling UploadValues. Also ensure that you read the text as UTF-8 on the server.
Try this.hdnDaten.Value = HttpUtility.UrlEncode("ÄÖÜ|äöü|ß|é|#"); on your post parameters.
on PHP you'll need to decode the parameters via html_entity_decode

Encoding non UTF-8 text in Parameters in ASP.NET MVC

Background
I have a web application that uses ISO-8859-1 encoding. When I pass parameters using Html.ActionLink(), the value is decoded to UTF-8:
Web.config:
<globalization requestEncoding="iso-8859-1" responseEncoding="iso-8859-1"
fileEncoding="iso-8859-1" />
Index.aspx
This is a <%= Html.ActionLink("test", "Read", new { name="Cosméticos" }) %>
generates the following:
This is a test
The problem is the value I receive in my controller is UTF-8, not iso-8859-1:
TestController:
public ActionResult Read(string name) {
//name is "Cosméticos" here!
}
Question
Why the string is not decoded to Cosméticos?
Does your aspx files are physically saved in iso-8859-1?
"File / Save Xyz As" And click at the right of the save button to have more encoding options to save your file in..
A guess
public static string ActionLinkNoEncode(this HtmlHelper htmlHelper, string linkText, ActionResult action )
{
var urlHelper = new UrlHelper(htmlHelper.ViewContext.RequestContext);
var url = Uri.UnescapeDataString(urlHelper.Action(action)).ToLowerInvariant();
var linkTagBuilder = new TagBuilder("a");
linkTagBuilder.MergeAttribute("href", url);
linkTagBuilder.InnerHtml = linkText;
return linkTagBuilder.ToString();
}
I found the problem and the workaround: the value I receive is UTF-8, but if I try to use System.Text.Encoding.UTF8.GetBytes(name) it converts the characters "é" to UTF-8 values instead of "É".
The workaround is to copy the string to a byte[] and then use System.Text.Encoding.Convert().
I don't know if this is the best way, but now everything is working for me.
A few things you might want to consider.
First, if you haven't already read it -- I highly recommend reading Joel Spolsky's article 'The Absolute Minimum Every Software Developer Absolutely, Positively Must Know About Unicode and Character Sets (No Excuses!)' It sets the stage for learning about character encoding and programming.
Second, looking at the docs on the globalization element in the web.config it sounds like there are ways to (accidentally?) override the specified encoding scheme. From the docs:
requestEncoding
Specifies the assumed encoding of each incoming request, including
posted data and the query string. If the request comes with a request
header containing an Accept-Charset attribute, it overrides the
requestEncoding in configuration. The default encoding is UTF-8,
specified in the <globalization> tag included in the Machine.config
file created when the .NET Framework is installed. If request encoding
is not specified in a Machine.config or Web.config file, encoding
defaults to the computer's Regional Options locale setting. In
single-server applications, requestEncoding and responseEncoding
should be the same. For the less common case (multiple-server
applications where the default server encodings are different), you
can vary the request and response encoding using local Web.config
files.
Have you tried using something like Fiddler to see what the Accept-Charset attribute is set to?

Categories