Related
I'm pretty new to this subject, but I've written a ASP.NET Core Web API using .Net 5. This API just reads and writes values to a Azure DB. It works perfectly fine on my machine, but after the upload/publish to Azure I get an "500 - Internal Server error"
{
"statusCode": 500,
"message": "Internal server error",
"activityId": "15ed7068-cb40-4e79-8205-cc4c6d4be8be"
}
I've followed the Microsoft Doc for pubishing my API: https://learn.microsoft.com/en-us/aspnet/core/tutorials/publish-to-azure-api-management-using-vs?view=aspnetcore-5.0
Once I got to the step https://learn.microsoft.com/en-us/aspnet/core/tutorials/publish-to-azure-api-management-using-vs?view=aspnetcore-5.0 it doesn't work anymore. Here is the trace:
api-inspector (0.238 ms)
{
"request": {
"method": "GET",
"url": "http://xxx.azure-api.net/v1/Tests",
"headers": [
{
"name": "sec-ch-ua",
"value": "\"Chromium\";v=\"92\",\" Not A;Brand\";v=\"99\",\"Google Chrome\";v=\"92\""
},
{
"name": "sec-ch-ua-mobile",
"value": "?0"
},
{
"name": "Ocp-Apim-Subscription-Key",
"value": ""
},
{
"name": "Sec-Fetch-Site",
"value": "cross-site"
},
{
"name": "Sec-Fetch-Mode",
"value": "cors"
},
{
"name": "Sec-Fetch-Dest",
"value": "empty"
},
{
"name": "X-Forwarded-For",
"value": "xxx.xxx.xxx.xxx"
},
{
"name": "Cache-Control",
"value": "no-cache, no-store"
},
{
"name": "Connection",
"value": "Keep-Alive"
},
{
"name": "Content-Type",
"value": "text/plain;charset=UTF-8"
},
{
"name": "Accept",
"value": "*/*"
},
{
"name": "Accept-Encoding",
"value": "gzip,deflate,br"
},
{
"name": "Accept-Language",
"value": "en,de;q=0.9,en-US;q=0.8,it;q=0.7,it-IT;q=0.6,und;q=0.5"
},
{
"name": "Host",
"value": "xxx.azure-api.net"
},
{
"name": "Referer",
"value": "https://apimanagement.hosting.portal.azure.net/"
}
]
}
}
api-inspector (0.003 ms)
{
"configuration": {
"api": {
"from": "/v1",
"to": {
"scheme": "https",
"host": "xxx.azurewebsites.net",
"port": 80,
"path": "/",
"queryString": "",
"query": {},
"isDefaultPort": false
},
"version": null,
"revision": "1"
},
"operation": {
"method": "GET",
"uriTemplate": "/Tests"
},
"user": "-",
"product": "-"
}
}
cors (0.020 ms)
"Origin header was missing or empty and the request was classified as not cross-domain. CORS policy was not applied."
Backend
(25.737 ms)↑ Back to top
forward-request (0.093 ms)
{
"message": "Request is being forwarded to the backend service. Timeout set to 300 seconds",
"request": {
"method": "GET",
"url": "https://xxx.azurewebsites.net:80/Tests",
"headers": [
{
"name": "Host",
"value": "xxx.azurewebsites.net:80"
},
{
"name": "sec-ch-ua",
"value": "\"Chromium\";v=\"92\",\" Not A;Brand\";v=\"99\",\"Google Chrome\";v=\"92\""
},
{
"name": "sec-ch-ua-mobile",
"value": "?0"
},
{
"name": "Ocp-Apim-Subscription-Key",
"value": ""
},
{
"name": "Sec-Fetch-Site",
"value": "cross-site"
},
{
"name": "Sec-Fetch-Mode",
"value": "cors"
},
{
"name": "Sec-Fetch-Dest",
"value": "empty"
},
{
"name": "X-Forwarded-For",
"value": "xxx.xxx.xxx.xxx,xxx.xxx.xxx.xxx"
},
{
"name": "Cache-Control",
"value": "no-cache, no-store"
},
{
"name": "Content-Type",
"value": "text/plain;charset=UTF-8"
},
{
"name": "Accept",
"value": "*/*"
},
{
"name": "Accept-Encoding",
"value": "gzip,deflate,br"
},
{
"name": "Accept-Language",
"value": "en,de;q=0.9,en-US;q=0.8,it;q=0.7,it-IT;q=0.6,und;q=0.5"
},
{
"name": "Referer",
"value": "https://apimanagement.hosting.portal.azure.net/"
}
]
}
}
forward-request (25.643 ms)
{
"messages": [
"The underlying connection was closed: An unexpected error occurred on a send.",
"Error occured while calling backend service.",
"The handshake failed due to an unexpected packet format."
]
}
I've already tried making a completely new project and republishing that one, without success.
I have the same issue, I solved it by changing the port from 80 to 443(if u use HTTPS). Hope it will help u. Or u can reference this https://github.com/ThreeMammals/Ocelot/issues/912
I have been following the tutorial listed here - https://learn.microsoft.com/en-us/azure/cognitive-services/anomaly-detector/quickstarts/detect-data-anomalies-csharp
I have set up my Anomaly API so I have the end points and API keys ready to go. I have set up my own time series data which I will post here for clarity (Note: While the below does NOT work, the sample data here using the same code DOES work - https://github.com/Azure-Samples/anomalydetector/blob/master/example-data/request-data.json):
{
"granularity": "daily",
"series": [
{
"timestamp": "2019-03-27T00:00:00",
"value": 10781.70
},
{
"timestamp": "2019-03-25T00:00:00",
"value": 4058.13
},
{
"timestamp": "2019-03-20T00:00:00",
"value": 8124132.33
},
{
"timestamp": "2019-03-19T00:00:00",
"value": 1571398.97
},
{
"timestamp": "2019-03-18T00:00:00",
"value": 2097703.40
},
{
"timestamp": "2019-03-15T00:00:00",
"value": 10624.76
},
{
"timestamp": "2019-03-14T00:00:00",
"value": 11647.00
},
{
"timestamp": "2019-03-13T00:00:00",
"value": 45937.16
},
{
"timestamp": "2019-03-08T00:00:00",
"value": 4237.20
},
{
"timestamp": "2019-03-07T00:00:00",
"value": 3315.40
},
{
"timestamp": "2019-03-04T00:00:00",
"value": 3218.77
},
{
"timestamp": "2019-02-28T00:00:00",
"value": 11271.00
},
{
"timestamp": "2019-02-27T00:00:00",
"value": 48605.08
},
{
"timestamp": "2019-02-26T00:00:00",
"value": 6181.12
},
{
"timestamp": "2019-02-25T00:00:00",
"value": 45069.00
},
{
"timestamp": "2019-02-22T00:00:00",
"value": 108860.84
},
{
"timestamp": "2019-02-21T00:00:00",
"value": 24924.50
},
{
"timestamp": "2019-02-20T00:00:00",
"value": 4068.50
},
{
"timestamp": "2019-02-19T00:00:00",
"value": 4329.60
},
{
"timestamp": "2019-02-18T00:00:00",
"value": 7615.20
},
{
"timestamp": "2019-02-14T00:00:00",
"value": 56974.10
},
{
"timestamp": "2019-02-13T00:00:00",
"value": 73393.52
},
{
"timestamp": "2019-02-12T00:00:00",
"value": 29991.99
},
{
"timestamp": "2019-02-11T00:00:00",
"value": 2906769.50
},
{
"timestamp": "2019-02-05T00:00:00",
"value": 1956853.85
},
{
"timestamp": "2019-02-04T00:00:00",
"value": 46863.31
},
{
"timestamp": "2019-01-31T00:00:00",
"value": 31602.31
},
{
"timestamp": "2019-01-30T00:00:00",
"value": 13149.59
},
{
"timestamp": "2018-10-10T00:00:00",
"value": 19380.60
},
{
"timestamp": "2018-08-21T00:00:00",
"value": 61801.45
},
{
"timestamp": "2018-08-16T00:00:00",
"value": 843.80
},
{
"timestamp": "2018-08-15T00:00:00",
"value": 52326.20
},
{
"timestamp": "2018-08-14T00:00:00",
"value": 136384.88
},
{
"timestamp": "2018-08-09T00:00:00",
"value": 7224.30
},
{
"timestamp": "2018-07-26T00:00:00",
"value": 16493.08
},
{
"timestamp": "2018-07-24T00:00:00",
"value": 1665163.72
},
{
"timestamp": "2018-07-23T00:00:00",
"value": 38642.88
},
{
"timestamp": "2018-07-13T00:00:00",
"value": 49913.00
},
{
"tim estamp": "2018-07-12T00:00:00",
"value": 49193.00
},
{
"timestamp": "2018-07-11T00:00:00",
"value": 37205.30
},
{
"timestamp": "2018-07-10T00:00:00",
"value": 44527.30
},
{
"timestamp": "2018-07-09T00:00:00",
"value": 148737.01
},
{
"timestamp": "2018-07-06T00:00:00",
"value": 138887.90
},
{
"timestamp": "2018-07-05T00:00:00",
"value": 74346.00
},
{
"timestamp": "2018-07-04T00:00:00",
"value": 71181.50
},
{
"timestamp": "2018-07-03T00:00:00",
"value": 215164.43
},
{
"timestamp": "2018-07-02T00:00:00",
"value": 83817.50
}
]
}
When I run my test code, I get back a 400 BAD REQUEST (with no additional information to suggest why). Here is my console app code (with obvious parts redacted)
using System;
using System.IO;
using System.Net;
using System.Net.Http;
using System.Net.Http.Headers;
using System.Text;
using System.Threading.Tasks;
namespace anomalyDetectionMachineLearning
{
class Program
{
// Replace the subscriptionKey string value with your valid subscription key.
const string subscriptionKey = "";
// Replace the endpoint URL with the correct one for your subscription.
// Your endpoint can be found in the Azure portal. For example: https://westus2.api.cognitive.microsoft.com
const string endpoint = "https://westus2.api.cognitive.microsoft.com/";
// Replace the dataPath string with a path to the JSON formatted time series data.
const string dataPath = #"C:\Temp\data.txt";
const string latestPointDetectionUrl = "/anomalydetector/v1.0/timeseries/last/detect";
const string batchDetectionUrl = "/anomalydetector/v1.0/timeseries/entire/detect";
static void Main(string[] args)
{
try
{
var requestData = File.ReadAllText(dataPath);
//Console.Write(requestData.ToString());
detectAnomaliesBatch(requestData);
//detectAnomaliesLatest(requestData);
System.Console.ReadKey();
}
catch (Exception ex)
{
Console.WriteLine(ex.ToString());
Console.ReadKey();
}
}
public static void detectAnomaliesBatch(string requestData)
{
System.Console.WriteLine("Detecting anomalies as a batch");
var result = Request(
endpoint,
batchDetectionUrl,
subscriptionKey,
requestData).Result;
dynamic jsonObj = Newtonsoft.Json.JsonConvert.DeserializeObject(result);
System.Console.WriteLine(jsonObj);
bool[] anomalies = jsonObj["isAnomaly"].ToObject<bool[]>();
System.Console.WriteLine("\n Anomalies detected in the following data positions:");
for (var i = 0; i < anomalies.Length; i++)
{
if (anomalies[i])
{
System.Console.Write(i + ", ");
}
}
}
static async Task<string> Request(string baseAddress, string endpoint, string subscriptionKey, string requestData)
{
using (HttpClient client = new HttpClient { BaseAddress = new Uri(baseAddress) })
{
System.Net.ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls12 | SecurityProtocolType.Tls11 | SecurityProtocolType.Tls;
client.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue("application/json"));
client.DefaultRequestHeaders.Add("Ocp-Apim-Subscription-Key", subscriptionKey);
var content = new StringContent(requestData, Encoding.UTF8, "application/json");
var res = await client.PostAsync(endpoint, content);
if (res.IsSuccessStatusCode)
{
return await res.Content.ReadAsStringAsync();
}
else
{
Console.WriteLine($"{res.Content.ToString()}");
Console.ReadKey();
return $"ErrorCode: {res.StatusCode}";
}
}
}
}
}
Here is the working sample data that there are no issues using (pasted here at request of someone in comments)
{
"granularity": "daily",
"series": [
{
"timestamp": "2018-03-01T00:00:00Z",
"value": 32858923
},
{
"timestamp": "2018-03-02T00:00:00Z",
"value": 29615278
},
{
"timestamp": "2018-03-03T00:00:00Z",
"value": 22839355
},
{
"timestamp": "2018-03-04T00:00:00Z",
"value": 25948736
},
{
"timestamp": "2018-03-05T00:00:00Z",
"value": 34139159
},
{
"timestamp": "2018-03-06T00:00:00Z",
"value": 33843985
},
{
"timestamp": "2018-03-07T00:00:00Z",
"value": 33637661
},
{
"timestamp": "2018-03-08T00:00:00Z",
"value": 32627350
},
{
"timestamp": "2018-03-09T00:00:00Z",
"value": 29881076
},
{
"timestamp": "2018-03-10T00:00:00Z",
"value": 22681575
},
{
"timestamp": "2018-03-11T00:00:00Z",
"value": 24629393
},
{
"timestamp": "2018-03-12T00:00:00Z",
"value": 34010679
},
{
"timestamp": "2018-03-13T00:00:00Z",
"value": 33893888
},
{
"timestamp": "2018-03-14T00:00:00Z",
"value": 33760076
},
{
"timestamp": "2018-03-15T00:00:00Z",
"value": 33093515
},
{
"timestamp": "2018-03-16T00:00:00Z",
"value": 29945555
},
{
"timestamp": "2018-03-17T00:00:00Z",
"value": 22676212
},
{
"timestamp": "2018-03-18T00:00:00Z",
"value": 25262514
},
{
"timestamp": "2018-03-19T00:00:00Z",
"value": 33631649
},
{
"timestamp": "2018-03-20T00:00:00Z",
"value": 34468310
},
{
"timestamp": "2018-03-21T00:00:00Z",
"value": 34212281
},
{
"timestamp": "2018-03-22T00:00:00Z",
"value": 38144434
},
{
"timestamp": "2018-03-23T00:00:00Z",
"value": 34662949
},
{
"timestamp": "2018-03-24T00:00:00Z",
"value": 24623684
},
{
"timestamp": "2018-03-25T00:00:00Z",
"value": 26530491
},
{
"timestamp": "2018-03-26T00:00:00Z",
"value": 35445003
},
{
"timestamp": "2018-03-27T00:00:00Z",
"value": 34250789
},
{
"timestamp": "2018-03-28T00:00:00Z",
"value": 33423012
},
{
"timestamp": "2018-03-29T00:00:00Z",
"value": 30744783
},
{
"timestamp": "2018-03-30T00:00:00Z",
"value": 25825128
},
{
"timestamp": "2018-03-31T00:00:00Z",
"value": 21244209
},
{
"timestamp": "2018-04-01T00:00:00Z",
"value": 22576956
},
{
"timestamp": "2018-04-02T00:00:00Z",
"value": 31957221
},
{
"timestamp": "2018-04-03T00:00:00Z",
"value": 33841228
},
{
"timestamp": "2018-04-04T00:00:00Z",
"value": 33554483
},
{
"timestamp": "2018-04-05T00:00:00Z",
"value": 32383350
},
{
"timestamp": "2018-04-06T00:00:00Z",
"value": 29494850
},
{
"timestamp": "2018-04-07T00:00:00Z",
"value": 22815534
},
{
"timestamp": "2018-04-08T00:00:00Z",
"value": 25557267
},
{
"timestamp": "2018-04-09T00:00:00Z",
"value": 34858252
},
{
"timestamp": "2018-04-10T00:00:00Z",
"value": 34750597
},
{
"timestamp": "2018-04-11T00:00:00Z",
"value": 34717956
},
{
"timestamp": "2018-04-12T00:00:00Z",
"value": 34132534
},
{
"timestamp": "2018-04-13T00:00:00Z",
"value": 30762236
},
{
"timestamp": "2018-04-14T00:00:00Z",
"value": 22504059
},
{
"timestamp": "2018-04-15T00:00:00Z",
"value": 26149060
},
{
"timestamp": "2018-04-16T00:00:00Z",
"value": 35250105
}
]
}
According to the 400 Possible Errors in this doc https://westus2.dev.cognitive.microsoft.com/docs/services/AnomalyDetector/operations/post-timeseries-entire-detect, it seems that there are two issues in your data.
The timestamps are not in ascending order
There seems to be more than 10% points missing in the given time range.
I investigated the sample, the reason you could not see error message because code here should be
if (res.IsSuccessStatusCode)
{
return await res.Content.ReadAsStringAsync();
}
else
{
Console.WriteLine(res.StatusCode);
return await res.Content.ReadAsStringAsync();
}
The reason why the first request failed because this
{
"tim estamp": "2018-07-12T00:00:00",
"value": 49193.00
}
But it should be
{
"timestamp": "2018-07-12T00:00:00",
"value": 49193.00
}
However, the request still could not be submitted successfully, as anomaly detector service require the input request sorted by timestamp in ascending order and the missing value rate of the input time-series can not exceed 10%. In API reference and best practice part you could find more useful information about the service.
Thanks for your feedback, as we also released .Net SDK. You may consider using the SDK to handle the client work. we will also update quick start part to cover the failed requests error message.
There some rules you should strictly follow -
File should contain two and only two columns. timestamp and value
all in small letters
timestamp should be in ISO 8601 format.
I have this JSON:
{
"total": 23695,
"total_pages": 1185,
"results": [{
"id": "r7bVvV7MLdQ",
"created_at": "2018-01-17T06:38:03-05:00",
"updated_at": "2018-05-09T03:35:24-04:00",
"width": 4032,
"height": 2526,
"color": "#F7EDE7",
"description": null,
"urls": {
"raw": "https://images.unsplash.com/photo-1516189050082-44d4deb5ceef?ixlib=rb-0.3.5\u0026ixid=eyJhcHBfaWQiOjEyMDd9\u0026s=8b6caac6353f390fbbabde8441dd1959",
"full": "https://images.unsplash.com/photo-1516189050082-44d4deb5ceef?ixlib=rb-0.3.5\u0026q=85\u0026fm=jpg\u0026crop=entropy\u0026cs=srgb\u0026ixid=eyJhcHBfaWQiOjEyMDd9\u0026s=89ca725623d794116d3741907c93ceab",
"regular": "https://images.unsplash.com/photo-1516189050082-44d4deb5ceef?ixlib=rb-0.3.5\u0026q=80\u0026fm=jpg\u0026crop=entropy\u0026cs=tinysrgb\u0026w=1080\u0026fit=max\u0026ixid=eyJhcHBfaWQiOjEyMDd9\u0026s=666eb6ac25c7fec68d5994545b933726",
"small": "https://images.unsplash.com/photo-1516189050082-44d4deb5ceef?ixlib=rb-0.3.5\u0026q=80\u0026fm=jpg\u0026crop=entropy\u0026cs=tinysrgb\u0026w=400\u0026fit=max\u0026ixid=eyJhcHBfaWQiOjEyMDd9\u0026s=3dbc611c97d323ff8b4b043cff19317b",
"thumb": "https://images.unsplash.com/photo-1516189050082-44d4deb5ceef?ixlib=rb-0.3.5\u0026q=80\u0026fm=jpg\u0026crop=entropy\u0026cs=tinysrgb\u0026w=200\u0026fit=max\u0026ixid=eyJhcHBfaWQiOjEyMDd9\u0026s=0c760185c35eadb31a7bba9b9794d424"
},
"links": {
"self": "https://api.unsplash.com/photos/r7bVvV7MLdQ",
"html": "https://unsplash.com/photos/r7bVvV7MLdQ",
"download": "https://unsplash.com/photos/r7bVvV7MLdQ/download",
"download_location": "https://api.unsplash.com/photos/r7bVvV7MLdQ/download"
},
"categories": [],
"sponsored": false,
"likes": 0,
"liked_by_user": false,
"current_user_collections": [],
"slug": "cloud-smoke-steam",
"user": {
"id": "G69mdFHx0X0",
"updated_at": "2018-05-03T14:00:07-04:00",
"username": "maxkuk",
"name": "Max Kukurudziak",
"first_name": "Max",
"last_name": "Kukurudziak",
"twitter_username": null,
"portfolio_url": "http://www.instagram.com/makckuk",
"bio": "Product Designer at MacPaw, Lecturer at Projector",
"location": "Kiev, Ukraine",
"links": {
"self": "https://api.unsplash.com/users/maxkuk",
"html": "https://unsplash.com/#maxkuk",
"photos": "https://api.unsplash.com/users/maxkuk/photos",
"likes": "https://api.unsplash.com/users/maxkuk/likes",
"portfolio": "https://api.unsplash.com/users/maxkuk/portfolio",
"following": "https://api.unsplash.com/users/maxkuk/following",
"followers": "https://api.unsplash.com/users/maxkuk/followers"
},
"profile_image": {
"small": "https://images.unsplash.com/profile-1518780839522-ee199eceaf8c?ixlib=rb-0.3.5\u0026q=80\u0026fm=jpg\u0026crop=faces\u0026cs=tinysrgb\u0026fit=crop\u0026h=32\u0026w=32\u0026s=c37d2f2844b45f52c0f66cd580a200c8",
"medium": "https://images.unsplash.com/profile-1518780839522-ee199eceaf8c?ixlib=rb-0.3.5\u0026q=80\u0026fm=jpg\u0026crop=faces\u0026cs=tinysrgb\u0026fit=crop\u0026h=64\u0026w=64\u0026s=93647049c20b6a323870fe0886eee329",
"large": "https://images.unsplash.com/profile-1518780839522-ee199eceaf8c?ixlib=rb-0.3.5\u0026q=80\u0026fm=jpg\u0026crop=faces\u0026cs=tinysrgb\u0026fit=crop\u0026h=128\u0026w=128\u0026s=aff2d13afe9fe418b562b85c226b7e8e"
},
"instagram_username": "makckuk",
"total_collections": 0,
"total_likes": 6,
"total_photos": 56
},
"tags": [{
"title": "cloud"
},
{
"title": "smoke"
},
{
"title": "steam"
},
{
"title": "mountain"
},
{
"title": "volcano"
},
{
"title": "blue"
},
{
"title": "rock"
},
{
"title": "glacier"
},
{
"title": "field"
},
{
"title": "geysir"
},
{
"title": "iceland"
}],
"photo_tags": [{
"title": "cloud"
},
{
"title": "smoke"
},
{
"title": "steam"
},
{
"title": "mountain"
},
{
"title": "volcano"
},
{
"title": "blue"
},
{
"title": "rock"
},
{
"title": "glacier"
},
{
"title": "field"
},
{
"title": "geysir"
},
{
"title": "iceland"
}]
},
I need to get results.profile_image.small I tried it on many ways but I never figured out how to access to profile_image fields.
Basically I want to do something like this:
dynamic array = JsonConvert.DeserializeObject(responz);
foreach (var itemx in array["results"])
{
MessageBox.Show(itemx.profile_image.small.ToString());
}
I spent last few hours figuring it out, searching searching StackOverflow. The last option is do this with regex which would be very stupid thing to do.
Based on your JSON, the actual path should be:
itemx.user.profile_image.small
So if you modify your code to include the missing "user" portion:
dynamic array = JsonConvert.DeserializeObject(responz);
foreach (var itemx in array["results"])
{
MessageBox.Show(itemx.user.profile_image.small.ToString());
}
That should solve your problem.
However, this problem would likely not have presented itself if you were using a concrete class to deserialize into. You would have type safety and the assistance of Intelisense if you use a concrete class. You can easily convert your JSON sample into a concrete class using Json2CSharp or using the "Paste as Class" function of modern Visual Studio versions.
Problem Definition:
I got an ExpandoObject that i need to convert to a Type like "Customer" Type, the problem is not achieving that, am wondering should i approach that server side or client side using javascript, i can achieve what i want both ways but which is more effective and less time consuming?
Server Side approach:
public IEnumerable<T> Convert<T>(dynamic self)
{
List<T> model = new List<T>();
var jsSerializer = new JavaScriptSerializer();
foreach (var obj in self)
{
model.Add(jsSerializer.ConvertToType<T>(obj));
}
return model.AsEnumerable();
}
Client Side approach:
var model = [];
data.forEach(function (item) {
var property = [];
item.forEach(function (pair) {
property[pair.Key] = pair.Value;
});
model.push(property);
});
Used ORM: Rob Conery's Massive
Original Response(No Conversion):
[[{ "Key": "ID", "Value": 2 }, { "Key": "customerID", "Value": 1 }, { "Key": "orderID", "Value": 1 }, { "Key": "address", "Value": "25 Ibrahim Nagy Street, NasrCity, Cairo" }, { "Key": "deliveryDateTime", "Value": "/Date(1381528800000)/" }, { "Key": "deliveryPersonnelID", "Value": 1 }], [{ "Key": "ID", "Value": 3 }, { "Key": "customerID", "Value": 2 }, { "Key": "orderID", "Value": 2 }, { "Key": "address", "Value": "14 Ibrahim Nagy Street, NasrCity, Cairo" }, { "Key": "deliveryDateTime", "Value": "/Date(1386972000000)/" }, { "Key": "deliveryPersonnelID", "Value": 2 }], [{ "Key": "ID", "Value": 4 }, { "Key": "customerID", "Value": 1 }, { "Key": "orderID", "Value": 3 }, { "Key": "address", "Value": "30 Abbas Akad Street, NasrCity, Cairo" }, { "Key": "deliveryDateTime", "Value": "/Date(1387922400000)/" }, { "Key": "deliveryPersonnelID", "Value": 2 }], [{ "Key": "ID", "Value": 5 }, { "Key": "customerID", "Value": 3 }, { "Key": "orderID", "Value": 4 }, { "Key": "address", "Value": "25 Hassan Maamoon Street, NasrCity, Cairo" }, { "Key": "deliveryDateTime", "Value": "/Date(1388354400000)/" }, { "Key": "deliveryPersonnelID", "Value": 3 }]]
I decided to do the conversion client side, i don't know it feels right to me, i mean let the client do some work and also it's not much of a hard work too, so why waste time server side?
namespace WebApplication1.Site
{
public partial class WebForm1 : System.Web.UI.Page
{
protected void Page_Load(object sender, EventArgs e)
{
var accessToken = "" // Token
var client = new FacebookClient(accessToken);
dynamic myInfo = client.Get("me/friends", new { fields = "name,id,work" });
foreach (dynamic friend in myInfo)
{
foreach (dynamic work in friend.work ?? new[] { new { employer = new { name = string.Empty }, position = new { name = string.Empty } } })
{
Response.Write("Employer: " + work.employer.name);
}
}
}
}
}
I am getting the following error on line 21. I cannot figure out what is causing it.
'System.Collections.Generic.KeyValuePair' does not contain a definition for 'work'
Sample JSON Return from the Facebook GraphAPI. This is only the first three friends. There are closer to 4000 friends I am parsing, obviously this gives some context for the structure of the data:
{
"data": [
{
"name": "Mia xxx",
"id": "11381",
"work": [
{
"employer": {
"id": "100982923276720",
"name": "New-York Historical Society"
},
"location": {
"id": "108424279189115",
"name": "New York, New York"
}
}
]
},
{
"name": "Leilah xxx",
"id": "1133"
},
{
"name": "xxx,
"id": "1231",
"work": [
{
"employer": {
"id": "104362369673437",
"name": "Bye Bye Liver: The Philadelphia Drinking Play"
},
"location": {
"id": "101881036520836",
"name": "Philadelphia, Pennsylvania"
},
"position": {
"id": "121113421241569",
"name": "Actress/Bartender"
},
"description": "A sketch comedy/improv show every Saturday night at Downey's on South & Front. Come thirsty!",
"start_date": "2011-09"
},
{
"employer": {
"id": "100952634348",
"name": "Act II Playhouse"
},
"location": {
"id": "109249869093538",
"name": "Ambler, Pennsylvania"
},
"position": {
"id": "125578900846788",
"name": "My Fair Lady"
},
"description": "11 actor version of the classic musical.",
"start_date": "0000-00"
},
An alternative to relying on the dynamic is to capture and parse the JSON with JSON.net, it's designed for querying json data and is really much safer than using dynamic
http://json.codeplex.com/
And deserializing into classes:
http://dotnetbyexample.blogspot.ca/2012/02/json-deserialization-with-jsonnet-class.html
Look at this:
Response.Write("Employer: " + myInfo.work.employer.name);
I suspect you meant:
Response.Write("Employer: " + work.employer.name);
Put it this way - if that's not what you meant, what's the purpose of your work variable?