Unable to update WorkItem Status with WorkItemTrackingHttpClient - c#

I am automating updates to work item hours but changes to Status are ignored. I'd like to set status from "Active" to "Resolved".
I have found information stating that you also need to set a "Reason" if you are changing the status but my code isn't changing Reason or Status although all the other field updates are working. I suspect it is because the Status field is read-only but we are unable to find a rule that makes it so (we're using the CMMI template):
Can someone tell me if the problem is the setup in dev ops or if it is my code (or something else)?
//Executing from LINQPad, no need to mention the blocks on async....
WorkItem targetWorkItem = client.GetWorkItemAsync(123456).Result;
JsonPatchDocument patchDocument = new JsonPatchDocument();
patchDocument.Add(
new JsonPatchOperation()
{
Operation = Operation.Replace,
Path = "/fields/Microsoft.VSTS.Scheduling.CompletedWork",
Value = 123
}
);
patchDocument.Add(
new JsonPatchOperation()
{
Operation = Operation.Replace,
Path = "/fields/Microsoft.VSTS.Scheduling.RemainingWork",
Value = 0
}
);
/*
These don't work! I think because "Reason" field is read only
*/
patchDocument.Add(
new JsonPatchOperation()
{
Operation = Operation.Add, //Tried Replace as well as Add
Path = "/Fields/System.Reason",
Value = "Complete and Requires Review/Test"
}
patchDocument.Add(
new JsonPatchOperation()
{
Operation = Operation.Add, //Tried Replace as well as Add
Path = "/Fields/System.State",
Value = "Resolved"
}
);
//Succeeds for any field except Status and Reason
WorkItem result = client.UpdateWorkItemAsync(patchDocument, 123456).Result;
Namespaces used:
Microsoft.TeamFoundation.WorkItemTracking.WebApi
Microsoft.TeamFoundation.WorkItemTracking.WebApi.Models
Microsoft.VisualStudio.Services.Common
Microsoft.VisualStudio.Services.WebApi
Microsoft.VisualStudio.Services.WebApi.Patch
Microsoft.VisualStudio.Services.WebApi.Patch.Json

You have a syntax error, you should write /fields/System.State with f and not Fields with F.
And change the state is enough, the reason will be changed automatically.

Your Json should end up looking like this:
{
"id": xx,
"rev": yy,
"fields": [{
"field": {
"refName": "System.State"
},
"value": "Resolved"
},
{
"field": {
"refName": "System.Reason"
},
"value": "Status Reason"
},
{
"field": {
"refName": "Microsoft.VSTS.Common.ActivatedBy"
},
"value": null
},
{
"field": {
"refName": "Microsoft.VSTS.Common.ActivatedDate"
},
"value": null
},
{
"field": {
"refName": "Microsoft.VSTS.Common.ResolvedDate"
},
"value": "2014-08-25T19:14:04.594Z"
},
{
"field": {
"refName": "Microsoft.VSTS.Common.ResolvedBy"
},
"value": "User Name"
},
{
"field": {
"refName": "Microsoft.VSTS.Common.ResolvedReason"
},
"value": "Resolved Reason"
},
{
"field": {
"refName": "Microsoft.VSTS.Common.ClosedDate"
},
"value": <null or "2014-08-25T19:14:04.594Z">
},
{
"field": {
"refName": "Microsoft.VSTS.Common.ClosedBy"
},
"value": <null, "John Doe">
}]
}

Related

Recursively create JSON in C# without losing the JProperties from the first iteration

Here, I am trying to create a JSON using using Newtonsoft.Json; libraries. I came up with a function which partially does my job and when the same function is called, my previous data on the JObject is lost. I am new to this items JProperty, JObject, JArray, JToken however managed to come up with this code.
The parent is initiated in the function itself and when the function is recursively called its initiated and cleaned up again. So I added another JObject head. Making it more complicated.
string statment = "(Entity.Country = 'USA' AND Entity.ShortName = 'Adele' AND (Entity.CIFNumber = '12345' OR Statement.StatementYear = '2015'))";
public static JObject ConvertToJsonObject(string text)
{
JObject parent = new JObject();
string bracketContents = getWhatsInsideBrackets(text);
parent.Add(new JProperty("operator", ReturnOperator(text)));
parent.Add(new JProperty("rules"));
string[] operators = splitWithOperator(bracketContents);
List<JObject> req = new List<JObject>();
for (int i = 0; i < splitWithOperator(bracketContents).Length; i++)
{
if (!checkIfBracketsExists(operators[i].Trim()))
{
req.Add(GetEachCondition(operators, i));
}
else if (checkIfBracketsExists(operators[i]))
{
parent["rules"] = new JArray(ConvertToJsonObject(operators[i]));
head = parent;
parent["rules"] = (ConvertToJsonObject(operators[i]));
}
}
parent["rules"] = new JArray(req);
head["rules"] = parent;
return parent;
}
I am trying to achieve this JSON output: (Only trying to achieve the key, value of DataSetCommonQuery, dont worry about the other keys on the json)
{
"context": {
"wfId": "00000000-0000-0000-0000-000000000000",
"taskId": "00000000-0000-0000-0000-000000000000"
},
"payLoad": {
"DataSetCommonQuery": {
"operator": "AND",
"rules": [
{
"field": "ENTITY.CIFNumber",
"condition": "<>",
"value": "3123"
},
{
"field": "ENTITY.Country",
"condition": "LIKE",
"value": "USA"
},
{
"operator": "OR",
"rules": [
{
"field": "ENTITY.FYEMonth",
"condition": "=",
"value": "May"
},
{
"field": "STATEMENT.ProfitBeforeTax",
"condition": ">=",
"value": 123123
},
{
"field": "STATEMENT.NetSales",
"condition": "<=",
"value": 234234
},
{
"field": "STATEMENT.statementdatekey_",
"condition": "=",
"value": "2019-07-01 12:00:00"
}
]
}
]
},
"PeerType": "DEFAULT",
"Name": "API TEST",
"Description": "API TEST",
"BmkPeerFormatId": "DBBmk",
"OperationType": "Create"
}
}

400 BAD Request attempting to contact Azure Anomaly Detection API

I have been following the tutorial listed here - https://learn.microsoft.com/en-us/azure/cognitive-services/anomaly-detector/quickstarts/detect-data-anomalies-csharp
I have set up my Anomaly API so I have the end points and API keys ready to go. I have set up my own time series data which I will post here for clarity (Note: While the below does NOT work, the sample data here using the same code DOES work - https://github.com/Azure-Samples/anomalydetector/blob/master/example-data/request-data.json):
{
"granularity": "daily",
"series": [
{
"timestamp": "2019-03-27T00:00:00",
"value": 10781.70
},
{
"timestamp": "2019-03-25T00:00:00",
"value": 4058.13
},
{
"timestamp": "2019-03-20T00:00:00",
"value": 8124132.33
},
{
"timestamp": "2019-03-19T00:00:00",
"value": 1571398.97
},
{
"timestamp": "2019-03-18T00:00:00",
"value": 2097703.40
},
{
"timestamp": "2019-03-15T00:00:00",
"value": 10624.76
},
{
"timestamp": "2019-03-14T00:00:00",
"value": 11647.00
},
{
"timestamp": "2019-03-13T00:00:00",
"value": 45937.16
},
{
"timestamp": "2019-03-08T00:00:00",
"value": 4237.20
},
{
"timestamp": "2019-03-07T00:00:00",
"value": 3315.40
},
{
"timestamp": "2019-03-04T00:00:00",
"value": 3218.77
},
{
"timestamp": "2019-02-28T00:00:00",
"value": 11271.00
},
{
"timestamp": "2019-02-27T00:00:00",
"value": 48605.08
},
{
"timestamp": "2019-02-26T00:00:00",
"value": 6181.12
},
{
"timestamp": "2019-02-25T00:00:00",
"value": 45069.00
},
{
"timestamp": "2019-02-22T00:00:00",
"value": 108860.84
},
{
"timestamp": "2019-02-21T00:00:00",
"value": 24924.50
},
{
"timestamp": "2019-02-20T00:00:00",
"value": 4068.50
},
{
"timestamp": "2019-02-19T00:00:00",
"value": 4329.60
},
{
"timestamp": "2019-02-18T00:00:00",
"value": 7615.20
},
{
"timestamp": "2019-02-14T00:00:00",
"value": 56974.10
},
{
"timestamp": "2019-02-13T00:00:00",
"value": 73393.52
},
{
"timestamp": "2019-02-12T00:00:00",
"value": 29991.99
},
{
"timestamp": "2019-02-11T00:00:00",
"value": 2906769.50
},
{
"timestamp": "2019-02-05T00:00:00",
"value": 1956853.85
},
{
"timestamp": "2019-02-04T00:00:00",
"value": 46863.31
},
{
"timestamp": "2019-01-31T00:00:00",
"value": 31602.31
},
{
"timestamp": "2019-01-30T00:00:00",
"value": 13149.59
},
{
"timestamp": "2018-10-10T00:00:00",
"value": 19380.60
},
{
"timestamp": "2018-08-21T00:00:00",
"value": 61801.45
},
{
"timestamp": "2018-08-16T00:00:00",
"value": 843.80
},
{
"timestamp": "2018-08-15T00:00:00",
"value": 52326.20
},
{
"timestamp": "2018-08-14T00:00:00",
"value": 136384.88
},
{
"timestamp": "2018-08-09T00:00:00",
"value": 7224.30
},
{
"timestamp": "2018-07-26T00:00:00",
"value": 16493.08
},
{
"timestamp": "2018-07-24T00:00:00",
"value": 1665163.72
},
{
"timestamp": "2018-07-23T00:00:00",
"value": 38642.88
},
{
"timestamp": "2018-07-13T00:00:00",
"value": 49913.00
},
{
"tim estamp": "2018-07-12T00:00:00",
"value": 49193.00
},
{
"timestamp": "2018-07-11T00:00:00",
"value": 37205.30
},
{
"timestamp": "2018-07-10T00:00:00",
"value": 44527.30
},
{
"timestamp": "2018-07-09T00:00:00",
"value": 148737.01
},
{
"timestamp": "2018-07-06T00:00:00",
"value": 138887.90
},
{
"timestamp": "2018-07-05T00:00:00",
"value": 74346.00
},
{
"timestamp": "2018-07-04T00:00:00",
"value": 71181.50
},
{
"timestamp": "2018-07-03T00:00:00",
"value": 215164.43
},
{
"timestamp": "2018-07-02T00:00:00",
"value": 83817.50
}
]
}
When I run my test code, I get back a 400 BAD REQUEST (with no additional information to suggest why). Here is my console app code (with obvious parts redacted)
using System;
using System.IO;
using System.Net;
using System.Net.Http;
using System.Net.Http.Headers;
using System.Text;
using System.Threading.Tasks;
namespace anomalyDetectionMachineLearning
{
class Program
{
// Replace the subscriptionKey string value with your valid subscription key.
const string subscriptionKey = "";
// Replace the endpoint URL with the correct one for your subscription.
// Your endpoint can be found in the Azure portal. For example: https://westus2.api.cognitive.microsoft.com
const string endpoint = "https://westus2.api.cognitive.microsoft.com/";
// Replace the dataPath string with a path to the JSON formatted time series data.
const string dataPath = #"C:\Temp\data.txt";
const string latestPointDetectionUrl = "/anomalydetector/v1.0/timeseries/last/detect";
const string batchDetectionUrl = "/anomalydetector/v1.0/timeseries/entire/detect";
static void Main(string[] args)
{
try
{
var requestData = File.ReadAllText(dataPath);
//Console.Write(requestData.ToString());
detectAnomaliesBatch(requestData);
//detectAnomaliesLatest(requestData);
System.Console.ReadKey();
}
catch (Exception ex)
{
Console.WriteLine(ex.ToString());
Console.ReadKey();
}
}
public static void detectAnomaliesBatch(string requestData)
{
System.Console.WriteLine("Detecting anomalies as a batch");
var result = Request(
endpoint,
batchDetectionUrl,
subscriptionKey,
requestData).Result;
dynamic jsonObj = Newtonsoft.Json.JsonConvert.DeserializeObject(result);
System.Console.WriteLine(jsonObj);
bool[] anomalies = jsonObj["isAnomaly"].ToObject<bool[]>();
System.Console.WriteLine("\n Anomalies detected in the following data positions:");
for (var i = 0; i < anomalies.Length; i++)
{
if (anomalies[i])
{
System.Console.Write(i + ", ");
}
}
}
static async Task<string> Request(string baseAddress, string endpoint, string subscriptionKey, string requestData)
{
using (HttpClient client = new HttpClient { BaseAddress = new Uri(baseAddress) })
{
System.Net.ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls12 | SecurityProtocolType.Tls11 | SecurityProtocolType.Tls;
client.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue("application/json"));
client.DefaultRequestHeaders.Add("Ocp-Apim-Subscription-Key", subscriptionKey);
var content = new StringContent(requestData, Encoding.UTF8, "application/json");
var res = await client.PostAsync(endpoint, content);
if (res.IsSuccessStatusCode)
{
return await res.Content.ReadAsStringAsync();
}
else
{
Console.WriteLine($"{res.Content.ToString()}");
Console.ReadKey();
return $"ErrorCode: {res.StatusCode}";
}
}
}
}
}
Here is the working sample data that there are no issues using (pasted here at request of someone in comments)
{
"granularity": "daily",
"series": [
{
"timestamp": "2018-03-01T00:00:00Z",
"value": 32858923
},
{
"timestamp": "2018-03-02T00:00:00Z",
"value": 29615278
},
{
"timestamp": "2018-03-03T00:00:00Z",
"value": 22839355
},
{
"timestamp": "2018-03-04T00:00:00Z",
"value": 25948736
},
{
"timestamp": "2018-03-05T00:00:00Z",
"value": 34139159
},
{
"timestamp": "2018-03-06T00:00:00Z",
"value": 33843985
},
{
"timestamp": "2018-03-07T00:00:00Z",
"value": 33637661
},
{
"timestamp": "2018-03-08T00:00:00Z",
"value": 32627350
},
{
"timestamp": "2018-03-09T00:00:00Z",
"value": 29881076
},
{
"timestamp": "2018-03-10T00:00:00Z",
"value": 22681575
},
{
"timestamp": "2018-03-11T00:00:00Z",
"value": 24629393
},
{
"timestamp": "2018-03-12T00:00:00Z",
"value": 34010679
},
{
"timestamp": "2018-03-13T00:00:00Z",
"value": 33893888
},
{
"timestamp": "2018-03-14T00:00:00Z",
"value": 33760076
},
{
"timestamp": "2018-03-15T00:00:00Z",
"value": 33093515
},
{
"timestamp": "2018-03-16T00:00:00Z",
"value": 29945555
},
{
"timestamp": "2018-03-17T00:00:00Z",
"value": 22676212
},
{
"timestamp": "2018-03-18T00:00:00Z",
"value": 25262514
},
{
"timestamp": "2018-03-19T00:00:00Z",
"value": 33631649
},
{
"timestamp": "2018-03-20T00:00:00Z",
"value": 34468310
},
{
"timestamp": "2018-03-21T00:00:00Z",
"value": 34212281
},
{
"timestamp": "2018-03-22T00:00:00Z",
"value": 38144434
},
{
"timestamp": "2018-03-23T00:00:00Z",
"value": 34662949
},
{
"timestamp": "2018-03-24T00:00:00Z",
"value": 24623684
},
{
"timestamp": "2018-03-25T00:00:00Z",
"value": 26530491
},
{
"timestamp": "2018-03-26T00:00:00Z",
"value": 35445003
},
{
"timestamp": "2018-03-27T00:00:00Z",
"value": 34250789
},
{
"timestamp": "2018-03-28T00:00:00Z",
"value": 33423012
},
{
"timestamp": "2018-03-29T00:00:00Z",
"value": 30744783
},
{
"timestamp": "2018-03-30T00:00:00Z",
"value": 25825128
},
{
"timestamp": "2018-03-31T00:00:00Z",
"value": 21244209
},
{
"timestamp": "2018-04-01T00:00:00Z",
"value": 22576956
},
{
"timestamp": "2018-04-02T00:00:00Z",
"value": 31957221
},
{
"timestamp": "2018-04-03T00:00:00Z",
"value": 33841228
},
{
"timestamp": "2018-04-04T00:00:00Z",
"value": 33554483
},
{
"timestamp": "2018-04-05T00:00:00Z",
"value": 32383350
},
{
"timestamp": "2018-04-06T00:00:00Z",
"value": 29494850
},
{
"timestamp": "2018-04-07T00:00:00Z",
"value": 22815534
},
{
"timestamp": "2018-04-08T00:00:00Z",
"value": 25557267
},
{
"timestamp": "2018-04-09T00:00:00Z",
"value": 34858252
},
{
"timestamp": "2018-04-10T00:00:00Z",
"value": 34750597
},
{
"timestamp": "2018-04-11T00:00:00Z",
"value": 34717956
},
{
"timestamp": "2018-04-12T00:00:00Z",
"value": 34132534
},
{
"timestamp": "2018-04-13T00:00:00Z",
"value": 30762236
},
{
"timestamp": "2018-04-14T00:00:00Z",
"value": 22504059
},
{
"timestamp": "2018-04-15T00:00:00Z",
"value": 26149060
},
{
"timestamp": "2018-04-16T00:00:00Z",
"value": 35250105
}
]
}
According to the 400 Possible Errors in this doc https://westus2.dev.cognitive.microsoft.com/docs/services/AnomalyDetector/operations/post-timeseries-entire-detect, it seems that there are two issues in your data.
The timestamps are not in ascending order
There seems to be more than 10% points missing in the given time range.
I investigated the sample, the reason you could not see error message because code here should be
if (res.IsSuccessStatusCode)
{
return await res.Content.ReadAsStringAsync();
}
else
{
Console.WriteLine(res.StatusCode);
return await res.Content.ReadAsStringAsync();
}
The reason why the first request failed because this
{
"tim estamp": "2018-07-12T00:00:00",
"value": 49193.00
}
But it should be
{
"timestamp": "2018-07-12T00:00:00",
"value": 49193.00
}
However, the request still could not be submitted successfully, as anomaly detector service require the input request sorted by timestamp in ascending order and the missing value rate of the input time-series can not exceed 10%. In API reference and best practice part you could find more useful information about the service.
Thanks for your feedback, as we also released .Net SDK. You may consider using the SDK to handle the client work. we will also update quick start part to cover the failed requests error message.
There some rules you should strictly follow -
File should contain two and only two columns. timestamp and value
all in small letters
timestamp should be in ISO 8601 format.

How to Regex Match by keyword in MongoDB and sort the results by most relevant results using C#?

I'm regex searching a {keyword} against a field [Name] from one of the collections [Brands] of Mongo DB using C#. And the result items would be sorted by the textMatchScore by descending. The result looks good but the format looks pretty weird.
The expected results would be like this.
[
{
"id": "5c8bcbc36ad6840725182158",
"name": "Window XP",
"dateAdded": "2019-03-15T15:58:58.925Z"
},
{
"id": "5c8bcbc96ad6840725182159",
"name": "Wind",
"dateAdded": "2019-03-15T15:59:05.429Z"
},
{
"id": "5c8bcbd16ad684072518215a",
"name": "Wired",
"dateAdded": "2019-03-15T15:59:13.292Z"
}
]
This is the actual result though.
[
[
{
"name": "_id",
"value": "5c8bcbc36ad6840725182158"
},
{
"name": "Name",
"value": "Window XP"
},
{
"name": "DateAdded",
"value": "2019-03-15T15:58:58.925Z"
},
{
"name": "textMatchScore",
"value": 0
}
],
[
{
"name": "_id",
"value": "5c8bcbc96ad6840725182159"
},
{
"name": "Name",
"value": "Wind"
},
{
"name": "DateAdded",
"value": "2019-03-15T15:59:05.429Z"
},
{
"name": "textMatchScore",
"value": 0
}
]
]
This is the C# part code.
[HttpGet]
public async Task<IActionResult> QueryAsync([FromQuery]string q, [FromQuery]int limit = 10, [FromQuery]int skip = 0)
{
var mongoUrl = new MongoUrl(_mongoConfig.ConnectionString);
var client = new MongoClient(mongoUrl);
var database = client.GetDatabase(mongoUrl.DatabaseName);
var collection = database.GetCollection<Brand>("Brands");
var regex = new BsonRegularExpression($".*{q}.*","i");
var filter = Builders<Brand>.Filter.Regex("Name", regex);
var projection = Builders<Brand>.Projection.MetaTextScore("textMatchScore");
var sort = Builders<Brand>.Sort.MetaTextScore("textMatchScore");
var result = await collection.Find(filter).Project(projection).Sort(sort).Skip(skip).Limit(limit).ToListAsync();
return Ok(result);
}
Has the query been implemented right? And, what can I do to make the result format a pretty tidy json format?

Query the DocDB based on document children value in c#

I have the DocumentDB like below in my Azure CosmosDB collection.
{
"TemplateID": "73",
"TemplateName": "Test -template",
"Read": [{
"devicename": "",
"timestamp": "2017-09-19T21:05:12.8550708+05:30",
"value": "038452735329RIV5"
},
{
"devicename": "",
"timestamp": "2017-09-19T21:05:12.8550708+05:30",
"value": "038452735330RIV5"
},
],
"eventTime": "2017-09-19T21:05:18.7954106+05:30",
}
{
"TemplateID": "73",
"TemplateName": "Test -template",
"Read": [{
"devicename": "",
"timestamp": "2017-09-19T21:05:12.8550708+05:30",
"value": "019452755319RIV5"
},
{
"devicename": "",
"timestamp": "2017-09-19T21:05:12.8550708+05:30",
"value": "138452715310RIV5"
},
],
"eventTime": "2017-09-19T21:05:18.7954106+05:30",
}
I need to get the document in following structure by querying the value node of Read array from the document(for example if i enter the code 038452735329RIV5), i need the following output,
{
"TemplateID": "73",
"TemplateName": "Test -template",
"eventTime": "2017-09-19T21:05:18.7954106+05:30",
}
I tried the following query,
var docs = _documentClient.CreateDocumentQuery<Document>(DocumentPath, queryOptions).Where(x => x.Read.value == code) .AsDocumentQuery();
I am not getting the required document. I am getting null as doc, but document exist in AzureDB.
Can any one please check and help me on how to query this.
Thanks
According to your description, I suggest you could try to use ARRAY_CONTAINS build-in function in documentdb.
It could check the values in array.
More details, you could refer to below query.
SELECT c.TemplateID,c.TemplateName,c.eventTime FROM c WHERE ARRAY_CONTAINS(c.Read, { 'value': '038452735329RIV5' }
The result:
C# codes:
string EndpointUrl = "xxxxxx";
string PrimaryKey = "xxxxxxxx";
DocumentClient client = new DocumentClient(new Uri(EndpointUrl), PrimaryKey);
FeedOptions queryOptions = new FeedOptions { MaxItemCount = -1 };
var familyQuery = client.CreateDocumentQuery<test>(
UriFactory.CreateDocumentCollectionUri("testdb", "coll1"), "SELECT c.TemplateID,c.TemplateName,c.eventTime FROM c WHERE ARRAY_CONTAINS(c.Read, { 'value': '038452735329RIV5' }, true)", queryOptions).ToList();
Result:

'System.Collections.Generic.KeyValuePair<string,object>' does not contain a definition for 'work'

namespace WebApplication1.Site
{
public partial class WebForm1 : System.Web.UI.Page
{
protected void Page_Load(object sender, EventArgs e)
{
var accessToken = "" // Token
var client = new FacebookClient(accessToken);
dynamic myInfo = client.Get("me/friends", new { fields = "name,id,work" });
foreach (dynamic friend in myInfo)
{
foreach (dynamic work in friend.work ?? new[] { new { employer = new { name = string.Empty }, position = new { name = string.Empty } } })
{
Response.Write("Employer: " + work.employer.name);
}
}
}
}
}
I am getting the following error on line 21. I cannot figure out what is causing it.
'System.Collections.Generic.KeyValuePair' does not contain a definition for 'work'
Sample JSON Return from the Facebook GraphAPI. This is only the first three friends. There are closer to 4000 friends I am parsing, obviously this gives some context for the structure of the data:
{
"data": [
{
"name": "Mia xxx",
"id": "11381",
"work": [
{
"employer": {
"id": "100982923276720",
"name": "New-York Historical Society"
},
"location": {
"id": "108424279189115",
"name": "New York, New York"
}
}
]
},
{
"name": "Leilah xxx",
"id": "1133"
},
{
"name": "xxx,
"id": "1231",
"work": [
{
"employer": {
"id": "104362369673437",
"name": "Bye Bye Liver: The Philadelphia Drinking Play"
},
"location": {
"id": "101881036520836",
"name": "Philadelphia, Pennsylvania"
},
"position": {
"id": "121113421241569",
"name": "Actress/Bartender"
},
"description": "A sketch comedy/improv show every Saturday night at Downey's on South & Front. Come thirsty!",
"start_date": "2011-09"
},
{
"employer": {
"id": "100952634348",
"name": "Act II Playhouse"
},
"location": {
"id": "109249869093538",
"name": "Ambler, Pennsylvania"
},
"position": {
"id": "125578900846788",
"name": "My Fair Lady"
},
"description": "11 actor version of the classic musical.",
"start_date": "0000-00"
},
An alternative to relying on the dynamic is to capture and parse the JSON with JSON.net, it's designed for querying json data and is really much safer than using dynamic
http://json.codeplex.com/
And deserializing into classes:
http://dotnetbyexample.blogspot.ca/2012/02/json-deserialization-with-jsonnet-class.html
Look at this:
Response.Write("Employer: " + myInfo.work.employer.name);
I suspect you meant:
Response.Write("Employer: " + work.employer.name);
Put it this way - if that's not what you meant, what's the purpose of your work variable?

Categories