I am making a request to an API that returns information about a bunch of objects to me as a JSON string, similar to this: (handwritten so ignore possible typos)
{{
"count": 2,
"object": [{
"name": "object 1",
"fields": {
"Attribute1": 2,
"Attribute2": "string",
(...)}
"links": [{
"rel": "SomeString",
"url": "http://UrlString"
},
{
...
}]
}],
[{
"name": "object 2",
"fields": {
"Attribute1": 3,
"Attribute4": 5,
"Attribute6": "foo",
(...)
I turned this into a class by making a sample request and using "Insert JSON as class" in Visual Studio but, due to the fact that different Objects have different fields (API won't return a value when it's Null but I don't have a complete list of all possible fields), the class is already a 135 lines long and that was with a very basic request.
Also I am worried what might happen when I do happen to get a result that has a field that isn't specified in the class. For all I know it might cause an Exception or simply ignore everything that isn't explicitly specified.
Is there a way to work with these objects (I am trying to save them in an Azure SQL Server) without losing any information?
I was thinking about KeyValuePairs but I don't think that will work with var (because I don't know if the value is string or integer). Also it will make it awkward to write to SQL because every time I "discover" a new field the whole db would have to be re-created, but since Azure SQL Servers have JSON Support this might be easier than it sounds.
I would be perfectly happy with getting the "name" field, making it a field in the db and then storing the "fields" and the "links" field directly as a JSON string in the db but I need to access some of the fields during computation so I'd have to convert from Response to Object back to string. If I don't have every field in my class I am again worried that I'll lose something.
Sorry for the long text, hope I'm not seeing something here :)
Related
I am using Azure search service to search for the documents in my Azure CosmosDB Account.
Using the portal, I have created an Azure search service and given my existing CosmosDB as data source.
Following is the sample document stored in CosmosDB
{
"id": "Engine",
"Sub-Components Price": [
//Price list
],
"Sub-Components": [
"List of sub components here"
],
"Brand": "Brand Name here",
}
When the CosmosDB containing above document is given as data source to the Azure search, id field is internally converted to some string (Automatic Indexing may be).
I am able to set other fields like Sub-Components, Brand as search parameter (using C#) and search only those specific fields. I want to apply the same to id field also. But id field encrypted/encoded to some other string as follows:
{
"id": "UkVRX1ZFSF9DVVNUX0",
"Sub-Components Price": [
//Price list
],
"Sub-Components": [
"List of sub components here"
],
"Brand": "Brand Name here",
}
How to retrieve my original id and set it as search parameter?
Thanks in advance !!
UkVRX1ZFSF9DVVNUX0 is a base64 encoded string and when you decode it you get REQ_VEH_CUST_.
Why the values are converting to base64 encoded string?
Please check the indexer details. Since there're limitations on the value in the key field (https://learn.microsoft.com/en-us/rest/api/searchservice/naming-rules - See Document Key), there's probably a setting in indexer (look under field mapping section and then check if base64Encode mapping function is applied on the id field mapping) which is converting and storing the value as base64 encoded string.
If you're confident that the value of id in source (i.e. key field in index) will not violate the rule for key field value, you can remove this base64encode mapping function, save the indexer, reset the indexer and run it again. This time the data will be saved as it is in the source.
Based on comment of #GauravMantri comment since you id is base 64 encoded before stored so either you can remove encoding while storing Id if that data is unique key in itself without encoding as suggested.
Alternatively you can endcode value you already have like System.Convert.ToBase64String(System.Text.Encoding.UTF8.GetBytes("IdBeforeEncodingAsString")) & use as search parameter, it should work since base64 encoded value of this string is stored as Id.
I have a situation where my API can accept objects where inner data can have all the information or only ID. Now If the client sends object A like an example
{
"description": "value",
"childA": {
"id": 123,
"otherA: {
"id": 321
}
},
"childB": {
"id": 123,
"otherBNew: {
"value": "some new text for new record"
}
}
I want my API to detect if "id" is set to ignore other values except for child and if child "id" is set not to look at the value or any other fields. Problem with the context. Update is that it tries to insert Child data with all the values (nulls and defaults) for fields not entered. Also, the child has a "type" which is enum and required but I want it ignored in case id is set.
All child elements are of the same time and the same goes for inner child elements. The structure is simple like so
{ id: 123, value: "text", childTag: {id: 4, value: "child", childTag: null} }
Is there any way to do this? Or do I have to fetch data from DB each time and only update fields that are changed?
If you want to update only set fields you can certainly achieve that. The easiest approach is to use JsonPatch to apply updates to your entity but as a simple example you would do the following.
var entity = new YourClass();
entity.id = update.id;
dbContext.Attatch(entity);
Now any fields that change are marked and can be saved in isolation.
entity.Description = "xxxx";
The issue with this is obviously any fields you want to update you would need to check to see if the value needs updating or not or waste time potentially re-saving the same data. Alternatively if you are able to use JsonPatch then you apply the patch to that temporary entity and then save your context.
For further reading on JsonPatch:
https://dotnetcoretutorials.com/2017/11/29/json-patch-asp-net-core/
Before the storm (hahaha):
Let's agree in something, DataSet has a magic (at least for me) due to cached-changes philosophy. It's so amazing for optimistic concurrency since I have the old(DataRowVersiom.Original) and new(DataRowVersiom.Current) valuesand . I've seen the way it works with WebServices (an old architecure), It holds changes for each row in XML format, the exchange data-structure for old and some actual apps. And there it comes my questions:
Is it possible to do the same with JSON, Entity Framework Core and Web API?
In case of not, should I implement a JsonConverter, some kind of Json Parser Behavior for Web API or Media Formatter? (this is the part when I'm a little bit lost)
What's my purpose:
I have this object coming from client
[{ //to modify
id:1,
name: "Pedro",
"original:name": "Peter"
},{//to delete
"original:id": "2"
"original:name": "Amy"
},{//to add
"name": "Bob"
}]
My parser whould do something like by each item:
If all values are original, deletes the item.
If both of them are present, modifies the entity matched by "original:" prefixed properties.
If there's not "original:", adds the entity.
Finally: Is it valid to do this kind of job?
I generate object with EF and SQL to LINQ, and have complexs types which is returned from stored procedure. Now want to serialize those objects to nested JSON file, by using Newtonsoft json librarys.
The problem which have is that all serialized JSON files are flatten (cause returned result from procedures are all rows), now what want to ask is -
What tehnics to use to get nice structured (nested) JSON automatically (I need to serialize lot of procedure)?
Is there way to configure EF or SQL to LINQ to have functionality like polymorphic associations (like this, but that's old)
Example:
[{"key":value,"key":value,"key":value...}] --> generated JSON
Want get to look like:
{
"key": value,
"key": value,
.
.
.
},
"table1": <------ structured like this
{
"key": value
"key": value
},
"table2":
{
"key": value
"key": value
}
}
I need to pull and display some json data from a google stock feed:
https://finance.google.com/finance/info?client=ig&q=NYSE:BHP
The only catch is I don't know what the data is. The customer would like to pass in a list of comma separated values to tell my code what items to pull.I would therefore plan to pass in an array of named items to tell the feed which items I want values for, so in theory to match the incoming values with the equivalent json item names. It wont always be the same items or number of items.
How can I do this dynamically (I am using json.net) ?
Sample Json Data:
[{
"id": "4905",
"t": "BHP",
"e": "NYSE",
"l": "26.90",
"l_fix": "26.90",
"l_cur": "26.90",
"s": "0",
"ltt": "6:01PM EST",
"lt": "Dec 2, 6:01PM EST",
"lt_dts": "2015-12-02T18:01:42Z",
"c": "-0.41",
"c_fix": "-0.41",
"cp": "-1.50",
"cp_fix": "-1.50",
"ccol": "chr",
"pcls_fix": "27.31"
}]
Sample of CSV values the user might pass in:
t, e, l, cp_fix
You can use the json class from System.Web.Helpers namespace to deserialize json string into a dynamic object. like below:
dynamic Data = Json.Decode(json);
It is included with the MVC framework as an additional download to the .NET 4 framework. Than you can access the properties you want using Data.PropertyName