I have below JSON file under service map, The Key values are not fixed
{
"ServiceMap": {
"rackAC1": {
"Env": "Public",
"Center": "north"
},
"rackD1": {
"Env": "Public",
"Center": "south"
},
"rackD2": {
"Env": "Public",
"Center": "North-south"
},
"rackD3": {
"Env": "Public",
"Center": "south"
},
...,
"rackD1000": {
"Env": "Public",
"Center": "south"
},
"rackBO": {
"Env": "Public",
"Center": "East"
},
"rackB1": {
"Env": "Public",
"Center": "West"
}
}
}
I want to extract the key values from rackD1 to rackD1000 whose center is south using linq.
is it possible to extract the key value like JsonContent["serviceMap"][rackD1*]
you dont need any classes to get list of keys
List<string> keys = ((JObject)JObject.Parse(json)["ServiceMap"]).Properties()
.Where(p => p.Name.Contains("rackD")
&& ((string)p.Value["Center"]).ToLower() == "south")
.Select(p => p.Name)
.ToList();
Just deserialize it to proper class structure and use LINQ on the results.
public class Root
{
// use dictionary to represent dynamic property names
public Dictionary<string, ServiceMap> ServiceMap { get; set; }
}
public class ServiceMap
{
public string Env { get; set; } // can be excluded if not needed
public string Center { get; set; }
}
And filter results:
var root = ... ; // deserialized root
var result = root.ServiceMap
.Where(kvp => kvp.Value.Center == "south") // or kvp.Value.Center.Equals("south", StringComparison.InvariantCultureIgnoreCase) for case-insensitive
.Select(kvp => kvp.Key) // keys like "rackD1", ...
.ToList();
Related
I have the following JSON that gets put into a list of objects. Each object has one property and a list of codes. I’m struggling to figure out a way to query the list to find a specific code value for a specific company. I’m ultimately looking for the “AbsenceType” value based on the Company/PayCode.
This is what I’ve tried to look at but it’s not working. Looking for any suggestions.
companyAbsenceType = AbsenceCodesList.First(c =>
(c.Company == companyCode) && (c.Codes.Find(x => x.PayCode == ppc.PayCode));
companyAbsenceType = AbsenceCodesList.Select(c =>
c.Company == companyCode && c.Codes.Find(x => x.PayCode == ppc.PayCode)).FirstOrDefault();
JSON:
[
{
"Company": "Company1",
"Codes": [
{
"PayCode": "SCK",
"AbsenceType": "Illness"
},
{
"PayCode": "VAC",
"AbsenceType": "Vacation"
},
{
"PayCode": "BRV",
"AbsenceType": "Bereavement"
},
{
"PayCode": "JUR",
"AbsenceType": "Jury Duty"
},
{
"PayCode": "PER",
"AbsenceType": "Personal"
}
]
},
{
"Company": " Company2",
"Codes": [
{
"PayCode": "SCK",
"AbsenceType": "Sick"
},
{
"PayCode": "VAC",
"AbsenceType": "Vacation"
},
{
"PayCode": "BRV",
"AbsenceType": "Bereavement"
},
{
"PayCode": "JUR",
"AbsenceType": "Jury Duty"
},
{
"PayCode": "PER",
"AbsenceType": "Personal"
},
{
"PayCode": "PRNU",
"AbsenceType": "Personal"
}
]
}
]
public class AbsenceCodes
{
public string Company { get; set; }
public List<AbsenceCode> Codes { get; set; }
}
public class AbsenceCode
{
public string PayCode { get; set; }
public string AbsenceType { get; set; }
}
UPDATE
Thanks to Moho and Eric Magers pointing me to a query. The query from Moho worked.
var absenceType = AbsenceCodesList.FirstOrDefault(c => c.Company == companyCode && c.Codes.Any(x => x.PayCode == ppc.PayCode)) ?.Codes.First(c => c.PayCode == ppc.PayCode) ?.AbsenceType;
You were close, use .Any instead of .Find for the Codes when filtering:
var absenceType = AbsenceCodesList
// first find a valid top level item
.FirstOrDefault(c =>
// is specific company
c.Company == companyCode
// has the target paycode
&& c.Codes.Any(x => x.PayCode == ppc.PayCode))
// then select the desired value
?.Codes.First(c => c.PayCode == ppc.PayCode)
.AbsenceType;
I've got incoming json arrays that I need to de-serialize into a class. The class is basically a two property class but the incoming json arrays may have different Key names but they will always be in a keyname/value pair.
Is there a way to do this without having to create a different class for each variation?
My class:
public class MyGenericClass
{
public string ColumnName { get; set; }
public string Value { get; set; }
}
Two possible json strings that I want to de-serialize into the above class:
[
{
"Test": "BTO"
},
{
"Test": "Test I"
},
{
"Test": "Test II"
},
{
"Test": "Regi"
},
{
"Test": "ISV"
},
{
"Test": "Bill"
}
]
Example 2:
[
{
"State": "Fried"
},
{
"State": "Shredded"
},
{
"State": "Licked"
},
{
"State": "Rated"
},
{
"State": "Smooth"
},
{
"State": "WY"
}
]
As suggested in the comments, the solution includes deserializing to Dictionary<string, string>[] and then transforming that into MyGenericClass.
// 'data' is a string variable with the raw JSON data
Dictionary<string, string>[] dicts = JsonSerializer.Deserialize<Dictionary<string, string>[]>(data);
IEnumerable<MyGenericClass> result = dicts
.Select(dict => dict.Single()) // select the first key-value pair of each dictionary
.Select(kvp => new MyGenericClass { ColumnName = kvp.Key, Value = kvp.Value }); // create a new 'MyGenericClass' from each key-value pair
// print out the results
foreach (MyGenericClass obj in result) {
Console.WriteLine($"ColumnName = {obj.ColumnName}, Value = {obj.Value}");
}
I'm new with ElasticSearch and encountered some problem while mapping my documents into ES Index.
My document structure is
public class DocumentItem
{
public string Id { get; set; }
public DocumentType DocType { get; set; }
public Dictionary<string, string> Props { get; set; } = new Dictionary<string, string>();
}
And here's my mapping
var indexResponseFiles = dClient.CreateIndex(sedIndex, c => c
.InitializeUsing(indexConfig)
.Mappings(m => m
.Map<DocumentItem>(mp => mp.AutoMap()
)
));
As u see i'm trying to map a DICTIONARY type. In every document the keys of dictionary are different.
My goal is to set my custom analyzer to all text values of dictionary. I have no idea how to do this.
Dynamic templates feature will help you here. You can configure the dynamic template for all string fields below props object which will create a mapping for such fields with certain analyzer.
Here is the example with creating text fields with english analyzer
var createIndexResponse = await client.CreateIndexAsync("index_name",
c => c.Mappings(m => m
.Map<Document>(mm => mm.DynamicTemplates(dt => dt
.DynamicTemplate("props_fields", t => t
.PathMatch("props.*")
.MatchMappingType("string")
.Mapping(dm => dm.Text(text => text.Analyzer("english"))))))));
and here is the mapping after indexing following document
var document = new Document { Id = "1", Name = "name"};
document.Props.Add("field1", "value");
var indexDocument = await client.IndexDocumentAsync(document);
mapping
{
"index_name": {
"mappings": {
"document": {
"dynamic_templates": [
{
"props_fields": {
"path_match": "props.*",
"match_mapping_type": "string",
"mapping": {
"analyzer": "english",
"type": "text"
}
}
}
],
"properties": {
"id": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"name": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"props": {
"properties": {
"field1": {
"type": "text",
"analyzer": "english"
}
}
}
}
}
}
}
}
Hope that helps.
I have a table of logs records and I want to conduct a simple search by date.
For example, I wanted to search all the queries before 01.06.2019 00:00:00 (mm.DD.yyyy hh:mm:ss) and I wrote this query:
var query = client.Search<SearchEventDto>(s => s
.AllIndices()
.AllTypes()
.Query(q => q
.MatchAll() && +q
.DateRange(r =>r
.Field(f => f.timestamp)
.LessThanOrEquals(new DateTime(2019,06,01, 0, 0, 0))
)
)
);
My Dto looks like this:
public class SearchEventDto : IDto
{
[KendoColumn(Hidden = true, Editable = true)]
public string id { get; set; }
[KendoColumn(Order = 2, DisplayName = "Level")]
public string level { get; set; }
[KendoColumn(Order = 4, DisplayName = "Message")]
public string message { get; set; }
[KendoColumn(Hidden = true)]
public string host { get; set; }
[KendoColumn(Order = 3, DisplayName = "Source")]
public string src { get; set; }
[KendoColumn(Order = 1, DisplayName = "Timestamp", UIType = UIType.DateTime)]
public DateTime timestamp { get; set; }
[KendoColumn(Hidden = true)]
public DateTime time { get; set; }
}
Unfortunately, it is returning all the records without filtering anything.
Where am I going wrong in this?
Thanks in advance!
PS: ES version: 6.7.0, NEST: 6.8
PS: I have integrated the logs with Nlog. So, now every day it inserts a new index with the date as the name. Here is a mapping for 219-06-28 (I am using the #timestamp):
{
"logstash-2019-06-28": {
"mappings": {
"logevent": {
"properties": {
"#timestamp": {
"type": "date"
},
"host": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"level": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"message": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"src": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"time": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
}
}
}
}
}
}
I'll post what we have figured out in comments as an answer as I think there are couple things which could be improved to increase performance and readability.
Solution:
Query from the question was using .Field(f => f.timestamp) which was translated by NEST to use timestamp field not #timestamp. Simple change to .Field("#timestamp") would resolve the problem as this is the proper field name in index mapping.
{
"logstash-2019-06-28": {
"mappings": {
"logevent": {
"properties": {
"#timestamp": {
"type": "date"
},
..
}
}
}
}
}
We could also mark timestamp property with PropertyName attribute to tell NEST to use #timestamp as a name instead of timestamp
public class SearchEventDto : IDto
{
[KendoColumn(Order = 1, DisplayName = "Timestamp", UIType = UIType.DateTime)]
[PropertyName("#timestamp")]
public DateTime timestamp { get; set; }
}
and query
var query = client.Search<SearchEventDto>(s => s
.AllIndices()
.AllTypes()
.Query(q => q
.MatchAll() && +q
.DateRange(r =>r
.Field(f => f.timestamp)
.LessThanOrEquals(new DateTime(2019,06,01, 0, 0, 0))
)
)
);
would be just working as well.
Improvements:
Query only specific indices:
var query = client.Search<SearchEventDto>(s => s
.AllIndices()
.AllTypes()
..
By using AllIndices() we are telling elasticsearch to try to gather documents from all of the indices, we could change it a little bit to query only indices with logs data:
var query = client.Search<SearchEventDto>(s => s
.Index("logstash-*")
.Type("logevent")
..
Use filter context for date range filter:
.Query(q => q.Bool(b => b.Filter(f => f.DateRange(..))))
This way your query should be faster as it doesn't care about calculating search relevance score. You can read more about it here.
Hope that helps.
I am querying ElasticSearch with Nest, with this code:
var mustTerms = new List<Func<QueryDescriptor<ElasticCheckIn>, Nest.QueryContainer>>();
var dmaxPrice = maxPrice.HasValue ? (double?)maxPrice.Value : 100000d;
var dminPrice = minPrice.HasValue ? (double?)minPrice.Value : 0d;
mustTerms.Add(mt => mt.Range(rd => rd.OnField("price").LowerOrEquals(dmaxPrice).GreaterOrEquals(dminPrice)));
Func<QueryDescriptor<ElasticCheckIn>, Nest.QueryContainer> queryFunc = qd => qd
.FunctionScore(fsq => fsq
.Query(fsqd => fsqd
.Bool(bqd => bqd
.Must(mustTerms.ToArray())
.Should(shouldTerms.ToArray())))
.Functions(fsd => fsd
.Linear("createDate", fsdfd => fsdfd
.Scale("1d")
.Decay(0.5d)
.Origin(DateTime.UtcNow.ToString("O")))));
Func<SearchDescriptor<ElasticCheckIn>, SearchDescriptor<ElasticCheckIn>> searchFunc = q => q
.Index(_indexName)
.Type(_typeName)
.Query(queryFunc)
.Size(limit);
This code produces query:
{
"query": {
"function_score": {
"functions": [
{
"linear": {
"createDate": {
"origin": "2015-03-16T12:48:14.2660667Z",
"scale": "1d",
"decay": 0.5
}
}
}
],
"query": {
"bool": {
"must": [
{
"range": {
"price": {
"gte": "29.97",
"lte": "67.5"
}
}
}
]
}
}
}
}
}
As you see, Nest renders the double values in the range part as strings:
{
"range": {
"price": {
"gte": "29.97",
"lte": "67.5"
}
}
}
This results in:
NumberFormatException[For input string: "29.97"]
When I correct the double values by hand, elasticsearch correctly return the query results. How can I get Nest to render the double values correctly?
Update:
Mapping for this index:
{
"myindice": {
"mappings": {
"mytype": {
"properties": {
"createDate": {
"type": "date",
"format": "dateOptionalTime"
},
"id": {
"type": "string"
},
/* other fields */
"price": {
"type": "long"
}
/* other fields */
}
}
}
}
}
We did not map the price field as long specifically. Some of the price fields on the docs are integers, some doubles.
This may be a bug in NEST using the syntax that you've choosen, you could create a new issue in their github repo. A more standard syntax would be to create a C# POCO to represent your ES data. something like this:
[ElasticType]
public class Type1
{
public string Id { get; set; }
public DateTime CreationDate { get; set; }
public double Price { get; set; }
}
With that, your query syntax become much, much simpler:
var result2 = ElasticClient.Search<Type1>(s=>s
.Query(q=> q
.Filtered(f=>f
.Filter(ff=> ff
.Range(r=>r
.OnField(rf=>rf.Price)
.GreaterOrEquals(29.97)
.LowerOrEquals(67.5)
)
)
)
)
);