This is my problem: I have a nested dictionary (primary keys: orderIds, secondary keys: productIds).
My scheme:
Dictionary<string, Dictionary<string, List<Task>>>
I need to look for the productId and return that value (Task object).
It's usually not the case that a product key appears in several orders. So it's unique.
Here is a json example:
{
"O1": {
"P1": [
{
"Field": "V1"
},
{
"Field": "V7"
}
],
"P2": [
{
"Field": "V2"
},
{
"Field": "V8"
}
]
},
"O2": {
"P1": [
{
"Field": "V3"
},
{
"Field": "V5"
}
],
"P2": [
{
"Field": "V4"
},
{
"Field": "V6"
}
]
}
}
If I look for productId "P5" I want to get...
[{"Field":"V5"}]
The only way I found that runs is...
return base.Values.Single(x => x.ContainsKey(productId))[productId];
//base is the nested Dictionary
But I don't like what I'm doing here. Because I detach the correct dictionary from the collection (values) where the key exists and finally I fetch only the value via key filtering (collection[key]).
That are basically two steps, but I suspect there is an easier way with just one step. - I just can't find this one.
Maybe you can help me out. :)
For performance tuning, use HashSet instead of List.
I also can recommend ISBN: 0321637003 (LINQ to Objects), maybe there is an updated version. But anyway its still very good content.
If you have millions of entries in your dict, you can try PLINQ.
return base.Values.Single(x => x.ContainsKey(productId))[productId];
The Sample json throws an exception, because your productId (P1, P2) is not unique and P5 does not exist.
If I look for productId "P5" I want to get...
[{"Field":"V5"}]
There Is no P5
I am not sure its helping but, here is my sample code. I have put your json into a file and deserialize it.
var lFile = new FileInfo(#"C:\_test\data.json");
using var lReader = lFile.OpenText();
var lJsonStr = lReader.ReadToEnd();
var lDicDic = JsonConvert.DeserializeObject<Dictionary<string, Dictionary<string, List<ProductId>>>>(lJsonStr);
//var lTest1 = lDicDic.Values.Single(x => x.ContainsKey("P1"))["P1"]; //Not working, P1 is not unique!
var lTest2 = lDicDic["O2"]["P1"];
//contains => "Field": "V3" + "Field": "V5"
var lTest3 = lDicDic
.SelectMany(p => p.Value.Values.SelectMany(qItem => qItem))
.FirstOrDefault(qProducts => qProducts.Field == "V5");
//contains => "Field": "V5"
var lTest4 = lDicDic["O2"]["P1"].Last();
//contains => "Field": "V5"
EDIT:
After your Fiddler code I was able to test it.
public List<MyTask> GetByProductId(string productId)
{
var lProductDict = Tasks.Values.SingleOrDefault(x => x.ContainsKey(productId));
return lProductDict?.GetValueOrDefault(productId);
}
My stomach tells my, your current approach is hard to make any better ;)
If you have a lot of calls on this nasted dict, and its changing not very often it could make sense to refactor the dict from TopDown to BottomUp at the beginning for the processing.
Anyway, If my comments are helpful plz upvote my answere :)
If I were you I would make some testcases with real data.
Dont forget: If you are in DEBUG mode, to enable "Optimize Code" at the Project. In my testcase, I had an AVG of 47ms without and 42ms with optimized code:
You can give PLINQ a try:
https://learn.microsoft.com/en-us/dotnet/standard/parallel-programming/introduction-to-plinq
Here us my test scenario:
var lStopWatch = new Stopwatch();
lStopWatch.Restart();
var lJobs = new Jobs("F1");
const int TestOrderCount = 1000000;
const int TestAvgCount = 1000;
for (var lIndex = 1; lIndex < TestOrderCount; lIndex++)
{
var lNoStr = lIndex.ToString("D6");
lJobs.Add($"O{lNoStr}", $"P{lNoStr}", $"V{lNoStr}");
}
var GetTimes = new List<long>();
var lRandom = new Random();
var lTestCases = Enumerable
.Range(1, TestAvgCount - 1)
.Select(r => $"P{lRandom.Next(1, TestOrderCount - 1):D6}")
.ToList();
var lSetupTimeMs = lStopWatch.ElapsedMilliseconds;
Debug.WriteLine($"SetupTimeMs: {lSetupTimeMs}");
foreach (var lTestCase in lTestCases)
{
lStopWatch.Restart();
var lTest = lJobs.GetByProductId(lTestCase);
GetTimes.Add(lStopWatch.ElapsedMilliseconds);
}
var lAvg = GetTimes.Sum() / TestAvgCount; //AVG Ms per get
Debug.WriteLine($"AVG: {lAvg}");
Related
I want to run a MultiGet (mget) search query on several IDs across two indexes. This is because I have two indexes, but I don't know which index contains my ID. This is the query:
GET _mget
{
"docs" : [
{
"_id": "id1",
"_index": "index1"
},
{
"_id": "id1",
"_index": "index2"
}
/* .... */
]
}
The query works great manually - I get the results and I just ignore the result that returns found: false.
Nest does not support this functionality, only on one index. So I tried to use the low-level client to achieve this, like so:
var data = PostData.Serializable(new
{
docs = new[]
{
new {
_id = "1",
_index = "index1"
},
new
{
_id = "1",
_index = "index2"
}
}
});
var response = await lowLevelClient.MultiGetAsync<MultiGetResponse>(data);
However, I'm getting the following exception: Elasticsearch.Net.UnexpectedElasticsearchClientException: 'Constructor on type 'Nest.MultiGetResponseFormatter' not found.'.
Is this the right way to achieve what I want?
Following will help you achieve what you are looking for with NEST
var request = new MultiGetRequest();
request.Documents = new IMultiGetOperation[]
{
new MultiGetOperation<object>("id1") { Index = "index1" },
new MultiGetOperation<object>("id1") { Index = "index2" },
};
var multiGetResponse = await client.MultiGetAsync(request);
Objective
Create a mock object, using Moq and XUnit, for loading the specific section "Character/Skills" to enhance the coverage in unit testing.
The SUT (in some point), loads the setting in the way
var skills = Configuration.GetSection(“Character:Skills”);
From the following appSetting:
{
"dummyConfig1": {
"Description": "bla bla bla...",
},
"Character": {
"Name": "John Wick",
"Description": "A retired hitman seeking vengeance for the killing of the dog given to him...",
"Skills": [
{
"Key": "CQC Combat",
"Id": "15465"
},
{
"Key": "Firearms",
"Id": "14321"
},
{
"Key": "Stealth",
"Id": "09674"
},
{
"Key": "Speed",
"Id": "10203"
}
],
"DummyConf2": "more bla bla bla..."
}
Previous Reading
Reading these posts (and other others, as result of Googling), I noticed that we can only use a primitive "string" datatype or else new Mock<IConfigurationSection> object (with no setting):
Stack Overflow - how to mock Configuration.GetSection(“foo:bar”),
Mocking IConfiguration extension method
Mocking IConfiguration Getvalue() extension method in Unit Test
Constraint: Copying the appSetting file into the TestProject (or create a MemoryStream) to load the real settings could solve this scenario, but the test would be a "Integration" instead of "Unit"; since there is an I/O dependency.
The approach
The code's idea (shown later) is mocking each property (key/id) and then merging them in a tree similar to this:
"Character" ------ Configuration to be read, using GetSection() and then Get<T>()
"Skills" ------ Configuration list with merged attribute
"Key" - "CQC Combat" ------ Primitive value 1
"Id" - "15465" ------ Primitive value 2
The Code
var skillsConfiguration = new List<SkillsConfig>
{
new SkillsConfig { Key = "CQC Combat" , Id = "15465" },
new SkillsConfig { Key = "Firearms" , Id = "14321" },
new SkillsConfig { Key = "Stealh" , Id = "09674" },
new SkillsConfig { Key = "Speed" , Id = "10203" },
};
var configurationMock = new Mock<IConfiguration>();
var mockConfSections = new List<IConfigurationSection>();
foreach (var skill in skillsConfiguration)
{
var index = skillsConfiguration.IndexOf(skill);
//Set the Key string value
var mockConfSectionKey = new Mock<IConfigurationSection>();
mockConfSectionKey.Setup(s => s.Path).Returns($"Character:Skills:{index}:Key");
mockConfSectionKey.Setup(s => s.Key).Returns("Key");
mockConfSectionKey.Setup(s => s.Value).Returns(skill.Key);
//Set the Id string value
var mockConfSectionId = new Mock<IConfigurationSection>();
mockConfSectionId.Setup(s => s.Path).Returns($"Character:Skills:{index}:Id");
mockConfSectionId.Setup(s => s.Key).Returns("Id");
mockConfSectionId.Setup(s => s.Value).Returns(skill.Id);
//Merge the attribute "key/id" as Configuration section list
var mockConfSection = new Mock<IConfigurationSection>();
mockConfSection.Setup(s => s.Path).Returns($"Character:Skills:{index}");
mockConfSection.Setup(s => s.Key).Returns(index.ToString());
mockConfSection.Setup(s => s.GetChildren()).Returns(new List<IConfigurationSection> { mockConfSectionKey.Object, mockConfSectionId.Object });
//Add the skill object with merged attributes
mockConfSections.Add(mockConfSection.Object);
}
// Add the Skill's list
var skillsMockSections = new Mock<IConfigurationSection>();
skillsMockSections.Setup(cfg => cfg.Path).Returns("Character:Skills");
skillsMockSections.Setup(cfg => cfg.Key).Returns("Skills");
skillsMockSections.Setup(cfg => cfg.GetChildren()).Returns(mockConfSections);
//Mock the whole section, for using GetSection() method withing SUT
configurationMock.Setup(cfg => cfg.GetSection("Character:Skills")).Returns(skillsMockSections.Object);
Expected result
Running the original system, I get the instantiated list with its respective
Here is the screenshot:
Mocked result
The code above, I only get the instantiated list but all attributes return null.
Here is the screenshot:
Finally I refactored the code, getting rid the whole foreach block and replacing the list initialization var mockConfSections = new List<IConfigurationSection>(); with the follow piece of code, which is simpler and cleaner.
var fakeSkillSettings = skillsConfiguration.SelectMany(
skill => new Dictionary<string, string> {
{ $"Character:Skills:{skillsConfiguration.IndexOf(skill)}:Key", skill.Key },
{ $"Character:Skills:{skillsConfiguration.IndexOf(skill)}:Id" , skill.Id },
});
var configBuilder = new ConfigurationBuilder();
var mockConfSections = configBuilder.AddInMemoryCollection(fakeSkillSettings)
.Build()
.GetSection("Character:Skills")
.GetChildren();
Explanation
As the previous implementation built a configuration tree with mocked nodes, there was a need to build a setup and return for each one, resulting in a bloated solution.
Based on the article Keeping Configuration Settings in Memory, I projected the list with flattened Key/Id Dictionary using the LINQ SelectMany, then built the memory configuration and finally mocked the setting with "real nodes", resulting in one mock setup.
I have a document like:
{
"Id":"1",
"Name":"product1",
"Categories":["Cat1",
"Cat2",
"Cat3"]
},
{
"Id":"2",
"Name":"product2",
"Categories":["Cat3",
"Cat2",
"Cat6"]
}
Now I want return a distinct List of all categories.
I tried CollapseParameter, but it doesn't work.
var foo = _solr.Query(new SolrQuery("*"), new QueryOptions
{
Collapse = new CollapseParameters("Categories")
});
var bar= results.CollapseExpand; //NULL
How can I get a list of all categories without iterating through all the documents? Do I have to create new Documents for the categories? Or should I work with Faceting?
It looks like the JSON is not valid. So I created a solution changing the Json:
{
"items": [{
"Id": "1",
"Name": "product1",
"Categories": ["Cat1", "Cat2", "Cat3"]
}, {
"Id": "2",
"Name": "product2",
"Categories": ["Cat3", "Cat2", "Cat6"]
}]
}
And then, I used foreach and LINQ structure:
var x = categories.SelectMany(item => item.Categories).Distinct();
If you are using solr and you want all distinct value for specific field then must start reading solr faceting doc, its amazing.
Only add below query in your query and you we get full list of all distinct categories.
&facet=true&facet.field=Categories
That's it, It will gives result set as you expected.
QueryOptions options = new QueryOptions
{
StartOrCursor = new StartOrCursor.Start(0),
Rows = 1,
Facet = new FacetParameters { Queries = new[] { new SolrFacetFieldQuery("Categories") } }
};
var result = _solrConnection.Query(new SolrQuery("*"), options);
if (result.FacetFields.ContainsKey("Categories"))
{
return result.FacetFields["Categories"]
.Where(li => li.Value > 0)
.Select(y=> y.Key)
.ToList();
}
That's my current solution. It works, but I hoped there would be a better way. I'm searching for a result just to get all the categories.
I have a document like this
{
"_id": "63dafa72f21d48312d8ca405",
"tasks": [{
"_ref": "63d8d8d01beb0b606314e322",
"data": {
"values": [{
"key": "Deadline",
"value": "2014-10-13"
}]
}
}, {
"_ref": "84dd046c6695e32322d842f5",
"data": {
"values": []
}
}]
}
Now I want to update the value inside values which is inside data if the _ref field do match my input.
My code so far:
public bool updateProject(Project dbPro, Project pro)
{
var collection = db.GetCollection<BsonDocument>("projects");
var filter = Builders<BsonDocument>.Filter.Eq("_id", ObjectId.Parse( dbPro.Id));
var update = Builders<BsonDocument>.Update.AddToSetEach("tasks", pro.Tasks);
var result = collection.UpdateOne(filter, update);
if (result.IsModifiedCountAvailable)
{
if (result.ModifiedCount == 1)
{
return true;
}
}
return false;
}
At the moment this code does only append the documents as new tasks instead to append the values to the matching tasks. Maybe someone has an idea how to achieve this behavior?
UPDATE
I tried it like #Shane Oborn said. But its still not working for me.
var collection = db.GetCollection<BsonDocument>("projects");
var filter = Builders<BsonDocument>.Filter.Eq("_id", ObjectId.Parse( dbPro.Id));
var update = Builders<BsonDocument>.Update.Push("tags", buildBsonArrayFromTags(pro.Tags));
var result = collection.UpdateOne(filter, update);
if (result.IsModifiedCountAvailable)
{
if (result.ModifiedCount == 1)
{
return true;
}
}
return false;
}
Instead to override the data it appends an array to my array.
UPDATE
OK instead of push i did need set. And it worked then.
I don't have the exact code accessible, but close. I have a method that performs "upserts" (which "adds" if new, or "updates" if existing). This should get you close:
// The variable "doc" below is a BsonDocument
var updateRequests = new List<WriteModel<BsonDocument>>();
updateRequests.Add(new ReplaceOneModel<BsonDocument>(
CreateBsonDocumentFilterDefinition(filterKeyName, filterKeyValue), doc)
{
IsUpsert = true
});
var writeResult = await collection.BulkWriteAsync(updateRequests);
The key objects here for you are "ReplaceOneModel" and the "IsUpsert" property for the filter definition.
Good luck!
UPDATE:
Another method I have that does updates in subdocuments looks like this:
// Below, "subDocument" is a BsonDocument, and "subDocArrayName" is a string
// that should match the name of the array that contains your sub-document
// that will be updated.
var collection = _database.GetCollection<BsonDocument>(collectionName);
var builder = Builders<BsonDocument>.Update;
var update = builder.Push(subDocArrayName, subDocument);
await collection.UpdateOneAsync(CreateBsonDocumentFilterDefinition(filterKeyName, filterKeyValue), update);
I have a block of code that works in .NET 4.0+, but I need to use this code in an SSIS package that only supports up to .NET 3.5. The problem is I can't use the dynamic object below 4.0. I'm unable to find a workaround, any ideas?
string json = File.ReadAllText(#"C:json.txt");
dynamic deserialisedJson = JsonConvert.DeserializeObject(json);
var locations = new List<Location>();
foreach (var root in deserialisedJson)
{
foreach (var state in root)
{
foreach (var city in state)
{
foreach (var location in city)
{
Location loc = new Location();
loc.CafeId = location.First["cafeID"];
loc.CafeName = location.First["cafeName"];
loc.CafeState = location.First["cafeState"];
loc.CafeCity = location.First["cafeCity"];
loc.CafeStreetName = location.First["cafeStreetName"];
loc.CafeZip = location.First["cafeZip"];
locations.Add(loc);
}
}
}
}
UPDATE
Adding JSON schema
{
"AK": {
"Anchorage": [{
"Name": "John Doe",
"Address": "123 Main St.",
"City": "Anchorage",
"State": "AK",
"Zip": "12345"
}],
"Fairbanks": [{
"Name": "Sally Smith",
"Address": "987 Main St.",
"City": "Fairbanks",
"State": "AK",
"Zip": "98765"
}]
}
}
UPDATE 2
I am attempting the IEnumerable workaround, but not sure what the correct syntax is so that I can grab the values I need:
string json = File.ReadAllText(#"C:json.txt");
var deserialisedJson = (IEnumerable)JsonConvert.DeserializeObject(json);
var locations = new List<Location>();
foreach (var root in deserialisedJson)
{
foreach (var state in (IEnumerable)root)
{
foreach (var city in (IEnumerable)state)
{
foreach (var location in (IEnumerable)city)
{
Location loc = new Location();
loc.Name = //What goes here???
loc.Address = //What goes here???
loc.City = //What goes here???
loc.State = //What goes here???
loc.Zip = //What goes here???
locations.Add(loc);
}
}
}
}
from another post - Newtonsoft JSON Deserialize
class MyData
{
public string t;
public bool a;
public object[] data;
public string[][] type;
}
and then use the generic version of DeserializeObject:
MyData tmp = JsonConvert.DeserializeObject<MyData>(json);
foreach (string typeStr in tmp.type[0])
{
// Do something with typeStr
}
Without an example json I can only speculate - but it looks like you know the (relevant) schema of your json already. You don't need dynamic in the first place, and even in .net 4.0 and higher I'd advise against using it. Code using dynamic is typically slower, more error prone, and harder to debug than code which is statically typed not just because of compile-time checks, but also because errors at runtime appear earlier.
From your limited example and without knowing what that First thing is, it looks to me like you could do something like...
class LocationFromJson {
public LocationContentsFromJson First;
}
class LocationContentsFromJson {
public string cafeID, cafeName, cafeState, cafeCity, cafeStreetName, cafeZip;
}
//Later usage; this should be equivalent to your example:
var deserialisedJson = JsonConvert.DeserializeObject<LocationFromJson[][][][]>(json);
var locations =
deserialisedJson //4 levels of enumerable
.SelectMany(o => o) //3 levels of enumerable
.SelectMany(o => o) //2 levels of enumerable
.SelectMany(o => o) //1 level of enumerable
.Select(o => new Location {
CafeId = o.First.cafeID,
CafeName = o.First.cafeName,
CafeState = o.First.cafeState,
CafeCity = o.First.cafeCity,
CafeStreetName = o.First.cafeStreetName,
CafeZip = o.First.cafeZip,
}).ToArray();
To be explicit: this may or may not work unaltered. Your example does not include the type declaration of Location nor an example json, so I'm speculating here a little: Location.CafeId might also be an int; I can't tell from your question.
But this shouldn't be too far from what you need.