I have a collection like below
{
"id":"5fd13c33ac0277c435117519",
"content":"test",
"votes":[
{
"user":"22",
"isLike":true
},
{
"user":"25",
"isLike":false
},
{
"user":"43",
"isLike":false
}
]
},
{
"id":"5fd13c33ac0277c435237443",
"content":"test 2",
"votes":[
{
"user":"25",
"isLike":true
},
{
"user":"43",
"isLike":false
}
]
}
How can I get the result below with c# driver by querying over votes.user -ie. 25- and then merge those 2 fields from parent and nested array?
{
"id":"5fd13c33ac0277c435117519",
"isLike":false
},
{
"id":"5fd13c33ac0277c435237443",
"isLike":true
}
EDIT: I've got the result after some try and fails on mongoplayground.net, stil not sure how to convert it over c# driver.
db.collection.aggregate([
{
$unwind: "$votes"
},
{
$match: {
"votes.user": "25"
}
},
{
$replaceWith: {
id: "$id",
isLike: "$votes.isLike"
}
}
])
I've managed it like below and wanted to share if someone needs it in the future.
I personally decided to go one-to-many path with separated vote collection after some readings.
var match = new BsonDocument("votes.user", 25);
var replace = new BsonDocument(new List<BsonElement> {
new BsonElement("id", "$id"),
new BsonElement("isLike", "$votes.isLike"),
});
return await _collection.Aggregate()
.Unwind(c => c.votes)
.Match(match)
.ReplaceWith<UserVote>(replace)
.ToListAsync();
Related
I have the following document structure:
{
"Agencies": [
{
"name": "tcs",
"id": "1",
"AgencyUser": [
{
"UserName": "ABC",
"Code": "ABC40",
"Link": "http.ios.com",
"TotalDownloads": 0
},
{
"UserName": "xyz",
"Code": "xyz20",
"Link": "http.ios.com",
"TotalDownloads": 0
}
]
}
]
}
Like I have multiple agencies and each agency contains a list of agents.
What I am trying is to pass the Code and update the TotalDownloads field of the agent that matches the code.
For example, if someone uses the code ABC40 so I need to update the field TotalDownloads of the agent called "ABC".
What I have tried is as below:
public virtual async Task UpdateAgentUsersDownloadByCode(string Code)
{
var col = _db.GetCollection<Agencies>(Agencies.DocumentName);
FilterDefinition<Agencies> filter = Builders<Agencies>.Filter.Eq("AgencyUsers.Code", Code);
UpdateDefinition<Agencies> update = Builders<Agencies>.Update.Inc(x => x.AgencyUsers.FirstOrDefault().TotalDownloads, 1);
await col.UpdateOneAsync(filter, update);
}
It is giving me the following error:
Unable to determine the serialization information for x => x.AgencyUsers.FirstOrDefault().TotalDownloads.
Where I'm wrong?
Note: From the attached sample document, the array property name: AgencyUser is not matched with the property name that you specified in the update operation, AgencyUsers.
Use arrayFilters with $[<identifier>] positional filtered operator to update the element(s) in the array.
MongoDB syntax
db.Agencies.update({
"AgencyUsers.Code": "ABC40"
},
{
$inc: {
"AgencyUsers.$[agencyUser].TotalDownloads": 1
}
},
{
arrayFilters: [
{
"agencyUser.Code": "ABC40"
}
]
})
Demo # Mongo Playground
MongoDB .NET Driver syntax
UpdateDefinition<Agencies> update = Builders<Agencies>.Update
.Inc("AgencyUsers.$[agencyUser].TotalDownloads", 1);
UpdateOptions updateOptions = new UpdateOptions
{
ArrayFilters = new[]
{
new BsonDocumentArrayFilterDefinition<Agencies>(
new BsonDocument("agencyUser.Code", Code)
)
}
};
UpdateResult result = await col.UpdateOneAsync(filter, update, updateOptions);
Demo
I want to run a MultiGet (mget) search query on several IDs across two indexes. This is because I have two indexes, but I don't know which index contains my ID. This is the query:
GET _mget
{
"docs" : [
{
"_id": "id1",
"_index": "index1"
},
{
"_id": "id1",
"_index": "index2"
}
/* .... */
]
}
The query works great manually - I get the results and I just ignore the result that returns found: false.
Nest does not support this functionality, only on one index. So I tried to use the low-level client to achieve this, like so:
var data = PostData.Serializable(new
{
docs = new[]
{
new {
_id = "1",
_index = "index1"
},
new
{
_id = "1",
_index = "index2"
}
}
});
var response = await lowLevelClient.MultiGetAsync<MultiGetResponse>(data);
However, I'm getting the following exception: Elasticsearch.Net.UnexpectedElasticsearchClientException: 'Constructor on type 'Nest.MultiGetResponseFormatter' not found.'.
Is this the right way to achieve what I want?
Following will help you achieve what you are looking for with NEST
var request = new MultiGetRequest();
request.Documents = new IMultiGetOperation[]
{
new MultiGetOperation<object>("id1") { Index = "index1" },
new MultiGetOperation<object>("id1") { Index = "index2" },
};
var multiGetResponse = await client.MultiGetAsync(request);
Perhaps someone can help me out with a problem to convert a mongodb aggregation query to C# using the mongodb .Net driver.
According to my problem here I tried to convert the following to C#
db.getCollection('test').aggregate([
{ "$facet": {
"allInRoot1": [{
"$match": { "rootReferenceId": LUUID("9f3a73df-bca7-48b7-b111-285359e50a02") }
}],
"allInRoot2": [{
"$match": { "rootReferenceId": LUUID("27f2b4a6-5471-406a-a39b-1e0b0f8c4eb9") }
}]
}},
{ "$project": {
"difference": {
"$filter": {
"input": "$allInRoot1",
"as": "this",
"cond": { "$in": [ "$$this.reference.id", { "$setDifference": [ "$allInRoot1.reference.id", "$allInRoot2.reference.id" ] } ] }
}
}
}}
])
So far I have this
var matchFilterOne = new ExpressionFilterDefinition<NodeModel>(node => node.RootReferenceId == baseId);
var matchStageOne = PipelineStageDefinitionBuilder.Match(matchFilterOne);
var pipelineOne = PipelineDefinition<NodeModel, NodeModel>.Create(new IPipelineStageDefinition[] { matchStageOne });
var matchFilterTwo = new ExpressionFilterDefinition<NodeModel>(node => node.RootReferenceId == idToExclude);
var matchStageTwo = PipelineStageDefinitionBuilder.Match(matchFilterTwo);
var pipelineTwo = PipelineDefinition<NodeModel, NodeModel>.Create(new IPipelineStageDefinition[] { matchStageTwo });
var facetPipelineOne = AggregateFacet.Create("allInRoot1", pipelineOne);
var facetPipelineTwo = AggregateFacet.Create("allInRoot2", pipelineTwo);
var test = testCollection.Aggregate()
.Facet(facetPipelineOne, facetPipelineTwo)
/* This seems to fail because the facet structure is wrong and it can't access the $allInRoot1 field ...
.Project(#"{
'difference': {
'$filter': {
'input': '$allInRoot1',
'as': 'this',
'cond': {
'$in': [ '$$this.reference.id', { '$setDifference': [ '$allInRoot1.reference.id', '$allInRoot2.reference.id' ] }]
}
}}}")
*/
.FirstOrDefault();
Perhaps someone has a clue pointing me in the right direction? Is it also possible to use the projection with types?
Any help is appreciated!
The best solution so far that i've seen is copying the whole aggregation array into MongoDbCompass's Aggregations tab and then export it in C#.
I am setting up a ChangeStream to notify me when a document has changed in a collection so that I can upsert the "LastModified" element for that document to the time of the event. Since this update will cause a new event to occur on the ChangeStream, I need to filter out these updates to prevent an infinite loop (updating the LastModified element because the LastModified element was just updated...).
I have the following code that is working when I specify the exact field:
ChangeStreamOptions options = new ChangeStreamOptions();
options.ResumeAfter = resumeToken;
string filter = "{ $and: [ { operationType: { $in: ['replace','insert','update'] } }, { 'updateDescription.updatedFields.LastModified': { $exists: false } } ] }";
var pipeline = new EmptyPipelineDefinition<ChangeStreamDocument<BsonDocument>>().Match(filter);
var cursor = collection.Watch(pipeline, options, cancelToken);
However, instead of hard-coding the "updateDescription.updatedFields.LastModified", I would like to provide a list of element names that I don't want to exist in the updatedFields document.
I attempted:
string filter = "{ $and: [ { operationType: { $in: ['replace','insert','update'] } }, { 'updateDescription.updatedFields': { $nin: [ 'LastModified' ] } } ] }";
but this didn't work as expected (I still got the update events for the LastModified change.
I originally was using the Filter Builder:
FilterDefinitionBuilder<ChangeStreamDocument<BsonDocument>> filterBuilder = Builders<ChangeStreamDocument<BsonDocument>>.Filter;
FilterDefinition<ChangeStreamDocument<BsonDocument>> filter = filterBuilder.In("operationType", new string[] { "replace", "insert", "update" }); //Only include the change if it was one of these types. Available types are: insert, update, replace, delete, invalidate
filter &= filterBuilder.Nin("updateDescription.updatedFields", ChangedFieldsToIgnore); //If this is an update, only include it if the field(s) updated contains 1+ fields not in the ChangedFieldsToIgnore list
where ChangedFieldsToIgnore is a List containing the field names that I do not want to get events for.
Can anyone help with the syntax that I need to use? or do I need to create a loop around my ChangedFieldsToIgnore list and create a new entry in the filter for each item to "$exists: false"? (this doesn't seem very efficient).
EDIT:
I attempted the following code based on the answer by #wan-bachtiar, but I'm getting an exception on my enumerator.MoveNext() call:
var match1 = new BsonDocument { { "$match", new BsonDocument { { "operationType", new BsonDocument { { "$in", new BsonArray(new string[] { "replace", "insert", "update" }) } } } } } };
var match2 = new BsonDocument { { "$addFields", new BsonDocument { { "tmpfields", new BsonDocument { { "$objectToArray", "$updateDescription.updatedFields" } } } } } };
var match3 = new BsonDocument { { "$match", new BsonDocument { { "tmpfields.k", new BsonDocument { { "$nin", new BsonArray(updatedFieldsToIgnore) } } } } } };
var pipeline = new[] { match1, match2, match3 };
var cursor = collection.Watch<ChangeStreamDocument<BsonDocument>>(pipeline, options, Profile.CancellationToken);
enumerator = cursor.ToEnumerable().GetEnumerator();
enumerator.MoveNext();
ChangeStreamDocument<BsonDocument> doc = enumerator.Current;
The exception is: "{"Invalid field name: \"tmpfields\"."}"
I suspect the problem might be that I'm getting "replace" and "insert" events which do not contain the updateDescription field, so the $addFields/$objectToArray are failing. I'm too new to figure out the syntax, but I think I need to use a filter that does:
{ $match: { "operationType": { $in: ["replace", "insert"] } } }
OR
{ $eq: { "operationTYpe": "update" }} AND { $addFields....}
Also, it appears that the C# driver does not include a Builder that helps with the $addFields and $objectToArray operations. I was only able to use the new BsonDocument {...} method to build the pipeline variable.
ChangedFieldsToIgnore is a List containing the field names that I do not want to get events for.
If you would like to filter based on multiple keys (whether updatedFields contains certain fields), it's easier if you convert the keys to values first.
You can convert the document contained within updatedFields into values by utilising aggregation operator $objectToArray. For example:
pipeline = [{"$addFields": {
"tmpfields":{
"$objectToArray":"$updateDescription.updatedFields"}
}},
{"$match":{"tmpfields.k":{
"$nin":["LastModified", "AnotherUnwantedField"]}}}
];
The above aggregation pipeline adds a temporary field called tmpfields. This new field will pivot the content of updateDescription.updatedFields turning {name:value} into [{k:name, v:value}]. Once we have those keys as values, we can utilise $nin as an array of filter.
UPDATED
The reason you're getting an exception of tmpfields being invalid, is because the result is casted into ChangeStreamDocument model which does not have a recognizable field called tmpfields.
In the case, when it's different operations that does not have field updateDescription.updatedFields, the value of tmpfields would just be null.
Below is an example of MongoDB ChangeStream .Net/C# using MongoDB .Net driver v2.5, along with an aggregation pipeline that modifies the output change stream.
This example is not type safe, and would return BsonDocument :
var database = client.GetDatabase("database");
var collection = database.GetCollection<BsonDocument>("collection");
var options = new ChangeStreamOptions { FullDocument = ChangeStreamFullDocumentOption.UpdateLookup };
// Aggregation Pipeline
var addFields = new BsonDocument {
{ "$addFields", new BsonDocument {
{ "tmpfields", new BsonDocument {
{ "$objectToArray",
"$updateDescription.updatedFields" }
} }
} } };
var match = new BsonDocument {
{ "$match", new BsonDocument {
{ "tmpfields.k", new BsonDocument {
{ "$nin", new BsonArray{"LastModified", "Unwanted"} }
} } } } };
var pipeline = new[] { addFields, match };
// ChangeStreams
var cursor = collection.Watch<BsonDocument>(pipeline, options);
foreach (var change in cursor.ToEnumerable())
{
Console.WriteLine(change.ToJson());
}
I wrote the piece of code below as I was having the same issues you were having. No need to mess around with BsonObjects ...
//The operationType can be one of the following: insert, update, replace, delete, invalidate
//ignore the field lastrun as we would end in an endles loop
var pipeline = new EmptyPipelineDefinition<ChangeStreamDocument<ATask>>()
.Match("{ operationType: { $in: [ 'replace', 'update' ] } }")
.Match(#"{ ""updateDescription.updatedFields.LastRun"" : { $exists: false } }")
.Match(#"{ ""updateDescription.updatedFields.IsRunning"" : { $exists: false } }");
var options = new ChangeStreamOptions { FullDocument = ChangeStreamFullDocumentOption.UpdateLookup };
var changeStream = Collection.Watch(pipeline, options);
while (changeStream.MoveNext())
{
var next = changeStream.Current;
foreach (var obj in next)
yield return obj.FullDocument;
}
I am writing a program that takes in an XML file of vehicle reflash data and converts it to JSON so it can be stored in a MongoDB database. The XML starts like this:
<FlashReportGeneratorTag>
<VehicleEntry>
<VehicleStatus>PASSED</VehicleStatus>
</VehicleEntry>
<VehicleEntry>
<VehicleStatus>PASSED</VehicleStatus>
</VehicleEntry>
</FlashReportGeneratorTag>
After I convert it to JSON and add the project identifier I am left with a format kinda like this:
{
"FlashReportGeneratorAddedTag" : {
"VehicleEntry" : [
{
"VehicleStatus" : "PASSED"
},
{
"VehicleStatus" : "PASSED"
}
]
},
"project_id" : "1234"
}
What I would like to do is get an aggregate count of number of vehicles passed and number of vehicles failed within each document for project 1234 but I have had no luck.
I have tried using the basic aggregation skills I know but I cannot simply group by project_id since that will group by document, when I need to aggregate over an array inside of it. I also haven't found any resources that tell you if you can or cannot aggregate two values at once (get sum of passed and sum of failed counts).
As a very last resort I could change the document style around to just have each VehicleEntry be its own document, but I would like to take and store the XML as it is if I can.
EDIT Using Unwind I was able to setup an aggregation for the array that I'm looking for:
var aggregate = collection.Aggregate().Match(new BsonDocument { { "project_id", "1234" } }).Unwind(i => i["FlashReportGeneratorAddedTag.VehicleEntry"]);
However, I cannot find the proper way to group these in order to get the pass/fail counts throughout the array. I assume there is some way I need to use the Match function but I can't figure out how to do that without excluding one of the two conditions. Do I have to run aggregation twice, once for passed and once for failed?
Thanks to a hint from JohnnyHK and some more digging I was able to work this out. First I had to use the Unwind method to unwind the Vehicleentry array in order to aggregate on it:
var aggregate = collection.Aggregate().Match(new BsonDocument { { "project_id", "1234" } })
.Unwind(i => i["FlashReportGeneratorAddedTag.VehicleEntry"])
Once I had that I was able to nest BsonDocuments in order to sum based on a condition. In order to get the passed count I used this:
{ "passed", new BsonDocument { { "$sum", new BsonDocument { { "$cond", new BsonArray { new BsonDocument { { "$eq", new BsonArray { "$FlashReportGeneratorAddedTag.VehicleEntry.VehicleStatus", "PASSED" } } }, 1, 0 } } } } } }
Similarly I added in a failed tab. The whole thing (not yet formatted) was like this:
var collection = _database.GetCollection<BsonDocument>(Vehicles);
var aggregate = collection.Aggregate()
.Match(new BsonDocument{ { "project_id", "1234" } })
.Unwind(i => i["FlashReportGeneratorAddedTag.VehicleEntry"])
.Group(new BsonDocument
{
{ "_id", "$project_id" },
{ "passed", new BsonDocument
{ { "$sum", new BsonDocument
{ { "$cond", new BsonArray
{ new BsonDocument
{ { "$eq", new BsonArray
{
"$FlashReportGeneratorAddedTag.VehicleEntry.VehicleStatus",
"PASSED"
} }
},
1,
0 } } } } }
},
{ "failed", new BsonDocument
{ { "$sum", new BsonDocument
{ { "$cond", new BsonArray
{ new BsonDocument
{ { "$eq", new BsonArray
{
"$FlashReportGeneratorAddedTag.VehicleEntry.VehicleStatus",
"FAILED"
} }
},
1,
0 } } } } }
},
});
var results = await aggregate.ToListAsync();