This code is used with the C# driver to select items from a document of items that have a location field value in the range of location id values, I am just providing it as an example:
var locations = new BsonValue[] { 1, 2, 3, 4 };
var data = collection
.Find(Builders<BsonDocument>.Filter.In("LocationId", locations))
.Project(x => Mapper.Map<BsonDocument, ItemViewModel>(x))
.ToListAsync().Result;
Does BsonValue just serve to initialize an array here? Where do I get more information? How do I convert a regular C# list/array into that bson value?
BsonDocument provides flexible way to represent JSON/BSON in C#. Creating BsonDocument is similar to creating JSON objects.
Simple document
new BsonDocument("name", "Joe")
creates JSON { "name" : "Joe" }
More complex document
new BsonDocument
{
{"Name", "Joe"},
{
"Books", new BsonArray(new[]
{
new BsonDocument("Name", "Book1"),
new BsonDocument("Name", "Book2")
})
}
}
creates JSON {"Name":"Joe", "Books" : [ { "Name":"Book1" },{ "Name":"Book2" } ]}
Array
new BsonArray(new [] {1, 2, 3})
creates JSON [1,2,3]
Convert C# class to BsonDocument
var product = new Product{ Name = "Book", Pages = 3}.ToBsonDocument()
creates JSON {"Name":"Book","Pages":3}
Implicit conversion helps initialize variables
BsonValue bsonInt = 1;
BsonValue bsonBool = true;
new BsonValue[] { 1, 2, 3, 4 }
Related
I am working on a C# app to work with a 3rd party vendor. They've requested that our JSON be formatted as such (note the "owner" array value)
var good =
{
"id": "0DE570C8-E578-48A9-B22A-F95BC6211B4F",
"make": "Subaru",
"index": "Forester",
"year": "2013",
"owner":
[
{
"First": "Bob",
"Last": "Smith"
}
]
}
In my code, I am unable to get the formatting correct. I am able to produce valid C#, but it doesn't add "owner" as an array and any attempt to make it an array (e.g. adding "[]" after the "new" produces an error ("The name 'x' does not exist in the current context"). Here is my attempted C# code:
var car = new
{
id = "0DE570C8-E578-48A9-B22A-F95BC6211B4F",
make = "Subaru",
model = "Forester",
year = "2013",
Owner = new
{
First = "Bob",
Last = "Smith"
}
};
I tried making a "Car" class and tried using an anonymous type.
Your JSON field for owner is an array and in your code you assign it as an object so you need to assign owner to an array like below:
var car = new
{
id = "0DE570C8-E578-48A9-B22A-F95BC6211B4F",
make = "Subaru",
model = "Forester",
year = "2013",
Owner = new[]
{
new { First = "Bob", Last = "Smith" }
}
};
I was using the Equals(obj) method, but it's not working correctly because the array elements in the database are not ordered. If the order is different, it is not considered a matching element.
I tried that solution in two ways:
var filteredObj = objects.Find(obj => obj.Elements.OrderBy(x => x).Equals(elements)).FirstOrDefault();
var filteredObj = objects.Find(obj => obj.Elements.OrderBy(x => x).Equals(elements)).FirstOrDefault();
But in both of them I get the following exception:
Unsupported filter: {document}{Elements}.Equals(value(System.Collections.Generic.List[System.Int32]).OrderBy(x => x)).
It seems that MongoDb Driver doesn't support that kind of thing.
The method responsible for that:
public string GetIdByElements(List<int> elements)
{
var filteredObj = objects.Find(obj => obj.Elements.Equals(elements)).FirstOrDefault();
if (filteredObj is null)
throw new ObjectNotFoundException("Object not found");
return filteredObj.Id.ToString();
}
You can use this Find query with the $setEquals operator as follows. Assuming the MongoDB colection's array filed name arrayField:
var elements = new List<int>() { 1, 34, 900 }; // input list
var bsonElements = new BsonArray(elements);
var query =
new BsonDocument("$expr",
new BsonDocument("$setEquals",
new BsonArray {
"$arrayField",
bsonElements
})
);
var doc = collection.Find(query).FirstOrDefault();
Console.WriteLine(doc); // doc is of type BsonDocument
A sample collection's document with an array field: { "arrayField" : [ 1, 33, 900 ] }.
The query in mongo shell to match the array elements using $setEquals operator (the order of the elements do not matter):
db.collection.find( { $expr: { $setEquals: [ "$arrayField", [ 1, 900, 33 ] ] } } )
I am trying to upsert documents within an array of documents via the C# Driver for MongoDB. I manage to modify existing array elements via $set & arrayFilters, but struggle to add non-existing elements via $addToSet.
I would be glad about any suggestion, even if there is a completely different way.
My simplified class in C#
internal class TimeSeries
{
[BsonId]
internal Name;
[BsonDictionaryOptions(DictionaryRepresentation.ArrayOfDocuments)]
internal Dictionary<DateTime, double> Container;
}
// add a test document
void foo()
{
var coll = _myDatabase.GetCollection<TimeSeries>("myColl");
var res = coll.InsertOne(new TimeSeries()
{ Name = "abc", Container = new Dictionary<DateTime, double>()
{{new DateTime(2000,1,1),20}}
});
}
$set and $addToSet in the Mongo Shell work fine:
// modify the existing value to 30
db.myColl.update( {"_id":"abc"}, {$set: {"Container.$[loc].v":30}}, {arrayFilters:[{"loc.k":new Date("2000-01-01")}]})
// add if no existent
db.myColl.update( {"_id":"abc"}, {$addToSet: {"Container": {"k":new Date("2000-02-01"),"v":200}}})
In C# I can reproduce $set, but get a "Specific Cast is not valid" error for $addToSet.
var filter = Builders<TimeSeries>.Filter.Eq("_id", "abc");
var arrayFilters = new List<ArrayFilterDefinition<BsonDocument>>()
{new BsonDocument("loc.k", new DateTime(2000,1,1))};
// $set
var upsert = Builders<TimeSeries>.Update.Set("Container.$[loc].v", 30);
var resUpt = coll.UpdateOne(filter, upsert, new UpdateOptions { ArrayFilters = arrayFilters })
// $addToSet
var upsert_add = Builders<TimeSeries>.Update.AddToSet("Container", new BsonDocument { { "k", new DateTime(2000, 2, 1) }, { "v", 50} });
var res_add = coll.UpdateOne(filter, upsert_add); // Specific Cast is not valid
I have a lot of different collections of values I generate at runtime and want to send them to ElasticSearch. I can represent them as List<object> or if really doesn't work any other way, as List<string>. But I can't find any example how to do that. Here is an example of the code which doesn't work. There is probably a lot wrong with it, so any additional pointers are highly appreciated.
var client = new ElasticClient(new Uri("http://localhost:9200"));
client.CreateIndex("testentry");
var values = new List<object> {"StringValue", 123, DateTime.Now};
var indexResponse = client.Index(values, descriptor => descriptor.Index("testentry"));
Console.WriteLine(indexResponse.DebugInformation);
Which results in:
Invalid NEST response built from a unsuccessful low level call on POST: /testentry/list%601
# Audit trail of this API call:
- [1] BadResponse: Node: http://localhost:9200/ Took: 00:00:00.0600035
# ServerError: ServerError: 400Type: mapper_parsing_exception Reason: "failed to parse" CausedBy: "Type: not_x_content_exception Reason: "Compressor detection can only be called on some xcontent bytes or compressed xcontent bytes""
and
[2016-09-17 14:16:20,955][DEBUG][action.index ] [Gin Genie] failed to execute [index {[t
estentry][list`1][AVc4E3HaPglqpoLcosDo], source[_na_]}] on [[testentry][1]]
MapperParsingException[failed to parse]; nested: NotXContentException[Compressor detection can only
be called on some xcontent bytes or compressed xcontent bytes];
at org.elasticsearch.index.mapper.DocumentParser.parseDocument(DocumentParser.java:156)
I'm using Elasticsearch.Net 2.4.3 and NEST 2.4.3.
In addition to Henrik's answer, you could also index values in a Dictionary<string, object>
public class MyType
{
public MyType()
{
Values = new Dictionary<string, object>();
}
public Dictionary<string, object> Values { get; private set; }
}
void Main()
{
var pool = new SingleNodeConnectionPool(new Uri("http://localhost:9200"));
var connectionSettings = new ConnectionSettings(pool);
var client = new ElasticClient(connectionSettings);
var myType = new MyType
{
Values =
{
{ "value1", "StringValue" },
{ "value2", 123 },
{ "value3", DateTime.Now },
}
};
client.Index(myType, i => i.Index("index-name"));
}
The Dictionary<string,object> will be serialized to a json object with property names to match the dictionary keys
{
"values": {
"value1": "StringValue",
"value2": 123,
"value3": "2016-09-18T18:41:48.7344837+10:00"
}
}
Within Elasticsearch, the mapping will be inferred as an object type.
Arrays with a mixture of datatypes are not supported.
You could convert all of the values to strings:
client.CreateIndex("testentry");
var values = new List<string> { "StringValue", "123", DateTime.Now.ToString() };
var indexResponse = client.Index(new { Values = values}, descriptor => descriptor.Index("testentry").Type("test"));
Or specify the fields that the values should be indexed to:
client.CreateIndex("testentry");
var values = new { Field1 = "StringValue", Field2 = 123, Field3 = DateTime.Now };
var indexResponse = client.Index(values, descriptor => descriptor.Index("testentry").Type("test"));
Consider specifying the type of the document with the IndexDescriptor or create a class for the document.
Can I retrieve basic information about all collections in a MongoDB with F#?
I have a MongoDB with > 450 collections. I can access the db with
open MongoDB.Bson
open MongoDB.Driver
open MongoDB.Driver.Core
open MongoDB.FSharp
open System.Collections.Generic
let connectionString = "mystring"
let client = new MongoClient(connectionString)
let db = client.GetDatabase(name = "Production")
I had considered trying to just get all collections then loop through each collection name and get basic information about each collection with
let collections = db.ListCollections()
and
db.GetCollection([name of a collection])
but the db.GetCollection([name]) requires me to define a type to pull the information about each collection. This is challenging for me as I don't want to have to define a type for each collection, of which there are > 450, and frankly, I don't really know much about this DB. (Actually, no one in my org does; that's why I'm trying to put together a very basic data dictionary.)
Is defining the type for each collection really necessary? Can I use the MongoCollection methods available here without having to define a type for each collection?
EDIT: Ultimately, I'd like to be able to output collection name, the n documents in each collection, a list of the field names in each collection, and a list of each field type.
I chose to write my examples in C# as i'm more familiar with the C# driver and it is a listed tag on the question. You can run an aggregation against each collection to find all top level fields and their (mongodb) types for each document.
The aggregation is done in 3 steps. Lets assume the input is 10 documents which all have this form:
{
"_id": ObjectId("myId"),
"num": 1,
"str": "Hello, world!"
}
$project Convert each document into an array of documents with values fieldName and fieldType. Outputs 10 documents with a single array field. The array field will have 3 elements.
$unwind the arrays of field infos. Outputs 30 documents each with a single field corresponding to an element from the output of step 1.
$group the fields by fieldName and fieldType to get distinct values. Outputs 3 documents. Since all fields with the same name always have the same type in this example, there is only one final output document for each field. If two different documents defined the same field, one as string and one as int there would be separate entries in this result set for both.
// Define our aggregation steps.
// Step 1, $project:
var project = new BsonDocument
{ {
"$project", new BsonDocument
{
{
"_id", 0
},
{
"fields", new BsonDocument
{ {
"$map", new BsonDocument
{
{ "input", new BsonDocument { { "$objectToArray", "$$ROOT" } } },
{ "in", new BsonDocument {
{ "fieldName", "$$this.k" },
{ "fieldType", new BsonDocument { { "$type", "$$this.v" } } }
} }
}
} }
}
}
} };
// Step 2, $unwind
var unwind = new BsonDocument
{ {
"$unwind", "$fields"
} };
// Step 3, $group
var group = new BsonDocument
{
{
"$group", new BsonDocument
{
{
"_id", new BsonDocument
{
{ "fieldName", "$fields.fieldName" },
{ "fieldType", "$fields.fieldType" }
}
}
}
}
};
// Connect to our database
var client = new MongoClient("myConnectionString");
var db = client.GetDatabase("myDatabase");
var collections = db.ListCollections().ToEnumerable();
/*
We will store the results in a dictionary of collections.
Since the same field can have multiple types associated with it the inner value corresponding to each field is `List<string>`.
The outer dictionary keys are collection names. The inner dictionary keys are field names.
The inner dictionary values are the types for the provided inner dictionary's key (field name).
List<string> fieldTypes = allCollectionFieldTypes[collectionName][fieldName]
*/
Dictionary<string, Dictionary<string, List<string>>> allCollectionFieldTypes = new Dictionary<string, Dictionary<string, List<string>>>();
foreach (var collInfo in collections)
{
var collName = collInfo["name"].AsString;
var coll = db.GetCollection<BsonDocument>(collName);
Console.WriteLine("Finding field information for " + collName);
var pipeline = PipelineDefinition<BsonDocument, BsonDocument>.Create(project, unwind, group);
var cursor = coll.Aggregate(pipeline);
var lst = cursor.ToList();
allCollectionFieldTypes.Add(collName, new Dictionary<string, List<string>>());
foreach (var item in lst)
{
var innerDict = allCollectionFieldTypes[collName];
var fieldName = item["_id"]["fieldName"].AsString;
var fieldType = item["_id"]["fieldType"].AsString;
if (!innerDict.ContainsKey(fieldName))
{
innerDict.Add(fieldName, new List<string>());
}
innerDict[fieldName].Add(fieldType);
}
}
Now you can iterate over your result set:
foreach(var collKvp in allCollectionFieldTypes)
{
foreach(var fieldKvp in collKvp.Value)
{
foreach(var fieldType in fieldKvp.Value)
{
Console.WriteLine($"Collection {collKvp.Key} has field name {fieldKvp.Key} with type {fieldType}");
}
}
}