we have a problem with our indexes. We have an index on our emails but it throws errors like such:
> db.User.insert({email: "hell33o#gmail.com", "_id" : BinData(3,"iKyq6FvBCdd54TdxxX0JhA==")})
WriteResult({
"nInserted" : 0,
"writeError" : {
"code" : 11000,
"errmsg" : "E11000 duplicate key error index: placetobe.User.$email_text dup key: { : \"com\", : 0.6666666666666666 }"
}
})
when we have the index created with our C# driver like this
Created by C# with:
CreateIndexOptions options = new CreateIndexOptions {Unique = true};
_collection.Indexes.CreateOneAsync(Builders<User>.IndexKeys.Text(_ => _.email), options);
resulted in
{
"v" : 1,
"unique" : true,
"key" : {
"_fts" : "text",
"_ftsx" : 1
},
"name" : "email_text",
"ns" : "placetobe.User",
"weights" : {
"email" : 1
},
"default_language" : "english",
"language_override" : "language",
"textIndexVersion" : 2
}
but if we create it with the MongoDB console like this it works:
{
"v" : 1,
"unique" : true,
"key" : {
"email" : 1
},
"name" : "email_1",
"ns" : "placetobe.User"
}
I don't understand the difference between the two indexes, but they have an effect on our DB. We also have problems with a Collectin that saves names. we get duplicate exceptions on "Molly" if we try to insert "Molli". With the emails is seems to give us errors whenever we have two "gmail" emails in the collection or two ".com" emails etc.
This is a University project and we have to turn it in tomorrow. we're really in trouble, help would be much appreciated
You don't want your email to be a text Index. Text indices allow you to search large amounts of text in MongoDB like if you were parsing through comments or something. All you want is to make sure your emails aren't duplicated so you should use an ascending or descending index.
CreateIndexOptions options = new CreateIndexOptions {Unique = true};
_collection.Indexes.CreateOneAsync(Builders<User>.IndexKeys.Ascending(_ => _.email), options)
Related
I'm new to C# mongo,earlier worked on Node and Mongo.
i have a collection called tasks.Below is a sample record.
{
"_id" : ObjectId("6193bfba23855443a127466a"),
"taskIdentifier" : LUUID("00000000-0000-0000-0000-000000000000"),
"title" : "PR Liquidators",
"company" : "iuytreugdfh",
"purpose" : "test purpose",
"column" : "Search",
"assignTo" : "Shiva",
"assignToId" : ObjectId("61933b47a79ac615648a7855"),
"assignToImage" : null,
"notes" : "ggh#William james ",
"done" : 0,
"taskID" : "00029",
"status" : "Pending",
"states" : [
"Alabama - AL",
"Alaska - AK"
],
"active" : true,
"updatedAtUtc" : ISODate("2021-11-18T12:26:37.616Z"),
"updatedBy" : ""
}
in my c# webapi Project i always get a array called filterCriteria from api request of below form:
filterCriteria=[
{key:"purpose",value:"test purpose",type:"eq"},
{key:"active",value:true,type:"eq"}
]
Now I want to query the mongo collection tasks using the given filterCriteria.
tried something using LINQ statements but no use --hardcoding works but dynamically not working.
How can I achieve this???
maybe you are looking for Builders:
public enum FilterType{
eq=1,//equal
gt=2//greater than
}
//************
var builder = Builders<FilterCritertiaModel>.Filter;
var query = builder.Empty;
foreach(var filterCriteriaItem in filterCriteria){
switch (filterCriteriaItem.type) {
case eq:
query &= builder.Eq(filterCriteriaItem.Key, filterCriteriaItem.Value);
case gt:
query &=builder.Gt(filterCriteriaItem.Key, filterCriteriaItem.Value);
//all cases....
}
Let's say we have a collection of documents like this one:
{
"_id" : ObjectId("591c54faf1c1f419a830b9cf"),
"fingerprint" : "3121733676",
"screewidth" : "1920",
"carts" : [
{
"cartid" : 391796,
"status" : "New",
"cart_created" : ISODate("2017-05-17T13:50:37.388Z"),
"closed" : false,
"items" : [
{
"brandid" : "PIR",
"cai" : "2259700"
}
],
"updatedon" : ISODate("2017-05-17T13:51:24.252Z")
},
{
"cartid" : 422907,
"status" : "New",
"cart_created" : ISODate("2017-10-23T08:57:06.846Z"),
"closed" : false,
"items" : [
{
"brandid" : "PIR",
"cai" : "IrHlNdGtLfBoTlKsJaRySnM195U"
}
],
"updatedon" : ISODate("2017-10-23T09:46:08.579Z")
}
],
"createdon" : ISODate("2016-11-08T10:29:55.120Z"),
"updatedon" : ISODate("2017-10-23T09:46:29.486Z")
}
How do you extract only the documents where no item in the array $.carts have $.carts.closed set to true and $.carts.updatedon greater than $.updatedon minus 3 days ?
I know how to do find all the documents where no item in the array satisfy the condition $and: [closed: {$eq: true}, {updatedon: {$gt : new ISODate("2017-10-20T20:15:31Z")}}]
But how can you reference the parent element $.updatedon for the comparison?
In plain mongodb shell query language it would aleady be of help.
But I am actually accessing it using c# driver, so my query filter is like this:
FilterDefinition<_visitorData> filter;
filter = Builders<_visitorData>.Filter
.Gte(f => f.updatedon, DateTime.Now.AddDays(-15));
filter = filter & (
Builders<_visitorData>.Filter
.Exists(f => f.carts, false)
| !Builders<_visitorData>.Filter.ElemMatch(f =>
f.carts, c => c.closed && c.updatedon > DateTime.Now.AddDays(-15)
)
);
How can I replace DateTime.Now.AddDays(-15) with a reference to the document root element updatedon?
You can project the difference of carts.updatedon and updatedon and then filter out the results from this aggregation pipeline.
coll.aggregate([{'$unwind':'$carts'},
{'$match':{'closed':{'$ne':true}}},
{'$project':{'carts.cartid':1,'carts.status':1,'carts.cart_created':1,'carts.closed':1,'carts.items':1,'carts.updatedon':1,'updatedon':1,'diff':{'$subtract':['$carts.updatedon','$createdon']}}},
{'$match': {'diff': {'$gte': 1000 * 60 * 60 * 24 * days}}}])
days = 3 will filter out results more than 3 days difference documents.
I have just given the example of how you can use $subtract to find date difference and filter documents based on that.
well I was in a similar situation few days back.
I tackled it by using Jobject of Newtonsoft.Json.
Create a function to return bool which actually process each document take it as input.
Jobject jOb=Jobject.parse(<Your document string>);
JArray jAr=JArray.Parse(jOb["carts"]);
If(jOb["updateon"]=<Your Business Validation>)
{
foreach(var item in jAr)
if(item["closed"]==<Your validation>){ return true}
}
return false;
I hope this helps :)
If you handling with any null values in those properties then please use Tryparse and out variable.
As MongoDB provides the flexibility to store the unstructured data,
Is there any way in mongodb C# driver, I can find the number of distinct fields name from a collection.
I mean to say
{
"_id" : ObjectId("52fb69ff1ecf0322f0ab3129"),
"Serial Number" : "1",
"Name" : "Sameer Singh Rathoud",
"Skill" : "C++",
"City" : "Pune",
"Country" : "India"
}
{
"_id" : ObjectId("52fb69ff1ecf0322f0ab312a"),
"Serial Number" : "2",
"Name" : "Prashant Patil",
"DOB" : "31/07/1978",
"Location" : "Hinjewadi",
"State" : "Maharashtra",
"Country" : "India"
}
I want to get [_id, Serial Number, Name, DOB, Skill, City, State, Country]
i also faced this issue. If you till not got proper solution or for new person who searching solution for this kind of question they can use this.
var keys = [];
db.Entity.find().forEach(function(doc){
for (var key in doc){
if(keys.indexOf(key) < 0){
keys.push(key);
}
}
});
print(keys);
Im looking for the most efficient way of performing summing queries against mongodb.
Currently we insert documents that contain various information and a date time stamp of when the document was created.
We need to sum this data to be viewed in the following ways:
Documents by hour of the day 1-24
Documents by day of the month 1-28/31
Documents by month of the year 1-12
Documents by year
This summed data will be accessed often as we're afraid that the massive amount of data thrown at mongo will have problems summing this data often.
We thought perhaps when a document is inserted into mongo that we have another document that contains these counts that we increment at the time of insertion. This way, we can quickly pull the counts without summing the data each request. Our concern is that this may not be the most efficient way to perform this type of operation in mongo
Any thoughts on the best way to accomplish this? My dev team as well as myself are new to mongodb and we want to make sure we don't fall into a performance trap with summing large sets of data.
The Aggregation Framework is perfectly suited for this type of queries.
I've done some examples for you below.
To start, let's populate some documents:
db.myDocumentCollection.insert({"date" : new Date('01/01/2012'),
"topic" : "My Title 1"}); db.myDocumentCollection.insert({"date" : new
Date('01/02/2012'), "topic" : "My Title 2"});
db.myDocumentCollection.insert({"date" : new Date('01/02/2012'),
"topic" : "My Title 3"}); db.myDocumentCollection.insert({"date" : new
Date('01/02/2012'), "topic" : "My Title 4"});
db.myDocumentCollection.insert({"date" : new Date('01/04/2012'),
"topic" : "My Title 5"}); db.myDocumentCollection.insert({"date" : new
Date('01/05/2012'), "topic" : "My Title 6"});
db.myDocumentCollection.insert({"date" : new Date('01/07/2013'),
"topic" : "My Title 7"}); db.myDocumentCollection.insert({"date" : new
Date('01/07/2013'), "topic" : "My Title 8"});
db.myDocumentCollection.insert({"date" : new Date('02/07/2013'),
"topic" : "My Title 9"}); db.myDocumentCollection.insert({"date" : new
Date('02/08/2013'), "topic" : "My Title 10"});
Return number of documents grouped by full date
db.myDocumentCollection.group(
{
$keyf : function(doc) {
return { "date" : doc.date.getDate()+"/"+doc.date.getMonth()+"/"+doc.date.getFullYear() };
},
initial: {count:0},
reduce: function(obj, prev) { prev.count++; }
})
Output
[
{
"date" : "1/0/2012",
"count" : 1
},
{
"date" : "2/0/2012",
"count" : 3
},
{
"date" : "4/0/2012",
"count" : 1
},
{
"date" : "5/0/2012",
"count" : 1
},
{
"date" : "7/0/2013",
"count" : 2
},
{
"date" : "7/1/2013",
"count" : 1
},
{
"date" : "8/1/2013",
"count" : 1
}
]
Return number of documents grouped by day of month for the year 2013
This is perhaps a little more relevant for the kinds of queries you want to do.
Here, we use the cond to specify only to group documents after 1/1/2013
You could use $gte and $lte to do date ranges here.
db.myDocumentCollection.group(
{
$keyf : function(doc) {
return { "date" : doc.date.getDate()+"/"+doc.date.getMonth()};
},
cond: {"date" : {"$gte": new Date('01/01/2013')}},
initial: {count:0},
reduce: function(obj, prev) { prev.count++; }
})
Output
[
{
"date" : "7/0",
"count" : 2
},
{
"date" : "7/1",
"count" : 1
},
{
"date" : "8/1",
"count" : 1
}
]
I am battling strange behavior of index creation from hours. I am trying to re-build my sample datas so i drop my collection before insert new datas and before insert new datas i create indexes again like below.
db.GetCollection("Posts").EnsureIndex("Name","Title","Owner");
After that i am trying to Execute sorted Query but MongoDb throws exception and says that
QueryFailure flag was too much data for sort() with no index. add an index or specify a smaller limit
But if i put this line code db.GetCollection("Post").EnsureIndex("Name"); before execute query, it works without problem. Then i have realized that if i use this before rebuild datas it works. There should to be a bug in overloading method or something i have missed.
I am using 10Gen .net driver version 1.2 and i have checked which indexes exist before execure query. Here it is
db.GetCollection("Posts").EnsureIndex("Name","Title","Owner");
db.GetIndexes();//result
[0]: { "v" : 1, "key" : { "_id" : 1 }, "ns" : "Posts", "name" : "_id_" }
[1]: { "v" : 1, "key" : { "Name" : 1, "Title" : 1, "Owner" : 1 }, "ns" : "Posts", "name" : "Name_1_Title_1_Owner_1_" }
db.GetCollection("Posts").EnsureIndex("Title") // i call this for other indexes too
db.GetIndexes();
[0]: { "v" : 1, "key" : { "_id" : 1 }, "ns" : "Posts", "name" : "_id_" }
[1]: { "v" : 1, "key" : { "Name" : 1 }, "ns" : "Posts", "name" : "Name_1" }
[2]: { "v" : 1, "key" : { "Title" : 1 }, "ns" : "Posts", "name" : "Title_1" }
[4]: { "v" : 1, "key" : { "Owner" : 1 }, "ns" : "Posts", "name" : "Owner_1" }
I can't tell from your example exactly what you think isn't working.
One thing to keep in mind is that EnsureIndex only knows what's going on within your own process. So if you remove an index or drop a collection using the mongo shell EnsureIndex won't pick up on that. You can use CreateIndex instead of EnsureIndex if you want to make sure the index exists regardless of what other processes might have done in the meantime.
Let me know if you can provide more details on how to reproduce what you are seeing.