I have just started using mongodb as a result of dealing with bulk data's for my new project.I just set up the database and installed c# driver for mongodb and here is what i tried
public IHttpActionResult insertSample()
{
var client = new MongoClient("mongodb://localhost:27017");
var database = client.GetDatabase("reznext");
var collection = database.GetCollection<BsonDocument>("sampledata");
List<BsonDocument> batch = new List<BsonDocument>();
for (int i = 0; i < 300000; i++)
{
batch.Add(
new BsonDocument {
{ "field1", 1 },
{ "field2", 2 },
{ "field3", 3 },
{ "field4", 4 }
});
}
collection.InsertManyAsync(batch);
return Json("OK");
}
But when i check the collection for documents i see only 42k out of 0.3million records inserted.I use robomongo as client and would like to know what is wrong here.Is there any insertion limit per operation ?
You write async and don't wait for a result. Either wait for it:
collection.InsertManyAsync(batch).Wait();
Or use synch call:
collection.InsertMany(batch);
Related
In the below code, insertMany successfully inserts the documents but is there any way to get count/number of documents inserted successfully by insertMany operation.
var mongo = new MongoClient("mongodb://10.44.4.59");
MongoServer server = mongo.GetServer();
MongoDatabase test = server.GetDatabase("server_info");
var category = test.GetCollection("test_collection");
List<BsonDocument> batch = new List<BsonDocument>();
for (int i = 0; i < 300000; i++)
{
batch.Add(
new BsonDocument {
{ "field1", 1 },
{ "field2", 2 },
{ "field3", 3 },
{ "field4", 4 }
});
}
category.InsertMany(batch);
I'm trying to create a bulk delete job in Dynamics 365 using the web api. As reference I've used the following web pages:
https://learn.microsoft.com/en-us/dynamics365/customer-engagement/web-api/bulkdelete?view=dynamics-ce-odata-9
Unable to call the BulkDelete action from Microsoft Dynamics CRM WebAPI
I'm using api-version 9.1.
I've gotten most of it to work and have removed quite a few validation errors, so I know I'm on the right track. However, now I get the following error message: "The Entity bookableresourcebooking does not support Synchronous Bulk Delete".
When I try to create the same bulk delete job manually in Dynamics, no errors occur.
Can anyone help me resolve this error?
The relevant code I'm using:
var relativeUrl = "BulkDelete()";
var bulkDelete = new BulkDeleteRequest("Delete all future bookings");
var querySet = new QuerySet();
querySet.EntityName = "bookableresourcebooking";
querySet.Distinct = false;
var conditionStarttimeGreaterEqualToday = new Condition();
conditionStarttimeGreaterEqualToday.AttributeName = "starttime";
conditionStarttimeGreaterEqualToday.Operator = "OnOrAfter";
conditionStarttimeGreaterEqualToday.Values = new List<ValueClass>();
conditionStarttimeGreaterEqualToday.Values.Add(new ValueClass(new DateTime(DateTime.Now.Year, DateTime.Now.Month, DateTime.Now.Day).ToUniversalTime().ToString("o"), "System.DateTime"));
var conditionVoltooidOpEmpty = new Condition();
conditionVoltooidOpEmpty.AttributeName = "new_voltooidop";
conditionVoltooidOpEmpty.Operator = "Null";
conditionVoltooidOpEmpty.Values = new List<ValueClass>();
querySet.Criteria = new Criteria();
querySet.Criteria.FilterOperator = "And";
querySet.Criteria.Conditions.Add(conditionStarttimeGreaterEqualToday);
querySet.Criteria.Conditions.Add(conditionVoltooidOpEmpty);
bulkDelete.QuerySet.Add(querySet);
await _crmClient.PostCRMData(relativeUrl, JsonConvert.SerializeObject(bulkDelete)); //Dependency injected httpclient.
Extra info:
bookableresourcebooking is a standard entity that comes with Field Service
new_voltooidop up is a custom datetime field I've added to this entity
We have to make this job asynchronous and for that we have to pass RunNow: false. Then this error will disappear.
I have tested this working code in CRM REST Builder. But this is JS equivalent.
var parameters = {};
var queryset1 = {
EntityName: "account",
ColumnSet: {
AllColumns: true
},
Distinct: false,
};
queryset1["#odata.type"] = "Microsoft.Dynamics.CRM.QueryExpression";
parameters.QuerySet = [queryset1];
parameters.JobName = "arun test";
parameters.SendEmailNotification = false;
var torecipients1 = {};
torecipients1.activitypartyid = "00000000-0000-0000-0000-000000000000"; //Delete if creating new record
torecipients1["#odata.type"] = "Microsoft.Dynamics.CRM.activityparty";
parameters.ToRecipients = [torecipients1];
var ccrecipients1 = {};
ccrecipients1.activitypartyid = "00000000-0000-0000-0000-000000000000"; //Delete if creating new record
ccrecipients1["#odata.type"] = "Microsoft.Dynamics.CRM.activityparty";
parameters.CCRecipients = [ccrecipients1];
parameters.RecurrencePattern = "FREQ=DAILY;";
parameters.StartDateTime = JSON.stringify(new Date("05/07/2021 13:30:00").toISOString());
parameters.RunNow = false;
var bulkDeleteRequest = {
QuerySet: parameters.QuerySet,
JobName: parameters.JobName,
SendEmailNotification: parameters.SendEmailNotification,
ToRecipients: parameters.ToRecipients,
CCRecipients: parameters.CCRecipients,
RecurrencePattern: parameters.RecurrencePattern,
StartDateTime: parameters.StartDateTime,
RunNow: parameters.RunNow,
getMetadata: function() {
return {
boundParameter: null,
parameterTypes: {
"QuerySet": {
"typeName": "Collection(mscrm.QueryExpression)",
"structuralProperty": 4
},
"JobName": {
"typeName": "Edm.String",
"structuralProperty": 1
},
"SendEmailNotification": {
"typeName": "Edm.Boolean",
"structuralProperty": 1
},
"ToRecipients": {
"typeName": "Collection(mscrm.activityparty)",
"structuralProperty": 4
},
"CCRecipients": {
"typeName": "Collection(mscrm.activityparty)",
"structuralProperty": 4
},
"RecurrencePattern": {
"typeName": "Edm.String",
"structuralProperty": 1
},
"StartDateTime": {
"typeName": "Edm.DateTimeOffset",
"structuralProperty": 1
},
"RunNow": {
"typeName": "Edm.Boolean",
"structuralProperty": 1
}
},
operationType: 0,
operationName: "BulkDelete"
};
}
};
Xrm.WebApi.online.execute(bulkDeleteRequest).then(
function success(result) {
if (result.ok) {
var results = JSON.parse(result.responseText);
}
},
function(error) {
Xrm.Utility.alertDialog(error.message);
}
);
I have an Azure function triggered by a timer in which I want to update documents inside CosmosDB. Now I'm using the function UpdateOneAsync with option IsUpsert = true to make the update (or insert if the document doesn't exist).
However I'm doing the update operation inside a foreach loop, therefore an update operation is performed foreach item. How can I do a bulk update (upsert), performing just one operation after the foreach loop finishes?
Here it is my code right now:
foreach (var group in GetGroups(date, time, hour))
{
dic = new MyDictionary<string>();
//... some operations
List<BsonElement> documents = new List<BsonElement>();
documents.Add(new BsonElement("$inc", new BsonDocument(dic)));
documents.Add(new BsonElement("$set", new BsonDocument(new Dictionary<string, string>() { { "c", key }, { "d", date } })));
var doc = clicksDoc.UpdateOneAsync(t => t["_id"] == "c-" + key + "-" + date, new BsonDocument(documents), new UpdateOptions() { IsUpsert = true }).Result;
}
Instead I'd like to perform just one update after the loop. How can I do that?
2020 answer
Bulk support has been added to the .NET SDK:
Introducing Bulk support in the .NET SDK
To use it, first enable bulk execution when you create your client:
CosmosClient client = new CosmosClientBuilder(options.Value.ConnectionString)
.WithConnectionModeDirect()
.WithBulkExecution(true)
.Build();
Then get your container as normal:
Container container = client.GetContainer("databaseName", "containerName");
Then do your bulk operation, e.g. upsert:
public async Task BulkUpsert(List<SomeItem> items)
{
var concurrentTasks = new List<Task>();
foreach (SomeItem item in items)
{
concurrentTasks.Add(container.UpsertItemAsync(item, new PartitionKey(item.PartitionKeyField)));
}
await Task.WhenAll(concurrentTasks);
}
You can use the method BulkUpdateAsync from the BulkExecutor,
List<UpdateItem> updateList = initialDocuments.Select(d =>
new UpdateItem(
d.id,
d.AccountNumber,
new List<UpdateOperation> {
new SetUpdateOperation<string>(
"NewSimpleProperty",
"New Property Value"),
new SetUpdateOperation<dynamic>(
"NewComplexProperty",
new {
prop1 = "Hello",
prop2 = "World!"
}),
new UnsetUpdateOperation(nameof(FakeOrder.DocumentIndex)),
}))
.ToList();
var updateSetResult = BulkUpdatetDocuments(_database, _collection, updateList).GetAwaiter().GetResult();
and
var executor = new BulkExecutor(_documentClient, collectionResource);
await executor.InitializeAsync();
return await executor.BulkUpdateAsync(updates);
SAMPLE
I am using the below API and method to try and get all PR commits that merge a given commitId. But the API only returns a max of 250. Is there a parameter or pagination technique to get the remaining?
public void GetAllPullRequestsForCommit(Guid repoId, string commitId)
{
var query = new GitPullRequestQuery();
var input = new GitPullRequestQueryInput() { Type = GitPullRequestQueryType.Commit, Items = new List<string>() { commitId } };
query.QueryInputs = new List<GitPullRequestQueryInput>() { input };
var response = _gitClient.GetPullRequestQueryAsync(query, repoId).Result;
}
I am trying to read data from a remote MongoDB instance from a c# console application but keep getting an OutOfMemoryException. The collection that I am trying to read data from has about 500,000 records. Does anyone see any issue with the code below:
var mongoCred = MongoCredential.CreateMongoCRCredential("xdb", "x", "x");
var mongoClientSettings = new MongoClientSettings
{
Credentials = new[] { mongoCred },
Server = new MongoServerAddress("x-x.mongolab.com", 12345),
};
var mongoClient = new MongoClient(mongoClientSettings);
var mongoDb = mongoClient.GetDatabase("xdb");
var mongoCol = mongoDb.GetCollection<BsonDocument>("Persons");
var list = await mongoCol.Find(new BsonDocument()).ToListAsync();
This is a simple workaround: you can page your results using .Limit(?int) and .Skip(?int); in totNum you have to store the documents number in your collection using
coll.Count(new BsonDocument) /*use the same filter you will apply in the next Find()*/
and then
for (int _i = 0; _i < totNum / 1000 + 1; _i++)
{
var result = coll.Find(new BsonDocument()).Limit(1000).Skip(_i * 1000).ToList();
foreach(var item in result)
{
/*Write your document in CSV file*/
}
}
I hope this can help...
P.S.
I used 1000 in .Skip() and .Limit() but, obviously, you can use what you want :-)