C# MongoDB update definition from filled model - c#

How i can combine type C# filled model and mongo language for create update request?
i try this but get types error:
var update = new ObjectUpdateDefinition<User>(user) & Builders<User>.Update.Inc(f => f.FindCount, 1);
FullCode:
var user = new Model
{
Email = email,
Language = language,
Password = password,
// +17 fields
};
// how i can convert all fields to set? and join with query Inc (like AllFieldToSet)?
var update = user.AllFieldToSet() & Builders<User>.Update.Inc(f => f.FindCount, 1);
Models.FindOneAndUpdate<Model>(
f => f.Email == email,
update,
new FindOneAndUpdateOptions<Model> {IsUpsert = true});

Firstly I would begin to ask why do you update so many fields at once, this can have a significant effect on performance and scalability, especially when your collection has indexes, replication and/or sharding involved. Whenever you have to update an indexed field it needs to update the index as well.
There are multiple solutions:
ReplaceOne:
Inc only increments the value, this can be done manually in the model: FindCount++
Manually build the update operator: Just manually build using the builder they provided
Write your own extension using reflection Personally I like type safety but you can use this extension that I wrote (you will just have to extend it and build in null checks, etc.)
public static class UpdateBuilderExtensions
{
public static UpdateDefinition<TDocument> SetAll<TDocument, TModel>(this UpdateDefinition<TDocument> updateBuilder, TModel value, params string[] excludeProperties)
{
foreach (var property in value.GetType().GetProperties(BindingFlags.Public | BindingFlags.Instance).Where(x => excludeProperties?.Contains(x.Name) == false))
updateBuilder = Builders<TDocument>.Update.Combine(updateBuilder.Set(property.Name, property.GetValue(value)));
return updateBuilder;
}
}
Usage of it:
You have to exclude properties that have been set already and you have to exclude the BsondId field (if you do not set it manually)
var update = Builders<User>.Update.Inc(x => x.FindCount, 1).SetAll(model, nameof(model.Id), nameof(model.FindCount));
Models.UpdateOne(x => x.Email== "some-email", update, new UpdateOptions
{
IsUpsert = true
});

Related

Replace property values in a class from List<Dictionary> values

I have a method that takes a List<Dictionary<string,object>> as a parameter. The plan is to use that parameter, but only update the values held in a particular class. Here is the (partially written) method
public async Task<Errors> UpdatePageForProject(Guid projectId, List<Dictionary<string, object>> data)
{
if (!IsValidUserIdForProject(projectId))
return new Errors { ErrorMessage = "Project does not exist", Success = false };
if (data.Count == 0)
return new Errors { ErrorMessage = "No data passed to change", Success = false };
var page = await _context.FlowPages.FirstOrDefaultAsync(t => t.ProjectId == projectId);
foreach (var d in data)
{
}
return new Errors { Success = true };
}
My original plan is to take each dictionary, check if the key and the property in page match and then alter the value (so I can pass in 1 dictionary or 8 dictionaries in the list and then alter page to save back to my entity database).
I'd rather not use reflection due to the speed hit (though C#9 is really fast, I'd still rather not use it), but I'm not sure how else this can be done. I did consider using AutoMapper to do this, but for now would rather not (it's a PoC, so it is possibly overkill)
If you want to do this without Reflection (which I agree is a good idea, not just for performance reasons) then you could use a "map" or lookup table with actions for each property.
var map = new Dictionary<string,Action<Page,object>>()
{
{ "Title", (p,o) => p.Title = (string)o },
{ "Header", (p,o) => p.Field1 = (string)o },
{ "DOB", (p,o) => p.DateOfBirth = (DateTime)o }
};
You can then iterate over your list of dictionaries and use the map to execute actions that update the page.
foreach (var dictionary in data)
{
foreach (entry in dictionary)
{
var action = map[entry.Key];
action(page, entry.Value);
}
}

Unable to save lists to MongoDB using UpdateDefinitionBuilder in C#

I have the following helper function that uses reflection to look at the supplied model, check if any fields are non-null, and overwrite those with new values in MongoDB.
It works fine until I try updating a List<string> field. Instead of saving the list as an array to MongoDB, it saves a string value of "(Collection)" in MongoDB. What am I missing to make this work?
I don't want to hardcode that any List should default to List<string> either. It's possible I could have lists of ints.
public override MyObj Update(MyObj model)
{
var builder = Builders<MyObj>.Update;
var builder_def = builder.Set(x => x.Id, model.Id);
foreach (PropertyInfo prop in model.GetType().GetProperties())
{
var value = model.GetType().GetProperty(prop.Name).GetValue(model, null);
if (value != null)
{
builder_def = builder_def.Set(prop.Name, value); // Not setting lists correctly
}
}
var filter = Builders<MyObj>.Filter;
var filter_def = filter.Eq(x => x.Id, model.Id);
Connection.Update(filter_def, builder_def);
return model;
}
Edit
I think my problem is related to this open bug https://jira.mongodb.org/browse/CSHARP-1984. Given that, is there any way for me to make this work in that context?

DapperExtensions: Add "insert ... update on duplicate key"

Now I'm using Dapper + Dapper.Extensions. And yes, it's easy and awesome. But I faced with a problem: Dapper.Extensions has only Insert command and not InsertUpdateOnDUplicateKey. I want to add such method but I don't see good way to do it:
I want to make this method generic like Insert
I can't get cached list of properties for particular type because I don't want to use reflection directly to build raw sql
Possible way here to fork it on github but I want to make it in my project only. Does anybody know how to extend it? I understand this feature ("insert ... update on duplicate key") is supported only in MySQL. But I can't find extension points in DapperExtensions to add this functionality outside.
Update: this is my fork https://github.com/MaximTkachenko/Dapper-Extensions/commits/master
This piece of code has helped me enormously in MySQL -related projects, I definitely owe you one.
I do a lot of database-related development on both MySQL and MS SQL. I also try to share as much code as possible between my projects.
MS SQL has no direct equivalent for "ON DUPLICATE KEY UPDATE", so I was previously unable to use this extension when working with MS SQL.
While migrating a web application (that leans heavily on this Dapper.Extensions tweak) from MySQL to MS SQL, I finally decided to do something about it.
This code uses the "IF EXISTS => UPDATE ELSE INSERT" approach that basically does the same as "ON DUPLICATE KEY UPDATE" on MySQL.
Please note: the snippet assumes that you are taking care of transactions outside this method. Alternatively you could append "BEGIN TRAN" to the beginning and "COMMIT" to the end of the generated sql string.
public static class SqlGeneratorExt
{
public static string InsertUpdateOnDuplicateKey(this ISqlGenerator generator, IClassMapper classMap, bool hasIdentityKeyWithValue = false)
{
var columns = classMap.Properties.Where(p => !(p.Ignored || p.IsReadOnly || (p.KeyType == KeyType.Identity && !hasIdentityKeyWithValue))).ToList();
var keys = columns.Where(c => c.KeyType != KeyType.NotAKey).Select(p => $"{generator.GetColumnName(classMap, p, false)}=#{p.Name}");
var nonkeycolumns = classMap.Properties.Where(p => !(p.Ignored || p.IsReadOnly) && p.KeyType == KeyType.NotAKey).ToList();
if (!columns.Any())
{
throw new ArgumentException("No columns were mapped.");
}
var tablename = generator.GetTableName(classMap);
var columnNames = columns.Select(p => generator.GetColumnName(classMap, p, false));
var parameters = columns.Select(p => generator.Configuration.Dialect.ParameterPrefix + p.Name);
var valuesSetters = nonkeycolumns.Select(p => $"{generator.GetColumnName(classMap, p, false)}=#{p.Name}").ToList();
var where = keys.AppendStrings(seperator: " and ");
var sqlbuilder = new StringBuilder();
sqlbuilder.AppendLine($"IF EXISTS (select * from {tablename} WITH (UPDLOCK, HOLDLOCK) WHERE ({where})) ");
sqlbuilder.AppendLine(valuesSetters.Any() ? $"UPDATE {tablename} SET {valuesSetters.AppendStrings()} WHERE ({where}) " : "SELECT 0 ");
sqlbuilder.AppendLine($"ELSE INSERT INTO {tablename} ({columnNames.AppendStrings()}) VALUES ({parameters.AppendStrings()}) ");
return sqlbuilder.ToString();
}
}
Actually I closed my pull request and remove my fork because:
I see some open pull requests created in 2014
I found a way "inject" my code in Dapper.Extensions.
I remind my problem: I want to create more generic queries for Dapper.Extensions. It means I need to have access to mapping cache for entities, SqlGenerator etc. So here is my way. I want to add ability to make INSERT .. UPDATE ON DUPLICATE KEY for MySQL. I created extension method for ISqlGenerator
public static class SqlGeneratorExt
{
public static string InsertUpdateOnDuplicateKey(this ISqlGenerator generator, IClassMapper classMap)
{
var columns = classMap.Properties.Where(p => !(p.Ignored || p.IsReadOnly || p.KeyType == KeyType.Identity));
if (!columns.Any())
{
throw new ArgumentException("No columns were mapped.");
}
var columnNames = columns.Select(p => generator.GetColumnName(classMap, p, false));
var parameters = columns.Select(p => generator.Configuration.Dialect.ParameterPrefix + p.Name);
var valuesSetters = columns.Select(p => string.Format("{0}=VALUES({1})", generator.GetColumnName(classMap, p, false), p.Name));
string sql = string.Format("INSERT INTO {0} ({1}) VALUES ({2}) ON DUPLICATE KEY UPDATE {3}",
generator.GetTableName(classMap),
columnNames.AppendStrings(),
parameters.AppendStrings(),
valuesSetters.AppendStrings());
return sql;
}
}
One more extension method for IDapperImplementor
public static class DapperImplementorExt
{
public static void InsertUpdateOnDuplicateKey<T>(this IDapperImplementor implementor, IDbConnection connection, IEnumerable<T> entities, int? commandTimeout = null) where T : class
{
IClassMapper classMap = implementor.SqlGenerator.Configuration.GetMap<T>();
var properties = classMap.Properties.Where(p => p.KeyType != KeyType.NotAKey);
string emptyGuidString = Guid.Empty.ToString();
foreach (var e in entities)
{
foreach (var column in properties)
{
if (column.KeyType == KeyType.Guid)
{
object value = column.PropertyInfo.GetValue(e, null);
string stringValue = value.ToString();
if (!string.IsNullOrEmpty(stringValue) && stringValue != emptyGuidString)
{
continue;
}
Guid comb = implementor.SqlGenerator.Configuration.GetNextGuid();
column.PropertyInfo.SetValue(e, comb, null);
}
}
}
string sql = implementor.SqlGenerator.InsertUpdateOnDuplicateKey(classMap);
connection.Execute(sql, entities, null, commandTimeout, CommandType.Text);
}
}
Now I can create new class derived from Database class to use my own sql
public class Db : Database
{
private readonly IDapperImplementor _dapperIml;
public Db(IDbConnection connection, ISqlGenerator sqlGenerator) : base(connection, sqlGenerator)
{
_dapperIml = new DapperImplementor(sqlGenerator);
}
public void InsertUpdateOnDuplicateKey<T>(IEnumerable<T> entities, int? commandTimeout) where T : class
{
_dapperIml.InsertUpdateOnDuplicateKey(Connection, entities, commandTimeout);
}
}
Yeah, it's required to create another DapperImplementor instance because DapperImplementor instance from base class is private :(. So now I can use my Db class to call my own generic sql queries and native queries from Dapper.Extension. Examples of usage Database class instead of IDbConnection extensions can be found here.

Sitefinity: Dynamic Content query optimization with field values

I will attempt to be as specific as possible. So we are using Sitefinity 8.1.5800, I have a couple dynamic content modules named ReleaseNotes and ReleaseNoteItems. ReleaseNotes has some fields but no reference to ReleaseNoteItems.
Release Note Items has fields and a related data field to ReleaseNotes.
So I can query all ReleaseNoteItems as dynamic content pretty quickly less than a second.
I then use these objects provided by sitefinity and map them to a C# object so I can use strong type. This mapping process is taking almost a minute and using over 600 queries for only 322 items (N+1).
In Short: I need to get all sitefinity objects and Map them to a usable c# object quicker than I currently am.
The method for fetching the dynamic content items (takes milliseconds):
private IList<DynamicContent> GetAllLiveReleaseNoteItemsByReleaseNoteParentId(Guid releaseNoteParentId)
{
DynamicModuleManager dynamicModuleManager = DynamicModuleManager.GetManager(String.Empty);
Type releasenoteitemType = TypeResolutionService.ResolveType("Telerik.Sitefinity.DynamicTypes.Model.ReleaseNoteItems.Releasenoteitem");
string releaseNoteParentTypeString = "Telerik.Sitefinity.DynamicTypes.Model.ReleaseNotes.Releasenote";
var provider = dynamicModuleManager.Provider as OpenAccessDynamicModuleProvider;
int? totalCount = 0;
var cultureName = "en";
Thread.CurrentThread.CurrentUICulture = new CultureInfo(cultureName);
Type releasenoteType = TypeResolutionService.ResolveType("Telerik.Sitefinity.DynamicTypes.Model.ReleaseNotes.Releasenote");
// This is how we get the releasenote items through filtering
DynamicContent myCurrentItem = dynamicModuleManager.GetDataItem(releasenoteType, releaseNoteParentId);
var myMasterParent =
dynamicModuleManager.Lifecycle.GetMaster(myCurrentItem) as DynamicContent;
var relatingItems = provider.GetRelatedItems(
releaseNoteParentTypeString,
"OpenAccessProvider",
myMasterParent.Id,
string.Empty,
releasenoteitemType,
ContentLifecycleStatus.Live,
string.Empty,
string.Empty,
null,
null,
ref totalCount,
RelationDirection.Parent).OfType<DynamicContent>();
IList<DynamicContent> allReleaseNoteItems = relatingItems.ToList();
return allReleaseNoteItems;
}
This is the method that takes almost a minute that is mapping sitefinity object to C# object:
public IList<ReleaseNoteItemModel> GetReleaseNoteItemsByReleaseNoteParent(ReleaseNoteModel releaseNoteItemParent)
{
return GetAllLiveReleaseNoteItemsByReleaseNoteParentId(releaseNoteItemParent.Id).Select(rn => new ReleaseNoteItemModel
{
Id = rn.Id,
Added = rn.GetValue("Added") is bool ? (bool)rn.GetValue("Added") : false,
BugId = rn.GetValue<string>("bug_id"),
BugStatus = rn.GetValue<Lstring>("bugStatus"),
Category = rn.GetValue<Lstring>("category"),
Component = rn.GetValue<Lstring>("component"),
#Content = rn.GetValue<Lstring>("content"),
Criticality = rn.GetValue<Lstring>("criticality"),
Customer = rn.GetValue<string>("customer"),
Date = rn.GetValue<DateTime?>("date"),
Grouped = rn.GetValue<string>("grouped"),
Override = rn.GetValue<string>("override"),
Patch_Num = rn.GetValue<string>("patch_num"),
PublishedDate = rn.PublicationDate,
Risk = rn.GetValue<Lstring>("risk"),
Title = rn.GetValue<string>("Title"),
Summary = rn.GetValue<Lstring>("summary"),
Prod_Name = rn.GetValue<Lstring>("prod_name"),
ReleaseNoteParent = releaseNoteItemParent,
McProductId = GetMcProductId(rn.GetRelatedItems("McProducts").Cast<DynamicContent>()),
}).ToList();
}
Is there any way to optimize this all into one query or a better way of doing this? Taking almost a minute to map this objects is too long for what we need to do with them.
If there is no way we will have to cache the items or make a SQL query. I would rather not do caching or SQL query if I do not have to.
Thank you in advance for any and all help you can provide, I am new to posting questions on stackoverflow so if you need any additional data please let me know.
Is there a reason why you are doing a .ToList() for the items? Is it possible for you to avoid that. In my opinion, most of the time(of the 1 minute) is taken to convert all your items into a list. Conversion from Sitefinity object to C# object is not the culprit here.
Look Arno's answer here: https://plus.google.com/u/0/112295105425490148444/posts/QrsVtxj1sCB?cfem=1
You can use the "Content links manager" to query dynamic modules relationships (both by parent -ParentItemId- or by child -ChildItemId-) much faster:
var providerName = String.Empty;
var parentTitle = "Parent";
var relatedTitle = "RelatedItem3";
DynamicModuleManager dynamicModuleManager = DynamicModuleManager.GetManager(providerName);
Type parentType = TypeResolutionService.ResolveType("Telerik.Sitefinity.DynamicTypes.Model.ParentModules.ParentModule");
Type relatedType = TypeResolutionService.ResolveType("Telerik.Sitefinity.DynamicTypes.Model.RelatedModules.RelatedModule");
ContentLinksManager contentLinksManager = ContentLinksManager.GetManager();
// get the live version of all parent items
var parentItems = dynamicModuleManager.GetDataItems(parentType).Where(i => i.GetValue<string>("Title").Contains(parentTitle) && i.Status == ContentLifecycleStatus.Live && i.Visible);
// get the ids of the related items.
// We use the OriginalContentId property since we work with the live vesrions of the dynamic modules
var parentItemIds = parentItems.Select(i => i.OriginalContentId).ToList();
// get the live versions of all the schedules items
var relatedItems = dynamicModuleManager.GetDataItems(relatedType).Where(i => i.Status == ContentLifecycleStatus.Live && i.Visible && i.GetValue<string>("Title").Contains(relatedTitle));
// get the content links
var contentLinks = contentLinksManager.GetContentLinks().Where(cl => cl.ParentItemType == parentType.FullName && cl.ComponentPropertyName == "RelatedField" && parentItemIds.Contains(cl.ParentItemId) && cl.AvailableForLive);
// get the IDs of the desired parent items
var filteredParentItemIds = contentLinks.Join<ContentLink, DynamicContent, Guid, Guid>(relatedItems, (cl) => cl.ChildItemId, (i) => i.OriginalContentId, (cl, i) => cl.ParentItemId).Distinct();
// get the desired parent items by the filtered IDs
var filteredParentItems = parentItems.Where(i => filteredParentItemIds.Contains(i.OriginalContentId)).ToList();
I would imagine that every release note item under a single release note would be related to the same product wouldn't it?
If so, do you need to do the GetMcProductId method for every item?

MongoDB: update only specific fields

I am trying to update a row in a (typed) MongoDB collection with the C# driver. When handling data of that particular collection of type MongoCollection<User>, I tend to avoid retrieving sensitive data from the collection (salt, password hash, etc.)
Now I am trying to update a User instance. However, I never actually retrieved sensitive data in the first place, so I guess this data would be default(byte[]) in the retrieved model instance (as far as I can tell) before I apply modifications and submit the new data to the collection.
Maybe I am overseeing something trivial in the MongoDB C# driver how I can use MongoCollection<T>.Save(T item) without updating specific properties such as User.PasswordHash or User.PasswordSalt? Should I retrieve the full record first, update "safe" properties there, and write it back? Or is there a fancy option to exclude certain fields from the update?
Thanks in advance
Save(someValue) is for the case where you want the resulting record to be or become the full object (someValue) you passed in.
You can use
var query = Query.EQ("_id","123");
var sortBy = SortBy.Null;
var update = Update.Inc("LoginCount",1).Set("LastLogin",DateTime.UtcNow); // some update, you can chain a series of update commands here
MongoCollection<User>.FindAndModify(query,sortby,update);
method.
Using FindAndModify you can specify exactly which fields in an existing record to change and leave the rest alone.
You can see an example here.
The only thing you need from the existing record would be its _id, the 2 secret fields need not be loaded or ever mapped back into your POCO object.
It´s possible to add more criterias in the Where-statement. Like this:
var db = ReferenceTreeDb.Database;
var packageCol = db.GetCollection<Package>("dotnetpackage");
var filter = Builders<Package>.Filter.Where(_ => _.packageName == packageItem.PackageName.ToLower() && _.isLatestVersion);
var update = Builders<Package>.Update.Set(_ => _.isLatestVersion, false);
var options = new FindOneAndUpdateOptions<Package>();
packageCol.FindOneAndUpdate(filter, update, options);
Had the same problem and since I wanted to have 1 generic method for all types and didn't want to create my own implementation using Reflection, I end up with the following generic solution (simplified to show all in one method):
Task<bool> Update(string Id, T item)
{
var serializerSettings = new JsonSerializerSettings()
{
NullValueHandling = NullValueHandling.Ignore,
DefaultValueHandling = DefaultValueHandling.Ignore
};
var bson = new BsonDocument() { { "$set", BsonDocument.Parse(JsonConvert.SerializeObject(item, serializerSettings)) } };
await database.GetCollection<T>(collectionName).UpdateOneAsync(Builders<T>.Filter.Eq("Id", Id), bson);
}
Notes:
Make sure all fields that must not update are set to default value.
If you need to set field to default value, you need to either use DefaultValueHandling.Include, or write custom method for that update
When performance matters, write custom update methods using Builders<T>.Update
P.S.: It's obviously should have been implemented by MongoDB .Net Driver, however I couldn't find it anywhere in the docs, maybe I just looked the wrong way.
Well there are many ways to updated value in mongodb.
Below is one of the simplest way I choose to update a field value in mongodb collection.
public string UpdateData()
{
string data = string.Empty;
string param= "{$set: { name:'Developerrr New' } }";
string filter= "{ 'name' : 'Developerrr '}";
try
{
//******get connections values from web.config file*****
var connectionString = ConfigurationManager.AppSettings["connectionString"];
var databseName = ConfigurationManager.AppSettings["database"];
var tableName = ConfigurationManager.AppSettings["table"];
//******Connect to mongodb**********
var client = new MongoClient(connectionString);
var dataBases = client.GetDatabase(databseName);
var dataCollection = dataBases.GetCollection<BsonDocument>(tableName);
//****** convert filter and updating value to BsonDocument*******
BsonDocument filterDoc = BsonDocument.Parse(filter);
BsonDocument document = BsonDocument.Parse(param);
//********Update value using UpdateOne method*****
dataCollection.UpdateOne(filterDoc, document);
data = "Success";
}
catch (Exception err)
{
data = "Failed - " + err;
}
return data;
}
Hoping this will help you :)

Categories