Prevent Azure TableEntity property from being serialized in MVC 4 WebAPI - c#

So I have a Model Subscription which inherits from Azure's TableEntity class for use in a WebApi Get method as follows:
[HttpGet]
public IEnumerable<Subscription> Subscribers()
In this method, I do a Select query on my subscribers table to find all subscribers, but I only want to return a few of the columns (properties) as follows:
var query = new TableQuery<Subscription>().Select(new string[] {
"PartitionKey",
"RowKey",
"Description",
"Verified"
});
The definition for the model is below:
public class Subscription : TableEntity
{
[Required]
[RegularExpression(#"[\w]+",
ErrorMessage = #"Only alphanumeric characters and underscore (_) are allowed.")]
[Display(Name = "Application Name")]
public string ApplicationName
{
get
{
return this.PartitionKey;
}
set
{
this.PartitionKey = value;
}
}
[Required]
[RegularExpression(#"[\w]+",
ErrorMessage = #"Only alphanumeric characters and underscore (_) are allowed.")]
[Display(Name = "Log Name")]
public string LogName
{
get
{
return this.RowKey;
}
set
{
this.RowKey = value;
}
}
[Required]
[EmailAddressAttribute]
[Display(Name = "Email Address")]
public string EmailAddress { get; set; }
public string Description { get; set; }
public string SubscriberGUID { get; set; }
public bool? Verified { get; set; }
}
The following is the XML response of the API query:
<ArrayOfSubscription>
<Subscription>
<ETag>W/"datetime'2013-03-18T08%3A54%3A32.483Z'"</ETag>
<PartitionKey>AppName1</PartitionKey><RowKey>Log1</RowKey>
<Timestamp>
<d3p1:DateTime>2013-03-18T08:54:32.483Z</d3p1:DateTime>
<d3p1:OffsetMinutes>0</d3p1:OffsetMinutes>
</Timestamp>
<ApplicationName>AppName1</ApplicationName>
<Description>Desc</Description>
<EmailAddress i:nil="true"/>
<LogName>Log1</LogName>
<SubscriberGUID i:nil="true"/>
<Verified>false</Verified>
</Subscription>
</ArrayOfSubscription>
As you can see, the model not only has a few additional properties such as SubscriberGUID which I do not want to be serialized in the response (and since they are not in the select query, they are null anyway), but TableEntity itself has fields such as PartitionKey, RowKey, Etag, and Timestamp which are also being serialized.
How do I continue to use Azure tables but avoid serializing in the response these undesired fields I do not want the user to see.

Not disagreeing with the answer of using a specific DTO, but the Microsoft.WindowsAzure.Storage assembly now provides an attribute, the IgnorePropertyAttribute, that you can decorate your public property with to avoid serialization.
I haven't actually tried it yet but there is a method on TableEntity called ShouldSkipProperty() that checks a number of things before returning false (i.e. don't skip):
Is the Property Name one of "PartitionKey", "RowKey", "Timestamp" or "ETag" -> skip
Are EITHER of the getter and setter non-public -> skip
Is it static -> skip
Does the property have the attribute IgnorePropertyAttribute -> skip
Looks like it'll do the trick.

I would suggest using DTO (data transfer objects) to solve this type of issues. DTO's might mean more code (more classes) but would benefit you in the long term. You have much better control as to what would be put on the wire. They are better from a security standpoint too rather than using some serializer specific attributes to control what is being put on the wire.
Refer to this asp.net web API tutorial for more.

The use of the DTO is the way to go, IMHO, but to clarify, since it wasn't as obvious from the posts is where to implement to the DTO. I was hoping I could have just used it as part of the query, which I could not. Instead, I had to do this:
query.SelectColumns = new List<string> { "QuoteId", "RateId", "Date" };
var results = await MyCloudTable.ExecuteQuerySegmentedAsync(query, null);
return results.Select(d => new MyDto { QuoteId = d.QuoteId, RateId = d.RateId, Date = d.Date }).ToList();
You have to return your TableEntity derived object from your TableQuery, but since all the properties are null (from explicitly selecting the columns you want) there is no additional data on the wire. You then project into your DTO so you can return exactly the object you need.

You do not need to inherit from TableEntity class. You can use TableEntity.Flatten method to create a DynamicTableEntity from your Subscription class and write to table storage. And you can use TableEntity.ConvertBack method to recompose your subscription object when you read the DynamicTableEntity back from azure table storage. These static helper methods are available in Azure Table Storage SDK version >= 8.0.0
TableEntity.Flatten: https://msdn.microsoft.com/en-us/library/azure/mt775434.aspx
TableEntity.ConvertBack: https://msdn.microsoft.com/en-us/library/azure/mt775432.aspx
Eliminating the need for you to further write up converter classes between DTO s and Business Data Models

Related

How to save/pass MongoDB UpdateDefinition for logging and later use?

I am stumped on how to save/pass MongoDB UpdateDefinition for logging and later use
I have created general functions for MongoDB in Azure use on a collection for get, insert, delete, update that work well.
The purpose is to be able to have a standard, pre-configured way to interact with the collection. For update especially, the goal is to be able to flexibly pass in an appropriate UpdateDefinition where that business logic is done elsewhere and passed in.
I can create/update/set/combine the UpdateDefinition itself, but when i try to log it by serializing it, it shows null:
JsonConvert.SerializeObject(updateDef)
When I try to log it, save it to another a class or pass it to another function it displays null:
public class Account
{
[BsonElement("AccountId")]
public int AccountId { get; set; }
[BsonElement("Email")]
public string Email { get; set; }
}
var updateBuilder = Builders<Account>.Update;
var updates = new List<UpdateDefinition<Account>>();
//just using one update here for brevity - purpose is there could be 1:many depending on fields updated
updates.Add(updateBuilder.Set(a => a.Email, email));
//Once all the logic and field update determinations are made
var updateDef = updateBuilder.Combine(updates);
//The updateDef does not serialize to string, it displays null when logging.
_logger.LogInformation("{0} - Update Definition: {1}", actionName, JsonConvert.SerializeObject(updateDef));
//Class Created for passing the Account Update Information for Use by update function
public class AccountUpdateInfo
{
[BsonElement("AccountId")]
public int AccountId { get; set; }
[BsonElement("Update")]
public UpdateDefinition<Account> UpdateDef { get; set; }
}
var acct = new AccountUpdateInfo();
acctInfo.UpdateDef = updateDef
//This also logs a null value for the Update Definition field when the class is serialized.
_logger.LogInformation("{0} - AccountUpdateInfo: {1}", actionName, JsonConvert.SerializeObject(acct));
Any thoughts or ideas on what is happening? I am stumped on why I cannot serialize for logging or pass the value in a class around like I would expect
give this a try:
var json = updateDef.Render(
BsonSerializer.SerializerRegistry.GetSerializer<Account>(),
BsonSerializer.SerializerRegistry)
.AsBsonDocument
.ToString();
and to turn a json string back to an update definition (using implicit operator), you can do:
UpdateDefinition<Account> updateDef = json;
this is off the top of my head and untested. the only thing i'm unsure of (without an IDE) is the .Document.ToString() part above.

Most efficient way to convert a object to another (Model to ViewModel)

Suppose I have a model with 20 fields, and in my index page, I want to list all models that are stored in my database.
In index page, instead of listing all fields of the model, I only to list 3 fields.
So, I make two class:
class CompleteModel {
public int Id { get; set; }
public string Field01 { get; set; }
public string Field02 { get; set; }
public string Field03 { get; set; }
public string Field04 { get; set; }
public string Field05 { get; set; }
...
public string Field20 { get; set; }
}
now, in my Controller, I can use:
await _context.CompleteModel.ToListAsync();
but I feel that it does not seem to be the right way to do it, because I'm getting all fields and using only 3 fields.
So, I made this code:
class ViewModel {
public string Field02 { get; set; }
public string Field04 { get; set; }
public string Field08 { get; set; }
}
var result = _context.CompleteModel.Select(
x => new {
x.Field02,
x.Field04,
x.Field08
}).ToListAsync();
var listResults = new List<IndexViewModel>();
if (result != null)
{
listResults.AddRange(results.Select(x => new IndexViewModel
{
Field02 = x.Field02,
Field04 = x.Field04,
Field08 = x.Field08
}));
}
I think this is a lot of code to do this.
First, I selected all the fields that I want, then, copied everything to another object.
There's a "more directly" way to do the same thing?
Like:
_context.CompleteModel.Select(x => new IndexViewModel { Field02, Field04, Field08 });
You could use AutoMapper to reduce the boiler plate so you're not manually copying field values over.
If you include the AutoMapper NuGet package then you'd need to have the following in your startup somewhere to configure it for your classes:
Mapper.Initialize(cfg => cfg.CreateMap<CompleteModel, ViewModel>());
You could then do something like the following:
var results = await _context.CompleteModel.ToListAsync();
var viewModelResults = results.Select(Mapper.Map<ViewModel>).ToList();
There are a lot of configuration options for the package so do take a look at the documentation to see if it suits your needs and determine the best way to use it if it does.
In my view this is one of the weaknesses of over abstraction and layering. The VM contains the data that is valuable to your application within the context of use (screen, process etc). The data model contains all the data that could be stored that might be relevant. At some point you need to match the two.
Use EF Projection to fetch only the data you need from the database into projected data model classes (using the EF POCO layer to define the query, but not to store the resultant data).
Map the projected classes onto your VM, if there is a naieve mapping, using Automapper or similar. However unless you are just writing CRUD screens a simple field by field mapping is of little value; the data you fetch from your data store via EF is in its raw, probably relational form. The data required by your VM is probably not going to fit that form very neatly (again, unless you are doing a simple CRUD form), so you are going to need to add some value by coding the relationship between the data store and the View Model.
I think concentrating on the count of lines of code would lead to the wrong approach. I think you can look at that code and ask "is it adding any value". If you can delegate the task to Automapper, then great; but your VM isn't really pulling its weight other than adding some validation annotation if you can consistently delegate the task of data model to VM data copying.

azure table storage doesn't add new fields

I created a table in an Azure Table Storage and added a few records using my Entity class derived from TableEntity . After this I added two more properties to this class and tried to insert more records, but looks like the new fields are not added to the storage, and only the old fields are written and read.
Am I missing something? I have to do something more to change the layout of the table?
thanks for the info but in the end I found that the problem was another. I used Vs to create the properties in the class, and doing this Vs created properties add an internal setter.
Looks like that in this situation The Azure Storage client simply ignore these properties not creating the fields not writing and not reading, giving no errors at all.
Removed the internal keyword started to work correctly.
First, I want to make sure that I understand your scenario. You modified the existing entity class by adding two more properties, then added new entities in the table. You were not able to see two newly added properties, but only able to update/retrieve the old properties. If that is the scenario that you are trying to implement, check the following. You should be able to add new properties.
Add new properties in the class that you derived your entity from TableEntity class. Optional1 and Optional2 are new properties that you are trying to add.
public class CustomerEntity : TableEntity
{
public CustomerEntity() { }
public CustomerEntity(string lastName, string firstName)
{
this.PartitionKey = lastName;
this.RowKey = firstName;
}
public string Email { get; set; }
public string CellPhoneNumber { get; set; }
public string Optional1 { get; set; }
public string Optional2 { get; set; }
}
}
Make sure you set Optional1 and Optional2 values. See the sample code below.
var customer = new CustomerEntity(LastName, FirstName)
{
Email = Email,
CellPhoneNumber = cellPhoneNumber,
Optional1 = optional1,
Optional2 = optional2,
}
TableOperation insertOperation = TableOperation.Insert(customer);
TableName.Execute(insertOperation);
Note: I have not compiled the above code so there may be typos and etc.
Thanks,
Aung
In my case the fields that I define are nullable. Even though I provided values for them, the fields are still not created. Had to make them not nullable so the fields can be created.

MongoDB C# Driver - Ignore fields on binding

When using a FindOne() using MongoDB and C#, is there a way to ignore fields not found in the object?
EG, example model.
public class UserModel
{
public ObjectId id { get; set; }
public string Email { get; set; }
}
Now we also store a password in the MongoDB collection, but do not want to bind it to out object above. When we do a Get like so,
var query = Query<UserModel>.EQ(e => e.Email, model.Email);
var entity = usersCollection.FindOne(query);
We get the following error
Element 'Password' does not match any field or property of class
Is there anyway to tell Mongo to ignore fields it cant match with the models?
Yes. Just decorate your UserModel class with the BsonIgnoreExtraElements attribute:
[BsonIgnoreExtraElements]
public class UserModel
{
public ObjectId id { get; set; }
public string Email { get; set; }
}
As the name suggests, the driver would ignore any extra fields instead of throwing an exception. More information here - Ignoring Extra Elements.
Yet Another possible solution, is to register a convention for this.
This way, we do not have to annotate all classes with [BsonIgnoreExtraElements].
Somewhere when creating the mongo client, setup the following:
var pack = new ConventionPack();
pack.Add(new IgnoreExtraElementsConvention(true));
ConventionRegistry.Register("My Solution Conventions", pack, t => true);
Yes. Another way (instead of editing you model class) is to use RegisterClassMap with SetIgnoreExtraElements.
In your case just add this code when you initialize your driver:
BsonClassMap.RegisterClassMap<UserModel>(cm =>
{
cm.AutoMap();
cm.SetIgnoreExtraElements(true);
});
You can read more about ignoring extra elements using class mapping here - Ignoring Extra Elements.

Is it wrong to dynamically add "data-val" and "data-val-required" in the View?

I have a ViewModel that I can decorate with the [Required] attribute (see below). I've come to the point where I need to let the client control which fields are required or not. They can configure this trough XML and all this info is stored in the Model when it's first created. Now I have fields that are not decorated with [Required] but still need to get validated (as per "user settings") before submitting (for example the Phone field).
public class MyBusinessObjectViewModel
{
[Required]
public string Email { get; set; } //compulsory
public string Phone { get; set; } //not (yet) compulsory, but might become
}
If the user will not enter the Phone number, the data will still get posted. Wanting not to mess with custom validators, I just add the "data-val" and "data-val-required" attributes to the Html, like this:
Dictionary<string, object> dict = new Dictionary<string, object>();
dict.Add("data-val", "true");
dict.Add("data-val-required", "This field is required.");
#Html.TextBoxFor(x => x, dict);
This forces the client side validation for all the properties that are dynamically set as required. Is this good practice? What kind of side effects can I expect?
You should look into extending the meta model framework with your own metadata provider to do the actual binding between your site's configuration and the model metadata. You can actually set the required property flag to true on the property model metadata during the metadata creation process. I can't remember for sure whether this causes the built in editor templates to generate the attribute, but I think it does. Worst case scenario you can actually create and attach a new RequiredAttribute to the property, which is a tad bit kluggy, but works pretty well in certain scenarios.
You could also do this with IMetadataAware attributes, especially if Required is the only metadata aspect your users can customize, but the implementation really depends on what you're trying to do.
One major advantage of using a custom ModelMetadataProvider in your specific case is that you can use dependency injection (via ModelMetadataProviders) to get your customer settings persistence mechanism into scope, whereas with the data attribute you only get to write an isolated method that runs immediately after the metadata model is created.
Here is a sample implementation of a custom model metadata provider, you'd just have to change the client settings to whatever you wanted to use.
UPDATED but not tested at all
public class ClientSettingsProvider
{
public ClientSettingsProvider(/* db info */) { /* init */ }
public bool IsPropertyRequired(string propertyIdentifier)
{
// check the property identifier here and return status
}
}
public ClientRequiredAttribute : Attribute
{
string _identifier;
public string Identifier { get { return _identifer; } }
public ClientRequiredAttribute(string identifier)
{ _identifier = identifier; }
}
public class RequiredModelMetadataProvider : DataAnnotationsModelMetadataProvider
{
ClientSettings _clientSettings;
public RequiredModelMetadataProvider(ClientSettings clientSettings)
{
_clientSettings = clientSettings;
}
protected override ModelMetadata CreateMetadata(IEnumerable<Attribute> attributes, Type containerType, Func<object> modelAccessor, Type modelType, string propertyName)
{
// alternatively here is where you could 'inject' a RequiredAttribute into the attributes list
var clientRequiredAttribute = attributes.OfType<ClientRequiredAttribute>().SingleOrDefault();
if(clientRequiredAttribute != null && _clientSettings.IsPropertyRequired(clientRequiredAttribute.Identifier))
{
// By injecting the Required attribute here it will seem to
// the base provider we are extending as if the property was
// marked with [Required]. Your data validation attributes should
// be added, provide you are using the default editor templates in
// you view.
attributes = attributes.Union(new [] { new RequiredAttribute() });
}
var metadata = base.CreateMetadata(attributes, containerType, modelAccessor, modelType, propertyName);
// REMOVED, this is another way but I'm not 100% sure it will add your attributes
// Use whatever attributes you need here as parameters...
//if (_clientSettings.IsPropertyRequired(containerType, propertyName))
//{
// metadata.IsRequired = true;
//}
return metadata;
}
}
USAGE
public class MyModel
{
[ClientRequired("CompanyName")]
public string Company { get; set; }
}
public class MyOtherModel
{
[ClientRequired("CompanyName")]
public string Name { get; set; }
public string Address { get; set; }
}
Both of these models would validate the string "CompanyName" against your client settings provider.
Not wanting to mess with custom validators, so you messed in the View obfuscating the logic of your validation by removing it from the place where it is expected to be found.
Really, don't be afraid of creating a custom attribute validator. What you are doing right now is getting a technical debt.

Categories