azure table storage doesn't add new fields - c#

I created a table in an Azure Table Storage and added a few records using my Entity class derived from TableEntity . After this I added two more properties to this class and tried to insert more records, but looks like the new fields are not added to the storage, and only the old fields are written and read.
Am I missing something? I have to do something more to change the layout of the table?

thanks for the info but in the end I found that the problem was another. I used Vs to create the properties in the class, and doing this Vs created properties add an internal setter.
Looks like that in this situation The Azure Storage client simply ignore these properties not creating the fields not writing and not reading, giving no errors at all.
Removed the internal keyword started to work correctly.

First, I want to make sure that I understand your scenario. You modified the existing entity class by adding two more properties, then added new entities in the table. You were not able to see two newly added properties, but only able to update/retrieve the old properties. If that is the scenario that you are trying to implement, check the following. You should be able to add new properties.
Add new properties in the class that you derived your entity from TableEntity class. Optional1 and Optional2 are new properties that you are trying to add.
public class CustomerEntity : TableEntity
{
public CustomerEntity() { }
public CustomerEntity(string lastName, string firstName)
{
this.PartitionKey = lastName;
this.RowKey = firstName;
}
public string Email { get; set; }
public string CellPhoneNumber { get; set; }
public string Optional1 { get; set; }
public string Optional2 { get; set; }
}
}
Make sure you set Optional1 and Optional2 values. See the sample code below.
var customer = new CustomerEntity(LastName, FirstName)
{
Email = Email,
CellPhoneNumber = cellPhoneNumber,
Optional1 = optional1,
Optional2 = optional2,
}
TableOperation insertOperation = TableOperation.Insert(customer);
TableName.Execute(insertOperation);
Note: I have not compiled the above code so there may be typos and etc.
Thanks,
Aung

In my case the fields that I define are nullable. Even though I provided values for them, the fields are still not created. Had to make them not nullable so the fields can be created.

Related

How to save/pass MongoDB UpdateDefinition for logging and later use?

I am stumped on how to save/pass MongoDB UpdateDefinition for logging and later use
I have created general functions for MongoDB in Azure use on a collection for get, insert, delete, update that work well.
The purpose is to be able to have a standard, pre-configured way to interact with the collection. For update especially, the goal is to be able to flexibly pass in an appropriate UpdateDefinition where that business logic is done elsewhere and passed in.
I can create/update/set/combine the UpdateDefinition itself, but when i try to log it by serializing it, it shows null:
JsonConvert.SerializeObject(updateDef)
When I try to log it, save it to another a class or pass it to another function it displays null:
public class Account
{
[BsonElement("AccountId")]
public int AccountId { get; set; }
[BsonElement("Email")]
public string Email { get; set; }
}
var updateBuilder = Builders<Account>.Update;
var updates = new List<UpdateDefinition<Account>>();
//just using one update here for brevity - purpose is there could be 1:many depending on fields updated
updates.Add(updateBuilder.Set(a => a.Email, email));
//Once all the logic and field update determinations are made
var updateDef = updateBuilder.Combine(updates);
//The updateDef does not serialize to string, it displays null when logging.
_logger.LogInformation("{0} - Update Definition: {1}", actionName, JsonConvert.SerializeObject(updateDef));
//Class Created for passing the Account Update Information for Use by update function
public class AccountUpdateInfo
{
[BsonElement("AccountId")]
public int AccountId { get; set; }
[BsonElement("Update")]
public UpdateDefinition<Account> UpdateDef { get; set; }
}
var acct = new AccountUpdateInfo();
acctInfo.UpdateDef = updateDef
//This also logs a null value for the Update Definition field when the class is serialized.
_logger.LogInformation("{0} - AccountUpdateInfo: {1}", actionName, JsonConvert.SerializeObject(acct));
Any thoughts or ideas on what is happening? I am stumped on why I cannot serialize for logging or pass the value in a class around like I would expect
give this a try:
var json = updateDef.Render(
BsonSerializer.SerializerRegistry.GetSerializer<Account>(),
BsonSerializer.SerializerRegistry)
.AsBsonDocument
.ToString();
and to turn a json string back to an update definition (using implicit operator), you can do:
UpdateDefinition<Account> updateDef = json;
this is off the top of my head and untested. the only thing i'm unsure of (without an IDE) is the .Document.ToString() part above.

Backendless c# desktop application saving data but by empty or null values

I have a desktop app written in c# and I added app id and key id
and used this code to add data to database but the data is always empty or null.
var film = new Film();
film.setName(“soooft”);
film.setGenre(“aaa”);
film.setPlot(“fdgveqw”);
film.setUrl(“gdfwrw”);
var f = Backendless.Data.Of<Film>().Save(film);
I googled Backendless and it's a third-party solution. (See https://github.com/Backendless/.NET-SDK)
Usage gets explained at https://backendless.com/docs/dotnet/data_data_object.html
But I'm suspicious about why you use setName(), setGenre(), setPlot and setUrl in your code. Seems your Film class is missing properties. I would expect you'd be writing this instead:
var film = new Film();
film.Name = “soooft”;
film.Genre = “aaa”;
film.Plot = “fdgveqw”;
film.Url = “gdfwrw”;
But that would mean those fields are declared as public properties in your class like this:
public class Film
{
public string Name { get; set; }
public string Genre { get; set; }
public string Plot { get; set; }
public string Url { get; set; }
}
So I don't know why you have those setName and other methods. The Backendless API specifies that these fields need to be public properties so it can read them through reflection. Your code seems to suggests that they're not proper properties as indicated by their example and my code of the Film() class.
Make sure to use public get/set properties instead of private fields and the data will be saved properly.

Most efficient way to convert a object to another (Model to ViewModel)

Suppose I have a model with 20 fields, and in my index page, I want to list all models that are stored in my database.
In index page, instead of listing all fields of the model, I only to list 3 fields.
So, I make two class:
class CompleteModel {
public int Id { get; set; }
public string Field01 { get; set; }
public string Field02 { get; set; }
public string Field03 { get; set; }
public string Field04 { get; set; }
public string Field05 { get; set; }
...
public string Field20 { get; set; }
}
now, in my Controller, I can use:
await _context.CompleteModel.ToListAsync();
but I feel that it does not seem to be the right way to do it, because I'm getting all fields and using only 3 fields.
So, I made this code:
class ViewModel {
public string Field02 { get; set; }
public string Field04 { get; set; }
public string Field08 { get; set; }
}
var result = _context.CompleteModel.Select(
x => new {
x.Field02,
x.Field04,
x.Field08
}).ToListAsync();
var listResults = new List<IndexViewModel>();
if (result != null)
{
listResults.AddRange(results.Select(x => new IndexViewModel
{
Field02 = x.Field02,
Field04 = x.Field04,
Field08 = x.Field08
}));
}
I think this is a lot of code to do this.
First, I selected all the fields that I want, then, copied everything to another object.
There's a "more directly" way to do the same thing?
Like:
_context.CompleteModel.Select(x => new IndexViewModel { Field02, Field04, Field08 });
You could use AutoMapper to reduce the boiler plate so you're not manually copying field values over.
If you include the AutoMapper NuGet package then you'd need to have the following in your startup somewhere to configure it for your classes:
Mapper.Initialize(cfg => cfg.CreateMap<CompleteModel, ViewModel>());
You could then do something like the following:
var results = await _context.CompleteModel.ToListAsync();
var viewModelResults = results.Select(Mapper.Map<ViewModel>).ToList();
There are a lot of configuration options for the package so do take a look at the documentation to see if it suits your needs and determine the best way to use it if it does.
In my view this is one of the weaknesses of over abstraction and layering. The VM contains the data that is valuable to your application within the context of use (screen, process etc). The data model contains all the data that could be stored that might be relevant. At some point you need to match the two.
Use EF Projection to fetch only the data you need from the database into projected data model classes (using the EF POCO layer to define the query, but not to store the resultant data).
Map the projected classes onto your VM, if there is a naieve mapping, using Automapper or similar. However unless you are just writing CRUD screens a simple field by field mapping is of little value; the data you fetch from your data store via EF is in its raw, probably relational form. The data required by your VM is probably not going to fit that form very neatly (again, unless you are doing a simple CRUD form), so you are going to need to add some value by coding the relationship between the data store and the View Model.
I think concentrating on the count of lines of code would lead to the wrong approach. I think you can look at that code and ask "is it adding any value". If you can delegate the task to Automapper, then great; but your VM isn't really pulling its weight other than adding some validation annotation if you can consistently delegate the task of data model to VM data copying.

Prevent Azure TableEntity property from being serialized in MVC 4 WebAPI

So I have a Model Subscription which inherits from Azure's TableEntity class for use in a WebApi Get method as follows:
[HttpGet]
public IEnumerable<Subscription> Subscribers()
In this method, I do a Select query on my subscribers table to find all subscribers, but I only want to return a few of the columns (properties) as follows:
var query = new TableQuery<Subscription>().Select(new string[] {
"PartitionKey",
"RowKey",
"Description",
"Verified"
});
The definition for the model is below:
public class Subscription : TableEntity
{
[Required]
[RegularExpression(#"[\w]+",
ErrorMessage = #"Only alphanumeric characters and underscore (_) are allowed.")]
[Display(Name = "Application Name")]
public string ApplicationName
{
get
{
return this.PartitionKey;
}
set
{
this.PartitionKey = value;
}
}
[Required]
[RegularExpression(#"[\w]+",
ErrorMessage = #"Only alphanumeric characters and underscore (_) are allowed.")]
[Display(Name = "Log Name")]
public string LogName
{
get
{
return this.RowKey;
}
set
{
this.RowKey = value;
}
}
[Required]
[EmailAddressAttribute]
[Display(Name = "Email Address")]
public string EmailAddress { get; set; }
public string Description { get; set; }
public string SubscriberGUID { get; set; }
public bool? Verified { get; set; }
}
The following is the XML response of the API query:
<ArrayOfSubscription>
<Subscription>
<ETag>W/"datetime'2013-03-18T08%3A54%3A32.483Z'"</ETag>
<PartitionKey>AppName1</PartitionKey><RowKey>Log1</RowKey>
<Timestamp>
<d3p1:DateTime>2013-03-18T08:54:32.483Z</d3p1:DateTime>
<d3p1:OffsetMinutes>0</d3p1:OffsetMinutes>
</Timestamp>
<ApplicationName>AppName1</ApplicationName>
<Description>Desc</Description>
<EmailAddress i:nil="true"/>
<LogName>Log1</LogName>
<SubscriberGUID i:nil="true"/>
<Verified>false</Verified>
</Subscription>
</ArrayOfSubscription>
As you can see, the model not only has a few additional properties such as SubscriberGUID which I do not want to be serialized in the response (and since they are not in the select query, they are null anyway), but TableEntity itself has fields such as PartitionKey, RowKey, Etag, and Timestamp which are also being serialized.
How do I continue to use Azure tables but avoid serializing in the response these undesired fields I do not want the user to see.
Not disagreeing with the answer of using a specific DTO, but the Microsoft.WindowsAzure.Storage assembly now provides an attribute, the IgnorePropertyAttribute, that you can decorate your public property with to avoid serialization.
I haven't actually tried it yet but there is a method on TableEntity called ShouldSkipProperty() that checks a number of things before returning false (i.e. don't skip):
Is the Property Name one of "PartitionKey", "RowKey", "Timestamp" or "ETag" -> skip
Are EITHER of the getter and setter non-public -> skip
Is it static -> skip
Does the property have the attribute IgnorePropertyAttribute -> skip
Looks like it'll do the trick.
I would suggest using DTO (data transfer objects) to solve this type of issues. DTO's might mean more code (more classes) but would benefit you in the long term. You have much better control as to what would be put on the wire. They are better from a security standpoint too rather than using some serializer specific attributes to control what is being put on the wire.
Refer to this asp.net web API tutorial for more.
The use of the DTO is the way to go, IMHO, but to clarify, since it wasn't as obvious from the posts is where to implement to the DTO. I was hoping I could have just used it as part of the query, which I could not. Instead, I had to do this:
query.SelectColumns = new List<string> { "QuoteId", "RateId", "Date" };
var results = await MyCloudTable.ExecuteQuerySegmentedAsync(query, null);
return results.Select(d => new MyDto { QuoteId = d.QuoteId, RateId = d.RateId, Date = d.Date }).ToList();
You have to return your TableEntity derived object from your TableQuery, but since all the properties are null (from explicitly selecting the columns you want) there is no additional data on the wire. You then project into your DTO so you can return exactly the object you need.
You do not need to inherit from TableEntity class. You can use TableEntity.Flatten method to create a DynamicTableEntity from your Subscription class and write to table storage. And you can use TableEntity.ConvertBack method to recompose your subscription object when you read the DynamicTableEntity back from azure table storage. These static helper methods are available in Azure Table Storage SDK version >= 8.0.0
TableEntity.Flatten: https://msdn.microsoft.com/en-us/library/azure/mt775434.aspx
TableEntity.ConvertBack: https://msdn.microsoft.com/en-us/library/azure/mt775432.aspx
Eliminating the need for you to further write up converter classes between DTO s and Business Data Models

Entity Framework Code First, nonempty setter or getter?

I am working with an EF Code First project, and all is well. I have a simple Class, Customer. In my Customer Class I have a field I want to encrypt (Yes, I know I can encrypt at the DB level but requirements dictate I encrypt at the Domain/Code level), so I am hoping that I can do something like the following:
public class Customer
{
public int CustomerID { get; set; }
public string FieldToEncrypt { get; set { _FieldToEncrypt = MyEncryptionFunction.Encrypt(); } }
}
However, I assume that if the setter has a definition, entity framework code first may ignore that property when generating the schema. So my question is, is there a way to do EF Code First with provided getters/setters, or should I move this functionality into a constructor? Should I override one of the methods/events that happens when the Context is saving, instead?
EDIT ********************
As a note, I am using DataService to transmit the data over an OData protocol service. This automatically generates insert/update/select methods. Some of the suggestions require creating a second property, but the DataService class does not seem to pass through NotMapped properties. This throws a bit of a kink into my earlier question.
public class Customer
{
public int CustomerID { get; set; }
public string EncryptedField { get; private set; }
[NotMapped]
public string Field
{
get { return MyEncryptionFunction.Decrypt(EncryptedField); }
set { EncryptedField = MyEncryptionFunction.Encrypt(value); }
}
}

Categories