I am new to Code First. I am interested in using Code First going forward on my projects. I am using EF 6.xx. I will be creating several projects using an existing database but will be adding additional tables/views/stored procedures where necessary. Perhaps a silly question... Can I develop a library of POCOs that are tagged with the appropriate Fluent API tags and then pick and choose what Fluent API tagged POCO library classes I want to include in the OnModelCreating method for the particular project. I'm interested in re-using the same POCOs from project to project. Is this what others are doing or are they re-creating the POCOs in every project?
Thanks in advance,
Terry
You can certainly re-use the POCO classes between applications. If they are not referenced directly by your DbSet subclass or indirectly by another class that is already referenced then they won't be used by EF.
You can use attributes (what I think you mean by tags) on the various POCO classes as long as those attributes are the same between all projects that will use them - e.g. column name etc.
For stuff that changes between projects you'll definitely want to use Code First's fluent interface to configure them in the OnModelCreating.
Related
I am building a new ASP.NET Core 5 MVC app. I want to use clean architecture as outlined in Microsoft's web app architecture ebook.
I am also studying eShopOnWeb sample application available here :
https://github.com/dotnet-architecture/eShopOnWeb
What I understand from the ebook and sample app is - EF Core entity classes (say Customer, Product, Order) will go inside ApplicationCore project's Entities folder. The DbContext will be in Infrastructure project.
My confusion is: is it alright to add data annotation schema attributes such as [Table], [DatabaseGenerated], and [Key] on these entity classes inside ApplicationCore project? If not, where should I add these data annotations?
Any advice in this regards is highly appreciated.
Thank you.
In the example of eShopOnWeb they separate Entities and configuration models. So, that means to have clean architecture you don't need annotations directly in these Entities.
You can use FluentAPI, as they used as well in the Infrastructure/Data/Config directory.
So, if you have a separate project for DataContext that is the best location for describing your entities with FluentAPI in this case.
More information about FluentAPI:
https://learn.microsoft.com/en-us/ef/ef6/modeling/code-first/fluent/types-and-properties
I am building an asp.net mvc application that is using Entity Framework 6. We have the challenge of building several implementations of this same application. So we have created a core library called MyApp.Core which contains the following:
DbContext
Models
Customer
Product
(other models)
Repositories
We have the need to extend models for different implementations of the application. For example we might want to put SomeProperty on the customer table for 1 customer and SomeOtherProperty for another customer.
How can we improve the structure so it doesn't break the EF code first migrations? Or cause any other issues?
Should we just have a unique ASP.net project for each customer that references the MyApp.Core? And should we reference those via a nuget package? Or something else like a git sub module?
Any suggestions on the organization of the custom implementations of this type of structure?
Your solution may be an IsA (Is-a, is a) database structure that can be created from your code-first model using the TPC method mentioned here:
http://weblogs.asp.net/manavi/inheritance-mapping-strategies-with-entity-framework-code-first-ctp5-part-3-table-per-concrete-type-tpc-and-choosing-strategy-guidelines
You might create an abstraction such as ICustomer, or CustomerBase which contains a reference to a table containing the implementations of your application (is that Product?).
You might also transform your Customer class into a base class, and other classes with additional fields would inherit from it.
In either event, the article deals with bringing code first into a database model that can handle this kind of thing. You may also want to take a look at multi-tenant architecture, just to say you've done your homework. That's here: https://msdn.microsoft.com/en-us/library/aa479086.aspx
I am using database first approach with Entity Framework 6 and generating POCO objects using T4 templates to use them in my WCF service application.
When creating new instances of the POCO objects on the server side if there are any missing required fields, as a final resort if the code missed any, I could call the method GetValidationResult() on the DBContext.Entry to get any validation errors. However I would like to be able to handle any of these on the client end by emitting the Data Annotations on the POCO objects as I am sharing the POCO objects between the Server and Client application.
I was thinking to add buddy classes to define the metadata and create partial classes to configure the buddy class. I was wondering if there are any T4 templates which would help me generate these partial classes with the metadata since there are a lot of entities and also whenever there is an update to the tables the partial classes with the metadata could be updated easily instead of going through each object and updating.
Please let me know whether my thought process is in the right direction and other best approaches.
All,
We are using EF as our primary data access technology. Like many apps out there, we have a business objects/domain layer. This layers talks to our repository, which, in turn, talks to EF.
My question is: What is the best mechanism for passing the data back and forth to/from EF? Should we use the EF-generated entity classes (we did DB-first development, so we have entity classes that EF generated), create our own DTOs, use JSON or something else?
Of course, I could make an argument for each of these, as well as a counter-argument against them. I'm looking for opinions based on experience building a non-trivial application using a layered architecture and EF.
Thanks,
John
I would use POCOs and use them with EF. You can still do that with the DB first approach.
The main benefit is that your business objects will not be tied to any data access technology.
Your underlying storage mechanism can, and will, change but your POCOs remain. All that business logic is easily re-used and tested.
As you're looking for cons, then I would say it might take longer. However, that cost is well worth it.
With t4 templates I put the actual EF generated entities in a common project that is referenced by all other projects. I use the EF database first created models through the entire application (including use as view models). If I need to add additional properties to an entity that are not in the database I just extend the partial class of the entity in the common project. I have written dozens and large nTier applications using this model and its worked great.
I will be using Solr search server with my ASP.NET 4.5 application. I've already installed SOLR on my Windows 8 laptop computer. According to SolrNet this documentation, I need to use specific attributes on my POCOs.
The thing is that I am using Entity Framework and my classes are auto generated. Is there an option to assign those type of Solr attributed and also make sure that they are presistant and won't be erased if your suggested solution is based on editing the template (.tt) file.
I want to use Entity Framework, but if it is not possible, I will just copy the pocos and create the classes myself with those Attributes. But I prefer searching for a solution that will allow me to use solrnet with Entity Framework. Thanks.
I would suggest that you create separate classes that map to your Solr Index schema, as typically the structure of your EF classes and your index schema will not be identical. This way you have a clean separation between your persistence classes (those auto-generated by EF) and your index mapping classes and can control how the mapping between the two occurs. I recommend the use of AutoMapper to assist with translating your objects from EF to Solr and back again as needed.