We're currently trying SQLite Extentions (PCL) as an ORM.
We're wondering if the mapping is supposed to build a SELECT with INNER JOINs on children if they are correctly configured in the entity?
public class Project
{
[PrimaryKey]
public long Id { get; set; }
[ForeignKey(typeof(EnterpriseClient))]
public long EnterpriseClientId { get; set; }
[ManyToOne]
public EnterpriseClient EnterpriseClient { get; set; }
[OneToMany(CascadeOperations = CascadeOperation.All)]
public List<WorkOrderHead> WorkOrderHeads { get; set; }
}
If we get all the Projects with GetAllWithChildren:
var x = _db.GetAllWithChildren<Project>(p => true);
Our result is multiple select for each child (EnterpriseClient) and we were hoping that it would en in one select and a join to collect all the data at once.
Is our configuration wrong or it's supposed to be that way?
Right now SQLite-Net Extensions performs a SELECT for each property to be fetched and also suffers from the N+1 issue in read operations (it is already solved for write operations). It's implemented as a very thin layer over SQLite.Net providing you some convenience methods for accessing entity relationships. Currently it works the way you described as an intended behavior. Accessing registers by primary key or an indexed property it's very fast, and performance is not an issue for small databases like the used in most mobile projects.
SQLite-Net Extensions is an evolving project, so feature requests (and pull requests, of course) are always welcome. However, INNER JOINs would break the SQLite.Net mapping, so a single SELECT returning all the required information would require re-implementing the SQLite.Net mapping mechanism.
It is theoretically possible to workaround the N+1 issue performing a single SELECT for each property, so recursive TO-MANY operations would see a performance improvement. I've created an issue to keep track of this request.
Happy coding!
Related
I have a question regarding the setup of foreign keys in entity framework 6. Our project stores data from a few other services (to have faster access to the data) and provides the users with charts and statistics depending on the stored data. For the storage of the data we´ve setup a cronjob which runs daily at about 3 AM.
Here are 2 example database models:
public class Project {
public string Id { get; set; }
public string Title { get; set; }
}
public class Issue {
public string Id { get; set; }
public string Title { get; set; }
[ForeignKey("Project")]
public string ProjectId { get; set; }
[ForeignKey("ProjectId")]
public Project Project { get; set; }
}
The problem now is for some issues we don´t save the project it depends on but we have to save the ProjectId (because at a later point it might be possible that the project exists in our database). So when I try to save this issues it tells me that I can´t save them because the project does not exist.
Is there any way I can tell entity framework that it doesn´t matter if the project exists or not? Currently I´ve just removed the ForeignKeys but this makes it very slow when I try to get the full list of issues with their projects.
Or is there any other way to read out all issues with their projects if there are no foreign keys? Currently I´m using a foreach loop to go threw each issue and then I search for the project but with more than 10.000 issues this get´s very slow.
The navigation property you've defined is requiring the data in the Project table in order to save an Issue. This isn't Entity Framework, this is a SQL Server foreign key constraint issue. Entity Framework is doing preliminary validation to not waste a connection that will ultimately fail. While you can turn off enforcing the foreign key constraint, there is not a good way to turn this validation off in Entity Framework
Keep in mind, having a foreign key does not mean it will help with your query's performance. It's simply a way to enforce referential integrity. I suspect that your real problem is the way you've written your query. Without seeing your query and metrics around "slow", it be hard to point you in the right direction.
I got a sqlite table in xamarain (native android / pcl):
[Table("Customer")]
public class Customer
{
[PrimaryKey, AutoIncrement]
public int Id { get; set; }
public Address Address{ get; set; }
}
"Address" represents a second table.
1) Is it possible to automatically create the "Address" Table when I call
connection.CreateTable<CustomerDto>();
because it is it's dependency?
2) Is it possible to use a LINQ expression which automatically maps the correct "Address" to this "Customer?
In my .NET Standard library I'm using:
"sqlite-net": "1.0.8"
"sqlite-net-pcl": "1.3.1"
My approach was to create "initial state models" of all the tables, marked as abstract (so there is no risk that somebody could instantiate them), defining only the fields necessary in the database and the primary keys (GUID in my case), used only to create tables at the beginning. Following modification to the data structures always with ALTER instructions.
In another namespace a duplication of all the models, this time with getters/setters and other utilities, and I used these as "real models".
For representing linked models I used a field as Id and another one as model (refreshed when necessary):
public int IdAddress { get; set; }
[Ignore]
public Address Address { get; set; }
I don't think sqlite-net can do what you are asking because it's a very lightweight orm, and even if it could I prefer don't automate too much because of my past experiences with Hibernate.
https://github.com/praeclarum/sqlite-net
https://components.xamarin.com/view/sqlite-net
It sounds like you should look at using Entity Framework because that will allow you to use LINQ with SQLite. The standard library on the web (not Entity framework) is very light and doesn't have much functionality for the ORM like functionality you are looking for.
If you're looking for a more lightweight library, you can use this, but it will not allow you to write LINQ expressions without writing your own ORM:
https://github.com/MelbourneDeveloper/SQLite.Net.Standard
While using EF with wcf come across the condition where i need to map entity to data contract and vice versa because EF objects are burdened with additional data provided by EF. So tried few functions for mapping.
[DataContract]
public class WebsitesD
{
[DataMember]
public int Id { get; set; }
[DataMember]
public string Domain { get; set; }
[DataMember]
public string UserId { get; set; }
[DataMember]
public string Title { get; set; }
}
private WebsitesD mapWebsite(Website w)
{
WebsitesD wd = new WebsitesD();
wd.Id = w.Id;
wd.Title = w.Title;
wd.UserId = w.UserId;
wd.Domain = w.Domain;
return wd;
}
public int insertWebsite(WebsitesD d)
{
try
{
using (MyInfoEntities entities = new MyInfoEntities())
{
entities.Websites.Add(mapWebsite(d));
entities.SaveChanges();
return 1;
}
}
catch (Exception e)
{
throw e;
}
}
where WebsitesD is my data contract and website is entity object. With this i can achieve my objective but problem is that whenever i need to perform any database operation i need to do mapping which i think can be costly operation.
Should i leave Entity Framework and go with ADO .net as i don't need to do any mapping over there. Please suggest me pros and cons with which approach i should go.
As with any ORM there is trade-off between performance and developer productivity. As you said ADO.NET will be the fastest method to fill your data contracts from datareader/dataset, with EF/NHibernate you will always have this situation. However mapping is not expensive for solo entities , it becomes expensive when you map list of entities.If you dont need mapping at all , you can also put [DataContract] on entity classes and [DataMember] on members which wcf want to send to client. but when your EF code is autogenerated when your schema changes , that is all wiped out.
You can also opt for EF Code First approach.
Another approach which involves less coding for mapping is to use AutoMapper
Check this out
Also there is good thread on ORM tradeoffs here
Do what's best for the code base. Consider maintainability and productivity. Servers are cheap and developers are expensive. Very few companies in the world are of such a scale that it is worth maintaining more complicated code rather than buying another server.
With EF 6, you can use the code generation item EF 6.x DbContext Generator with WCF Support.
Just right click on the designer of the EDMX and go to Add Code Generation Item...
Click Online on the left and search for DbContext.
Using this will auto generate the DataMember and DataContract Attributes in your classes.
Also, you'll probably want to delete the regular template generated with the edmx, or you will have two sets of entities.
I am using c# with Fluent NHibernate and auto mapping.
Here is some code (truncated for clarity), then I'll explain the problem.
public class Company
{
public virtual string Description { get; set; }
}
public class Stock
{
public virtual Product Product { get; set; }
public virtual Company Company { get; set; }
}
Mapping
mappings.Conventions.Add<CascadeConvention>()
.Conventions.Add<CustomForeignKeyConvention>()
.Conventions.Add<HasManyConvention>()
.Conventions.Add<VersionConvention>()
CascadeConvention just sets everything to All.
CustomForeignKeyConvention removes the _id that NHibernate usually
appends to foreign key id columns.
HasManyConvention sets all HasMany's to inverse.
VersionConvention convertion looks like this:
instance.Column("Version");
instance.Default(1);
The problem is that when I insert a new stock record, Nhibernate also updates the version number on the related Company.
If I had an IList<Stock> property on the Company then that would make sense but I don't.
I've done a lot of reading around:
NHibernate emitting extraneous update statements regardless of proper Inverse (fluent nhibernate) settings on relations
Cascade Save-Update in NHibernate and version column
NHibernate Reference - Chapter 17. Example: Parent/Child
Ayende # Rahien - NHibernate Mapping
From these, I've tried a whole bunch of things including adding .Not.OptimisticLock() all over the place. I even added an IList<Stock> property on Company so that I could specifically set it as Inverse, Not.OptimisticLock, etc. Nothing I do seems to make any difference.
We eventually sorted this by moving to a Session-per-request paradigm. Not sure why it was going wrong or why this fixed it. I wrote numerous unit tests to try and reproduce the behaviour in a more controlled environment without success.
In any case, it works now. There are good reasons session-per-request is often given as the best practice way to manage NHibernate sessions in a web application.
I am currently developing a web service which provides basic CRUD operations on business objects. The service will be used by legacy applications which currently use direct database access.
I decided to use ServiceStack instead of WCF due to ServiceStacks great architecture.
However know I am trying to decide wether to use OrmLite, nHibernate or Entity Framework to access the existing legacy database.
Requirements for the ORM are as follows
Support for joins
Support for stored procedures
I already tried OrmLite (as it's fast and already included with ServiceStack). The only way I managed to join two tables was by using SQL (not an option). Is there any better way?
// #stackoverflow: This is my POCO DTO
public class Country
{
public long Id { get; set; }
public string Alpha2 { get; set; }
public string Alpha3 { get; set; }
public string ShortText { get; set; }
public string LongText { get; set; }
}
public class CountryRepository : ICountryRepository
{
// #stackoverflow: This is the query to join countries with translated names stored in another table
private const string CountriesSql =
#"SELECT C.Id, C.Alpha2, C.Alpha3, L.ShortText, L.LongText FROM COUNTRY AS C INNER JOIN LOCALIZATION AS L ON C.LocId = L.Id WHERE (L.Lang_Id = {0})";
private const string CountrySql = CountriesSql + " AND C.Id={2}";
private IDbConnection db;
public IDbConnectionFactory DbFactory { get; set; }
private IDbConnection Db
{
get { return db ?? (db = DbFactory.Open()); }
}
public List<Country> GetAll()
{
return Db.Select<Country>(CountriesSql, 0);
}
public Country GetById(long id)
{
return Db.SingleOrDefault<Country>(CountrySql, 0, id);
}
}
The example above shows one of the simple business objects. Most others require Insert, Update, Delete, multiple Joins, and Read with many filters.
If all you need are joins (lazy-loading or eager loading) and stored procedure support and want to get setup quickly then Entity Framework and nHibernate are great options. Here is a cool link about EntityFramework and the repository and unit of work pattern. http://blogs.msdn.com/b/adonet/archive/2009/06/16/using-repository-and-unit-of-work-patterns-with-entity-framework-4-0.aspx
If you are very concerned with performance and want more control over how your classes will look (ie POCOs) and behave then you can try something more lightweight like ORMLite or Dapper. These two are just thin wrappers with less features but they will give you the best performance and most flexibility -- even if that means writing some SQL every once in a while.
You can also use hybrid approaches. Don't be afraid to mix and match. This will be easiest when using POCOs.
I think the important thing is to code for your current database and current needs. However, to do so using proper interfaces so if the time came to switch to a different database or storage mechanism then you simply have to create a new data provider and plug it in.
Ormlite supports primitive Join functions using expressions. The new JoinSqlBuilder class can help with this. For SPs, I have added a new T4 file to generate corresponding c# functions. Currently the SP generation code supports Sql Server alone; if you are using any other db, you can easily add support for it.
You might consider LLBLGen Pro -- it's got great support for database first design and also has designer tools that speed up getting started if you use nHibernate or EF. But it is $$.
http://llblgen.com
As a follow up to this Matt Cowan has created an AWESOME template generator for building this sort of thing with LLBLGen. Check out the blog post here:
http://www.mattjcowan.com/funcoding/2013/03/10/rest-api-with-llblgen-and-servicestack/
and demo here:
http://northwind.mattjcowan.com/
The demo is entirely autogenerated!
Also check this comparison from an OO perspective between NHibernate 3.x and Entity Framework 5/6
http://www.dennisdoomen.net/2013/03/entity-framework-56-vs-nhibernate-3.html