I got a sqlite table in xamarain (native android / pcl):
[Table("Customer")]
public class Customer
{
[PrimaryKey, AutoIncrement]
public int Id { get; set; }
public Address Address{ get; set; }
}
"Address" represents a second table.
1) Is it possible to automatically create the "Address" Table when I call
connection.CreateTable<CustomerDto>();
because it is it's dependency?
2) Is it possible to use a LINQ expression which automatically maps the correct "Address" to this "Customer?
In my .NET Standard library I'm using:
"sqlite-net": "1.0.8"
"sqlite-net-pcl": "1.3.1"
My approach was to create "initial state models" of all the tables, marked as abstract (so there is no risk that somebody could instantiate them), defining only the fields necessary in the database and the primary keys (GUID in my case), used only to create tables at the beginning. Following modification to the data structures always with ALTER instructions.
In another namespace a duplication of all the models, this time with getters/setters and other utilities, and I used these as "real models".
For representing linked models I used a field as Id and another one as model (refreshed when necessary):
public int IdAddress { get; set; }
[Ignore]
public Address Address { get; set; }
I don't think sqlite-net can do what you are asking because it's a very lightweight orm, and even if it could I prefer don't automate too much because of my past experiences with Hibernate.
https://github.com/praeclarum/sqlite-net
https://components.xamarin.com/view/sqlite-net
It sounds like you should look at using Entity Framework because that will allow you to use LINQ with SQLite. The standard library on the web (not Entity framework) is very light and doesn't have much functionality for the ORM like functionality you are looking for.
If you're looking for a more lightweight library, you can use this, but it will not allow you to write LINQ expressions without writing your own ORM:
https://github.com/MelbourneDeveloper/SQLite.Net.Standard
Related
I have a SQL Server based ASP.NET MVC 5 app, and I'm using Entity Framework 6 to talk to the database.
We're using a "hybrid" approach - we manage all the database structure with classic SQL scripts which we deploy onto our DB server, and then we generate the "code-first" classes from that SQL Server database. This works reasonably well, for the most part.
One thing that bugs me is the if a given table has multiple FK link to another table, the naming convention used by the EF6 code generation is pretty lame....
Assume I have a table (and therefore entity) Site which represents a site somewhere, and this site has three links to a Contact table for various roles - the "main" contact, the "support" contact, and a "sales" contact. So my table in SQL Server looks something like this:
CREATE TABLE dbo.Site
(
SiteID INT NOT NULL
CONSTRAINT PK_Site PRIMARY KEY CLUSTERED,
.... some other properties, of no interest or relevance here .....
MainContactId INT NOT NULL
CONSTRAINT FK_Site_MainContact FOREIGN KEY REFERENCES dbo.Contact(ContactId),
SalesContactId INT NOT NULL
CONSTRAINT FK_Site_SalesContact FOREIGN KEY REFERENCES dbo.Contact(ContactId),
SupportContactId INT NOT NULL
CONSTRAINT FK_Site_SupportContact FOREIGN KEY REFERENCES dbo.Contact(ContactId)
)
I had been hoping that the EF6 code-first from existing database generation would be smart enough to read those column names and come up with meaningful names for the navigation properties on the entity - but alas, this is what I get instead:
[Table("Site")]
public partial class Site
{
[DatabaseGenerated(DatabaseGeneratedOption.None)]
public int SiteID { get; set; }
public int MainContactId { get; set; }
public int SalesContactId { get; set; }
public int SupportContactId { get; set; }
public virtual Contact Contact { get; set; }
public virtual Contact Contact1 { get; set; }
public virtual Contact Contact2 { get; set; }
}
While the actual FK columns are OK - the "deduced" navigation properties are horrible - Contact, Contact1 and Contact2 - seriously, is this the best naming??? I think not... I would much prefer to called them "MainContact", "SalesContact", "SupportContact" - wouldn't that make a lot more sense? And be clearer for later use?
I installed the custom T4 templates (Nuget package "EntityFramework.CodeTemplates.CSharp"), and I see that there are a few interesting and potentially very useful helper classes being used (CSharpCodeHelper, EdmHelper, EntitySet, DbModel from the Microsoft.Data.Entity.Design and System.Data.Entity.Infrastructure namespaces) - unfortunately, most of them are sparsely documented, and also often their constructors are internal - so I cannot really build my own tool based on those ready-made classes.
Any other approach? I'd really like to teach the code generation a few smarts - this right now is just not up to usual standards and requires me to make a lot of manual changes to generated files - a labor in vain, flushed down the digital toilet each time I need to re-generate the classes from the database.....
I got a code first EF and I want to use native sql for the more complex select statements.
When I try to execute:
using (VaultsDbContext db = new VaultsDbContext())
{
var contracts = db.Contracts.SqlQuery("select * from Contracts").ToList<Contract>();
}
I got:
Cannot create a value for property 'MetaProps' of type
'DskVault.Models.DbModels.MetaProps'. Only properties of primitive or
enumeration types are supported.
MetaProps is a class that holds deleteflag, creator etc. and it's a property of all my classes. It's not mapped to a different table, every table has deleteflag, createor, etc.
public class Contract
{
public long Id { get; set; }
...
public MetaProps MetaProps { get; set; }
}
Is there a way to map from the native sql to the class if the class contains a complex type or does EF not support that? Also what if the complex type is entity mapped to another table(join)?
Edit:
Version: Entity Framework 6
I know from experience not all the fields in your table have to be contained in your model. This is a good thing when it comes to installing updates into production.
Have you tried reverse engineering your tables on a SEPARATE temporary project using the Entity Framework Power tools? This is a Nuget package that I have found to be extremely useful in code first programming. Reverse engineering will overwrite existing files, so make sure not to do this on your live code.
Say I have the below entities. (Heavily Simplified for brevity, but the key properties are included)
public class Crime
{
[Key]
public int CrimeId {get;set;}
public virtual ICollection<Victim> Victims {get;set;}
public virtual CrimeDetail Detail {get;set}
}
public class Victim
{
[Key]
public int VictimId {get;set;}
public string VictimCategory {get;set;}
}
public class CrimeDetail
{
[Key]
public int id {get;set;}
public string DetailText {get;set;}
}
I have 600,000+ of these records to insert into SQL Server 2008 Express R2, which takes quite some time using Entity Framework 4.4.
Ideally I'd like to use SQLBulkCopy to batch insert these records, but since that doesn't support complex types (At least not out of the box), then I'm trying to find some other potential solutions.
I'm open to changing my model if necessary.
EDIT: would the AsDataReader Extension method from MSDN help in this instance?
When having the same issue we end up having code-first generated database with EF and strongly typed generated datasets to be used for SQLBulkCopy.
(We never really coded those classes, they were generated using xsd util from xsd definition of 1-10gb xml file. I'm trying to recall right now when we havent generated typed datasets from the same xsd, but that seems irrelevant to your issue.)
Depending on how you are getting those 600k+ records you either can change the code to use generated strongly-typed datasets or use some object-to-object mapper to map your EF POCO objects to datasets as properties going to be named the same.
Here is a link on generating strongly typed datasets.
Here is an example how to use SqlBulkInsert.
I am currently developing a web service which provides basic CRUD operations on business objects. The service will be used by legacy applications which currently use direct database access.
I decided to use ServiceStack instead of WCF due to ServiceStacks great architecture.
However know I am trying to decide wether to use OrmLite, nHibernate or Entity Framework to access the existing legacy database.
Requirements for the ORM are as follows
Support for joins
Support for stored procedures
I already tried OrmLite (as it's fast and already included with ServiceStack). The only way I managed to join two tables was by using SQL (not an option). Is there any better way?
// #stackoverflow: This is my POCO DTO
public class Country
{
public long Id { get; set; }
public string Alpha2 { get; set; }
public string Alpha3 { get; set; }
public string ShortText { get; set; }
public string LongText { get; set; }
}
public class CountryRepository : ICountryRepository
{
// #stackoverflow: This is the query to join countries with translated names stored in another table
private const string CountriesSql =
#"SELECT C.Id, C.Alpha2, C.Alpha3, L.ShortText, L.LongText FROM COUNTRY AS C INNER JOIN LOCALIZATION AS L ON C.LocId = L.Id WHERE (L.Lang_Id = {0})";
private const string CountrySql = CountriesSql + " AND C.Id={2}";
private IDbConnection db;
public IDbConnectionFactory DbFactory { get; set; }
private IDbConnection Db
{
get { return db ?? (db = DbFactory.Open()); }
}
public List<Country> GetAll()
{
return Db.Select<Country>(CountriesSql, 0);
}
public Country GetById(long id)
{
return Db.SingleOrDefault<Country>(CountrySql, 0, id);
}
}
The example above shows one of the simple business objects. Most others require Insert, Update, Delete, multiple Joins, and Read with many filters.
If all you need are joins (lazy-loading or eager loading) and stored procedure support and want to get setup quickly then Entity Framework and nHibernate are great options. Here is a cool link about EntityFramework and the repository and unit of work pattern. http://blogs.msdn.com/b/adonet/archive/2009/06/16/using-repository-and-unit-of-work-patterns-with-entity-framework-4-0.aspx
If you are very concerned with performance and want more control over how your classes will look (ie POCOs) and behave then you can try something more lightweight like ORMLite or Dapper. These two are just thin wrappers with less features but they will give you the best performance and most flexibility -- even if that means writing some SQL every once in a while.
You can also use hybrid approaches. Don't be afraid to mix and match. This will be easiest when using POCOs.
I think the important thing is to code for your current database and current needs. However, to do so using proper interfaces so if the time came to switch to a different database or storage mechanism then you simply have to create a new data provider and plug it in.
Ormlite supports primitive Join functions using expressions. The new JoinSqlBuilder class can help with this. For SPs, I have added a new T4 file to generate corresponding c# functions. Currently the SP generation code supports Sql Server alone; if you are using any other db, you can easily add support for it.
You might consider LLBLGen Pro -- it's got great support for database first design and also has designer tools that speed up getting started if you use nHibernate or EF. But it is $$.
http://llblgen.com
As a follow up to this Matt Cowan has created an AWESOME template generator for building this sort of thing with LLBLGen. Check out the blog post here:
http://www.mattjcowan.com/funcoding/2013/03/10/rest-api-with-llblgen-and-servicestack/
and demo here:
http://northwind.mattjcowan.com/
The demo is entirely autogenerated!
Also check this comparison from an OO perspective between NHibernate 3.x and Entity Framework 5/6
http://www.dennisdoomen.net/2013/03/entity-framework-56-vs-nhibernate-3.html
I have the following in Entity Framework.
Table - Country
Fields
List item
Country_ID
Dialing_Code
ISO_Alpha2
ISO_Alpha3
ISO_Full
I would like to map only selected fields from this entity model to my domain class.
My domain model class is
public class DomainCountry
{
public int Country_ID { get; set; }
public string Dialing_Code { get; set; }
public string ISO_3166_1_Alpha_2 { get; set; }
}
The following will work, however insert or update is not possible. In order to get insert or update we need to use ObjectSet<>, but it will not support in my case.
IQueryable<DomainCountry> countries =
context.Countries.Select(
c =>
new DomainCountry
{
Country_ID = c.Country_Id,
Dialing_Code = c.Dialing_Code,
ISO_3166_1_Alpha_2 = c.ISO_3166_1_Alpha_2
});
Is there a nice solution for this? It wound be really fantastic.
Ideally it will be kind of proxy class which will support all the futures however highly customizable.
That is, only the columns we want to expose to the outer world.
The term for "plain .NET classes" is POCO - plain old CLR objects (inspired by POJO, plain old Java objects).
Read this blog post series, it helped me a lot:
http://blogs.msdn.com/b/adonet/archive/2009/05/21/poco-in-the-entity-framework-part-1-the-experience.aspx
I want to do the same thing. My goal is to build a WCF service that can use the same set of objects as the application I'm building by sharing a DLL and sending/receiving the same classes. Additionally, I also wanted to limit what fields are exposed. After thinking about this for a while it seems a user-defined cast might do the trick. Have a look to see if it works for you.
http://www.roque-patrick.com/windows/final/bbl0065.html