I have a application which is running in both production and development environments. I would like to utilize the databases better (and save money on my hosting bill) so i want to be able to make my Linq2Sql run on two different schemas (instead of two different databases) (there are ~15 tables in a schema). How to set this up in Linq2Sql?
Or should i go the distance and read up on Entity Framework instead (and is it possible to segment the tables based on schemas in this one?
Any other solutions to this problem are welcome?
This is actually easier to do in LINQ to SQL than it is in EF. Not that it is terribly easy, mind you. I wrote a blog post a couple of years ago on how to do it but the heart of it is to specify the mapping source in your context constructor.
XmlMappingSource source = XmlMappingSource.FromUrl("TestLINQ.map");
// Could also use XmlMappingSource.FromXml(string)
using (LINQ.TestLINQDataContext context = new LINQ.TestLINQDataContext(Properties.Settings.Default.TestConnectionString, source))
{
Using this method, you can alter your mapping source to point at the schema (or table name) that you want to.
Related
I have searched back and forth but seemingly could not get a hold of what I need. I am sorry if this has been answered of late. A redirection to the discussion will do me good.
This is the scenario. I have been instructed to move from Microsoft Visual Foxpro (MS is withdrawing support come 2015) to .Net C# by my boss. For the sake of good foundation and adoption of best practices, I have decided to first learn, piece pertinent information together, then start coding. This is the second year.
We are a bureau company that offer payroll processing outsource services to over 50 clients. Each client currently has their own database. The databases have tables with completely identical structures.
I am a newbie. Totally new to .net world.
I had started off with raw SQL using datatables, datareaders but in my research I got some discussions discouraging this. Many were of the view that Entity Framework should serve the purpose. But one is allowed to mix approaches especially when complex queries are involved.
Can someone point me to some 'good read' where I can implement Entity Framework with over 50 indentical databases. Each database is totally independent and has nothing to dowith any other. When the user logs in, they select which client they need to process payroll for, then EF points to that database.
EF needs 2 different pieces of information to work with data from a database:
1) The database schema: This is included as compiled code in your application and cannot normally be changed at runtime.
2) The connection string: This is provided at runtime, normally from a config file.
In your case, all the databases have the same schema, so you can just model one database and it will work for all the others.
The piece you want to change is the connection string. This tells EF how to find the database and can be provided at runtime.
There is an overload of the DbContext constructor which takes a connection string as a parameter: MSDN: DbContext Constructor (String)
And there are even classes in the framework that help create connection strings for you:
MSDN: EntityConnectionStringBuilder Class
MSDN: Connection String Builders
It is very simple
I had,
//WMSEntities is conection string name in web.config
//also the name of Entitiframework
public WMSEntities() : base("name=WMSEntities")
{
}
already in autogenerated Model.Context.cs of edmx folder
To connect to multiple database in runtime, I created another constructor that takes connection string as parameter like below in same file Model.Context.cs
public WMSEntities(string connStringName)
: base("name=" + connStringName)
{
}
Now, I added other connection string in Web.Config for example
<add name="WMSEntities31" connectionString="data source=TESTDBSERVER_NAME;
initial catalog=TESTDB;userid=TestUser;password=TestUserPW/>
<add name="WMSEntities" connectionString="data source=TESTDBSERVER_NAME12;
initial catalog=TESTDB12;userid=TestUser12;password=TestUserPW12/>
Then, when connecting to database I call below method passing connetionString name as parameter
public static List<v_POVendor> GetPOVendorList(string connectionStringName)
{
using (WMSEntities db = new WMSEntities(connectionStringName))
{
vendorList = db.v_POVendor.ToList();
}
}
Hrmmm I happen to really like EF Code First but I'm not certain it suits what you're doing. How often does your schema change?
Should You Be Using EF?
Advantages of EF
If the schema changes somewhat regularly, the Migrations part of EF Code First might save you a lot of time and effort because you can often do away with SQL scripts for schema upgrades - schema changes end up in your source repository with the rest of your code instead. You'd start here:
https://stackoverflow.com/a/8909092/176877
I also happen to really like how easy EF is to setup, and how easy it is to write LINQ queries against it and return exactly the POCOs I built from the DB.
But EF might not be the best fit.
Other ORMs to consider
Many other ORMs support LINQ and POCOs with better support for existing databases (there are things that can be pretty difficult to map in EF Code First), --and existing support for asynchronous operation (EF is on 5.0 right now; 6.0 has async)-- (update: EF6 is the latest and its async support is great. Its bulk delete is terrible though and should be avoided like plague, drop to plain SQL for that).
In particular NHibernate is the beast on the scene for existing db support, but it's a bit of a configuration chore and what appears to be political infighting has caused the documentation to be conflicting for different versions and forks of it.
Much simpler are many "Micro ORMs" - that link is to a short list from 2011 but if you poke around you'll find 30 or so in .Net. Some generate better or less optimal queries, some none at all, some make you write SQL (don't use those) - you'll have to poke around to decide which is for you. This can be a bigger research task but I suspect the easy configuration and small learning curve for one of these best suits what you're trying to do.
Answer to your specific question
Talk to All client Dbs at once
If you're connecting to all 50 databases from one app at the same time you'll need to instantiate 50 DbContexts like:
var dbClient1 = new DbClient1();
var dbClient2 = new DbClient2();
Assuming you went around making little wrapper classes like:
public class DbClient1 : CoreDbContext
{
public DbClient1()
: base("DbClient1") // Means use the connection string named "DbClient1" in Web.Config
Where CoreDbContext is the main EF class in your Project that extends DbContext (standard part of any EF project).
Talk to just one at a time
If you're using just the one per app then any EF tutorial will do.
The only major trick will be migrating those Dbs when schema changes occur. Two basic approaches there. Either way you grab a backup and restore a copy of them locally so you can test your migrations against them (update-database -f -verbose). If you don't you risk data errors like changing a column to NOT NULL and finding your local test instance had no nulls, one client's did, kaboom. Once you get them working, you're onto deciding how you want to update Production. There are a lot of ways you might do this ranging from writing a custom roll-forward/back tool (or finding one) with SQL scripts checked into git, hiring a DBA, or much simpler:
The Obvious - SQL Script
Dump the migration to SQL (update-database -script) and run it against the actual production database.
My Crazy Way for Small Numbers of Dbs
Add entries for each db to Web.Config, and create a Project Configuration for each of them like "DbDeployClient1," "DbDeployClient2," etc. In each of those make a build define like DbDeployClient1, and then add this to your DbContext class:
public CoreDbContext()
#if DbDeployClient1
: base("DbDeployClient1")
#elseif DbDeployClient2
: base("DbDeployClient2")
// etc
#endif
{
That allows you to quickly switch to your DbDeploy config and run the migration directly from Visual Studio against the target database. Obviously if you're doing this you'll need to temporarily open a port, preferably only allowing in your IP, on the actual SQL Server instance you're migrating. One nicety is you get clear errors from your migration right there, and full rollback capability, without any real work - all that rollback support you're leveraging is just part of EF. And one dev can do it without a bunch of other bottlenecks. But it has a lot of opportunities to reduce risk and improve automation.
Currently I'm adjusting a system that works with Entity Framework to connect to a SQL Server 2008 R2 database.
For the new part the key users need to add, change and remove entities that the normal users can use. Before I make a system that saves objects with names and with attributes I wanted to look if it is possible to create the database dynamically with the entities that the key users are giving (through a simplified entity designer).
I've search a little bit but on the internet but didn't find something quite like this. Maybe someone here knows something to push me in the right direction?
It sounds like you are best off by really defining tables, columns, indexes and foreign keys dynamically. If you were to use a "database of databases" schema with entities and attributes you would be unable to effectively index the database. Queries become extraordinarily slow and nasty.
You can query and change the database schema using SQL Server Management Objects (SMO). I have used them multiple times. They work and are quite nice to work with.
I'm not convinced that Entity Framework brings much to the table here. EF is good for expressing queries and DML on a static schema. If you were to use a dynamic schema you lose most of the benefits. Of course, some benefits remain such as entity key management and being able to use Entity SQL instead of T-SQL. On the downside you have to create all EF metadata at runtime (probably generate EDMX files or dynamic assemblies).
I think it is not worth it. I'd strongly consider building a database schema at runtime and executing queries against it using dynamically built T-SQL. It is much easier to do this than work against the system with EF.
In that sense you are back to DataTables and GridViews which was considered good style even 5 years ago. It's probably not too bad.
I'm having trouble choosing an appropriate data access framework, partly because I'm very picky with my preferences and mostly because I don't have much experience with most of them :-)
I need a framework that will allow me to easily map between the DB tables (SQL Server) and my entities, and that will handle the CRUD operations for me (for the most part).
I want my entities to reside in a separate assembly from my DAL.
I prefer using attributes for the mappings over external file like XML.
It doesn't have to be an ORM, and I want to code my entities myself.
I don't mind writing stored procedures.
The project's database won't be very big. Less than 50 tables.
I'd like some of my entities to correspond to an inner join of two tables - one for static data entered manually during development and the other with data filled during runtime - without using two entities that reference one another (the result of this join will be a single entity).
Entity Framework sounded perfect until I realized it doesn't support Enums (yet - and I can't wait for EF 5.0).
I want these entities to include Enums, and plan on using lookup tables for the enums + code generation for the enum to keep it synchronized with the database.
Linq-to-SQL seems like a good candidate, but I don't know if it copes well with my previous demands.
Using Enterprise Library 5.0 DAAB with it's RowMapper, and extending it's abilities to perform updates and inserts is also an option (but will require more coding on my part).
I plan on implementing the Repository Pattern.
How about NHibernate? Would it do? No experience there either.
I would be happy to hear all suggestions.. the more the merrier! Thanks in advance!
I think nHibernate is the way to go, although some of its main strengths (ORM, stored procedure generation, etc) are things you listed as non-requirements. Anyway, nHibernate will do everything you want it to do. Technically it does use xml mappings, but these can easily be auto-generated using fluent attribute mapping. I like this, as it IS done for you, but you get the customization too just in case you need it. Good luck!
I'm working on an application that uses SQL Server and NHibernate. We have the concept of default data (complex entities) that needs to be created for each new entity. This data can be changed on a per-user basis. However, we're struggling with the best way to create this data.
For example, lets say my application has a Store entity which has several default Products that I want to create when a new Store gets created. Anything about aProduct can be modified by managers of each Store.
As I see it, there are two main options:
Keep the default data in code and write it to the database once the new entity is created.
Keep the default data in the database and move it over with a stored procedure/raw SQL when the entity is created.
Instinctively, I lean toward option two, since databases are great at moving and manipulating sets of data, and option one would require a ton of messy code that could get out of hand.
However, writing a stored procedure or raw SQL presents its own issues:
We would have to re-write the stored procedure or SQL depending on the database we're using
We would be subverting the ORM in a way (not sure if this is actually wrong). That is, we'd be moving data around without using NHibernate
I found this article by Ayende Rahien which outlines how to perform a bulk delete. I am thinking that doing something similar for inserting default data would be fine. I also found an nhibernate users groups post (called "Schema export and default data"--SO won't let me post two links) that describes a similar situation, but it doesn't seem like there's a consensus on what the right solution is (although Ayende does offer some feedback and suggests that the data live in the database).
After writing this, I'm leaning even more toward using a stored procedure, I'm just worried about possible pitfalls of mixing two database access strategies (directly calling SProcs and using an ORM).
Any feedback is appreciated!
Edit: Removed "immutable" language. I'm specifically talking about default data that can change so I think this term was incorrect/confusing here.
I would create a default data service that creates those data in code, and use a factory to create your store and use the default data service to generate the default entities.
Using a Stored Procedure definitely defeats the point of having an ORM.
I am trying to leverage ORM given the following requirements:
1) Using .NET Framework (latest Framework is okay)
2) Must be able to use Sybase, Oracle, MSSQL interchangeably
3) The schema is mostly static, BUT there are dynamic parts.
I am somewhat familiar with SubSonic and NHibernate, but not deeply.
I get the nagging feeling that the ORM can do what I want, but I don't know how to leverage it at the moment.
SubSonic probably isn't optimal, since it doesn't currently support Sybase, and writing my own provider for it is beyond my resources and ability right now.
For #3 (above), there are a couple of metadata tables, which describe tables which the vendors can "staple on" to the existing database.
Let's call these MetaTables, and MetaFields.
There is a base static schema, which the ORM (NHibernate ATM) handles nicely.
However, a vendor can add a table to the database (physically) as long as they also add the data to the metadata tables to describe their structure.
What I'd really like is for me to be able to somehow "feed" the ORM with that metadata (in a way that it understands) and have it at that point allow me to manipulate the data.
My primary goal is to reduce the amount of generic SQL statement building I have to do on these dynamic tables.
I'd also like to avoid having to worry about the differences in SQL being sent to Sybase,Oracle, or MSSQL.
My primary problem is that I don't have a way to let ORM know about the dynamic tables until runtime, when I'll have access to the metadata
Edit: An example of the usage might be like the one outlined here:
IDataReader rdr=new Query("DynamicTable1").WHERE("ArbitraryId",2).ExecuteReader();
(However, it doesn't look like SubSonic will work, as there is no Sybase provider (see above)
Acording to this blog you can in fact use NHibernate with dynamic mapping. It takes a bit of tweaking though...
We did some of the using NHibernate, however we stopped the project since it didn't provide us with the ROI we wanted. We ended up writing our own ORM/SQL layer which worked very well (worked since I no longer work there, I'm guessing it still works).
Our system used a open source project to generate the SQL (don't remember the name any more) and we built all our queries in our own Xml based language (Query Markup Language - QML). We could then build an xmlDocument with selects, wheres, groups etc. and then send that to the SqlEngine that would turn it into a Sql statement and execute it. We discusse, but never implemented, a cache in all of this. That would've allowed us to cache the Qmls for frequently used queries.
I am a little confused as to how the orm would be used then at runtime? If the ORM would dynamically build something at runtime, how does the runtime code know what the orm did dynamically?
"have it at that point allow me to manipulate the data" - What is manipulating the data?
I may be missing something here and i aplogize if thats the case. (I only have really used bottom up approach with ORM)
IDataReader doesn't map anything to an object you know. So your example should be written using classic query builder.
Have you looked into using the ADO.NET Entity Framework?
MSDN: LINQ to Entities
It allows you to map database tables to an object model in such a manner that you can code without thinking about which database vendor is being used, and without worrying about minor variations made by a DBA to the actual tables. The mapping is kept in configuration files that can be modified when the db tables are modified without requiring a recompile.
Also, using LINQ to Entities, you can build queries in an OO manner, so you aren't writing actual SQL query strings.