Save objects to a database? - c#

So far in my .Net coding adventures I've only had a need to save information to files. So I've used XmlSerializer and DataContractSerializer to serialize attributed classes to XML files. My next project, however, requires that I save and retrieve information from a SQL server database. I'm wondering what my options are for doing this.
The current version of the app, which was not created by me, uses a lot of hard coded SQL commands. But now I'm trying to avoid doing anything where I have to read or write individual fields to or from the database or objects. I especially want to avoid a lot of hard coded SQL in my code. I like how the serializer classes just figure out how to read and write XML files based on the attributes and or public properties of the class. Is there something similar for a database rather then XML?

Object Relational Mapping
There are bunch of products out there, most notorious one being NHibernate, there are couple of competing products offered by Microsoft in Linq 2 Sql and Entity Framework (you're supposed to use the later, but everyone uses the first as is waaaay simpler).
You can see a nice (although I suspect biased) comparison of ORM offerings at http://ormbattle.net/

I believe you're referring to Object Relational Mappers. These provide a wealth of functionality, including simple object CRUD plumbing.
Check out:
NHibernate
Entity Framework
Linq to SQL
There are many others, but that'll get you going.

There is no generic object type when you deal with databases. Only tables and fields.
The combination of these could make an object though. Your best bet is to use stored procedures if you are concerned with hard coded SQL on the client code.
I'm also mainly referring to the actual field types in a database. ORM's are a different story. If you want look into nHibernate if you want an object relational mapper that can help with INSERTs, SELECTs, etc.

Depending on the project an ORM like NHibernate might be what you're looking for. Something where you map your database information to classes and the ORM takes care of the inserts, deletes, and selects for you without hand-written SQL. This also allows for migration to a different database system without a ton of rewrite.
I say it depends on the project because other things come into play here like performance and how the data is actually structured.

I think you should read up on Linq to SQL. This will allow you to work "primarily" with classes that are representations of your database tables and their relations.
DataContext context = new DataContext();
var obj = context.Table1.Single(row => row.Id == 1234);
obj.Name = "Test1234";
context.SubmitChanges();
This could be a good place to start to learn about Linq to SQL
Hope this is what you are looking for.

I agree with (and prefer) the previous suggestions to use an ORM. Just to make sure you have a full menu of options here is another option. If you're comfortable with the XML representation, (de)serialization, etc... you could also look into using SQLXML. With that said, you should not use this to avoid doing proper database design although this can be totally reasonable for some solutions.

Related

translate large queries to linq to Entity framework

I have a few very large queries which I need to convert it linq because we are using Entity framework and I cant use stored procedures(breaks compatibility with other data bases).
using tool like linqer didnt even help and even if I get it to work with some mods to generated linq, there is a huge performance issue.
so, what is the best option in a situation like this where EF fails?
please don't ask me to divide it into small queries cause that's not possible.
Moving this to an "answer" because what I want to say is too long for a comment.
It sounds like you're running into an inherent limitation to ORMs. You won't get perfect performance trying to do everything in code. It sounds like you're trying to use an ORM like a T-SQL interface rather than a mapping between objects and a relational instance of data.
You say you want to maintain compatibility between databases but that's already a nonstarter if you consider schema differences from database to database. If you're already implementing a schema validation step so you ensure your code doesn't break, then there should be no reason why you can't use something like views.
You can say you don't want to support these things all day long but the simple point is that these things exist because they address certain problems. If you wholesale abandon them, then you can't really expect to get rid of the problem. Some things the database simply does better.
So, I think you're expecting something out of the technology that it wasn't meant to solve. You'll need to either reevaluate your strategy or use another tool to accomplish it. I think you may even need a couple different tools.
What you've been doing may have worked when your scale was smaller. I could see such a thing working for quite a while actually. However, it does have a scale limit, and I think you're coming up against it.
I think you need to make a determination on what databases you want to support. Saying "we support all databases" is untenable. Then, compare features and use the ones in common. If it's a MS SQL vs. MySQL thing, then there's no reason why you can't use views or stored procedures.
Check out LinqKit - this is a very useful tool for building up complex large EF queries.
http://www.albahari.com/nutshell/linqkit.aspx

Data Access Framework that addresses my needs

I'm having trouble choosing an appropriate data access framework, partly because I'm very picky with my preferences and mostly because I don't have much experience with most of them :-)
I need a framework that will allow me to easily map between the DB tables (SQL Server) and my entities, and that will handle the CRUD operations for me (for the most part).
I want my entities to reside in a separate assembly from my DAL.
I prefer using attributes for the mappings over external file like XML.
It doesn't have to be an ORM, and I want to code my entities myself.
I don't mind writing stored procedures.
The project's database won't be very big. Less than 50 tables.
I'd like some of my entities to correspond to an inner join of two tables - one for static data entered manually during development and the other with data filled during runtime - without using two entities that reference one another (the result of this join will be a single entity).
Entity Framework sounded perfect until I realized it doesn't support Enums (yet - and I can't wait for EF 5.0).
I want these entities to include Enums, and plan on using lookup tables for the enums + code generation for the enum to keep it synchronized with the database.
Linq-to-SQL seems like a good candidate, but I don't know if it copes well with my previous demands.
Using Enterprise Library 5.0 DAAB with it's RowMapper, and extending it's abilities to perform updates and inserts is also an option (but will require more coding on my part).
I plan on implementing the Repository Pattern.
How about NHibernate? Would it do? No experience there either.
I would be happy to hear all suggestions.. the more the merrier! Thanks in advance!
I think nHibernate is the way to go, although some of its main strengths (ORM, stored procedure generation, etc) are things you listed as non-requirements. Anyway, nHibernate will do everything you want it to do. Technically it does use xml mappings, but these can easily be auto-generated using fluent attribute mapping. I like this, as it IS done for you, but you get the customization too just in case you need it. Good luck!

Suggestions for dynamic SQL Server access using C#

I'm looking for a good solution to make my life easier with regards to writing/reading to a SQL Server DB in a dynamic manner. I started with Entity-framework to make my life easier to begin with, but as the software become more general and config driven I'm finding that Entity becomes less and less appropriate because it relies on specific objects defined at design time.
What I'd like to do.
Generate Tables/Fields at runtime.
Select rows from tables by table name with unknown schema into a generic data type (eg Dictionary)
Insert rows to tables by table name using generic data types (dictonary, where the string maps to field name), where the data type mapping between typeof(object) and field type is taken care off.
I've started implementing this stuff myself, but I imagine someone has already has already done it before.
Any suggestions?
Thanks.
I'm having trouble understanding how what you are describing is any different than plain old ADO.NET. DataTables are dynamically constructed based on a SQL query and a DataRow is just a special case of an IndexedDictionary (sometimes called an OrderedDictionary where you can access values via a string name or an integer index like a list). I make no judgment as to whether choosing ADO.NET is actually right or wrong for your needs, but I'm trying to understand why you seem to have ruled it out.
You can use Sql.Net ( http://sqlom.sourceforge.net ) to easily generate dynamic SQL statements in C#.
The iBATIS.NET (now MyBatis.NET) Data Mapper framework doesn't automatically generate tables or fields at runtime, but it does allow you to select and commit data via Dictionary objects.
It's probably not going to suit your needs completely (it's kind of tedious to set up, but pretty easy to maintain once it is), but it might be worth a look. Here's a link to the online documentation.
Other popular frameworks might do the same or similar, such as NHibernate.

c# ado.net : best way to map database fields to property

Using VS2005, .net 2.0 , C#
Hi all,
What is the best way to map stored proc columns to the c# object properties with out creating tight coupling.
For example, I dont like to do the following
DataRow row = Getmyrows();
MyObject.MyProperty1 = row["col1"];
MyObject.MyProperty2 = row["col2"];
So, when the column in stored proc gets changed to colxyz then the binary code will break. What is the best practice to address this. A code sample would be helpful and thank you in advance.
I would look into OR mappers. LINQ to SQL, nHibernate, Entity Framework, LLBLGen, etc. These allow you to configure your mapping via XML or some other external configuration source. Most of them also provide a way to completely decouple your entities from the persistence framework, allowing your entities to be POCO (Plain Old CLR Objects). Another benefit of OR mappers is they generate SQL for you on the fly, which allows you to largely eliminate your stored proc layer, which is also a coupling that can cause problems (on both ends...in your code as well as in your DB schema.)
Couple approaches:
If you're forced to stick w/ ADO.NET proper, use a strongly typed dataset. All that mapping between objects and data structures is down in a schema where it belongs. Then you'd be able to hydrate your objects w/ code like this:
MyObject.MyProperty1 = dataSet.TableName.PropertyName
I noticed everyone else said the same thing I was going to ;-) Go w/ an ORM. I know premature optimization is a slippery slope, but you inevitably will find clear justification for going that route. You won't regret it as your requirements become more complex, and you'll be learning a valuable skillset that's clearly gaining a lot of momentum in the .NET space.
Best solution is to use a proper O/RM like NHibernate, or if you can settle for "less" Linq to SQL or Entity Framework.
However if you must, I suggest using IDataReader/SqlDataReader instead (simplest, best performance), but you won't get away with having to map to column names if you want to do it the "hard way".
An approach that coworkers and I used back in the 2.0 days was to create a custom attribute which we used to specify the field name from the data table, and tagged our objects' properties with it. then built a generic entity builder that would take a datareader as a parameter ( EntityBuilder(IDataReader rdr) ); as it worked through the datareader, it would create an empty T, reflect on the class, go through the properties to get the custom attribute information and the type, and set the value based on that.
also had another custom attribute that specified the parameter used in our Insert and Update SPROCs to automatically populate the parameters, too.
If you have to do it that way, and dont want to use ORM, store the column names mapping in an xml file.
<appSettings>
<key="prop1" value="col1">
</appSettings>
then in the code do something like:
myObject.Prop1 = ConfigurationManager.AppSettings["prop1"].Value
I know its clunky, and involves reading from xml (or the config file) for each property but it will work.

How can I leverage an ORM for a database whose schema is unknown until runtime?

I am trying to leverage ORM given the following requirements:
1) Using .NET Framework (latest Framework is okay)
2) Must be able to use Sybase, Oracle, MSSQL interchangeably
3) The schema is mostly static, BUT there are dynamic parts.
I am somewhat familiar with SubSonic and NHibernate, but not deeply.
I get the nagging feeling that the ORM can do what I want, but I don't know how to leverage it at the moment.
SubSonic probably isn't optimal, since it doesn't currently support Sybase, and writing my own provider for it is beyond my resources and ability right now.
For #3 (above), there are a couple of metadata tables, which describe tables which the vendors can "staple on" to the existing database.
Let's call these MetaTables, and MetaFields.
There is a base static schema, which the ORM (NHibernate ATM) handles nicely.
However, a vendor can add a table to the database (physically) as long as they also add the data to the metadata tables to describe their structure.
What I'd really like is for me to be able to somehow "feed" the ORM with that metadata (in a way that it understands) and have it at that point allow me to manipulate the data.
My primary goal is to reduce the amount of generic SQL statement building I have to do on these dynamic tables.
I'd also like to avoid having to worry about the differences in SQL being sent to Sybase,Oracle, or MSSQL.
My primary problem is that I don't have a way to let ORM know about the dynamic tables until runtime, when I'll have access to the metadata
Edit: An example of the usage might be like the one outlined here:
IDataReader rdr=new Query("DynamicTable1").WHERE("ArbitraryId",2).ExecuteReader();
(However, it doesn't look like SubSonic will work, as there is no Sybase provider (see above)
Acording to this blog you can in fact use NHibernate with dynamic mapping. It takes a bit of tweaking though...
We did some of the using NHibernate, however we stopped the project since it didn't provide us with the ROI we wanted. We ended up writing our own ORM/SQL layer which worked very well (worked since I no longer work there, I'm guessing it still works).
Our system used a open source project to generate the SQL (don't remember the name any more) and we built all our queries in our own Xml based language (Query Markup Language - QML). We could then build an xmlDocument with selects, wheres, groups etc. and then send that to the SqlEngine that would turn it into a Sql statement and execute it. We discusse, but never implemented, a cache in all of this. That would've allowed us to cache the Qmls for frequently used queries.
I am a little confused as to how the orm would be used then at runtime? If the ORM would dynamically build something at runtime, how does the runtime code know what the orm did dynamically?
"have it at that point allow me to manipulate the data" - What is manipulating the data?
I may be missing something here and i aplogize if thats the case. (I only have really used bottom up approach with ORM)
IDataReader doesn't map anything to an object you know. So your example should be written using classic query builder.
Have you looked into using the ADO.NET Entity Framework?
MSDN: LINQ to Entities
It allows you to map database tables to an object model in such a manner that you can code without thinking about which database vendor is being used, and without worrying about minor variations made by a DBA to the actual tables. The mapping is kept in configuration files that can be modified when the db tables are modified without requiring a recompile.
Also, using LINQ to Entities, you can build queries in an OO manner, so you aren't writing actual SQL query strings.

Categories