I want to build a generic data access layer targeting any multiple databases i.e. SQL Server, Oracle, DB2, etc, and the tricky part is that I need to also support Web Services, so my datasource can also be a Web Service. Each of the data sources have similar data but the column names are different and the return type(s) are different in the web services as well.
How can i create a generic Data access layer to target any datasource and return a generic object/list to the UI in .NET, C#.
.NET 4.5, C#, Web API, VS 2013 is what i am using.
Please suggest. Any help is greatly appreciated.
cheers
You are making things way too complicated in my opinion. Just create data access components for every data source type you have, and have them return business or domain entities that you define. Use the power of each data source to its fullest instead of trying to make them 'generic'.
Also, why treat a web service as a data source? It's a web service, already abstracted out. Just use it in your application layer and use its entities.
Bottom line: if it's too hard to do, then don't do it and keep it simple.
Also read http://ayende.com/blog/4567/the-false-myth-of-encapsulating-data-access-in-the-dal.
Related
I'm learning MVC. There are plenty of sample codes working with SQL Server. The coder has the database created on the fly from his/her classes, which enables a very clean and rapid development workflow.
I'm working with Oracle DB.
Can I also abstract data from tables, that I already have? I don't need to abstract all columns(i.e. i need only two of 50). I need only read access and wanna use either web service or oracle as input.
You know any sample code, so I can see how can abstract data for web-service or oracle as data source?
You could take a look at Entity Framework. It allows you to abstract your data access code from the underlying database. This really is not MVC specific and you could use it in any .NET application you wish.
As far as web services are concerned I would recommend you designing a data access layer that will be called from your MVC controllers and which would delegate the calls to the underlying web service. An abstraction over this web service would be beneficial if you want to unit test your controllers in isolation.
I have the following scenario:
Server
SQL Server 2008
Core (Entity Framework and business logic)
WCF Service
MVC Web application (for backend management)
Client
Local Database - a simplified model of the main database
WPF Client
Requirements
The client has to work fully offline, and persist data
Changed data should be pulled from the server over WCF service
Client should not change the data, but call a a WCF method (if not available queue the call)
Possible Solutions
Microsoft Sync Framework - I think its an overkill, because I mainly need one way synching, and also the data structure is not the same.
Datasets serialization over WCF, yes, because Datasets support merging and offline scenarios, but isn't It out of date?
Entity Framework? I tried to build a prototype, but EF doesn't seem to support my needs very well (I need to search for an entity, and change it if modified, or add it if not existant)
Question
What, do you think, is the most appropriate approach?
Is SQL Server Compact a good local db?
I am very interested in your thoughts. Thank you!
The Microsoft Sync Framework is in my opinion not appropriate because you have differing schemas. I can also imagine that you have some business rules about what data is allowed to change and how it should be synced.
The choice between DataSets and Entity Framework depends on your needs. An Object-Relation-Mapper comes in to view when you are really using an object model.
If your domain is complex enough and you have the knowledge using a full fledged Domain Model is definitely a nice solution that can scale really well and handle complex projects.
If your project is somewhat simpler and you don't want to build a Domain Model you can choose for DataSets.
Personally, I think that learning the ins and outs of Domain Modeling and the Entity Framework as an ORM is a nice choice for projects. When you have enough experience using these technologies, you will even favor them on small projects.
About the problems you where having with your EF prototype.
Because the data schema of the client and the server is different I would use custom Data Transfer Objects for moving data between the two. This way, you decouple the object models and they can change independent of each other.
The client knows everything there is to know about the data changes. Because it has a local representation of the server data it knows if data is added, changed or deleted. Why don't you add this knowledge to your server call? If you use a field in your DTO that states if the object is Added, Modified or Deleted, the server won't have to detect this.
I have probably written the same LINQ to SQL statement 4-5 times across multiple projects. I don't even want to have to paste it. We use DBML files combined with Repository classes. I would like to share the same Library across multiple projects, but I also want to easily update it and ensure it doesn't break any of the projects. What is a good way to do this? It is OK if I have to change my approach, I do not need to be married to LINQ to SQL and DBML.
We have both console apps and MVC web apps accessing the database with their own flavor of the DBML, and there have been times when a major DB update has broken them.
Also, since currently each project accesses the DB from itself, which is sometimes on another server, etc. Would it be possible to eliminate the DB layer from being within each project all together? It might help the problem above and be better for security and data integrity if I could manage all the database access through a centralized application that my other applications could use directly rather than calling the database directly.
Any ideas?
The way I handle this is using WCF Data Services. I have my data models and services in one project and host this on IIS. My other projects (whatever they may be) simply add a service reference to the URI and then access data it needs over the wire. My database stuff happens all on the service, my individual projects don't touch the database at all - they don't even know a database exists.
It's working out pretty well but there are a few "gotchas" with WCF. You can even create "WebGet" methods to expose commonly used methods via the service.
Let me know if you want to see some example code :-)
I'm working For my company on a .NET N-tier architecture and have several questions. Basics: the project should be split into layers and should not allow tues develop as easy as possible additional modules for modules like logging, ratings, user management etc.
Environment: VS2010, EF4, SQL Server 2008, LinqToSql, c#
Current situation is as follows:
I have a Data Access Layer WHERE I'm using Entity Framework Tues access database ENTITIES.
then I have a Bussines Layer, with all the logic Insert, update, delete and additional methods for searching etc.
The next layer is for WCF Service Contracts
Finally there is a presentation layer (desktop and web).
First I have created a desktop application. After adding a service reference and presentation layer on the left side in the datasources tab all shown all the tables from the database. This is great.
Then I used a datagridview and bound his DataSource to WCF with bindingsource and finally the wizard to order columns etc. This worked fine.
The next try was to make a web site with the functionality itself. Despite the fact that I added a WCF service reference, datasources are not displayed, therefore it is not possible to use the wizard for quick changes and all the stuff must be coded! This work is difficult and needs a lot of code to serve with basic operations like insert, update, delete. But if I add a reference to the Data Access Layer and add in the web.config a connection string, then I can use the wizard. But then, access to datasource (DAL) does not take place through WCF.
I found out that in Asp.net Dynamic Data web site has a basis for insert, update and delete done. It would probably use this a better or I am wrong?
Question 1: Should the DAL for .edmx use the auto generated code (code generation add items)?
Question 2: How to to get the same DataSources (in the tab) as in the desktop application?
Question 3: Is it possible to use WCF in conjunction with Telericks DataGrid as it already supports Ajax by default, insert, update and delete operations?
Question 4: How to design architecture that will support modularity?
I have spent hours and hours to find concrete informations about this, and still don't know what's the correct and best way. I was searching for articles, where solving of such specific problems would be demonstrated, but didn't found.
I really hope, to get answers/help from you.
Any help is welcome and thanks in advance.
Greetings
Yes, as long it gets your work done, otherwise extend.
This seems purely design related question; but if you are using telerik, I hope you can find the same for web as well as desktop.
Yes. But you will have to tweak the objects returned from web service to set into the telerik datagrid datasource.
You've already torn it down to business, data, and ui level; so its somewhat modular already. ASP.NET MVC can be your best bet. You can also consider using ASP.NET Modules to keep the concerns separate logging, authentication.
I need some expert advice on strong typed data sets in ADO.NET that are generated by the Visual Studio. Here are the details. Thank you in advance.
I want to write a N-tier application where Presentation layer is in C#/windows forms, Business Layer is a Web service and Data Access Layer is SQL db.
So, I used Visual Studio 2005 for this and created 3 projects in a solution.
project 1 is the Data access layer. In this I have used visual studio data set generator to create a strong typed data set and table adapter (to test I created this on the customers table in northwind). The data set is called NorthWindDataSet and the table inside is CustomersTable.
project 2 has the web service which exposes only 1 method which is GetCustomersDataSet. This uses the project1 library's table adapter to fill the data set and return it to the caller. To be able to use the NorthWindDataSet and table adapter, I added a reference to the project 1.
project 3 is a win forms app and this uses the web service as a reference and calls that service to get the data set.
In the process of building this application, in the PL, I added a reference to the DataSet generated above in the project 1 and in form's load I call the web service and assign the received DataSet from the web service to this dataset. But I get the error:
Cannot implicitly convert type
'PL.WebServiceLayerReference.NorthwindDataSet'
to 'BL.NorthwindDataSet' e:\My
Documents\Visual Studio
2008\Projects\DataSetWebServiceExample\PL\Form1.cs
Both the data sets are same but because I added references from different locations, I am getting the above error I think.
So, what I did was I added a reference to project 1 (which defines the data set) to project 3 (the UI) and used the web service to get the DataSet and assing to the right type, now when the project 3 (which has the web form) runs, I get the below runtime exception.
System.InvalidOperationException:
There is an error in XML document (1,
5058). --->
System.Xml.Schema.XmlSchemaException:
Multiple definition of element
'http://tempuri.org/NorthwindDataSet.xsd:Customers'
causes the content model to become
ambiguous. A content model must be
formed such that during validation of
an element information item sequence,
the particle contained directly,
indirectly or implicitly therein with
which to attempt to validate each item
in the sequence in turn can be
uniquely determined without examining
the content or attributes of that
item, and without any information
about the items in the remainder of
the sequence.
I think this might be because of some cross referenceing errors.
My question is, is there a way to use the visual studio generated DataSets in such a way that I can use the same DataSet in all layers (for reuse) but separate the Table Adapter logic to the Data Access Layer so that the front end is abstracted from all this by the web service?
If I have hand write the code I loose the goodness the data set generator gives and also if there are columns added later I need to add it by hand etc so I want to use the visual studio wizard as much as possible.
I would stay away from datasets if I were you. They are very much a .NET 2.0 solution to the problem, and they weren't a very good solution, either.
I would use Entity Framework as a data layer - they do not have the issues a DataSet has, including the one you're seeing. A DataSet has to live in both worlds - both XML and relational, and they don't always fit. Entity Framework can build you entity models that need only conform to standard programming concepts like inheritance and association.
Entity Framework also has fewer issues when transferred over web services. You should use WCF for all your new web service work (Microsoft now considers ASMX web services to be "legacy technology"), but even with ASMX web services, EF entities will transfer fairly cleanly. There are some relatively minor issues in .NET 3.5, but those are addressed in .NET 4.0.
My question is, is there a way to use the visual studio generated DataSets in such a way that I can use the same DataSet in all layers (for reuse) but separate the Table Adapter logic to the Data Access Layer so that the front end is abstracted from all this by the web service
If you don't want to rewrite your app for EF or add DTOs, and you know the schemas are equal, you could set the data set schema in your presentation layer via the web service.
Project3DataSet.ReadXmlSchema(
new StringReader(Project2WebService.GetCustomersDataSetSchema()));
[WebMethod]
public string GetCustomersDataSetSchema()
{
return new Project1DataSet().GetXmlSchema();
}
Your data set schema acts as the object model. Not pretty but should do the job.
With that being said, if this is a new project, I agree with the other answers - avoid data sets.
I'd back up John here, we use EF v1.0 in N-tiered app and of course it has it's own problems but you get regular objects which you can push through service. However I'd advise ( in case you go with EF1 not EF4 which has ability to expose it's objects as POCO's) to have a separate layer of DTO objects which would me mapped to DO objects from you DAL, and use DTO for transfering through service. Also consider using .NET RIA services, they are really fantastic
#sb: on DTO, on EF, quick overlook of RIA services,
old article on DTO with datasets, what you are trying to do.