Using a WCF Service with Entity Framework 4 and...DTO? - c#

As described above I'm implementing a multi-tier architecture to work with WCF and Entity Framework 4 (with poco). Since I'm already have persistence ignorance with POCO I do need implement DTO or I can use WCF in its pure way?
The main quote is - I do need DTO to pass a lightweight object on the network or I can use my POCO entities.
What you guys recommend?

Its hard to answer unless you define what the "pure way" is. Are we talking SOA pure or WCF pure?
WCF proxies already are DTOs in a way because they do not bring along any business logic across your service contract. Creating another layer of DTOs on top of the proxy classes generated by WCF seems redundant.
The biggest question you want to answer is "how SOA is this solution?". You cannot share your POCO entities across service boundaries if you want to be SOA compliant. SOA is all about disparate contracts.
If you go all SOA based than you lose a lot of functionality because the classes your web tier will be working with most of the time will be stupid proxies. You'll have to repeat a lot of logic and you lost a lot of the "meta data, convention over configuration" functionality that MVC 2 provides.
If you throw the SOA buzzword into the shredder, which you should do ( http://soafacts.com/ ), then you'll have a much easier time sharing business logic and meta data information across tiers. If the only consumer of your web service is yourself than this method is probably your best choice.
This is where you could use DTOs to send across the wire instead of your POCO entities. The only downside is again, repeating logic, and lots of boiler plate ceremonial code that does nothing. Really depends on the size of your project. If its small, forget about DTOs, but if you have 20 developers working with 200,000 LoC than DTOs are probably worth creating.

As jfar said it depends on whether you are going to the be only one consuming the service, or whether the presentation tier is going to be you only.
If you are doing the later and it's only going to be you using your service then you can serialise you POCOs across the wcf service boundaries. This is something I have done recently and wrote this blog post about getting it to work. This will allow you to use the same entities in the app tier as well as the presentation tier.
Hope it helps.

The strongest reason to recommend a DTO when using WCF with EF is that EF database-first classes drag implementation-dependencies into your proxy classes. If you are using code-first with POCO classes, then there should be no implementation dependencies.
Try returning just your POCO classes, but then take a close look at the generated proxy classes. Make sure that there is nothing in those classes that is part of the EF infrastructure. If the proxy classes are clean, then you should be all set.

Related

Separation of Concerns - How to separate GET/PUT/PATCH/POST/DELETE/ETC into one Microservice that gets its models and DTOs externally

Lets say you have a typical C# .netcore webapi you are wanting to use in a microservices architecture environment. It uses entity framework connects to a SQL database, has models and DTOs.
If you want to separate the 'restfulness', the actions of actually responding to the individual GET/PUT/PATCH/POST/DELETE/ETC methods, from the data models (and into microservices) what approach would you take?
IE instead of having to create 100 microservices that each expose the same exact RESTful functionality within the APIs but each have their own specific data models and DTOs, id want to create 1 API that exposes restful GET/PUT/PATCH/POST/DELETE/ETC and separate it from static models, dtos and entitybuilder configurations. So i'd have 100 microservices concerned with passing data to the 1 REST microservice to get whatever job I need to do done in a dynamic fashion.
I am not super experienced with object oriented programming methods and I thought maybe it would be possible to have my CRUD microservice that my child microservice talks to (through an API gateway or some other method that I haven't worked out yet) pass a set of models, DTOs and entity framework entitybuilder parameters into the CRUD microservices Program.cs's Main method?
I am on the right path here?
Thank you in advance for any advise or helpful examples!!!
You don't. What you describe is still having a single monolith and putting 100 microservices as a facade. That makes about as much sense as ordering a large pizza and a dozen small salads as dessert because you were told that having a small salad for a meal would help you lose weight. That's not how that works.
Either look into microservices and what you gain by using that architecture, or go with your single service that does it all. Both might be valid choices, just don't pretend you are doing the other one too.

Placement of DTO / POCO in a three tier project

I've been in the process of re-writing the back-end for a web site and have been moving it towards a three-tiered architecture.
My intention is to structure it so:
Web site <--> WCF Service (1) <--> Business Layer (2) <--> Data Layer (3)
My issue is with the placement of the DTO's within this structure. I'll need to use DTO's to move data between the business layer and the WCF service and from the WCF service to the consuming web site.
During my research on here I've read some excellent discussions though I've been left scratching my head a bit:
Davide Piras makes some great points in this post and if I were to follow this design then I'd declare interfaces for the POCOs in a separate project. These would then be implemented by tiers (1) and (2). Whilst I like the use of interfaces it does seem like I'd be making more work for myself by declaring POCOs in (1) and (2) and then copying their data back and forth using something like AutoMapper.
This post uses a system where a business objects project is created which would be referenced by all tiers. This seems to be simplier than the other solution and seems to lead me to a solution that would be
Web site <--> WCF Service (1) <--> Business Layer (2) <--> Data Layer (3)
^ ^ ^
| | |
[ -- Business Objects Referenced here --]
My question is this: is there a code smell from sharing the business objects across three layers of the solution or are the two methods I've listed above just two different ways of cracking the same nut?
Let's think of your UI (website) and service (WCF + BL + DAL) as two distinct entities.
If you have 100% control over both, you should choose approach #2 since it will avoid translation between WCF proxy objects and your business objects in the UI layer.
Else, you are better off with approach #1 since one of the entities is kind of a 'black box' and is subject to change by external stakeholders. So, it's safer to maintain an internal set of business objects. This will need translation between your business objects and the WCF proxy objects (through Extension methods or a translator framework).
Now, not exactly sure of your UI layer complexity or its implementation (MVC, WebForms, etc.), so you may or may not need View specific objects (lighter for data-binding, faster to serialize to JSON, etc.), let's call this object as the Model.
If you don't need a distinct UI specific Model, suggest to mark your business objects as DataContract (in the context of WCF) and use them across layers. Don't forget to explicitly mark entities as Serializable if you are invoking WCF via the web ($.ajax).
Else, use DataContract in the service and a translator to convert DataContract to Model in the UI layer. A Service Adapter is a good fit here - it sits in the UI layer and is responsible for consuming the WCF service and translating between DataContract and Model. You may use a Service Proxy in the UI layer, which is a wrapper over your WCF service and is consumable over the web.
Lastly, aren't you are missing a reference to the Business Objects in your Data Layer? I believe you will populate your Business Objects from the data-store in the Data Layer itself.
I would say that it often depends on the complexity of the project you're building. For smaller projects that I've built, I shared my 'entities' (they were just simple DTOs, serializable data buckets with getters and setters) across the layers and didn't care too much about this. On of the biggest downsides was that the logic got scattered across the project (not only in the 'Business Layer') and had a procedural-style all around. This really seems like the Anemic Domain Model, but the complexity didn't creep in as the project didn't grow too much. Entity Framework has some templates you can build you entities off of as such (if I'm not mistaken they're called self tracking entities?).
For medium/bigger projects, I wouldn't use this approach and would keep entities and DTOs separately as they would serve different roles. The DTOs might have a totally different shape than your entities and trying to share them between the tiers/layers would be often smelly.

Expose Entity Framework entities to other clients

I have a .NET 4 class library that contains an Entity Framework data model and a collection of classes that provide common functionality using those entities. These classes are used across different types of applications.
So, my question is would it be considered good practice to expose entities contained within the class library to other applications?
If your entities are advanced enough to meet your persistence needs and your domain needs (or external application needs) and there is not much or a low-likelihood of "cross-layer pollution", then I say yes it's a good practice. It can also be a good practice in the agile development sense: good-enough-for-now.
Given a longer period of time and if being a purist is more your inclination, then it starts to become a bad practice because you're increasing the coupling, for example, if you start adding attributes to deal with persistence, validation, and serialization.
Some ways to avoid that are to use something like AutoMapper, generated code, or hand-coded facades, service layers, and/or adapters to minimize the effects.
Because entity framework changes whenever the database changes, I don't believe this would provide an adequate API. Instead, I would recommend creating a Data Transfer Object to act as an intermediary between external code and each entity that needs to be accessed. Furthermore, consider creating a Facade class (Service layer) that will mediate between the internals of your library and the external "client." Good article on DTOs: http://msdn.microsoft.com/en-us/magazine/ee236638.aspx

Designing an API: Use the Data Layer objects or copy/duplicate?

Struggling with this one today.
Rewriting a web-based application; I would like to do this in such a way that:
All transactions go through a web services API (something like http://api.myapplication.com) so that customers can work with their data the same way that we do / everything they can do through our provided web interface they can also do programmatically
A class library serves as a data layer (SQL + Entity Framework), for a couple of design reasons not related to this question
Problem is, if I choose not to expose the Entity Framework objects through the web service, it's a lot of work to re-create "API" versions of the Entity Framework objects and then write all the "proxy" code to copy properties back and forth.
What's the best practice here? Suck it up and create an API model class for each object, or just use the Entity Framework versions?
Any shortcuts here from those of you who have been down this road and dealt with versioning / backwards compatibility, other headaches?
Edit: After feedback, what makes more sense may be:
Data/Service Layer - DLL used by public web interface directly as well as the Web Services API
Web Services API - almost an exact replica of the Service Layer methods / objects, with API-specific objects and proxy code
I would NOT have the website post data through the web services interface for the API. That way leads to potential performance issues of your main website. Never mind that as soon as you deploy a breaking API change you have to redeploy the main website at the same time. There are reasons why you wouldn't want to be forced to do this.
Instead, your website AND web services should both communicate directly to the underlying business/data layer(s).
Next, don't expose the EF objects themselves. The web service interface should be cleaner than this. In other words it should try and simplify the act of working with your backend as much as possible. Will this require a fair amount of effort on your part? yes. However, it will pay dividends when you have to change the model slightly without impacting currently connected clients.
It depends on project complexity and how long you expect it to live. For small, short living projects you can share domain objects across all layer's. But if it's big project, and you expect it to exist, work well, and update for next 5 years....
In my current project (which is big), I first started with shared entities across all layers, then i discovered that I need separate entities for Presentation, and now (6 month's passed) I'm using separate classes for each layer (persistence, service, domain, presentation) and that's not because i'm paranoid or was following some rules, just I couldn't make all work with single set of classes across layers... Make you conclusions..
P.S. There are tools that can help you convert your objects, like Automapper and Value Injecter.
I would just buck up and create an API specifically aimed at the needs of the application. It doesn't make much sense to what amounts to exposing the whole DB layer. Just expose what needs to be exposed in order to make the app work, and nothing else.

Sharing domain model with WCF service

Is it good practice to reference my web applications domain layer class library to WCF service application.
Doing that gives me easy access to the already existing classes on my domain model so that I will not need to re-define similar classes to be used by the WCF service
On the other hand, I don't like the coupling that it creates between the application and service and i am curious if it could create difficulties for me on the long run.
I also think having dedicated classes for my WCF app would be more efficient since those classes will only contain the members that will be used by the service and nothing else. If I use the classes from my domain layer there will be many fields in the classes that will not be used by the service and it will cause unnecessary data transfer.
I will appreciate if you can give me your thoughts from your experience
No it's not. Entities are all about behaviour. data contract is all about... data. Plus as you mentioned you wouldn't want to couple them together, because it will cripple you ability to react to change very soon.
For those still coming across this post, like I....
Checkout this site. Its a good explanation on the topic.
Conclusion: Go through the effort of keeping the boundaries of your architecture clear and clean. You will get some credit for it some day ;)
I personally frown on directly passing domain objects directly through WCF. As Krzysztof said, it's about a data contract not a contract about the behavior of the the thing you are passing over the wire.
I typically do this:
Define the data contracts in their own assembly
The service has a reference to both the data contracts assembly and the business entity assemblies.
Create extension methods in the service namespace that map the entities to their corresponding data contracts and vice versa.
Putting the conceptual purity of what a "Data Contract" is aside, If you begin to pass entities around you are setting up your shared entity to pulled in different design directions by each side of the WCF boundary. Inevitably you'll end up with behaviors that only belong to one side, or even worse - have to expose methods that conceptually do the same thing but in a different way for each side of the WCF boundary. It can potentially get very messy over the long term.

Categories