BreezeJs + EF + Angular capabilities with DTO's - c#

i'm evaluating Breeze.Js for a large enterprise, data oriented, Angular 5 application in order to take advantage of the following features that are missing in the vanilla Angular framework:
client side data store
client side model state tracking
client side model validation rules
bulk data persistence (SaveChanges() method to persist all entities).
For test purposes i've written the following simple BreezeController in my ASP.NET WebApi + EntityFramework server side:
[EnableCors(origins: "*", headers: "*", methods: "*")]
[BreezeController]
public class PeopleController : ApiController
{
private AdventureWorksDbContext db = new AdventureWorksDbContext();
#region "Breeze"
readonly EFContextProvider<AdventureWorksDbContext> _contextProvider =
new EFContextProvider<AdventureWorksDbContext>();
// ~/breeze/todos/Metadata
[HttpGet]
public string Metadata()
{
return System.Text.Encoding.UTF8.GetString(AdventureWorks.WebApi.Properties.Resources.WebApiMetadata);
}
// ~/breeze/todos/Todos
// ~/breeze/todos/Todos?$filter=IsArchived eq false&$orderby=CreatedAt
[HttpGet]
public IQueryable<PersonDTO> GetPeople()
{
return db.People.ProjectTo<PersonDTO>();
}
// ~/breeze/todos/SaveChanges
[HttpPost]
public SaveResult SaveChanges(Newtonsoft.Json.Linq.JObject saveBundle)
{
return _contextProvider.SaveChanges(saveBundle);
}
#endregion
}
As you can see in my example (it uses AdventureWorks DB) i've done the following modifications:
1) "GetPeople()" endpoint returns a queryable of DTO ("ProjectTo" extension is provided by Automapper). I need to do this in order to shape the model in a usable way for the client, avoid recursions, deep dive in the schema, big fields serialization and so on.
2) "Metadata()" endpoint returns a string resource that represents metadata of the DTO class. I builded it using "PocoMetadata" tool of the "Breeze Tooling Suite" (https://github.com/Breeze/breeze.tooling). This is needed because i can't return the _contextProvider.Metadata() result as long as i'm using DTO's and not EF POCO class.
Now, if in my Angular 5 client i issue an ODATA query like the following i can see that executeQuery() method actually works:
export class BreezeDataStoreComponent implements OnInit {
private _em: EntityManager;
constructor() {
this._em = new EntityManager({
serviceName: 'http://localhost:31328/breeze/People'
});
}
ngOnInit() {
const query = EntityQuery.from('GetPeople')
.where('FirstName', FilterQueryOp.StartsWith, 'F')
.orderBy('LastName', true);
this._em.executeQuery(query).then(res => {
// Here i can get all People instances.
// Now i try to get the first, edit the name and saveChanges.
(res.results[0] as any).FirstName = 'Franklino';
this._em.saveChanges().then(saveResult => {
const test = saveResult.entities;
});
});
}
}
Unfortunately problems comes with SaveChanges().
When the Angular client calls that method, in my server side i get the following error:
System.InvalidOperationException: Sequence contains no matching
element
I think it's due to the fact that i'm calling SaveChanges() over an EF context provider passing a JObject bundle referred to DTO instead of POCO class.
So my question is:
Is it possible to use BreezeJs query and bulk persistence (SaveChanges() method) using DTO's? It's a pretty common need in big data-centric enterprise applications since i think it's a bad practice exposing EF POCOs on WebApi.
should i rely instead over a classic WebApi that respond to the POST\PUT\DELETE HTTP verbs? In that case, how to configure Breeze client in order to contact those endpoints instead of "SaveChanges" when persisting data?
If Breeze is not suitable for this needs are there other technolgies that provides the 4 abovementioned points?
Thank you very much.

To make SaveChanges work with your DTOs, you would need to either
Write your own method to unpack the JObject saveBundle, or
Use the BeforeSaveChanges method to modify the dictionary of DTOs and replace them with entities that EF understands.
Number 2 seems like the better choice. If you do not have a 1:1 match between entities and DTOs, some logic would be required when doing the mapping.

Related

How do I map the DTO files to my Models in my .Net Core project

I've never worked with a .Net Core project before but have a history with .Net including MVC and entity framework. I'm working with a new .Net Core project which has five solution folders, EHA.PROJ.API, EHA.PROJ.DTO,EHA.PROJ.Repository, EHA.PROJ.Repository.Test and EHA.PROJ.Web. The EHA.PROJ.DTO folder has a number of files such as CategoryDTO.cs which looks like this
namespace EHA.PROJ.DTO
{
public class CategoryDescDTO
{
public int CategoryRef { get; set; }
public string CategoryName { get; set; }
}
}
I'm looking to set up a mapping arrangement to get the data from the EHA.PROJ.DTO files to the model files in my models folder in my EHA.PROJ.Web folder. I've been browsing as I've never done anything like this before as I've previously worked with data from a DAL folder using entity framework and connection done through connection strings. I'm guessing that there must be some process to map the data in my dbContext to connect the files in both folders. I did find some information on AutoMapper but was unsure how to implement it.
This arrangement with .Net Core is new to me so if anyone can help with any examples or point me in the right direction I would be grateful.
Your first problem is having your entities in your web project. Right off the bat, you have tight-coupling between the web project and your data layer, which then pretty much negates the point of all your other layers: DTO, repository, etc. You want to move out your entities and context into a true data layer (i.e. a class library project separate from your web project).
Then, you want to decide how far your data layer should extend. If the API is to feed the Website, then you want to actually remove all dependencies on the data layer from the web project. Your DTO project would be shared between the API and Web projects and your API would send/receive your DTOs, mapping back and forth from your entities under the hood.
However, if you're going to do that, then the repository project should just go away entirely. Just have your API work directly with EF and your entities. Your abstraction is the API itself; there is no need for another. The only reason to have the repository layer is if both the API and Web will both directly utilize the repositories, which isn't a very good pattern actually. You'll inevitably end up with a bunch of duplicated logic specific to each project.
Simply, the repository pattern is superfluous when using an ORM like EF. The ORM is your data layer. You're simply using a DAL provided by a third-party, rather than one you created yourself. The repository pattern only makes sense when working directly with SQL using something like ADO.NET directly. Otherwise, get rid of it.
Having an API is enough of an abstraction, if your goal is simply to hide the data layer. The website knows nothing of the underlying data source, and an API is really just a service layer that returns JSON over HTTP rather than object instances directly, i.e. the API is essentially your "repository" layer.
The situation can be improved even further by moving to a microservices-based architecture. With that, you essentially have multiple small, self-contained APIs that work with just one part of your domain or piece of functionality. Each can utilize EF directly, or an entirely different ORM, or even an entirely different stack. You could have APIs build on Node.js or python, etc. The website simply makes requests to the various services to get the data it needs and doesn't know or care how those services actually work.
I have been using Automapper for quite some time in .NET Core projects due to ease of use and built-in dependency injection.
Install from PM:
Install-Package AutoMapper
Install-Package AutoMapper.Extensions.Microsoft.DependencyInjection
Register in the Startup.cs, ConfigureServices method:
services.AddAutoMapper(typeof(Startup));
Create a class to keep your mappings, e.g. MappingProfile.cs using Profile from automapper, you can define mappings.
public class MappingProfile : Profile
{
public MappingProfile()
{
CreateMap<Operator, OperatorDto>().ReverseMap();
}
}
}
The above mapping tells automapper that Operator can be mapped to OperatorDto and OperatorDto can be mapped to Operator.
In your controller, you can inject an IMapper
private readonly IMapper _mapper;
public OperatorsController(IMapper mapper)
{
_mapper = mapper;
}
and map values like below:
var dto = _mapper.Map<OperatorDto>(op); // Map op object to dto
var op = _mapper.Map<Operator>(dto); // Map dto to op object
Automapper offers custom mappings, should you need it.
While it is very easy to perform mappings with Automapper, you need to learn the framework.
I believe it is worth the effort to learn it as it will save you a lot of time writing mapping code in the future.
This article is a good reference to start: https://buildplease.com/pages/repositories-dto/
My suggestion is to have a DTO assembler that maps your model to the DTO object. So, you start with your DTO class:
namespace EHA.PROJ.DTO
{
public class CategoryDescDTO
{
public int CategoryRef { get; set; }
public string CategoryName { get; set; }
}
}
Then build the assembler:
public class CategoryDescAssembler {
public CategoryDescDTO WriteDto(CategoryDesc categoryDesc) {
var categoryDescDto = new CategoryDescDTO();
categoryDescDto.CategoryRef = categoryDesc.CategoryRef;
categoryDescDto.CategoryName = categoryDesc.CategoryName;
return categoryDescDto;
}
}
Now you implement the service to do all the work required to get the DTO object:
public class CategoryDescService : ICategoryDescService {
private readonly IRepository<CategoryDesc> _categoryDescRepository;
private readonly CategoryDescAssembler _categoryDescAssembler;
public CategoryDescService(IRepository<CategoryDesc> categoryDescRepository, CategoryDescAssembler categoryDescAssembler) {
_categoryDescRepository= categoryDescRepository;
_categoryDescAssembler= categoryDescAssembler;
}
public CategoryDescDTO GetCategoryDesc(int categoryRef) {
var categDesc = _categoryDescRepository.Get(x => x.CategoryRef == categoryRef);
return _categoryDescAssembler.WriteDto(categDesc);
}
}
With the interface looking like this:
public interface ICategoryDescService
{
CategoryDescDTO GetCategoryDesc(int categoryRef);
}
You would then need to add the service to your Startup.cs:
public void ConfigureServices(IServiceCollection services)
{
...
services.AddTransient<ICategoryDescService, CategoryDescService>();
}
Now you can call your service from you view controller.

Is it possible to use DbContext code to access JSON data from a WebApi?

I have an ASP.NET Core 2.1 web app that I created following the steps in the Create Razor pages web app in the ASP.NET Core docs. Everything went as expected and I used the default localdb that was created when I scaffolded the Model. That part is fine.
What I need to do, however, is to try to use the same general code in a production environment where all CRUD functions are performed by calling WebApi methods, using JSON data that directly matches my Model objects.
For example, the Page that is created by scaffolding for the Create is this:
public async Task<IActionResult> OnPostAsync()
{
if (!ModelState.IsValid)
{
return Page();
}
_context.Registration.Add(Registration); // Here
await _context.SaveChangesAsync(); // Here
return RedirectToPage("./Index");
}
and
public async Task OnGetAsync()
{
Registration = await _context.Registration.ToListAsync(); // Here
}
Is it possible, or even feasible, to simply change every occurrence of _context operations, like the lines marked "//Here", to use calls to WebApi methods? Is there a better way to accomplish what I'm trying to do?
You can only work with entity classes. However, you can convert JSON into an instance of one of your entity classes. For example:
using (var response = await _client.GetAsync("/foo/1"))
{
var foo = await response.ReadAsAsync<Foo>();
_context.Add(foo);
await _context.SaveChangesAsync();
}
The ReadAsAsync method is syntactic sugar that simply reads the response as a string and then passes that on to JSON.NET to convert into the type specified. If you have just a JSON string, you can instead utilize:
var foo = JsonConvert.DeserializeObject<Foo>(jsonString);
That said, this is a code smell. If your API is already interacting with your entity classes, all the work should be done there. If you want to add a new entity, there should be an endpoint on your API for that. Likewise for all the other CRUD operations. Having two different apps working with your same context, and only in pieces at that, is a recipe for disaster.

How to disable c# ODataController client side query

According to this tutorial: http://www.asp.net/web-api/overview/odata-support-in-aspnet-web-api/using-select-expand-and-value
"Web API 2 adds support for the $expand, $select, and $value options in OData. These options allow a client to control the representation that it gets back from the server"
My question it, how can I disable the representation manipulation done at the client side. In other words, my server makes sure that filtering/selecting etc. are done properly, and thus I do not want the client side to do it again. It is more of an overhead.
I think you misunderstand the purpose of query options like $expand, $select, etc. They do not cause data to be manipulated on the client. Rather, they are instructions to the service. In the Web API OData implementation, query options are typically handled by the EnableQuery attribute or the Queryable attribute. If you don't use these attributes, then you are responsible for writing the code that handles query options. Or you are free to not support them.
In your controller action, like get method, add attribute [EnableQuery] (this is for OData v4)
IN your client, send out request like ~/EntitySet?$filter=... & $select = ...
Then the response will only contain the filtered and select content.
Refer to https://github.com/OData/ODataSamples/tree/master/WebApi/v4/ODataQueryableSample to see the example.
You can create a custom attribute which would inherit from EnableQueryAttribute and then override the ValidateQuery method to limit the allowed query options as well as allowed functions and page size.
using System.Net.Http;
using System.Web.OData;
using System.Web.OData.Query;
public class SecureApiQueryAttribute : EnableQueryAttribute
{
public override void ValidateQuery(HttpRequestMessage request, ODataQueryOptions queryOptions)
{
base.AllowedQueryOptions = AllowedQueryOptions.None;
base.PageSize = 30;
base.AllowedFunctions = AllowedFunctions.AllFunctions;
base.ValidateQuery(request, queryOptions);
}
}
Then you can use this custom attribute like this
[SecureApiQuery]
public IHttpActionResult Get([FromODataUri] int? key = null)
{
}

How to properly separate model validation from Controller into Service?

I'm in the middle of re-factoring the project I'm working on. In my existing controllers I do use the repository pattern, but I was still performing a little too much scaffolding than I felt comfortable with. That and some of my controllers could have 10+ repositories passed in (via Ninject). So, I decided to introduce a service layer where my intention is to have one service per controller and each service will instead have the multiple repositories injected into it and do the work I need. This works great so far, but I'm running into a confusion of sorts: How do I move the model validation away from the controller and into the service layer?
For example, take a look this Edit method on my OfficesController:
[HttpPost]
public async Task<RedirectToRouteResult> Edit(
short id,
FormCollection form,
[Bind(Prefix = "Office.Coordinates", Include = "Latitude,Longitude")] Coordinate[] coordinates) {
if (id > 0) {
Office office = await this.OfficesService.GetOfficeAsync(id);
if ((office != null)
&& base.TryUpdateModel(office, "Office", new string[2] {
"Name",
"RegionId"
}, form)
&& base.ModelState.IsValid) {
this.OfficesService.UpdateOfficeAsync(office, coordinates);
}
return base.RedirectToAction("Edit", new {
id = id
});
}
return base.RedirectToAction("Default");
}
The problem with it in comparison to the methods of the controller is that I still grab an Office object from the database, do the update, validate it, and then save it again. In this case the complexity increased rather than decrease. Before, I called the repository in the method, now I call the service which calls the repository to perform the same function. So far this increase in complexity has only show it self in my Edit methods, everywhere else the complexity decreased substantially, which is what I want.
So, what would be a proper way move validation, and now that I think about it, the model updating logic out of the controller and into the service? Recommendations are appreciated!
For reference, here's how my project is structured:
Data: Contains all of my model classes
Data.Google.Maps: Contains all classes I need to deserialize a specific Kml
Data.Models: Contains my DbContext, configurations, view models and partial view models
Data.Repositories: Contains all of my repositories that talk to the DbContext. Since EF is a pseudo repository on it's own, I'm leveraging my "repositories" as a more specific way of querying for data. For Example: FindTechnicians() or FindActive(), etc.
Data.Services: Contains all of the services I will use. The services will have one or more repository injected into them and perform all of logic I need done before I pass a completed view model back to the controller.
Identity: Contains my implementation of ASP.NET Identity.
Web.Private: Contains the actual MVC project.
Here are 2 articles you should read if you haven't already:
https://cuttingedge.it/blogs/steven/pivot/entry.php?id=91
https://cuttingedge.it/blogs/steven/pivot/entry.php?id=92
The answers to your problem are FluentValidation.NET and dependency decoration.
With it, you could do something like this:
private readonly IExecuteCommands _commands;
[HttpPost]
public async Task<RedirectToRouteResult> Edit(short id, UpdateOffice command) {
// with FV.NET plugged in, if your command validator fails,
// ModelState will already be invalid
if (!ModelState.IsValid) return View(command);
await _commands.Execute(command);
return RedirectToAction(orWhateverYouDoAfterSuccess);
}
The command is just a plain DTO, like a viewmodel. Might look something like this:
public class UpdateOffice
{
public int OfficeId { get; set; }
public int RegionId { get; set; }
public string Name { get; set; }
}
... and the magic validator:
public class ValidateUpdateOfficeCommand : AbstractValidator<UpdateOffice>
{
public ValidateUpdateOfficeCommand(DbContext dbContext)
{
RuleFor(x => x.OfficeId)
.MustFindOfficeById(dbContext);
RuleFor(x => x.RegionId)
.MustFindRegionById(dbContext);
RuleFor(x => x.Name)
.NotEmpty()
.Length(1, 200)
.MustBeUniqueOfficeName(dbContext, x => x.OfficeId);
}
}
Each of these validation rules will be run before your action method even gets executed, provided you have the validators set up for dependency injection and that you are using the FV MVC validation provider. If there is a validation error, ModelState.IsValid will be false.
You have also just solved the over injection problems in both your controller and (maybe) service layers. You can run any query, execute any command, or validate any object with only 3 interface dependencies.

Entity Framework + WCF DataServices + Extension Methods

I'm trying to build a library project which will assist me in my other projects with some extension methods. The type which will be extended is DbContext. Here is some example
public bool Insert<TEntity>(this DbContext Context, TEntity entity) where TEntity : class
{
if (Context.Entry(entity).State == EntityState.Detached) //Entity is detached
{
Context.Set<TEntity>().Add(entity);
}
else //Entity is attached
{
Context.Entry(entity).State = EntityState.Added;
}
var str = Context.GetValidationErrors();
if (Context.GetValidationErrors().Any()) return false;
Context.SaveChanges();
return true;
}
Where is the problem ?
I'm exposing the context through WCF Data Services, and it shrinks my context capabilities, which for me is contraditory, since the goal here is to expose data, how can you expose data without the meanings to reach it ? So, how can i accomplish this task, exposing my entension methods as extensions methods to my client side context operations.
EDIT
I have been reading arround and found this answer from Ladislav Mrnka
Implement WCF Data Service using the Repository Pattern
DbContext API differs from API of generated context, when you're adding a reference to a data service. Moreover, possibilities of generated context are limited comparing to DbContext. Client side context is a helper for building OData queries, and it is not perfect. I don't think, that you could port every extension method without re-implementation (if it will be possible at all).

Categories