I have some sensitive data which I want to keep encrypted in the db and decrypt it on the fly in the code. Since it's an existing application, I would like the encryption/decryption process to run as low as possible in the pipeline, so I don't have to amend my services level. My first solution was amending getters and setters for selected properties, to run those through my encryption helper. Then I thought it would be nice to have this wrapped in an attribute. Found Post Sharp extension that could do it, but seems to be an overkill for this scenario. Are there any other alternatives to achieve what I want to do?
Depending on how you use EF, you could build a wrapper on top of EF DbContext, a generic repository in which you could inject some hooks (interceptors).
One of those interceptors would be something like ICryptographyInterceptor which would handle based on the entity type the logic for encryption (on insert) or decrypt (on retrieve) and it would not pollute your business logic or models since these interceptors would handle this task.
An existing implementation can be found here.
If the project is too complex for this kind of changes, maybe a ICryptographyService for transforming entities would be a better solution.
Related
I'm starting out with Xamarin and sqlite, specifically this package:
https://github.com/oysteinkrog/SQLite.Net-PCL
There is no fluent api (IE like in Entity Framework), which is OK, but instead, does anyone have a pattern to keep the attributes, such as [PrimaryKey] out of the domain model so that my domain library doesn't need a reference to the sqlite libraries? The only way I can see is to create separate classes in my sqlite repo library for each of my domain classes, and employ some kind of mapping scheme. But that seems a lot of work just to avoid an attribute. In fact, in this case it's probably easier just to bang out the sql and do it that way instead of using SQLite.Net-PCL's built-in ORM.
It's probably worth it in my case to just litter my domain with the attributes and create a dependency on the SQLite.Net-PCL library.
Are there any other libraries that i can use with xamarin/PCL that might help? Or is there a better technique?
If there really is no configuration-based approach like the fluent API you mention (I don't know of one), you have two options:
Use a separate persistence model
This is usually not as much effort as it initially appears, and it provides real separation of concerns. By going down this path, you also have a natural location for schema migration logic and custom data mappings.
The ugly part of this is the mapping between the persistence and the domain model, which can typically be solved elegantly by using an object to object mapper. For C#, a very good solution is AutoMapper.
Re-evaluate your decision to go with DDD
This is a rather drastic change, but given the fact that you're going to build a mobile app, there is a chance that DDD is a bit heavy anyway. If that's the case*, the door is open to just make the trade-off and have a unified persistence and "domain" model.
*: Take this decision seriously and consider all requirements. Don't base your decision solely on the "implementation detail" problem that you're having right now.
Does anyone know of any tools that can generate a C# WebApi REST Interface for a model?
What I would like is to define my model and add attributes that describe properties of the resource in the context of standard REST architecture. After defining the model and adding attributes, the hypothetical tool would generate all the code needed for implementation.
I'm not 100% sure if this tool will work for your very specific case, but it is a code generator nonetheless and appears to be relatively versatile.
http://www.codesmithtools.com/product/generator
I'm assuming you already searched for a code generator to suit your need so if this one doesn't work for you, you may have to face the music and write one yourself.
A co-worker I work with here wrote a generator in C# that pulled schema/data structure info from a SQL database, took that info, and simply output C# code to a .cs file.
I know this isn't the best answer, but hopefully it helps in at least some way. :)
Use OData to create simple rest services(GET, POST, PUT, DELETE). OData is being widely accepted by many UI components which allow us to use dynamic querying at request level.
You can simply replace the Model Names and DTO Names in an odata controller to use it to other model's controller.
this post is meant to have a list of suggestions on MVVM approach... What tools do you use, what do you do to speed up development, how do you maintain your application, any special ways of finding defects in this design pattern......
here is what I do:
So first i create my model/db with EF.
Then I create Views (either user controls or windows) with their respective viewmodel. I usually place the viewmodel in the same location as my view. But while starting the name of my view with "UC-name", I call my viewmodel just "name-model".
In my viewmodel I implement InotifyPropertyChanged
in my xaml view I have a resource of my viewmodel, and bind my grids/controls via the itemsource to the staticresource.
I try to do a lot of front end logic with triggers and styles and also place some code in the xaml.cs file if it regards logic for behaviour of my controls.
I can reach my viewmodel from my view (xaml + xaml.cs).
for communiation between viewmodels I use MVVM lights.
that's pretty much it.
Things I'm thinking about
I'm thinking of using T4 templates for generating viewmodel/view. What do you guys think of this. is this worth it?
when using MVVM light Messenger, we get a subscription based communication, and sometimes I find it hard to track what has changed in my DataContext. Any suggestions on this?
any other improvements or suggestions are more than welcome !
Answering first question regarding View/ViewModel generation I think for CRUD cases it makes sense to use some tools, otherwise it won't be that beneficial.
Pretty nice basic scaffolding implementation example you can find here: WPF CRUD Generator. Also WPF solution by DevExpress looks really promising.
There are at least couple Codeplex projects addressing View/ViewModel generation:
WPF Scaffolder
ViewModel Tool by Clarius
But I am quite pessimistic about T4 for such scenarios. I think writing and polishing own T4's will take you much more time than adoption of existing tools.
Regarding MVVMLight messenger I can say that it will take you some time to get used to it. And as soon as you will understand difference between "regular" and message driven MVVM you'll be able to use it in most efficient way. Very nice article about messenger is Messenger and View Services in MVVM. And want to add a really important quote from there:
A Word of Caution About Messenger
Messenger is a powerful component that can greatly facilitate the task
of communication, but it also makes the code more difficult to debug
because it is not always clear at first sight which objects are
receiving a message. Use with care!
I'm very much a proponent of Domain-Driven Development (DDD). First I have the designer write specifications, roughly adhering to the methodologies in Behavior-Driven Development (BDD). This then forms the basis of unit tests for Test-Driven Development (TDD), for which I use NUnit. For the domain layer itself I start with an Anemic Domain Model i.e. entity classes containing mostly properties and virtually no methods; there are plenty of arguments both for and against this but personally I find it works well. Coupled with this is the Business Logic Layer (BLL) which knows only about the domain entities.
For the Data Access Layer (DAL) I prefer NHibernate, it supports all the usual things you would expect like lazy loading and repository management etc but particularly important is the Object Relational Mapping (ORM) i.e. the bit that translates between your domain entities and the underlying database representation.
One of the problems with NHibernate, in my opinion, is that it uses XML files to do the mapping for the ORM. This means two things: first is that any errors you introduce won't get picked up until run-time. Secondly it's not really a proper "solution" to ORM at all, instead of writing mapping classes you just wind up writing XML files. Both of these problems can be solved by using Fluent. Fluent solves the first problem by replacing XML files with C# files, so your mapping declarations are now done in code which will usually pick up errors at compile-time. It solves the second problem by providing an auto-mapper, which looks at your entities and generates the necessary mapping files automatically. This can be manually overridden if and where needed, although in practice I find I seldom need to. Since the auto-mapper uses reflection is does tend to be a bit slow but it can be run in an offline utility and then saved to a configuration file that is loaded at run-time for near-instant start-up; the same utility can also be used to create your database automatically. I've used this tech with MySql, MS Server and MS Server CE...they've all worked fine.
On the other side of the tier is your view model. I've seen a lot of projects create an almost 1:1 mapping of domain entities to view model classes, I may infuriate MVVM purists by saying this but I really don't see the point in doing all that extra work for something that isn't really needed. NHibernate allows you to provide proxies for the classes it creates, using Castle Dynamic Proxy you can set an intercepter to your NHibernate session factory that automatically injects INotifyPropertyChanged notification to all of your entity properties so that they work with the WPF binding mechanism. Another product, uNhAddIns, allows you to replace any lists with an ObservableCollection in order to get INotifyCollectionChanged support (for reasons I won't go into you can't just put an ObservableCollection into your entities without it seriously affecting performance).
If you're designing and building your application properly using technologies like these and fully unit-testing along the way then you're going to need some way of handling Inversion of Control (IoC) so that you aren't passing object references around all over the place, and for that you'll need a dependency injection framework. My personal preference is Ninject but Unity is pretty good too. Dependency injection is particularly important for good database session management (so that all relevant objects reference the same session), a good rule is one session per WPF form or one per web request.
There are lots of other little things I use to make life easier (MVVM Lite, log4net, Moq for mocking objects for unit testing etc) but this is my core architecture. It takes a while to set up but once you've got it all going you can build fully functional database applications in literally minutes without any of the headaches traditionally associated with layer management in tiered enterprise applications...you just create your domain entities and then start coding for them. Your schema is created automatically, your database is created automatically, you can use your entity classes to fill your database for immediate stress testing and you have full WPF support without having to pollute your entity classes with code or attributes not actually related to the domain. And since all development is driven by anemic domain entities your data is already in the perfect format for serialization into html/ajax/soap etc when you want to give your app web capablities.
You'll notice that I haven't discussed the presentation/XAML layer, mainly because that part is now straightforward. Using a decent architecture you can actually create a fully working and tested application that then only needs pure XAML added to turn it into a releasable product.
I am pretty new to MVC 4, and I have worked mostly with web forms up to this moment in C#. I understand the pattern of MVC, the routing, calling actions and so on.
But what about the actions which are responsible for fetching data from the database, for example by firing stored procedures? I have seen some tutorials where they put the logic for connecting to the database directly in the actions.
However I am thinking of a more centralized way to do it. For example, I can put all the functions which are firing stored procedures in a separate class named DatabaseCoordinator.cs in a folder named Helpers for example. Then I can call them from the actions in the controllers.
In that way I will know that I can find all of my methods for the database in one class, which is a very clean solution, I think (or at least in web forms). However I want to follow the pattern of MVC, and use only models, views and controllers as the name of the pattern itself implies.
So what is the best practice for that? Should I make a separate class for this, or implement the logic directly in the controllers, or perhaps somewhere else?
You should certainly make a separate repository class to contain all of your data access operations.
There is a good worked example here:
http://www.asp.net/mvc/tutorials/getting-started-with-ef-using-mvc/implementing-the-repository-and-unit-of-work-patterns-in-an-asp-net-mvc-application
I recommend that you put your data access code somewhere other than in your controller. The controller's primary purpose is to gather together the information for display on a page or the reverse - to take the data from the page that is posted back and feed it to the code responsible for business rules and data access.
For most MVC projects (heck, for most projects really!) I build separate class library projects - at minimum one for business rules and data access, though typically I'll make those two separate projects. The purpose of separating the logic is really for simpler future maintenance and reusability. If you keep your various logical parts separate, you can easily swap them out if your logic or database needs to change, or you can easily consume the business rules and data from a new type of user interface; for example, if you decided to implement your project as a Windows forms application in addition to your web system, you could (theoretically) just reuse your business logic and data access logic libraries and only rebuild the user layer. However, if you build your logic into your controller, you really can't reuse that logic without extracting it and converting it to the new application model you're using.
So, simply put, definitely keep 99% of your logic and data access out of your controller. Only put what you must put into your controller, the rest in a separate class, or where appropriate, in separate class libraries.
Good luck!
The Controllers and Views tend to stay within the same project, but it's common to split the data access classes and models into their own seperate class library, as this allows other projects to utilise them.
This will allow you, in the future, to maybe add a windows forms/wpf interface or maybe a mobile device interface, leveraging the work you already have in the standalone class library.
Another thing to consider, is looking into how to use ViewModels in your MVC application. It's a common technique when Views require more than one domain object. Using View Models in MVC.
Check out the Unit of Work Pattern (UOW) combined with the Repository Pattern. It doesn't matter if you ultimately call a stored procedure or an inline linq query to return results, your caller shouldn't know or care how GetPersons is ultimately implemented. The UOW pattern combined with the Repository pattern is a very popular way to expose an Entity Framework database in the ASP.NET community. You will find different ways to do it, some are over-kill and some just create dependencies with no actual benefit but you will find a way that feels right to you with those patterns.
After more experience, I would like to change my answer and state that the Repository Pattern and thus the Unit of Work pattern are pointless layers of abstraction to prevent you from working with Entity Framework, which is your data layer abstraction! directly.
Other than being able to swap out databases from say Microsoft SQL PostgreSQL (when would this ever happen in the real world?) and control the structure of complex queries that you don't want repeated in your code, I see no real value to the repository pattern. To include CreatedBy,ModifiedBy values on Insert/Update you need only override EntityFramework. To encapsulate queries that include business rules such as where active = 1 and isdeleted = 0 just extend Linq queries with extension methods.
Platform: C# 2.0
Using: Castle.DynamicProxy2
I have been struggling for about a week now trying to find a good strategy to rewrite my DAL. I tried NHibernate and, unfortunately, it was not a good fit for my project. So, I have come up with this interaction thus far:
I first start with registering my DTO's and my data mappers:
MetaDataMapper.RegisterTable(typeof(User)):
MapperLocator.RegisterMapper(typeof(User), typeof(UserMapper));
This maps each DTO as it is registered using custom attributes on the properties of the DTO essentially:
[Column(Name = "UserName")]
I then have a Mapper that belongs to each DTO, so for this type it would be UserMapper. This data mapper handles calling my ADO.Net wrapper and then mapping the result to the DTO. I however am in the process of enabling deep loading and subsequently lazy loading and thus where I am stuck. Basically my User DTO may have an Address object (FK) which requires another mapper to populate that object but I have to determine to use the AddressMapper at run time.
My problem is handling the types without having to explicitly go through a list of them (not to mention the headache of always having to keep that list updated) each time I need to determine which mapper to return. So my solution was having a MapperLocator class that I register with (as above) and return an IDataMapper interface that all of my data mappers implement. Then I can just cast it to type UserMapper if I am dealing with User objects. This however is not so easy when I am trying to determine the type of Data Mapper to return during run time. Since generics have to know what they are at compile time, using AOP and then passing in the type at run time is not an option without using reflection. I am already doing a fair bit of reflection when I am mapping the DTO's to the table, reading attributes and such. Plus my MapperFactory uses reflection to instantiate the proper data mapper. So I am trying to do this without reflection to keep those expensive calls down as much as possible.
I thought the solution could be found in passing around an interface, but I have yet to be able to implement that idea. So then I thought the solution would possibly be in using delegates, but I have no idea how to implement that idea either. So...frankly...I am lost, can you help, please?
I will suggest a couple of things.
1) Don't prematurely optimize. If you need to use reflection to instantiate your *Mappers, do it with reflection. Look at the headache you're causing yourself by not doing it that way. If you have problems later, than profile them to see if there's faster ways of doing it.
2) My question to you would be, why are you trying to implement your own DAL framework? You say that NHibernate isn't a good fit, but you don't elaborate on that. Have you tried any of the dozens of other ORM's? What's your criteria? Your posted code looks remarkably like Linq2Sql's mapping.
Lightspeed and SubSonic are both great lightweight ORM packages. Linq2Sql is an easy-to-use mapper, and of course there's Microsoft's Entity Framework, which has a whole team at Microsoft dealing with the problems you're describing.
You might save yourself a lot of time, especially maintenance wise, by looking at these rather than implementing it yourself. I would highly recommend any of those that I mentioned.