WPF - MVVM Architecture (Visual Studio Solution and Projects) [closed] - c#

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
I'm working on a Window-App WPF project in MVVM pattern. At the moment, the app is a bit simple (can't really explain the nature of the product), but eventually it is expected to grow into a more complex app.
The wpf winapp has a local database and also connects to a REST service.
Development Time is not really the top concern; but maintainability, and testability.
Will use an IOC container and DI
Planning to do 1 ViewModel is to 1 View
I don't want to use any WPF/MVVM frameworks, as this is my first time in WPF-MVVM app (just like first time coding in bare DOM javascript even if there's jquery).
I decided to use multiple projects, and here's what I came up so far:
Product.Windows.Common (Utils, Logging, Helpers, etc.)
Product.Windows.Entities (Database and REST entities)
Product.Windows.Contracts (All Interfaces will reside in this namespace/project)
Product.Windows.Data (for local Database)
Product.Windows.ServiceClients (for REST client)
Product.Windows.App (the main WPF project, contains the Views/XAML)
Product.Windows.Models (INPChanged)
Product.Windows.ViewModels (INPChanged and ICommands)
Product.Windows.Tests (Unit Tests)
I just want to ask:
Is this architecture a bit over-kill?
Do I need to create a Product.Windows.Business for the business logic? Or should I just put business logic in the ViewModels?
Thank you in advance :)

i'm currently working on an app with a similar structure. the project structure looks ok. in my project i did things a little differently though.
the Data and ServiceClients assemblies might represent your DAL. it's good these are separated in different assemblies. in the Data assembly you'll have the repositories and in the ServiceClients you'll have the service agents. The Entities and Contracts assemblies might represent your BL. Here, i think you could have used a single assembly. this assembly should be referenced by both DAL assemblies.
it's good that logging is implemented separately and if you have security this should also be implemented in Common. From what i've read recently, in a great book, Dependency Injection in .NET, utils & helpers are a result of poor/incomplete design. these classes usually contain static methods. but i don't think this is relevant to the discussion.
on my projects i usually implement the VMs in the same assembly as the views. this includes the RelayCommand (the ICommand implementation) and the ViewModelBase that implements INPC.
i've recently viewed a presentation by Robert Martin. from what i can remember he said that an application's architecture should scream what the application does. classes should not be grouped in projects or folders called (MVC or MVVM). this tells us nothing about what the app does. classes should be grouped by what they do, by the features they implement. i'm not at this phase yet. i'm still grouping things like you :).
i see that you only have a single test project. this might also be fine if you add directories in this project for all the assemblies you are planning to test. if you're not doing that it will be a little hard to find the tests for a particular assembly. you might want to add test projects for every assembly you plan to test.

You can organize your components as you want but i prefere the following structure:
- create 2 class libraries (dll) for each screen in your project (one of them has views + View Models for this screen and the other dll has the business logic for it) so you can use your view and viewmodel with another business logic and also you can change, update in every screen business/view separately and the update will work when you replace a dll.
Use all of your components except:
Product.Windows.ViewModels
Product.Windows.Models

It's a bit of overkill, but I think only you can vouch for your own program. I think I would put Contracts inside Common and Entities (Depending on functionality). Also, I don't think you need to completely separate between the View and the ViewModel. It'll also ease the changing / debugging process if they are on the same project.
If your program is client side only you can have the BL in the ViewModel (At least if it's not TOO complicated to follow). If you have a main server and multiple client then you should not implement ANY logic (except cosmetics) in your ViewModel, and yes create a new project

Related

How to ensure Database Independence & Loose Coupling between Application and Infrastructure layers? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 4 months ago.
This post was edited and submitted for review 4 months ago and failed to reopen the post:
Original close reason(s) were not resolved
Improve this question
I am trying to implement the clean architecture and my current understanding of it is that it is meant to increase loose coupling and database independence mostly through dependency injection & dependency inversion. I am currently using EF Core on the infrastructure layer with Masstransit(Mediator & Messaging) on the application layer. I use a Generic Repository that sits on the Infrastructure Layer where the EF related methods like "ToListAsync" and "FindAsync" are expressed and i access them from the Application Layer through an interface. My LINQ specification code also sits on the application layer.
This all made sense as long as i assumed that the reason to move the EF dependency to the Infrastructure layer was that i was making my code framework and database independent and that LINQ would work with other databases or with another DAL library. It just so happens that i recently decided to replace EF with Dapper and all this talk about database independence and loose coupling starts to make little sense to me. I still have to rewrite everything from the ground up since LINQ queries can't be used with Dapper which means my extension methods and other abstractions i built are now useless. And besides, there are many other very relevant (NoSQL) databases that don't have a mapping with LINQ.
Now here's my question. How to ensure that my Core project (Domain and Application Layer) stays agnostic of anything that relates to the persistence layer. That not only includes EF but also LINQ queries.
Why on earth do we need to make the application layer seemingly "independent" (but not really) from EF Core when it make no difference at the end of the day. It comes with no added value at all. The reliance of the application code on the database and data access libraries is still there.
You almost got it, but instead of concluding that you did something wrong, you concluded that there's something wrong with Clean Architecture. But, as you say, why would you make it seemingly independent but not really? Well, you don't! You have to really make it independent.
There are two important things to note from the description of your implementation:
Using (EF Core) LINQ in the application layer. Querying a DB using LINQ is a very specific EF thing. The fact that you managed somehow to hide part of the expression (ToListAsync) in the infrastructure layer, doesn't mean that you have abstracted anything. Your application code is custom made for EF and EF only.
You are using a generic repository. A generic repository, even behind a (single) interface, is not Clean Architecture friendly. In clean architecture it's the Core or business logic code which defines a very concrete interface for each specific scenario. As all scenarios are different, you can't create a (single) generic repository interface which covers them all without forcing some scenarios to depend on functionality that they don't need. This, not only is not SOLID, but also it can complicate your life a lot. For example, as Clean Architecture promises, you should be able to replace your DB, but not as a big bang change. You should be able to move your Products (for example) to MongoDB while leaving the rest of the application in SQL Server. Of course, if all your data access is behind a generic repository, you are forced to change everything at once. Instead, your business logic code should define an interface for every use case (IProductsRepository, ICustomersRepository, etc). Each interface will have only the concrete methods required on each case, no more. Note that if you wanted, you could still implement all interfaces with a single class or with a lot of shared code in a base class in the infrastructure layer, but you can always move one interface to a completely different implementation. Of course, the interface has to abstract the whole data access implementation, not only the ToListAsync part.
Maximizing business separation from storage layer technologies has always been one of my concerns.Just try to change how you define your repositories: Take a look at a very simple library I use https://github.com/mzand111/MZBase. The MZBase.Infrastructure (also available as a nuget package) is a .Net standard library which contains some base interfaces and classes.
Also to handle paging, sorting and filtering just using basic data-types I have developed another library https://github.com/mzand111/MZSimpleDynamicLinq which provides two main classes to use in your repositories: LinqDataRequest and
LinqDataResult. So if you take a look at ILDRCompatibleRepositoryAsync it has minimum technology dependence and implementing this interface is possible in most of the ORM technologies.
The idea of the clean architecture is to separate your business logic that much from any external service, DB or IO that you do NOT have to rewrite anything in your business logic if you want to replace one technology by another.
if you still have to rewrite parts of your business logic then it is obviously not separated properly. If your LinQ statements only work if the implementation is EF then the interface adapter is not really an adapter and the business logic is making an assumption about the implementation of the DAL.
Additionally this thread might be interesting in this context: How can Clean Architecture's interface adapters adapt interfaces if they cannot know the details of the infrastructure they are adapting?

Should I use a Bootstrapper class in a WebApi project?

Starting with ASP.Net 5, I wanted to lay the foundation to my project. As of now, I created 2 projects.
Project - The WebApi project that comes with a Startup class.
Project.Server - A dll project that will hold all the business logic.
At first I though I should write a Bootstrapper class in "Project.Server" that will allow me to hide many parts of that dll (that "Project" doesn't need to know about), but then I found myself thinking I may be doing some extra work; In "Project"'s Startup class I'm calling many of my Bootstrapper class.
Does this extra layer of abstraction needed in a WebApi project?
Although "Project.Server" is currently only referenced in "Project", but I still want to structure is correctly...
Different people will have different opinions on how to structure your web app. Personally, for me, it's a matter of how much work is involved. If it's fairly easy for you to separate out your business logic into a separate DLL, then do it. Even though there may not be any immediate advantages now (since Project is the only consumer of Project.Server), in the future, if you ever decide there needs to be another consumer of the business logic, it will be a lot easier to make that work. However, if it's a lot of work to create this extra layer, then I'd say it's not worth it, since you can't really predict what the future might bring, and so why spend a ton of effort trying to code for a future that is unknown.

Restructuring my application [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I am handling a project which needs restructuring set of projects belonging to a website. Also it has tightly coupled dependency between web application and the dependent project referenced. Kindly help me with some ideas and tools on how the calls could be re factored to more maintainable code.
The main aspect is that there are calls to apply promotion (promotion class has more than 4 different methods available) consumed from various functions, which could not be stream lined easily.
Kindly help me here with best practices.
Sorry guys- i could not share much code due ot restriction, but hope the below helps
My project uses N-Hibernate for data access
Project A- web project - aspx and ascx with code behind
Project B- Contains class definition consumed by project C (data operation class)
Project C - Business logic with saving to database methods (customer, order, promotion etc.)
The problem is with project C - which i am not sure if it does too many things or needs to be broken down.But there are already many other sub projects.
Project C supports like saving details to DB based on parameters
some of the class methods in this calls the promotion based on some condition, I would like to make things more robust - sample code below
Project -C
Class - OrderLogic
public void UpdateOrderItem(....)
{
....
....
...
}
Order order = itm.Order;
promoOrderSyncher.AddOrderItemToRawOrderAndRunPromosOnOrder(itm, ref order);
orderItemRepository.SaveOrUpdate(itm);
So just like the above class the promotion is called from may other places, i would like to streamline this calls to promotion class file. So i am looking for some concepts.
Most important in any project, especially web projects that often need to communicate with a persistent layer is to leverage dependency injection.
But before you do that, you need to make sure that the classes that provide services to communicate with the database all have an interface. Typically these classes are called data access objects (DAO). So, you'd have something like:
public class UserDao : IUserDao
{
public User GetUserById(int id)
{
...
}
}
As a rule of thumb, for these data access objects, if they contain conditional logic then you should probably refactor that out into a more business oriented service (class). It's best that your interface to the database contain has little logic as possible. It has to be thin because this layer is hard to unit tests because of its dependency on the database.
Once you've done this, use a dependency injection container and register IUserDao and its implementation.
Now, moving forward, you'll be able to create unit tests that completely mocks that database by mocking the UserDao implementation.
May I suggest:
Microsoft Unity for dependency injection
FakeItEasy for unit testing
mocking framework
Other fine ones:
Castle Windsor (DI)
Ninject (DI)
RhinoMocks (unit test - mocking
framework)
Good luck!
Hope it helps.
I strongly suggest not to start restructuring your application without a strong knowledge of SOLID principles and dependency injection. I did this mistake and now I have an application full of service locator (anti)pattern implementations that are not making my life simpler than before.
I suggest you to read at least the following books befor starting:
http://www.amazon.com/Agile-Principles-Patterns-Practices-C/dp/0131857258 (for SOLID principles)
http://www.manning.com/seemann/ (for .NET dependency injection)
http://www.amazon.com/Working-Effectively-Legacy-Michael-Feathers/dp/0131177052
A possible strategy is not refactoring just for the sake of it, but consider refactoring only the parts that are touched more than others. If something is working and nobody is going to change it there's no need to refactor it, it can be a loss of time.
Good luck!

Structuring of Solution(s) [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I am in the very early phases of a WinForms product rewrite, and I am trying to determine the "best" strategy for implementing a new solution structure. The current solution contains 50+ projects, and (for the most part) it contains all of the logic required to run the application. Several of the projects do have dependencies to projects that exist in a separate "Framework" solution, but that is being replaced / modified as well.
As I said, the current solution produces a WinForms product; furthermore everything is tightly coupled together, front to back. Additionally, we want to start offering a Web / Mobile solution in addition to/alongside our WinForms product. Because of the desired changes, I am considering breaking this out into several separate solutions. Details follow.
Product.Framework Solution becomes Product.Core - A shared set of assemblies containing common interfaces, enums, structs, "helpers", etc.
Product.Windows - MVC pattern. Contains all views and business logic necessary to run the WinForms product.
Product.Web - MVC Pattern. Contains all views and business logic necessary to run the Web product.
Product.Services - Hostable WCF services. Contains the public service layer that Web/Win/Mobile call into with the underlying DAL.
This is where I am looking for a sanity check: I am planning on implementing DI/IoC in both the WinForms and Web project (I am not so much worried about injecting into the WCF services); in my mind it makes sense to have interfaces of all the concrete entities (representation of database tables) and services in the Product.Core solution. The only reference I would possibly need to Product.Services in the Web and Winforms solutions would be to register the concrete types with the container.
Does this make sense? Is there something glaring that I have overlooked? Thank you for any and all feedback!
The way I think about solutions is "all of the things necessary to run my program". In your case, your WinForms application is the final step. The goal is to be able to run the output executable from that project. The solution, then, should consist of every project necessary in order to build that executable from scratch. The last thing you want is to have a new developer have to checkout your source code from version control and then have to use tribal knowledge to figure out which solutions need to be built in which order and then how to tie them all together.
Now you mentioned that you may be adding some more final step applications such as a web application. Assuming that the dependencies for your WinForms application are similar to your web application, I am of the opinion that you should just add the web application to the same solution as your WinForms application. However, sometimes it makes sense to have a different solution for each, and then have each solution reference a similar set of projects.
One of the key things to remember is that when a project dependency is introduced, you will need to update all of your solutions to have that new dependency. This is the primary reason why I tend to have a single solution for most things.
Don't forget, in Visual Studio you can have solution folders to help you visually manage the solution as a whole. Also, you can utilize the build configurations and dependency tree such that building doesn't require compiling everything when you only need one final project built. Finally, you can utilize the Set Startup Project option to switch between which final output you want to work with.
Remember, any given project can very easily be part of multiple solutions. If you have a core set of frameworks that are used across an array of different products you can include the framework projects in each "Product" solution. If the framework is not primarily worked on by the same team that uses it you may want to consider splitting the framework into a separate repository and only distribute the output assemblies (which would be committed into other repositories and referenced in your other solutions).
In general, my opinion is to have a single solution for everything and utilize various features of Visual Studio so managing such a large and complex solution isn't very painful. The one thing I would advise against is having the build of one solution depend on the build output of another solution. If you are doing this, the two projects should reside in separate repositories and the build output should be copied and committed as needed (basically treat the output as a 3rd party library).
There is no "best" answer here. Here is an observation from your question:
Why do you need an interface for all concrete entities?
It appears that these are just data model classes. Unless you are looking to Mock these classes or write generic methods(or classes) that can operate on a category of classes such as all data model classes that implement an IEntity interface for instance so that you can constrain your generic method/class by the data model type.
Example:
public void MyGenericMethod<T>(T t) : where T:IEntity{ // do something}
It sounds like the refactoring / restructuring you are doing will have major impact on the business you are working for and the design/ architectural decisions made now will need to sustainable for the business as it evolves. I would highly suggest involving an architect in this process who can understand the needs and the nature of the business and come up with a game plan accordingly.

Separation of business logic [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 4 years ago.
Improve this question
When I was optimizing my architecture of our applications in our website, I came to a problem that I don't know the best solution for.
Now at the moment we have a small dll based on this structure:
Database <-> DAL <-> BLL
the Dal uses Business Objects to pass to the BLL that will pass it to the applications that uses this dll.
Only the BLL is public so any application that includes this dll, can see the bll.
In the beginning, this was a good solution for our company.
But when we are adding more and more applications on that Dll, the bigger the Bll is getting. Now we dont want that some applications can see Bll-logic from other applications.
Now I don't know what the best solution is for that.
The first thing I thought was, move and separate the bll to other dll's which i can include in my application. But then must the Dal be public, so the other dll's can get the data... and that I seems like a good solution.
My other solution, is just to separate the bll in different namespaces, and just include only the namespaces you need in the applications. But in this solution, you can get directly access to other bll's if you want.
So I'm asking for your opinions.
You should have a distinct BLL and DAL for each "segment" of business... for example:
MyCompany.HumanResources.BLL
MyCompany.Insurance.BLL
MyCompany.Accounting.BLL
I agree with #MikeC. Separate the BLL in namespaces, for each segment. Also, separate the DAL too, like this:
MyCompany.HumanResources.DAL
MyCompany.Insurance.DAL
Another thing to do, is separate the dll's. This way, you dont need to make DAL public. It will be a Business Layer (like WCF or Web-service), responsible of BLL and DAL, for each system, making the support and maintenance more easy. I dont know if its the most affordable approach for your company right now (in terms of complexity), but its a better approach for design purposes.
Times before, the applications developed here in the company, used component architeture - sharing the components trough applications -. We realized that, it wasnt the best design and today, many systems (in production enviroment) use that design approach.
Furthermore: If you want more complexity, you could also generate a Generic dbHelper component, responsible to maintain the data access, including operations that controls the connections, commands and transactions. This way, preventing the rewrite of code. That assembly could makes use of Enterprise Library or others components. An operation example could be:
public DbCommand CreateCommand()
{
if (this._baseCommand.Transaction != null)
{
DbCommand command = this._baseConnection.CreateCommand();
command.Transaction = this._baseCommand.Transaction;
return command;
}
return this._baseConnection.CreateCommand();
}
You can make it virtual, implementing a SqlCommand CreateCommand and so on.
Remembering: the Generic dbHelper idea I exposed, is just an idea!
I suggest you to separate your business logic into different dll's in accordance to their pertence (in accordance with previous post), these classes will implement specific interface while this interface will be declared on you business login consumer. Then I suggest you to implement the containers (see Inversion of Control theory) to resolve dll implementation, this will allow you to separate business logic implementation from consumption and you will be able to replace some implementation by another, without difficulty.
I defend the use of provider with IoC and not the consumption of business manager classes directly (think about references which can result in nightmare). This solution resolves the problem of dll's isolation and their optimized consumption.
It sounds like you have common business logic, that applies the your organization in general, and more specific logic per section or department. You could set up your code in a way so that each dept. only depends on their specific logic, which behind the scenes uses any generic functionality in the "base" logic. To that end, you could set up the following projects:
Business.BLL
Business.Finance.BLL
Business.IT.BLL
(etc, ad infinitum, and so on...)
Note that each of these can be a separate project, which compiles to its own assembly. A department would only need to use their own assembly.
As far as data access goes, you can have generic data access routines in your base BLL. Your specific BLLs can have their own specialized data queries that are funnelled to the base BLL, which in turn uses the generic DAL and returns results back up the chain.

Categories