Best way to share code between multiple MVC applications and deploy different versions [closed] - c#

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
We currently have a single database with users, customers, products and orders logically separated by schemas. We then have several MVC.net applications accessing the database via their own BLLs. Each of these applications have their own functionality and share some aspects with some/all of the other applications.
Currently, some code is duplicated in these BLLs and it's a bit of a mess to maintain. It does however, allow us to develop features quickly and deploy each application independently (assuming on major database work here).
We have started to develop a single access layer, properly separated out that sits above the database and is used by all of our MVC.net applications. Logically this makes sense as we can now share code between our applications. For example, application A can retrieve a customer record in the same way as application B. The issue comes when we want to deploy an application, we wouldn't be able to deploy one application, we'd need to deploy them all.
What other architectural approaches could we consider that would allow us to share code between our applications and deploy those applications independently?

A common solution is to factor out services (based on an arbitrary communication layer REST, WCF, Message Bus, your choice with versioning) and deploy these services to your infrastructure as standalone services.
Now you can evolve, scale and deploy your services independently of the consumers. Instead of deploying all applications you now only have to deploy the changed services (side-by-side with the old ones) and the new application.
This adds quite a lot of complexity around service versioning, configuration management, integration testing, a little communication overhead etc. So you have to balance the pros and cons. There are quite a bunch of articles on the net how to build such an architecture.

Related

design Asp.Net MVC 5 layered solution for horizontal scaling [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
I have been reading about using azure for asp.net solutions and I am sold. I have questions about a ton of stuff but I would like to know how to scale a layered application. I read in a book that we could for performance and scaling sake we can have our models, business logic and DbContext in a separate project and that this project can be on a separate server. So I guess my question is can a .NET class library be hosted in IIS? how would doing this scale and give me an advantage. Sorry I am an advanced beginner so you will need to bear with me. Thanks
You can't host just a class library technically.
Having them in a separate project is not done for scaling reasons. It's done so you can reuse the models in unit tests etc.
One thing you can do of course is to create an API project, which will be hosted in e.g. Azure App Service. Then you can build an MVC project that then uses this API through HttpClient and the like. This separates your front-end and back-end allowing both apps to scale independently depending on their load. This would of course require them to be in separate App Service Plans in Azure as otherwise they share the server instances. The plan can be changed later though, so you can move them to a separate plan later and start with a common one for now.
If you want to break down your app into even smaller pieces, I would advise looking into microservices architecture.

Is Owin/Katana supposed to replace Web API? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 6 years ago.
Improve this question
When ASP.NET MVC came, Microsoft announced many times in many places that it wasn't supposed to replace ASP.NET Web Forms. In other words, it's just another technology that you might find useful, or you might use Web Forms in other scenarios.
However, as companies enter into the market, they can't have a jungle of technologies, because that's too expensive. They usually select a mature technology, stick to it, build on it and extend it and reuse elements in it to reduce costs.
Now we're trying to decide to move to Owin/Katana from Web API. We just wonder if it's OK that we move 100% to Owin?
The reason I'm asking this question is because we've created a very rich codebase for Web API, including streaming, compression, authentication, normalization of UGC, support of I18N & L10N, and more.
If we want to move to Owin, we need to re-create these facilities/utilities again for Owin, because its architecture is different from Web API.
We want to move to Owin, because it's faster, lighter, self-hosted server, and seems to be the future of service technologies from Microsoft.
Is it safe for us to move to Owin completely and imagine a future in which all of our services are delivered through Owin, and we discontinue using Web API?
OWIN is just a specification, nothing more. It describes a common interface that servers and applications can both use, so that applications don't need to be tightly coupled to servers.
Katana was the first step towards decoupling ASP.NET from IIS. Work on Katana has stopped now, according to the official roadmap. The ideas and technologies developed for Katana have made their way into the next version of ASP.NET (ASP.NET Core).
It rarely makes sense to build applications on top of OWIN itself, because you're operating at the lowest level of abstraction above HTTP (literally dealing with raw requests and responses). That's usually only necessary if you are building middleware components that need low-level access.
In other words: you shouldn't rebuild your application on OWIN, because you'd be spending a lot of time reinventing all of the stuff already in ASP.NET.
ASP.NET Core is the next evolution of ASP.NET and Web API. It has all the things you mentioned: it's fast, lightweight, and can self-host. If you need to rebuild your architecture, do it on ASP.NET Core.

Is web service is better than the regular query method? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 6 years ago.
Improve this question
We have a desktop application. we need to install in client pc and connect the database to remote server. which method is better to connect database (for speed and performance).
1. Normal query method (mention the server name in connection string).
2. Create a web service and get the data in xml or json format.
Both solutions will bring positive and negative points.
Direct query to server -> imply that your client software knows the Database schema. If you change the Database schema, you need to test its integration in the client app.
Web service -> a limited API allows your Database to be only known by its data web service. The client app only knows about the small web service API. When the Database evolves, you have a very low chance to negatively impact the client code.
From an architectural point of view, it is encouraged to limit the size of contracts between 2 pieces of technology.
From a development cost point of view, creating and maintaining such a service has a cost and introduces maybe the need of a new set of technical skill set in your team.
Depends on your requirement, budget and time constraint.
If there is any possibility that this desktop software would be later extended to Mobile App and other platforms, then go for Creating web services preferably with JSON.
Keeping data access layer in Client Desktop Application saves a little development time, but makes testing, re usability and maintenance harder.
Also, the trend is to use SOA, thus I'd always prefer creating Web Services. Its secure, reusable and very friendly for future modification to project.

Why do we need dnx or cross platform for web [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
As per my understanding, DNX (.Net execution environment) is provided to support cross platform web applications, which sounds good but it would be more useful if it were to support desktop applications.
Why would you need a cross platform web based application ? usually a web application/web site is hosted once and it shouldn't be an issue to host it on IIS on a windows machine. Is there something with DNX that I am completely missing or is it somewhat useful for desktop/console based applications as well.
What if you had a web-based application that you intended to run on both embedded devices like a Raspberry Pi as well as more conventional servers? The Pi may not be able to run a full Windows installation and thus may need to run Mono or some alternative solution.
The idea of a previous place was to have a self-configured, low power solution for doing some tracking through RFID. The embedded devices would have to have a scaled down version of the system but be able to synchronize with the bigger systems as there could be various reports and other data to be generated on the big servers in the overall system. Imagine tracking wildlife or a big farmer's field with various sensors that could report the data that then has to get sent up to the big central DB so data can be compared over time with bigger resources than the embedded device would have. Thus, you could have a dozen or so of the small embedded devices in the field and have a beefy server back at a home base that could generate reports, maintain dashboards, etc. from traditional infrastructure in terms of electricity, connectivity, etc.
There was also the potential for this to lead to something like Skynet if the embedded devices could form a collective consciousness but the project never got to that stage of things.

Best practice to share data and notifications between applications [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
What is the best practice to share data between different applications on the same machine and notify them if the data has changed?
I have 4 applications which are using the same settings project to change their settings. When I change the setting in the project, other applications have to act on this change and have to know that the setting was changed.
I thought about IPC to make setting changes and then broadcast the change information to all users but it would be great if such a library already exists.
EDIT:
I found a solution that worked for me. We decided not to spend a lot of time in this functionality because its not extreme critically to update the other applications.
We save our settings, as we did before, in a XML-file and I registered the FileSystemWatcher on that file to get all changes. So if I change the settings all 4 applications go and read the settings file and determine if they have to take an action or not.
Solution to be chose depend on different parameters:
How much effort can you invest in implementation.
How critical is it that applications will be updated quickly.
Which environment is available for you/ yours customers.
...
For example:
saving changes to database/config file, and let the applications run a separate thread, which is dedicated to check for setting changes every n seconds. It's cheap and easy to implement this solution, yet not "nice", and many developers will reject such solution.
Create a WCF service, which "publish" changes to the applications. That case, using Dual bindings, applications will be updated instantly. Of course, this solution is more costly....
Those are only 2 examples out of many available solutions (shared memory, shared application domain, etc).
What you have done seems wisely.And I have another experience using MSMQ .
You can create Private or Public Queues,since you have all of your apps on the same machine, private queue is ok,otherwise, you should use Public queues.
In that time I had choosen Spring.Net as my framework (object builder & dependency Injector). Spring.net has brilliant QuickStarts and one of them is using MSMQ as communicating bridge between applications.
If I were you, I would utilize Queuing approach, because you can notify apps running on different machines.
Also,WCF provides convenient means for developing a distributed service over the underlying MSMQ component.
Furthermore,Publish-Subscribe is a common design pattern that is widely used in client/server
communication applications. In WCF service development, the Publish-Subscribe pattern will
also help in those scenarios where the service application will expose data to certain groups
of clients that are interested in the service and the data is provided to clients as a push model
actively (instead of polling by the client)
How about using the built in dependency objects ?
like:
CacheDependency - http://msdn.microsoft.com/en-us/library/system.web.caching.cachedependency.aspx
SqlDependency - http://msdn.microsoft.com/en-us/library/system.data.sqlclient.sqldependency.aspx
these are pretty simple to implement and and they work quite well
How about Network sockets ? You can make listeners and senders on different ports

Categories