Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
We're a development dept that have 'inherited' a large BizTalk based middle-ware system (100+ orchestrations that run a large number of financial transactions).
The system is based on BizTalk 2003 which is now reaching end of life and our systems team want to can it.
Is is possible to migrate from BizTalk 2003/2004 to Appfabric?
I presume if we took this approach we'd have to re-develop the orchestrations to Windows Workflow (which we already use)
Or is a migration to BizTalk 2013 an better (easier) option?
Has anyone come across this situation before?
Thanks
Steve.
The two pieces of software have nothing in common, except that they both use the WF libraries.
AppFabric is a set of services, one of which is to host WF workflows. The workflows themselves are plain low-level WF4 workflows. No special activities or adapters to other systems like Biztalk.
BizTalk is a server product with rules, adapters to various systems, able to read various business process languages, define orchestrations at a high level,etc, etc
It's almost like comparing IL to LINQ. Yes, there is common ground but not much.
Related
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
As per my understanding, DNX (.Net execution environment) is provided to support cross platform web applications, which sounds good but it would be more useful if it were to support desktop applications.
Why would you need a cross platform web based application ? usually a web application/web site is hosted once and it shouldn't be an issue to host it on IIS on a windows machine. Is there something with DNX that I am completely missing or is it somewhat useful for desktop/console based applications as well.
What if you had a web-based application that you intended to run on both embedded devices like a Raspberry Pi as well as more conventional servers? The Pi may not be able to run a full Windows installation and thus may need to run Mono or some alternative solution.
The idea of a previous place was to have a self-configured, low power solution for doing some tracking through RFID. The embedded devices would have to have a scaled down version of the system but be able to synchronize with the bigger systems as there could be various reports and other data to be generated on the big servers in the overall system. Imagine tracking wildlife or a big farmer's field with various sensors that could report the data that then has to get sent up to the big central DB so data can be compared over time with bigger resources than the embedded device would have. Thus, you could have a dozen or so of the small embedded devices in the field and have a beefy server back at a home base that could generate reports, maintain dashboards, etc. from traditional infrastructure in terms of electricity, connectivity, etc.
There was also the potential for this to lead to something like Skynet if the embedded devices could form a collective consciousness but the project never got to that stage of things.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
We currently have a single database with users, customers, products and orders logically separated by schemas. We then have several MVC.net applications accessing the database via their own BLLs. Each of these applications have their own functionality and share some aspects with some/all of the other applications.
Currently, some code is duplicated in these BLLs and it's a bit of a mess to maintain. It does however, allow us to develop features quickly and deploy each application independently (assuming on major database work here).
We have started to develop a single access layer, properly separated out that sits above the database and is used by all of our MVC.net applications. Logically this makes sense as we can now share code between our applications. For example, application A can retrieve a customer record in the same way as application B. The issue comes when we want to deploy an application, we wouldn't be able to deploy one application, we'd need to deploy them all.
What other architectural approaches could we consider that would allow us to share code between our applications and deploy those applications independently?
A common solution is to factor out services (based on an arbitrary communication layer REST, WCF, Message Bus, your choice with versioning) and deploy these services to your infrastructure as standalone services.
Now you can evolve, scale and deploy your services independently of the consumers. Instead of deploying all applications you now only have to deploy the changed services (side-by-side with the old ones) and the new application.
This adds quite a lot of complexity around service versioning, configuration management, integration testing, a little communication overhead etc. So you have to balance the pros and cons. There are quite a bunch of articles on the net how to build such an architecture.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
I was looking in the Internet if there is any article about trends in ASP.NET. There are many such articles and each one gives their own suggestions.
Could you point some technologies that are commonly used building ASP website?
I'm interested in trends in:
standard websites
client-server apps using client basing on ASP (maybe bad idea?)
I want to find out about the technologies, useful libraries, etc.
Please, don't hate this thread, as far as I noticed (in posts/comments in the whole Web), many people that want to start learning e.g. ASP with most known and the best technologies (for this time) on their own have problems with finding clearly answer.
Specialists could share their experiences and tell something about technologies used in their companies/projects. Maybe, please, describe the kind of your app and used technologies. Thanks
If you read .NET Technology Guide for Business Applications by Microsoft press, you will have a better idea what options there are (mostly Microsoft stack). It all depends on your requirements, so you might want to think about them first.
For me personally, in a lot of cases I end up with ASP.NET MVC 5 (with bootstrap 3 for responsive design), web API 2 for the backend services, and deploy them to Azure.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I need do some performance tests over a solution that uses: WCF services, serialization and a lot of stuff more (framework 4.0) deployed on several machines.
using a visual studio 2010 test project, would be possible configure any option to point at the different machines where the solution is installed?
would WCF cause any problem if I try to test agains it using a binding in particular?
I would create a test project, add references to the WCF services you want to test, and simply start testing. If you want something for complete end to end performance testing you can look at Microsofts Performance Testing suite (You can use this to load/stress test etc...)
What you are looking for is called "Lab Management". Those features are built in to Team Foundation Server on Visual Studio 2010 and newer.
Depending on how complicated of a a setup you want to do you may need a server running Hyper-V and a license for System Center Virtual Machine Manager (SCVMM) (it is not cheep).
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I'm coding a program for a small business. The application's going to be used for store-keeping and the ordering of parts for a mechanical shop. I need some type of backend to store orders, article numbers & prices, customers and so on (obviously). Right now I'm using a local MySQL server and running queries directly from the code. This is not ideal because of the risk of a system meltdown or similar. I've thought about running a local MySQL server - with a scheduled backup on a remote host, but I'm hoping there's a better solution. For previous applications I've written a PHP wrapper, and used a web hotel to host the MySQL server - Which isn't ideal either for security reasons. I suppose I should mention the application's written with windows forms in the VS .net environment(in C#). My question's this: How do I set up a MySQL server (or other type of database system) on a remote host - that I can run queries on and then return the result back to the application? Preferably I wouldn't want to handle the MySQL server myself but outsource it. I don't mind renting a server from some host - if it will spare me the hassle of setting up a local server machine to run separately. Are there any solutions that you can rent for this purpose? I'm sure there must be tons of information about this on the interwebs but I can't find anything. I would be very thankful if anyone could give me some pointers!
One way you could do this is rent a cheap VPS and host mysql in there. I have been using DigitalOcean, and it is pretty good. For your needs a $5 per month VPS would be enough.
Or you could use Azure. http://www.windowsazure.com/en-us/services/data-management/