Run web service API on same or separate servers? [closed] - c#

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 9 years ago.
I have a web portal and the web portal has web services API.
Which solution would be best and why?
Should I....
1) Run the web portal and the web portal API on the same server or
2) Run the web portal and the web portal API on separate servers

It's all a matter of trading off different forces, there just can't be the same answer for everybody.
Here's a few things to consider:
Having the UI (portal) and it's dependent services on the same box makes for a very clear set of dependencies, when diagnosing problems you've got just one place to look. You can scale by adding more such boxes, each being self-contained. Clarity has a lot of operational value.
But, it's likely that the portal or the services will have different resource requirements, hence you are scaling (say) the portal when the services are not using much resource. Hence you have more copies of something portal or service than you strictly need. This may have considerable costs. Examples:
Licence costs. Suppose you have 10 copies of portal but really only needed 5, then that's 5 licences wasted.
Memory consumption. Suppose there's a fixed overhead in getting the services (or portal) up irrespective of load demands (think caching or database connections) then you are paying that cost for the un-needed instances
Back-end costs. Your services may connect to enterprise systems, eg a database. Each connection costs resources on the back-end. If you have un-needed instances you pay needless costs.
3.Platform tuning. You may need to tune the platform differently for your portal and the services. This issue is more noticable when considering whether to co-locate the database too.

Related

when to use wcf and asp.net? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 9 years ago.
I want to do an application that use a database and can show the information in datagrids, execute commands with buttons and so on. I want to do it inside the lan and outside the lan.
I think that i have two options. Create a desktop application that use wcf to connect with the service to access to the database. The second option is create an aso application, so I can access to the database with any browser through internet.
Are the two options a good solution? What are the pros and cons of wcf and pros and cons of asp.net?
When to use asp.net and when wcf?
When you only have one client and your only need is to access it via LAN and internet then developing an ASP.NET application is less overhead. This because you don’t need to setup an extra service that you need to configure and secure. On the other hand creating a good UI for an ASP.NET application can be much harder than just a WinForm of WPF application (depending on your UI-needs).
But…. What if you’re planning new clients in the future? Maybe a (native) mobile app or another (windows/web) client for a different group of users with different needs? Then a web service give you some advantages…
For example you want to make a new Windows Phone application (in addiction to your web application) for some CRUD operations.
When you write all the database logic and business rules in you web application you can’t use it directly in you Windows Phone. Okay you can maybe you can use the assembly if it is compatible with that .NET framework profile. But what if you want to create an Android application without using Xamarin or something similar. Than you can’t use the assembly’s from you web application and you need to rewrite your logic again… When you had a web service (for example a REST web service) you can call the service for all the database and (shared) business logic. And you don’t need to care about if its working the correct way. As you probably can see maintainability can also be an advantage of a web service because all the logic is centralized in the service.

Very simple web service: take inputs, email results [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I work at a small startup as a Data Scientist, and I'm looking for ways to make my analysis a bit more visible/useful to the organization. I'd like to be able to put up a simple web service which allows internal users to run my scripts remotely. They should be able to input a few parameters via a very simple UI, and they should have the option to have the results appear in the browser window (after a possibly long wait), or have them emailed. Results may be a few pdf figures, and they may be Excel spreadsheets (maybe more exotic in the future, but this is it for now).
The scripts are going to be all in Python, which will handle the analysis.
So, I'd like to know what the pros and cons are of using C#/WCF vs. something like Django or Python. I have significant experience in C# working in the Client-side code base here, but I have much less experience with WCF. All of my analysis work is done in Python (and R, to a lesser extent). The main goal is to not take all of my time building a fancy web service/UI---the front end just has to be friendly enough to not intimidate the marketing people. I don't have to worry about encryption, the server will be behind our firewall. I'm pretty platform agnostic, but I think the servers are all Windows based, if this helps.
Thanks in advance.
For extra credit, how does your answer change if some of my scripts are in F#?
You might consider using the Django web framework. You could set up a small app with your python scripts as different views. https://www.djangoproject.com/
And if you don't want to put that much effort into creating a friendly UI you could use twitter bootstrap. http://twitter.github.com/bootstrap/
Then just run the app internally to gather and display data either via HTTP GETs or via e-mail.
edit: I'm sorry I did not read carefully "pros and cons are of using C#/WCF vs. something like Django". I recently made a Django app and it was fairly straight forward.

Consuming a SOAP Web Service with the lowest overhead [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I'm implementing a SOAP Webservice for sending thousands of emails and storing thousands of XML response records in a local database. (c#.net, visual studio 2012)
I would like to make my service consumer as fast and lightweight as possible.
I need to know some of the considerations. I always have a feeling that my code should run faster than it is.
E.g.
I've read that using datasets increase overhead. So should I use lists of objects instead?
Does using ORM introduce slowness into my code?
Is a console application faster than a winform? Because the user needs no GUI to deal with. There are simply some parameters sent to the app that invoke some methods.
What are the most efficient ways to deal with a SOAP Web Service?
Make it work, then worry about making it fast. If you try to guess where the bottle necks will be, you will probably guess wrong. The best way to optimize something is to measure real code before and after.
Datasets and ORM and win form apps, and console apps can all run plenty fast. Use the technologies that suit you, then tune the speed if you actually need it.
Finally if you do have a performance problem, changing your choice of algorithms to better suit your problem will likely yield much greater performance impact than changing any of the technologies you mentioned.
Considering my personal experience with soap, in this scenario I would say your main concern should be on how you retrieve this information from your database (procedures, views, triggers, indexes and etc).
The difference between console, winform and webapp isn't that relevant.
After the app is done you should make a huge stress test on it to be able to see where lies your performance problem, if it exists.

Best way to organize/architect a web site [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I have to take a key decision about our web site organization/architecture.
Here is my context.
Our main web site will be available in different countries. Even if the Business is nearly the same, there are some region-specific features. Of course it concerns translations, but also master/layouts and business process. These difference are because of different legislations. At the beginning we will have 4 or 5 derivations, but the target could be 20.
A simple comparison could be Stackoverflow and the Stack Exchange Network. Main features are quite the same between website, but there are site-specific business rules.
To my mind, there are basically two possible approachs :
Having a single web site that manage region/country-specific features.
The will keep core features on the same site, but will involve coupling between all regions. There is also a risk of "IF" in the code. Devs & Maintainability is optimal (unique fix for all) but risky (could break others). A way to do this is a combinaison of portable areas and a custom view engine (generic view template in parent folder and derivation in a sub folder)
Having one web site per region/country
There will be a common web site that will be implemented. There will have some common components but each web site will have its own lifecycle; Devs & Maintainability is easier but costly (if there are many derivations)
Please Note, another impact of this organization is deployment and avaibility.
What is the best way to organize this ?
Edit :
We already have some experiences in MVC and as a general guideline, we are aware of MVC Best Practices : thin controllers, DI, ViewModels, Action Filters, ...

What are some best practices for making sure your .NET code will scale well? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 9 years ago.
Last week I interviewed for a position at a TripleA MMORPG game company here in NE. I didn't get the job but one of the areas that came up during the interview was the about the scalability of the code that you write and how it should be considered early on in the design of your architecture and classes.
Sadly to say I've never thought very much about the scalability of the .NET code that I've written (I work with single user desktop and mobile applications and our major concerns are usually with device memory and rates of data transmission). I'm interested in learning more about writing code that scales up well so it can handle a wide range of remote users in a client server environment, specifically MMORPGs.
Are there any books, web sites, best practices, etc. that could get me started researching this topic?
Here are some places to start:
http://highscalability.com/blog/2010/2/8/how-farmville-scales-to-harvest-75-million-players-a-month.html
http://www.cs.cornell.edu/people/~wmwhite/papers/2009-ICDE-Virtual-Worlds.pdf
In particular, http://highscalability.com is full or articles about huge websites that scale and how they do it (Digg, flickr, facebook, YouTube, ...)
Just one point I'd like to highlight here. Just cache your reads. Work out a proper caching policy where you determine which objects can be cached and for what periods. Having a distributed caching farm will take load off your DB servers, which will greatly benefit performance.
Even just caching some pieces of data for a few seconds - in a very high load multi-user scenario - will provide you with substantial benefit.
If you are looking for physical validation, what I usually find that helps is doing some prototyping. This gives you a good idea usually of any unforeseen problems that might be in your design and just how easy it is to add onto it. I would try to apply any design patterns possible to allow future scalability. Elements of Reusable Object-Oriented Software is a great reference for that. Here are some good examples that show before and after code using design patterns. This can help you visualize how design patterns could make your code more scalable as well. Here is an SO post about specific design patterns for software scalability.

Categories