Hosting a RESTful API - c#

I'm a C# ASP.NET junior dev and have worked with Code First C# Databases, RESTful API's, MVC & Vue (a frontend framework sort of like React) to create websites.
Now at work and during my education, I've never handled deployment.
At this time I have a personal project. I have succesfully hosted my relational MySQL Database on phpMyAdmin and can update it from my local desktop.
My hosting site let me know they do not host C# or anything of the sort.
I found some posts suggesting Azure, AWS, others, but for every post I find I find equal people protesting those.
What is a good site to host my first REST API? I'm looking for something that can go beyond Minimum Viable Product and I'd like to host my website under the hosting service I'm currently using (so not paired hosting with the API).
What would the costprice look like for an API that's deployed and being used by clients?
I realize this cost depends on the amount of traffic, but assume a basic API used for, let's say, posting orders in an online shop (though website/app/w.e, it all would communicate through the API).
Any tips are welcome as I feel I'm swimming in the dark researching this.
Thank you

Any hosting service that grants you real access to a machine will be able to run your API as some specialized for the .net/core ecossystem.
I supposed you know about php based on phpMyAdmin service and the ecossystem of hosts that support php, although cheaper they do not exactly give you access to the machine and probably will not support .net/core as inumerous others tech stacks.
As a Junior Developer I believe you should have a little bit of practice in any deployment ecossystem so I encorage you to try most of the big clouds (Azure, GCP, AWS), but also some smaller hosts to gain experience and understand a bit more about the differences in deployment and ecosystem.
Azure will be really easy, you can create an account and post your API without any costs using an free WebApp, VS will even have publishing tools that will handle 90% of the job, GCP will be a little trickier and will require you to know a bit about containers and clusters, if you go for a non specilized host like digitalocean you will need to understand more about the Operational system and associated servers/controllers to deploy and publish
about the cost part is a lot more difficult. It will depend on the host that you are using and the load (process, memory, size, and throughput of data). In my experience I had some very small-scale APIs that required more processing or memory to accomplish some tasks like PDF generation than a medium-scale API that only had json data transaction

Related

What is the ideal method for creating a Windows application and service package?

I have a project I am working on where I need to create an app and service package for Windows. I would like the service process to run as SYSTEM or LOCALSYSTEM so that credentials are irrelevant. The application frontend will be installed and executable by any user on the machine. Data from the frontend application will be passed to the service - most likely paths to directories selected by users. Once started the service will listen for a command to do some action while accepting the aforementioned paths.
I'm using C# on the .NET platform and I've looked into creating a standalone service and a standalone application separately as well as creating a WCF service library and host application - that's as far as I've gotten.
All of these methods seem overly complex for what I am trying to achieve. What is modern convention when attempting something like this? I'm willing and able to learn the best method for moving forward.
Edit: This was flagged duplicate. I'm not looking for information on HOW to communicate with a Windows service. That's remedial and not at all what I'm asking. I'm looking for validation that I'm on the right track and if I'm not, I'm looking for suggestions. I've been told that I'm on the right track and pointed towards named pipe binding.
Windows Service is certainly an option for hosting WCF, although it kind of is a deployment nightmare. It really depends on your environment and the capability and support of your system admins as I've had many clients where deploying a windows service, as you need admin rights to install and update it, was simply not practical.
Console applications may sound like a terrible idea but the practicality of being able to drop them on a share and run a powershell script to start them is very compelling.
But frankly IIS hosting has the most advantages in my mind as the product is designed for ease of deployment and up time. And you can use any transport binding in IIS that you can use in a Windows Service or Console.
As for the binding itself named pipe is not really a popular option in many enterprise scenarios as it is incompatible with anything but .NET. Although the same can be said for binary which is one of the more performant bindings. The WSHttpBinding is probably the most popular binding in scenarios that require unknown callers. WebHttpBinding is an interesting option as its HTTP/REST based, although that requires further decoration of your operations and honestly if your going that route you should really be using Web API.

Send fax from ASP.NET MVC application

I am developing ASP.NET MVC application as a part of summer experience job from my college.
I have a requirement where i need to implement a fax functionality.
Because i am still student, and this is my first real application, i'm kinda confused how to make this function and what libraries should i use. Third party services (such as mail to fax and so on) are not considered due to the nature of application - it is going to be a service that help doctors to get access to patients encounters. This data is private and cannot be sent to any third party service.
I am using Visual Web Developer 2008 express edition if this matters.
The fax machine is going to be installed on the server, and faxes will be sent from there.
I am looking for advice or, maybe some good resources that can help me.
Thanks.
I've worked on faxing in our application. Our app integrates with various fax services through configuration options. I recommend buying or using something off-the-shelf and integrating with it. Some examples include:
Microsoft Fax Server, free with
existing Windows Server licenses.
Faxman, which costs a nominal
fee, works great. Can fax PDFs with
an optional add-on.
RightFax is a very expensive
solution. Can fax PDFs with an
optional add-on.
You might look into the Adapter pattern to abstract the faxing implementation from your service.
The easiest approach here would be to use some sort of email to fax facility--either hosted locally or in the cloud--to send the faxes. Will make your life much, much easier as you won't have to take it back to the old school and get down with dialtones.
If you do have to send stuff, you should probably look at externalizing the operation to your own service for a few reasons:
1) testability/maintainability/flow -- if its external you can create a stubbed API and write your web app to talk to that first, then get down with dialtones later. But its not a blocking issue. And your test suite doesn't need a faxmodem.
2) usability -- faxes take a while to send, if they succeed at all. Passing off the request quickly and telling a user its being sent then notifying them of success makes a bit more sense then a really long running "PROCESSING" graphic.
I'd start by looking at TAPI, which is Microsoft's Telephony API. There are .Net wrappers listed in the Wikipedia page.

Shaky connectivity - favor web or desktop app?

I'm a desktop application developer who is temporarily working in the web. I'm working with a client that wants me to build an app for use by locations all over the state; however, these locations have very shaky connectivity.
They really want a centralized web app and are suggesting I build a "lean" web app. I don't know what a "lean web app" means: small HTTP requests but lots of them? or large HTTP requests with few of them? I tend to favor chunky vs chatty.. but I've never had to worry about connectivity before.
Do I suggest a desktop app that replicates data when connectivity exists? If not, what's the best way to approach a web app when connectivity is shaky?
EDIT:
I must qualify my question with further information. Assuming the web option, they've disallowed the use of browser runtime technologies and anything that requires installation. Thus, Silverlight is out, Flash is out, Gears is out - only asp.net and javascript is available to me. Having state this, part of my question was whether to use a desktop app; I suppose that can be extended to "thicker technologies".
EDIT #2: Network is homogeneous - every node is Windows. This won't be changing.
You should get a definition of what the client means by "lean" so that you don't have confusion surrounding it. Maybe present them with several options of lean that you think they might mean. One thing I've found is it's no good at all to guess about client requirements. Just get clarification before you waste a bunch of time.
Shaky connectivity definitely favors a desktop application. Web apps are great for users that have always-on Internet connections, and that might be using a variety of different browsers and operating systems.
Your client probably has locations that are all using Windows, so a desktop application is an appropriate choice. One other advantage of web applications is that they make the deployment issue easy to deal with. Auto-update technologies like ClickOnce make the deployment and update of desktop applications almost as easy.
And not to knock Google Gears, but it's relatively new and would have to be considered more risky than a tried-and-true desktop application.
Update: and if you're limited to just javascript on the client side, you definitely do not want to make this a web app. Your application simply will not be available whenever the Internet connection is down. There are ways to save stuff locally in javascript using cookies and user stores and whatnot, but you just don't want to do this.
If connectivity is so bad, I would suggest that you write a WinForm app that downloads information, locally edits it and then uploads it. This way, if your connection goes down, all you have to do is retry until it works.
They seem to be suggesting a plain vanilla web app that doesn't use AJAX or rely on .NET postbacks or do anything that might make it break down horribly if your connection goes away for a bit. Instead, it should be designed so that you can hit Refresh until it works. In other words, they seem to want the closest thing to a WinForm app, only uglier.
You may consider using a framework like Google Gears to help provide functionality during network down time. This allows users to connect to the web page once (with a functioning connection) and then be able to use the web app from then on, even without a connection.
When the network is restored, the framework can sync changes back with the central database.
There is even a tutorial for using Google Gears with the .Net Framework.
Gears with other languages
You mention that connectivity is shaky at these locations, but that the app needs to be centralized. One thing you might consider is using multiple decentralized read database servers and a single centralized write server. Mysql makes this possible and affordable if your app is small.
Have the main database server at the datacenter/central office. Put up small web/db servers at each location, with your app installed. You can even run them off a user computer if the remote location is not too big. Make the local database servers connect to the centralized database server as replication slaves. As changes come in to the centralized database, the slave servers will pull down the data and make it available locally. When the connection is unavailable, your app data is still at least available, if not up to date. When the connection is available, the database handles replicating all relevant data down.
Now all you have to do is make your app use two separate database handles: reading data it uses the local database, writing data it uses the central database.

What sort of web host lets you run crawlers on it?

I'm working on a graduation project for one of my university courses, and I need find some place to run several crawlers I wrote in C# from. With no web hosting experience, I'm a bit lost. Is this something that any site allows? Do I need a special host that gives more access to the server? The crawler is a simple app that does its work, then periodically writes information to a remote database.
A web crawler is a simulation of a normal user. It acess sites like browsers do, getting the html code (javascript, etc.) returned from the server (so no internal access to server code). Being that, any site can be crawled.
Be aware of some web crawler ethics guidelines. There are pages you shouldn't index or follow its links. And web developers build some files and instructions to web crawlers, saying what you can index or follow.
If you can't run it off your desktop for some reason, you'll need a host that lets you execute arbitrary C# code. Most cheap web servers don't do this due to the potential security implications, since there will be several other people running on the same server.
This means you'll need to be on a server where you have your own OS. Either a VPS - Virtual Private Server, where virtualization is used to give you your own OS but share the hardware - or your own dedicated server, where you have both the hardware and software to yourself.
Note that if you're running on a server that's shared in any way, you'll need to make sure to throttle yourself so as to not cause problems for your neighbors; your primary issue will be not using too much CPU or bandwidth. This isn't just for politeness - most web hosts will suspend your hosting if you're causing problems on their network, such as denying the other users of the hardware you're on resources by consuming them all yourself. You can usually burst higher usage levels, but they'll cut you off if you sustain them for a significant period of time.
This doesn't seem to have anything to do with web hosting. You just need a machine with an internet connection and a database server.
I'd check with your university if I were you. At least in my time, a lot was possible to arrange in-house when it came to graduation projects.
Failing that, you could look into a simple VPS (Virtual Private Server) account. Unless you are sure your app runs under Mono, you will need a Windows one. The resource limits are usually a lot lower than you'd get from a dedicated server, but they're relatively affordable. Some will offer a MS SQL Server database you can use next to the VPS account (on another machine). Installing SQL Server on the VPS itself can be a problem license wise.
Make sure you check the terms of usage before you open an account, as well as the (virtual) system specs though. Also check if there is some kind of minimum contract period. Sometimes this can be longer than a single month, especially if there is no setup fee.
If at all possible, find a host that's geographically close to you. A server on the other side of the world can get a little annoying to access remotely using Remote Desktop.
80legs lets you use their crawlers to process millions of web pages with your own program.
The rates are:
$2.00 per million pages
$0.03 per CPU-hour
They claim to crawl 2 billion web pages a day.
You will need a VPS(Virtual private server) or a full on dedicated server. Crawlers are nothing more then applications that "crawl" the internet. While you could set up a web site to be a crawler, it is not practical because the web page would have to be accessed for you crawler to work. You will have to read the ToS(Terms of service) for the host to see what the terms are for usage. Some of the lower prices hosts will cut your connection with a reason of "negatively impacting the network" if you try to use to much bandwidth even though they have given you plenty to use.
VPS are around $30-80 for a linux server and $60+ for a windows server.
Dedicated services run $100+ for both linux and windows servers.
You don't need any web hosting to run your spider. Just ask for a PC with web connection that can act as a dedicated server,configure the database and run the crawler from there.

Is it possible to create a standalone, C# web service deployed as an EXE or Windows service?

Is it possible to create a C# EXE or Windows Service that can process Web Service requests? Obviously, some sort of embedded, probably limited, web server would have to be part of the EXE/service. The EXE/service would not have to rely on IIS being installed. Preferably, the embedded web service could handle HTTPS/SSL type connections.
The scenario is this: customer wants to install a small agent (a windows service) on their corporate machines. The agent would have two primary tasks: 1) monitor the system over time and gather certain pieces of data and 2) respond to web service requests (SOAP -v- REST is still be haggled about) for data gathering or system change purposes. The customer likes the idea of web service APIs so that any number of clients (in any language) can be written to tap into the various agents running on the corporate machines. They want the installation to be relatively painless (install .NET, some assemblies, a service, modify the Windows firewall, start the service) without requiring IIS to be installed and configured.
I know that I can do this with Delphi. But the customer would prefer to have this done in C# if possible.
Any suggestions?
Yes, it's possible, you may want to have a look at WCF and Self Hosting.
Yes, it is possible (and fairly easy).
Here is a CodeProject article showing how to make a basic HTTP server in C#. This could easily be put in a standalone EXE or service, and used as a web service.
One technology you might want to check out is WCF. WCF can be a bit of a pain to get into but there's a great screencast over at DNRTV by Keith Elder that shows how to get started with WCF in a very simple fashion.
http://www.dnrtv.com/default.aspx?showNum=135
You could take a look at HttpListener in the .Net framework.
I would highly recommend WCF. It would fit very well into a product like you are describing. There are a good number of books available.
Sure, you can do that. Be sure to change the Output Type of the project to Console Application. Then, in your Main function, add a string[] parameter. Off of some switch that you receive on the command line, you can branch to ServiceBase.Run to run as a Windows Service or branch to some other code to run a console application.
This question is somewhat older but since I needed something similar some time ago it felt like this question is still relevant.
I wrote a small Rest-API with NancyFx and OWIN. OWIN is a standard interface between .Net applications and web servers. With OWIN it is possible to create a self-hosted WEB-API. Nancy on the other hand is
a lightweight, low-ceremony, framework for building HTTP based
services on .NET ยน
The combination of those two makes it possible to create a self-hosted C# Web service.
I am quite sure that there are many more possibilities to create something like this by now but since I used it like this I thought the Information might be useful to someone.

Categories