I'm developing a Client/Server applications (C#, Winforms for GUI).
We have a module to perform tasks to import / export data from the database to other external sources. Activities are managed by users using any client station. The next step will be to allow the schedule to automatically execute tasks (eg, X start time and repetition every hour, daily or weekly or monthly time, and so on).
Each tasks allows to import or export a large amount of data with any datasources (excel. access or dbms), therefore they are long-running activities.
Now the DLL that implements this logic is distributed to each client station. This is not a good solution because we have to install all the potential requirements in each client (for example driver ado / oledb / odbc for all managed dbms).
I have to move this logic to the server station. In each client I want to see the tasks progress, stop or start any tasks, or change the schedule table and restart the process.
I'm considering what is the best solution. Realize a Web API or WCF. Probably WCF because service-oriented, but I've seen projects or articles with Web APIs combined with libraries like Quartz or Hangfire.
I'm also considering whether it is better to use a Windows service and to host WCF inside it.
What is the best solution? or are there any other solutions I'm not considering?
Thank you
EDIT:
From any client workstation the user can schedule all tasks to be executed depending by the applied settings (frequence time, repeat each day/week/month). Probably I should use a windows service because when the server machine is automatically switch on, this service must be automatically started and check if there are tasks to run. At the same time the user can decide to run manually any task without schedule it and, in this case, it will be queued and processed when it is his turn.
Now I'm thinking to host a WCF service into a Windows service in the server machine. Automatically I will start a background worker to check the scheduled tasks to run. In addition all clients can invoke a method to start one or more tasks. To notify the progress to all clients I'll use Contract Duplex.
You will need to compare between WCF and Web API and Choosing which technology to use according to your requirements.
If you just need HTTP only as transport protocols and Lightweight web-hosted services go with Web API.
And I will recommend Hangfire as it has many features than Windows service like Distributed, Persistent and Also, it's out of the box Dashboard that shows you all your scheduled, processing, succeeded and failed jobs.
Check also this article about
Runing Background Tasks in ASP.NET
if this is an internal application and clients are using winforms, behind the scenes you can make gets/posts to web api endpoints -- this allows users to retrieve/export data without having to install database drivers
web api driven imo, not very familiar with windows services, but one of the benefits i'm seeing is that the service can still be running on reboot
feel free to reach out to me directly
Related
I have a .NET Core Console Application that I need to deploy to Azure and schedule to run once a day. The application is creating a TCP Socket to get market data. I need to schedule it to run in the morning, and the application will receive a close message near the end of the day from the market and automatically close. Approximately run time is estimated at 16 hours, 5 days a week.
Here are the option I've looked researched:
Cloud Service, which might be deprecated (I'm having a hard to validating the comments I've read to this effect)
Service Fabric - but this really looks like it's tailored for stateless applications that can spin up and down for scale. In my case, it should always be a single instance (I do like the self "healing", if my service does go down, it would be great if it is automatically restarted or a new one is spun up)
Azure Web job and azure scheduler. It looks like I could set this to "always on" and add a settings file that has cron configuration, but it seems like a waste of resources to have it "always on". This option also appears to be limited in it's deployment options - I can't set up (that I see) a git integration and auto-deploy. This does seem like the way to go
I'm looking for the pro's and con's of these options above for my use case, or any other options that I might have missed.
There's one thing that seems to be overlooked here. This part:
The application is creating a TCP Socket to get market data.
Is that 80/TCP or 443/TCP? Does it talk HTTP over one of those ports?
Because if your application talks custom protocol over an arbitrary TCP port, you can't use WebJobs. The App Service sandbox does not allow arbitrary port binding. This applies ingress. Egress (outbound), there's no restriction. You can make raw TCP requests from the WebJob to any destination and port.
From https://github.com/projectkudu/kudu/wiki/Azure-Web-App-sandbox#network-endpoint-listening:
Network endpoint listening
The only way an application can be accessed via the internet is through the already-exposed HTTP (80) and HTTPS (443) TCP ports; applications may not listen on other ports for packets arriving from the internet.
There's no need to involve the Azure Scheduler service. WebJobs have a built-in cron implementation which is completely free.
Also, the Always On feature really means hit this site with synthetic requests every couple minutes so it serves a 200 OK, and thus prevent the Application Pool from being unloaded from memory due to inactivity.
I would use the tank. Can't find anything wrong with the tank if you can pick your tank size. There's also very little maintenance with tanks.
Size (id) Cores Ram Net Bandwidth Total disk size
---------------------------------------------------------------
ExtraSmall 1 0.75 GB Low 19 GB
Small 1 1.75 GB Moderate 224 GB
...
ServiceDefinition.csdef if you need to listen on a socket:
<Endpoints>
<InputEndpoint name="aRawTCPEndpoint" protocol="tcp" port="54321" localPort="54321" />
</Endpoints>
Where does your application keep state? Memory/disk/off-the-box database? Cloud Service roles are stateless in nature and if one instance gets sick it's barbecued and a new one is spun up. It's crucial that state be kept off-the-box, in durable storage - Blob/Table storage, Azure SQL, DocumentDB, etc.
Imagine you've build your house 6 years ago. And you've used this material called ClassicBrick in the structure. It is a good material, strong, waterproof, scratch-resistant. But recently this newer and better material - let's call it Armritis (which by the way is designed to be used in BRIDGES not houses, but i digress) came out which everybody tells you is better in every way. Do you tear down the house? Cloud Services are not deprecated and until i see an official Microsoft roadmap telling the opposite i'm not going to entertain this in any way.
On the topic of Service Fabric, it CAN do stateful and it's actually one of its biggest selling points:
From https://azure.microsoft.com/en-us/documentation/articles/service-fabric-reliable-services-quick-start/:
Create a stateful service
Service Fabric introduces a new kind of service that is stateful. A stateful service can maintain state reliably within the service itself, co-located with the code that's using it. State is made highly available by Service Fabric without the need to persist state to an external store.
Azure Functions is also worth a good look if you can speak HTTP over standard ports.
First, we need to compare the pricing between Cloud Service, Service Fabric, Web job &Scheduler if you doesn't want to wast resources. Here is the pricing calculator. Because your console application job will have special scheduler time to work. It is better to save money if the job is not work. So, Web job(if you have web app meantime) &Scheduler will be good choice to achieve your purpose.
Using a cloud service for such a tiny job is like using a tank to go to work
Service Fabric is mainly for building Micro-services style application not a console apps or jobs that run once a day
Web Jobs require a web app, so you have the remaining option which is the base for web jobs.
You can create the scheduler and make it run every day at specific time or execute it manually on demand
I would go with either one solution, in this priority:
App Service: Web Apps (Web Job)
Virtual Machine
Cloud Service
App Service: Web Apps (Web Job) provides a free plan. And it started to support Web Jobs in a free plan. You will be able to work with files, should you need it. Just as mentioned in other answers, just set a scheduler. If you have doubts and think it is not kosher to do it on a website, then think of it as getting a free website (if you use paid plan) as a bonus. Either way, everything runs on a machine - be it with web-server or without. Maybe you will even start some long-ago-forgotten web project of yours?
Cloud Service and Virtual Machine are both straightforward and simple. Gotta be honest, I haven't used Cloud Service, yet I think you can connect to it via Remote Desktop just like to an ordinary VM. You will have complete control. I would choose Virtual Machine over the Cloud Service though, just because it is cheaper.
Solutions that will NOT work:
Azure Scheduler does not fit you, because it allows only HTTP/S requests and posting messages to Azure Storage queues, Azure Service Bus queues, or Azure Service Bus.
Personally I would go with a WebJob without AlwaysOn, and use Azure Scheduler to fire the WebJob at the desired times using HTTPS. Then the WebJob can do the calls needed to get the data. This does not need AlwaysOn since the Scheduler call wakes it up anyway.
Azure Functions might also be worth a look, though they aren't meant for long-running tasks.
This would also probably be the cheapest option as the Web App can probably run on Free tier (depending on how long the job takes) and the Scheduler can also be Free tier.
I have some experience with WCF services development but with this requirement I want to get some help/suggestion from some of the experienced developers here. Here is my scenario,
I will have a service(REST) (let calls it Service 1) which will receive requests from a different service (lets calls it Service Main)with some parameters. I am planing to save these parameters in a database so that I can track the status of the progress in future steps. Then I have to a start a process on the server from Service 1 which will run for in determinant time (based on the parameters) and lets call this process A. When process A is done with its task and comes back with good results then I have to start a different process which is called Process B which will use files generated by process A. When process B is done with its business and sends an acknowledgement to service 1 then I have to send the information back to Service Main.
For database i am planing to use no sql database since there are no relationships involved and it is more like a cache. I am having hard time on how to architect this entire process so that all of these steps/tasks run asynchronous and able to scale and handle lot of requests.
Approach 1: My initial idea was to have a wcf or ASP.NET Web api service(REST) use TPL framework to launch process A and wait for it to complete and call async callback method of process A then launch Process B on a new Task. But I am not sure if that is a good solutions or even possible.
Approach 2: After lot of reading i thought may be having a windows service on the hosted server to launch process A and process B. WCF service will talk to window service to start the process.
Hopefully I explained the problem clearly and waiting to hear some advises.
To be precise: I have a .NET web forms system. I need a way to check some values and perform tasks, depending on these values in periodic manner. Let's say: Every month I have to check if my customers credit cards are still valid. There some other tasks/checking in short periods.
What is the best approach to the subject. I thought about Windows Service but I read about WCF. Please advise what is the modern and good way to solve this task. I'm thinking about .NET 4.0.
WCF is just an interface that can run in either Windows Service or IIS. You use this WCF interface to trigger some synchronous or asynchronous actions.
Your case sounds like you want a Windows Service on timer to perform validation on data stored in a data base or file.
If you want to start a process on demand then adding a WCF endpoint might be useful, if the timer approach is good enough, then you need not bother with WCF.
References for hosting WCF in Windows Process
microsoft.com
codeproject.com
As you've surmised, a Windows Service is a good approach to this problem.
Similarly, you could write a Console application and have it run via a scheduled task in Windows.
It depends on how your backend works and what you're most familiar with really.
Writing a console application is very simple to do, but it's not perhaps the best approach as you need to ensure that a user is logged on so that the scheduled task can run.
A service is slightly more complicated to implement, but it has the benefits of being integrated into the OS properly.
MSDN has a good guide to writing a service in C#, and you don't necessarily need WCF:
http://msdn.microsoft.com/en-us/library/aa984464(v=vs.71).aspx
You could use something like quartz.net. See link - http://quartznet.sourceforge.net/
If you have limited control over server (i.e. only regular HTTP pages allowed):
You can also use a web page to trigger the task - this way you don't need any additional components installed on server. Than have some other machine configure periodic requests to the page(s) that trigger tasks. Make sure that tasks are restartable and short enough - so you can finish each on regular page request. Page can respond with "next task to run" data so your client page can continue pinging server till whole operation is finished.
Note: Trying to run long running tasks inside web service process is unreliable due to app pool/app domain recycles.
The wording of the question doesn't necessarily do the issue justice...
I've got a client UI sitting on a local box with and a background windows service to support it while it performs background functions.
The client UI is just the presentation layer and the windows service does all the hard hitting action... so there needs to be communication between the two of them. After spending a while on google and reading best practices, I decided to make the service layer using WCF and named pipes.
The client UI is the WCF client and the windows service acts as the WCF host (hosting locally only) to support the client.
So this works fine, as it should. The client UI can pass data to the WCF host. But my question is, how do I make that data useful? I've got a couple engines running on the windows service/WCF host but the WCF host is completely unaware of the existence of any background engines. I need the client's communications requests to be able to interact with those engines.
Does anybody have any idea of a good design pattern or methodology on how to approach facilitating communication between a WCF host and running threads?
I think that your best bet is to have some static properties or methods that can be used to interchange data between the service threads/processes and the WCF service.
Alternatively, the way that we approach this is through the use of a database where the client or wcf service queues up requests for the service to respond to and the service, when it is available, updates the database with the responses to those requests. The client then polls the database (through WCF) on a regular basis to retrieve the results of any outstanding requests.
For example, if the client needs a report generated, we fire off a request through WCF and WCF creates a report generation request in the database.
The service responsible for generating reports regularly polls this table and, when it finds a new entry, it spins off a new thread/process that generates the report.
When the report has completed (either successfully or in failure), the service updates the database table with the result.
Meanwhile, the client asks the WCF service on a regular basis if any of the submitted reports have completed yet. The WCF service in turn polls the table for any requests that have been completed, but not been delivered to the client yet, gathers the information from them, and returns them to the client.
This mechanism allows us to do a couple of things:
1) We can scale the number of services processing these requests across multiple physical/virtual machines as the workload increases.
2) A given service can support numerous clients.
3) Through the WCF interface, we can extend this support to any client platform that we choose to support (web, win, tablet, phone, etc).
Forgot to mention:
Just because we elect to use a database doesn't mean that you have to in order to implement this pattern. You can easily implement the same functionality by creating a static request collection that the WCF service and worker service access in much the same way that we use the database.
You will just need to be very careful about properly obtaining and releasing locks on the static properties to avoid cross-thread collisions or deadlocks.
Edit (again): Let me simplify my problem. I have a Windows Service that exposes some WCF endpoints with methods like:
int ExecuteQuery(string query) {
// asynchronously execute query that may take 1 second to 20 minutes
return queryId;
}
string GetStatus(int queryId) {
// return the status of the query (# of results so far, etc)
}
What is the best way to implement the ExecuteQuery method? Should I just call ThreadPool.QueueUserWorkItem to get my query going?
Note that the actual work behind executing a query is done by load-balanced black box. I want to be able to have several queries going at the same time.
The analogy is a web browser that is downloading multiple files simultaneously and you have a download manager that can track the status of each file.
Take a look at Microsoft Message Queuing (MSMQ):
Microsoft Message Queuing (MSMQ) technology enables applications running at different times to communicate across heterogeneous networks and systems that may be temporarily offline. MSMQ provides guaranteed message delivery, efficient routing, security, and priority-based messaging. It can be used to implement solutions for both asynchronous and synchronous messaging scenarios.
It's good to know that Windows Communication Foundation (WCF) can leverage queuing services offered by MSMQ.
Either this is a trick question or a no-brainer... ThreadPool.QueueUserWorkItem is about the easiest way to go when you want to execute a piece of code concurrently. I'm sure you already knew that, so technically you have already answered your own question.
So if this is not a trick question, then are you asking exactly how to pass the query in the ThreadPool.QueueUserWorkItem?
I use a Windows service for a very similar task and it works very well. I use database tables to queue requests and responses, as it gives me a persistent queue that can be accessed over the network from remote ASP.Net applications, and concurrency control through transactions.
A supervisor thread on a timer spawns workers whenever incoming requests need servicing. I use a separate database tables for configuration and control so that I can administer the service and pause the supervisor from an application without while leaving the service core running. Logging to a separate table is a convenient way to see what's happening from web apps and a local admin app.
I wouldn't use the ThreadPool for long-running threads, but instead create a worker class that runs in its own thread and uses callback methods to update the supervisor with progress and completion status.
Adding to the MSMQ answer, you could think about looking at using an Enterprise Service Bus (ESB) to handle these sorts of things, if future scalability is a concern. Check out NServiceBus for one .NET example.
I would use WWF (4.0):
You can start long running transactions that can be handle in a few machines, execute task in parallel, failure support, friendly coding, you can manage it with appfabric, it is free...