WCF service call - c#

I have a WCF service which takes computer ID,IDs on a network as input parameter and saves the computer stats for past 12 hours[like how much time the computer was locked, active, idle etc..] in the database.
Also I have a website from where I can set the scheduling for few computers at some time t for stats[ for past 12 hours as mentioned above]. This scheduling information[computer is and time] will be saved to database.
Now the issue is how to use the WCF service to make sure it runs on that particular scheduling time and also how to show the computer stats on the website when the WCF has been called and stats have been generated. If I use a window service to call WCF service how will I ensure that it runs on that scheduled time, also how to inform the website that the stats have generated.
Any help would be appreciated.
Cheers!

Set up a Schedule Task that will make call to WCF Service Method you want to run using something like cUrl
I believe the website needs refreshing so it can pick the data after the WCF Method gets executed so you can again use cUrl to make a web page call

Related

How To Keep My Web Services Awake?

I have written two web services that I am running on GoDaddy. One is a Microsoft WCF web service and the other is a RESTful Web API service. They are both working, but they rarely get traffic. If I don't call the web services for some period of time they seem to go to sleep. Then when I load the pages that call the web services they take some 20 to 30 seconds to retrieve data from the services. After that if I continue to call them repeatedly they load in just a second or two. Is this normal or did I do something wrong in my configuration? Is there some way to keep them active?
Entirely normal. You can either increase the recycle time limit in IIS (but you will still get recycled eventually) or you can write a quick scheduled task like the following to run every 10 minutes or so:
powershell Invoke-WebRequest -Uri "http://example.com"
Although I would caution that you should forcefully restart the service sometime during low usage hours just to clear the process memory / resource utilization.

Getting a notification when my web service is down

I have created a WCF SOAP web service in C# and a C# application. Other users will work with the application and are trying to get information from a local database via the web service.
Now is it possible that the web service can go down. Then I want as quickly as possible, let's say getting an email.
I can let the application which is by the users sending an email when they don't get a connection with the web service, but there are more than 10.000 users. And I don't want to get 500 emails when there are 500 online.
Is there a better solution for getting an email when my web service is down?
I would suggest a sort of heartbeat program that runs on another server / location.
This will not only test the server, but also your internet connection for example;
Will work even when the power is down.
There are programs to do this already. The only I know is Microsoft System Center and Nagios.
There are a lot of monitoring software packages out there if you want a pre-made tool. The company I work for uses New Relic. (http://newrelic.com/server-monitoring).
If you add a simple function to your webservice, that simply returns true, you can call it periodically (maybe every minute?). If it times out, your service is down, and your monitoring program can email you.

WCF/ASP.NET Web API service architecture suggestions

I have some experience with WCF services development but with this requirement I want to get some help/suggestion from some of the experienced developers here. Here is my scenario,
I will have a service(REST) (let calls it Service 1) which will receive requests from a different service (lets calls it Service Main)with some parameters. I am planing to save these parameters in a database so that I can track the status of the progress in future steps. Then I have to a start a process on the server from Service 1 which will run for in determinant time (based on the parameters) and lets call this process A. When process A is done with its task and comes back with good results then I have to start a different process which is called Process B which will use files generated by process A. When process B is done with its business and sends an acknowledgement to service 1 then I have to send the information back to Service Main.
For database i am planing to use no sql database since there are no relationships involved and it is more like a cache. I am having hard time on how to architect this entire process so that all of these steps/tasks run asynchronous and able to scale and handle lot of requests.
Approach 1: My initial idea was to have a wcf or ASP.NET Web api service(REST) use TPL framework to launch process A and wait for it to complete and call async callback method of process A then launch Process B on a new Task. But I am not sure if that is a good solutions or even possible.
Approach 2: After lot of reading i thought may be having a windows service on the hosted server to launch process A and process B. WCF service will talk to window service to start the process.
Hopefully I explained the problem clearly and waiting to hear some advises.

Self hosted WCF Service with Timer

I am following this example to create self hosted WCF service. Ideally I would like service to hookup with timer to check every half an hour if certain value is updated in database and if yes service would perform some task else will keep checking every half an hour. I have read it online that using timer in IIS hosted WCF is not a good idea, how about using it on self hosted wcf service? any examples?
Thanks,
I think a better option for you would be to create a simple console app that performs your task if the value is updated and then create a Scheduled Task in Windows that runs this console app every half hour. That way you can let Windows manage the timing part and you just have to write the code that checks the DB and updates it if necessary.
Not sure what version of Windows you are running, but on you can get to scheduled tasks from the Control Panel.
Create a Scheduled Task on XP
Create a Scheduled Task on Windows 7
The reason a timer in a IIS hosted WCF service is "not a good idea" is a IIS service has a much different lifetime than a self hosted service. See this SO question and answer for some details and this MSDN article for even more details.
Basically a WCF service can be "shut down" while being hosted inside IIS if no-one has connected to it within a timeout period. If you need regular periodic maintenance like you are describing you will need to use a self hosted service and have that service start a timer up that fires every half hour in it's OnStart() call.

Asp.Net - Offload processing to external application

I have a asp.net website which processes requests using a 3rd party exe. Currently my workflow is
User accesses website using any browser and fills out a form with job details
Website calls a WCF self hosted windows service which is listening on a port
Windows service launches 3rd party exe to process the job and returns the result to website
Website displays the returned result to the user
The above website was a prototype which now needs to be turned into a production ready deployment. I realize that the above architecture has many points that could break. For example, if the machine is powered off or if the windows service crashes and is no longer listening on the port, all current requests will stop processing. To make the architecture more robust, I am considering the following
User accesses website using any browser and fills out a form with job details
Website writes out the job details to a database
Windows service which is polling the database every 10 seconds for a new job picks up the job and executes it using the 3rd party application. The results are written back to the database.
Website which has now started polling the database, picks up the results and displays them to the user.
The second architecture provides me with more logging capabilities and jobs can start again if they are in a queue. However it involves large amounts of polling which may not be scalable. Can anyone recommend a better architecture?
Instead of Polling I would go with MSMQ or RabbitMQ.
That way you could off-load your processing to multiple consumers (possibly separate servers from the web server) of the queue, and process more requests in parallel.
I have implemented the same architecture in one of my applications where users are making multiple requests to process. So I have -
Users goto website and select parameters etc. Submit the request
Request is stored into database table with all the details + user name etc
A service looks the database table and picks up the request in FIFO manner
After the request is processed, the status is updated as Failed or Completed into database table against that requestId, which can be seen by users on website
Service picks up next request if there is any, otherwise stops.
Service runs every 30 mins

Categories