I have database and mvc application hosted on iis. I periodicaly gather data from internet and save them in sql database. And i calculate statistic and graphs from these data and publish them in mvc application.
Problem is that iis have recycling period about 1 hour -> meaning that my timer(function) which gather data from interenet is stoped whenever there is server restart, recycling or there is no request on the web page.
solutions i have found are:
turn of recycling - i don't own srv can't do that.
windows service - 99% hostings don't allow host ws...
So is there any solution, service, framework, which purpose is to gather data and i can be sure that it will not stop after some inactivity time or server restart? or is my logic completely wrong and i need to gather data diferently? can it be done on hosting which i don't own? can it be done using iis?
can it be done using iis?
If the IIS in question has app fabric installed, then that supports an auto start feature, which effectively lets you write 'service like' code which will keep running in the background.
Quick overview here
Related
I've scenario to create an application(Windows service , Winforms app) which runs twice every day automatically on users PC. These Users are internal employees in the same network. So at morning and evening this application has to run. But it doesn't need to show any window or information saying its running. Its good to have a simple notification in system tray that its started execution.
My experience in with web application development. So I got a little stucked with these such application on deciding which is best.What my understandings are if its a standalone exe, we could ask all users to download the exe and install.
If its a windows service we may depend up on instalutil to install the service.
So I really needs an advice on this. The application is nothing, just requesting a TFS api and the resulting JSON has to store in Table. So the JSOn will be based on each user using their windows authentication.
Please suggest a good solution to achieve its the best,secure and easiest way even for non tech savvy users.
Instead of all user communicating to TFS server twice a day i guess better way is to install a service in one centralized machine which will run a window service twice a day and that machine will host that service using WCF so that other user will communicate with machine this will help you to distribute the load of tfs api. i used the same approach in my case where one machine talk to ALM and other talk to that machine to get the files.
Creating a window service is pretty simple and straight forward.
Follow the link to make one:
https://www.c-sharpcorner.com/UploadFile/naresh.avari/develop-and-install-a-windows-service-in-C-Sharp/
You can host the service in WCF using IIS, TCP, Webservice, Console application its upto you. Follow this link
https://www.codeproject.com/Articles/550796/A-Beginners-Tutorial-on-How-to-Host-a-WCF-Service
I guess i helped you :)
I need some architectural advice on how to build a background service application.
Background:
I have 2 websites, and I need to transfer some data from website A to website B. Service
would have to run in a background (as windows service) and should connect (every 5 minutes)
to websites's A database directly (MSSQL) grab some data and insert this data through
websites's B API (API is build on MVC Web Api). Both websites are hosted on a same virtual
machine (Windows Server 2008 R2 Datacenter), but this might change (website B can be switched to another virtual server or cloud hosting as Windows Azure or Amazon AWS).
Question:
What do you suggest (best practices) and what guidelines you can give me? I want this to
be scalable and fast as possible and that service will receive multiple requests.
Thank you,
Jani
If it is important to know what data was transferred, then:
Add logs - log4net for instance
Issue tickets if the process stops, and close the ticket when it restarts, this way you will know if a process fails. Depending on the amount of data use you could use Redis/Riak.
Put monitoring on each service A and B, and you might also consider restarting the service via IIS on fail down.
I'm making an application in C# with VS 2012 that checks a database every 15 seconds and perform some actions when it finds data. Right now I've created a Console Application so I can debug it easely but during relese this application needs to run in a IIS server.
How can I do that? I've read this question but it looks like some sort of workaround because to run it I need to perform these steps. Right now I'm reading the docs about Windows Service Application, Is this the right way?
EDIT Sorry but I've never used Windows server before, so as people pointed out IIS is only a web server, the thing I need to do is run my application in a Windows Server environment
IIS is a web-server and accordingly it should be used for hosting web applications.
Develop a windows service which does the job of checking the database in intervals and invoke a web service (which you can host in IIS)
If your application is performing some data query and manipulation on the server then I would recommend the approach to host it in a windows service.
Some advantages to this are:
The service will start and run independently of a user logging into the server.
You can configure the service to recover should it experience an exception (ideally not!).
The service will start automatically (if configured) when the server restarts.
You can configure which user group (or user) the service should run under so you have a more granual approach to security.
As it's running as a seperate process, you can monitor its memory and processor utilisation.
Debugging is slightly more cumbersome but not difficult, one approach I've used is to install the service locally, start it and then attach to it via the debugger. This question describes another approach I've also used.
I am new to WCF and i am designing a project in which i want to run a crawler program (coded in c#) which crawlers some websites and it stores the crawled data in the tables of database (sql server db). I want that crawler runs repeatedly after 30 minutes and updated the database.
I want to then use the service on my hosted platform so that i can use the data from tables in web form (i.e. .aspx page)
Is it okay to use WCF for this purpose ?
Please suggest me how to move on ?
Thanks
Windows Communication Foundation (WCF) is responsible for communication between 2 points with different channel technology. you will use WCF if you want to send/receive some data between two point regardless channel technology (TCP/UDP/NetPipe/MSMQ , ...)
But you first need to design you crawler application which is configured to fetch data from your target web sites, then you need to design a schedular application using
http://quartznet.sourceforge.net/
to run your crawlers.
after running and storing your web pages you can use WCF if you need to do replication or synchronization with center server but it is optional
You could use a WCF service to do this but I would go for another setup:
I'd build a Windows application that is scheduled to run every 30 minutes by the Windows Task Scheduler. A simple console application might be fine.
I'd use a Web application (possibly ASP MVC) to query the database.
As you can see there is no need to use WCF at all.
An exception can/must be made when the server is not yours but you are using a hosting provider who doesn't allow you to schedule a Windows task. In that case you might want to run the crawling process by hand through the web application and have it repeat itself after 30 minutes.
Some hosting providers do allow the scheduling of tasks but in a different way so it might be worth to investigate.
I am developing a project for college and i need some suggestions over the development. Its a website which shows information from other websites like Links, Images etc.
I have prepared below given model for the website.
A Home.aspx page which shows data from tables (sql server).
I have coded a crawler (in c#) which can crawl (fetch data) required website data.
I want some way through which i can run the crawler at back end for some time interval and it can insert updates in the tables. I want that i can get updated information in my database so that Home.aspx shows updated info. (Its like smaller version of Google News website)
I want to host the wesbite in Shared Hosted Environment (i.e a 3rd party hosting provider company and that may use IIS platform)
I posted simliar situation to different .NET forums and communities and they suggested lot of different things such as
Create a web service (is it really necessary ?)
Use WCF
Create a Console application and run windows task sheduler (is it okay with asp.net (win forms website) and in shared hosted)
Run crawler on local machine and update database accordingly. (No i want everything online) etc etc
Please suggest me a clear way out so that i complete the task. Please suggest elobrated technology and methods which suits my project.
Waiting...
Thanks...
Your shared host constraint really impacts on technologies restrictions.
In theory, the best way to host your crawler would have been a Windows service, since you can take advantage of windows services configuration. A service is always up, can be automatically started at startup, writes errors in event log, can be automatically restarted after failure...
Then, you Home.aspx would have been a regular website in IIS.
If you really stay on a shared host (where you cannot setup a service), I would have make the crawler as a module which is run on your application startup.
Problem is, IIS application pool doesnt live forever if your web site is not in use, and it may stop the crawler. It is configurable, but I dont know how much in a shared host.
In IIS 7.5, think about starting your module at application warm up
Finally if you need to run the crawler at interval times (like every day at midnight), if your shared host does not let you set task scheduling, think about Quartz Framework, which allow you perform task scheduling inside your application (without the intervention of the OS)
Integrate your crawler code into a aspx page
Setup a task scheduler on your host to call that page every X minutes
When the page is called check that localhost has called the page
If localhost called it run the crawl routine and
If localhost hasn't called it throw a 404 eror