I am new to WCF and i am designing a project in which i want to run a crawler program (coded in c#) which crawlers some websites and it stores the crawled data in the tables of database (sql server db). I want that crawler runs repeatedly after 30 minutes and updated the database.
I want to then use the service on my hosted platform so that i can use the data from tables in web form (i.e. .aspx page)
Is it okay to use WCF for this purpose ?
Please suggest me how to move on ?
Thanks
Windows Communication Foundation (WCF) is responsible for communication between 2 points with different channel technology. you will use WCF if you want to send/receive some data between two point regardless channel technology (TCP/UDP/NetPipe/MSMQ , ...)
But you first need to design you crawler application which is configured to fetch data from your target web sites, then you need to design a schedular application using
http://quartznet.sourceforge.net/
to run your crawlers.
after running and storing your web pages you can use WCF if you need to do replication or synchronization with center server but it is optional
You could use a WCF service to do this but I would go for another setup:
I'd build a Windows application that is scheduled to run every 30 minutes by the Windows Task Scheduler. A simple console application might be fine.
I'd use a Web application (possibly ASP MVC) to query the database.
As you can see there is no need to use WCF at all.
An exception can/must be made when the server is not yours but you are using a hosting provider who doesn't allow you to schedule a Windows task. In that case you might want to run the crawling process by hand through the web application and have it repeat itself after 30 minutes.
Some hosting providers do allow the scheduling of tasks but in a different way so it might be worth to investigate.
Related
I need some architectural advice on how to build a background service application.
Background:
I have 2 websites, and I need to transfer some data from website A to website B. Service
would have to run in a background (as windows service) and should connect (every 5 minutes)
to websites's A database directly (MSSQL) grab some data and insert this data through
websites's B API (API is build on MVC Web Api). Both websites are hosted on a same virtual
machine (Windows Server 2008 R2 Datacenter), but this might change (website B can be switched to another virtual server or cloud hosting as Windows Azure or Amazon AWS).
Question:
What do you suggest (best practices) and what guidelines you can give me? I want this to
be scalable and fast as possible and that service will receive multiple requests.
Thank you,
Jani
If it is important to know what data was transferred, then:
Add logs - log4net for instance
Issue tickets if the process stops, and close the ticket when it restarts, this way you will know if a process fails. Depending on the amount of data use you could use Redis/Riak.
Put monitoring on each service A and B, and you might also consider restarting the service via IIS on fail down.
I'm making an application in C# with VS 2012 that checks a database every 15 seconds and perform some actions when it finds data. Right now I've created a Console Application so I can debug it easely but during relese this application needs to run in a IIS server.
How can I do that? I've read this question but it looks like some sort of workaround because to run it I need to perform these steps. Right now I'm reading the docs about Windows Service Application, Is this the right way?
EDIT Sorry but I've never used Windows server before, so as people pointed out IIS is only a web server, the thing I need to do is run my application in a Windows Server environment
IIS is a web-server and accordingly it should be used for hosting web applications.
Develop a windows service which does the job of checking the database in intervals and invoke a web service (which you can host in IIS)
If your application is performing some data query and manipulation on the server then I would recommend the approach to host it in a windows service.
Some advantages to this are:
The service will start and run independently of a user logging into the server.
You can configure the service to recover should it experience an exception (ideally not!).
The service will start automatically (if configured) when the server restarts.
You can configure which user group (or user) the service should run under so you have a more granual approach to security.
As it's running as a seperate process, you can monitor its memory and processor utilisation.
Debugging is slightly more cumbersome but not difficult, one approach I've used is to install the service locally, start it and then attach to it via the debugger. This question describes another approach I've also used.
Environment: C#.NET VS 2012
We need to write an order delivery process. Basically it runs through the orders tables and and creates a file every night, that contains orders that are received on that day.
Traditionally we build this using Windows Console Application and a scheduled task wakes up this console application at every night (or every 6 hrs) to deliver the files
We are planning to re-write this console application. We are leaning towards both approaches i.e.
Approach 1: the scheduled task would run to deliver the order every night
Approach 2: ASP.NET web apps, that would also deliver the orders.
I am new to WCF, not yet tried it, is this a good situation to use WCF?
If so, can someone throw me some basic points how to implement this.
FYI: I have implemented another approach for some other client, where we have a ASMX web service that does this job, and the console application just calls the web service.
One disadvantages we have with this approach is, the file creation and delivery and everything is done through IIS and we prefer not to use IIS if it needs to be called from Windows Scheduler. This is for performance reasons.
Thanks
Suresh
Keep it simple. Run a console application as a scheduled task.
An IIS app (WCF service or WebApi) would only be useful if you get job requests, i.e. acting as a server.
I am developing a project for college and i need some suggestions over the development. Its a website which shows information from other websites like Links, Images etc.
I have prepared below given model for the website.
A Home.aspx page which shows data from tables (sql server).
I have coded a crawler (in c#) which can crawl (fetch data) required website data.
I want some way through which i can run the crawler at back end for some time interval and it can insert updates in the tables. I want that i can get updated information in my database so that Home.aspx shows updated info. (Its like smaller version of Google News website)
I want to host the wesbite in Shared Hosted Environment (i.e a 3rd party hosting provider company and that may use IIS platform)
I posted simliar situation to different .NET forums and communities and they suggested lot of different things such as
Create a web service (is it really necessary ?)
Use WCF
Create a Console application and run windows task sheduler (is it okay with asp.net (win forms website) and in shared hosted)
Run crawler on local machine and update database accordingly. (No i want everything online) etc etc
Please suggest me a clear way out so that i complete the task. Please suggest elobrated technology and methods which suits my project.
Waiting...
Thanks...
Your shared host constraint really impacts on technologies restrictions.
In theory, the best way to host your crawler would have been a Windows service, since you can take advantage of windows services configuration. A service is always up, can be automatically started at startup, writes errors in event log, can be automatically restarted after failure...
Then, you Home.aspx would have been a regular website in IIS.
If you really stay on a shared host (where you cannot setup a service), I would have make the crawler as a module which is run on your application startup.
Problem is, IIS application pool doesnt live forever if your web site is not in use, and it may stop the crawler. It is configurable, but I dont know how much in a shared host.
In IIS 7.5, think about starting your module at application warm up
Finally if you need to run the crawler at interval times (like every day at midnight), if your shared host does not let you set task scheduling, think about Quartz Framework, which allow you perform task scheduling inside your application (without the intervention of the OS)
Integrate your crawler code into a aspx page
Setup a task scheduler on your host to call that page every X minutes
When the page is called check that localhost has called the page
If localhost called it run the crawl routine and
If localhost hasn't called it throw a 404 eror
I have a C# application that needs to always be running. I originally planned on making this a windows service but I now have a requirement to make the application host a web admin console.
I haven't played with IIS in quite a few years so my question is this:
What would you recommend I use?
I've thought about making a windows service and embedding a web server such as Cassini but so far I'm not very happy with the open source web servers I've looked at.
Can IIS handle this? Do people use it for this type of scenario, and if so how?
This sounds like a job for two separate projects.
One is the original Windows Service. Windows Services are well suited for what you're doing.
The second is the Web Project that will be used to administer the Windows Service. This is the part that runs in IIS.
It depends on what you mean by always running. An ASP.NET web application deployed in IIS could very well be unloaded by the web server if there aren't any requests for certain amount of time killing all background threads. So if you want an ever running background thread it would be better suited to use a Windows Service. As far as the web admin is concerned, well, here you don't have much choice: ASP.NET in IIS. In order to do something useful those two applications should be able to find a common language to talk. So you could use a database to store the results into which could be used by both applications.
IIS will run your app on first request, not on server boot. So you will still need to run a service to ensure your app is always running.
You can use IIS as a webserver for your web admin part, and link your ASP.net app with your service by means of a configuration database (easy) or webservices (a little more tricky).
Windows and Web services are two very different creatures. A web service will expose external methods that you can implement against an application, while a windows service is an entity within itself. If you're planning on using this service on a timed interval to perform an operation, a Windows service would be the right way to go. If you use a web service, you will need to invoke the method you wish to run from a secondary application.
If you need to queue commands against your windows service, you could always create a database that was accessible by both your website and your windows service. This way you could send commands and query data between the two. Placing a web service in to serve as an intermidary between the two may be overkill.