Getting the list of every web address being accessed? - c#

I've been trying very hard to write a code in C# that lets me know what web address is accessed from my computer no matter which browser I am using or which software is accessing that web address.
This is very important because there might be softwares on my computers opening up web pages in the background that I am not aware of.
I need the code to have a list of the web addresses, even if the recording begins when ever you run the program its fine...

It sounds like what you might want is a proxy server.
Check out Squid.

Use Fiddler, you can get it here...
http://www.fiddler2.com/fiddler2/

Related

How would i go about creating a external url monitor?

How would I go about creating an application that can somehow see URLs I am viewing in most popular browsers?
Can it be done out of process?
In what ways can this be done?
Can it be done without browser plugins?
The simplest answer is use a proxy server.. prevent html access out except from the proxy server so they have to use it, then all url requests go to the proxy server, even if you choose to make it yourself, then you can collect the URLS either from logs or something and off you go.
You will have to install something in your network infrastructure and then collect data from it.
That could be a proxy server (eg Fiddler) or you can look at tools like WireShark (pcap).

C# - How to detect whether a website has a Shared or Dedicated IP Address?

Is it possible to detect whether a website has a dedicated or shared ip address from it's url using C# (Windows Forms application) ? I want to implement a functionality in my application to let write a web address in a TextBox than i click on the Test button. and then show ( Success ) MessageBox if the site has a Dedicated ip address or show a ( Failure ) MessageBox otherwise.
How can i detect whether a website has a Shared or Dedicated IP Address using C#.NET?
You can try, but you'll never have a good result. The best I think you could do is to check the PTR records of the IP, and then check if there are associated A records from different websites. This would still suck however, since a website could have two seemingly different domains that pertain to the same organization (googlemail.com/gmail.com for example).
Also, this assumes the existence of PTR records, multiple ones. I don't think I've seen such a setup supported by most VPS/sharing hosting.
Well, the way I would do it is:
Send HTTP GET to the URL and save the result.
Resolve the URL to an IP.
Send HTTP GET to the IP and save the result.
Compare the two results. (You can do sample checks between the two result)
If the results are the same, then this is dedicated hosting, if the result is different then this is a shared hosting.
Limitations for this method that I can think of now:
Will take you time to figure our a proper comparing method for the
two results.
If shared hosting is configured to default route to the site which you are checking.
Functions to resolve URLs, and do web requests for different programming languages are scattered across the Internet.
From a technical standpoint, there's no such thing as a "shared" or "dedicated" IP address; the protocol makes no distinction. Those are terms used to describe how an IP is used.
As such, there's no programmatic method to answer "is this shared or dedicated?" Some of the other answers to this question suggest some ways to guess whether a particular domain is on a shared IP, but those methods are at best guesses.
If you really want to go down this road, you could crawl the web and store resolved IPs for every domain. (Simple, right?) Then you could query your massive database for all the domains hosted on a given IP. (There are tools that seem to do this already, although only the first one was able to identify the multiple domains I have hosted on my server.)
Of course, this is all for naught with VPS (or things like Amazon EC2) where the server hardware itself is shared, but every customer (domain) gets one or more dedicated IPs. From the outside, there's no way to know how such servers are set up.
TL;DR: This can't be done in a reliable manner.

C# Know if server is being accessed from application

I have recently created an API on my server in PHP, but I have discovered that I shouldn't use my API directly with an API key because sensitive information like that can't be held securly inside an EXE. I did some research and people recommend creating a proxy between the API and your application, but even still that can be broken into.
I was wondering, how can I make my server know if it is being accessed from my C# application, or from another source? The reason why I want to know is to stop potential hackers accessing my gateway and using it themselves.
Thanks
SSL with a login?
There is no way for you to be certain someone is using your application to access a web service. I'm in a similar boat, and the most you can do is ensure the communication channel is secure (SSL) and use a username/password or something similar. You also have to be aware that anything done on the client's computer can be compromised. So much so, that you should pretty much assume that your application will be open source to anyone that wants it.

run C# code on client side in a web app

I have code on my server which works very well. It must crawl a few pages on remote sites to work properly. I know some users may want to abuse my site so instead of running the code which uses webclient and HttpRequest i would like it to run on client side so if it is abused the user may have his IP blacklisted instead of my server. How might i run this code client side? I am thinking silverlight may be a solution but i know nothing about it.
Yes, Silverlight is the solution that lets you run a limited subset of .NET code on client's machine. Just google for silverlight limitations to get more information about what's not available.
I don't know what is the scenario you're trying to implement, and whether you need real-time results, but I guess caching the crawl results could be a good idea?
In case you're after web scraping, you should be able to find a couple of JavaScript frameworks that for you.
I think your options here are Silverlight or somesort of desktop app
Unless maybe there is a jquery library or other client scripting language that can do same things
That's an interesting request (no pun). If you do use Silverlight then maybe instead of porting your logic to it, create a simple Proxy class in it that receives Requests from your server app and shuttles it forward for the dirty work. Same with the incoming Responses: have your Silverlight proxy send it back to the server app.
This way you have the option of running your server app through the Silverlight proxy in some instances, and on its own (with no proxy) in other scenarios. The silverlight plugin should provide a consistent API to program against no matter which browser it's running in.
If using a proxy solution in the web browser, you might even be able to skip Silverlight altogether and use JavaScript/AJAX calls. Of course this kind of thing is usually fraught with browser compatibility issues and it would be an obscure push/pull implementation for sure, but I think JavaScript can access domains and URLs and (in some cases of usage) not be restricted to the one it originated from.
If Silverlight security stands in the way you might look into other kinds of programmable (turing complete) browser plugins like Java, Flash, etc. If memory serves correct, for the Java plugin, it can only communicate over the network with the domain it originated from. This kind of security is too restrictive for your crawling needs.

How to detect the current sharepoint pages from the client machine?

On the client machine I need to be able to somehow detect which sites the current user are looking at right now.
I know the base URL of the sharepoint app, say sharepoint.thecompany.net but how the hack do I get the last requested url from the server?
I have hit a dead stop when trying to iterate the current processes and the casting the iexplorer process to something I can work with, I just don't know which kind of object to cast the process to :-(
I hope to implement this logic in a C# assembly what should run on the client box.
Any other approach that might work?
Thanks in advance
WatiN will allow you to attach to IE instances and get the current url from them. It will also allow you to do the same with Firefox instances.
It might be more efficient however to try to get requested urls at the network level using a wireshark type concept where you are just listening to http traffic on the computer and keeping track of the urls but a solution like that is a bit over my head.
EDIT: I came across this while looking for a solution: http://www.codeproject.com/KB/IP/networkmonitor.aspx
From what I can see I would think you could adapt the monitoring code to monitor and look for http request packets and parse the headers for the url information you need.

Categories