Programmatic way to temporarily block specific Web Sites? - c#

I need a way to programaticaly block and then later unblock specific websites based on their domain names. I only need to block browsers (so http & https would be sufficient, I guess?) but not just Internet Explorer, it should also work for anyone trying to run Chrome or Firefox too.
This needs to work on Windows XP and be usable from a .NET program (Vb.net or C#).
(ps., I had found this question: How to unblock website which is blocked, using C#? which seems to be saying the same thing, however at the time I could not understand it. Now I see it, thanks all.)
Thanks,

This line in the hosts file will redirect to localhost. Though I have nothing against Nascar ;)
127.0.0.1 www.nascar.com
Block websites using a hosts file.

A down and dirty way would be to dynamically update the hosts file.
c:\Windows\System32\drivers\etc\hosts

You could add entries to the Hosts.ini file to achieve this. It would only work on Windows. Have a look here link text

First off, know that you need to be an Administrator to do this stuff.
Well, you can just add a line to the hosts file in ( c:\Windows\System32\drivers\etc\hosts ) with the site you want, such as:
127.0.0.1 www.example.com
After that is done, just run the following command:
ipconfig /flushdns
Some web browsers, including Firefox, will also have to be restarted as well.
To unblock a blocked site, just delete it's entry from the hosts file, or place a # sign at the beginning of it.

Related

Making a proxy to block websites

I am making a proxy for Windows. I would like to be able to block certain documents under specific URLs.
For example, everything on google.com would work fine, but google.com/index.html could be blocked.
Can anyone help with this please?
I want the proxy to run on the same PC that uses it.
I have found what I needed. Sorry if I wasn't specific enough.
For any future users who come across this post and know what they are after - try FiddlerCore for .NET applications. It's a great package enabling you to capture and 'fiddle' with HTTP and HTTPS responses before they reach your device.
This means that FiddlerCore can be used to analyse a request's header, check for a certain URL and then drop the request.
There are two methods:
(1) using FiddlerCore package
(2) modification of "hosts" file
1)
FiddlerCore is a .NET package which is used to analyse packets in the network.
A deep knowledge in FiddlerCore is required because it contains dangerous functions as well as most helpful functions.
2)
You can block websites by appending curresponding URL to the "hosts" file.
Navigate to C:\Windows\System32\drivers\etc, changing C: to whatever drive letter you have Windows installed on. Select "hosts" to open it.
Just append in a new line :
127.0.0.1
You want to get Administrative privilege for your application to modify the hosts file.

Extremely strange ClickOnce Web deployment behavior (caching)

so I recently deployed my application via ClickOnce to a web server (WAMP to be exact), and had VS2010 auto-generate the webpage and all that jazz. The users were able to download the application just fine.
The strangeness began when I pushed out my first update. 2 different scenarios occurred. When then when to website and hit install, it always installed the first version and not the update. Also, I have a "Check for Updates" Button in the app itself and when they'd click on that it would say "No update available" (using a variation of this code).
On a hunch I had them clear their browser cache and try the "Check for Updates" button in-app again... and lo and behold it worked.
What's going on here? Is it caching the webpage and thus not seeing the updates? When they visit it the text on the webpage has been updated saying it's the new version but they cannot install until they clear the cache. Furthermore, is that check for update code hitting the webpage too (How else would that not work either)? Would placing a NO-CACHE HTML line in the auto-generated webpage's header fix this? Any suggestions/insights are welcome.
I'd look into how your Apache is set up for caching, like you said. Look into what headers it's sending out. Make sure that it's sending out the .application file with the correct MIME type application/x-ms-application
After the ClickOnce is installed, it will always get the same Uri example.com/app/app.application and compare the installed version number with the one it just downloaded. When you Publish through Visual Studio it overwrites the file at that location. So, yes, I could see it being a caching issue. It's odd to me that the ApplicationDeployment API would be using the same browser cache, but who knows maybe it uses IE internally.
I have my testing ClickOnce application written on top of MSDN's asynchronous example. There's a progress string where you can see it downloading the .application each run. I haven't seen the same issue as you hosting the deployments either on a UNC path or on AWS S3 with static web hosting enabled. That's why I think it may be something in Apache?

process.start from ASP page, where to put exe file

Just a quick question, where do I put the exe that I am trying to launch in an ASP project? It cant see it in the bin folder.
You need to pass a full path to Process.Start:
Process.Start(Server.MapPath("~/bin/whatever.exe"));
However, ~/bin/ is meant for .Net assemblies; it's generally better to put external EXEs in ~/App_Data/.
Note that you can only execute programs on the server, not the client.
Remember that your Website is running on a server so if you Shell anything (Process.Start in an ASP.Net application) it will be opened on the Server - not the Client PC.
Be sure you test this in production and when you do shell processes on a server (which I would not advise) dont forget to close them, otherwise you will end up with strange errors on the server.
To avoid security concerns and multiple process running in parallel. from asp page i would rather prefer to queue up the request for process start in some table than have other process like windows service in background to pick it up from queue and execute.
Path.Combine(Server.MapPath(Request.ApplicationPath), "PathToExeFromWWWRoot");

Move a Web Site project with a web service to AWS windows server

I've started ut a new instance of a windows server 2008 and am trying to move and launch my web service I've created in Visual Studio. How do I move the project from my local computer to the remote desktop? Grateful for all help!
I've tried the really simple approach and just copied the directory to the remote desktop in the same location as on my local computer. Did not work.. When I try to access the same adress that it has on my local computer (http://localhost:80/somesite all I get is this:
HTTP Error 403.14 - Forbidden
The Web server is configured to not list the contents of this directory.
I'm probably going about this the wrong way, but don't know where to start..
Sounds like you need to setup IIS. See the following link http://www.codeproject.com/Articles/28693/Deploying-ASP-NET-Websites-on-IIS-7-0
I would make sure asp.net is enabled in the IIS server. Also try to explicitly hit your page such as:
http://localhost:80/somesite/myhome.aspx
I'm sure there is a quick answer to your particular issue, but if you're going to be doing this sort of thing often, it is best to take some time up front and read up, then click around and get a feel for IIS.
http://msdn.microsoft.com/en-us/library/ms178477.aspx
Visual Studio has abstracted much of the site/virtual directory setup and configuration, chances are you can't just copy the files over and have it work. There are lots of things to think about: websites versus virtual directories and their configurations, application pools and their identities, file permissions, default documents, etc. enjoy.

saving a text file in client side without asking for permission any time

I need to save a text file on the client side possibly without permission. The case is that I need to save this text file in a shared folder in this or in another machine in the lan. This text file is going to be read automatically by the fiscal printer which will print the fiscal invoice. I have a asp .net web application and the server is not on the same lan with the fiscal printer, so I have to write it on the client-side. Any idea how to do this without asking to the user every time for the security issue.
I need a cross browser solution.
I can accept a solution like, the client is asked only one time a the first printing, but not every time he wants to print a bill. Some kind of asking permission to the client for allowing this website, in order to not repeat the permission asking.
Obviously - this would be a major security breach to download files to the user's computer without them knowing. All browsers have precautions in place to prevent this from happening.
No, you can not do this. Saving a file to a computer without permission in a public folder is not allowed.
You can, however, have your Client install your application which will have the ability to read and write where you want.
A common way that Trojan viruses to this is by giving the Client some goofy program to run that displays a fireworks show or something else quite trivial. While the Client is busy wondering what he's looking at, your virus is installing quietly in the background.
Now, you are probably saying to yourself, "But I am not installing a virus." However, there is no way for a Browser to know if your application is a virus or not. That is why it is not allowed and why you can not do it.
The more applicable scenario for me is:
1- Do your work inside your web application.
2- Get the information that you need to print.
3- Send it to another computer directly (or to a hosted web service) and this computer will act as a host for these files.
4- let your server access this shared folder, and print what you want
You could use a cookie, which won't ask permission. Of course that would only work when cookies are enabled and can store limited amounts of data.

Categories