What you want to do is pretty complex.
First, your best choice for a filter is Squid. You can write all sorts of rules that allow or deny connections to remote sites based on IP, domain name, or text in the URL.
HTTPS is a whole other ball game. Putting a filter between clients and remote sites using SSL will generate repeated complaints by the client's browser that you are being subjected to a "man-in-the-middle" attack since the filter will disrupt the encrypted transaction between the browser and the server. Squid 3.2 has some clever solutions for this problem, but they are not easy to implement. If you're curious, do a search for "SSLBump".
At one client site where I manage filtering we use iptables firewalling rules to block remote HTTPS sites. That's not an easy task, either. Many sites with heavy traffic loads have multiple servers listening on multiple IP addresses. You'd need to identify all the possible server addresses for each site, then write iptables rules like this:
Code:
/sbin/iptables -A OUTPUT -d 173.252.110.27 -p tcp --dport 443 -j REJECT
This blocks HTTPS connections to one of Facebook's servers.
I would not suggest any of this for newbie Linux users. I cannot figure out from your request whom you're trying to filter, but a better method would be enforcing good behavior.
Bookmarks