In trying to find a solution to a problem I posted here, I found a way to have squid block URLs, including https urls. My other post covers how to lock a user, or group of users to a single pc, as the objective of this post is to block traffic between some web-servers and a single lan IP. If the user is not restricted to a specific machine, he/she can bypass the restriction, by using a colleague's PC.
I guess this solution is not a replacement for iptable rules, but it adds another layer of redundancy, and allows the option of directing users to custom error pages.
Regarding SSL, since squid has to decide what to do with ssl, (such as allow direct connection), and log the request, there exists an opportunity for squid to reference access controls and respond accordingly, such as to deny the connection. This got me thinking.
All I did was write an acl to define bad sites:
Code:
acl Badsites dstdomain "/etc/squid/badsites.acl"
Current contents of /etc/squid/badsites.acl:
Code:
.facebook.com
.twitter.com
31.13.76.10
31.13.77.42
31.13.77.58
69.171.228.0/25
69.63.190.0/25
66.220.146.0/25
66.220.147.0/25
66.220.149.0/25
66.220.153.0/25
66.220.156.0/25
66.220.158.0/25
69.63.189.0/25
69.63.190.0/25
69.171.224.0/25
69.171.228.0/25
69.171.229.0/25
69.171.234.0/25
69.171.242.0/25
I included the IP's I added to IPtables, just incase someone tries to navigate to a site by IP. (Again, even if I didn't add the IP's, my firewall would have blocked it, but redundancy doesn't hurt.)
Then, under my access controls, I define an access restriction reply rule, before anything else:
Code:
http_reply_access deny Badsites
Edit:
Or
Code:
http_reply_access deny ManagerPC Badsites
as I have it along with the example in my other post, in order to only block bad sites on a specific pc, and specific users should you so wish.
Bookmarks