PDA

View Full Version : [SOLVED] Squid, Blocking Every Website



Hovercat
February 11th, 2011, 12:14 PM
I'm trying to block a few websites from my network, but when I have Squid used as my proxy, every website is blocked. I get this error for every website:
ERROR The requested URL could not be retrieved The following error was encountered while trying to retrieve the URL: http://www.google.com/ Access Denied. Access control configuration prevents your request from being allowed at this time. Please contact your service provider if you feel this is incorrect. Your cache administrator is webmaster. Generated Fri, 11 Feb 2011 11:11:17 GMT by NETWORK-SERVER (squid/2.7.STABLE9) The only changes I've made to my squid.conf is:
acl bad url_regex "/etc/squid/squid-block.acl" http_access deny bad and /etc/squid/squid-block.acl contains:
.4chan.org .redtube.com .brazzers.com .brazzersmobile.com EDIT: I don't have it the way the forum parsed the text, it's all on it's own line.

Blutkoete
February 11th, 2011, 01:15 PM
I think the default setting for Squid is to block all traffic. You must allow the other computers to access the internet.

Example (for all computers within 192.168.1.x):



acl allcomputers src 192.168.1.0/255.255.255.0
http_access allow allcomputers

EDIT: WARNING: I'm no expert on the subject and this might cause security problems.

Hovercat
February 11th, 2011, 01:40 PM
Thanks, I'll try it when I get home.

Hovercat
February 11th, 2011, 09:57 PM
Oh my god! It works! Thank you soooooo much!