Results 1 to 10 of 10

Thread: Blocking websites with ufw

  1. #1
    Join Date
    Mar 2013
    Beans
    4

    Blocking websites with ufw

    Hi.

    I have set up a 12.04 server version of ubuntu as a router, using iptables or ufw. It is set up virtually using VMware, where i have the router, a Windows 2008 server and a win 7 client.
    I have been trying to find a way to block facebook and other pages, but without a solution.
    One problem that I have found is that facebook uses a https page for the login, so I need to be able to block https sites as well as http.
    Is it possible to do this using ufw, or do I have to use iptables or something else.

    Oksendal
    Last edited by oksendal; March 11th, 2013 at 02:56 PM.

  2. #2
    Join Date
    Nov 2012
    Location
    Halloween Town
    Beans
    Hidden!
    Distro
    Xubuntu Development Release

    Re: Blocking websites with ufw

    With IPTables:
    Code:
    sudo iptables -A OUTPUT -d <ip address> -j DROP
    This syntax also supports a wildcard. By typing an IP with a zero in it, you are effectively blocking the entire span of that field. For example, 192.168.13.0 references the IP range of 192.168.13.1 to 192.168.13.254

    With UFW:
    Code:
    sudo ufw deny from <ip address>
    To block an IP range you should do:
    Code:
    sudo ufw deny from 192.168.13.1/254

  3. #3
    Join Date
    Mar 2013
    Beans
    4

    Re: Blocking websites with ufw

    Hi.

    It did not work. Might it be because I have enabled IP-Masquerading on the server/router?
    As you might have understood I am pretty new to ubuntu and linux.
    Is there a way you can use urls for blocking, or is blocking with IPs the best way to do it?

  4. #4
    Join Date
    Nov 2008
    Location
    Lleida, Spain
    Beans
    1,157
    Distro
    Ubuntu 12.04 Precise Pangolin

    Re: Blocking websites with ufw

    Of course you can use iptables for doing this, but in web filtering (you want to block some pages) iptables are not the best option. I think using squid with squidguard in transparent mode is a better option for you.

    http://www.ubuntugeek.com/how-to-set...in-ubuntu.html

  5. #5
    Join Date
    Nov 2012
    Location
    Halloween Town
    Beans
    Hidden!
    Distro
    Xubuntu Development Release

  6. #6
    Join Date
    Mar 2013
    Beans
    4

    Re: Blocking websites with ufw

    Quote Originally Posted by albandy View Post
    Of course you can use iptables for doing this, but in web filtering (you want to block some pages) iptables are not the best option. I think using squid with squidguard in transparent mode is a better option for you.

    http://www.ubuntugeek.com/how-to-set...in-ubuntu.html

    Is it possible to block https sites using squid?
    I have not yet mentioned that I am running the server without gui, just command line. Does this affect any functionality when it comes to blocking pages?

  7. #7
    Join Date
    Dec 2007
    Beans
    12,521

    Re: Blocking websites with ufw

    Quote Originally Posted by oksendal View Post
    Is it possible to block https sites ...
    You should add the bit about wanting to block https sites to your original post.

  8. #8
    Join Date
    Nov 2012
    Location
    Halloween Town
    Beans
    Hidden!
    Distro
    Xubuntu Development Release

    Re: Blocking websites with ufw

    Blocking websites can be done with squid dstdomain access lists (ACL).

    Edit the squid.conf configuration file:
    Code:
    gksudo geany /etc/squid/squid.conf
    Adding these three lines to it
    Code:
    acl lan src 192.168.10.0/24                           # client ip range to block web sites
    acl bad_sites dstdomain .foo.com .fooo.com            # Block two domains in single acl
    http_reply_access deny bad_sites lan
    This, will make squid server deny browsing if anyone from the acl "lan" access the domain foo.com and fooo.com. In any case you should keep in mind that squid will also deny all sub domains of the blocked domain.

  9. #9
    Join Date
    Mar 2013
    Beans
    4

    Re: Blocking websites with ufw

    Does it matter where in the squid.conf file i add this?
    What do I do to all the documentation that is already there, is all of that commented so that it's not being used?



    Quote Originally Posted by slickymaster View Post
    Blocking websites can be done with squid dstdomain access lists (ACL).

    Edit the squid.conf configuration file:
    Code:
    gksudo geany /etc/squid/squid.conf
    Adding these three lines to it
    Code:
    acl lan src 192.168.10.0/24                           # client ip range to block web sites
    acl bad_sites dstdomain .foo.com .fooo.com            # Block two domains in single acl
    http_reply_access deny bad_sites lan
    This, will make squid server deny browsing if anyone from the acl "lan" access the domain foo.com and fooo.com. In any case you should keep in mind that squid will also deny all sub domains of the blocked domain.

  10. #10
    Join Date
    Nov 2008
    Location
    Boston MetroWest
    Beans
    16,326

    Re: Blocking websites with ufw

    Quote Originally Posted by oksendal View Post
    Is it possible to block https sites using squid?
    Yes, but it takes quite a bit of work. Read this for more details. I'm testing the current 3.3 version now to see how well it works with transparent proxying of HTTPS requests. I've gotten 3.2 to work with SSLBump, but it requires that the browser be configured to use the proxy for HTTPS requests.

    None of my servers and gateways run GUIs. In fact the server version of Ubuntu comes without any GUI at all.
    If you ask for help, do not abandon your request. Please have the courtesy to check for responses and thank the people who helped you.

    Blog · Linode System Administration Guides · Android Apps for Ubuntu Users

Tags for this Thread

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •