Page 2 of 2 FirstFirst 12
Results 11 to 17 of 17

Thread: website hijacking

  1. #11
    Join Date
    Mar 2010
    Location
    Squidbilly-Land
    Beans
    Hidden!
    Distro
    Ubuntu

    Re: website hijacking

    For a trivial, personal website that you don't want the world to see automatically, just don't put the entry page at the top. Bury it a little.

    I haven't used apache in years to know the default locations, but
    /var/www/index.html can just exist with an <html></html> to stop most casual people.
    Then make a directory /var/www/sasfwae8fs/ and put your website there. If you want extra security, use virtualhosts - leave the default so everyone is sent to the empty page if they come with just the IP or www.domain.TLD ... but setup sas.domain.TLD/sasfwae8fs/ to hit your website/webapp. Unless you or a visitor gets careless ... like by using a URL shortener or online bookmark tool, then it is unlikely that most of the world will find it. Be certain to remind all the users that you would rather they didn't share the URL anywhere, with anyone else.

    And you can add a basicsecurity password if you like. Shared passwords always get known, however,

    I used to have an online photo gallery that I shared with friends and family. Then someone posted a link on twitter or FB to it (I don't recall) and then all the web crawlers started hitting it. So much for my buried URL. I changed the name and stopped sharing it. The Firefox "awesome bar" isn't helping any of us either. Other browsers have the same thing - anything typed into there gets submitted to the search engine configured, so it won't be long before they know about it.

    You'll want to watch the log files for unauthorized access and probably setup some analytics to see which parts of the website are being read at all.

  2. #12
    Join Date
    Feb 2014
    Beans
    140

    Re: website hijacking

    Quote Originally Posted by Drone4four View Post
    Is this password protect feature like the one I demonstrated in my lorum ipsum post?
    I don't know enough to say whether your security is breakable, but if all you're interested in is stopping a webcrawler from mirroring your site then it will probably be enough.

    Quote Originally Posted by Drone4four View Post
    Does this mean that I’d have to get each user to sign up for login credentials? This is overkill I think for my needs. My master plan all along was to have a business card with my URL and the password (without a username). I would hand this business card out to people who I meet who want a way to connect with me on fbook without having to exchange a tiny scrap piece of paper with my contact information. Once they get there, they can explore who I am. I wouldn’t want to hassle my acquaintances with having each of them to create login credentials and all that jazz.
    If you were paranoid about security I suspect getting each user to sign up would be a fairly basic measure to protect your data. From your response I can see you're not so a simple shared access password would be OK, until it gets widely known.

    Have you thought about using a QR code? Is it just to direct your contacts to a popular social networking site implied by the term fbook, or does the site contain the information you want to share. I haven't got a clue how you'd program it, but apparently a mechanism has been developed to log in using a QR code.

  3. #13
    Join Date
    Mar 2010
    Location
    Squidbilly-Land
    Beans
    Hidden!
    Distro
    Ubuntu

    Re: website hijacking

    Most QR codes are just URL redirectors. If you use an external service (bit.ly, goo.gl, etc) then those guys capture the redirection traffic for their stats - violating your user's privacy. If you setup a redirecting service yourself, fine, great.

    It is possible to put more data inside - about 4k characters - depending on the size of the 2D barcode.

    QR is great if your peeps use it with their smartphones. Not so great for the rest of the world.

    I've seen them used like business cards - but more and more are just URLs to other data. Sometimes that data is like a business card, but 99.9999% of the time, it is to a URL redirector, redirector, redirector for marketeing of things I don't want, need, or have any interest in. ;( Burn me once ...

  4. #14
    Join Date
    May 2007
    Location
    West Indies
    Beans
    497
    Distro
    Ubuntu

    concealing web content on my webserver

    Why is this thread closed? I would today like to make a follow up post. Here is my follow up in a new thread.

    I'm trying to conceal my content from Google, Bing and YaCy indices. To achieve this I've added <meta name='robots' content='noindex,nofollow' /> to all my html. This meta tag doesn't prevent the NSA from indexing my content, but I don't care about them or the Chinese Communist Party for that matter. What I am trying to achieve is ensuring that when recruiters and prospective employers search for my name on Google, they don't stumble upon my website.

    In the original thread, TheFu commented:
    For a trivial, personal website that you don't want the world to see automatically, just don't put the entry page at the top. Bury it a little. I haven't used apache in years to know the default locations, but* /var/www/index.html can just exist with an <html></html> to stop most casual people. Then make a directory /var/www/sasfwae8fs/ and put your website there.
    I know what TheFu means. My question today is: would my content be better concealed by burying it further down a directory tree like in /var/www/html/[remotehost]/public_html/qwerty/uiop/asdf/[content_homepage]/index.html. I would only use this scheme with the intention of sharing it directly via private IM as a link. With this scheme I wouldn't put the link on the face of a business card because that would make it difficult for people to type in the entire lengthy web address into their web browser. For the business card I'm thinking about looking into setting up a QR code URL redirector, as was explained in the original thread.

    I'll also be looking into adding Header set "X-Robots-Tag" "noindex,nofollow" to my apapche2.conf. I thought about adding a
    Code:
    * {
    -webkit-user-select: none; 
    -khtml-user-select: none; 
    -moz-user-select: none; 
    -ms-user-select: none; 
    -o-user-select: none; 
    user-select: none;
    }
    rule to all my CSS. Although at this point I think using such a rule isn't necessary. I'm also not interested in asking visitors to create login credentials. When my audience navigate to my content using Firefox and Chrome, my URL is submitted to their index. To circumvent this I could change the directory structure every so often, revoking access to older links. Then I'll have to change all my QR codes.

    There are all sorts of things to consider when evading public access and maintaining secrecy. What do you folks think?
    My rig:
    IBM Personal System/2 Model 30-286 - - Intel 80286 (16 bit) 10 Mhz - - 1MB DRAM - - Integrated VGA Display adapter
    1.44MB capacity Floppy Disk - - PS/2 keyboard (no mouse)

  5. #15
    Join Date
    Oct 2009
    Beans
    Hidden!
    Distro
    Ubuntu 22.04 Jammy Jellyfish

    Re: concealing web content on my webserver

    Quote Originally Posted by Drone4four View Post
    Why is this thread closed? I would today like to make a follow up post. Here is my follow up in a new thread.
    Threads get auto closed after one year from their last post has passed. It helps prevent necro postings.

    Anyway, I have merged your new thread with the old one to keep things together.

    As for your question - I don't really know. Most search engines obey robots.txt but if you want to keep your site from being indexed, why not just require a password to view it?
    Come to #ubuntuforums! We have cookies! | Basic Ubuntu Security Guide

    Tomorrow's an illusion and yesterday's a dream, today is a solution...

  6. #16
    Join Date
    May 2007
    Location
    West Indies
    Beans
    497
    Distro
    Ubuntu

    Re: website hijacking

    Quote Originally Posted by CharlesA View Post
    As for your question - I don't really know. Most search engines obey robots.txt but if you want to keep your site from being indexed, why not just require a password to view it?
    I thought about using a password, and making the password trivial and providing it on the business card. I could figure out how to implement the password protect feature here: https://angeles4four.wordpress.com/2...orem-ipsum-ra/

    The password is 'gizelle'. Is this what you were suggesting, CharlesA?

    This protective measure won't stop malicious script kiddies from accessing my content, but it would do what I have set out to do which is prevent relatively technically illiterate job recruiters or prospective employers from accessing my controversial content.
    My rig:
    IBM Personal System/2 Model 30-286 - - Intel 80286 (16 bit) 10 Mhz - - 1MB DRAM - - Integrated VGA Display adapter
    1.44MB capacity Floppy Disk - - PS/2 keyboard (no mouse)

  7. #17
    Join Date
    Oct 2009
    Beans
    Hidden!
    Distro
    Ubuntu 22.04 Jammy Jellyfish

    Re: website hijacking

    I was thinking of just password protecting the entire site, so you'd get a "login required" box, but if you can do it from within wordpress itself, that might be an easier way to manage it.
    Come to #ubuntuforums! We have cookies! | Basic Ubuntu Security Guide

    Tomorrow's an illusion and yesterday's a dream, today is a solution...

Page 2 of 2 FirstFirst 12

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •