Page 1 of 2 12 LastLast
Results 1 to 10 of 13

Thread: website hijacking

Hybrid View

  1. #1
    Join Date
    May 2007
    Location
    Ontario
    Beans
    303
    Distro
    Ubuntu

    website hijacking

    About a week ago I deployed my first digitalocean droplet. I used 12.04 and installed Apache, MySQL and PHP. I successfully got my index.html to load in my browser when entering my droplet’s ip address indicated in my digitalocean control panel. That same weekend in a humble state of accomplishment, I shared the ip address to my freshly deployed LAMP stack with a friend of mine from work who works in our IT department. This is the same person who recommended digitalocean to me in the first place. He and I share a lot in common.

    I Googled a string of uniquely worded content from my site in quotation marks yesterday and discovered that someone had mirrored my HTML and CSS code onto their server with a domain name. I did a whois query and discovered the domain registered to someone in Texas in December of 2013. My site was live on someone else’s server before I had the opportunity to bind my ip address to the domain name I registered, which was something I was planning on doing this weekend.

    I didn’t want my content mirrored like this so I destroyed my droplet last night. About an hour or two later this third party clone of my site indexed on Google went down too.

    What the hell could be going on here?

    I have a suspicion. I shared my digitalocean droplet ip address with only one person - - that co worker I mentioned earlier. This coworker is a very gifted individual. He has worked at fortune 500 companies in the past and is in the process of studying to write advanced level Cisco certifications. The domain name which the third party used was goofy, goofy in a way that matches my coworker’s playful character. I suspect it could be my co worker playing a prank on me, but I don’t know for sure.

    Is there any other logical explanation for someone getting a hold of my droplet’s ip address, hijacking my content and hosting on their own public server? Could this be a malicious third-party who somehow discovered my hexadecimal ip address and decided to copy my HTML? If this happens to other people, why would third-parties do such things? What could be his or her motive? Do they have something financial to gain? Do these kinds of third-party hijackers do so for similar reasons that DNS squatters register potential names to sell to the highest bidder? Do they use it for hidden, malicious intent involving spam or malware or something?

    Is there any other possible explanation for this phenomenon I am experiencing?
    Gigabyte GA-P45T-ES3G LGA 775 Intel P45
    Intel Core 2 Quad Q9650 @3.0GHz
    16GB RAM DDR3 1333MHz
    EVGA nVidia GeForce GTX 560 Ti 2GB

  2. #2
    Join Date
    Mar 2010
    Location
    Metro-ATL; PM free zone.
    Beans
    Hidden!
    Distro
    Lubuntu 14.04 Trusty Tahr

    Re: website hijacking

    IPs are public. Searching the entire internet for stuff on port 80 takes less than 24 hours these days.

    I've been on the internet since the 1990s and have seen my content stolen many times, usually by well meaning people who were afraid that it would disappear. It usually just takes a nicely worded email to the webmaster to handle.

    If you don't want people to use iframes with your content, then block that. Check out YaCy to see what's possible.

    I'd be concerned over putting anything based on PHP on the internet. I'm not the only one. https://www.owasp.org/index.php/PHP_...ty_Cheat_Sheet is a group around creating best practices for web-app security. There is a local group in my metro area and I've spoken at a few others around the world. Nice folks.

    Be extremely careful with PHP. E-X-T-R-E-M-E-L-Y.

  3. #3
    Join Date
    May 2007
    Location
    Ontario
    Beans
    303
    Distro
    Ubuntu

    Re: website hijacking

    Thank-you TheFu for your reply.

    Code:
    <meta name='robots' content='noindex,nofollow' />
    If bind my ip address to a domain name and append the above code nested beneath my header tag, what will this do? According to Google support docs, it will prevent all robots from indexing the page on my site. But is it conceivable that the bots you speak of (the ones which scan the entire internet in 24 hours) still get a hold of the ip address binded to my domain? Or should the code in my meta tag prevent that? I suppose I am just asking hypothetically here.

    Is it at all possible to keep my domain private and unlisted, so as to make it accessible only to the people who I directly give the address to?

    I checked out YaCy. I found the Wikipedia entry but the actual homepage for the project is offline right now. To use the YaCy search engine, do I have to be a member of the peer-to-peer network and access it via my localhost:8090?

    I don't intend on using PHP. I am not a programmer. I just installed PHP for the sake of it. But is there any danger having PHP installed on my server even though its mostly inactive? What if I install a service like WordPress which uses PHP? Should I worry about PHP then? If I were careless with PHP, what's the worst that could happen?
    Gigabyte GA-P45T-ES3G LGA 775 Intel P45
    Intel Core 2 Quad Q9650 @3.0GHz
    16GB RAM DDR3 1333MHz
    EVGA nVidia GeForce GTX 560 Ti 2GB

  4. #4
    Join Date
    Mar 2010
    Location
    Metro-ATL; PM free zone.
    Beans
    Hidden!
    Distro
    Lubuntu 14.04 Trusty Tahr

    Re: website hijacking

    Quote Originally Posted by Drone4four View Post
    Thank-you TheFu for your reply.

    Code:
    <meta name='robots' content='noindex,nofollow' />
    If bind my ip address to a domain name and append the above code nested beneath my header tag, what will this do? According to Google support docs, it will prevent all robots from indexing the page on my site. But is it conceivable that the bots you speak of (the ones which scan the entire internet in 24 hours) still get a hold of the ip address binded to my domain? Or should the code in my meta tag prevent that? I suppose I am just asking hypothetically here.

    Is it at all possible to keep my domain private and unlisted, so as to make it accessible only to the people who I directly give the address to?

    I checked out YaCy. I found the Wikipedia entry but the actual homepage for the project is offline right now. To use the YaCy search engine, do I have to be a member of the peer-to-peer network and access it via my localhost:8090?

    I don't intend on using PHP. I am not a programmer. I just installed PHP for the sake of it. But is there any danger having PHP installed on my server even though its mostly inactive? What if I install a service like WordPress which uses PHP? Should I worry about PHP then? If I were careless with PHP, what's the worst that could happen?
    Servers can request all sorts of things from clients ---- which are completely ignored. robots.txt are ignored all the time.

    The YaCy reference was just to show that anyone can run a webcrawler from anywhere in the world and find any website, running on any port.

    No, you CANNOT prevent people from connecting to your URL unless you put in a highly restrictive firewall ... which sorta defeats the point of having a website on the internet. So ... you can stop putting it on the internet and force only your friends to access it through a VPN, but that is usually more trouble than it is worth for a hobby. OTOH, YOU might want a VPN so that when you are away, it is safe to remote in to the home network. openvpn is the tool of choice for that or you can just use ssh like most UNIX folks do.

    Should you worry about PHP. This is an opinion question - I think so. My CSO thinks so. Our corporate poilicy doesn't allow PHP on the internet. We can run it internally, only accessible over VPN, but we have to get a "variance" approved by a VP. It is easier to use perl, python, ruby, ... almost anything else ... except Java. That is prohibited here too since Oracle took over. I don't think most java is nearly as bad as most php, but any code can be dangerous. I think that wordpress, when run by professionals, is as secure as any other blog engine. However, last month, I helped a guy here discover that his wordpress blog was hacked. He came here looking for help on web analytics that stopped working and left thinking his server had been hacked and we probably being used as C&C for a botnet.

    Then internet isn't a friendly place for the server or network administrator. We need to be proactive, aggressive, protective. BTW, backups are the #1 security tool - be certain you have those working perfectly and stored offline so you can compare against something when you are hacked. "when", not if.

    As to people copying your content - there are all sorts of ways to make it more difficult, but in the end, that doesn't work for a determined copier. If you put text and/or images on the web, they will be copied. If you put them into non-easily found URLs, one of your friends will share that URL somewhere (probably inside gmail) and then google will start searching for it. That happened to me. If you don't want others to access the data, then don't let them see it or force only a VPN to have access.

  5. #5
    Join Date
    Feb 2014
    Beans
    87

    Re: website hijacking

    Quote Originally Posted by Drone4four View Post
    Is it at all possible to keep my domain private and unlisted, so as to make it accessible only to the people who I directly give the address to?
    No-one else mentioned it, but if you want your content to remain private you could always restrict access with password protection. If you're really paranoid you should also protect it with SSL so that no-one can pick up the encoded username/password in the http request headers.

  6. #6
    Join Date
    May 2007
    Location
    Ontario
    Beans
    303
    Distro
    Ubuntu

    Re: website hijacking

    In the the future, when I want to protect my content, I could password protect it as demonstrated at this blog post. The password is gizelle.

    I can also use this tag, these attributes and these values:
    <meta name='robots' content='noindex,nofollow' />

    I’ll use the following CSS rules:
    Code:
    -webkit-user-select: none; 
    -khtml-user-select: none; 
    -moz-user-select: none; 
    -ms-user-select: none; 
    -o-user-select: none; 
    user-select: none;
    Thanks to SeijiSensei, I can also append some provisions to my apache configuration file:
    Quote Originally Posted by SeijiSensei View Post
    Instead of placing "nofollow,noindex" in a <meta> tag on each page, you can tell Apache to send these restrictions in advance of delivering the page content like this:
    Code:
    Header set "X-Robots-Tag" "noindex,nofollow"
    To set this globally, add it to the bottom of /etc/apache2/apache2.conf right above the "IncludeOptional" directives.
    It’s a dog-eat-dog security arms race out there. The internet is a fragile piece of technology. The NSA and the Chinese Communist Party could easily circumvent the protective measures I listed above. So could CyberPunk script kiddies. But I don’t have a problem with that because I am pronoid (checkout the Wikipedia entry for a fascinating read). Then why would I wanna protect my content if I don’t mind the NSA knowing everything about me? Because I want to prevent prospective employers from finding on Google traces of my controversial political beliefs which I intend on posting to my site. I take my privacy very seriously, in particular when it comes to employment. The listed measures should be sufficient to meet my needs. My intended audience for my site is maybe two dozen people, most of whom aren’t smart enough to use Ctrl + U in firefox or chrome.

    What do you folks think? Is there anything else I should consider?

    Quote Originally Posted by m-dw View Post
    No-one else mentioned it, but if you want your content to remain private you could always restrict access with password protection.
    Is this password protect feature like the one I demonstrated in my lorum ipsum post?

    If you're really paranoid you should also protect it with SSL so that no-one can pick up the encoded username/password in the http request headers.
    Does this mean that I’d have to get each user to sign up for login credentials? This is overkill I think for my needs. My master plan all along was to have a business card with my URL and the password (without a username). I would hand this business card out to people who I meet who want a way to connect with me on fbook without having to exchange a tiny scrap piece of paper with my contact information. Once they get there, they can explore who I am. I wouldn’t want to hassle my acquaintances with having each of them to create login credentials and all that jazz.
    Last edited by Drone4four; April 1st, 2014 at 11:20 PM. Reason: grammar correction
    Gigabyte GA-P45T-ES3G LGA 775 Intel P45
    Intel Core 2 Quad Q9650 @3.0GHz
    16GB RAM DDR3 1333MHz
    EVGA nVidia GeForce GTX 560 Ti 2GB

  7. #7
    Join Date
    Mar 2010
    Location
    Metro-ATL; PM free zone.
    Beans
    Hidden!
    Distro
    Lubuntu 14.04 Trusty Tahr

    Re: website hijacking

    For a trivial, personal website that you don't want the world to see automatically, just don't put the entry page at the top. Bury it a little.

    I haven't used apache in years to know the default locations, but
    /var/www/index.html can just exist with an <html></html> to stop most casual people.
    Then make a directory /var/www/sasfwae8fs/ and put your website there. If you want extra security, use virtualhosts - leave the default so everyone is sent to the empty page if they come with just the IP or www.domain.TLD ... but setup sas.domain.TLD/sasfwae8fs/ to hit your website/webapp. Unless you or a visitor gets careless ... like by using a URL shortener or online bookmark tool, then it is unlikely that most of the world will find it. Be certain to remind all the users that you would rather they didn't share the URL anywhere, with anyone else.

    And you can add a basicsecurity password if you like. Shared passwords always get known, however,

    I used to have an online photo gallery that I shared with friends and family. Then someone posted a link on twitter or FB to it (I don't recall) and then all the web crawlers started hitting it. So much for my buried URL. I changed the name and stopped sharing it. The Firefox "awesome bar" isn't helping any of us either. Other browsers have the same thing - anything typed into there gets submitted to the search engine configured, so it won't be long before they know about it.

    You'll want to watch the log files for unauthorized access and probably setup some analytics to see which parts of the website are being read at all.

  8. #8
    Join Date
    Feb 2014
    Beans
    87

    Re: website hijacking

    Quote Originally Posted by Drone4four View Post
    Is this password protect feature like the one I demonstrated in my lorum ipsum post?
    I don't know enough to say whether your security is breakable, but if all you're interested in is stopping a webcrawler from mirroring your site then it will probably be enough.

    Quote Originally Posted by Drone4four View Post
    Does this mean that I’d have to get each user to sign up for login credentials? This is overkill I think for my needs. My master plan all along was to have a business card with my URL and the password (without a username). I would hand this business card out to people who I meet who want a way to connect with me on fbook without having to exchange a tiny scrap piece of paper with my contact information. Once they get there, they can explore who I am. I wouldn’t want to hassle my acquaintances with having each of them to create login credentials and all that jazz.
    If you were paranoid about security I suspect getting each user to sign up would be a fairly basic measure to protect your data. From your response I can see you're not so a simple shared access password would be OK, until it gets widely known.

    Have you thought about using a QR code? Is it just to direct your contacts to a popular social networking site implied by the term fbook, or does the site contain the information you want to share. I haven't got a clue how you'd program it, but apparently a mechanism has been developed to log in using a QR code.

  9. #9
    Join Date
    Mar 2010
    Location
    Metro-ATL; PM free zone.
    Beans
    Hidden!
    Distro
    Lubuntu 14.04 Trusty Tahr

    Re: website hijacking

    Most QR codes are just URL redirectors. If you use an external service (bit.ly, goo.gl, etc) then those guys capture the redirection traffic for their stats - violating your user's privacy. If you setup a redirecting service yourself, fine, great.

    It is possible to put more data inside - about 4k characters - depending on the size of the 2D barcode.

    QR is great if your peeps use it with their smartphones. Not so great for the rest of the world.

    I've seen them used like business cards - but more and more are just URLs to other data. Sometimes that data is like a business card, but 99.9999% of the time, it is to a URL redirector, redirector, redirector for marketeing of things I don't want, need, or have any interest in. ;( Burn me once ...

  10. #10
    Join Date
    Nov 2008
    Location
    Metro Boston
    Beans
    9,341
    Distro
    Kubuntu 14.04 Trusty Tahr

    Re: website hijacking

    I think you and your higher-ups are overly paranoid about PHP. It's certainly a lot stronger since the Zend folks took over development. Most problems with PHP sites have to do with bad programming practices, not inherent deficiencies in the language itself.

    OP, if there are no .php (or .ph*) pages on your site, then nothing will invoke the PHP engine.

    Instead of placing "nofollow,noindex" in a <meta> tag on each page, you can tell Apache to send these restrictions in advance of delivering the page content like this:
    Code:
    Header set "X-Robots-Tag" "noindex,nofollow"
    To set this globally, add it to the bottom of /etc/apache2/apache2.conf right above the "IncludeOptional" directives.
    If you ask for help, please have the courtesy to check for responses and thank the people who helped you.

    Blog · Linode System Administration Guides · Android Apps for Ubuntu Users

Page 1 of 2 12 LastLast

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •