Page 1 of 2 12 LastLast
Results 1 to 10 of 17

Thread: website hijacking

  1. #1
    Join Date
    May 2007
    Location
    West Indies
    Beans
    497
    Distro
    Ubuntu

    website hijacking

    About a week ago I deployed my first digitalocean droplet. I used 12.04 and installed Apache, MySQL and PHP. I successfully got my index.html to load in my browser when entering my droplet’s ip address indicated in my digitalocean control panel. That same weekend in a humble state of accomplishment, I shared the ip address to my freshly deployed LAMP stack with a friend of mine from work who works in our IT department. This is the same person who recommended digitalocean to me in the first place. He and I share a lot in common.

    I Googled a string of uniquely worded content from my site in quotation marks yesterday and discovered that someone had mirrored my HTML and CSS code onto their server with a domain name. I did a whois query and discovered the domain registered to someone in Texas in December of 2013. My site was live on someone else’s server before I had the opportunity to bind my ip address to the domain name I registered, which was something I was planning on doing this weekend.

    I didn’t want my content mirrored like this so I destroyed my droplet last night. About an hour or two later this third party clone of my site indexed on Google went down too.

    What the hell could be going on here?

    I have a suspicion. I shared my digitalocean droplet ip address with only one person - - that co worker I mentioned earlier. This coworker is a very gifted individual. He has worked at fortune 500 companies in the past and is in the process of studying to write advanced level Cisco certifications. The domain name which the third party used was goofy, goofy in a way that matches my coworker’s playful character. I suspect it could be my co worker playing a prank on me, but I don’t know for sure.

    Is there any other logical explanation for someone getting a hold of my droplet’s ip address, hijacking my content and hosting on their own public server? Could this be a malicious third-party who somehow discovered my hexadecimal ip address and decided to copy my HTML? If this happens to other people, why would third-parties do such things? What could be his or her motive? Do they have something financial to gain? Do these kinds of third-party hijackers do so for similar reasons that DNS squatters register potential names to sell to the highest bidder? Do they use it for hidden, malicious intent involving spam or malware or something?

    Is there any other possible explanation for this phenomenon I am experiencing?
    My rig:
    IBM Personal System/2 Model 30-286 - - Intel 80286 (16 bit) 10 Mhz - - 1MB DRAM - - Integrated VGA Display adapter
    1.44MB capacity Floppy Disk - - PS/2 keyboard (no mouse)

  2. #2
    Join Date
    Mar 2010
    Location
    Squidbilly-Land
    Beans
    Hidden!
    Distro
    Ubuntu

    Re: website hijacking

    IPs are public. Searching the entire internet for stuff on port 80 takes less than 24 hours these days.

    I've been on the internet since the 1990s and have seen my content stolen many times, usually by well meaning people who were afraid that it would disappear. It usually just takes a nicely worded email to the webmaster to handle.

    If you don't want people to use iframes with your content, then block that. Check out YaCy to see what's possible.

    I'd be concerned over putting anything based on PHP on the internet. I'm not the only one. https://www.owasp.org/index.php/PHP_...ty_Cheat_Sheet is a group around creating best practices for web-app security. There is a local group in my metro area and I've spoken at a few others around the world. Nice folks.

    Be extremely careful with PHP. E-X-T-R-E-M-E-L-Y.

  3. #3
    Join Date
    May 2007
    Location
    West Indies
    Beans
    497
    Distro
    Ubuntu

    Re: website hijacking

    Thank-you TheFu for your reply.

    Code:
    <meta name='robots' content='noindex,nofollow' />
    If bind my ip address to a domain name and append the above code nested beneath my header tag, what will this do? According to Google support docs, it will prevent all robots from indexing the page on my site. But is it conceivable that the bots you speak of (the ones which scan the entire internet in 24 hours) still get a hold of the ip address binded to my domain? Or should the code in my meta tag prevent that? I suppose I am just asking hypothetically here.

    Is it at all possible to keep my domain private and unlisted, so as to make it accessible only to the people who I directly give the address to?

    I checked out YaCy. I found the Wikipedia entry but the actual homepage for the project is offline right now. To use the YaCy search engine, do I have to be a member of the peer-to-peer network and access it via my localhost:8090?

    I don't intend on using PHP. I am not a programmer. I just installed PHP for the sake of it. But is there any danger having PHP installed on my server even though its mostly inactive? What if I install a service like WordPress which uses PHP? Should I worry about PHP then? If I were careless with PHP, what's the worst that could happen?
    My rig:
    IBM Personal System/2 Model 30-286 - - Intel 80286 (16 bit) 10 Mhz - - 1MB DRAM - - Integrated VGA Display adapter
    1.44MB capacity Floppy Disk - - PS/2 keyboard (no mouse)

  4. #4
    Join Date
    Mar 2010
    Location
    Squidbilly-Land
    Beans
    Hidden!
    Distro
    Ubuntu

    Re: website hijacking

    Quote Originally Posted by Drone4four View Post
    Thank-you TheFu for your reply.

    Code:
    <meta name='robots' content='noindex,nofollow' />
    If bind my ip address to a domain name and append the above code nested beneath my header tag, what will this do? According to Google support docs, it will prevent all robots from indexing the page on my site. But is it conceivable that the bots you speak of (the ones which scan the entire internet in 24 hours) still get a hold of the ip address binded to my domain? Or should the code in my meta tag prevent that? I suppose I am just asking hypothetically here.

    Is it at all possible to keep my domain private and unlisted, so as to make it accessible only to the people who I directly give the address to?

    I checked out YaCy. I found the Wikipedia entry but the actual homepage for the project is offline right now. To use the YaCy search engine, do I have to be a member of the peer-to-peer network and access it via my localhost:8090?

    I don't intend on using PHP. I am not a programmer. I just installed PHP for the sake of it. But is there any danger having PHP installed on my server even though its mostly inactive? What if I install a service like WordPress which uses PHP? Should I worry about PHP then? If I were careless with PHP, what's the worst that could happen?
    Servers can request all sorts of things from clients ---- which are completely ignored. robots.txt are ignored all the time.

    The YaCy reference was just to show that anyone can run a webcrawler from anywhere in the world and find any website, running on any port.

    No, you CANNOT prevent people from connecting to your URL unless you put in a highly restrictive firewall ... which sorta defeats the point of having a website on the internet. So ... you can stop putting it on the internet and force only your friends to access it through a VPN, but that is usually more trouble than it is worth for a hobby. OTOH, YOU might want a VPN so that when you are away, it is safe to remote in to the home network. openvpn is the tool of choice for that or you can just use ssh like most UNIX folks do.

    Should you worry about PHP. This is an opinion question - I think so. My CSO thinks so. Our corporate poilicy doesn't allow PHP on the internet. We can run it internally, only accessible over VPN, but we have to get a "variance" approved by a VP. It is easier to use perl, python, ruby, ... almost anything else ... except Java. That is prohibited here too since Oracle took over. I don't think most java is nearly as bad as most php, but any code can be dangerous. I think that wordpress, when run by professionals, is as secure as any other blog engine. However, last month, I helped a guy here discover that his wordpress blog was hacked. He came here looking for help on web analytics that stopped working and left thinking his server had been hacked and we probably being used as C&C for a botnet.

    Then internet isn't a friendly place for the server or network administrator. We need to be proactive, aggressive, protective. BTW, backups are the #1 security tool - be certain you have those working perfectly and stored offline so you can compare against something when you are hacked. "when", not if.

    As to people copying your content - there are all sorts of ways to make it more difficult, but in the end, that doesn't work for a determined copier. If you put text and/or images on the web, they will be copied. If you put them into non-easily found URLs, one of your friends will share that URL somewhere (probably inside gmail) and then google will start searching for it. That happened to me. If you don't want others to access the data, then don't let them see it or force only a VPN to have access.

  5. #5
    Join Date
    Nov 2008
    Location
    Boston MetroWest
    Beans
    16,326

    Re: website hijacking

    I think you and your higher-ups are overly paranoid about PHP. It's certainly a lot stronger since the Zend folks took over development. Most problems with PHP sites have to do with bad programming practices, not inherent deficiencies in the language itself.

    OP, if there are no .php (or .ph*) pages on your site, then nothing will invoke the PHP engine.

    Instead of placing "nofollow,noindex" in a <meta> tag on each page, you can tell Apache to send these restrictions in advance of delivering the page content like this:
    Code:
    Header set "X-Robots-Tag" "noindex,nofollow"
    To set this globally, add it to the bottom of /etc/apache2/apache2.conf right above the "IncludeOptional" directives.
    If you ask for help, do not abandon your request. Please have the courtesy to check for responses and thank the people who helped you.

    Blog · Linode System Administration Guides · Android Apps for Ubuntu Users

  6. #6
    Join Date
    Sep 2006
    Beans
    8,627
    Distro
    Ubuntu 14.04 Trusty Tahr

    ssi

    I've used PHP in the past and am leery of its use. I've come to see PHP as a last resort.

    http://me.veekun.com/blog/2012/04/09...of-bad-design/

    Unfortunately there are many things that, purely from a user perspective, are great but from a development or sysadmin perspective are a terrible nightmare.

    There are many sites that use PHP just to get standardized headers, footers and navigation menus. If that is what you are looking to do, then you can do all that with Server-Side Includes more securely. Just be sure to use IncludesNOEXEC in the options, for extra protection.

    About the copying, you can try complaining to the site's owner or to the organization where the ip number is registered. About the meta tags, I'd use index rather than noindex because, usually, you want the honest search engines like YaCY and Google to find your page for people The dishonest search engines will happily ignore both robots.txt and meta tags.

    A more obnoxious route would be to PGP sign the page's content and then, when it gets copied, come down with the DMCA or EUCD. But I don't think highly of that option and the recommendation of a polite letter is good and may be enough to clear up the misunderstanding.

  7. #7
    Join Date
    Feb 2014
    Beans
    140

    Re: website hijacking

    Quote Originally Posted by Drone4four View Post
    Is it at all possible to keep my domain private and unlisted, so as to make it accessible only to the people who I directly give the address to?
    No-one else mentioned it, but if you want your content to remain private you could always restrict access with password protection. If you're really paranoid you should also protect it with SSL so that no-one can pick up the encoded username/password in the http request headers.

  8. #8
    Join Date
    Nov 2008
    Location
    Boston MetroWest
    Beans
    16,326

    Re: ssi

    Quote Originally Posted by Lars Noodén View Post
    There are many sites that use PHP just to get standardized headers, footers and navigation menus.
    All my sites contain dynamic content and rely on a database backend. For that I need some type of programming environment; PHP works for me.

    That article you cited raises a lot of tiny issues and points to problems that most of us never encounter. Sure I don't like having to remember that the order of parameters in strstr() is different from that in preg_match(), but those are quibbles. One of his security examples concerns a broken crypt() function. Does anyone even use crypt for password hashing these days? Certainly not me. The section on debugging seems to criticize PHP for not offering functionality that is irrelevant to a scripting language. His discussion of globals is entirely off-the-mark since he fails to acknowledge the existence of the $GLOBALS built-in array. I could go on, but I've seen these kinds of rants from programmers, and they all fail to move me to switch languages.

    Now I've been writing sites like these for some 15 years now, so I'm probably better able to protect my code from potential dangers. Still if you think about the security implications of what you are writing while you are writing it, I don't think PHP is per se less secure than competing languages for developing web applications. I'll just observe that it is the language of choice for many large projects like WordPress and Joomla.
    Last edited by SeijiSensei; April 1st, 2014 at 10:35 PM.
    If you ask for help, do not abandon your request. Please have the courtesy to check for responses and thank the people who helped you.

    Blog · Linode System Administration Guides · Android Apps for Ubuntu Users

  9. #9
    Join Date
    Mar 2010
    Location
    Squidbilly-Land
    Beans
    Hidden!
    Distro
    Ubuntu

    Re: ssi

    The core PHP project has a history of releasing code that THEY KNEW contained major bugs in. If the project really is under new management, that would be a good step.

    I know some really sharp php developers. One works at the company behind Wordpress. He's learned all the traps in php, but the barrier to entry for php programming is just so very low that the average php program is ... er ... crap code. Best to only stick with EXTREMELY well made code if php is the solution OR stick with languages that ahve a better history of trying to protect programmers from themselves and other users.

    I'm still waiting for any other language to come close to the "taint" checking in perl.

    Oh - and the concerns over PHP aren't just mine. The OWASP group (Open Web Application Security Project) says this:
    Serious issues abound in all aspects of PHP, making it difficult to write secure PHP applications. If you’re forced to use PHP, then you must be aware of all its pitfalls.
    https://www.owasp.org/index.php/PHP_...ty_Cheat_Sheet

    Even professional php developers have trouble with security: http://www.zdnet.com/wordpress-attac...ts-7000014256/ Every year (approximately), there is a major php-based tool that gets hacked. They are the Microsoft of scripting languages, IMHO.

    BTW, the website for my company is currently php and has been for years. When it was deployed by the CEO's brother, we sat down and explained that we couldn't stop it from being hacked - all we could do what have a response for "after" that occurs. The CEO wasn't happy. Actually, he was pissed. There isn't anything on the public website that couldn't be done mostly with CSS and extremely lite JS, if any. I reproduced it using Template::Toolkit and static files that built the pages to appear like dynamic stuff. Matching headers, footers and sidebars are relatively trivial in TT (actually ttree is the tool name). I still use TT to build some internal webpages - it is a simple, but extremely powerful language. http://template-toolkit.org/docs/tutorial/Web.html Of course, there are many similar tools that will build a static website that doesn't look or behave like a static website.

  10. #10
    Join Date
    May 2007
    Location
    West Indies
    Beans
    497
    Distro
    Ubuntu

    Re: website hijacking

    In the the future, when I want to protect my content, I could password protect it as demonstrated at this blog post. The password is gizelle.

    I can also use this tag, these attributes and these values:
    <meta name='robots' content='noindex,nofollow' />

    I’ll use the following CSS rules:
    Code:
    -webkit-user-select: none; 
    -khtml-user-select: none; 
    -moz-user-select: none; 
    -ms-user-select: none; 
    -o-user-select: none; 
    user-select: none;
    Thanks to SeijiSensei, I can also append some provisions to my apache configuration file:
    Quote Originally Posted by SeijiSensei View Post
    Instead of placing "nofollow,noindex" in a <meta> tag on each page, you can tell Apache to send these restrictions in advance of delivering the page content like this:
    Code:
    Header set "X-Robots-Tag" "noindex,nofollow"
    To set this globally, add it to the bottom of /etc/apache2/apache2.conf right above the "IncludeOptional" directives.
    It’s a dog-eat-dog security arms race out there. The internet is a fragile piece of technology. The NSA and the Chinese Communist Party could easily circumvent the protective measures I listed above. So could CyberPunk script kiddies. But I don’t have a problem with that because I am pronoid (checkout the Wikipedia entry for a fascinating read). Then why would I wanna protect my content if I don’t mind the NSA knowing everything about me? Because I want to prevent prospective employers from finding on Google traces of my controversial political beliefs which I intend on posting to my site. I take my privacy very seriously, in particular when it comes to employment. The listed measures should be sufficient to meet my needs. My intended audience for my site is maybe two dozen people, most of whom aren’t smart enough to use Ctrl + U in firefox or chrome.

    What do you folks think? Is there anything else I should consider?

    Quote Originally Posted by m-dw View Post
    No-one else mentioned it, but if you want your content to remain private you could always restrict access with password protection.
    Is this password protect feature like the one I demonstrated in my lorum ipsum post?

    If you're really paranoid you should also protect it with SSL so that no-one can pick up the encoded username/password in the http request headers.
    Does this mean that I’d have to get each user to sign up for login credentials? This is overkill I think for my needs. My master plan all along was to have a business card with my URL and the password (without a username). I would hand this business card out to people who I meet who want a way to connect with me on fbook without having to exchange a tiny scrap piece of paper with my contact information. Once they get there, they can explore who I am. I wouldn’t want to hassle my acquaintances with having each of them to create login credentials and all that jazz.
    Last edited by Drone4four; April 1st, 2014 at 11:20 PM. Reason: grammar correction
    My rig:
    IBM Personal System/2 Model 30-286 - - Intel 80286 (16 bit) 10 Mhz - - 1MB DRAM - - Integrated VGA Display adapter
    1.44MB capacity Floppy Disk - - PS/2 keyboard (no mouse)

Page 1 of 2 12 LastLast

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •