Originally Posted by
DuckHook
I have personally conducted a similar experiment in a lesser and unsystematic way. If you install NoScript on FF and turn off absolutely all scripting, the Internet essentially becomes unusable. There are no sites these days that are scriptless.
I've had a different experience. Most sites will display the data I want without any scripting enabled. For example, ubuntuforums includes googletagmanager.com, but works fine without allowing any javascript or connection from that site. In fact:
Code:
$ ping googletagmanager.com
PING googletagmanager.com (127.0.0.1) 56(84) bytes of data.
64 bytes from localhost (127.0.0.1): icmp_seq=1 ttl=64 time=0.040 ms
64 bytes from localhost (127.0.0.1): icmp_seq=2 ttl=64 time=0.056 ms
my current system can't even resolve that name. I block many domains at the network layer, not just through DNS. The web works fine for the most part - even casual google use still works. I use a site called "xyz123". They are all about tracking and have 13 tracking includes there. All of those are prevented from running. I need to allow only the xyz123 and xyz123cdn for the site to work. Those 13 other trackers - meh. The same happens over and over and over. Allow only the parts that actually need javascript to display content you want. Not anything else.
After a few sites, you'll learn which not to allow and to ban some.
Another trick is to use the old mobile website version which didn't support javascript at all. All the content, fast. None of the extra junk.
How much tracking are we willing to have? Everyone has a different answer. But we certainly don't need to blindly allow everything. It is always a negotiation. Always.
Bookmarks