Description:
A simple python script that when first run downloads a webpage for each url in a file, then each time after it's run checks the old file for changes. If there's a change detected it opens firefox! Execute the program with a "-v" option for more output.
To setup the script from your bash terminal:
mkdir ~/.checkurls; nano ~/.checkurls/checkurls.py
Copy and paste the below code in the file and save it with ctl+x then y then Enter
sed -i "s/\$USER/$USER/g" ~/.checkurls/checkurls.py
Link to the file with sudo ln -s ~/.checkurls/checkurls.py /usr/bin/checkurls
sudo chmod a+x /usr/bin/checkurls; nano ~/.checkurls/urls.txt
Then enter some urls seperated by newlines into the file and you're done! It's not too hard, only takes a few seconds. n_n
Script:
Making the script run when your computer starts:Code:#!/usr/bin/env python import os, sys, urllib savepath = '/home/$USER/.checkurls/' errormsg = '%s: %s: No such file or directory' urlspath = savepath + 'urls.txt' for path in savepath, urlspath: if not os.path.exists(path): print errormsg % (__file__, path) sys.exit(0) urlfile = open(urlspath, 'r').readlines(); urlstring = '' for url in urlfile: if not 'http://' in url: url = 'http://' + url url = url.replace('\n', '') filename = url.replace('/', '%2f') if '-v' in sys.argv: print '%s: %s ...' % (__file__, url) if not os.path.isfile(savepath + filename): urllib.urlretrieve(url, savepath + filename) filelines = open(savepath + filename, 'r').readlines() urllines = urllib.urlopen(url).readlines() if not filelines == urllines: open(savepath + filename, 'w').writelines(urllines) urlstring += '"' + url + '" ' if urlstring: os.popen('firefox ' + urlstring) else: if '-v' in sys.argv: print '%s: nothing for today' % __file__




Adv Reply





Bookmarks