Hello all,
So I'm writing a little shell script which will perform all the actions necessary to update my android phone, which is relatively laborious as it runs an unofficial build of lineageOS, meaning that many operations need to be performed manually every single time it's updated, like restoring contacts, messages, reinstalling apps etc. I've got the core functions of the script sorted, but I still have to manually download the most recent build, as well as all my apps, using a web browser. Ideally I would like to use wget to automatically retrieve the latest versions of the files I need when the script is run, but there are a couple of problems
Because each new lineageOS build is posted to the maintainers google drive as a new file, as opposed to an updated version of the same file, I think I'm right in thinking that wget's '--timestamping' feature won't work (unless I download the entire repo, in which case it would work but it would be a rubbish workaround). This isn't such an issue as the bulk of the manual work is actually downloading all my apps, however the same thing applies. For example this page is where I download Firefox for android, but I have no idea how I would make wget download the latest version while intelligently choosing the correct CPU architecture for my device. I wondered whether there was some way to implement wget within the script and perhaps use regular expressions to select the correct download link (assuming some sort of consistent naming convention is in use)
Basically I don't understand how wget interacts with the webpage / server and have no idea where to begin. Perhaps I can't do this using BASH / wget and it would require another scripting language? I'd love to hear what some options may be from someone who understands it a little better.
Bookmarks