I don't know of any way to flush the L1 or L2 cache from a CPU. Sorry.
I still think that the best answer was what I suggested in post #6 - prevent **anything** from being written to begin with. If it is never on disk, then you don't have to delete it.
But ... basic bash scripting isn't hard. There are subtle things that are too hard to explain here, so best to share a screen with some people who know how to do it and talk through decisions.
Beginning Bash scripting Guide: https://tldp.org/LDP/Bash-Beginners-...ers-Guide.html
Advanced Bash Scripting Guide: https://tldp.org/LDP/abs/html/
Also, your systems have different programs than my systems have, so the caches on my system are different from what are on yours. There's no way any of us can 100% tell you how to find each program, then find the documentation for those (if there is any) and say that the disk cache for that program is 100% always in /x/y/z directory. There are many, many, many, things that can change the locations. Different DEs can use different cache locations - and depending on which release for each program, each DE, each user, the cache directory for each program can be in different places.
With all that said, you can look for directories with "cache" in their name. There are system caches and per-userid caches. System caches, say for APT, are under /var/lib ... somewhere. I honestly don't know where system caches are for snap packages. For userids, 99.999% of the time, the default locations for those would be under the userid's $HOME directory, which is held in the HOME environment variable, set at login time, based on the command
getent passwd {userid}
The man page for the passwd file (not the command), explains which field holds the HOME directory setting, but most people would be able to recognize it in the : separated fields. I could head of on a tangent about the passwd file, LDAP databases, and how /home/ isn't mandatory, just common on home Linux systems. In a corporate environment with hundreds of users, I'd be surprised if /home/ was used for any Unix workstation. It just isn't done.
There are probably GUI tools that will get most of the big cache directories for the top 5 popular programs. But for the 20,000 other programs, who knows where they would put files? In my code, I would always put cache files under /tmp/{userid}+{PID}/ and clean them up when the program was closed for any reason. /tmp gets automatically cleaned up at reboot as well, so there is little cause for end-users to deal with it, unless their systems run for a month. That can happen, here's one of my systems:
Code:
$ uptime
22:24:42 up 21 days, 9:35, 2 users,
But that box got patched to day and needs to be rebooted. All my systems need to be rebooted after today's patches. I know this because the file, /var/run/reboot-needed exists.
Anyway, after you find all the cache directories, the script would just be rm commands.
Code:
#!/bin/bash
/bin/rm /path/to/cache/directory/*
/bin/rm /path/to/cache/directory2/*
/bin/rm /another/path/to/different/cache/*
Is that helpful? Be certain to make the file you put those commands into have execute permissions.
Another way would be to do this:
Code:
#!/bin/bash
/bin/rm /path/to/cache/directory/* /path/to/cache/directory2/* /another/path/to/different/cache/*
And another way that would only delete files over 1 day old is this:
Code:
#!/bin/bash
/usr/bin/find /path/to/cache/directory /path/to/cache/directory2 /another/path/to/different/cache -type f -mtime +1 -delete
Be careful with these commands. Any files that get found either by the globbing or using 'find' will be removed. If stuff you don't want found gets found, it can be removed too. I didn't show recursive removal in the first 2 example scripts, so and damage done would be limited. But in the last script that uses the 'find' command, that one can wipe out millions of files if the paths are bad. Be very careful. Of course, the find in my example lists 3 directories to search in 1 command, but you could split that up into 3 find commands, each with 1 directory.
My scripting 101 article: https://blog.jdpfu.com/2014/04/01/li...-101-scripting
Ok, a quick concrete example.
Code:
#!/bin/bash
/usr/bin/find /u/thefu/.cache -type f -mtime +1 -delete
Let's see how much damage I could do with that script ....
Code:
$ /usr/bin/find ~/.cache -type f -mtime +1|wc -l
/usr/bin/find: ‘~/.cache/dconf’: Permission denied
13064
So, there are 13000 files in my personal "cache" directory that were last modified over 1 day ago. How bad would it be to delete all those files? Is it safe to delete those files when they are actively being used? Obviously, I have a browser open right now, and there are many, many, browser cache files that I see there. Some of them are certainly open. That's why I want the last modified time to be at least 1 day ago. 'find' has many, many, many, options to hone in on exactly the files or directories that we seek. This is one of the few commands where seeing a top 50 find examples webpage is extremely helpful.
BTW, in a script, never use ~/ and never use $HOME. Those aren't normally set when we go to automate running a script via crontab and we really don't want a script to "find" anywhere we didn't intend. What would happen if / was "found"? and we started deleting all the files not modified in the last day? There's good news - Unix permissions should prevent any normal userid from deleting any system files, but every file in our HOME directory would likely get deleted. I'm just trying to convey what could happen. In 1996, I was a new Unix admin and wrote a little clean-up script. Because it needed to delete old files from many different users, I ran it as root. It found ./ which happened to be / at the time and started to delete the entire OS on a relatively new server. I was fortunate in that the day prior, I'd just made a full system backup to tape, but I didn't know how to restore it yet. Ended up running to the local Microcenter and buying a book Essential System Administration. I gave away that 1996 version long ago, but still have the 3rd edition within arms reach now.
Anyways, be careful deleting files in a script. The first few runs, use a different command, perhaps 'ls' rather than delete. Look carefully at the files.
Bookmarks