the chief reason is that there are all kinds of characters that are let's say unprintable, but otherwise completely legal in filenames because in linux filesystems everything but / and null char \0 are allowed in names. Ls will do something with these chars for printing purposes, eg will put ?? there, and as a result the name presented is not the name the file actually has.
Another problem is with cutting the output to proper names. Newline is legit and then, even ignoring funky chars, you are unable to solve the ambiguity
Let's say you have output:
such output is produced in 4 cases
Code:
1. a, b, c
2. a, b\nc
3. a\nb, c
4. a\nb\nc
and you have no way of telling which scenario you are dealing with. This is inherent problem of transforming filenames to plain text, because the crucial information about the boundaries is lost.
And if you are saying you will never have filenames with funky chars or newlines - never say never, eg it's rather easy to create weird filenames by mistake in scripts, let's say by assuming that some command will always return a filename friendly, single line but getting multiline value. ls based janitor script would be hard pressed to mop such files up.
There are only 2 way to support filenames in a rock solid way
- native globs which create array-like structure with original names and no boundary problems
- find and other tools that support \0 char as a separator, usually combined with while read -d $'\0' - null char is disallowed in names, so it never collides with the legit chars and there is no ambiguosity
Every other approach can be tricked.
Bookmarks