⬆️ ⬇️

Real Unix is ​​not acceptable Unix.

The Unix command line is full of surprises. For example, did you know that the ls tool, which is most often used to list files in the current directory, recognizes at least 38 different flags in OS X version?



I did not know, so I shut up this fact. And I got a couple of answers, one of which made me think: is it really necessary for Unix itself to be blamed for this?



As far as I know, neither Linux nor OS X was designed in strict accordance with the Unix philosophy . It will be hypocritical to base the criticism of “Unix” only on these Unix derivatives that we have today.

')

But I still try to show how many problems with the command line interfaces in modern Unix heirs go back to the roots of Unix itself. In particular, I will try to explain my skepticism about the idea that a Unix command line environment could ever support an ecosystem of programs, each of which performs one function well.



But I'm a little ahead of the story. Before I start talking about this, let's take a closer look at the ls and try to figure out exactly what it does wrong.



A lot of things, but weak



Different versions of ls recognize different sets of flags, but in general these flags can be divided into several broad categories. ls users use flags to:





If you omit the first category of flags, then the other three have something interesting. Connoisseurs of functional programming, in particular, may see something familiar in them.



So it is - each of these three categories roughly corresponds to the only common higher order function that works with sequences!





ls seems bloated because it is really bloated. A number of higher-order functions can take on most of the functionality that is now wedged into ls in the form of flags.



Something went wrong?



The idea that every program should contain a self-contained unit of functionality is by no means new. For decades, Unix supporters praised the advantages of pipelines: “programs” created on the fly by connecting small composable filters one after another. In this case, how can one explain the evolution of such a conceptually simple tool as ls in the direction of increasing complexity?



The extreme stinginess of the Unix command line suggests one possible explanation. When Unix was invented, the maximum screen width was limited to about 80 characters, and using a computer meant sitting in front of the terminal and typing commands on the keyboard. In such an environment, it made sense to sacrifice readability and composability to cram more information into as few characters as possible.



In that era, the authors of the most popular utilities strongly tended to create abbreviations wherever possible. ls is called ls for the same reason that its flags are encrypted runes of one character instead of meaningful words or, God forbid, whole phrases: it was designed for a small group of highly specialized experts in an environment where each keystroke, each character on the screen had a real and expressive price.



Similarly, the flags themselves are abbreviations for the most common real-life scenarios. Why waste time and add a filtering step to the pipeline to exclude hidden files from the output, if 90% of the time nobody wants to see hidden files anyway? Why display all the information about each file, if the user only needs names for 90% of the time?



This way of thinking - that presses are expensive and that there should be shortcuts absolutely for everything - has identified many of the problems of ls and the Unix command-line environment in general.



"Universal Interface"



But why not write a simpler alternative to ls , a function that takes an arbitrary directory, or a default working directory, and returns a list of files from it, ignoring the flags? After all, hacking Unix is ​​easier than anything else: if you don't like ls , you can replace it.



I will answer this question with a hypothesis. Imagine a programming language in which each function takes exactly one argument (string) and returns exactly one result (another string).



Oh, look - such a language exists, and it is called a shell.



Unix allows programs to communicate with each other and with the user exclusively through streams of characters. You cannot write a function that returns a list of files, because the shell does not know what a “list” is, does not know what “files” are and cannot tell you the difference between a “function” and a “program”, even if his life will depend.



Programs do not "take arguments" and do not "return values", they read characters from stdin and type characters in stdout !



Write a program for processing text streams, because it is a universal interface.

Douglas McIlroy, " Designing Programs in the UNIX Environment "


The original Unix architects viewed the “simplicity” of text flows as an advantage. Therefore, they refused to impose any structure on the data that is transferred between programs. This solution, designed to avoid unnecessary complexity, instead simply transferred unnecessary complexity further downstream.



Remember the first category of flags ls - those flags that we could not pass for abbreviations of standard transformations of a sequence? It turns out that these are simply abbreviations for informally coding conditional file lists as strings that certain other programs (or, in some cases, human beings) know how to parse.



The system is unable to provide the abstractions needed by users. In response, users reinvent them , scarce, inconsistent and in the wrong places. This is a frustratingly common practice.



Unix Unacceptable



Knowledge of the history of computer science and the limitations in which current mental models were formed, gives us a kind of superpower: we can determine when the necessary compromise has become ridiculous and outdated.



Many of the usability issues that Don Norman raised in his criticism of Unix 1981 have remained virtually unchanged to date. As a gift, we developed graphical interfaces that keep “normal users” away from the command line, but it is still assumed that “serious developers” will sink into a clearly inhumane environment to do something meaningful there.



Instead of re-evaluating the Unix command line with an eye to improving usability, when modern equipment no longer restricts the interface, we wrote terminal emulators that faithfully reproduce the limitations of the mid-1970s. We demand from new alternative shells compatibility with sh and blindly believe that hierarchical file systems are the best way to organize information.



What are the chances that 40 years ago we somehow stumbled upon the best possible interface with computer interaction? In other words, what are the chances that our actions today make sense?



Even the earliest version of Unix was only private, with flawed implementations of the Unix philosophy. If we want to encourage the wider dissemination of philosophy , then it is not worthwhile to defend implementation by minimizing its shortcomings. Instead, you need to directly tackle these shortcomings. Build systems that eliminate them, while adhering to the spirit - if not the letter - of the principles on which Unix is ​​built.

Source: https://habr.com/ru/post/326176/



All Articles