We continue the translation of the essay and the book by Paul Graham "Hackers and Artists."
At the end of the article, the technical director of the
Edison company tells how they ported Lisp to C #
“We were chasing C ++ programmers. We managed to drag them a whole bunch halfway to Lisp. ”
Guy Steele, coauthor of the Java specification.

Original -
Revenge of the Nerds , May 2002
and
What Made Lisp Different , December 2001
Thanks for the translation thanks to Shchyokotova Yana.Start:
Paul Graham: "Revenge of the nerds", part 1')
Part two
What is different about LispWhen Lisp was first developed, it embodied nine new principles. Today we take some of them for granted, others can only be seen in more advanced languages, and two are still the prerogative of Lisp. These 9 principles are listed below in the order of their application in the mainstream IT.
1
Conditionals . A conditional statement is an if-then-else construct. Today it is a standard for us, but they were absent in Fortran I. There was only a goto conditional transition, based on the underlying machine instruction.
2
Functional data type (A function type). In Lisp, functions are the same data type as integer or string. They have a literal representation, can be assigned to variables, can be passed as arguments, etc.
3
Recursion (Recursion). Lisp was the first programming language to support it.
4
Dynamic typing (Dynamic typing). In Lisp, all variables are, in essence, pointers. Values ​​are what the type is, not variables, and assigning or binding variables means copying pointers, not what they indicate.
5 Garbage collection.
6
Programs made up of expressions (Programs composed of expressions). Lisp programs are represented as expression trees, each of which returns a value. It’s like a contrast between Fortran and most of the subsequent languages, which distinguishes between “expressions” and “operators”.
For Fortran I, this distinction was natural, because operators could not be nested. And so, while the work needed mathematical expressions, there was no point in creating something that would return meaning, because this value could not be saved anywhere.
This restriction is a thing of the past with the advent of languages ​​with a block structure, but by that time it was too late. The distinction between expressions and operators is deeply rooted and spread from Fortran to Algol, and then to all their descendants.
7
Symbol type . The characters in fact are pointers to strings stored in hash tables. Thus, equality can be tested by comparing pointers, rather than each character.
8
Notations for code using symbol and constant trees .
9
Permanent integrity of the language . In reality, there is no difference between code reading time, compile time, and execution time. You can compile or run the code while reading, read or run the code at compile time, and read or compile the code while it is being executed.
Running the code while reading allows users to reprogram the Lisp syntax. Running the code at compile time is the basic principle of the macro. Runtime compilation is the principle of using Lisp as an extension language in programs like Emacs. And reading at run time allows programs to communicate via S-expressions, which was recently reinvented as XML.
When Lisp first appeared, these principles were far beyond the ordinary practice of programming, which was mainly dictated by hardware that appeared in the late 1950s. Over time, the base language, embodied in followers of popular languages, gradually evolved into Lisp. Principles 1-5 are now widespread. The principle at number 6 is just beginning to manifest itself in the mainstream. Python is at stage 7, although it seems that there are no syntax rules for this principle.
As for point 8, it is probably one of the most interesting. Principles 8 and 9 became part of Lisp by chance, because Steve Russell introduced what McCarthy was never going to do. And yet, these principles proved to be the basis of both the strange appearance of Lisp and its most distinctive features. Lisp looks so strange, not because it has a kind of syntax, but because it does not have a syntax as such. You write programs directly in the parse trees, which are built behind the scenes during parsing in other programming languages, and these trees consist of lists that are data structures in Lisp.
The expression of a language in its own data structures is a fairly powerful property. Principles 8 and 9 collectively mean that you can write programs that write programs. This may sound like a nonsense, but in Lisp this is a common thing. The most common way to do this is with a macro.
Macros (in the context of Lisp) are still, as far as I know, a rarity for the Lisp language. This is partly because, in order to write a macro, you may have to create the syntax of your programming language as strange as in Lisp. This may also be because if you inject this final touch, then there is nothing more to expect the development of a new language, but only a new dialect of Lisp.
Basically, I present all this as a joke, but there is some truth in it. If you define a language that has car, cdr, cons, quote, cond, atom, eq, and a notation for functions expressed as lists, then from all this you can build the rest of the Lisp gut. This, in fact, is the defining quality of Lisp: everything was organized in such a way that McCarthy gave Lisp the form that it has now.
Where languages ​​matterSo, suppose Lisp represents a constraint to which popular languages ​​asymptotically approach. Does this mean that you should use this language to write software? How much do you lose on using less powerful language? Isn't it better, sometimes, to stay aside from advanced innovations? And isn't popularity to some extent a matter of its own? Isn't the ignorant boss right, for example, in using a language for which he can easily hire programmers?
Of course, there are projects where the choice of programming language does not matter. As a rule, the more demanding the application, the greater the gain you get from using a powerful language. But many projects are not so demanding. Most of the programming process consists of writing small intermediary programs where you can use any language you already know and that have a good set of libraries for any of your purposes. If you only need to transfer data from one Windows program to another, of course, use Visual Basic.
You can also write intermediary programs in Lisp (I use it as a desktop calculator), but the greatest benefit is from using languages ​​like Lisp in another application spectrum, where you need to write complex programs to solve complex problems in a highly competitive environment. A good example is the airfare price quest program, which ITA Software has granted Orbitz to use. These guys entered the market, where two major strong opponents, Travelocity and Expedia, were already dominating, and seemed to just suppress them from a technological point of view.
The core of the ITA application is 200,000 lines of Common Lisp program, which is looking at an order of magnitude more features than their competitors, still apparently using mainframe programming technology. (Although ITA also uses a mainframe programming language in a sense). I have never seen a single line of code from the ITA program, but according to one of their best specialists, they use a lot of macros, which I am not surprised at all.
To be continued(Who wants to help with the translation of the articles by Paul Graham - write in a personal)
Anecdote, (not) invented by a programmer who ported LispThe programmer sits deep in debugging.
Suitable son:
- Dad, why does the sun rise every day in the east, and set in the west?
- Did you check it?
- Checked.
“Did you test well?”
- Good.
- Works?
- Works.
- Does every day work?
- Yes, every day.
- Then for God's sake, son, do not touch anything, do not change anything !!!
Lisp was widely used to create expert systems
on punch cards when programming was just coming to industry and defense. Therefore, there are “pieces” of infrastructure with a super-responsible function that people are simply afraid to touch, because they can shy away across the country.
The interface, the wrapper, the math - everything can be rewritten, but the expert system can be rewritten - it is associated with great political, organizational and reputational risks and responsibilities if something goes wrong as a result.
Imagine a server with a huge computational power, which serves the algorithm written for 30-year-old iron with low performance (Mathematical coprocessor? - No, I did not hear it). And you can only scale it with a blunt increase in power. Because the code itself was written down manually for decades.
Those. CEO of Edison : We ported a compute module to C #, written a long time ago by a Lisp developer. The module consisted partly of an expert system, and partly of mathematics (Fourier transforms, statistics). Lisp was chosen because the creator of the algorithm knew only him. They said that he had been refining it almost all his life.
The reason for the transition - used the old Lisp compiler, which did not give high performance. For many years, calculations on Lisp were collected in the form of a library, glued together with modern programs and called up in this way: the original matrices were transferred to the library, it was thought for a long time and then gave the result. Now the customer wanted to use the new features of Windows and speed up the execution of the calculation. Nevertheless, the code wanted to remain unchanged, so as not to spoil the "heuristic" part.
The solution consisted of two parts:
- "Literal" rewriting of the code, which contained in places very poor-quality and even clumsy expressions, as if made hastily.
- optimization, which was accompanied by a test of the new library. When the algorithm was fed with 30 different input data schemes, our task was to make sure that the result obtained through the old (Lisp) and new (C #) libraries was identical.
To understand the coverage - the system is deployed across the country and processed tens of thousands of objects.