📜 ⬆️ ⬇️

Blue. Not! Yellow! - or - Do new programming languages ​​increase development speed?

What language was used to write the very first programs for the very first computers with a stored program?

Binary machine language, of course.

Why?

Obviously because there was no symbolic assembler. The first programs needed to be written in binary code.

How much easier is it to write programs in assembler than in binary machine language?

Much easier.
')
Can I figure it out? How many times easier?

Well, damn, the assembler does all the hard "routine" work for you. Those. it calculates all physical addresses. He makes up all the physical machine commands. It ensures that it is impossible to issue physically unrealizable commands, for example, addressing outside the address space. And then it creates easy-loading binary output.

Saving workloads is huge .

How much? Can I figure it out?

OK If I had to write a simple program, such as printing the squares of the first 25 integers, in an assembler on an old PDP-8 machine, then it would take me about two hours. If I had to write the same program in binary machine language, it would take twice as long.

I speak twice , because I would first write the program in symbolic syntax on paper, and then assemble the machine language by hand on paper. After which I would have to enter this binary code into the computer also manually. And all this additional work required about the same amount of time from me as writing a program in the first case. Perhaps more.

Ok, enough. So, using symbolic assembler reduces the amount of work by half?

In fact, I think a lot more. Raising integers to a square is a fairly simple program. The larger the program, the harder it is to manually assemble and load it. I suppose, in fact, the gain in complexity depends on the size of the program. For large programs, the time savings are significant .

Please explain.

Well, suppose you need to change one line in a program in a character assembler. It will take me 20 minutes on an old PDP-8 with punched tape. But when manually assembling, I must then recalculate all addresses and re-assemble all machine instructions manually. Depending on the size of the original program will take hours. Subsequent manual entry will take no less time.

I could save time by segmenting the program into modules loaded into fixed addresses that have free intervals between them. You can save a little more time by writing a small program that will help you download a large program. However, such a "routine" load will still be very, very high.

Good. But, nevertheless, is it possible to figure? On average, how much does the use of assembler make it easier to work compared to writing a program in binary code?

Okay. I suppose you can say about 10 times.

In other words, a character assembler allows one programmer to do the work of ten programmers working in binary code?

Yes, it is probably close to the truth.

If the character assembler reduced labor intensity by about 10 times, how much did Fortran do?

Very decent. If we are talking about the 50s, then Fortran was then simple. In other words, it was somewhat more than a character assembler for a symbolic layout — I'm not sure if you understand what I mean.

Does this mean that he has reduced the labor intensity by another ten times?

What you surely are not! The “routine” load of the symbolic assembler was not so high. I would say that Fortran has reduced the labor intensity relatively little. Perhaps about 30%.

In other words, 10 programmers in FORTRAN can replace 13 programmers in assembler?

If you want to consider the process from this position, then yes, it seems.

We continue - how much can a language like C help save time compared to Fortran?

Well, C spends a little less time on "routine" work than Fortran. In the old Fortran, things like line numbers and the order of common operators needed to be remembered. There was also an incredible number of transition operators throughout the text. The C language is much more comfortable for programming than Fortran 1. I would say that it reduced the workload by about 20%.

Good. That is, 10 C programmers can replace 12 Fortran programmers?

Well, this, of course, is only an assumption, but I would say a reasonable assumption.

Good. Now: how much did C ++ reduce the workload with respect to C?

Listen, let's stop. We do not recall much greater impact now.

Is it? What exactly?

Development environment. This means that in the 50s we used punched cards and paper tapes. Compiling a simple program took at least half an hour. And thats if you could access the car. But in the late 80s, when C ++ became popular, programmers kept their programs on disks, and the compilation of a simple program lasted only two or three minutes.

Is this a reduction in labor intensity? Or just reducing the waiting time?

A. So that's it. The question is clear. Yes, then the car had to wait a long time.

Request: when you give your estimates of labor intensity, please exclude the waiting time. I am interested in saving time, associated only with the language itself.

Understand, I understand. So you asked about C ++. Actually, honestly, I do not think that C ++ somehow significantly reduced the complexity. Of course, there was something , but I suppose no more than 5%. This means that the routine load in C was simply small, and therefore the comparative saving of time when working in C ++ could not be significant.

If you use 5%, then this means that 100 C ++ programmers can replace 105 C programmers. Is this really so?

In general, yes. But only for small and medium-sized programs. For large C ++ programs, there are some additional benefits.

What kind?

It is quite difficult to explain. But the point is that the object-oriented characteristics of C ++, in particular, polymorphism, made it possible to divide large programs into independently developed and deployable modules. And this - for very large programs - significantly reduces the routine load.

Can I figure it out?

Well, you seem to be going to twist my arms further ... Considering the number of really large programs that were created in the 80s and 90s, I would say that, in general, C ++ reduced the workload, perhaps by 7%.

It did not sound particularly confident.

Yes. But let's use this value. 7%.

Good. So, 100 C ++ programmers can replace 107 C programmers?

Looks like I said that. Let's use this value.

How long does Java save compared to C ++?

Hard to say. For a while saves. Java is a simpler language. It has an automatic management of the release of dynamic memory ("garbage collection"). It does not have header files. It works in a virtual machine. He has many virtues. And a few flaws.

What about numbers?

I have a feeling that we are skipping ... But since you are pressing me like that, you would say that all other things being equal (which is never the case), working with Java, you can get a 5% reduction in labor intensity compared to C ++.

So, 100 Java programmers can replace 105 C ++ programmers?

Yes! However, no. This is not true. Scatter too large. If we randomly select 100 Java programmers and compare them with the 105 selected C ++ programmers, I would not have dared to predict the result. To get a real win, you need a lot more programmers.

How much more?

At least two orders of magnitude.

In other words, 10,000 randomly selected Java programmers can replace 10,500 also selected C ++ programmers?

Perhaps so.

Very good. How much does a language like Ruby reduce the workload compared to Java?

Well, dear! (sighs). What are you speaking about? See, Ruby is, indeed, a wonderful language. It is both simple and sophisticated, elegant and quirky. It is much slower than Java, but the computers are so cheap now that ...

Sorry, but I'm not asking about that.

You're right. I know. So, the main direction in which Ruby’s workload is less compared to a language like Java is Types . In Java, you need to create a formal type structure and maintain its consistency. Ruby can be played with types fairly quickly and freely.

Sounds like productivity gains.

In general, no. It turns out that the ability to play quickly and freely with the structure of the type leads to the appearance of a runtime error class that is missing when programming in Java. Therefore, Ruby programmers have a higher load on testing and debugging programs.

In other words, are these effects balanced?

It depends on who you ask.

I ask you.

Okay. I would say that the effects do not balance each other. The complexity of working with Ruby is lower than with Java.

How much? 20%?

People used to think so. Indeed, in the 90s, many thought that Smalltalk programmers work many times more productive than C ++.

You are confusing me. Why remember those languages?

Because C ++ is pretty close to Java, and Smalltalk to Ruby.

Clear. So, does Ruby reduce the workload several times over Java?

No, most likely not. If you look at the 90s, the problem with the waiting time was still quite pronounced. The compilation time for a typical C ++ program was a few minutes. The compile time for the Smalltalk program was almost zero .

Zero?

Almost yes. The problem is that when using languages ​​such as Java and C ++, you need to perform a lot of work on the coordination of all types. There is no such problem when using Smaltalk and Ruby. Therefore, in the 90s it took time from minutes to milliseconds.

Clear. But since all this is just a waiting time, we can not consider it.

Not certainly in that way. You see, if the compile time is almost zero , then it generates other programming style and discipline. You can work with a very short cycle - seconds instead of minutes. This gives extremely fast feedback. With a long compilation time, fast feedback is not possible.

Fast feedback reduces labor intensity?

Yes, to a certain extent. When your cycles are extremely short, the “routine” load in each cycle is very small. Your employment caused by the need for tracking is reduced. Lengthening cycles increases the “routine” load, and nonlinearly.

Nonlinear?

Yes, the "routine" load increases disproportionately to the duration of the cycle. It can grow as, for example, O (N ^ 2). I dont know. But I’m pretty sure that the relationship is non-linear.

Wonderful! So Ruby is the leader!

Not. And that's the point. Thanks to the improvement of our hardware over the past twenty years, the compilation duration for Java has become almost zero . The cycle time for a Java programmer is no more (or should be no more) than a Ruby programmer.

Please clarify.

I say that programmers using the short cycle discipline will see only a small difference in labor intensity (or, in general, will not see it), working with Java and Ruby. The difference will be so small that it will be difficult to measure.

Unmeasured difference?

I believe that in order to obtain a statistically reliable result on this distinction, it will be necessary to experiment with thousands of programmers.

But you said earlier that Ruby reduces the workload compared to Java.

I think that this is the case, but only if the cycle time is long. If the editing / compilation / testing cycle is kept very short, the effect will be negligible.

Zero?

Of course not - more likely about 5%. But the spread will be gigantic.

So, 10,500 programmers working in a short loop in Java do the same work as 10,000 programmers in a short loop in Ruby?

If we add another order for the sample size, then I would venture to agree.

Are there languages ​​superior to Ruby?

You can get another 5% using a language like Clojure, since it is, on the one hand, quite simple and, on the other hand, functional.

Do you give only 5% of functional language?

No, I say that the discipline of a short cycle practically erases the differences in performance in modern languages.

If you work with short cycles, it hardly matters what modern language you use.

That is: Swift? Dart? Go?

Irrelevant.

Scala? F #?

Irrelevant.

In other words, we reached the top. No future language will be better than what we have now.

Not certainly in that way. I say only that we are on the path of declining efficiency. No future language will give a gain of 10 times, as it was with the assembler in relation to the binary code. No future language will reduce labor intensity by 50% or 20% or even 10% in comparison with existing languages. The discipline of a short cycle has reduced the differences to practical immeasurability.

Then why are all new languages ​​appearing?

This is the search for the Holy Grail .

And, so this is just a matter of the level of your favorite color.



Translator's note: The title of the post and its subject matter are a reference to a fragment of the film “Monty Python and the Holy Grail” in which the Knights of the Round Table answer five three questions to cross the Death Bridge

Source: https://habr.com/ru/post/308550/


All Articles