Someone will say (one who writes in C) that the code written in C is shorter than the code written in Java or Basic. Or really shorter?
You will be surprised, but it turns out that if you take 2 programming languages, let's call them A and B, then there is a constant that makes up the difference between the lengths of the codes of these two languages for all programs.
')
Seriously!
To begin with a couple of symbols.
Let Length (A, φ) for the language A and the function φ be the number of characters in the smallest program in length, which calculates the function φ written in the language A. (Here we must think)
The list of all functions is defined as (1), 2 (2), (3), ... (there are infinitely many)
As a result, we get one single constant c, such that for any i:
Length (A, F (i)) <= Length (B, F (i)) + s and vice versa
Length (B, F (i)) <= Length (A, F (i)) + s
Why?
Because in language A you can write an interpreter for language B and transfer the program of language B as a parameter to a program written in language A. And vice versa. Then our constant with will be the sum of the lengths of interpreter programs written in language A and in language B.
PS Do not say that it is not possible to write an interpreter in any programming language, because the general case is considered just a general case.
Literature: Ming Li, Paul Vitanyi - Annotation to Kolmogorov Complexity and Applications.