📜 ⬆️ ⬇️

Programming on the machine Post

Recently, two materials on languages ​​from the “Big Four Turing Swamps ” appeared on Habré: about the Markov algorithm and Brainfuck . I think, for the sake of completeness, it will be interesting to compare these esoteric systems with another important algorithmic primitive - the Post machine, which I am working on.

The Post machine ( wiki ; for simplicity, a variant of the syntax is taken from there) is similar to the well-known Turing machine, but it has interesting features. It contains only 6 commands, in addition, only 2 characters can be written to the memory bits (binary coding of information). "Naturally," no additional memory, not for nothing is called esoteric!

Thus, when programming on a Post machine, in addition to the need to cope with the Okkam syntax, one has to think about how to record all intermediate results on a tape without losing the return path to the input data residues along the way. Why "leftovers"? Often, due to the lack of additional memory, it is necessary to process the input data iteratively (and sometimes recursively). I hope the foregoing convincingly proves that writing familiar algorithms on the Post machine is a good warm-up for the brain and a very exciting activity.

')

Example


Consider one of the shortest implementations of multiplying two natural numbers. The numbers n and m are written on the tape in the unit number system, separated by a single empty cell. The input / output of the algorithm may be as follows (the initial position of the carriage is marked):
    v
 ..01110110 .. → ..01111110 ..
     3 * 2 = 6


The idea of ​​the algorithm is a short addition. In each pass of the cycle, the machine “bites off” one bit from the left multiplier and “copies” the rightmost available block (first it is the second multiplier, then its last copy). When the left multiplier is “finished,” there are n blocks of m units left on the tape. Merging them gives the desired number n * m.

Code

  0.!  - stop command, execution starts from line number 1
  1. 0 - main loop
  2. →
  3.?  29, 4 - 29: the left multiplier is over
  4. →
  five. ?  6, 4
  6. →
  7.?  8, 4
  8. ←
  9. ←
 10. 0 - copying the right block.
 11. →
 12. ?  13, 11
 13. →
 14. ?  15, 13
 15. 1
 16. ←
 17.?  18, 16
 18. ←
 nineteen. ?  20, 18
 20. 1
 21. ←
 22.?  23, 10 - the end of the copy block
 23. ← 
 24.?  25, 23
 25. ←
 26.?  27, 23
 27. →
 28. → 1
 29. → - fusion of copies of the second factor
 30. 0
 31. → 
 32.?  33, 31
 33. 1
 34. →
 35.?  0, 36 - exit from the program
 36. ←
 37.?  29, 36 


You can check the correctness of the algorithm in mind, on a piece of paper, or with the help of this program.

This is the shortest implementation of multiplication known to me. However, it can potentially be shrunk even more if you figure out how to economically combine the processes of creating copies and merging them into a single array.

If you are interested in the topic, I suggest thinking about the following tasks:


PS “The Big Four” I call the Turiga, Lent car, the Markov system and the Brainfuck - the most studied turing bogs.

Source: https://habr.com/ru/post/111272/


All Articles