“One of the reasons to actually go to university is to go beyond simple vocational training and cling to deeper ideas instead.”

Let's think a little over this issue. A few years ago, Computer Science departments invited me to give lectures at a number of universities. Almost by chance, I asked my first audience, consisting of undergraduates, graduate students and professors, about their definition of “Computer Science”. All were able to give only an engineering definition. I did this in every new place, and there were similar results everywhere.
')
Another question was: "Who is Douglas Engelbart?" Several people said: “Wasn’t he somehow connected with the computer mouse?” (And this disappointed me a lot, since my scientific community put a lot of effort into answering this question after two or three mouse clicks and to make sure that Engelbart really was somehow connected to a computer mouse).
Part of the problem was the lack of curiosity, partly the narrowness of personal goals that were not related to learning, partly the lack of an idea of what this science is, and so on.
I have been working part-time at the Department of Computing Engineering at the University of California for several years (in fact, I am a professor, but I do not need to go to the department meetings). Periodically, I conduct classes, sometimes at freshmen. Over the years, the already low level of curiosity for Computer Science has decreased significantly (but the popularity level has also increased, since computing is seen as a way to well-paid work, if you know how to program and received a certificate in the best 10 schools). Accordingly, no student has yet complained that C ++ is the first language at the University of California!
It seems to me that we are faced with a situation in which both the meanings of “Computer” and “Science” were destroyed by weak massive concepts in order to create a new term - a kind of label on jeans - that sounds good, but is quite empty. A related term, which was similarly destroyed, “software engineering”, which, again, did not use the most ingenious ideas of “programming” and “engineering”, but simply combined them (it was intentionally done in the sixties, when term).
One of the reasons to actually go to university is to go beyond simple vocational training and cling to deeper ideas instead. It seems to me quite reasonable for an introduction to the specialty to try - if possible, with the help of examples - to make students study real problems and begin to understand what is really interesting, important and central in this area.
First graders are happy when they are shown how a ruler on top of another ruler becomes an adding machine, with which they can beat the guys from grade 5 by adding the fractional part. And then they will be happy to take part in the development of improved arithmometers. They touched a real computer — a physical and mental tool that helps us think. They learned a truly effective way of representing numbers — more effective than teaching in schools!
They were able to combine their sensible idea of “adding” as “accumulating” with something similar with new powerful properties. They programmed it so that it was able to solve various problems.
They also expanded it. And so on. This is not a digital computer. And this is not a computer with a memorized program. But this is the essence of the computer. In the same way as the
anti-Hitter mechanism is generally the essence of computer and computing.
Antikythera mechanismHow far can we go and how much to do before everything gets out of control and we get lost in abstractions? I have always been partial to the characterization of
Alan Perlis - the first Turing Award winner, who may have invented the term “Computer Science” - who in the 60s said: “Computer Science is the science of processes”. All processes.
For the sake of Quora, let's not try to drive it further or turn it into religious dogma. Let's just happily use the idea of
Al Perlis to think better about our area. And especially on how to teach this. Now we need to look at the modern meaning of "science", and Perlis was pretty sure that it should not be diluted with old values (for example, such as "knowledge collection") and uses (for example, "librarianship" or even "social sciences "). By “science” he tried to understand the phenomenon by creating models / maps that are trying to show, “track” and predict phenomena.

I have given several interviews about how the best cards and models can often be suitable for a T-shirt, just like Maxwell’s equations and others are suitable. The analogy is that there is a “science of bridges”, although most bridges are made by man. But as soon as the bridge is built, it displays phenomena, scientists can study them, it is possible to make many types of models from bridges and form comprehensive and useful “bridge theories”. The fun is that you can then design and build new bridges (I already mentioned that there is hardly anything more fun than the joint work of scientists and engineers to solve big and important problems!)
Herbert Simon - the winner of the Turing and Nobel Prizes - called all this "the science of artificial" (and wrote a great book with the same name).
Let me give you an example. In the 1950s, companies and universities built computers with a memorized program and started programming them - and there was a special moment when Fortran appeared in 1956 - which was not the first high-level language, but perhaps the first one done so well that used in many different areas, including many that were previously made only in machine language.
All this gave rise to "phenomena".
John mccarthyLisp's story is more complicated, but John McCarthy became interested in trying to find a "mathematical theory of computation" and was determined to make everything work fine. The eval function, which interprets Lisp, easily fits on a T-shirt! Compared with the "programming system" - this is negligible. More importantly, this “theory of computation” had a more powerful concept than Fortran! It was the best idea of the bridge!
Lisp’s diminutiveness allows the whole idea of programming to be grasped with a couple of taps on a deeper level and to be thought out at a level that seems simply impossible when you look at huge artifacts (this is one of the reasons why scientists like math to be compact and powerful). The mathematics used here is new mathematics, because it allows such concepts as “before” and “after”, and this leads to “variable variable logic”, which allows you to save both functional dependence and logical train of thought, while also allowing and the course of time. (This is still not understood in our time in the cruel world of situational programming).
Lisp, as a powerful programming language and metalanguage, which is able to present its own theory, is an example of real computer science. If you learn it and other similar things, you can think more deeply and be more responsible for your own destiny than if you just learned how to program in Fortran or its modern equivalents (... so you can get close to programmers!).
You will learn much more about special types of design that are needed in calculations (for example, they are usually not appreciated when calculations often require going beyond the computing environment: one of the special characteristics of stored software calculations is that it is not just material for a program, but material for a brand new computer).
Another reason for choosing the definition of Perlis is that, in general, computation is much more connected with the creation of many types of systems than with algorithms, “given structures” or even programming as such. For example, a computer is a system, computing is a system, a local network and the Internet are systems, and most programs should be better systems than they are (the old programming style existed until the 50s it seems that programming should be like this - there is nothing farther from the truth).
The Internet is a good example - unlike most programs in our time, the Internet does not need to be stopped in order to repair or improve something - it looks more like a biological system - in our intention, than what most people consider to be a computing system. And it is much more scalable and reliable than almost all software systems that exist today. It is really worth thinking about before learning newbies in programming with less powerful concepts!
So, what we need to do in the first year at Computer Science is to take into account what students can do at the very beginning, and then try to stay within their “cognitive load” to help them get to what's really important. . It is very important to “remain real” and find ways that would be intellectually honest and suitable for those who are just starting their studies. (Please do not teach bad ideas just because they seem a bit simpler - lots of bad ideas are actually simpler!).
Students must begin by creating something that has many important characteristics that I have discussed here. It should be a system of several dynamically interacting parts, and so on. A good way to decide which programming language to use is simply to do something that has thousands of interacting parts! If not, then you should find one. The worst thing you can do is put students on the path of too weak fluency, which would severely limit large-scale ideas. It just kills them - and we want to grow them, not kill them.
About GoTo
