📜 ⬆️ ⬇️

For the first time in Russian: Nicolas Carr. Does Google make us dumber?

I really liked the article ... here is the source www.theatlantic.com/magazine/archive/2008/07/is-google-making-us-stupid/6868
The material is almost 3 years old, but it is still relevant ...

Nicholas Carr. Is Google making us stupid? (The Atlantic. July / August 2008)
Translation: Alina Lepeshkina

“Dave, stop! Yes, stop you! Wait, Dave! Can you stop? ”, Implore the HAL supercomputer of the irreconcilable astronaut Dave Bowman in the famous and extraordinarily dramatic scene in the Stanley Kubrick film“ 2001: A Space Odyssey. ” Bowman, who was nearly crushed to death in the depths of space by a faulty machine, quietly and coldly disconnects the memory circuits that regulate its artificial intelligence. “Dave, my mind exists,” HAL says lonely. "I'm feeling it. I feel".
I feel it too. Several years ago I had a disturbing thought that someone or something was digging in my brain, reconfiguring the nervous system, reprogramming my memory.

It’s not that I lose my mind — at least I don’t think so — but it changes. I do not think the way I used to think. This is most felt during the reading.
It used to be easy to dive into a book or a long article. My mind followed the narration or swept through the turns of the arguments, I could spend hours walking the vast expanses of the texts. It is unlikely that this will happen. Now I often lose concentration after two or three pages. I break down, lose the thread of reasoning, begin to look for what else to do. I feel how I force my capricious brain to return to the text. Thoughtful and serious reading, which was natural, turned into a problem.
I think I know what's going on. For more than a decade, I spend a lot of time online, looking for information and moving from site to site, and sometimes add something to huge Internet databases.
The network is a godsend for me as a writer. The study, which used to require days in the library’s repository or library periodicals, is now carried out in minutes. A few Google requests, clicks on hyperlinks - and a ready fact or meaningful quote. Even if I don’t work, I go online with the same pleasure if I hadn’t done it for a thousand years.
I read and write e-mails, browse headlines or blog posts, watch videos and listen to podcasts, or just travel with a link to a link. (Unlike footnotes, with which they are sometimes confused, hyperlinks not only point to the work associated with it, but take you away from them).
')
For me, like many others, the Network becomes a universal medium, a channel for most of the information that flows through vision and hearing to the brain. The benefits of instant access to such an incredibly rich arsenal of information are many, and they are already widely described and properly evaluated. "The absolute end of silicon memory," wrote Clive Thompson in Wired magazine, "can be good ground for reflection." But this soil has a price.

As media theorist Marshall McLuhan noted in the 1960s, media is not just passive channels of information. They provide material for thought, but they also shape the thinking process itself. And it seems that the Internet is curtailing my ability to concentrate and contemplate. Now my brain expects to receive information in the same way that the Web spreads it: in a rapidly moving stream of particles. I once was a scuba diver in a sea of ​​words. Now I’m sweeping across the surface like a guy on a motorboat.

Network as a way of thinking

And I'm not alone. When I talk about my reading problems to friends and acquaintances - most of them are writers, many of them say that they notice something similar. The more they use the Web, the more often they are forced to hold attention on long pieces of text. Some bloggers I read also began to notice this phenomenon. Scott Kap, who runs a blog about online media, recently admitted that he completely stopped reading books. “I was among the reading majority in college, a real bookworm,” he wrote. “And what happened?” He reasons over the answer: “What if I read everything online, not because it is more convenient for me, but because of the changed way of thinking?”
Bruce Friedman, who regularly writes a blog about computer use in medicine, also talked about how the Internet influenced his mental habits. “I practically lost my ability to read and absorb long articles, both on the web and in print,” he wrote. A pathologist who had been teaching at the University of Michigan Medical School for a long time, Friedman continued his thoughts in a telephone conversation with me. His thinking is now, he says, like staccato: he quickly looks at small passages from many resources at the same time.

“I can’t read War and Peace anymore,” he remarks. “I lost that ability. Even a blog post consisting of more than three or four parts can not be swallowed easily. In that case, I'll miss it. ”

Such stories are not very convincing. We are still waiting for neurological and psychological experiments that will draw a real picture of how the Internet affects our ability to learn. But recent studies of Internet search habits by researchers from University College London indicate that we may be halfway through a complete transformation of our way of thinking and reading habits. During the five-year research program, scientists analyzed user activity data from two popular search engines, one of which is managed by the British Library, the other by the educational consortium of the United Kingdom, both of which give access to journal articles, electronic books and other written information sources. It turned out that the people who used these sites showed one of the forms of “rushing activity”, jumping from one resource to another and rarely returned to any of the sites that they had already visited. They usually read no more than one or two pages before jumping to another site. Sometimes they save long texts, but this does not guarantee that they will be read. The study authors reported:
Obviously, people do not read online in the usual sense; There are all signs that a new kind of “reading” is emerging: users are quicker, “diagonally” browsing the headings, content of pages and quotes. This actually means that they come to the Web to avoid traditional reading.
Thanks to the widespread use of text on the Internet, not to mention the popularity of SMS, we can read more than we did in the 1970s and 80s, when television was the only media we could choose instead. But we are considering a different type of reading, which may be followed by another type of thinking - perhaps even a different sense of self.

“We are not just what we read,” says Marian Wolf, a psychologist at Tufts University and the author of Proust and Squid: The History and Science of the Reading Brain. "We are the way we read."
Wolfe is worried that the reading style offered by the Internet, a style that relies primarily on promptness and immediacy, can weaken our ability to read thoughtfully, which has existed since the time when printing press technologies have traveled a long way to ubiquitous dissemination. When we read online, she says, we tend to become “just decoders of information.” Our ability to interpret the text, to build semantic connections, (as it happens with thoughtful reading, when we are not distracted), is not used to a greater extent.

Reading, explains Wolff, is not an instinct. It is not embedded in the genes, like the ability to speak. We must teach the mind to recognize the characters that we see in a language that we understand.
And the media, like other technologies used by us in the study and practice of such a craft as reading, play an important role in the formation of neural circuits inside the brain. Experiments demonstrate that the psychology of reading among those who use ideographic writing (for example, Chinese) is very different from that of people whose language is based on the alphabet. Changes apply to many brain regions, including those that govern such natural cognitive functions as memory and interpretation of visual and auditory stimuli. We can assume that the schemes used under the influence of the Internet will differ from those that arise when reading books in general of the printed word.

How typewriter influenced Nietzsche’s style

In 1882, Friedrich Nietzsche bought a typewriter - the firm Malling-Hansen, to be exact. His eyesight deteriorated, and focusing on the page became difficult and painful, which often led to terrible headaches. He had to write less: he was even afraid that soon they would have to give up on it altogether. The machine saved him, at least for a while. He learned to "blind" print, could write even with his eyes closed, using only the tips of his fingers. Thoughts and words could pour into the pages again.
But the machine has affected his writings. One of Nietzsche's friends, the composer, noticed changes in his writing style. His already compressed prose became even tougher, more telegraphic. “It is possible that with the help of this tool you will even acquire a new language,” wrote a friend in a letter, noting that in his own work, “musical thought and language often depend on the quality of the pen and paper.”

“You are right,” answered Nietzsche, “our writing equipment forms our thoughts as well.” Influenced by a typewriter, writes German media researcher Friedrich Kittler, in Nietzsche's prose, "aphorisms replaced arguments, puns, reflections, style telegrams - rhetoric."

The human brain is infinitely flexible. People believed that our brain network of approximately 100 billion neurons inside our skull had already been completely formed by the time of coming of age. But brain researchers have found out that this is not the case. James Old, professor of neurology, head of the Krasnovsky Institute for Advanced Study, says that even the mature mind is very plastic. Nerve cells regularly break old connections and form new ones. "The brain," says Old, "has the ability to reprogram itself on the fly, changing the way it functions."
Since we use what the sociologist Daniel Bell called "intellectual technologies" - tools that increase our mental, not physical potential - we inevitably fall under their influence.

Mechanical watches, which were in general consumption in the 14th century, are a convincing example of this. In "Technique and Civilization," the historian and cultural critic Lewis Mumford describes how the clock "separated time from everyday life and helped create faith in the independent world of mathematically measurable sequences." The theoretical framework of divided time has become a point of reference, both for action and for thought.

Methodical ticking brought the scientific mind and the scientific person to life. But something has taken away. As he later described in his 1976 book, “The Power of the Computer and the Human Mind: From Judgment to Calculation,” computer scientist Joseph Weizenbaum, the concept of the world that emerged from the development of chronometers, “remains a curtailed version of the previously existing the direct experience that formed the basis for a truly established, old reality. ” When deciding when to eat, when to work, when to sleep, and when to get up, we stopped listening to our feelings and began to obey the clock.

The process of adaptation to new intellectual technologies is manifested in a change in the metaphor with which we are accustomed to explain ourselves. When the mechanical clock appeared, people began to think of the brain as a “clock-like” mechanism. Today, in the software era, we are accustomed to presenting it as a “computer-like” system. But changes, as neuroscience tells us, have gone very far from this metaphor. Due to the plasticity of our brain, adaptation occurs at the biological level.
All-consuming media environment

The Internet seems to have a far-reaching impact on human consciousness. In a paper published in 1936, the British mathematician Alan Turing wrote that a digital computer, which at one time represented purely theoretical development, could be programmed to perform the functions of any other information processing device. And what do we see today? The Internet, an omnipotent computer system, integrates most other intelligent technologies. It becomes our map and our watches, the press and typewriter, our calculator and telephone, radio and television.
The World Wide Web absorbs media, they are recreated in the Internet incarnation. Media content is saturated with hyperlinks, blinking banners and other digital knickknacks. A new e-mail message, for example, can announce your arrival when you view the latest news. As a result, it disperses attention and concentration evaporates.
The influence of the Internet is not limited to the limits of a computer monitor, not at all. Since the human brain is gradually adjusting to the quilt of online media, traditional media must adapt to the new expectations of the audience. Television programs add ticker and pop-up windows, magazines and newspapers shorten their articles, post brief summaries, and fill pages with easy-to-read informational fragments.
When in March 2008, the New York Times decided to devote the second and third pages of each issue to excerpts from the articles, the design director of the newspaper, Tom Bodkin, explained that the "cuts" would give their rushing readers a quick idea of ​​the news of the day, saving them from "less effective "method of actually turning the pages and reading materials. Older media have no choice but to play by the rules of new media.
Never before has the communications system played such a large role in our lives — or had such a strong influence on our thoughts — as the Internet today. However, among all that has been written about the Internet, there is very little concern with how it reprograms us. Network intellectual ethics remains unclear.

Google and "new Taylorism"
At the time Nietzsche began using a typewriter, a young man named Frederick Winslor Taylor brought a stopwatch to a steel mill in Philadelphia and began a series of experiments aimed at increasing the efficiency of factory workers. With the consent of the owner of the company, he dialed a group of workers, put them in for various metalworking machines, and recorded and noted the time of each of their actions, as well as machine operations. By destroying each workflow to a sequence of small, discrete steps, and, after testing possible ways to accomplish them, Taylor created a set of precise instructions — an algorithm, as we say today, that every worker should act on. Midval employees were outraged by the severity of the new regime, claiming that he humiliated them to the level of machine tools, but factory productivity took off.
More than a century after the invention of the steam engine, the Industrial Revolution finally found its philosophy and its philosopher. Taylor's strict industrial choreography — his “system,” as he liked to call it — was adopted by manufacturers throughout the country, and after a while around the world. In search of maximum speed, maximum efficiency and production, the factory owners used his research to organize the workflow. The goal, as Taylor defines it in his famous treatise of 1911, "Principles of Scientific Management", is to identify and approve for each type of work one best way to do it, thereby initiating "the gradual replacement of science by the practical method of mechanical skill".
One day his system will be applied to all processes of manual labor, Taylor assured his followers that this will lead to a restructuring not only of industry, but also of society, will create a utopia of absolute efficiency. “In the past, man was the first,” he assured; "In the future, the first is the system."
The Taylor system is still in many ways with us; it still remains the ethic of industrial production. And now, thanks to the increasing power with which computer engineers and software programmers hold our intellectual lives in their hands, Taylor's ideas are beginning to control the sphere of the mind with the same success.
The Internet is a machine designed for efficient and automated collection, transmission, and work with information, and the legions of programmers are looking for the “best method” —a perfect algorithm — to perform every mental movement that we used to describe as “thinking.”
Google centers in Mountain View, California - this is Googleplex - the temple of high-speed Internet, and the local religion - Taylorism.Google, says its executive director Eric Schmidt, is “a company founded around the science of measurement,” and it seeks to “systematize everything” that is done in it. Relying on terabytes of behavioral data that she collects on her search engine and through other sites, the company spends thousands of experiments a day, according to the Harvard Business Review, and uses their results to refine algorithms that control how people search for information and extract meaning from it .
What Taylor did for manual labor, Google does for mental work.

The company proclaimed that its mission is “to organize global information and make it universally accessible and useful.” They strive to develop "a perfect search tool", which is defined as something that "understands what you mean, and gives you exactly what you want." From the point of view of Google, information is a kind of product, a utilitarian resource that can be mined and processed with the economic efficiency of production. After all, the more information we can get, the sooner we can extract the essence from it, the more productive we are as thinkers.
Where is the end of this? Sergey Brin and Larry Page, talented young people who founded Google when they worked at Stanford on doctoral dissertations in computer science, honestly talk about their desire to turn a search engine into artificial intelligence like the HAL machine that can be connected directly to the brain. “The search engine is as smart as a person, or even smarter,” Page said a few years ago. "For us to work in search is a way to work on artificial intelligence." In 2004, in an interview with Newsweek, Brin said: "Of course, it would be great if information from all over the world flowed directly into your brain, or artificial intelligence, more advanced than yours." Paige once told a convention by scientists that Google "is actually trying to create artificial intelligence, and on a large scale to do it."
These are natural ambitions, even deserving admiration, because they come from a pair of mathematical wise men with a huge amount of cash at their disposal and a small army of computer scientists in the service. In essence, Google’s scientific enterprise is motivated by the desire, according to Eric Schmidt, “to solve problems that have not yet been solved”; and artificial intelligence is the hardest of tasks. Why don't Bryn and Paige want to crack this nut?
But still their naive assumption that we would be better if our brain was improved or even replaced by artificial intelligence is alarming. It inspires the conviction that intelligence is the result of a mechanical process, a series of discrete steps that can be isolated, measured and optimized.
In the Google universe, in the world that we fall into when we go online, there is little room for vague reflections. Ambiguity is not a loophole for insight, but an error that will be eliminated. And the human brain is just an outdated computer that needs a faster processor and a larger hard drive.
The idea that our minds should work like high-speed computers was not only influenced by the Internet, it was also the dominant business model. The faster we move on the web - the more links we click and the more pages we scan - the more opportunities Google and other companies have to get information about us and feed us with advertising.
Most of the owners of online advertising are financially interested in collecting scraps of data that we reserve, jumping from a link to a link - and the more these “scraps” the better. And it is not in their economic interest to encourage slow or slow reading and concentrated thinking.
What is the danger of "Galaxy Internet"?
Perhaps I worry in vain. Just as there is a tendency to glorify technological progress, there is a reverse - to wait for the worst from each new tool or machine. In Plato's Phaedre, Socrates lamented the development of writing. He was afraid that people would get used to relying on the written word as a substitute for knowledge that they had always kept in their heads, and that they could, according to one of the characters of the Dialogues, "stop training their memory and become forgetful." And because they could "receive a lot of information without proper instructions," they will "consider themselves fully aware, while for the most part they are completely ignorant." They "will be filled with dubious wisdom instead of real". Socrates was not mistaken - new technologies often produced exactly the effect he feared - but he was short-sighted. He could not foreseethat many ways of writing and reading will serve to disseminate information, fresh ideas, the expansion of human knowledge (if not wisdom).
The discovery of Guttenberg's printing press in the 15th century initiated another wave of “dental gnashing” about new technologies. The Italian humanist Ioronimo Skvartsiofiko feared that the readiness of books would lead to intellectual laziness, making people "less diligent" and weaken their minds. Others argued that cheaply printed books and leaflets would undermine religious power, devalue the work of scholars and scribes, and contribute to the spread of sedition and debauchery.
As Clay Shirky, a professor at New York University, wrote, “most of the arguments against the press were correct, even prophetic.” But, again, the pessimists were not able to present the many benefits that the printed word can give.
So of course you must be skeptical of my skepticism. Maybe those who do not support Internet critics, such as Luddites or “nostalgists,” will be right. And for our hyperactive minds sunk in information, the golden age of intellectual discoveries and universal wisdom will come. And yet, the Internet is not an alphabet, and although it can replace a printing press, it creates something completely different.
The kind of deep reading that the printing press contributes to is valuable not only by the knowledge that we extract from the author's words, but also by the resonance they cause in the depths of our mind.
In the space that opens as a result of a concentrated reading of the book, or any other act of contemplation, for that matter, we look for our own associations, draw our own conclusions and analogies, draw our own ideas. Deep reading, claimed Marianna Wolff, is indistinguishable from deep reflection.
If we lose this ability, or if we replace all with abstract “content”, then we will sacrifice something important not only for ourselves, but also for our culture. In a recent essay, playwright Richard Forman eloquently described what was at stake:
“I am a child of Western culture, in which the ideal (my ideal) is a complex, like a cathedral, highly educated and integral personality - a man or a woman, who carry in themselves a unique and personally recreated cast of the entire Western heritage. [But now] I see in each of us (including me) the substitution of a complex internal organization with a new kind of “I” developing under the pressure of information overload, the technology of “instant access”.
Since we no longer mix with the “repertoire of our cultural heritage,” Foreman concludes, we run the risk of becoming “people - pancakes” - spreading through a wide network of information accessible to us at the touch of a button.

The Dark Prophecy of Kubrick
The scene from Space Odyssey haunts me. What makes it so acute and so strange is the computer’s emotional reaction to the analysis of its intellect: his despair, when the schemes go out one by one, the way he childishly begs the astronaut: “I feel it. I feel.I'm afraid ”- and returning in the final to what we might call the state of innocence. The outpouring of feelings by HAL contrasts sharply with senseless human figures that go about their business like robots. Their thoughts and feelings are subject to the scenario as if they acted according to an algorithm. In the world of science fiction film, people have become so much like cars that the most humane character is just at the car. This is the essence of Kubrick's dark prophecy: we are used to relying on computers, which have become our guides in understanding the world, and our own intellect is approaching artificial.
Author: Vladimir Stepanov

Source: https://habr.com/ru/post/116079/


All Articles