Our time is often called the information age. However, information was critical to the human race throughout its existence. Man has never been the fastest, strongest and most enduring animals. We owe our position in the food chain to two things: sociality and the ability to transmit information in more than one generation.
How information was stored and spread through the centuries continues to remain literally a matter of life and death: from the survival of the tribe and the preservation of traditional medicine recipes to the survival of the species and the processing of complex climate models.
Look at the infographic (clickable to view in full version). It reflects the evolution of storage devices, and the scale is truly impressive. However, this picture is far from perfect - it covers some decades of the history of mankind, already living in the information society. Meanwhile, the data were accumulated, broadcast and stored from the moment when we know the history of mankind. At first it was an ordinary human memory, and in the near future we are already waiting for data storage in holographic layers and quantum systems. On Habré, they have repeatedly written about the history of magnetic drives, punch cards and home-sized disks. But there has never been a journey to the very beginning, when there were no iron technologies and notions of data, but there were biological and social systems that learned to accumulate, save, transmit information. Let's try today to scroll through the whole story in one post.
Image source: Flickr')
Before the invention of writing
Before the appearance of what could be called a written language without a doubt, the main way to preserve important facts was oral tradition. Social customs, important historical events, personal experience or the narrator’s creativity were transmitted in this form. This form is difficult to overestimate, it continued to flourish until the Middle Ages, far after the appearance of writing. Despite the undeniable cultural value, the oral form is the standard of inaccuracy and distortion. Imagine a game of "spoiled phone", in which people have been playing for several centuries. Lizards are transformed into dragons, people get their heads, and reliable information about the life and customs of entire nations cannot be distinguished from myths and legends.
BoyanFrom cuneiform to printing press
For most historians, the birth of a civilization with a capital letter is inseparably linked with the advent of writing. According to popular theories, civilization in its modern sense is the result of the creation of surplus food, the division of labor and the appearance of trade. In the valley of the Tigris and Euphrates, this is exactly what happened: the fertile fields gave rise to commerce, and commerce, in contrast to the epic, requires accuracy. It was about 2700 BC, that is, 4700 years ago. The lion's share of Sumerian cuneiform tablets are filled with an endless series of trade transactions. Not everything, of course, is so trivial, for example, the decoding of the Sumerian cuneiform has preserved for us the oldest literary work at the moment - “The Epic of Gilgamesh”.
Clay tablet with cuneiformThe cuneiform was definitely a great invention. Clay tablets well preserved, what can we say about the cuneiform stamped on the stone. But cuneiform has an unambiguous minus - speed, and the physical (not in megabytes) weight of the final “documents”. Imagine that you urgently need to write and deliver several bills to a nearby city. With clay tablets, such work can literally become overwhelming.
In many countries, from Egypt to Greece, mankind was looking for ways to quickly, conveniently and securely record information. More and more people came to this or that variation of thin sheets of organic origin and contrasting “ink”. This solved the problem with speed and, so to speak, “capacity” per kilogram of weight. Thanks to parchment, papyrus and, ultimately, paper, humanity received its first information network: mail.
However, with new advantages came new problems: everything that is written on materials of organic origin tends to decompose, fade, and just burn. In the era from the dark ages up to the invention of the printing press, copying books was a big and important thing: literal rewriting was missing, letter by letter. If we imagine the complexity and complexity of this process, it is easy to understand why reading and writing remained the privilege of a very narrow layer of monasticism and noble people. However, in the middle of the fifteenth century, what could be called the First Information Revolution occurred.
From Gutenberg to the lamp
Attempts to simplify and speed up typing using sets of pre-cast word forms or letters and manual press were made in China in the 11th century. Why do we know little about this and are accustomed to consider Europe as the birthplace of the press? The spread of typesetting in China was prevented by their own complex writing. The production of letters for full printing in Chinese was too time consuming.
Thanks to Gutenberg, the concept of a copy appeared in the books. Gutenberg's Bible was printed 180 times. 180 copies of the text, and each copy increases the likelihood that fires, floods, lazy scribes, hungry rodents will not interfere with future generations of readers.
Gutenberg Printing PressManual press and manual selection of letters, however, are not, of course, optimal in terms of speed and labor-intensive process. With each century, human society sought not only to find a way to preserve information, but also to spread it to the widest possible range of people. With the development of technology, both print and copy production have evolved.
The rotary printing press was invented at the end of the nineteenth century, and its variations are used up to the present day. These engines, with continuously rotating shafts on which the printed forms are fixed, were the quintessence of the industrial approach and symbolized a very important stage in the information development of mankind: information became mass, thanks to newspapers, leaflets and cheapened books.
Mass character, however, does not always benefit a particular piece of information. The main carrier, paper and ink, are still subject to wear, dilapidation, loss. Libraries full of books in all possible areas of human knowledge have become increasingly voluminous, occupying vast spaces and demanding more and more resources for their maintenance, cataloging and searching.
The next paradigm shift in the field of information storage occurred after the invention of the photoprocess. Several engineers came up with the bright idea that miniature photocopies of technical documents, articles, and even books can prolong the life of the source code and reduce the space required for their storage. The resulting microfilm films (miniature photographs and equipment for viewing them) became common in financial, technical, and scientific circles in the 1920s. Microfilm has many advantages - this process combines the ease of copying and durability. It seemed that the development of methods for storing information reached its apogee.
Microfilm, still in useFrom punched cards and magnetic tapes to modern data centers
Engineering minds have tried to come up with a universal method of processing and storing information since the 17th century. In particular, Blaise Pascal remarked that if one conducts calculations in binary number system, then mathematical regularities make it possible to bring solutions to a form that makes it possible to create a universal computer. His dream of such a machine remained only a beautiful theory, however, after centuries, in the middle of the 20th century, Pascal's ideas were embodied in hardware and spawned a new information revolution. Some believe that it is still ongoing.
What is now called “analog” methods for storing information implies that sound, text, images and video used their own record and playback technologies. Computer memory is universal - everything that can be written is expressed with the help of zeros and ones and reproduced with the help of specialized algorithms. The very first method of storing digital information was neither convenient, compact, nor reliable. These were punched cards, simple cardboard with holes in specially designated areas. A gigabyte of such a “memory” could weigh up to 20 tons. In such a situation it was difficult to talk about competent systematization or backup.
CardThe computer industry developed rapidly and quickly penetrated into all possible areas of human activity. In the 1950s, engineers “borrowed” data recording on magnetic tape from analog audio and video recordings. Streamers with cassettes up to 80 MB were used for storing and backing up data up to the 90s. Was it a good way with a relatively long shelf life (up to 50 years) and a small carrier size? In addition, the convenience of their use and the standardization of data storage formats introduced the concept of backup to everyday use.
One of the first IBM hard drives, 5 MBThe magnetic tapes and systems associated with them, there is one serious drawback - this is consistent access to data. That is, the further the record is from the beginning of the tape, the more time it will take to read it.
In the 70s of the 20th century, the first "hard disk" (HDD) was produced in the format in which it is familiar to us today - a set of several disks with magnetizable material and read / write heads. Variations of this technology are used today, gradually giving way to the popularity of solid-state drives (SSD). Starting from this moment, during the entire computer boom of the 80s, the main paradigms of storage, protection and backup of information are formed. Thanks to the mass distribution of consumer and office computers that do not have large amounts of memory and computing power, the client-server model has become stronger. At the beginning, the “servers” were mostly local, for each organization, institute or company. There was no system, rules, information was duplicated mainly on floppy disks or magnetic tapes.
The emergence of the Internet, however, has spurred the development of data storage and processing systems. In the 90s, at the dawn of the “dot-com bubble”, the first data centers, or data centers (data centers), began to appear. Requirements for the reliability and availability of digital resources grew, along with them the complexity of their provision grew. From special rooms in the depths of an enterprise or an institute, data centers have become separate buildings with their cunning infrastructure. At the same time, anatomy of a kind has crystallized at the data center: computers themselves (servers), communication systems with Internet providers and everything related to utilities (cooling, fire extinguishing systems and physical access to the premises).
The closer to today, the more we depend on the data stored somewhere in the "clouds" of the data center. Banking systems, e-mail, online encyclopedias and search engines — all this has become a new standard of living, one might say, the physical continuation of our own memory. The way we work, rest and even get treated, all this can be harmed by a simple loss or even a temporary disconnection from the network. In the two thousand years, standards of reliability of data centers were developed, from the 1st to the 4th level.
At the same time, backup technologies began to actively penetrate from the space and medical industries. Of course, people knew how to copy and reproduce information in order to protect it in the event of the destruction of the original, but it was the duplication of not only data carriers but also various engineering systems, as well as the need to provide for points of failure and possible human errors distinguishes serious data centers. For example, a data center belonging to Tier I will only have limited storage redundancy. The requirements for Tier II already contain redundancy of power supplies and the availability of protection against elementary human errors, while Tier III provides for the reservation of all engineering systems and protection against unauthorized access. Finally, the highest level of reliability of the data center, the fourth, requires additional duplication of all backup systems and the complete absence of points of failure. The multiplicity of redundancy (how many reserve elements fall on each primary) is usually denoted by the letter M. Over time, the requirements for the multiplicity of redundancy only grew.
Building a data center with a TIER-III reliability level is a project that only an exceptionally qualified company can handle. This level of reliability and availability means that both engineering communications and communication systems are duplicated, and the data center is idle only in the amount of about 90 minutes per year.
We at
Safedata have this experience: in January 2014, in cooperation with the Russian Scientific Center Kurchatov Institute, we commissioned the second SAFEDATA data center, Moscow-II, which also meets the requirements of the TIER 3 level of the TIA-942 standard, Earlier (2007-2010) we built the Moscow-I data center, which meets the requirements of TIER 3 level of TIA-942 standard and belongs to the category of data storage and processing centers with secure network infrastructure.
We see that another paradigm shift is occurring in IT, and it is connected with data science. Processing and storing large amounts of data are becoming more relevant than ever. In a sense, any business should be ready to become a bit of a scientist: you collect a huge amount of data about your customers, process them and get a new perspective for yourself. To implement such projects will require the rental of a large number of powerful server machines and operation will not be the cheapest. Or, perhaps, your internal IT system is so complex that it takes too many company resources to maintain it.
In any case, for whatever purposes you need significant computational power, we have the “Virtual Data Center” service. Infrastructure as a service is not a new direction, but we stand out with a holistic approach, ranging from specifically IT problems, such as transferring corporate resources to a “Virtual Data Center”, to legal ones, such as consulting on current Russian legislation in the field of data protection.
The development of information technology is like a train ruthlessly rushing forward, not everyone has time to jump into the car when they are given the opportunity. Somewhere still use paper documents, hundreds of non-digitized microfilms are stored in the old archives, government agencies can still use floppy disks. Progress is never linearly uniform. Nobody knows how many important things we have lost forever as a result and how many hours were spent due to still not completely optimal processes. But we in Safedata know how to prevent waste and irrecoverable losses specifically in your case.