The importance and application of virtualization extends far beyond the virtual machines.None of the advances in information technology over the past sixty years had so much value as virtualization. Many IT professionals think of virtualization in terms of virtual machines (VMs) and associated hypervisors and operating systems, but this is only the tip of the iceberg. An ever wider range of technologies, strategies and virtualization capabilities redefines the core elements of IT in organizations around the world.
Virtualization Definition
Considering the definition of virtualization in a broader sense, it can be said that it is the science of how to transform an object or resource that is simulated or emulated in software into an equivalent, physically-implemented object with respect to functions.
')
In other words, we use abstraction to make software look and behave like hardware, with significant advantages in flexibility, cost, scalability, overall capabilities, performance and in a wide range of applications. Thus, virtualization makes it real that it really is not, applying the flexibility, convenience of software features and services, replacing a similar implementation in software.
Virtual Machines (VM)
The era of VM originates from a small number of mainframes of the 1960s, primarily from the IBM 360/67, which later became generally accepted in the mainframe world in the 1970s. With the advent of the Intel 386 in 1985, VMs took their place in microprocessors, which are the heart of personal computers. The modern function of the virtual machine, embedded in microprocessors with the necessary hardware support, both using hypervisors and implementing at the OS level, is important for computing performance, which is crucial for capturing machine cycles that would otherwise be lost in modern high-performance 3+ GHz.
Virtual machines also provide additional security, integrity and convenience, given that they do not need large computational costs. Moreover, you can additionally extend the capabilities of virtual machines by adding emulator functions for interpreters, such as the Java virtual machine, and even full simulator functions. Running Windows under MacOS? Easily. Commodore 64 code on your modern Windows PC? No problem.
The main thing is that the software running in virtual machines does not know about this fact - even the guest OS, originally designed to work on bare metal, believes that this is its “hardware” platform. This is the most important element of virtualization itself: the embodiment of the implementation of information systems based on the isolation provided by the API and protocols.
In fact, we can trace the roots of virtualization to the era of
the time-sharing regime , which also began to emerge in the late 1960s. At that time, mainframes were certainly not portable, so the rapidly growing quality and availability of switched and leased telephone lines, as well as improved modem technology, made it possible to implement the virtual presence of the mainframe as a terminal (usually alphanumeric). Indeed, the virtual machine: Thanks to advances in technology and microprocessor economics, this computational model led directly to the creation of personal computers of the 1980s with local computing in addition to transferring data over the telephone line, which evolved into a local network and ultimately present The possibility of continuous access to the Internet.
Virtual memory
The concept of virtual memory, which also developed rapidly in the 1960s, is as important as the idea of ​​virtual machines. The era of mainframes was notable for its extraordinary expensive memory with a magnetic core, and mainframes with more than one megabyte of memory were rare at all until the 1970s. As with virtual machines, virtual memory is activated by relatively small additions to hardware and instruction sets to include storage parts, usually called segments and / or pages, for writing to secondary storage and for memory addresses within these blocks that will be dynamically translated as they are unloaded back from disk.
One real megabyte of RAM on an IBM 360/67, for example, can support the full 24-bit address space (16 MB) included in the computer architecture, and if properly implemented, each virtual machine can also have its own full set of virtual memory. As a result of these innovations, hardware designed to work with a single program or operating system can be shared by several users even if they have different operating systems installed or the required memory size exceeds actual throughput. The benefits of virtual memory, like virtual machines, are numerous: the separation of users and applications, improved security and data integrity, as well as significantly improved RoI. Sounds already familiar?
Virtual desktops
After virtualization of machines and memory, as well as their introduction into low-cost microprocessors and PCs, the next step was desktop virtualization and, consequently, the availability of applications, both single-user and collaborative. Again, we have to go back to the
time-sharing model described above, but in this case we mimic the desktop PC on the server and remove the graphics and other user interface elements over the network connection through the client-related software and often through inexpensive and easily manageable and secure thin client device. Each leading operating system today supports this feature in one form or another, with a wide range of additional hardware and software products, including VDI, X Windows, and a very popular (and free) VNC.
Virtual storage
The next major achievement, which today has a high prevalence, is the virtualization of processors, storage and applications in the cloud, i.e. the ability at any time to pull out the necessary resource that may be needed right now, as well as simply adding and building capacity with little or no effort from the IT staff. Savings in physical space, capital costs, maintenance, downtime due to failures, labor-intensive troubleshooting costs, serious performance and outage problems, and many additional costs can actually be paid for by service solutions that are stored in the cloud. For example, storage virtualization can offer many possibilities in such cases.
The widespread introduction of cloud storage (not only as a backup, but also as the main storage) will become more common, because and wired and wireless networks provide data rates of 1 Gbps and higher. This feature is already implemented in Ethernet, 802.11ac Wi-Fi and one of the most anticipated high-speed networks - 5G, which is currently being tested in many countries.
Virtual networks
Even in the world of networks, the concept of virtualization is becoming more and more applied, the “network as a service” (NaaS) technology is now in many cases a promising and highly demanded option. This trend will only be popularized in view of the further implementation of network function virtualization (NFV), which will at least become the object of the greatest interest among operators and providers, especially in the field of mobile communications. It is noteworthy that network virtualization can provide a real opportunity for mobile operators to expand the range of their services, increase bandwidth and thereby increase the value and attractiveness of their services for corporate clients. It is likely that over the next few years, an increasing number of organizations will apply NFV in their own and even in hybrid networks (again, customer attractiveness factor). At the same time, VLAN (802.1Q) and virtual private networks (VPN) for their part make a huge contribution to the approaches to the use of modern virtualization.
Virtualization reduces costs
Even taking into account the wide range of significant functional solutions that virtualization can offer, the economic assessment of large-scale virtualization functions, which attracts particular attention, still comes to the fore. The competitiveness of the rapidly evolving cloud-based business model means that the traditional labor-intensive operating expenses borne by customer organizations every day will decline as service providers, based on their own experience, develop new offers that will significantly help save finances. and offer lower prices to end users as a result of market competition.
Using it is easy to improve reliability and resiliency due to the use of multiple cloud service providers in a fully redundant or hot backup mode, which virtually eliminates the possibility of single points of failure. As you can see, many of the elements of the costs implied by the capital expenditures in the IT sphere, turn into operating expenses, For the most part, the funds are not spent on increasing the amount of equipment, increasing the capacity and personnel of the organization, on service providers. Again, thanks to the power of modern microprocessors, improvements in systems and architectural solutions, as well as a sharp increase in the performance of both local networks and WAN networks (including wireless), virtually every element of the IT industry today can actually be virtualized and even implemented as a scalable cloud service if necessary.
Virtualization itself is not a paradigm shift, although it is often described that way.
The essence of virtualization in any form is to allow IT processes, with the help of a huge range of capabilities described above, to appear more flexible, efficient, convenient and productive.
Based on the virtualization strategy of most cloud services in IT, it can be said that virtualization is the best solution today as an alternative to an operating model with economic advantages, which will make it possible to avoid the use of traditional methods of work.
The development of virtualization in this area is due to the substantial economic inversion of the IT operating model, which takes its roots at the beginning of the commercialization of information technology.
In the early days of computer technology, our interests were focused on expensive and often overloaded hardware elements, such as mainframes. Their huge cost and motivated the first attempts of virtualization, which are described above.
As the hardware became cheaper, more powerful and more accessible, the focus turned to applications running in practically standardized and virtualized environments, from PCs to browsers.
The result of this evolution is what we are seeing now. Since computers and computing were the basis of IT, we turned our attention to information processing and the ability to provide it anytime and anywhere. This “infocentricity” - spurred the evolution of the mobile and wireless era, and as a result, the end user can at any time, regardless of location, get this information and have it at hand.
Initially, thinking about how to work more effectively with a slow and very expensive mainframe, everything has led to the fact that virtualization is now becoming the main strategy for the entire future of the IT industry. No IT innovation has had as much impact as virtualization, and with the transition to cloud virtualization infrastructure, we are really just starting the path to something global.
Original article:
What is virtualization?