See also: “ What really happened with Vista: an insider retrospective ”
I usually eat about things that I was directly involved in — either writing code, or managing a project. In this article, I chose a different approach to write about my view on the root causes of the Windows Vista fiasco (codename Longhorn). Although it happened more than a decade ago, it was a key period for switching to mobile devices - and those events had long-term consequences inside Microsoft. I found that many attempts to describe the problems of Microsoft, especially in connection with the transition to the mobile platform, are not convincing and do not coincide with my understanding of what happened. The Vanity Fair article
“The Lost Decade of Microsoft” describes bureaucratic rot and undercover struggle (“life ... has become continually violent”) or cultural rot due to the negative effects of the competitive stacks rating system.
A subsequent article in The Atlantic describes the situation as a classic “innovator dilemma”.
I think that the situation can be stated differently - with a better connection to specific facts about projects and the true motives of key parties. This is not an attempt to write an alternative history - I have no idea what would happen if those mistakes were not made. But they certainly did not help Microsoft get through this crucial moment in the computer industry.
This is not a journalistic investigation - I did not conduct a large series of interviews with key participants in the events. Here is my personal opinion based on what I saw at that time and what I learned later. Although at the time I worked in the Office division, I had to work closely with many colleagues from the Windows division, so I am well aware of the processes that took place there.
')
I apologize for the large size of the article. Here is a summary:
- Microsoft misjudged key trends in hardware development. In particular, a sharp turn in 2003 in the trend of a rapid increase in the frequency of a single-threaded processor, with corresponding improvements in other key elements of the PC. Vista was planned and created for equipment that did not exist. It was bad for desktops, worse for laptops and terrible for mobile devices.
- The C # bet and managed code was poorly justified and poorly implemented. Responsibility for this particular failure can be directly attributed to Bill Gates and his fruitless attempts to create the holy grail from a universal repository with a universal application infrastructure. This failure has especially long-term consequences.
- Project management in the Windows division remained at a catastrophic level throughout its history, including projects that were not brought to the end. Vista was a disaster, but it was just the culmination of a series of almost catastrophic scenarios in a key mission of executing complex projects.
Since this is very important for this story, I want to start with a small example about the structure of the industry and the creation of value qualities.
Any device consists of hardware, operating system (OS) and applications. At the most basic level, the OS manages and exposes hardware resources so that applications can share them. The OS also provides software interfaces (APIs) that allow different types of hardware to connect to the device, and APIs that provide applications with access to this hardware, as well as OS system services. Although at the basic level, the OS provides only hardware resources, in practice the “OS” includes many other high-level functions, including a graphical user interface, sophisticated controls for displaying and editing formatted text or embedding HTML, high-level network support, file management, for sharing data and functionality between applications and even entire applications like a browser, email client, working with photos and cameras. The history of the OS, especially in the consumer world, involves the inclusion of more and more high-level services, which are either directly provided to users or exposed as an API for applications.
This development of high-level functionality is due to the
beneficial cycle and multilateral network effects inherent in the OS business. More and more OS users are attracting more developers. More developers create more applications, which makes the OS more attractive to users. This leads to a cycle where even more users lead to an even greater increase in the number of developers. The operating system API provided is what makes the business strategy so successful and stable for the winners of this contest. Millions of developers as a whole spend enormous efforts on programming system APIs and services behind them. The stronger some application depends on the complex API provided by a particular OS, the more difficult it is to transfer this application to some other OS. So, even if a competitor is able to repeat the key functionality of another OS, he still will not receive these applications. It’s absolutely impossible for a single OS vendor to duplicate efforts spent by millions of developers.
With this dynamic, there are many reinforcing reasons for the supplier to add ever more complex functionality and APIs to their OS, which make it easier for developers to access this functionality. Complex functionality should attract developers, and with simple APIs they can quickly build better applications. These best applications are immediately included in the beneficial cycle of attracting more users. A classic example was when Windows became the first OS where it was allowed to embed HTML documents directly into applications. Crucially, after using this functionality, it became harder to move the application to another OS.
If you look at Windows, iOS and Android, then they all use the same strategy, although Microsoft, Apple and Google have different ways to monetize. Microsoft classically charges for a license on each device. It is paid by OEM builders who sell devices with Windows. This is a horizontal business strategy with a large number of OEMs, each of which pays Microsoft for each device assembled and sold. Apple is monetized through production and direct sales of devices. Google depends on post-sale monetization, mainly through search. In fact, the fear that Apple and Microsoft will take away the mobile search market (and mobile services in general) is the main reason for Google’s investment in Android. Microsoft is also switching to direct monetization from the sale of devices (through the Surface line), as well as to post-sale monetization (through Bing and paid subscriptions like Office 365).
Another important part that needs to be considered here is third-party middleware (middleware) like Java and Adobe Flash. In a sense, it does not differ from high-level OS services, except that it is created and provided by a third-party company. OS providers and developers of middleware software have a mix of love and hate. In the sense that middleware allows developers to quickly create great applications for their platform, this is love. The hate part is driven by several driving forces. Certain types of middleware specifically solve the problem of creating applications that run on different platforms. Middleware like Java and Flash are distributed under the motto “write once, run everywhere”. Applications built on such middleware are not directly dependent on the API in the OS - and therefore can run on any platform where this middleware exists. It converts its API to the native operating system API. (Note that modern readers can represent Java either as a server infrastructure for websites, or as a preferred language for developing Android applications. I mean the root causes of its creation as a language for browser applications on demand. As such, it was considered times when vista was planned).
Cross-platform middleware destroys the network effects that are created as a result of binding applications to a specific OS through exclusive APIs specific to that OS. Applications built on such middleware also tend to use the “lowest overall functionality” of all operating systems and do not quickly adopt new features that appear in the operating system. Some types of OS functionality create their own internal network effects, in which the more applications use this functionality, the better all applications work. A classic example is a formatted copy-paste; the more applications support copying and pasting of formatted content among themselves, the more valuable is the OS for each user. If a third-party middleware provider blocks this dynamic, then over time it blocks the possibility of stable OS differentiation.
The browser as an application delivery platform is probably the most stable example of middleware that disrupts the dynamics of the system APIs. If you look at 35 years of computer history, then in some moments other approaches also appeared, but they all failed for reasons that we will not go into here. For our history, it is critically important that 20 years ago it was not so obvious what all would result in. Fear of middleware and disruption of stable differentiation of the API were major factors in the days of Vista.
Let's look at the history of Windows and the problems in the implementation of this project.
I am going to carry out hard work to summarize the results, but I think that we will not lose the point. Typically, each Windows release had a main theme and an approximate time limit. For example, Windows 95 had to update 32-bit user Windows, install a modern file system, a new UI, and standard network tools (including a browser). In addition to the main themes, individual developers and groups independently defined key functions in their field - and began development. The product in development turned out to be unstable, because for a long time new functions could not yet work stably with each other. At a certain point, the developers made a decision that they developed a sufficiently well functional - and started working on stability and preparing for release. Historically, the Windows developers usually overdue the planned release date (Windows 95 was originally called Windows 93), and the important planned functionality was either dropped or significantly reduced compared to the original plans. The stage of preparation for release often turned into a “death march” when bugs were corrected in late evenings and on weekends — and the deadlines were constantly shifted. I want to note that the key difference between Windows and Office was that after Office 97, the Office team set the release date of the next version — and usually kept the deadlines. This made it possible to achieve broad coordination with minimal overhead.
The process was very different from modern development practices. Regardless of whether the idea of a separate function comes from top to bottom as part of a broader vision or from bottom to top from individual developers and groups, but modern practitioners usually provide for continuous quality control and very frequent releases for users. Services can be updated in production several times a day, and a new client code is issued weekly or monthly (updating a client is expensive for both the provider and users, making it difficult to update too often). This requires that the basic architectural and engineering infrastructure works reliably in large, complex systems, such as Windows or Office. This process does not always make it easier for significant breakthroughs in complex functionality, but dramatically increases the flexibility of development teams and their ability to respond to external events and realities. He also gives a much more honest assessment of real current progress. It may be worthwhile to write a separate article, as the Office division made the transition to these modern development practices, but suffice it to say that the Windows division did not even come close to them at that time.
Windows XP has become a major release, which also corresponded to the mentioned sad pattern. It combined business and user platforms on a reliable Windows NT kernel with a user-friendly interface. It was difficult to achieve compatibility with all applications created for the Windows platform, but this was a key factor for the transition to a single business and user platform. Unfortunately, Windows XP found a 0day vulnerability that hit the headlines right on the day of the public release. This and other information security catastrophes forced Microsoft to drastically upgrade software and rethink security development practices — and eventually release a dimensionless service pack for Windows XP, in which most Windows divisions took part. In addition, the key Windows kernel development groups (Core groups) worked on the 64-bit version, which is especially important for integrating Windows client and server installations (while Windows was lagging behind other corporate platforms like Sun Solaris in 64-bit computing) . This was critically important because Windows was competing (and successfully) in the area of larger corporate servers.
Although the large divisions of the Windows division concentrated on these initiatives, an equally large organization was involved in creating the next generation of Windows on top of the “managed” C # platform. Here you need a little explanation of the background.
From the earliest days, the browser began to develop as an application delivery platform. Marc Andreessen's notorious quote “Netscape will soon leave only a poorly tuned set of device drivers from Windows” dates back to 1995. This led to the implementation of the embrace and extend strategy, which caused Microsoft so much trouble in the antitrust investigation. Microsoft created and developed its own browser and mechanism for implementing proprietary ActiveX code. At that time, Java appeared as an alternative application delivery strategy. Developers could use Java, a high-level language with its own rich set of APIs, where the code was automatically downloaded and launched in the browser. Java was advertised as “write once, run everywhere” technology, which clearly fell into the range of “hate” responses to middleware. Suddenly, Microsoft signed a licensing agreement with Sun for Java, but then the company was sued when Microsoft expanded Java to directly access native Windows API interfaces (which violated the “run everywhere” principle, but gave Java developers access to a richer and growing set of API on the platform). Microsoft reached a deal on the Java lawsuit and ultimately decided to go its own way with the C # programming language. This decision was disastrous for many reasons. (I note that C # itself is an excellent piece of technology - disaster was in strategy).
C # is a "managed" language. Mostly, this means that developers do not need to manage the allocation and freeing of memory "manually." The language and runtime use
garbage collection to automatically free any memory that is no longer in use. It is important that the runtime also prevented the types of memory errors that caused many security vulnerabilities at the time. At that time, and in reality over the next decade, passionate arguments continued about the effect of automatic memory management on programming performance and security. I will not retell those arguments here. Suffice it to say that the most successful modern iOS operating system has decided not to play these games (Android is sold in larger quantities, but iOS gets the lion's share of the profits). Managed environments have a cost overrun compared to unmanaged environments, so they need more memory to work. Most of the environments that take advantage of the performance of managed code, try to carefully limit its use only where it makes the most sense, and not blindly use it everywhere.
If programmers first encountered this type of environment (which is almost 100% of the developers of a Windows project at the time), then they are not serious about using memory. But no matter how the memory is managed, automatically or manually, it is a resource, and a frivolous attitude to the resource leads to bloated code, which requires more memory to work. In fact, even “high productivity” (generating a large amount of code) is not always the main criterion for the success of a project (
“if I had more time, I would have written a shorter letter” ). At that time, the use of more resources was part of the value system, because it increased the value of the fat client for the computing capabilities of the system (compared to a “thin client” like a simple application that works through a web page). When creating the Longhorn update, Windows developers would brag about how many new APIs they wrote.
Part of the C # bet was also a bet on a rich library of main classes and the creation of a new client technology as a set of class libraries on top of this base. The base library provided simple types such as strings and arrays, as well as more complex data structures and services like lists and hashing tables. The point is that this should ensure consistency for the entire Windows API. Win32 were at first relatively small uniform APIs, but over the past decade they have been heavily inflated by the efforts of so many groups that have added to the API set without any single consistent concept. The new library was considered as an opportunity to bring everything back to normal.
The fact that no other operating system chose such a path was seen as a kind of “big bet,” which was a fundamental part of the value system in the internal culture of Microsoft. Unfortunately, apart from the problems of using bloated resources, there were fundamental challenges using the new technology in the operating system (especially with how to handle resource failures in critical parts of the OS, how to independently update applications, a controlled runtime environment and the OS, and how to make different parts of the OS developed independently of each other) that were not decided or not fully understood at that time. There was literally no migration strategy for existing applications built on top of unmanaged Win32. Despite this, huge armies of developers launched construction on top of this unstable platform. What were they working on?
In this place it makes sense to connect to the story of Bill Gates.
The entire history of Microsoft since the founding of the company has been dedicated to the superiority of software over hardware. Software is a universal commodity, and the value is in the programs. If you look at the sustainable profits of the PC ecosystem, then this view really seems accurate. Definitely accurate in terms of OEM and Microsoft relationships. Intel company received the lion's share of profits among manufacturers of all computer equipment. At the same time, the dynamics of the entire computer ecosystem was maintained by an endless exponential increase in the productivity of iron under the action of Moore's law and other exponential trends. The software caught the wave because it was it that received the main economic benefit from the final product. , — .
. . RIM ( Blackberry) iPhone,
was: "This is impossible"! It is impossible to construct a full-screen lightweight phone with a touch interface and such performance that the battery lasts for a sufficiently long time. But in reality this was possible. For the last ten years, the market has been moving forward with continuous hardware innovations (better screens, faster communication, more productive processors, more memory, better camera, new sensors, more capacious batteries, weight reduction, instant switching on) indirectly through the OS software.
iOS , , , . Apple , , , — iOS .
, Windows Vista. , , , .
Longhorn, C#. WinFS . . , , , . , . . - Cairo, Windows 95.
The Avalon group (later Windows Presentation Foundation) was supposed to rethink the presentation layer on top of powerful graphics processors. The presentation layer focused on creating a universal canvas, where the user interface and rich application content were seamlessly mixed together, so the result was partially a document, and partly a user interface, all under the control of powerful graphics processors capable of handling 3D games.
The third group wrote code that came out as Windows Communication Foundations (WCF) to create network functions. This is a critical component of almost any modern application.
. C# , . API .
? , .
, .
Core 64- Windows, Longhorn.
amount of code. Unfortunately, when you build a complex system and work without clear limits and deadlines, the correct mental state of a team that generates a lot of code does not match the mental state of those who build a railway across the country and have already completed it by 90%. They can rather be compared with those who dug an incredibly deep hole, and now they are thinking about how to get back and bury it. The team only coped with the understanding of all the consequences of attempting to deliver OS functions on top of this managed infrastructure. They realized that they have a ton of work to simply make the basic premises a reality. Besides all this, not one of the main components has been brought to the stage of readiness. They also began to understand the limitations of performance when introducing major new subsystems into existing code.WinFS and Avalon projects did not replace the existing OS infrastructure, but lay on top of it. So, the essential expenses of computing resources were simply added from above.
As explained in the 2005 Wall Street Journal article , Olchin decided to remove these major components from the release, continuing to develop them. As a result, after three years of the product cycle, work on them had to start virtually from scratch. All of these managed functions had to be excluded from the basic composition of the OS and delivered separately. Their exclusion was clearly the right decision, but the exclusion of both subsystems revealed and caused problems that would then persist for a decade.
The C # bet and managed code included a strategy that reduced investments in unmanaged Win32 layers. I remember long meetings with attempts to persuade representatives of the Windows division to make relatively small commits to the text and graphic functions necessary for Office. The exception to the release of these components of C # made it even more obvious that Windows will remain for several years without key user interface controls for developers (such as Office) in its main Win32 API.
, Avalon IE. IE , Avalon, IE , . ́ , HTML , , HTML, Avalon.
This was a huge strategic mistake that opened the door to spread Firefox, and then Google’s Chrome browser. Although it is impossible to say, it would have prevented such an outcome from investing in IE, but definitely it would not have become worse. The cessation of attachments also undermined the IE team and left them unprepared and lacking staff in the face of the continuing rapid evolution of web technologies, which ruined IE’s reputation among web developers. The fact of the mistake made immediately became obvious to everyone in the company: there was no need to be a special visionary. Office and other units in the company actively invested in the web and HTML. There was no real way how these tools could go to Avalon, and even more so the whole industry. In fact, the Avalon team never described this real way:something magical had to happen - and suddenly everyone starts to write Avalon applications instead of HTML. It was as absurd as it was shameless. Immediately after we “won” the browser war and watched AOL takeover of Netscape, we drastically reduce the further development of technology on these open standards. Only with the start of the project, Windows 7 restored the staff of the IE group and resumed active investments in IE and standard web technologies.Only with the start of the project, Windows 7 restored the staff of the IE group and resumed active investments in IE and standard web technologies.Only with the start of the project, Windows 7 restored the staff of the IE group and resumed active investments in IE and standard web technologies.
Other issues that follow are associated with Avalon.
The Avalon model was based on the concept of Bill to provide a universal canvas, a runtime environment for applications. As I explained in the article “Leaky at the level of architecture” , one of the main problems for developers of frameworks like Avalon is how to open functions at different levels so that applications can become attached to them at the appropriate functional level and not incur extra productivity. If you open the functionality only at the highest level, then all the work will in fact be inaccessible for more complex applications (such as Office applications) that need to be linked at lower levels. It took another 10 years before the release of Windows 10 before they solved these problems at the architecture level.
Avalon also made a bid for energy-intensive graphics cards. The mobile graphic model, although using some of the same elements, was mainly focused on achieving smooth animation through pre-rendered textures or layers, with their zooming, panning and blending. The number of layers is carefully limited, so that the mobile graphics processor shows a large frame rate, which is necessary for smooth animation and user interaction, while consuming some energy. The Avalon graphic model effectively moved in the opposite direction.
Problems with WinFS in some ways turned out to be even more fundamental than with Avalon. While Avalon was supplied separately, and some key concepts were used as the basis for the UI components in Windows 8 and 10, the WinFS project was completely stopped.
Initially it was assumed that WinFS would become . , , — . , Windows ( 64- ). WinFS , . , WinFS . , . WinFS , « », , . Microsoft , , WinFS. , , ó , - .
At a deeper level, the vision of Bill's ecosystem of applications that store everything and share their data in this relational repository, directly contradicts the way applications build their data models. While some desktop applications (and almost all internal IT applications) use relational storage for their internal data models, they don’t want to provide these data models for unsupervised reading and writing by other applications. I explained some of the fundamental reasons in the above-mentioned article "Leaky at the level of architecture". There were (and there are) many other options for applications that want to use relational storage. And of course, the long-term trend was that all this data is moving to the cloud, and not locked into the local PC storage.
Vista. , , . Windows Windows 7, Windows (DevDiv), DevDiv, , , . , Windows Windows 8, , Windows. . . , — Windows. « », (, , Office ). , , , . « ». , Flash ( ) Silverlight . , , , .
If binder software is considered to be one of the worst nightmares for an OS supplier in terms of losing the monopoly on mediation, it becomes clear that "we are facing an enemy, and this is one of us." I do not declare that I have insider information for that period, but I remember the annoyance of this concentration on the layers of the managed year and their uselessness for most Office scenarios, although I cannot clearly formulate strategic issues. In fact, it was precisely system innovations in iOS that gave a clear understanding in hindsight how much the wrong direction was chosen for work.
, C#, , Vista, . Windows XP 64 , 128 - Windows XP. Vista 512 , 1 ( “Vista Capable” ). . , «» - , . ( ) , . , , , , . Windows 7 , — .
Another blow to the reputation of Vista as a stable system was caused by the problematic nature of the drivers - key software, which was written and supplied by hardware manufacturers (graphics cards, network cards, hard drives, etc.) for embedding in the OS. In the Vista system, an important change was made in the driver model: they were taken out of the OS kernel to a layer that is more reliable to manage. The famous “blue screen of death” of Windows XP after an operating system crash almost always appeared due to the failure of some third-party driver. Having removed this code from the kernel, Windows should have increased the overall reliability of the system.
Changes in the driver model required massive changes in the driver code for Windows from all hardware manufacturers. The advantage of this major detuning from competitors becomes a stone around your neck when you try to make such large-scale changes in the entire ecosystem. Since the release of Vista was often delayed, it was difficult for hardware manufacturers to set the priority for this work and the timing of its execution. Many were not ready for Vista. This meant that the first experience of using a new system for many users was associated with missing or very buggy drivers.
The processor speed scaling collapse I mentioned at the beginning of this article contributed to performance issues. The computer industry has evolved under the influence of exponential growth in data processing — the amount of data that can be stored and processed, processing speed, bandwidth, and communication delays between different devices. Much of this is due to Moore’s law — a regular doubling of the number of transistors that are placed in the same area of an integrated circuit. This simple doubling law is familiar to users. They can expect that Moore's law will manifest itself in increasing the processor speed, the amount of RAM, storage capacity and connection speed.
The reality is a bit more complicated. The increase in processor speed is accompanied by an increase in power consumption and heat dissipation. I remember a vivid diagram showing the growth of heat in Intel processors. The logarithmic scale depicted a straight line from the first Intel processors, through the Pentium, to heat dissipation on the surface of the Sun. Here comes to mind the law of Stein: "If something can not last forever, it will not." The computer industry ran headlong into the "energy barrier". Processor speeds could not scale without an unacceptable increase in power consumption and heat dissipation. If you look at the graphics speeds of processors, then the turn occurred in 2003, at the height of the Vista fiasco. Other trends also made the naive perception of improvements in productivity dangerous.Chip makers produced memory chips with a higher density of transistors, but the “memory barrier” meant that the delay between the CPU and the memory made it more and more difficult to efficiently use all this memory. Perhaps most of the problems in creating a balanced PC system were caused by an increase in disk capacity, but a much slower increase in the number of random I / O operations per second. This meant that larger programs were placed on larger disks, but they loaded much slower. Because of the imbalance, fast programs could easily make I / O requests faster than the disk was able to process them - the result was slowed-down systems, despite the increase in processor speed and more memory.that the delay between the CPU and the memory made it more and more difficult to effectively use all this memory. Perhaps most of the problems in creating a balanced PC system were caused by an increase in disk capacity, but a much slower increase in the number of random I / O operations per second. This meant that larger programs were placed on larger disks, but they loaded much slower. Because of the imbalance, fast programs could easily make I / O requests faster than the disk was able to process them - the result was slowed-down systems, despite the increase in processor speed and more memory.that the delay between the CPU and the memory made it more and more difficult to effectively use all this memory. Perhaps most of the problems in creating a balanced PC system were caused by an increase in disk capacity, but a much slower increase in the number of random I / O operations per second. This meant that larger programs were placed on larger disks, but they loaded much slower. Because of the imbalance, fast programs could easily make I / O requests faster than the disk was able to process them - the result was slowed-down systems, despite the increase in processor speed and more memory.but a much slower increase in the number of random I / O operations per second. This meant that larger programs were placed on larger disks, but they loaded much slower. Because of the imbalance, fast programs could easily make I / O requests faster than the disk was able to process them - the result was slowed-down systems, despite the increase in processor speed and more memory.but a much slower increase in the number of random I / O operations per second. This meant that larger programs were placed on larger disks, but they loaded much slower. Because of the imbalance, fast programs could easily make I / O requests faster than the disk was able to process them - the result was slowed-down systems, despite the increase in processor speed and more memory.
Vista came out in a period when the transition to a mobile platform accelerated. Laptop sales surpassed desktop sales in 2003; and by 2005, laptops had overtaken desktops by the number of devices sold. Because Vista did not work so well on cheap new laptops ("netbooks"), Microsoft had to allow OEM builders to continue shipping Windows XP to these lower-priced machines.
An important part of what is happening here was a deeper problem - the fundamental sufficiency of the desktop PC form factor for those tasks that were solved on it. The main areas of use are productivity (mainly Office), communications, surfing (including search, websites and web applications), some internal business applications, and frontend for specialized devices (think of your dentist’s X-ray machine). All this has generally stabilized by the year 2000 and has not changed much since then. Microsoft could create new APIs, but computers already did almost everything that users needed. Improvements needed by people — better controllability, stability, performance, security from the software side and longer battery life, lighter weight, faster processors,increased communication speed, larger screens - from the hardware side. Many of these improvements required a decrease in software, rather than an increase.
— ,
« » .
« » . — . — , , . , , , . - . , , .
Notice that here I focus on the sufficiency of the form factor. General computing requirements in the economy continued to grow at an explosive pace. But faster and ubiquitous communications meant more flexibility in how the application distributes its computing requirements (data and processing) between different nodes in the system. In the past two decades, under the influence of many factors, an increasing part of the processing goes to the server or to the cloud. I'd rather have powered these computational cycles from the Grand Coulee HPP with a server in eastern Washington state than carrying a battery with me. If you need access to data from different devices or multiple users, then I want to store and process them on the server, not on the local PC.Progress in wireless communications (and overall bandwidth end-to-end communications) has made this device operation scenario exceptionally stable.
We did not observe this only on desktop computers (including laptops). Probably, the tablets showed a trend of sufficiency of the form factor more than anything else. Although in a broader perspective it is not. For decades, various tablet reincarnations have not satisfied us with weight, battery life, computational power, input modes, and total response time. But when the iPad appeared on the scene with its combination of screen size, weight, battery life, touch input, processor speed, and instant power-on, we reached a sufficient point. The changes that have been made since then are mostly incremental in nature - which makes the engineers who work on these things crazy and spend a lot of energy and creative powers. Engineers from Maytag, working on a new washing machine, should have the same feelings.
, , . — . , , . , , .
?
, . . .
. , , . — . , , , . , . , .
, - , . - ,
« » , 1951 . — .