📜 ⬆️ ⬇️

XGI Technology, the story of an ambitious company

Currently, the market for discrete graphics accelerators is divided between two giants - AMD, ATi in the past, and NVIDIA, but this was not always the case. At the beginning of the 21st century, there was a third ambitious player in the market of graphic accelerators. Today, many have forgotten XGI Technology, and after all, at one time, but not quite so long ago, in 2003, these guys threatened to open the door and climb into the market of productive graphics cards, pushing today's leaders in this industry ...



XGI Technology was founded in June 2003 in Taipei, Taiwan.


The story about XGI, strange as it may seem, should start with another company, namely with Trident Microsystems . Trident, specializing in the supply of budget graphics processors, became widely known in the 90s. At the time, Trident provided video for most computers with 286 and 386 processors, in most cases they were 2D accelerators. Despite the widespread and success in the sector of 2D accelerators, in the late 90s, attempts to do something in the 3D direction collapsed over and over again. In the end, after going through a series of failures, the company's engineers managed to make a small race and in 1999 Trident Blade3D entered the market. I think many still remember this accelerator, although Blade3D could not claim a better place in the sun, the architecture of this 3D accelerator was 64-bit, the maximum amount of video memory was only 8 MB, it did not compete with the monsters of that time in performance - 3dfx Voodoo Banshee, Matrox Millenium G200 and NVIDIA RivaTNT, but could compete with the budget Intel i740, and given innovations such as Trident Hardware-Assisted MPEG-2 Architecture for DVD (iron support for playing DVD) and Full scene anti-aliasing, as well as price about $ 20 - sold well enough and won a place of many of the system blocks of the time. In the future, Trident refined its Blade3D, and even prepared to compete in the DirectX 8.1 video card sector with the help of the Trident XP4 brainchild, which was then dominated by the GeForce4 Ti 4600 and ATi Radeon 9700. Trident XP4, priced at less than $ 99, with a performance level of GeForce4 Ti 4200, for the company had to become a punch card in the high-performance graphics market. Unfortunately, the need to constantly increase the budget for development in the field of 3D and intense competition with Intel in the lower price segment of graphics cards slowly killed the company. In June 2003, Trident issued a statement on the complete restructuring of the company. It was at this time that the graphic division bought the heroine of today's article, XGI Technology, while Trident itself went into the development of chips for home appliances.


Immediately after the foundation, XGI Technology absorbed Trident Microsystems in 2003.


At the turn of the 21st century, another, equally well-known company Silicon Integrated Systems (SiS) occupied a prominent share in the market of chipsets for motherboards. In the SiS assortment, there were chipsets for all the platforms of that time, a certain percentage fell on chipsets with integrated graphics - SiS Mirage Graphics. The company successfully managed to compete in the market in the chipset segment for motherboards with VIA and Intel, but attempts to enter the market of discrete cards constantly failed miserably. At that time, SiS had a video chip - SiS Xabre, which the company was trying to promote with all its might. Chaintech, which produced a lot of SiS Xabre-based products at one time, gave them away practically for nothing, trying to somehow free up the warehouses and reduce losses. In retail, these cards were on the shelves for about $ 30, even at that price they were not needed by anyone. It is quite clear that this situation did not suit the leadership of SiS at all, and the board of directors was thinking about what to do next with the company's graphic division. It was necessary to make a decision - either to close the graphics development unit, or, by giving him money and his own work, as well as a good kick in the ass, to let him float freely. Trident had problems just in time for SiS. The second option was chosen by the board of directors and it was decided to separate its graphic solutions division into a separate company and take Trident, who was in a difficult situation, in the hands of engineers. Thus was born the daughter of SiS - XGI Technology.
')

Less than 3 months have passed since the founding of the XGI before the announcement of the first products.



Since its foundation, XGI Technology has had two teams of engineers from the SiS and Trident immediately. Both teams had almost finished products - SiS Xabre and Trident XP4. The management hoped that having two groups of highly qualified specialists under its wing, XGI could create competitive products and grab a decent market share from the leaders - NVIDIA and ATi. The plans had a figure of 10% of the market in a year! Already in September 2003, a whole Volari product line was announced, as well as a two-chip one! video accelerator Volari Duo. Fully presented model lineup of XGI Volari graphics chips for desktop and mobile systems was as follows:


The release of finished products was scheduled for the end of 2003, 6 months after the founding of XGI.



In fact, the Volari V3 chip was nothing more than a modified Trident XP4. Similarly, the mobile version of the Volari XP5 was the same old chip, fired by Trident experts for use in mobile computers. The rest of the cards were developed by the SiS team, which used its SiS Xabre experience. Volari V3 and XP5 only supported DirectX 8.1, all other products were already content to support DirectX9 and OpenGL 1.4. It is interesting to note that the announcement of new products XGI was not paper. Already at Computex 2003 at the end of September 2003, the engineering prototype of the dual-chip card on the Volari Duo V8 Ultra was shown at the company's stand, and its first benchmarks were also presented to the public! Which resulted in 5370 points in 3DMark 2003, which was quite impressive, although it didn’t reach the big-name rivals - GeForce FX 5900 Ultra and Radeon 9800 Pro.
The plans of XGI were shipping ready-made solutions by the end of 2003, in the approximate partners were such companies as CP Technology, Gigabyte and Club-3D, and negotiations with ASUS and MSI were underway. Many reputable print publications and reviewers have been looking forward to the explosion in the graphics accelerator market. After all, the emergence of another strong competitor in the market, divided between the two giants, is always interesting and unknown how it can end. As they say in the well-known saying - the third one is superfluous, everything indicated that XGI wants and is ready to destroy this stereotype.


The first Volari maps reached the observers only by January-February 2004.



When the press finally got samples of XGI products, the experts took up a study of the performance of the new cards. Virtually the entire public was of course interested in the top product, the Volari Duo V8 Ultra. The main problem of the product immediately came to light - if in synthetic tests, including in 3dMark, the dual-chip monster showed tolerable performance, of course, it lagged behind the tops of its main competitors of that time in the person of ATi and NVIDIA, but this lag could be somehow non-exaggerated at the price of the final product, in real games, the performance was simply nothing. Not only did the second GPU not work in almost all applications, and in fact the reviewers had a single-processor Volari V8 Ultra in their hands, there were still big problems with the drivers and rendering quality that weren’t relevant. In many ways, there were problems due to unwillingness to cooperate and the lack of effective work with game developers at the development stage of the game itself. Often, developers could not even test their games on Volari cards, since simply did not have them in stock. As a result, XGI had to fence the garden on its own in the drivers. By the way, the driver settings utility interface was an outwardly exact copy of the old SiS utility. In order to somehow improve performance, I had to make concessions, somewhere to lower the quality of rendering, to disable some functions altogether, some games did not work at all on XGI maps. It was ridiculous - the driver developers blocked the possibility of taking screenshots in games so that it would be impossible to compare the quality of image processing with other cards. It's funny, but true - probably the guys from XGI made profits only from the sale of samples of such cards to the press, because they practically did not send the samples, but they were ready to sell them with pleasure. Volari Duo V8 Ultra started to go on sale at a price of about $ 500, there were not many people who wanted to buy a product on which some games do not work.


In early 2004, only one major partner decided to issue Volari cards.



In 2004, it seemed the company did not care at all about the failure of its own products, XGI everyone cracked about the future of its graphics chips, not forgetting to show the schedule (roadmap), and the guys had ambitious plans, DirectX10 accelerator by 2005! A hasty transition to a prospective bus at the time - PCI Express x16 was announced. At the same time, XGI starts selling video adapters under its own brand, because all partners began to scatter from a sinking ship, and as we know, using 3dfx as an example, own production of cards is an expensive enterprise, as well as a very risky business. Prices immediately went down, trying to stimulate the XGI buyer, along with remaining committed Chinese manufacturers, began selling Volari Duo V8 Ultra 256MB DDR2 for less than $ 200, and Volari V8 Ultra for less than $ 100. By the end of 2004, XGI introduced its server solutions, the Volari Z7 line to accelerate 2D graphics, which, by the way, won their place in the server segment. Until now, Volari Z7 works in many servers that have survived to our time. Unfortunately, in 2004, XGI was expecting a little bit different, namely new products, which never appeared. The company was constantly postponing the deadlines; from the point of view of logic, it was natural and understandable - it takes about one and a half years to develop a full-fledged new product from industry leaders (ATi and NVIDIA), and this is where the enormous experience of the engineering team is behind taking into account the creation of previous generations of graphics chips . Surely the company, which was formed just a little over a year ago, will be able to roll out a really powerful product, and what a difficulty, with two GPU processors! At that time, none of the market leaders had thought about the implementation of cards with two GPUs. At Computex 2004, the same good old Volari V3, V5, V8 accelerators and their double-headed brothers were presented, with one difference, the accelerators migrated from the AGP bus to the promising PCI Express x16 at that time.


In February 2005, XGI entered the North American market with its Volari products.



In 2005, new monsters from NVIDIA and ATi, the GeForce 6800 Ultra and Radeon X850 XT, were already on the market. Compete with these cards, even with the strong refinement and improvement of the drivers, XGI products could not. If in 2004 the cards were somehow sold, then in 2005 the sales on the market completely withered. There were practically no volunteers willing to buy Volari cards, albeit for pennies. Somehow, the Trident team had to improve the situation again. In 2005, a new budget accelerator, the Volari 8300, developed by Trident engineers, was rolled out to the public. It was decided to sell the new card for $ 49.95. The graphics card had a PCI Express interface, the chip had a built-in MPEG-2 decoder, i.e., with this DVD video card, decoding was done by the video card, not the processor, only DirectX 9.0 with shaders version 2.0 was supported, while the GeForce 6 series already supported Shaders 3.0. Volari 8300 could output HDTV image up to 720P / 1080i via S-Video. In general, the card turned out to be quite successful for multimedia PCs and media centers, but alas, it could no longer save the company.


By the end of 2005, XGI did not show new chips in the High-End segment.


By 2006, no one believed in the bright future of XGI, even if a miracle happened and the new products saw the light, they would have to compete with the new generation of cards from the “graphic two”. XGI was on the verge of bankruptcy, there were rumors about the sale of the company and the winding down of its activities. In March 2006, the final point was reached in the fate of this once-promising manufacturer - official data appeared, XII buys ATI Technologies. According to experts, the transaction was completed for a penny 8-12 million dollars. ATi, in his own words, was interested in two things - a team of XGI engineers and extensive contacts with partners in the Chinese market. So the history of the once ambitious company ended, XGI existed for less than 3 years from the day of its foundation and was not weak enough to excite the world community. XGI was quickly forgotten, but you can still buy old cards on ebay , mostly the Volari Z7, but sometimes you’ll see really interesting items - the Volari Duo V8 Ultra, computer antiques fans instantly snatch such items ...

Source: https://habr.com/ru/post/123555/


All Articles