Currently, the market for discrete graphics accelerators is divided between two giants - AMD, ATi in the past, and NVIDIA, but this was not always the case. At the beginning of the 21st century, there was a third ambitious player in the market of graphic accelerators. Today, many have forgotten XGI Technology, and after all, at one time, but not quite so long ago, in 2003, these guys threatened to open the door and climb into the market of productive graphics cards, pushing today's leaders in this industry ...
 At the time, Trident provided video for most computers with 286 and 386 processors, in most cases they were 2D accelerators. Despite the widespread and success in the sector of 2D accelerators, in the late 90s, attempts to do something in the 3D direction collapsed over and over again. In the end, after going through a series of failures, the company's engineers managed to make a small race and in 1999 Trident Blade3D entered the market. I think many still remember this accelerator, although Blade3D could not claim a better place in the sun, the architecture of this 3D accelerator was 64-bit, the maximum amount of video memory was only 8 MB, it did not compete with the monsters of that time in performance - 3dfx Voodoo Banshee, Matrox Millenium G200 and NVIDIA RivaTNT, but could compete with the budget Intel i740, and given innovations such as Trident Hardware-Assisted MPEG-2 Architecture for DVD (iron support for playing DVD) and Full scene anti-aliasing, as well as price about $ 20 - sold well enough and won a place of many of the system blocks of the time. In the future, Trident refined its Blade3D, and even prepared to compete in the DirectX 8.1 video card sector with the help of the Trident XP4 brainchild, which was then dominated by the GeForce4 Ti 4600 and ATi Radeon 9700. Trident XP4, priced at less than $ 99, with a performance level of GeForce4 Ti 4200, for the company had to become a punch card in the high-performance graphics market. Unfortunately, the need to constantly increase the budget for development in the field of 3D and intense competition with Intel in the lower price segment of graphics cards slowly killed the company. In June 2003, Trident issued a statement on the complete restructuring of the company. It was at this time that the graphic division bought the heroine of today's article, XGI Technology, while Trident itself went into the development of chips for home appliances.
 SiS Mirage Graphics. The company successfully managed to compete in the market in the chipset segment for motherboards with VIA and Intel, but attempts to enter the market of discrete cards constantly failed miserably. At that time, SiS had a video chip - SiS Xabre, which the company was trying to promote with all its might. Chaintech, which produced a lot of SiS Xabre-based products at one time, gave them away practically for nothing, trying to somehow free up the warehouses and reduce losses. In retail, these cards were on the shelves for about $ 30, even at that price they were not needed by anyone. It is quite clear that this situation did not suit the leadership of SiS at all, and the board of directors was thinking about what to do next with the company's graphic division. It was necessary to make a decision - either to close the graphics development unit, or, by giving him money and his own work, as well as a good kick in the ass, to let him float freely. Trident had problems just in time for SiS. The second option was chosen by the board of directors and it was decided to separate its graphic solutions division into a separate company and take Trident, who was in a difficult situation, in the hands of engineers. Thus was born the daughter of SiS - XGI Technology.


 Not only did the second GPU not work in almost all applications, and in fact the reviewers had a single-processor Volari V8 Ultra in their hands, there were still big problems with the drivers and rendering quality that weren’t relevant. In many ways, there were problems due to unwillingness to cooperate and the lack of effective work with game developers at the development stage of the game itself. Often, developers could not even test their games on Volari cards, since simply did not have them in stock. As a result, XGI had to fence the garden on its own in the drivers. By the way, the driver settings utility interface was an outwardly exact copy of the old SiS utility. In order to somehow improve performance, I had to make concessions, somewhere to lower the quality of rendering, to disable some functions altogether, some games did not work at all on XGI maps. It was ridiculous - the driver developers blocked the possibility of taking screenshots in games so that it would be impossible to compare the quality of image processing with other cards. It's funny, but true - probably the guys from XGI made profits only from the sale of samples of such cards to the press, because they practically did not send the samples, but they were ready to sell them with pleasure. Volari Duo V8 Ultra started to go on sale at a price of about $ 500, there were not many people who wanted to buy a product on which some games do not work.
 own production of cards is an expensive enterprise, as well as a very risky business. Prices immediately went down, trying to stimulate the XGI buyer, along with remaining committed Chinese manufacturers, began selling Volari Duo V8 Ultra 256MB DDR2 for less than $ 200, and Volari V8 Ultra for less than $ 100. By the end of 2004, XGI introduced its server solutions, the Volari Z7 line to accelerate 2D graphics, which, by the way, won their place in the server segment. Until now, Volari Z7 works in many servers that have survived to our time. Unfortunately, in 2004, XGI was expecting a little bit different, namely new products, which never appeared. The company was constantly postponing the deadlines; from the point of view of logic, it was natural and understandable - it takes about one and a half years to develop a full-fledged new product from industry leaders (ATi and NVIDIA), and this is where the enormous experience of the engineering team is behind taking into account the creation of previous generations of graphics chips . Surely the company, which was formed just a little over a year ago, will be able to roll out a really powerful product, and what a difficulty, with two GPU processors! At that time, none of the market leaders had thought about the implementation of cards with two GPUs. At Computex 2004, the same good old Volari V3, V5, V8 accelerators and their double-headed brothers were presented, with one difference, the accelerators migrated from the AGP bus to the promising PCI Express x16 at that time.
 Somehow, the Trident team had to improve the situation again. In 2005, a new budget accelerator, the Volari 8300, developed by Trident engineers, was rolled out to the public. It was decided to sell the new card for $ 49.95. The graphics card had a PCI Express interface, the chip had a built-in MPEG-2 decoder, i.e., with this DVD video card, decoding was done by the video card, not the processor, only DirectX 9.0 with shaders version 2.0 was supported, while the GeForce 6 series already supported Shaders 3.0. Volari 8300 could output HDTV image up to 720P / 1080i via S-Video. In general, the card turned out to be quite successful for multimedia PCs and media centers, but alas, it could no longer save the company.
 XGI was on the verge of bankruptcy, there were rumors about the sale of the company and the winding down of its activities. In March 2006, the final point was reached in the fate of this once-promising manufacturer - official data appeared, XII buys ATI Technologies. According to experts, the transaction was completed for a penny 8-12 million dollars. ATi, in his own words, was interested in two things - a team of XGI engineers and extensive contacts with partners in the Chinese market. So the history of the once ambitious company ended, XGI existed for less than 3 years from the day of its foundation and was not weak enough to excite the world community. XGI was quickly forgotten, but you can still buy old cards on ebay , mostly the Volari Z7, but sometimes you’ll see really interesting items - the Volari Duo V8 Ultra, computer antiques fans instantly snatch such items ...Source: https://habr.com/ru/post/123555/
All Articles