... While the operating room is again plowing 4/5, from which
ironworks and scraps of chips and mats and screenshots in large numbers fly out every day, we can state that now there is already some material that I will try to assemble into a digestible form below.
To begin with, I would like to start. From the explanation of the reasons and goals of this post. The post as if reports three very important points:
Do not trust, and check!
While you are young, you need to beat the anvil!
Take everything from iron (and do not give it back)!
Now about the goals.
Truly, let us first recall (without quotes and proofs in the name of peace of mind) what were the reviews of early hardware-3D on a PC in the late 90s and early 2000s ...
- The exhaustive attention is so beloved resolutions 800x600 and 1024x768. It does not bother you what to measure in 99 which of the cards with all the newfangled features will show 60 fps in the resolution of 800x600 is somewhat strange? After all, these cards already had support for high resolutions in 3D - that's what needed to be tested!
- Detailed review of the "2D component" (lacked only the graphs of the oscilloscope). This is so important when you have a CRT monitor!
- At least coverage of the whole theory, but no specifics; Even predictions for the future!
- A bunch of beautiful graphs from Yoksel with some outrageous numbers. Have you ever wondered what those digits meant? I guess the amount in dollars, not otherwise.
... Now it's generally like: vidyahu inserted, they downloaded the firewood themselves, started the game, she herself patched, set up the graphics, turned on the fraps and burned. What can we say, there are only two alive api left, and even then OGL is found less and less in top games.
')
What happened before? There were more vendors, more api, and, therefore, performance was somewhat more problematic to compare, because much harder and longer. And then no one thought about monitoring that way at all. Honestly, well, show me at least one review of those years, where the performance cards are compared and at the same time some powerful game shots are provided (for example, on the unreal engine), where the results of online monitoring are at least basic fps and CPU utilization ?
Since November 95. Did not have. Not a single review containing such shots. Shock, heart attack, dess: x
Then, in general, hardly anyone seriously even used FRAPS . And that means everything was on the eye (well, almost).
- Well, for the sweet configs, on which everything was tested, do you remember? Having closed my eyes on the motherboard (chipset), it’s good if at least the percents met the vidyahi requests. In general, it was necessary to monitor something else in the background (? Free resources?).
Okay, iron, and where in general it was possible to get an acceptable comparison of two cards from a different vendor, if most often both cards were compared in ... well, offhand: 3DMark , Quake3 , more_one_mass_grow_middle requirements .
For some reason nobody even remembered such giants as Unreal or Hitman . But these were top-notch engines (which no modern map could pull out normally) with a lot of settings, with support for several api , with support for human permissions (more than 1024x768).
And don't tell me anything about our economy or CRT monitors of 15 size. The staff members at that time already sold, if not LCD, then precisely flat CRT with the support of human resolutions! The potential was high. In general, potency, they say, is an important parameter. Here we measure it.
Back in March, all of this did not finally give me peace to live, because "how is it that I was late for 15 years ???". Well, I decided to pass a couple of reviews on your unfavorable analysis, chip by chip each.
But what is there?And here
there will be no manipulations with the PC: memory upgrade, processor overclocking and similar gestures. Because I collected the top of my generation - it makes no sense to overclock and upgrade this car.
percents and mother are the most uninteresting components of the old pc, until you want something unusual. what sense to keep almost identical computers? these are not works of art, but faceless and soulless boxes. Early 3D video and sound, that's interesting. And the prots and mothers are all the same and boring
And there
will be a lot of detailed stops on each board and its features: there will be monitoring, comparison in different api, with different tweaks and overclocking. Analysis of input data,
modeling of a discrete random variable by twenty six iterations, calculating the variance, plotting an integral function that accurately describes the resulting picture, subjective and analytical estimates. In general, a completely clumsy approach, which for some reason has always been neglected by reviewers.
ATTENTION Config:MB: Gigabyte 6vtxe (Apollo Pro133T chipset)
CPU: Pentium 3-S 1.4Ghz / 512Kb / 133FSB
RAM: 3x256Mb PC133
HDD: healthy UDMA-5 WD with a decent 2MB cache and 5400RPM
WinME
Video:
<awaiting insertion ...>Go
Why WinME. So after all, flash drives work right away, it does not turn blue from 768MB of memory ... Not enough? But what else did I love WinME for:
- Avtorefresh absent almost everywhere;
- Complete OS instability. Hangs up the full-screen process, complete the hell;
- But traditionally you can exit the blue screen by clicking Anykey.
- Conservative axis. It’s okay not to get on the Internet: the latter, even a third-party browser, is not something that html5, javascript will hardly understand. But it is loaded as a seven on SSD. In general, the axis is designed for productive work.
- The most difficult quest is not firewood at all. Try to find a working monitoring software in 3D. There is now Everest - it works in the background. But output in 3D than? Yes, I had to study the question for a month or two. The best thing was when I first tried to put something:
- put the application. I run. Update, please, Windows to a newer version!
Then he chose more carefully. And that, in D3D, monitoring didn’t really work ...
Now. I played the game I chose
Deus Ex . There are several reasons for this.
First, the unclimbed childhood game, inaccessible by system requirements. He played with his brother with breaks for a month. Very vague memories of lack of sleep as a result of a delay of "5 more minutes" to 2 am. So vague that in the course of the recent passage it turned out that I managed to go quite far after that ... I have long wanted to go through
Deus Ex without brakes at maximum speed (!!! so much so that the scale of the details shook !!!) - what’s the reason?
Secondly, seriously. Playing on the
Unreal Engine 1 . The power offered in 1998 by this dviglo:
- support for the absolute majority of 3d features like trilineks, anisotropics, multitex and all kinds of lightnings;
- Supports 16-channel sound, effects, and sound positioning. Support for hardware sound processing (yes, this was once also relevant;))
- support for 5 3d-renering api !!! It didn't seem to you: besides DirectX, there was at least another 4;
- For 2d-only card owners (these are without 3D hardware acceleration): a full-featured software-renderer is waiting for you !!! : yes:
- All of the above had to be crammed in somehow and made sure that everything worked: whistling:
- of all the above follows below: the engine was successfully sold and implemented.
Out of the eye

Before you again that same legendary Curve, the killer of 3dfx (represented by ASUS V3800PRO). This statement seemed doubtful, because with it we begin.

Supports AGP4x, but by the way it does not have Sideband. Well, now and check out all this important or not;)
... On the rest of the features such as featureconnect and TV tuner, let's put ...
Closer to the heart
The map is built on the RIVA TNT2 PRO chip. Born in 1999

In general, TNT is
T wi
N T exel. It was the first in the history of desktops
! one! a chip capable of imposing two textures per clock. And this is already TNT
2 .
This card (
PRO ) is like the originally overclocked version of the standard TNT2 (by the way, the
ULTRA was also in the assortment - it is like the originally overclocked version of the
PRO ). Powered by 142/166 (chip / memory).

By the way, this is the coolest video card in which you had to choose between either fair trilinear filtering or multitexturing. Already scary ...
However, what does all this really mean and how to look at it in practice? With this, and we'll figure it out.
Implementation
Built-in firewood in winme! highly! old, with them, even quake3 will not start, saying that your opengl is outdated or simply missing.
Drivers from asus for tnt2 are no good. Just look at the abundance of options that are clear and accessible to every student, such as the
Expansion of the buffer area or the
Transon mode ...
Trapson ... damn ...

OMFG, Poor people! But there is still a ForceWare driver already from! 2005! years with the support of as much D3D9. Deus EX with them is wonderful artifact. Now it becomes clear why cardholders were so exhausted to change it for voodoo - it seems no one has written normal firewood in the entire history of the card.
There is also a well-known utility written by our compatriot
Unwinder 'om -
RivaTuner .
User interface !!!
I must say, a worthy improvement in the series. I finally understood how to use the utility only later, probably a couple of days. And then, the abundance of tabs and hidden features of the registry still remains a mystery to me ... In any case, the main thing I got was the possibility of low-level overclocking of the card.
Application layer
ATTENTION Config:MB: Gigabyte 6vtxe (Apollo Pro133T chipset)
CPU: Pentium 3-S 1.4Ghz / 512Kb / 133FSB
RAM: 3x256Mb PC133
HDD: healthy UDMA-5 WD with a decent 2MB cache and 5400RPM
WinME
Video:
Riva TNT2 PROIMPORTANT:All graphics settings in the game - to the maximum !!! It means something like this:
if something beautiful and / or useful can be included, we include it.We immediately disperse the map from the standard 142/166 to almost the Ultra level (
180/210 ). Why not? In fact, overclocking will give frames 5-7. Well, not bad either. I did not try to drive higher, I'm sure: you will start collecting artifacts.
The resolution in the game for this card was
1280x1024 . Anything below doesn’t interest me (it didn’t interest me at the time of writing the paragraph). All that is higher is not its level. So shows less than 30 frames almost everywhere.
So, let's go by technology.
API and color
The map is standardly able to Direct3D and OpenGL.
What?... Today, probably, not everyone already understands what a graphical API is. Let's do this: everyone knows about DirectX. It consists of DirectInput (today XInput), DirectDraw, Direct3D, DirectSound and today something else. So here are all the various api, which is clear from the name. If you take the Direct3D API, this is something like a software tool for drawing 3D using the hardware capabilities of the drawing card itself. This is an important point. Support by the card of one or another API indicates whether it is possible to draw using specific cards of a specific api using this card. Performance in each of the api, of course, is different. Because the implementation is different ...
Also let me remind you that at that time in every second game, 32-bit color was slowly introduced and offered. So there was a choice between 16 fast and 32 slow.
If anyone is interested, what is the difference.So, here are a few moments. Let's start with the theory. There is a picture of 16 bits per pixel. This means that for each of the three colors is given on how many bits? Yes, damn it does not mean. Because a typical picture for a game can most often be encoded in one of the following ways:

Who has not understood: the 16-bit image can be encoded as RGB565 (red, green, yellow), or it can be encoded as
A RGB1555. A is the alpha channel. We need it to indicate the transparency of the texture. On that leave you to reflect on the purpose of the other options.
Essentially, the following is seen: not only is the color spectrum “thousands of colors” (aka
High color, it’s 16bit) clearly does not cover the human (2 ^ 5 x 2 ^ 6 x 2 ^ 5 = 65536 color values), but also I had to choose which of the channels to give preference to the situation.
Both problems solved
Tru Color (32bit). Everything is almost like Marx: everything according to need, all possible. By the way, in general, the color is 24-bit, 8 bits for transparency.
The reverse side of the coin guess what? Right size. Such textures (if we talk about the game) will weigh up to two times. Just do not think that in games where there was a choice between 16- and 32-bit textures, both types were stored. Of course, 32 bits were simply converted.
Speaking of games. Shots are made unprofessionally, but look at the water and the sky.


But the bonus to you is the consequences of incorrectly encoding the alpha channel: there are not many transparent details, for example, the hud above, softening the shadow of the car.

... Well, the flock will scatter ...
Nvidia has positioned the card as a powerful 32-bit pipeline. By the way, the competitor in the name 3dfx had only 16-bit color and he fell in this battle. Gone blind?
Well, let's look at the picture in different APIs, and at the same time try to find the difference between 16-bit and 32-bit color;)
As promised,
there will be further monitoring shots everywhere and all things where it is possible ; original dimensions (signatures at the bottom left are yellow - mine, I confess ...)




Well what can I say ...
- First of all, I want to note that in fact in the game everything looks " somewhat " brighter ... These screenshots are honest and are made by means of the engine. Honestly, this place was very visual, though I tried: D
In general, the brightness of the game is a big problem. Here is what you see, this brightness is unscrewed by 100%. By default it costs 50%. Understand? I heard about the fix , but somehow it was too late to put it. Yes, and good atmospheric ... In the end, who the hell surrendered these screenshots? And soooo come down ... Well, do not worry, there will be more screenshots. - Secondly, you can see the difference between 16 and 32bpp in Direct3D: if you return to the shots, you will see a terrible ripple in the center of d3d_16bpp. This is the rendering quality of TNT2. Other cards usually do not have this.
In OGL, there is no difference in color at all. This suggests that the game cheats and always gives textures in some fixed format. What exactly is not interesting. Interestingly, the color bit setting does not play a role in OGL.
In general, if you download to the computer and unscrew the brightness in the photocopy, you will see the difference in api. It is both in color and in drawing, but rather insignificant. So 3dfx is hardly blind then ... I don’t think so. - Thirdly, monitoring (in the upper right yellow) you saw only in OGL. Why? Yes, because in those distant only unknown to me programs could monitor in D3D x , which is used in Unreal Engine 1.
Plus, the percent in WinME is always 100% loaded. Apparently, this is his credo (proca). It is enough to enter Windows and press START!
Memory can be seen that for the eyes. In general, it is clear that further monitoring is meaningless, but the Unreal Engine itself can count the frames (below, under the inventory, white in the center)
AGP
As already mentioned somewhere above, the map can AGP4x. At the time, it was an innovation. Does it give anything? We will definitely find out now, but for starters, I will especially “know” the Unreal engine settings for D3D (for OGL, we don’t have an interesting option at all).

As you can see, I turned on
DiME . So, this option will not affect anything in our case. In the tnt2 curve, agp-texturing has not yet been used. According to rumors, AGP-aperture was used only as an intermediate repository before sending textures to local memory. This is a known flaw in the chip and you can easily google it.
Well, fine, now you can begin to measure the potency:
AGP2X
Let me remind you: !!! FPS is considered the engine at the bottom of the shot !!!


By the way, here you can see the difference between api. Look at the reflection of the ball.
AGP4X
Notice that AGP speed did not affect any parameter !!! On practice:


Well, after all, it is clear that the card does not even give out 35 frames in agp2x, but
does not a rise to agp4x change anything at all ? I really don’t think that there is not enough server 1.4Ghz server, which, I'm sorry, came out after three years. Memory is exactly enough. The ratio of FSB: DRAM is 1: 1, and I’m sure the hard time.
I think the chip just chokes and nothing can be done about it. Marketing is such a marketing.
Trilinear
Here is the most interesting. You can quote yourself.
By the way, this is the coolest video card in which you had to choose between either fair trilinear filtering or multitexturing. Already scary ...
So, the wonderful UE1 engine allows you to check all this on the spot. But you can only check in D3Dx. In OGL, multi-texturing through game options is not configurable.
Check what? Well ... The trilinek was first introduced into TNT2. But to use it simultaneously with multitexturing chip did not cope. And it is necessary, because competitors! Therefore, it seemed to be, but it worked in two versions: real trilinear filtering without multitexturing, or less costly
mip-map dithering (“approximation”, as it was wrongly called then) and multitexturing.
To begin with, I will try to explain what a trilinek is in general (maybe not too successfully) ...
Trilinear is the development of bilineas. Applies
only to 2D objects that are at a distance from the player and close to each other. A good example is, for example, a floor or wall consisting of tiles of the same type that closely adhere to each other. When away from the player, the tiles should decrease (this is called “mipmapping” by the way), which is what happens, but as a result of the decrease between the tiles, a sharp transition becomes noticeable, the overall picture is lost. Well, the trilinear is intended to fix this. In general, it averages the extreme pixels to the total color value ...
So. This is real. What
about mip-map dithering ?
Explanation of mip-map dithering on the example of voodoo5:
Mip Mapping introduces a new annoying artifact: You can notice the blurry level. This swap discontinuity can be hidden using two techniques: trilinear filtering and Mip Map Dithering. Trilinear filtering and filtering, it is 2x multi texturing filtering and 2 x multi filtering. texturing). It’s a bit different. The basic idea behind Mip Map Dithering is that instead of changing from the high level to the lower level in one go you do it gradually. This is done by adding a random value to the Mip Level selector, as a result you will swap between the level for a while (jumping over the limit and back) rather than one move over the change value. While Mip Map dithering successfully hides the sudden change it also results in more blurry graphics. Some people consider Mip Map dithering cheating, but remember — it does not require you to perform double texture passes for each pixel.
...
www.beyond3d.com/content/articles/57/4
Okay, come on, I'll show you better shots, so let's go through the surveyors of the mid-90s. Expose 640x480. Now everyone should look at the smoothing of the lamps on the ceiling in the corridor and the smoothing of the Radioactive sign to the left of the aisle !!!


You also do not see the difference? Actually, it is there, you just need to pull the 640x480 on your 20+ inch monitor and then justice will prevail (today there is a carriage in every browser). It is clear that at high resolution small pictures are all the same ...If, however, someone pulled, he noticed that with multitexturing enabled:- The overall level of brightness in the game drops. This is a known engine bug .
- Trilinear with multi-texturing on screen looks a little worse than bilinear. But in the game the difference will be the more noticeable, the larger your monitor. It is a fact.
Let's go back to 1280x1024.
Well here it is better to see how mip-map dithering. Horror, right? Honestly, if I had not been told, I would never have noticed. My opinion, the nickname "curve" the chip has received undeservedly. He is not guilty that you play 640x480 in 1998, right?And although looking for flaws mip-map dithering with a magnifying glass of 640x480 among the squares seems idiotic to me, yet for those special cases that need damn checkers to go, our countryman Unwinder offers the same special solution:
no more comments ...Go
Well, you know ...
Very worthy. 1280x1024x32bpp. 20-30fps in overclocked condition. For me, so if the mouse starts to blunt the menu, and inventory in battle is drawn with a delay of half a second, you can wait. It was impossible, for example, on the Pentium 150 and 2D video card to wait for its progress in Heroes 3 for 5-7 minutes, or, for example, it was impossible to wait in Quake 2 in the cooperative, where the rockets appeared later, and the sound delay was about a second. So 20-30 frames is already quite playable.Do not like, do not want to drive? Well, play 1024x768 and below. Or in 16-bit color.But still, of course, voodoo 3000 was faster, albeit only with 22 bits. It's a pity 3dfx, but he left us something ...Thanks
ZY
In the process of testing I ran into a monitoring problem. At the forum, where I come from, I recently sketched out briefly how to monitor on Win9x / me . I invite everyone who understands and shares (the idea) to share experience, because I have never found such information anywhere (not surprisingly).... tbc ...Used literature and greetings