Geforce gtx 260 driver update

Entrance doors 02.10.2020
Entrance doors
In their striving to get ahead of each other, NVIDIA and ATI companies often sacrifice the quality of driver training in the release of new graphics processors and video cards based on them. At the same time, it is no secret to anyone that NVIDIA GeForce (nee - ForceWare) and ATI Catalyst drivers are one of the determining factors for the success of new graphics solutions. As an example, it is enough to recall that at the time of the release of the G200, RV770 graphics processors and video cards based on them, there were no official versions of drivers at all, and all reviewers had to test the cards on various beta versions, far from not only ideal, but also from elementary stability. As a result, the test results could not be called fully objective, and the scatter in the results obtained by different authors was quite significant even within one single game or test application.

The release of official certified versions of drivers does not always and not in all applications add speed to video cards - it is most often aimed at fixing errors and supporting new games. Nevertheless, both NVIDIA and ATI are not shy about announcing a certain performance gain in some games and tests. Only in practice these statements are not confirmed as often as you and I would like. In order to try to reveal the speed gain on different versions of NVIDIA GeForce and ATI Catalyst drivers, I decided to prepare two separate articles comparing the speed of video cards on several driver versions. In the next article, after the promising Catalyst 8.12 has been released, I will test the Radeon HD 4870 on various drivers. Well, in today's article we will check how much the NVIDIA GeForce drivers have evolved in this regard using the example of the GeForce GTX 260 (new version, with 216 pipelines). ZOTAC will help us with this, having kindly provided its GeForce GTX 260 AMP2 video card! Edition for testing.

Review of the ZOTAC GeForce GTX 260 AMP2 video card! Edition 896 MB

Company ZOTAC InternATIonal (MCO) Limited can still be classified as a newcomer to the video card market, since it entered this market segment only last year. ZOTAC GeForce GTX 260 AMP2 graphics card! Edition 896 MB comes in a large cardboard box oriented vertically. The obverse shows a formidable dragon with outstretched wings:



There you can also find some characteristics of the video card, a label about the extended warranty period and a sticker with information that along with the video card you are purchasing the Race Driver GRID game. On the back of the box is a description of the key features of the NVIDIA GPU, which include 192 more unified shader processors, while the video card being tested today is equipped with 216 processors. Apparently, the box is still from the old version of ZOTAC cards of the "AMP!" Series, only on the front side they managed to stick the sticker "AMP2!"

Inside, the video card is securely fixed in a plastic shell in the central compartment, and around and under it you can find the following components:



I will list them:

splitter adapter from S-Video output to component cable;
one DVI → HDMI adapter;
one DVI → D-Sub adapter;
instructions for installing a video card and drivers;
two cables for connecting additional power to the video card (6-pin connectors);
audio cable (S / PDIF, for audio output via HDMI);
cD with video card drivers;
Race Driver GRID full DVD.

In this respect, the video card is favorably distinguished from competitors' products by two cables for connecting additional power (most often one is supplied), as well as the quite relevant and graphically rich game Race Driver GRID.

The external difference between the new video card and the reference GeForce GTX 260 lies only in the sticker on the plastic casing of the cooling system, which depicts the same dragon as on the box:


The dual-slot cooling system covers the card from three sides:






The dimensions of the video card are standard and are 270 x 100 x 32 mm. ZOTAC GeForce GTX 260 AMP2! Edition is not replete with sophistication in terms of connectors and is equipped with a PCI-Express x16 2.0 interface, two DVI-I ports (dual-channel, with support for high resolutions), as well as an S-Video output adjacent to a grill for air exhaust from the case of the system unit:



At the top of the video card there are two 6-pin connectors for connecting additional power, a connector for connecting an audio S / PDIF cable, as well as two MIO connectors for building SLI and 3-Way SLI systems from two or three identical video cards based on NVIDIA GPUs:


Let me remind you that according to the specifications, the peak power consumption of the GeForce GTX 260 (192SP) is 182W, so a 500W power supply is recommended for a system with one such video card installed, and 600W for SLI configurations. For a video card with 216 unified shader processors, the power requirements are not significantly higher. Of course, these are official recommendations, the authors of which assume that your system may have a processor with high power consumption and several hard drives; otherwise, the video card will work on less powerful units - as our measurements show, the card on the GeForce GTX 260 (192SP) itself consumes about 136 watts.

Let's see what a video card looks like without a cooling system:


Power part:



ZOTAC GeForce GTX 260 AMP2 GPU Marking! Edition released in Taiwan 27th week of 2008 - G200-103-A2:



The chip belongs to the second revision (A2). The number of unified shader processors is 216, texture units - 72, rasterization units - 28. The graphics processor frequencies of the video card, in contrast to the official ones for the GeForce GTX 260 (575/1242 MHz), are increased to 648/1404 MHz - or + 12.7 / +13.1%! Quite good, even if these are not record-breaking frequencies for factory video cards. The voltage on the GPU is 1.06 V. Let me add here that in order to save energy and in order to reduce heat generation, the frequencies of the GPU in 2D mode are reduced to 300/600 MHz.

896 MB of memory for a video card of the GDDR3 standard contains 14 microcircuits located on the front and back sides of the board. The memory bus of the video card is 448 bits wide. As with all GeForce GTX 260s we studied earlier, the ZOTAC video card uses Hynix chips with a nominal access time of 1.0 ns:


The passport frequency of the chips marked as H5RS5223CFR NOC is 2000 MHz. According to the specifications of the GeForce GTX 260, its memory should operate at an effective frequency of 1998 MHz, while the ZOTAC GeForce GTX 260 AMP2! Edition memory frequency is 2106 MHz, which is only 5.4% higher than the nominal. Modestly, however, as there are commercially available GeForce GTX 260 and higher memory frequency cards.

Be that as it may, the specifications of the new video card are as follows:


Well, more detailed and complete characteristics can be seen in the table:


The cooling system has not changed compared to conventional cards based on GTX 280/260 chips:



Always wondered at the thick, thick layer of thermal paste on the GPU. The impression is that the workers' pay is piecework at the plant, for the amount of consumed thermal paste, as once in the Soviet years among collective farmers-tractor drivers for the amount of consumed diesel fuel. Only our “clever guys” poured it into the ground at the edge of the field, and hardworking Chinese diligently put the entire “put” thermal interface on the GPU heat spreader cover. The problem here is that, ideally, thermal paste should not form a continuous layer between the chip and the heatsink, it should only smooth out the roughness of the metal, filling in the irregularities - its thermal conductivity is high compared to air, but small compared to the metal of the heatsink.

Let's check the temperature regime of the video card. Hereinafter, the load on the GPU and the video card as a whole was created using ten cycles of the Firefly Forest test from the synthetic graphics package 3DMark 2006 at a resolution of 1920 x 1200 with anisotropic filtering at 16x level. Full-screen anti-aliasing was not activated, since when enabled, the load on the GPU and its temperature are lower. All tests were carried out in a closed case Ascot 6AR2-B system unit (you can find the configuration of the fans in it in the section with testing methods below). The room temperature during testing was 23.5 degrees Celsius. The frequencies and temperatures of the video card were monitored using the RivaTuner v2.20 program (by Alexey Nikolaychuk). Due to the fact that the video card was disassembled before the tests, the standard thermal interface on the GPU was replaced with Gelid GC1 high-performance thermal paste, applied to the GPU with the thinnest possible layer.

Let's look at the test results in automatic turbine operation:


The temperature of the GPU did not exceed 80 ° C, and this is despite the already increased frequencies of the video card and the low noise level of the turbine of the cooling system. The latter during testing spun up only up to 1860 rpm, with the maximum possible 3200 rpm.

And here's how the reference cooler cools the video card at 100% turbine power:


Here, temperatures are at all atypically low for a top-end video card. Interestingly, the voltage regulator current at the peak of the load at lower temperatures of the video card turned out to be 5 A less than at higher temperatures.

Considering the high efficiency of the reference cooler, the overclocking potential of the video card was checked without replacing the cooling system, but the turbine speed was manually set to 2050 rpm - this turned out to be the maximum speed and at the same time subjectively comfortable in terms of noise level. As a result, the video card was overclocked to 2448 MHz (+ 22.5%) in video memory without loss of stability and picture quality, and it was absolutely impossible to overclock it in the graphics processor:


Alas, the core of the ZOTAC video card provided to us for testing is already working at the limit of its capabilities. In addition, increasing the core voltage from 1.06 V to 1.15 V in the video card BIOS did not lead to an increase in the overclocking potential of the GPU. Well what can I say? Perhaps, the features of a particular instance of a video card, nothing more.

The temperature regime of the card additionally overclocked by video memory is as follows:


By the time the material is finished, the recommended cost of the ZOTAC GeForce GTX 260 AMP2 video card! Edition 896 MB was US $ 320–350, but prices for the entire new line of video cards based on NVIDIA GPUs are expected to drop soon, which will certainly affect ZOTAC products. At the time of writing this article, the real cost of this card in Moscow stores was a little over 10 thousand rubles.

Evolution stages of GeForce drivers (ForceWare)

First of all, it is necessary to draw your attention to the fact that this year NVIDIA changed the name of its drivers from ForceWare to GeForce, which, in my opinion, now only leads to confusion between the names of video cards and the names of drivers. Well, okay, let's get used to it, not for the first time already. As of 11/19/2008 (start date of testing), NVIDIA has released countless beta drivers and five official certified versions for the GeForce GTX 260 and GTX 280 line of video cards, four of which we will study today.

GeForce 177.41 (06/26/2008) - is the second WHQL-certified driver for the GTX 280/260 line. Why was not the first official version 177.35 chosen? The fact is that the driver version 177.41 came out only nine days after 177.35, and did not make any significant changes. For example, here's what the official list of innovations looks like:

added support for GeForce GTX 280 and GeForce GTX 260 GPU;
supports one GPU and NVIDIA SLI technology on DirectX 9, DirectX 10 and OpenGL, including 3-way SLI technology with GeForce GTX 280 and GeForce GTX 260 GPUs;
added support for CUDA technology;
added support for the distributed computing system [email protected];
supports HybridPower, Hybrid SLI technology, on the following GPUs and motherboards:

GeForce GTX 280 GPU;
GeForce GTX 260 GPU;
nForce 780a SLI;
nForce 750a SLI;


support for GPU overclocking and temperature monitoring when installing NVIDIA System Tools (minor bugs fixed).

As you can see, no changes were announced in terms of increasing the performance of video cards in the new (at that time) driver. In addition to the above, a couple of bugs were fixed under Windows Vista x64 - and that's it.

GeForce 178.13 (25.09.2008) - the third WHQL-certified driver for GeForce GTX 280/260, released after an atypically long break for NVIDIA (almost 3 months have passed). Of the interesting changes, the following can be noted:

added support for NVIDIA PhysX acceleration for all GeForce 8, 9 and 200 series GPUs with at least 256 MB of video memory (now PhysX version 8.09.04 is packed with the driver);
added support for 2-way and 3-way NVIDIA SLI technology for GeForce GTX 200 series GPUs on Intel D5400 XS motherboards;


up to 11% in 3DMark Vantage;

up to 15% in BioShock (DX10);
up to 15% in Call of Duty 4;




up to 7% in BioShock (DX10);


up to 10% in World in Conflict (DX10);


fixed various compatibility issues with 3D applications.

In addition, the notes for this version say about fixing some bugs in games for a single video card and SLI-bundles. By the way, this version of the GeForce driver was characterized by many users and testers as the most stable and problem-free.

GeForce 178.24 (15.10.2008) - Released just 20 days after the release of version 178.13 and also has a WHQL certificate. Despite the very short period of time, there are more than enough changes. I will list the key ones related to improving performance in games:

performance improvements in the following 3D applications for a single GPU:

up to 11% in 3DMark Vantage;
up to 11% in Assassin's Creed (DX10);
up to 15% in BioShock (DX10);
up to 15% in Call of Duty 4;
up to 8% in Enemy Territory: Quake Wars;


performance improvements in the following 3D applications for 2-way SLI GPUs:

up to 7% in BioShock (DX10);
up to 10% in Company of Heroes: Opposing Fronts (DX10);
up to 12% in Enemy Territory: Quake Wars;
up to 10% in World in Conflict (DX10).

Attentive readers will surely notice that all optimizations are identical to those in the 178.13 driver. However, this is not my fault, as the official site states that all performance gains are shown from the 178.19 beta driver, which came out later than 178.13. An interesting point, since, in fact, NVIDIA increased the speed of its video cards in the above games twice by two drivers by the same amount. Rather, it should have increased ...

In addition to speed optimizations, the new driver introduced fixes to the operation of DVI-HDMI devices, Hybrid SLI mode, and also fixed errors in the World in Conflict game menu when using full-screen anti-aliasing for the GeForce 6600 (apparently, this is the joke at NVIDIA - show in very demanding to the GPU power of the game, this card will still not be able to achieve any acceptable performance). As with version 178.13, version 8.09.04 PhysX libraries are packaged with this driver.

GeForce 180.48 (11/19/2008) - the freshest on this moment from certified GeForce drivers for video cards based on NVIDIA chips. The long-awaited and previously advertised by NVIDIA itself, the driver of the new 180-series, in addition to the traditional bug fixes, should bring significant changes and noticeable performance gains. In more detail it looks like this:

new opportunities:

nVIDIA SLI technology certification for motherboards based on Intel X58 chipsets (Intel Core i7 processors) for GeForce GTX 280, GeForce GTX 260, GeForce 9800 GX2, GeForce 9800 GTX + and GeForce 9800 GTX GPUs;
added the ability to connect multiple monitors to video cards combined in an SLI bundle, both in desktop and 3D modes;
physX technology now enables physical graphics acceleration on GeForce 8, 9 and 200 series graphics cards;


announced performance gains for the following 3D applications:

up to 10% in 3DMark Vantage;
up to 13% in Assassin's Creed;
up to 13% in BioShock;
up to 15% in Company of Heroes: Opposing Fronts;
up to 10% in Crysis Warhead;
up to 25% in Devil May Cry 4;
up to 38% in Far cry 2;
up to 18% in Race Driver: GRID;
up to 80% in Lost Planet: Colonies;
up to 18% in World of Conflict.

Particularly impressive is the 80% speed gain in new game Lost Planet: Colonies - you can make an erroneous conclusion that before the driver 180.48 instead of 180 fps there was only 100 fps. In addition, to all the declared performance gains on the official page of the site there is a postscript: "the results may be different on different configurations", that is, no one guaranteed anything. Well, okay, in general it should be noted that the changes in the new driver are announced to be very interesting. Will they turn out to be a reality in practice? .. Let's check.

Test configuration, tools and testing methodology

The video card and different driver versions were tested on a computer with the following configuration:

Motherboard: DFI LANParty DK X48-T2RS (Intel X48, LGA 775, BIOS 03/10/2008);
Processor: Intel Core 2 Extreme QX9650 (3.0 GHz, 1.25 V, L2 2 x 6 MB, FSB 333 MHz x 4, Yorkfield, C0);
Processor cooling system: Thermalright SI-128 SE with Scythe Ultra Kaze fan (1320 rpm);
Thermal interface: Gelid GC1;
RAM:

2 x 1024 MB DDR2 Corsair Dominator TWIN2X2048-9136C5D (Spec: 1142 MHz / 5-5-5-18 / 2.1 V);
2 x 1024 MB DDR2 CSX DIABLO CSXO-XAC-1200-2GB-KIT (Spec: 1200 MHz / 5-5-5-16 / 2.4 V);


Disk subsystem: SATA-II 300 GB, Western Digital VelociRaptor, 10,000 rpm, 16 MB, NCQ;
HDD cooling and soundproofing system: Scythe Quiet Drive;
Drive: SATA DVD RAM & DVD ± R / RW & CD ± RW Samsung SH-S183L;
Case: Ascot 6AR2-B (120mm Scythe Slip Stream case fans at 960 rpm with silicone pins are installed for intake and exhaust, the same fan at 960 rpm on the side wall);
Control and monitoring panel: Zalman ZM-MFC2;
Power supply: Thermaltake Toughpower 1500 W, W0218 (standard 140mm fan);
Monitor: 24 "BenQ FP241W (1920 x 1200 / 60Hz).

In order to reduce the dependence of the video card tested today on the platform speed, the quad-core processor was overclocked to 4.0 GHz at a voltage of 1.575 V:


During the tests, the RAM operated at a frequency, with timings lowered to 5-4-4-12 values \u200b\u200bat Performance Level \u003d 6 and a voltage of 2.175 V.

All tests were conducted on Windows Vista Ultimate Edition x86 SP1 (plus all critical updates as of 11/10/2008). The start date of the tests was 11/19/2008, so the following drivers available at that time were used:

motherboard chipset: Intel Chipset Drivers 9.1.0.1007;
directX Libraries: November 2008.

For each of the GeForce / ForceWare drivers, a separate PhysX package was installed, available at the time the driver was released, or included directly in the driver. The drivers were tested in the same sequence in which they were released. Each of the drivers and the PhysX package with them were installed only after uninstalling the drivers of the previous version and additional cleaning of the system using Driver Sweeper v1.5.5.

After installing each of the drivers, the following settings were changed in their control panels: graphics quality from “Quality” to “High Quality”, Transparency antialiasing was changed from “Disable” to “Multisampling”, vertical sync was forcibly disabled ( "Force off"). Other than that, no changes were made. Anisotropic filtering and full-screen anti-aliasing was enabled directly in the game settings. If the change of these settings in the games themselves is not implemented, then the parameters were adjusted in the control panel of the GeForce drivers.

The video cards were tested in two resolutions: 1280 x 1024/960 and widescreen 1920 x 1200. The following set of applications was used for the tests, consisting of two synthetic tests, one techno-demo and twelve games of different genres:

3DMark 2006 (Direct3D 9/10) - build 1.1.0, default settings and 1920 x 1200 + AF16x + AA4x;
3DMark Vantage (Direct3D 10) - v1.0.1, "Performance" and "Extreme" profile (only basic tests were tested);
Unigine Tropics Demo (Direct3D 10) - v1.1, built-in demo test, maximum quality settings, resolution 1280 x 1024 without methods and resolution 1920 x 1200 with AF16x and AA4x;
World in Conflict (Direct3D 10) - game version 1.0.0.9 (b89), graphics quality profile "Very High", but UI texture quality \u003d Compressed; Water reflection size \u003d 512; DirectX 10 rendering activated;
Enemy Territory: Quake Wars (OpenGL 2.0) - game version 1.5, maximum graphics settings, demo at the "Salvage" level, Finland;
(Direct3D 9) - game version 1.7.568, graphics and texture settings set to "Extra" level, demo "d3" to "Bog" level;
Unreal Tournament 3 (Direct3D 9) - game version 1.3, the maximum graphics settings in the game (5th level), Motion Blur and Hardware Physics are activated, the Fly By scene was tested at the DM-ShangriLa level (two consecutive cycles), the test was used HardwareOC UT3 Bench v1.3.0.0;
Devil may cry 4 (Direct3D 10) - game version 1.0, maximum graphics quality settings ("Super High"), the average value of a double sequential run of the second scene of the test was taken as the result.
S.T.A.L.K.E.R .: Clear Sky (Direct3D 10) - game version 1.5.07, quality settings profile "Improved DX10 Full Lighting" plus anisotropic filtering at 16x level and other maximum graphics quality settings, own demo recording "s04" (triple test cycle);
Crysis WARHEAD (Direct3D 10) - game version 1.1.1.690, "Very High" settings profile, two-time video card test cycle at "Frost" level from the test HardwareOC Crysis WARHEAD Bench v1.1.1.0;
Far cry 2 (Direct3D 10) - game version 1.00, "Ultra High" settings profile, double test cycle "Ranch Small" from Far Cry 2 Benchmark Tool (v1.0.0.1);
X3: Terran Conflict (Direct3D 10) - game version 1.2.0.0, maximum quality of textures and shadows, fog is on, parameters "More Dynamic Light Sources" and "Ship Color VariATIons" are on, the result is the average speed value based on the results of one run of all four demos;
Left 4 dead (Direct3D 9) - version of the game 1.0.0.5, maximum quality, a meaty demo (two passes) was tested on the third level of “No Mersy”, the first scene of “The Seven”;
Lost Planet: Colonies (Direct3D 10) - game version 1.0, graphics level "Maximum quality", HDR Rendering DX10, built-in test, consisting of two scenes.

The last game is practically of no interest, since it is rather old (by the standards gaming industry). Nevertheless, I decided to add it to the list, since the NVIDIA GeForce drivers claim the performance gain in Lost Planet: Colonies on 180.48 driver up to 80%! I wonder how real this figure is?

Testing in each of the applications was carried out twice (not to be confused with running demo records twice!). The end result was the best speed indicator (or conditional points) of the two testing cycles, which you will see in the diagrams. Let's move on to the test results and their analysis.

Test results

Drivers are listed in the diagrams in the order of their release. First driver - version 177.41 - highlighted yellow, two drivers of the 178.xx series are marked with a blue-green gamut, while the new 180.48 is marked with a dark blue color. Thus, the list of drivers in the tests looks like this:

GeForce 177.41;
GeForce 178.13;
GeForce 178.24;
GeForce 180.48.

Before proceeding to the analysis of the results, it is necessary to pay attention to one very important point. The fact is that the driver version 180.48 presented an unpleasant "surprise", namely - the still very popular (if not the most popular) resolution of 1280 x 1024 pixels disappeared in it, and instead of it, it was proposed to install 1280 x 960 in games and tests. with arbitrary resolutions ("Custom") in the driver's control panel, it was not possible to add it in any way, since every attempt to test the "alternative" resolution of 1280 x 1024 pixels ended with an error. There was no time to wait for a response from NVIDIA support to which the corresponding request was sent. Alternatively, it was possible to simply replace the resolution of 1280 x 1024 selected for tests in all other driver versions with 1280 x 960, but, as you remember, the driver version 180.48 was tested last ... In general, the prospect of repeating a full cycle of all tests for three the previous versions of drivers did not appeal to me at all. However, the resolution of 1280 x 960 is less than 1280 x 1024, by less than 6.3%, and the performance of video cards in different resolutions does not linearly depend on them. In addition, we have the same for everyone and less processor-dependent 1920 x 1200 pixels, which I propose to focus on.

First, the results of testing video card drivers in two synthetic tests:

3DMark 2006



In the 3DMark 2006 benchmark, there is no performance increase when moving to new driver versions. A slight improvement compared to version 177.41 was brought by the drivers of the 178th series, and 180.48 is in the lead due to a slightly lower resolution of 1280 x 960 and is equal to the others in 1920 x 1200.

3DMark Vantage



The three versions preceding the driver 180.48 are equal in performance in 3DMark Vantage, but the last of the official versions, 180.48, demonstrates a slight speed increase in both resolutions. The total score at 1280 x 960 pixels in 3DMark Vantage is not calculated, so I only included the GPU Score data.

Unigine Tropics Demo

I draw your attention to the fact that today's article uses the new version 1.1 of this beautiful demo. The settings look like this (resolution, AF and AA have changed):



Let's see the results:



In the test from the new version of the Unigine Tropics demo, the driver version 180.48 also allows the video card to run slightly faster.

So, in two of the three synthetic tests we observed, albeit a very small, but still an increase in the performance of the video card on the new drivers of version 180.48. How will the situation develop in real games?

World in Conflict






In the mode without anti-aliasing and anisotropic filtering, the performance gain is observed in all three driver versions released after 177.41. The latest driver, version 180.48, demonstrates an increase in the mode with full-screen anti-aliasing and anisotropic filtering, and quite close to the promised 18 percent.

Enemy Territory: Quake Wars







Call of Duty 4: Modern Warfare MP







Unreal Tournament 3






In the three games tested above - Enemy Territory: Quake Wars, Call of Duty 4: Modern Warfare MP and Unreal Tournament 3 - there was no difference in the performance of the GeForce GTX 260 on different driver versions (remember that version 180.48 gets a slight head start due to the resolution 1280 x 960). Although performance gains were promised in the games Enemy Territory: Quake Wars and Call of Duty 4: Modern Warfare MP, only on some "different configurations" they can vary.

Devil may cry 4






But in the game Devil May Cry 4, a significant jump in performance is already obvious on the driver version 180.48. Both in the mode with full-screen anti-aliasing and anisotropic filtering, and in the mode without these methods of improving the quality of graphics, the GeForce 180.48 driver allows the video card to demonstrate a higher average speed than in its previous versions.

S.T.A.L.K.E.R .: Clear Sky



There is a subtle change in performance depending on driver versions. It is hardly worth paying attention to.

Crysis WARHEAD

The Crysis WARHEAD game settings are as follows:


Results:






1-2 frames per second faster than GeForce 180.48 and in Crysis WARHEAD, which in practice, of course, is completely invisible in the game.

Far cry 2






The results of testing a video card with different versions of drivers in the mode without methods for improving the quality of graphics are of no interest, but when full-screen anti-aliasing and anisotropic filtering are enabled, the GeForce 180.48 takes a sharp lead. This is not at all the 38% increase that was promised, but also interesting. Fortunately, we could not find any difference in picture quality on different driver versions.

X3: Terran Conflict

In the new game X3: Terran Conflict, testing was carried out with the following settings (resolution, AF and AA were changed):


The test results did not bring any surprises:







Left 4 dead

Since testing in this new game is carried out for the first time, I will give here its detailed settings:


Screen resolution, anisotropic filtering and full-screen anti-aliasing varied depending on the testing mode. Since the game engine is not too heavy for modern graphics cards, MSAA8x was used instead of the often used 4x full-screen anti-aliasing.

The test was carried out on the first stage of "The Seven" of the "No Mersy" level with a large number of explosions, quickly dying dead and other effects:



Let's see the results:






As we remember, there is no mention of performance gains in the release notes for the GeForce drivers for Left 4 Dead. Moreover, at the time of the release of all driver versions up to 178.24 inclusive, the game was not yet on the market. Nevertheless, the increase is quite noticeable, and occurs stepwise from the series 177.xx to 178.xx and further to 180.xx. Changes in picture quality were not found not only in the dynamics of the game, but also upon close examination of the screenshots.

Lost Planet: Colonies

Since the testing in this game was carried out after the completion of the main block of tests, the resolution of 1280 x 960 was used not only in the driver version 180.48, but also in the three previous versions of GeForce. Results:






Indeed, there is a small performance gain when switching to newer driver versions in both test scenes of Lost Planet: Colonies, but 180.48 is too far from the declared 80%.

Graphics quality and speed on the example of 3DMark 2006

First of all, I will make a reservation that this section of the article is a supplement and does not pretend to be a full-fledged study. Here I propose to evaluate the decrease in performance of the GeForce GTX 260 (216SP) in the synthetic package 3DMark 2006, depending on the graphics quality mode, and also to evaluate the quality itself. To this end, 3DMark 2006 was tested twice eight times, and the quality settings were changed in the GeForce drivers control panel from “High Performance” to “High Quality”. Further, already in 3DMark 2006 itself, anisotropic filtering and three levels of full-screen anti-aliasing of 2x, 4x and 8x degrees were sequentially enabled. Let me remind you that 3DMark 2006 does not respond to enabling anti-aliasing from the control panel settings in GeForce drivers, so other types of multisampling have not been tested. The choice of this particular synthetic package for tests is due to the almost unique ability to create a screenshot of a given frame, which is necessary in the context of testing.

The tests were carried out using the GeForce 180.48 driver. First, let's take a look at the diagram with the speed change depending on the quality mode:



Obviously, the quality mode selected directly in the driver does not affect the speed at all, since the 3DMark 2006 scores vary within the margin of error in one direction or the other in both resolutions. But enabling anisotropic filtering and, especially, full-screen anti-aliasing, already leads to a decrease in speed, which, however, is quite predictable.

Now let's look at how the picture quality changes with different settings in the control panel of the GeForce drivers. For this, frame 1350 from the Canyon Flight scene was selected in the maximum resolution for the monitor 1920 x 1200:


High PerfomancePerfomance


QualityHigh quality


To see the details, it is more convenient to download all screenshots at once and switch between them in some graphical browser (ACDSee, for example). The first two screenshots in the "High Performance" and "Performance" modes do not differ at all in quality. At least I couldn't find any differences. The second pair of four screenshots in the “Quality” and “High Quality” modes also do not differ in quality with each other, but you can find a difference with the quality of the previous pair. Namely, at the “Quality” and “High Quality” settings, the picture is a little darker, the background does not look as whitish as in the first couple of screenshots.

Now let's see how the picture changes when various methods of improving the quality of graphics are activated, be it anti-aliasing or anisotropic filtering:


HQ + AF16xHQ + AF16x + AA2x


HQ + AF16x + AA4xHQ + AF16x + AA8x


Enabling 16x anisotropic filtering literally transforms the image! Textures appear on the body of the sea monster and rocks, boards on the rudder and keel of the accelerated ship became visible, the surface of the balloon and its plumage also began to look more natural and detailed. In general, it is strongly not recommended to turn off anisotropic filtering. In the remaining three screenshots, enabling full-screen anti-aliasing gradually raises the image quality to its maximum level. The smoothed cables and edges of the ship, the guides of the engine propeller, the wings of the ball on which the ship is suspended - all this is a consequence of the multisampling operation. If we talk about the difference in picture quality with different degrees of anti-aliasing, then it is very noticeable between AA2x and AA4x, but the transition to AA8x is not so obvious.

Initially, I planned to carry out similar 3DMark 2006 testing in the new game Crysis WARHEAD, but this attempt ended in failure. Despite the fact that the HardwareOC Crysis WARHEAD Bench utility allows you to take a screenshot from a given frame of the demo, the screenshots were a little blurry around the edges, and it was only a stretch to compare them in terms of graphics quality. I went through all 13 demos built into the test, but the pictures were always obtained with an offset. Alternatively, you could try to take screenshots immediately after loading any save, but even in this case the image did not turn out to be static, so it was not possible to make an accurate quality comparison. Nevertheless, this did not stop us from tracking the change in the performance of the video card in the game depending on the quality mode:



Of the features of testing in Crysis WARHEAD, we can note the fact that anisotropic filtering does not affect the already low speed in the game on the GeForce GTX 260, as well as the fact that in modes using full-screen anti-aliasing of various degrees, the results are practically equal.

Conclusion

Despite the fact that the performance gains promised by NVIDIA during the gradual transition to newer versions of GeForce drivers were not confirmed in my conditions, the speed increase during today's testing was noticed in games such as World in Conflict, Devil May Cry 4, Crysis WARHEAD (a very small increase), Far Cry 2, Left 4 Dead, Lost Planet: Colonies, as well as in the 3DMark Vantage and Unigine Tropics Demo benchmarks. Therefore, there is undoubtedly a point in installing new drivers on video cards. In addition, in some old games that appear in the "release notes" for the drivers, testing was not carried out, and in some of the new ones noticed in the same place, enough precise methods there is simply no performance measurement. On other configurations and operating systems, the performance gain can vary both upward and downward. It should be remembered here that new drivers are not only designed to improve performance, but are often aimed at fixing errors in working with new games, which is also an argument in favor of installing new versions.

At the end of the article, a few words about the tested card. ZOTAC GeForce GTX 260 AMP2 graphics card! Edition turned out to be an interesting product with expressive packaging, the most complete set of delivery, with higher frequencies than nominal and with a very efficient and quiet cooling system. Unfortunately, the overclocking potential of the graphics processor of the video card was completely absent, but the overclocking of the video memory turned out to be quite successful. The cost of GeForce GTX 260 AMP2! Edition is comparable to that of competitors' products. It should be noted that in this segment the range of GeForce GTX 260 video cards on the market is quite extensive, so it will not be difficult to choose the model that is interesting for you.

In our next article on this topic, I plan to tell you about the evolution of the speed of ATI Catalyst drivers using the example of the Radeon HD 4870 1024 MB video card.

P.S. We would like to thank the Russian representative office of ZOTAC and 3Logic, as well as personally Nadezhda Dymova for the video card provided for testing.

Other materials on this topic


Fallout 3 and modern video cards: expectations and reality
Inexpensive ATI Radeon HD 4830 Gaming Graphics
Choosing a video card for games: autumn-winter 2008

This is a WHQL release from the Release 260 family of drivers. This driver package supports GeForce 6, 7, 8, 9, 100, 200, 300, and 400-series desktop GPUs as well as ION desktop GPUs.

New in Release 260.89

New GPU Support

  • Adds support for the newly released GeForce GPU.

Performance

  • Increases performance for GeForce GTX 400 Series GPUs in several PC games vs. v258.96 WHQL drivers. The following are examples of some of the most significant improvements measured on Windows 7. Results will vary depending on your GPU and system configuration:
  • GeForce GTX 480:

      • Up to 10% in StarCraft II (2560x1600 4xAA / 16xAF Ultra)
      • Up to 14% in S.T.A.L.K.E.R .: Call of Pripyat (1920x1200 4xAA / 16xAF)
      • Up to 16% in S.T.A.L.K.E.R .: Call of Pripyat (SLI - 2560x1600 4xAA / 16xAF)
      • Up to 6% in Aliens vs. Predator (SLI - 1920x1200 noAA - Tessellation on)

    GeForce GTX 460:

      • Up to 19% in StarCraft II (SLI - 1920x1200 4xAA / 16xAF Ultra)
      • Up to 15% in Battlefield Bad Company 2 (SLI - 2560x1600 4xAA / 16xAF)
      • Up to 12% in S.T.A.L.K.E.R .: Call of Pripyat (2560x1600 noAA)
      • Up to 9% in Aliens vs. Predator (1680x1050 4xAA / 16xAF - Tessellation on)
      • Up to 7% in Metro 2033 (1680x1050 noAA - Tessellation on)
      • Up to 11% in Dirt 2 (SLI - 2560x1600 4xAA / 16xAF)
      • Up to 12% in Crysis: Warhead (SLI - 1920x1200 4xAA / 16xAF Gamer)
      • Up to 13% in Far Cry 2 (2560x1600 4xAA / 16xAF)
      • Up to 12% in H.A.W.X (SLI - 1920x1200 4xAA / 16xAF SSAO Very High)
      • Up to 5% in Just Cause 2 (1920x1200 4xAA / 16xAF)
      • Up to 22% in Riddick: Assault on Dark Athena (1920x1200 noAA)
      • Up to 5% in 3DMark Vantage (Extreme Preset)

Blu-ray 3D

  • Adds support for playing back Blu-ray 3D discs when connecting your GPU to an HDMI 1.4 3D TV. Playback requires compatible software application from CyberLink, ArcSoft, Roxio, or Corel. For more information,.

HD Audio

  • Adds lossless DTS-HD Master Audio and Dolby TrueHD audio bitstreaming support for compatible Blu-ray movies with GeForce GTX 460 GPUs *.
  • Adds high definition 24-bit, 96 and 192 KHz multi-channel audio sampling rate support for compatible Blu-ray movies with GeForce GTX 400 Series, GT 240, GT 220 and 210 GPUs *.
  • Upgrades HD Audio driver to version 1.1.9.0.

* Note: A Blu-ray movie player update may be required to enable these new features; check with your movie players software manufacturers for more details. These features are only supported on Windows 7.

Installation

  • New driver installer with enhanced user interface and new Express and Custom installation options.
    • Express - fast and easy one-click installation
    • Custom - customized installation
      • Option to perform a clean installation (completely removes older drivers from your system prior to installing the new driver).
      • Option to choose which driver components (ie. PhysX or 3D Vision) to install.
    • Improved installation time for multi-GPU PCs.

NVIDIA Surround

  • Updated NVIDIA Surround setup wizard
    • After first setup, wizard allows users to jump to any setup step.
    • Improved display connection diagrams and tooltips.
    • Improved UI for setup and arrangement of displays.
    • Improved bezel correction setup experience.
    • Adds help page to highlight which in-game resolution to select (e.g. how to pick bezel corrected resolutions)
    • Option to dedicate an extra GPU to PhysX or to drive an additional display.
    • Allows for portrait or landscape setup directly from the setup wizard.
  • Updated 3D Vision Surround and NVIDIA Surround game support list. Please visit the for a full list of supported games.

NVIDIA 3D Vision

  • WHQL Certified driver
  • With Release 260 drivers, the installation process for 3D Vision has changed. Please view this knowledgebase article for more information on the changes.
  • Fixed glasses losing sync to 3D Vision IR emitter that what would cause glasses to flicker and loss of 3D effect user experience.
  • Adds NVIDIA 3D Vision streaming support for Firefox 4 and Google Chrome web browsers.
  • Adds support for Sony's 3D Sweep Panorama picture format added to NVIDIA 3D Photo Viewer (Sony digital cameras that can capture 3D Sweep Panorama pictures include NEX-5 / NEX-3, Alpha a560 / a580 and Cyber-shot DSC-WX5 / DSC-TX9 / DSC-T99 models).
  • Adds support for new 3D Vision Desktop LCD monitors: BenQ XL2410T and NEC F23W2A
  • Adds support for new 3D Vision projectors: Sanyo PDG-DWL2500 and ViewSonic PJD6251
  • Added the following
    • Arcania Gothic 4
    • Fallout: New Vegas
    • Ferrari Virtual Academy 2010 (new in 260.89)
    • Ferrari Virtual Race (new in 260.89)
    • FIFA 11
    • Formula 1 Racing
    • Final Fantasy XIV Benchmark
    • Guild Wars 2
    • Kane & Lynch 2 - Dog Days
    • Lead and Gold
    • Lego Harry Potter
    • Live for speed
    • Lost Planet 2
    • Moonbase alpha
    • Serious Sam HD - The Second Encounter
    • Shrek forever after
    • Singularity
    • Vitrua Tennis 2009
    • Virtrua Tennis 3
  • Update the following
    • Civilization V - updated from v260.63 to 3D Vision Ready rating
    • Dead Rising 2 - updated from v260.63 to 3D Vision Ready rating
    • Drakensang: The Dark Eye - updated in-game compatibility text
    • Mafia II - updated profile to properly for 3D Vision-Rating
    • StarCraft II - fixed profile to properly recognize the retail game executable name and match the 3D Vision rating of “Good”
    • Super Commander - fixed HUD elements
    • TRINE - new profiles fixes that allow the game to be rated ”3D Vision-Ready” when used with the TRINE patch v1.08, available via Steam.

NVIDIA SLI

  • Adds or enhances SLI profiles for the following PC games:
    • City of Heroes: Going Rogue
    • Alien swarm
    • Dead rising 2
    • Front mission evolved
    • Kane and lynch 2: dog days
    • LEGO: Harry Potter

Other Improvements

  • Adds support for OpenGL 4.1 for GeForce 400 series GPUs.
  • Upgrades System Software to version 9.10.0514.
  • Improves compatibility for older PC games (DirectX 7 to DirectX 9) running on Windows 7 (examples: Gothic, Gothic II, Falcon 4.0: Allied Force, Links 2003, Independence War II - Edge of Chaos, and X2: Wolverine "s Revenge) ...
  • Adds drag and drop display arrangement support to the “Set up multiple displays” page.
  • Includes numerous bug fixes. Refer to the release notes on the documentation tab for information about the key bug fixes in this release.
  • Users without US English operating systems can select their language and download the International driver .

Additional Information

  • Supports the new GPU-accelerated features in.
  • Supports GPU-acceleration for smoother online HD videos with Adobe Flash 10.1. Learn more.
  • Supports the new version of MotionDSP "s video enhancement software, vReveal, which adds support for HD output. NVIDIA customers can download a free version of vReveal that supports up to SD output
    GeForce 7 series:
    7950 GX2, 7950 GT, 7900 GTX, 7900 GT / GTO, 7900 GS, 7800 SLI, 7800 GTX, 7800 GS, 7650 GS, 7600 LE, 7600 GT, 7600 GS, 7550 LE, 7500 LE, 7350 LE, 7300 SE / 7200 GS, 7300 LE, 7300 GT, 7300 GS, 7150 / NVIDIA nForce 630i, 7100 GS, 7100 / NVIDIA nForce 630i, 7100 / NVIDIA nForce 620i, 7050 PV / NVIDIA nForce 630a, 7050 / NVIDIA nForce 630i, 7050 / NVIDIA nForce 610i, 7025 / NVIDIA nForce 630a

    GeForce 6 series:
    6800 XT, 6800 XE, 6800 Ultra, 6800 LE, 6800 GT, 6800 GS / XT, 6800 GS, 6800, 6700 XL, 6610 XL, 6600 VE, 6600 LE, 6600 GT, 6600, 6500, 6250, 6200 TurboCache, 6200SE TurboCache, 6200 LE, 6200 A-LE, 6200, 6150SE nForce 430, 6150LE / Quadro NVS 210S, 6150 LE, 6150, 6100 nForce 420, 6100 nForce 405, 6100 nForce 400, 6100

The NVIDIA GeForce GTX 260 graphics adapter first appeared on the market in 2008, and in fact, has long been outdated. Although it is still ahead of almost all built-in video cards and can be used to run even some modern games, albeit at minimal settings. It should be noted that the model did not become the flagship at the time of its release, but it looked favorably against the background of other market offers, almost catching up with the older, flagship version of the GTX 280 in terms of parameters.

Having received not the worst characteristics from the manufacturer at that time, the GTX 260 made it possible to use the computer on which it was installed for the following purposes:

  • to run games and other graphics applications that support DirectX 10.0 or OpenGL 2.1 technology;
  • for watching videos in FullHD format through the TV-Out input;
  • for image transmission in 2048x1536 and 2560x1600 quality when connected to monitors via adapters to VGA and HDMI, respectively (despite the fact that only DVI-I was located on the card).

The presence of 896 Mb of GDDR3 video memory and 999 (1998) MHz frequency is not impressive even against the background of competitors in 2008. However, the width of the 448-bit bus made the video adapter more efficient than some more modern office models with 1 or 2 GB like GT 610 or GT 720. In tests it matches the parameters of the integrated Intel HD 630 card.

The parameters, which were not bad at the time of the release of the model, led to its high cost. The average price of the GTX 260 on the market started at $ 400. Now, 10 years later, those wishing to purchase a graphics adapter can do so at a much lower cost. In the secondary market, the cost of the NVIDIA GeForce GTX 260 video card in rubles starts from one and a half thousand.

GTX 260 review

The appearance of the graphics adapter, regardless of its manufacturer, remains impressive, reminiscent of modern cards. Among the possibilities of the card are:

  • support for PhysX technologies for obtaining game effects, universal computing CUDA, DirectX and OpenCL, thanks to which the card is also suitable for encoding or video editing;
  • the ability to combine with 1 or 2 of the same adapters when installed on a special SLI-compatible motherboard;
  • a certain margin of safety for working with 3D images with a compatible monitor.
On the other hand, the capabilities of the card also largely depend on the computer's central processor and the amount of RAM. Quite good performance will be provided on a computer with processors like Intel Core i5 and 4-8 GB of RAM.

The disadvantages include the power consumption of the GTX 260, which reaches 182 W - because of it, the cooling system of the video adapter is quite noisy, and for normal operation you will need a powerful power supply.

Another serious drawback is the lack of support for DirectX 11, which is why some new games may simply not launch, even if the card meets their minimum requirements.

What kind of power supply is needed for the GTX 260

The need to provide almost 200 watts of power only for the operation of the video card requires the use of a sufficiently powerful power supply. The recommended value for the PSU of a basic gaming computer is 500 watts. The minimum allowable power value is 450 watts.

Such requirements do not allow installing a video card on a regular office PC equipped with a 300-400 W unit. Therefore, even when assembling an inexpensive system from used components, it is worth providing a sufficiently powerful power supply.

How to overclock an NVIDIA GeForce GTX 260 graphics card

To improve graphics performance, you can overclock the GTX 260 using utilities like FireStorm or RivaTuner. Used to improve performance and special software from manufacturers (for example, Zotac or Palit).

After the overclocking of the NVIDIA GeForce GTX 260 video card is completed, the operating frequency increases by 8-16%. The indicators in games also increase, increasing the comfort of the gaming process.

Taking into account the high power consumption and the relatively low speed of the video adapter, it is not worth mining cryptocurrency with it. Even after overclocking, mining the GTX 260 is making a profit comparable to paying electricity bills.

Therefore, it is not recommended to assemble a mining farm from such GPUs. For this, more modern versions with lower power consumption and more memory capacity and frequency are better suited.

Testing in games

Performing the test in games GeForce GTX 260, they get the following indicators:

  • for the shooter Call Of Duty 4, FPS rates increase from 60 to 130 frames per second, depending on the settings;
  • in the first part of Assasin's Creed, the card gives out from 46 to 102 FPS;
  • for Unreal Tournament 3, the value rises to 170 when switching to a resolution of 1280x1024 pixels;
  • in Crysis, the frame rate can be 30, 60 or 80 FPS - it all depends on the settings and format.

For more modern games, the numbers will be less impressive. It will not be possible to launch most modern shooters and action games with its help, at best - strategy or MMORG.

The GTX 260 model is considered the minimum acceptable video card for the online game Total War: Arena; it will easily launch the popular 5th part of the Elder Scrolls series, Skyrim and even Fallout 4 with its help.

However, trying to play the third "The Witcher" or GTA V is not recommended - they will start, but the gameplay is unlikely to be called comfortable.

How to update NVIDIA GeForce GTX 260 drivers

The need to download drivers for NVIDIA GeForce GTX 260 may appear when there are problems with the software already installed on the computer. We'll have to look for control programs for the video card and when reinstalling Windows (or another supported operating system).

It is recommended to use the official NVIDIA website to download the driver. Here on one of the pages you can download the GeForce Experience utility, which allows you not only to optimize the card's performance, but also to improve its functionality in games.

We recommend reading

Up