Geforce gtx 260 driver update

Entrance doors 02.10.2020
Entrance doors
In their desire to get ahead of each other, NVIDIA and ATI often sacrifice the quality of driver preparation when releasing new graphics processors and video cards based on them. At the same time, it's no secret that NVIDIA GeForce (nee ForceWare) and ATI Catalyst drivers are one of the determining success factors for new graphics solutions. As an example, it suffices to recall that at the time of the release of the G200, RV770 GPUs and video cards based on them, there were no official driver versions at all, and all reviewers had to test the cards on various beta versions that were far from not only ideal, but also from elementary stability. As a result, the test results could not be called fully objective, and the spread in the results obtained by different authors was quite significant even within one single game or test application.

The release of official certified versions of drivers does not always and not in all applications add speed to video cards - it is most often aimed at fixing bugs and supporting new games. Nevertheless, both NVIDIA and ATI are not shy about systematically claiming some performance gains in some games and tests. But in practice, these statements are not confirmed as often as we would like. In order to try to identify the speed gain on different versions of NVIDIA GeForce and ATI Catalyst drivers, I decided to prepare two separate articles comparing the speed of video cards on several driver versions. In the next article, after the release of the promising Catalyst 8.12, I will test the Radeon HD 4870 on drivers of various versions. Well, in today's article, we will check how much NVIDIA GeForce drivers have evolved in this regard using the GeForce GTX 260 (new version, with 216 pipelines) as an example. The company ZOTAC will help us with this, kindly providing its GeForce GTX 260 AMP2 video card! Edition for testing.

Overview of the video card ZOTAC GeForce GTX 260 AMP2! Edition 896 MB

company ZOTAC International (MCO) Limited can still be attributed to a newcomer to the graphics card market, since it entered this market segment only last year. Video card ZOTAC GeForce GTX 260 AMP2! The 896 MB Edition comes in a large, vertically oriented cardboard box. The front side depicts a formidable dragon with outstretched wings:



There you can also find some characteristics of the video card, a label about the extended warranty period and a sticker with information that along with the video card you are purchasing the Race Driver GRID game. On the back of the box there is a description of the key features of the NVIDIA GPU, which talks about 192 more unified shader processors, while the video card tested today is equipped with 216 processors. Apparently, the box is still from the old version of the ZOTAC cards of the AMP! series, only they managed to stick the AMP2! sticker on the front side.

Inside, the video card is securely fixed in a plastic shell in the central compartment, and around and under it you can find the following components:



I will list them:

adapter-splitter from S-Video output to component cable;
one DVI → HDMI adapter;
one DVI → D-Sub adapter;
instructions for installing a video card and drivers;
two cables for connecting additional power to the video card (6-pin connectors);
audio cable (S/PDIF, for sound output via HDMI interface);
CD with video card drivers;
DVD with the full version of Race Driver GRID.

In this regard, the video card is favorably distinguished from the products of competitors by two cables for connecting additional power (most often one is supplied), as well as a quite relevant and graphically rich game Race Driver GRID.

The external difference between the new video card and the reference GeForce GTX 260 is only in the sticker on the plastic casing of the cooling system, which depicts the same dragon as on the box:


The dual-slot cooling system covers the card from three sides:






The dimensions of the video card are standard and are 270 x 100 x 32 mm. ZOTAC GeForce GTX 260 AMP2! The Edition is not replete with frills in terms of connectors and is equipped with a PCI-Express x16 2.0 interface, two DVI-I ports (dual-channel, with support for high resolutions), as well as an S-Video output adjacent to the air outlet grille from the system unit case:



At the top of the video card, there are two 6-pin connectors for connecting additional power, a connector for connecting an audio S / PDIF cable, as well as two MIO connectors for building SLI and 3-Way SLI systems from two or three identical video cards on NVIDIA GPUs:


Let me remind you that according to the specifications, the peak power consumption of the GeForce GTX 260 (192SP) is 182 W, so for a system with one installed such a video card, a 500 W power supply is recommended, and for SLI configurations - with a power of 600 W or more. For a video card with 216 unified shader processors, the power requirements are slightly higher. Of course, these are official recommendations, the authors of which proceed from the fact that your system may have a processor with a high power consumption and several hard drives; otherwise, the video card will work on less powerful blocks - as our measurements show, the card itself on the GeForce GTX 260 (192SP) consumes about 136 W.

Let's see what a video card looks like without a cooling system:


Its power part:



ZOTAC GeForce GTX 260 AMP2 GPU Marking! Edition released in Taiwan on week 27, 2008 - G200-103-A2:



The chip belongs to the second revision (A2). The number of unified shader processors is 216, texture units - 72, rasterization units - 28. The frequencies of the graphics processor of the video card, unlike the official ones for the GeForce GTX 260 (575/1242 MHz), are increased to 648/1404 MHz - or + 12.7 / +13.1%! Quite good, even if these are not record frequencies for factory video cards. The voltage on the GPU is 1.06 V. I will add here that in order to save energy and in order to reduce heat generation, the frequencies of the GPU in 2D mode are reduced to 300/600 MHz.

896 MB of GDDR3 video card memory are made up of 14 chips located on the front and back sides of the board. The width of the video card memory exchange bus is 448 bits. As with all the previously reviewed GeForce GTX 260, the ZOTAC video card is equipped with Hynix chips with a nominal access time of 1.0 ns:


The nominal frequency of the chips, marked as H5RS5223CFR NOC, is 2000 MHz. According to the specifications of the GeForce GTX 260, its memory should operate at an effective frequency of 1998 MHz, while the ZOTAC GeForce GTX 260 AMP2! Edition memory frequency is 2106 MHz, which is only 5.4% higher than the nominal. Modestly, however, since there are commercially available GeForce GTX 260 graphics cards with higher memory frequencies.

Be that as it may, the specifications of the new video card are as follows:


Well, more detailed and complete characteristics can be seen in the table:


The cooling system has not changed compared to conventional cards based on GTX 280/260 chips:



I was always surprised by the thick, thick layer of thermal paste on the GPU. It seems that the workers at the factory are paid piecework, for the amount of thermal paste used, as it used to be in Soviet years from collective farmers-tractor drivers for the amount of diesel fuel used. Only our “wise men” poured it into the ground at the edge of the field, and the hardworking Chinese diligently put all the “placed” thermal interface on the cover of the GPU heat spreader. The problem here is that, ideally, thermal paste should not form a continuous layer between the chip and the heatsink, it should only smooth out the roughness of the metal, filling in the bumps - its thermal conductivity is high compared to air, but small compared to the metal of the heatsink.

Let's check the temperature of the video card. Here and below, the load on the GPU and the video card as a whole was created using ten cycles of the Firefly Forest test from the 3DMark 2006 synthetic graphics package at a resolution of 1920 x 1200 with 16x anisotropic filtering. Full screen anti-aliasing was not activated, because when it is enabled, the load on the GPU and its temperature are lower. All tests were carried out in a closed case Ascot 6AR2-B of the system unit (you can find the fan configuration in it in the section with the testing methodology below). Room temperature during testing was 23.5 degrees Celsius. Video card frequencies and temperatures were monitored using the RivaTuner v2.20 program (author - Alexey Nikolaychuk). Due to the fact that the video card was disassembled before the tests, the standard thermal interface on the GPU was replaced with a highly effective Gelid GC1 thermal paste, applied to the GPU as thinly as possible.

Let's look at the test results in automatic turbine operation:


The temperature of the graphics processor did not exceed 80 ° C, and this is despite the already increased frequencies of the video card and the low noise level of the cooling turbine. The latter during testing only spun up to 1860 rpm, with the maximum possible 3200 rpm.

And here is how the reference cooler cools the video card at 100% of the turbine power:


Here, temperatures are atypically low for a top-end video card. Interestingly, the voltage regulator current at peak load at lower graphics card temperatures turned out to be 5 A less than at higher temperatures.

Given the high efficiency of the reference cooler, the overclocking potential of the video card was tested without replacing the cooling system, but the turbine rotation speed was manually set to 2050 rpm - this turned out to be the maximum speed and at the same time subjectively comfortable mode in terms of noise level. As a result, the video card was overclocked to 2448 MHz (+22.5%) without loss of stability and image quality by video memory, and it was not possible to overclock it by the GPU at all:


Alas, the core of the ZOTAC video card provided to us for testing is already working at the limit of its capabilities. In addition, increasing the core voltage from 1.06 V to 1.15 V in the video card BIOS did not lead to an increase in the overclocking potential of the GPU. Well, what can I say? Perhaps the features of a particular instance of the video card, nothing more.

The temperature regime of the card additionally overclocked by video memory is as follows:


By the time the material is ready, the recommended price for the ZOTAC GeForce GTX 260 AMP2! Edition 896 MB was 320-350 US dollars, but prices for the entire new line of graphics cards based on NVIDIA GPUs are expected to drop soon, which will certainly affect ZOTAC products as well. At the time of writing the article, the real cost of this card in Moscow stores was a little over 10 thousand rubles.

Stages in the evolution of GeForce drivers (ForceWare)

First of all, it is necessary to draw your attention to the fact that NVIDIA changed the name of the drivers from "ForceWare" to "GeForce" this year, which, in my opinion, now only leads to confusion between the names of video cards and the names of drivers. Well, okay, get used to it, not the first time already. As of 11/19/2008 (the start date of testing), NVIDIA has released countless beta drivers and five official certified versions for the GeForce GTX 260 and GTX 280 line of video cards, four of which we will study today.

GeForce 177.41 (26.06.2008)- is the second WHQL-certified driver for the GTX 280/260 line. Why wasn't the first official version 177.35 chosen? The fact is that driver version 177.41 came out only nine days after 177.35, and did not make any significant changes. For example, here is what the official list of innovations looks like:

added support for GeForce GTX 280 and GeForce GTX 260 GPUs;
single GPU and NVIDIA SLI technology supported on DirectX 9, DirectX 10 and OpenGL, including 3-way SLI technology with GeForce GTX 280 and GeForce GTX 260 GPUs;
added support for CUDA technology;
added support for distributed computing system [email protected];
support for HybridPower, Hybrid SLI technology, on the following GPUs and motherboards:

GeForce GTX 280 GPU;
GeForce GTX 260 GPU;
nForce 780a SLI;
nForce 750a SLI;


support for GPU overclocking and temperature monitoring when installing NVIDIA System Tools (minor bugs fixed).

As you can see, no changes in terms of video card performance growth were announced in the new (at that time) driver. In addition to the above, a couple of bugs were fixed under Windows Vista x64 - and that's it.

GeForce 178.13 (09/25/2008)- the third WHQL-certified driver for the GeForce GTX 280/260, released after an unusually long break for NVIDIA (almost 3 months have passed). Interesting changes include the following:

Added support for NVIDIA PhysX acceleration for all GeForce 8, 9 and 200 series GPUs with at least 256 MB of video memory (PhysX version 8.09.04 is now packaged with the driver);
Added support for 2-way and 3-way NVIDIA SLI technology for GeForce GTX 200 series GPUs on Intel D5400 XS motherboards;


up to 11% in 3DMark Vantage;

up to 15% in BioShock (DX10);
up to 15% in Call of Duty 4;




up to 7% in BioShock (DX10);


up to 10% in World in Conflict (DX10);


Fixed various compatibility issues with 3D applications.

In addition, the notes for this release mention fixing some bugs in games for a single video card and SLI bundles. By the way, this version of the GeForce driver was characterized by many users and testers as the most stable and problem-free.

GeForce 178.24 (10/15/2008)– was released just 20 days after the release of version 178.13 and also has a WHQL certificate. Despite a very short period of time, there are more than enough changes. I will list the key ones related to improving performance in games:

performance improvements in the following single-GPU 3D applications:

up to 11% in 3DMark Vantage;
up to 11% in Assassin's Creed (DX10);
up to 15% in BioShock (DX10);
up to 15% in Call of Duty 4;
up to 8% in Enemy Territory: Quake Wars;


performance improvements in the following 3D applications for 2-way SLI GPUs:

up to 7% in BioShock (DX10);
up to 10% in Company of Heroes: Opposing Fronts (DX10);
up to 12% in Enemy Territory: Quake Wars;
up to 10% in World in Conflict (DX10).

Attentive readers will surely pay attention to the fact that all optimizations are identical to those in driver version 178.13. However, this is not my mistake, as the official website states that all performance gains are shown from the 178.19 beta driver, which was released after 178.13. An interesting moment, since, in fact, NVIDIA doubled the speed of its video cards by the same amount in the above games with two drivers. Rather, it should have increased ...

In addition to speed optimizations, the new driver has introduced fixes for DVI-HDMI devices, Hybrid SLI mode, and also fixed errors in the menu World games in Conflict when using full-screen anti-aliasing for the GeForce 6600 (apparently, this is a joke at NVIDIA - this card will still not be able to show any acceptable performance in a game that is very demanding on GPU power). As with version 178.13, the PhysX libraries of version 8.09.04 are packaged with this driver.

GeForce 180.48 (11/19/2008)- the freshest this moment from certified GeForce drivers for video cards based on NVIDIA chips. The long-awaited driver of the new 180-series, which has been advertised in advance by NVIDIA itself, in addition to the traditional error correction, should bring significant changes and a noticeable performance boost. In more detail it looks like this:

new opportunities:

NVIDIA SLI technology certification for motherboards based on Intel X58 chipsets (Intel Core i7 processors) for GeForce GTX 280, GeForce GTX 260, GeForce 9800 GX2, GeForce 9800 GTX+, and GeForce 9800 GTX GPUs;
added the ability to connect multiple monitors to video cards combined into an SLI bundle, both in desktop and 3D modes;
PhysX technology now allows physical graphics acceleration on GeForce 8, 9 and 200 series graphics cards;


Performance gains are claimed for the following 3D applications:

up to 10% in 3DMark Vantage;
up to 13% in Assassin's Creed;
up to 13% in BioShock;
up to 15% in Company of Heroes: Opposing Fronts;
up to 10% in Crysis Warhead;
up to 25% in Devil May Cry 4;
up to 38% in Far Cry 2;
up to 18% in Race Driver: GRID;
up to 80% in Lost Planet: Colonies;
up to 18% in World of Conflict.

Particularly impressive is the 80% increase in speed in new game Lost Planet: Colonies - you can make an erroneous conclusion that before the 180.48 driver, instead of 180 fps, there were only 100 fps. In addition, to all the declared performance gains on the official page of the site there is a note: "the results may be different on different configurations", that is, no one guaranteed anything. Anyway, in general, it should be noted that the changes in the new driver are very interesting. Will they turn out to be a reality in practice?.. Let's check.

Test configuration, tools and testing methodology

Testing of the video card and different versions of drivers was carried out on a computer with the following configuration:

Motherboard: DFI LANParty DK X48-T2RS (Intel X48, LGA 775, BIOS 03/10/2008);
Processor: Intel Core 2 Extreme QX9650 (3.0 GHz, 1.25 V, L2 2 x 6 MB, FSB 333 MHz x 4, Yorkfield, C0);
CPU Cooler: Thermalright SI-128 SE with Scythe Ultra Kaze fan (1320 rpm);
Thermal interface: Gelid GC1;
RAM:

2 x 1024MB DDR2 Corsair Dominator TWIN2X2048-9136C5D (Spec: 1142MHz/5-5-5-18/2.1V);
2 x 1024MB DDR2 CSX DIABLO CSXO-XAC-1200-2GB-KIT (Spec: 1200MHz / 5-5-5-16 / 2.4V);


Disk Subsystem: SATA-II 300 GB, Western Digital VelociRaptor, 10,000 rpm, 16 MB, NCQ;
HDD cooling and soundproofing system: Scythe Quiet Drive;
Drive: SATA DVD RAM & DVD±R/RW & CD±RW Samsung SH-S183L;
Case: Ascot 6AR2-B (120 mm Scythe Slip Stream case fans at 960 rpm on silicone studs are installed on the intake and exhaust, the same fan at 960 rpm is installed on the side wall);
Control and monitoring panel: Zalman ZM-MFC2;
Power supply: Thermaltake Toughpower 1500 W, W0218 (standard 140 mm fan);
Monitor: 24" BenQ FP241W (1920 x 1200 / 60Hz)

In order to reduce the dependence of the video card tested today on the speed of the platform, the quad-core processor was overclocked to a frequency of 4.0 GHz at a voltage of 1.575 V:


During the tests, the RAM operated at a frequency with timings reduced to 5-4-4-12 at Performance Level = 6 and a voltage of 2.175 V.

All tests were performed on Windows Vista Ultimate Edition x86 SP1 (plus all critical updates as of 11/10/2008). The test start date is 11/19/2008, so the following drivers available at that time were used:

chipset motherboard: Intel Chipset Drivers 9.1.0.1007;
DirectX libraries: November 2008.

For each of the GeForce/ForceWare drivers, a separate PhysX package was installed, available at the time the driver was released, or included directly in the driver. Drivers were tested in the same sequence in which they were released. Each of the drivers and the PhysX package with them were installed only after removing the drivers of the previous version and additionally cleaning the system using the Driver Sweeper v1.5.5 program.

After installing each of the drivers, the following settings were changed in their control panels: graphics quality from "Quality" to "High Quality", anti-aliasing of transparent textures (Transparency antialiasing) from "Disable" to "Multisampling", vertical synchronization is forcibly disabled ( "Force off"). Other than that, no changes have been made. Anisotropic filtering and full-screen anti-aliasing were enabled directly in the game settings. If changing these settings in the games themselves is not implemented, then the parameters were adjusted in the control panel of the GeForce drivers.

Testing of video cards was carried out in two resolutions: 1280 x 1024/960 and widescreen 1920 x 1200. The following set of applications was used for tests, consisting of two synthetic tests, one techno demo and twelve games of different genres:

3D Mark 2006(Direct3D 9/10) - build 1.1.0, default settings and 1920 x 1200 + AF16x + AA4x;
3D Mark Vantage(Direct3D 10) - v1.0.1, "Performance" and "Extreme" profile (only the main tests were tested);
Unigine Tropics Demo(Direct3D 10) - v1.1, built-in demo test, maximum quality settings, resolution 1280 x 1024 without methods and resolution 1920 x 1200 with AF16x and AA4x;
world in conflict(Direct3D 10) - game version 1.0.0.9(b89), graphics quality profile "Very High", but UI texture quality = Compressed; Water reflection size = 512; DirectX 10 rendering enabled;
Enemy Territory: Quake Wars(OpenGL 2.0) – game version 1.5, maximum graphics settings, demo at Salvage level, Finland;
(Direct3D 9) – game version 1.7.568, graphics and texture settings set to "Extra", demo "d3" set to "Bog";
Unreal Tournament 3(Direct3D 9) - game version 1.3, maximum graphics settings in the game (5th level), Motion Blur and Hardware Physics are activated, the "Fly By" scene was tested at the "DM-ShangriLa" level (two consecutive cycles), the test was used HardwareOC UT3 Bench v1.3.0.0 ;
Devil May Cry 4(Direct3D 10) – version 1.0 of the game, maximum graphics quality settings ("Super High"), the average value of a double sequential run of the second scene of the test was taken as the result.
S.T.A.L.K.E.R.: Clear Sky(Direct3D 10) - game version 1.5.07, quality settings profile "Enhanced full illumination DX10" plus anisotropic filtering level 16x and other maximum graphics quality settings, own demo "s04" was used (triple test cycle);
Crysis Warhead(Direct3D 10) - game version 1.1.1.690, "Very High" settings profile, two cycles of the video card test at the "Frost" level from the test HardwareOC Crysis WARHEAD Bench v1.1.1.0;
Far Cry 2(Direct3D 10) - game version 1.00, "Ultra High" settings profile, two cycles of the "Ranch Small" test from the Far Cry 2 Benchmark Tool (v1.0.0.1);
X3: Terran Conflict(Direct3D 10) – game version 1.2.0.0, maximum quality of textures and shadows, fog enabled, parameters "More Dynamic Light Sources" and "Ship Color VariATions" enabled, the average speed value based on the results of one run of all four demos was taken as the result;
Left 4 Dead(Direct3D 9) - game version 1.0.0.5, maximum quality, tested "meat" demo (two passes) on the third level of "No Mersy", the first scene of "The Seven";
Lost Planet: Colonies(Direct3D 10) - game version 1.0, graphics level "Maximum quality", HDR Rendering DX10, built-in test consisting of two scenes.

The last game is almost of no interest anymore, as it is quite old (by the standards gaming industry). Nevertheless, I decided to add it to the list, since NVIDIA GeForce drivers claim up to 80% performance increase in Lost Planet: Colonies on driver 180.48! I wonder how realistic this figure is?

Testing in each of the applications was carried out twice (not to be confused with a double run of demos!). The best speed indicator (or conditional points) from two testing cycles was taken as the final result, which you will see on the diagrams. Let's move on to the test results and their analysis.

Test results

In the diagrams, the drivers are listed in the order in which they were released. First driver - version 177.41 - highlighted yellow, the two 178.xx series drivers are marked in blue-green, and the new 180.48 is marked in dark blue. Thus, the list of drivers in the tests looks like this:

GeForce 177.41;
GeForce 178.13;
GeForce 178.24;
GeForce 180.48.

Before proceeding to the analysis of the results, it is necessary to pay attention to one very important point. The fact is that the driver version 180.48 presented an unpleasant “surprise”, namely, the still very popular (if not the most popular) resolution of 1280 x 1024 pixels disappeared in it, and it was proposed to install 1280 x 960 instead of it in games and tests. And in the section with arbitrary resolutions ("Custom") in the driver control panel, it was impossible to add it in any way, since each attempt to test the "alternative" resolution of 1280 x 1024 pixels ended with an error. There was no time to wait for a response from the NVIDIA support service, to which the corresponding request was sent. Alternatively, it was possible to simply replace the 1280 x 1024 resolution selected for tests in all other versions of the drivers with 1280 x 960, but, as you remember, the 180.48 driver was tested last ... In general, the prospect of repeating the full cycle of all tests on three Previous versions of the drivers did not appeal to me at all. However, the resolution of 1280 x 960 is less than 1280 x 1024 by less than 6.3%, and the performance of video cards at various resolutions does not have a linear dependence on them. In addition, we have the same for everyone and less processor-dependent 1920 x 1200 pixels, which I propose to focus on.

First, the results of testing video card drivers in two synthetic tests:

3D Mark 2006



In the 3DMark 2006 test, there is no increase in performance when moving to new driver versions. A slight improvement over version 177.41 comes from the 178-series drivers, while 180.48 takes the lead due to the slightly lower resolution of 1280 x 960 and is equal to the others at 1920 x 1200.

3D Mark Vantage



The three versions preceding the 180.48 driver are equal in terms of performance in 3DMark Vantage, but the last of the official ones, 180.48, demonstrates a slight speed increase in both resolutions. The total score at 1280 x 960 pixels in 3DMark Vantage is not calculated, so I only provided the GPU Score data.

Unigine Tropics Demo

I draw your attention to the fact that today's article uses the new version 1.1 of this beautiful demo. The settings look like this (resolution, AF and AA changed):



Let's look at the results:



In the test from the new version of the Unigine Tropics demo, driver version 180.48 also allows the video card to run a little faster.

So, in two of the three synthetic tests, we observed, albeit very small, but still an increase in the performance of the video card on the new drivers version 180.48. How will the situation in real games?

world in conflict






In the mode without anti-aliasing and anisotropic filtering, there is a performance increase in all three versions of drivers released after 177.41. The latest driver version 180.48 shows an increase in the mode with full-screen anti-aliasing and anisotropic filtering, and quite close to the promised 18 percent.

Enemy Territory: Quake Wars







Call of Duty 4: Modern Warfare MP







Unreal Tournament 3






In the three games tested above - Enemy Territory: Quake Wars, Call of Duty 4: Modern Warfare MP and Unreal Tournament 3 - there was no difference in the performance of the GeForce GTX 260 video card on different driver versions (recall that version 180.48 gets a slight head start due to resolution 1280 x 960). Although performance gains were promised in Enemy Territory: Quake Wars and Call of Duty 4: Modern Warfare MP, they may vary on some "different configurations".

Devil May Cry 4






But in the game Devil May Cry 4, a significant jump in performance is already obvious on driver version 180.48. Both in the mode with full-screen anti-aliasing and anisotropic filtering, and in the mode without these graphics quality enhancement techniques, the GeForce 180.48 driver allows the video card to demonstrate a higher average speed than in its previous versions.

S.T.A.L.K.E.R.: Clear Sky



There is a subtle change in performance depending on driver versions. It is hardly worth paying attention to it.

Crysis Warhead

Crysis WARHEAD game settings are as follows:


Results:






It is also 1-2 frames per second faster than GeForce 180.48 in Crysis WARHEAD, which in practice, of course, is completely unnoticeable in the game.

Far Cry 2






The results of testing a video card with different versions of drivers in the mode without graphics quality improvement techniques are of no interest, but when full-screen anti-aliasing and anisotropic filtering are enabled, GeForce 180.48 jumps ahead sharply. This is not at all the 38 percent increase that was promised, but it is also interesting. Fortunately, we could not find any difference in picture quality on different versions of the drivers.

X3: Terran Conflict

In the new game X3: Terran Conflict, testing was carried out with the following settings (resolution, AF and AA were changed):


The test results did not bring any surprises:







Left 4 Dead

Since testing in this new game is carried out for the first time, I will give here its detailed settings:


Screen resolution, anisotropic filtering, and full-screen anti-aliasing changed depending on the testing mode. Since the game engine is not too heavy for modern graphics cards, MSAA8x was used instead of the commonly used 4x full screen anti-aliasing.

The test was carried out on the first stage of The Seven at the No Mersy level, with a large number of explosions, rapidly dying dead, and other effects:



Let's look at the results:






There is no mention of a performance boost in the "release notes" for the GeForce drivers for Left 4 Dead, as you and I remember. Moreover, at the time of the release of all driver versions up to and including 178.24, the game was not yet on the market. However, the increase is very noticeable, and occurs in steps from the series 177.xx to 178.xx and further to 180.xx. Changes in the quality of the picture were not found not only in the dynamics of the game, but also when studying the screenshots closely.

Lost Planet: Colonies

Since testing in this game was carried out after the completion of the main block of tests, the resolution of 1280 x 960 was used not only in driver version 180.48, but also in the three versions of GeForce that preceded it. Results:






Indeed, there is a slight increase in performance when switching to newer versions of drivers in both test scenes of Lost Planet: Colonies, but it is too far from the declared 80% at 180.48.

Graphics quality and speed on the example of 3DMark 2006

First of all, I will make a reservation that this section of the article comes as an addition and does not pretend to be a full-fledged study. Here I propose to evaluate the performance degradation of the GeForce GTX 260 (216SP) in the 3DMark 2006 synthetic package depending on the graphics quality mode, as well as evaluate the quality itself. To do this, 3DMark 2006 was tested twice eight times, and in the GeForce driver control panel the quality settings were changed from "High Performance" to "High Quality". Further, already in 3DMark 2006 itself, anisotropic filtering and three levels of full-screen anti-aliasing of 2x, 4x and 8x degrees were turned on in succession. Let me remind you that 3DMark 2006 does not respond to anti-aliasing enabled from the GeForce driver control panel settings, so other types of multisampling have not been tested. The choice of this particular synthetic package for tests is due to the almost unique possibility of creating a screenshot of a given frame, which is necessary in the context of testing.

Tests performed using GeForce driver version 180.48. First, let's look at the diagram with the change in speed depending on the quality mode:



Obviously, the quality mode selected directly in the driver does not affect the speed at all, since the 3DMark 2006 scores vary within the error in one direction or another in both resolutions. But the inclusion of anisotropic filtering and, especially, full-screen anti-aliasing already leads to a decrease in speed, which, however, is quite predictable.

Now let's look at how the picture quality changes with different settings in the GeForce Drivers Control Panel. For this, frame number 1350 from the “Canyon Flight” scene was selected at the maximum resolution for the monitor of 1920 x 1200:


high performancePerformance


Qualityhigh quality


To view the details, it is more convenient to download all the screenshots at once and switch between them in some graphical browser (ACDSee, for example). The first two screenshots in the "High Performance" and "Perfomance" modes do not differ in quality at all. At least I couldn't find any differences. The second pair of four screenshots in the "Quality" and "High Quality" modes also do not differ in quality from each other, but you can identify the difference with the quality of the previous pair. Namely, on the “Quality” and “High Quality” settings, the picture is a little darker, the background does not look as whitish as in the first pair of screenshots.

Now let's see how the picture changes when various graphics quality improvement techniques are activated, be it anti-aliasing or anisotropic filtering:


HQ+AF16xHQ+AF16x+AA2x


HQ+AF16x+AA4xHQ+AF16x+AA8x


Enabling 16x level anisotropic filtering literally transforms the image! Textures appear on the body of the sea monster and rocks, the boards on the rudder and keel of the overclocked ship became visible, the surface of the balloon and its plumage also began to look more natural and detailed. In general, turning off anisotropic filtering is strictly not recommended. In the remaining three screenshots, the inclusion of full-screen anti-aliasing gradually raises the image quality to the maximum level for it. The smoothed cables and edges of the ship, the guides of the engine propellers, the wings of the ball on which the ship is suspended - all this is a consequence of the multisampling. If we talk about the difference in picture quality with different degrees of smoothing, then between AA2x and AA4x it is very noticeable, but the transition to AA8x is not so obvious.

Initially, I planned to perform similar 3DMark 2006 testing in the new game Crysis WARHEAD, but such an attempt ended in failure. Despite the fact that the HardwareOC Crysis WARHEAD Bench utility allows you to take a screenshot from a given frame of the demo, the screenshots turned out to be a little blurry at the edges, and it was only a stretch to compare them in terms of graphics quality. I went through all 13 demos built into the test, but the pictures were everywhere with an offset. Alternatively, you could try to take screenshots immediately after loading a save, but in this case the image did not turn out to be static, so it was not possible to make an accurate quality comparison. Nevertheless, this did not prevent us from tracking the change in the performance of the video card in the game depending on the quality mode:



Of the features of testing in Crysis WARHEAD, we can note the fact that anisotropic filtering does not affect the already low speed in the game on the GeForce GTX 260, and also that in modes using full-screen anti-aliasing of various degrees, the results are almost equal.

Conclusion

Despite the fact that the performance gains promised by NVIDIA with a gradual transition to newer versions of GeForce drivers were not confirmed in my conditions, the speed increase during today's testing was noticed in games such as World in Conflict, Devil May Cry 4, Crysis WARHEAD (very small increase), Far Cry 2, Left 4 Dead, Lost Planet: Colonies, as well as in the test applications 3DMark Vantage and Unigine Tropics Demo. Therefore, there is no doubt that there is a point in installing new drivers for video cards. In addition, in some old games that appear in the "release notes" to the drivers, testing was not carried out, and in some new ones, noticed in the same place, it is enough precise methods there is simply no performance measurement. For other configurations and operating systems performance gain can vary both upwards and downwards. It should also be remembered here that new drivers are designed not only to improve performance, but also, often, are aimed at correcting errors in working with new games, which is also an argument in favor of installing new versions.

At the end of the article, a few words about the tested card. Video card ZOTAC GeForce GTX 260 AMP2! Edition turned out to be an interesting product with expressive packaging, the most complete set of delivery, with increased frequencies compared to the nominal ones, and with a very efficient and quiet cooling system. The overclocking potential of the graphics processor of the video card, unfortunately, was completely absent, but the overclocking of the video memory turned out to be quite successful. Price GeForce GTX 260 AMP2! Edition is comparable to that of competitors' products. It should be noted that in this segment the range of GeForce GTX 260 video cards on the market is quite extensive, so choosing the model that is interesting for you will not be difficult.

In our next article on this topic, I plan to tell you about the evolution of the speed of ATI Catalyst drivers using the Radeon HD 4870 1024 MB video card as an example.

P.S. We thank the Russian representative office of ZOTAC and 3Logic, as well as personally Nadezhda Dymova for providing the video card for testing.

Other materials on this topic


Fallout 3 and modern video cards: expectations and reality
Inexpensive gaming graphics card ATI Radeon HD 4830
Choosing a video card for games: autumn-winter 2008

This is a WHQL release from the Release 260 family of drivers. This driver package supports GeForce 6, 7, 8, 9, 100, 200, 300, and 400-series desktop GPUs as well as ION desktop GPUs.

New in Release 260.89

New GPU Support

  • Adds support for the newly released GeForce GPU.

Performance

  • Increases performance for GeForce GTX 400 Series GPUs in several PC games vs. v258.96 WHQL drivers. The following are examples of some of the most significant improvements measured on Windows 7. Results will vary depending on your GPU and system configuration:
  • GeForce GTX 480:

      • Up to 10% in StarCraft II (2560x1600 4xAA/16xAF Ultra)
      • Up to 14% in S.T.A.L.K.E.R.: Call of Pripyat (1920x1200 4xAA/16xAF)
      • Up to 16% in S.T.A.L.K.E.R.: Call of Pripyat (SLI - 2560x1600 4xAA/16xAF)
      • Up to 6% in Aliens vs. Predator (SLI - 1920x1200 noAA - Tessellation on)

    GeForce GTX 460:

      • Up to 19% in StarCraft II (SLI - 1920x1200 4xAA/16xAF Ultra)
      • Up to 15% in Battlefield Bad Company 2 (SLI - 2560x1600 4xAA/16xAF)
      • Up to 12% in S.T.A.L.K.E.R.: Call of Pripyat (2560x1600 noAA)
      • Up to 9% in Aliens vs. Predator (1680x1050 4xAA/16xAF - Tessellation on)
      • Up to 7% in Metro 2033 (1680x1050 noAA - Tessellation on)
      • Up to 11% in Dirt 2 (SLI - 2560x1600 4xAA/16xAF)
      • Up to 12% in Crysis:Warhead (SLI - 1920x1200 4xAA/16xAF Gamer)
      • Up to 13% in Far Cry 2 (2560x1600 4xAA/16xAF)
      • Up to 12% in H.A.W.X (SLI - 1920x1200 4xAA/16xAF SSAO Very High)
      • Up to 5% in Just Cause 2 (1920x1200 4xAA/16xAF)
      • Up to 22% in Riddick: Assault on Dark Athena (1920x1200 noAA)
      • Up to 5% in 3DMark Vantage (Extreme Preset)

Blu-ray 3D

  • Adds support for playing back Blu-ray 3D discs when connecting your GPU to an HDMI 1.4 3D TV. Playback requires compatible software application from CyberLink, ArcSoft, Roxio, or Corel. For more information, .

HD Audio

  • Adds lossless DTS-HD Master Audio and Dolby TrueHD audio bitstreaming support for compatible Blu-ray movies with GeForce GTX 460 GPUs*.
  • Adds high definition 24-bit, 96 and 192 KHz multi-channel audio sampling rate support for compatible Blu-ray movies with GeForce GTX 400 Series ,GT 240, GT 220 and 210 GPUs*.
  • Upgrades HD Audio driver to version 1.1.9.0.

*Note: A Blu-ray movie player update may be required to enable these new features; check with your movie players software manufacturers for more details.

Installation

  • New driver installer with enhanced user interface and new Express and Custom installation options.
    • Express – fast and easy one-click installation
    • Custom - customized installation
      • Option to perform a clean installation (completely removes older drivers from your system prior to installing the new driver).
      • Option to choose which driver components (ie. PhysX or 3D Vision) to install.
    • Improved installation time for multi-GPU PCs.

NVIDIA Surround

  • Updated NVIDIA Surround setup wizard
    • After first setup, the wizard allows users to jump to any setup step.
    • Improved display connection diagrams and tooltips.
    • Improved UI for setup and arrangement of displays.
    • Improved bezel correction setup experience.
    • Adds help page to highlight which in-game resolution to select (e.g. how to pick bezel corrected resolutions)
    • Option to dedicate an extra GPU to PhysX or to drive an additional display.
    • Allows for portrait or landscape setup directly from the setup wizard.
  • Updated 3D Vision Surround and NVIDIA Surround game support list. Please visit the full list of supported games.

NVIDIA 3D Vision

  • WHQL certified driver
  • With Release 260 drivers, the installation process for 3D Vision has changed. Please view this knowledgebase article for more information on the changes.
  • Fixed glasses losing sync to 3D Vision IR emitter that what would cause glasses to flicker and loss of 3D effect user experience.
  • Adds NVIDIA 3D Vision streaming support for Firefox 4 and Google Chrome web browsers.
  • Adds support for Sony's 3D Sweep Panorama picture format added to NVIDIA 3D Photo Viewer (Sony digital cameras that can capture 3D Sweep Panorama pictures include NEX-5/NEX-3, Alpha a560/a580 and Cyber-shot DSC-WX5/DSC-TX9 /DSC-T99 models).
  • Adds support for new 3D Vision Desktop LCD monitors: BenQ XL2410T and NEC F23W2A
  • Adds support for new 3D Vision projectors: Sanyo PDG-DWL2500 and ViewSonic PJD6251
  • Added the following
    • Arcania Gothic 4
    • Fallout: New Vegas
    • Ferrari Virtual Academy 2010 (new in 260.89)
    • Ferrari Virtual Race (new in 260.89)
    • FIFA 11
    • Formula 1 Racing
    • Final Fantasy XIV Benchmark
    • Guild Wars 2
    • Kane & Lynch 2
    • Lead and Gold
    • Lego Harry Potter
    • Live For Speed
    • Lost Planet 2
    • Moonbase Alpha
    • Serious Sam HD
    • Shrek Forever After
    • Singularity
    • Vitrua Tennis 2009
    • Virtrua Tennis 3
  • Update the following
    • Civilization V – updated from v260.63 to 3D Vision Ready rating
    • Dead Rising 2 – updated from v260.63 to 3D Vision Ready rating
    • Drakensang: The Dark Eye – updated in-game compatibility text
    • Mafia II – updated profile to properly for 3D Vision-Rating
    • StarCraft II – fixed profile to properly recognize the retail game executable name and match the 3D Vision rating of “Good”
    • Super Commander – fixed HUD elements
    • TRINE – new profiles fixes that allow the game to be rated”3D Vision-Ready” when used with the TRINE patch v1.08 , available via Steam.

NVIDIA SLI

  • Adds or enhances SLI profiles for the following PC games:
    • City of Heroes: Going Rogue
    • Alien Swarm
    • Dead Rising 2
    • Front Mission Evolved
    • Kane and Lynch 2: Dog Days
    • LEGO: Harry Potter

Other Improvements

  • Adds support for OpenGL 4.1 for GeForce 400 series GPUs.
  • Upgrades System Software to version 9.10.0514.
  • Improves compatibility for older PC games (DirectX 7 to DirectX 9) running on Windows 7 (examples: Gothic, Gothic II, Falcon 4.0: Allied Force, Links 2003, Independence War II - Edge of Chaos, and X2: Wolverine's Revenge) .
  • Adds drag and drop display arrangement support to the “Set up multiple displays” page.
  • Includes numerous bug fixes. Refer to the release notes on the documentation tab for information about the key bug fixes in this release.
  • Users without US English operating systems can select their language and download the International driver .

Additional Information

  • Supports the new GPU-accelerated features in .
  • Supports GPU-acceleration for smoother online HD videos with Adobe Flash 10.1. Learn more.
  • Supports the new version of MotionDSP's video enhancement software, vReveal, which adds support for HD output. NVIDIA customers can download a free version of vReveal that supports up to SD output
    GeForce 7 series:
    7950 GX2 7950 GT 7900 GTX 7900 GT/GTO 7900 GS 7200 GS, 7300 LE, 7300 GT, 7300 GS, 7150 / NVIDIA nForce 630i, 7100 GS, 7100 / NVIDIA nForce 630i, 7100 / NVIDIA nForce 620i, 7050 PV / NVIDIA nForce 630a, 7050 / NVIDIA nForce 0 / NVIDIA nForce 630i, 7 610i, 7025 / NVIDIA nForce 630a

    GeForce 6 series:
    6800 XT 6800 XE TurboCache 6200 LE 6200 A-LE 6200 6150SE nForce 430

The NVIDIA GeForce GTX 260 graphics adapter first appeared on the market in 2008, and in fact, has long been outdated. Although it is still ahead of almost all integrated video cards and can be used to run even some modern games, albeit on the lowest setting. It is worth noting that the model did not become a flagship even at the time of its release, but it looked favorably against the background of other market offers, almost catching up with the older, flagship version of the GTX 280 in terms of parameters.

Having received from the manufacturer not the worst characteristics at that time, the GTX 260 allowed the computer on which it was installed to be used for the following purposes:

  • to run games and other graphics applications that support DirectX 10.0 or OpenGL 2.1 technology;
  • for viewing video in FullHD format through the TV-Out input;
  • for transmitting images in 2048x1536 and 2560x1600 quality when connected to monitors via VGA and HDMI adapters, respectively (though only DVI-I was located on the card).

The fact that the card has 896 Mb of GDDR3 video memory and a frequency of 999 (1998) MHz is not impressive even against the background of competitors in 2008. However, the bus width of 448 bits made the video adapter more productive than some more modern, but office models with 1 or 2 GB such as the GT 610 or GT 720. In tests, it corresponds to the parameters of the integrated Intel HD 630 card.

Good parameters at the time of the release of the model led to its high cost. The average price of the GTX 260 on the market started at $400. Now, 10 years later, those who want to purchase a graphics adapter can do so at a much lower cost. In the secondary market, the cost of an NVIDIA GeForce GTX 260 video card in rubles starts at one and a half thousand.

GTX 260 Overview

The appearance of the graphics adapter, regardless of its manufacturer, remains impressive, reminiscent of modern cards. The features of the card include:

  • support for PhysX technologies for game effects, universal computing CUDA, DirectX and OpenCL, thanks to which the card is also suitable for encoding or video editing;
  • the ability to combine with 1 or 2 of the same adapters when installed on a special SLI-compatible motherboard;
  • a certain margin of safety for working with 3D images with a compatible monitor.
On the other hand, the capabilities of the card also largely depend on the computer's central processor and the amount of RAM. Good performance will be provided on a computer with processors such as Intel Core i5 and 4-8 GB random access memory.

The disadvantages include the power consumption of the GTX 260, which reaches 182 W - because of it, the cooling system of the video adapter is quite noisy, and for normal operation you will need a powerful power supply.

Another major downside is the lack of DirectX 11 support, which can cause some new games to simply fail to run even if the card meets their minimum requirements.

What power supply is needed for the GTX 260

The need to provide almost 200 watts of power just for the operation of the video card requires the use of a sufficiently powerful power supply. The recommended power supply for a basic gaming PC is 500W. The minimum allowable power value is 450 W.

Such requirements do not allow you to install a video card on a regular office PC, equipped with a 300-400 W unit. Therefore, even when assembling an inexpensive system from used components, it is worth providing for a sufficiently powerful power supply.

How to overclock NVIDIA GeForce GTX 260 graphics card

To improve graphics performance, you can overclock the GTX 260 using utilities like FireStorm or RivaTuner. They also use special software from manufacturers (for example, Zotac or Palit) to improve performance.

After the overclocking of the NVIDIA GeForce GTX 260 video card is completed, the operating frequency increases by 8-16%. The performance in games also increases, increasing the comfort of the gameplay.

Taking into account the high power consumption and relatively low speed of the video adapter, it is not worth extracting cryptocurrency with it. Even after overclocking, mining the GTX 260 brings profit comparable to paying electricity bills.

Therefore, it is not recommended to assemble a mining farm from such GPUs. For this, more modern versions with less power consumption and a large amount and frequency of memory are better suited.

Testing in games

Performing a test in GeForce GTX 260 games, they get the following indicators:

  • for the shooter Call Of Duty 4 FPS increases from 60 to 130 frames per second, depending on the settings;
  • in the first part of Assasin`s Creed the card produces from 46 to 102 FPS;
  • for Unreal Tournament 3, the value rises to 170 when switching to a resolution of 1280x1024 pixels;
  • in Crysis, the frame rate can be 30, 60 or 80 FPS - it all depends on the settings and format.

For more modern games, the performance will be less impressive. You won’t be able to run most modern shooters and action games with it, at best, strategies or MMORGs.

The GTX 260 model is considered the minimum acceptable video card for the online game Total War: Arena, it will easily run the popular 5th part of The Elder Scrolls series, Skyrim and even Fallout 4 with it.

However, trying to play the third Witcher or GTA V is not recommended - they will start, but the gameplay can hardly be called comfortable.

How to Update NVIDIA GeForce GTX 260 Drivers

The need to download drivers for NVIDIA GeForce GTX 260 may appear when there are problems with the software already installed on the computer. You will have to look for control programs for the video card when reinstalling Windows (or another supported operating system).

To download the driver, it is recommended to use the official website of NVIDIA. Here on one of the pages you can download the GeForce Experience utility, which allows you not only to optimize the performance of the card, but also improve its functionality in games.

We recommend reading

Top