Jump to content

Talk:GeForce FX series

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

This is the current revision of this page, as edited by Cewbot (talk | contribs) at 09:04, 2 February 2024 (Maintain {{WPBS}} and vital articles: 2 WikiProject templates. Create {{WPBS}}. Keep majority rating "C" in {{WPBS}}. Remove 1 same rating as {{WPBS}} in {{WikiProject Computing}}.). The present address (URL) is a permanent link to this version.

(diff) ← Previous revision | Latest revision (diff) | Newer revision → (diff)

"Specification" Section - Clarity

[edit]

The second sentence in this area: ' With GeForce 3, NVIDIA introduced programmable shader units into their 3D rendering capabilities, in line with the release of Microsoft's DirectX 8.0 release, and the GeForce 4 Ti was an optimized version of the GeForce 3.' seems completely unclear to me. What is this supposed to be saying, and how does it relate to the 5-series? 208.102.5.40 (talk) 21:40, 12 December 2007 (UTC)[reply]

Poor performance of the 5800 Ultra?

[edit]

The article alludes to the poor performance of the entire range of FX CPUs compared to their ATI counterparts. "With the exception of the FX 5700 series (a late revision), the FX series lacked performance compared to equivalent ATI parts."

As far as I am aware, the performance of the FX5800 was pretty much equivalent to the 9700 and 9800 Pro, faster in some things, slower than others but on the whole, relatively equal. The issue with the 5800 in comparison with the equivalent ATI parts were it's a) its cost, b) the power requirements, c) the heat produced and the subsequential noisy cooling solution, d) the fact that the FX5800 came so out so much later than the 9700 Pro, but was only equivalent in performance (it had been hyped considerably). http://techreport.com/articles.x/4966/1 —Preceding unsigned comment added by 82.69.41.89 (talk) 22:02, 6 January 2008 (UTC)[reply]

"Issues" Section - Decreasing driver support

[edit]

This section seems very contrary to the release notes provided by nvidia for every driver release. I don't see these claims being substantiated by their release information. Can anyone verify their validity?--192.35.35.34 23:13, 20 September 2007 (UTC)[reply]


What more do you want? References fully supporting this information are cited in that section, including NVIDIA's own Vista FAQ (see article for link). In addition, if you goto NVIDIA's driver download portal, Manually Find Drivers by Product, plug in any GeForce FX 5 Series GPU with Vista as the OS, it will always bring you to version 96.85, released October 17, 2006 (when Vista was still BETA RC2).
Regarding the release notes, look again. The release notes for every Vista driver since 96.85 have dropped GeForce FX 5 Series models from the list of supported products. The Windows XP/2000 driver continues to support GeForce FX 5 Series, but not the Vista driver. You may be confusing the GeForce FX with certain Quadro products, such as Quadro FX 5500 and 5600. In spite of the virtually identical naming/model designation, these two Quadro FX parts have no relationship to the GeForce FX parts. Quadro FX 5500 and FX 5600 are based on G71 and G80, respectively. All GeForce FX 5 models are based on NV30 (and NV3x variants).
There are several Quadro models based on NV3x, such as Quadro FX 500, FX 1000, and FX 2000. As with the GeForce FX, the last Vista driver to support these NV3x based Quadro models is the same v96.85 released on October 17, 2006. --Brewster1971 (talk) 05:17, 5 December 2007 (UTC)[reply]

"Disappointment: Questionable Tactics" Section

[edit]

The following section is in dire need of references. It makes a number of somewhat controversial, or at least borderline derogatory, statements about NVidia's business practices; without substantiation (if not of the facts than at least that it is/was widely perceived to be true), I think it either needs to be toned down or removed. The section as currently written is reproduced below. --Kadin2048 16:39, 2 May 2007 (UTC)[reply]

NVIDIA's GeForce FX era was one of great controversy for the company. The competition had soundly beaten them on the technological front and the only way to get the FX chips competitive with the Radeon R300 chips was to optimize the drivers to the extreme.

This took several forms. NVIDIA historically has been known for their impressive OpenGL driver performance and quality, and the FX series certainly maintained this. However, with image quality in both Direct3D and OpenGL, they aggressively began various questionable optimization techniques not seen before. They started with filtering optimizations by changing how trilinear filtering operated on game textures, reducing its accuracy, and thus quality, visibly. Anisotropic filtering also saw dramatic tweaks to limit its use on as many textures as possible to save memory bandwidth and fillrate. Tweaks to these types of texture filtering can often be spotted in games from a shimmering phenomenon that occurs with floor textures as the player moves through the environment (often signifying poor transitions between mip-maps). Changing the driver settings to "High Quality" can alleviate this occurrence at the cost of performance.

NVIDIA also began to clandestinely replace pixel shader code in software with hand-coded optimized versions with lower accuracy, through detecting what program was being run. These "tweaks" were especially noticed in benchmark software from Futuremark. In 3DMark03 it was found that NVIDIA had gone to extremes to limit the complexity of the scenes through driver shader changeouts and aggressive hacks that prevented parts of the scene from even rendering at all. This artificially boosted the scores the FX series received. Side by side analysis of screenshots in games and 3DMark03 showed vast differences between what a Radeon 9800/9700 displayed and what the FX series was doing. NVIDIA also publicly attacked the usefulness of these programs and the techniques used within them in order to undermine their influence upon consumers. (Emph mine, Kadin2048)

Basically, NVIDIA programmed their driver to look for specific software and apply aggressive optimizations tailored to the limitations of the NV3x hardware. Upon discovery of these tweaks there was a very vocal uproar from the enthusiast community, and from several popular hardware analysis websites. Unfortunately, disabling most of these optimizations showed that NVIDIA's hardware was dramatically incapable of rendering the scenes on a level of detail similar to what ATI's hardware was displaying. In some cases, the ATi Mobility Radeon 9600 beat the GeForce FX Go 5650 by as much as 481%. So most of the optimizations stayed, except in 3DMark where the Futuremark company began updates to their software and screening driver releases for hacks.

Both NVIDIA and ATI are guilty of optimizing drivers like this historically. However, NVIDIA went to a new extreme with the FX series. Both companies optimize their drivers for specific applications even today (2006), but a tight rein and watch is kept on the results of these optimizations by a now more educated and aware user community.

FX 5500

[edit]

I have the Gainward Pro/685 GeForce FX 5500. Is that worth adding here? --Migs 08:27, 20 January 2006 (UTC)[reply]

  • Yes, 5500 is appropiate (of course, without Gainward Pro/685). Thanks. Actually, since you suggested this a whole ago, I'll go ahead and add it. Please feel free to edit if you want to add more info about it. --Shion Uzuki 17:45, 8 February 2006 (UTC)[reply]

What is the minimal power supply required for a 5200?

GDDR2?

[edit]

There isn't GDDR2, Right? Only GDDR3 Developed by ATI.

FX 5300?

[edit]

I can't see this card anywhere in this article, but I know it exists.

The one I've seen is a PCI-express 64MB card, but there may be variations.

Am I being duped? Or is this the real deal?

Nope, they are made. NV Official Info and photo. It's based on FX5200. It looks like it uses the same AGP -> PCIe bridge chip (under the smaller heatsink) as GeForce 6600 and others. --Swaaye 20:15, 1 April 2006 (UTC)[reply]

It's actual name is PCX 5300, to highlighten that the PCI-Express interface is being used

Core Design of NV31, NV34 and NV36

[edit]

Since shaders relies on a more complex architecture, traditional core descriptions should be left to instances where pixel shading aren't used with a note indicating that only 2 ALUs are present for running fragment programs/pixel shaders.

Windows Vista and GeForce FX PCI cards

[edit]

this section should be tweaked as Vista was finally marketed.

table

[edit]

The information is wrong, the NV31, NV34, and NV36 have 2 pipelines with 2 TMU's while the NV30, NV35, and NV38 are 4 real pipelines with 2 TMU's.

No, it is correct. NV3x behaves oddly. The number of textures per pass depends on how many textures. If 0 color, it is 4x1, if color, 2x2. It's very complex. That's why I'm holding a discussion http://www.beyond3d.com/forum/showthread.php?p=934346#post934346.
also look at: http://www.beyond3d.com/misc/chipcomp/?view=chip&orderby=chip_name&order=ASC&n=0 --Swaaye 03:06, 23 February 2007 (UTC)[reply]

im well aware of that information, but the fact remains that it is still a 2x2 and a 4x2 configuration, that is that it only does have 2 pipelines on the NV31, 34, and 36 while the NV30,35 and, 38 each have 4 real pipelines. How the behave does matter, but the core config is still wrong on that table. We dont say a 1900XTX has 48 pipelines, we say it has 16 with 48 shaders, much the same way this works here, different princable but the same.Candle 86 16:28, 23 February 2007 (UTC)[reply]

Ok, but it's not as simple as the Radeons or even later GeForces. That 4x1 situation does exist. Still, the fact that it is primarily 2x2 is worth being the primary mention. 4x1 needs to be noted or we miss a critical note for the arch. --Swaaye 17:25, 23 February 2007 (UTC)[reply]

I do agree, about that though. Candle 86 18:39, 23 February 2007 (UTC)\[reply]

I popped in my Geforce FX5600XT, the NV31 has 2 vertex shaders and 4 pixel shaders Candle 86 22:37, 25 February 2007 (UTC)[reply]

How about approximate dates of release? --Sedotes (talk) 00:09, 1 February 2008 (UTC)[reply]

miceteeth and interlacing

[edit]

I'm assuming that the GeForce FX interlaces because the miceteeth found on videos and the artifacts when viewed on the windows movie maker preview, how to I force progressive scan? Is it a feature? How do you make the card use progressive scan without the need to deinterlace? 96.228.19.6 (talk) 22:11, 9 December 2007 (UTC)[reply]

FX 5100

[edit]

I have an FX5100 it came in an old compaq system, it is a card not an integrated chip, has anyone else seen one? i think it should be added to the list, i can provide all the details - Stony (talk) 10:17, 25 December 2007 (UTC)[reply]

Fair use rationale for Image:NVidia Dawn.jpg

[edit]

Image:NVidia Dawn.jpg is being used on this article. I notice the image page specifies that the image is being used under fair use but there is no explanation or rationale as to why its use in this Wikipedia article constitutes fair use. In addition to the boilerplate fair use template, you must also write out on the image description page a specific explanation or rationale for why using this image in each article is consistent with fair use.

Please go to the image description page and edit it to include a fair use rationale. Using one of the templates at Wikipedia:Fair use rationale guideline is an easy way to ensure that your image is in compliance with Wikipedia policy, but remember that you must complete the template. Do not simply insert a blank template on an image page.

If there is other fair use media, consider checking that you have specified the fair use rationale on the other images used on this page. Note that any fair use images lacking such an explanation can be deleted one week after being tagged, as described on criteria for speedy deletion. If you have any questions please ask them at the Media copyright questions page. Thank you.

BetacommandBot (talk) 16:37, 8 March 2008 (UTC)[reply]

Fair use of the Dawn image

[edit]

This obviosly is a screenshot from a user's computer. and as such, because the poster most likely took the screenshot, he has a right to post it. nVidia has no dispute rights, because nobody owns an image taken from a screenshot generated by your own video card. nobody can copyright an interface. also you havee the right to post screenshots because nobody owns the likeness of a program. distributing the program may be infractions, but the program's screenshots are not protected. It upsets me with great indigence when people try to be so meticulous. I feel like my rights of freedom of speech are violated, and it makes me feel like saying things that are better not said on wikipedia. nobody can tell me what i can and cannot post as far as screenshots. delete it and i will re post it cuz i ripped it to my hard drive —Preceding unsigned comment added by 71.201.151.133 (talk) 19:01, 13 March 2008 (UTC)[reply]

Comparison of 5800 to 9700

[edit]

That is a bad comparison. A 9x00 are very very old GPUs. I would compare a 5x00 geforce to a x1000 series card. and in that respect, the nVidia does fall short. —Preceding unsigned comment added by 71.201.151.133 (talk) 19:11, 13 March 2008 (UTC)[reply]

The 9x00 series were released around the same time as the 5x series and therefore are compared against them. Why would you compare a card from 2003 against a newer video card released from 2005 - 2006 (Series release dates here: http://en.wikipedia.org/wiki/Radeon_R520) instead of the cards the FX Line is competing against? 9x series link: http://en.wikipedia.org/wiki/Radeon_R300 —Preceding unsigned comment added by 208.103.43.93 (talk) 18:57, 11 May 2008 (UTC)[reply]

Neutrality issues

[edit]

The bulk of this article is highly critical of the GeForce FX and makes a lot of comparisons to ATI, seemingly biased towards ATI, granted it is a lower-end card but this article read mores like a negative review and I'm tempted to tag it with {{review}} or {{NPOV}}. -- OlEnglish (Talk) 01:20, 11 May 2009 (UTC)[reply]

[edit]

Hello fellow Wikipedians,

I have just modified 3 external links on GeForce FX series. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:

When you have finished reviewing my changes, you may follow the instructions on the template below to fix any issues with the URLs.

This message was posted before February 2018. After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than regular verification using the archive tool instructions below. Editors have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the RfC before doing mass systematic removals. This message is updated dynamically through the template {{source check}} (last update: 5 June 2024).

  • If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
  • If you found an error with any archives or the URLs themselves, you can fix them with this tool.

Cheers.—InternetArchiveBot (Report bug) 18:55, 8 January 2017 (UTC)[reply]