VDPAU was proprietary to NVIDIA for a long time, although it is now open source.
ATI gains video acceleration through VAAPI and the XvBA backend, as you know. That doesn't provide as rich a set of acceleration options just yet. MPEG-2 VLD and MPEG-4 Part 2 are actually implemented in the driver, but are not yet exposed. AMD's anticipated XvBA SDK is expected to address that.
That is a red herring, anyway. Notice that I wrote "problematic". ATI GPUs are no more problematic than NVIDIA's.
If you select a high-end GPU, be careful with your PSU. 300W is too low for most cards.
As a GNU/Linux and X-Plane/FlightGear user, I'm recommending Nvidia.
http://ggka.online.fr/xp/xp10-63-th.jpg http://ggka.online.fr/xp/xp10-70-th.jpg http://ggka.online.fr/xp/xp10-72-th.jpg http://ggka.online.fr/xp/B777-005-th.jpg
Meanwhile, some of us don't care about hardware (or semantics). In fact, I'll flat out admit it. ATI makes better hardware. But who cares? I suppose if you expected the driver to one day actually support the features you wanted, then ATI is the way to go. If you actually want 1080p right now in Linux. Guess what? ATI would be a mistake.
In the last couple of years I've had a few machines with AMD graphics cards, and for various reasons, they all got replaced with Nvidia models. There are few reasons to buy one for Linux, over an Nvidia alternative.
My usage in the above cases, was for an HTPC, gaming machine, and general workstation.
I'm still trying to find the post where I said "as good as". Call me pedantic, I guess.
Strangely, if one were to look at the ATI wiki in my signature under "Enabling Video Hardware Acceleration", I specifically say that it is not as full featured as other OEMs.
I also said to pick what has the features you want rather than getting embroiled in a flame war - which is what this seems to have become.
I really believe you when you say the ATI cards are good hardware, but why would I buy a piece of hardware when I can't use all or most of its features?
If I have the choice between a €100 Nvidia card of which the drivers allow me to use 90% of the features (as compared to Windows drivers) or a €100 ATI card whose drivers
support 70% of the features, then my money goes to Nvidia. ATI wants my money? They can have it when their drivers don't lag so far behind the Windows drivers. I know better than to think Nvidia is doing a great job at this, but at least they do the less (least? lest?) worst job in the industry.
I hear you and I agree with that line of reasoning.
But there is often an undertone in such discussions that ATI doesn't care about Linux or that their drivers are hard to install or they don't work, etc.
I think that is myth that is a carry-over from before AMD bought ATI and ATI support for Linux really did suck.
AMD does care about Linux, but with ATI they started not just from behind, but going the wrong direction. They have worked pretty darn hard to turn that around. You should buy what works. But make it a features/cost decision rather than an ATI is horrid decision.
That they have gotten what they have into the driver, even if it is not yet exposed indicates that they are interested in putting effort and resources into Linux. It even goes so far as the fact that they have such a tight relationship with Canonical that every fourth and tenth month they make sure that they have a good driver to put in the Canonical repo even before it is available to the general Linux community.
NVIDIA has a longer history of Linux support. It's not that ATI can't get it together. They started with a lesser hand.
By the way, even though ATI has done a better job of working with open source developers, I think Linus' comment about NVIDIA was a little over the top.
AnandTech article NVIDIA's GeForce 600M Series: Mobile Kepler and Fermi Die Shrinks, says,
. . . "At the same time, it wouldn't be unreasonable to expect a cut down GK104 to materialize as the GTX 680M; the desktop GTX 680 only has a TDP of 195 watts, and some careful binning and pruning of clocks (keep in mind that the desktop card is running the GPU at 1GHz and the power-hungry GDDR5 at a staggering 6GHz) could theoretically produce a competitive top-end notebook GPU. It wouldn't be unheard of; NVIDIA's crammed cut down GF100/GF110 Fermi chips into notebooks with a 100W TDP, and the GTX 680 is already very close to that level. Give NVIDIA some time to make a bunch of money selling all the GTX 680 cards they can to early adopters and then we're likely to start seeing trickle down parts, including our presumed GTX 680M."
SOURCE -- http://www.anandtech.com/show/5697/n...hrinks-oh-my/3
Article by Dustin Sklavos & Jarred Walton on 3/22/2012 8:59:00 AM
Also see http://www.compuvest.com/Desc.jsp?iid=1771500 -- not an M suffix, but this is where I buy graphics cards as they become available.