View Full Version : Radeon X800 or a cheap nVidia 7 series?

October 2nd, 2008, 09:03 PM
When I upgrade my machine I'm planning on making a Myth box (for no other reason that it'd be interesting and I want to do something with the spare parts - which will be half of the computer!). While I use Fedora on the desktop and never quite got on with Ubuntu as easily when I've tried it, Linux Format in the UK have just done a Mythbuntu review and it looked like it had some nice extra features so it is currently the front-runner :)

Talking to a guy at work who has only ever use nVidia and has a Myth box or two I've been recommended that my Radeon X800 XL might not be the best idea and that it might be good to grab a 7 series nVidia before they vanish. What I was wondering was whether it's actually necessary or whether graphics cards and the ATI drives have come on far enough that the old thing of "only nVidia" is quite so applicable.

Here's the hardware I'll be using:

AMD Athlon 64 3500+
Radeon X800 XL (256MB)
Goodmans HD-capable TV with VGA input (possibly this model (http://www.goodmans.co.uk/productdetails.aspx?pid=GTVL26W8HD&language=en-GB) - the size and look is about right) so SXGA 1366 x 768 screen resolution

If I'm trying to watch satellite (UK Sky) and recordings from a box like that then does anyone know whether quality will be any good? Or will my friend be right and a cheap 7300 GT would be better because of the scaling support?

There's an entry in the PVB database (http://pvrhw.goldfish.org/tiki-view_pvrent.php?systemid=Silverstone%20SG-01) that makes it look like it should be okay, but it's ~18 months old and doesn't mention the resolution of the TV.


October 2nd, 2008, 09:44 PM
If my experience is anything to go by, go for the nVidia. I unwittingly bought a mobo with onboard ATI graphics. After about two weeks trying to get it useable, I gave in and got an old FX7200 for 14 off ebay. 10 minutes after putting the new card in, booting and installing from a flashdrive I was watching TV.

Seriously, 10 mins. I still can't get over how fast Ubuntu is to install - especially from flash - vs Windows.

Also, I'm not sure VGA is the way to go for watching video. In a side-by-side comparison DVI / HDMI will always win out over an analogue signal like VGA. If you're getting an HDTV and you have a DVI output you'd be mad to use VGA.

October 3rd, 2008, 09:04 AM
The thing I'm thinking of is that a cheap nVidia is still ~30 (I wouldn't trust eBay for computer components unless it was an eBay store) and I've not had a problem with my Radeon in the past year. On top of that the Radeon drivers are improving all the time and the X800 is probably more powerful than a 7300 (although I don't know how that'll affect scaling).

In terms of DVI, we've already got the TV and it doesn't have DVI. It does have HDMI, but it'd have to be a more modern and expensive card to have HDMI out. That leaves VGA as the only real option (except S-Video, if I have a cable)

To be honest I'm always dubious of people's "never use VGA" comments. I just replaced my monitor (with DVI) with a new Samsung (which I accidentally bought with just VGA) and I can't see any loss of quality. Everything still looks perfectly clean and crisp, and I can't see what difference DVI would make (except at work where I use a KVM and it'd stop me having to re-synch the monitors to get them both perfect). It's probably one of those audiophile-like things and the reason why HD hasn't taken off - 99% of the population can't tell the difference in general usage and even if it is fractionally better in a side-by-side comparison, the lesser one is more than good enough for what they want anyway.

October 3rd, 2008, 09:32 AM
I've bought a fair few pc bits off ebay and never had a problem except once, whe I got a refund without a quibble.

You do know that you can get DVI - HDMI cable, (6 eBay!) do you? You plug the DVI into your card, and the HDMI into the TV. I'm using a FX5200 to drive my Panasonic 32" LCD like this, and the picture is excellent.

I must andit to being a teeny bit influenced by the fact that I work for a large semiconductor company, and the bit that I work in designes the latest processors for flat-panel TV's. When making my choice, I spoke to many of the engineers who spend all day staring at TV pictures and the response was a unanimous - DVI/HDMI is preferable.

Still, it is entirely your choice, of course.

My point I guess is that, to answer your OP, nVidia IMHO, is the better choice. And if you're going to get a card that had DVI out, you would be chocolate coated nuts to use the VGA instead for the sake of the cost of a cable.

I was in Dixons one day 4-5 years ago, looking longlingly at the flat panels, and marvelling at the godawful quality of the signal they were driving them with. I overheard a couple stood next to me, and the guy said "what's the point buying one of these new tellys? the picture's crap" That's part of the reason things took long time to take off.

October 3rd, 2008, 04:10 PM
I have two similarly capable systems. A 2600XP+NVidia 7600GT AGP (MythTV backend/frontend here on Debian Etch) and a 2500XP+ATI X800 Pro AGP (Ubuntu desktop). In the three years I've been running the X800 pro, support has greatly improved in the last year, I can definitively say that. That said, for cards of this generation of cards we are discussing, NVidia drivers still perform much better. ATI has definitely made some gains and will continue to improve but who knows by how much and how long it will take? With the current ATI development model vs. NVidia, I think it's inevitable that ATI will eventually overtake NVidia for the de-facto cards with the best linux support, I just think that when it does happen, it's going to happen with whatever the current generation is first and continue going forward. I don't ever expect the X800, which is over 3 years old - to ever exceed driver support of NVidia cards for the same generation.

October 4th, 2008, 11:19 AM
SiHa: Yeah, electrical shops always use low quality input split between too many TVs, so it never looks as good as it could. In terms of our TV, though, we basically bought one of the cheapest we could, and that was only because my grandparents had offered to pay for part of it.

The difference between 99.9% and 100% perfect isn't likely to show for us and even the 30 for the new card is a push on our budget at the moment (which is why I'm asking if it's necessary/sufficiently worth while).

I've already got the X800, so it's not as if the question is "buy a X800 or 7300?" it's "is the X800 sufficiently good enough for an LCD panel or would I be better getting a 7300 before they all disappear".

I've already hooked my wife's laptop up to the VGA and while Vista got a bit upset and flickered half of the screen at times, the general quality looked crisp to me. In terms of HDMI, I just had a quick look on eBuyer and they've got brand new DVI to HDMI converters (http://www.ebuyer.com/product/144025) for under 3, and I'd trust them more than I'd trust eBay.

paulg: Looks like similar specs, it's just a shame the X800 wasn't your Myth box. I know the X800 can power a 1280x1024 display with Compiz and the lot without a problem, it's all just a question of performance and quality when it comes to up-scaling video (and whether that would even be a graphics card thing or whether the processor would take the brunt of it).

October 6th, 2008, 04:48 PM
paulg: Looks like similar specs, it's just a shame the X800 wasn't your Myth box. I know the X800 can power a 1280x1024 display with Compiz and the lot without a problem, it's all just a question of performance and quality when it comes to up-scaling video (and whether that would even be a graphics card thing or whether the processor would take the brunt of it).

Oh I know. I've had Compiz on my system for sometime now but it hasn't been without problems. Since I mainly use my desktop system now for audio stuff I actually use Fluxbox as my window manager instead of Compiz. Compiz can't compete with the speed of Fluxbox. I have a 1440x900 widescreen monitor and compiz works there too.

The real benefit with the NVidia 7000's, 6000's and 5000's series cards and MythTV comes with XVmC assisted playback of MPEG2 files (basically anything you record). That frees up CPU cycles to enable things like deinterlacing on slower machines such as mine. Even HD content in MPEG2 should be easier on my system. Although, I haven't been able to configure xvmc properly on my system yet, but I think it's next on my to do list now that everything else is working smoothly ;)

October 7th, 2008, 07:37 PM
Thanks, that was the kind of response I was looking for.

As I said, I've got the X800 already and I know it works perfectly fine (for me at least). The only problem I've had on Fedora recently was some stuttering that turned out to be a conflict between FGLRX's PAT (http://en.wikipedia.org/wiki/Page_Attribute_Table), and that was fixed by adding "nopat" to the kernel args in my Grub config. Some people might have problems, but I've got no complaints with ATI at the moment.

Someone at work has a 7950GT that they've replaced and will sell for 15, or I can get a passively cooled MSI 7300GT for 20 (plus postage at ~5, until they sell out).

I'm not planning on watching HighDef content at the moment (don't want to shell out on BluRay and there's only something like three HD channels on the main part of Sky at the moment, at least one of which will be paid for, and they just show the same content as the normal versions) so I'm not sure how much I'll encounter HD issues.

As for de-interlacing, I'd guess the 3500+ is powerful enough to do that without much of a struggle, so with a tight budget then if HD is the only thing that'll suffer given the machine specs I think "would realistically be best spending a max of 1p" wins out over "slightly less processor usage and 15+" (unless anyone has any other information).