Should ATI still burn in hell? Very likely so!

There’s probably a week’s worth of ranting about ATI’s and NVIDIA’s proprietary gfx card drivers, but for now I’m keeping it short. A short introduction is in order. Years ago I knew exactly which card was needed to get the best performance. It took time reading up and keeping up-to-date with all the latest developments. Since then my interests changed from knowing the exact details to something along these lines: I don’t care how it works as long as it works as you can reasonably expect (and I expect a lot by default. If you can’t deliver a decent piece of hardware, software or service, please do the world a favor and burn down your company as quickly as you can. Stopping myself here, as this is also one of those subjects I can go on about for days).

Anywaaaay… all those years ago I bought ATI gfx cards for the simple reason that they were the fastest. Not that you’d ever notice this in a real life usage, but synthetic benchmarks ruled. But as you know a decent piece of hardware is only half the story. So imagine you’ve got this sexy (the nerdy kind of sexy, not the erotic variant) piece of hardware lying in front of you, you wipe the drool of you chin, you plug it in and boot the latest Linux kernel. Next is installing ATI’s latest driver only to discover you’ve ended up in hell. It may have been years ago, but my mind still bears the scars of agony and frustration… hours on end. For this alone ATI deserves to burn in hell for all eternity (which is a pretty pointless thought if you don’t believe in heaven and hell (but the creation of an artificial hell would be most welcome. We could stuff all those greedy corporate bastards there. Maybe even a webcam and some running man style show for our entertainment to watch (which in turn we would watch via networks run by the same corporate bastards and thus negating the entire idea… anywaaaaay :)))).

Long story short I swore never to use ATI again. Since that time I’ve switched to NVIDIA on systems that required high performance 3D and embedded Intel for everything else. While NVIDIA cards also come with a proprietary driver, their installation process was (or is) much less frustrating. Intel on the other hand sucks for 3D, but just works for everything else. For non-gaming purposes, Intel is a blessing and I can highly recommend it if you don’t want to fiddle around. Now fast forward to 2 weeks ago when I ordered the MicroServer. Never thought of the ATI horror and focused on a card that fit and satisfies the max power limitation (25W for the PCIe x16 slot). And from what I’ve read ATI’s Linux support is still something to cry about. Don’t care about 3D, but for HTPC usage hardware decoding support is more or less mandatory.

So when all components have arrived I’ll be attempting to get a Radeon HD 5450 GPU up and running. This GPU has an UVD2 video engine (UVD = Unified Video Decoder) and uses the XvBA (X-Video Bitstream Acceleration) API. The UVD 2 engine features full bitstream decoding of H.264/MPEG-4 AVC, VC-1 and MPEG2 video streams. If rumours are correct it sucks balls compared to NVIDIA’s Video Decode and Presentation API for Unix (VDPAU). Rumour also has it that this proprietary crap has to do with DRM. Just as you think your hatred of a certain technology can’t get any deeper, it does. In any case I’m already preparing for some torture and if all else fails I’m just gonna get a GeForce G 210 or GT 220 card (only realistic options considering the 25W PCIe limitation).

Btw, anyone else thinks writing your entire company name in capital letters makes you think of sad 16 year olds screaming for attention on the net?

Comments are closed.