Posted 06 March 2010 - 11:32 PM
I haven't had any problems installing Direct X 9.0c on any systems so far. If you want to run something on the system that needs DX9, then you'll need to install it. Apart from demand, the biggest reason to install the most current Direct X is security - older versions can have vulnerabilities which can be exploited by malware.
The important thing to understand about Direct X is that it is an API (Application Programming Interface) that provides a set of multimedia functions in software, that programs can use in an identical way regardless of the hardware in the system. The program and its authors don't have to know anything about the video card etc in your system. When DX9 is installed, all the DX9 features are available.
The Direct X "level" of a video card describes to what degree the card duplicates the Direct X API functions in its hardware, which is faster than the Direct X software. So if you install DX9 on a system with a DX7 video card, Direct X functions up to Ver 7 will be recognised by the card, and processed in hardware. DX8 & 9 will be handled by the Direct X software, and translated into alternative instructions the video card can handle. This is slower than it would be if a DX9 card was fitted, which could execute the functions directly in the video chip.
So it's not purposeless for you to have DX9, the features will be there and usable, just not high performance. If you run a benchmark, the DX9 benchmark figures will not be for the video card alone, they will be for the combination of the DX9 API code running on the CPU, and the video card.
If you try to install Direct X on a system with a video card that is just too old to support that version of DX at all, the installer will terminate with an error message saying what video capability the card lacks that DX needs. Sometimes if this happens, a more recent video driver will have emulated that function to enable the card to work with a more recent DX.
Top 5 things that never get done: