Last year I bought a new PC, and because I didn’t know which graphics card to get at the time, I decided to do a little self-experimentation and see what life is like with integrated graphics. After all, more and more people buy cheap laptops these days, and integrated graphics are getting better all the time. I bought an Intel G33 board which has a GMA 3100 (no X) chipset. The specs say it can do DirectX 9.0b, and Shader Model 2.0. Enough for quite a lot of games, but not for the latest and greatest.
As a result of what can only be called bad testing by the developers, a lot of games simply crash without a warning. I mean, come on – if you don’t plan to support integrated graphics, at least pop up a dialog telling me the game won’t run. Don’t crash. If you’re a game developer, ask yourself: How are you testing for integrated graphics? And if you’re not, what makes you think you can afford to ignore them?
I should add that I’m running on a quad-core machine with XP x64, so some of the problems running these games might well be related to the OS or a failure to deal properly with multicore CPUs – but I’m pretty confident it’s the graphics chip most of the time.