think, be trying to take over the video card's job. No one wants to give either memory or processing power to video functions.
The multi processors will be trying to handle lots of image, audio and video tasks. Those are far more important than games.
This is a general question, and I'd be surprised if I'm the first person to think of it. With the proliferation of muliple-core computers, and with decent clock speeds and large RAM support, did it not occur to anyone to utilize a secondary core of a multi-core computer system for graphics acceleration (instead of requiring a graphics card)--pretty much for the sole purpose of gamers? It seems (and perhaps this is where I'm wrong) that graphics processing would be much more efficient if it were done on the CPU by a secondary core through some integrated chipset on the MOBO. It's not like you do much multi-tasking while playing a game, and games are as of yet not multi-threaded (I think) so the second core is sitting there not doing anything. I mean, why bother with expensive GPU's if CPU's are much faster, and with computers available with gobs of RAM (~4GB) it seems easy that one core running at, say 2.4 GHz and using the second set of, say, 2 GB dual-channel ram, would yield a much better graphics processing situation than even the most top-of-the-line graphics card available, and it seems like it shouldn't be that hard. Perhaps I'm wrong, or perhaps it's merely due to pressures from ATI and NVIDIA to keep the graphics card venue going. But it seems like a reasonable next step. I mean, it would do away with the newer bulky PCI-E cards, possibly require less power, and would seem to be a cheaper alternative. Does anyone have any insight into this? I'm pretty interested to see if developers have considered this, or whether it's even feasible? Cheers.

Chowhound
Comic Vine
GameFAQs
GameSpot
Giant Bomb
TechRepublic