Guess what quadrant of Moore’s model CPU’s and GPU’s are in. Intel is going to into High-end graphics to protect it’s position in the processor market. What is not mentioned is that Intel’s direct CPU competition, AMD, now owns ATI. I guess Intel couldn’t strike a deal with NVidia. Still this move doesn’t seem to be too too much of a stretch for Intel. I wonder, though, how their move will change integrated motherboard architecture. As it is inside of past computers I’ve built and the Blackbird I now own, the video cards suck down quite a bit of power and as performance has increased, require a lot of hardware to keep the components and the inside of the computer relatively cool. This translates into the graphics hardware occupying a lot of space. Heat and power consumption are a tough problem, so I imagine if Intel can handle that problem better than AMD and NVidia, they would have meaningful differentiation, especially in laptops (gaming and media center laptops are HOT). I’m not sure how important increasing interface throughput is to customers outside of hardcore gamers. But as a first go around, it would be good to extract some cash from some of them to help pay for a more mainstream solution.
Speaking of heat, has anyone ever thought about how to take the heat thrown off of hot processors and use it to power something else? An analogy would be regenerative braking in cars.