Monday, November 15, 2010

Future of Computing: CPU + GPU

Personal computing continues to advance in an unprecedented ways. At first chip makers were trying to out do each other by making CPUs (central processing unit) faster. And then they decided that it would be more efficient to combine multiple CPUs in one chip. That is why we now have what we call multicore CPUs. What this latest chip is capable of doing is to split computer processing loads among its cores.

But soon you won't be needing you graphics card either. Intel and AMD are working on a new breed of CPU that combines a graphics processor with the traditional multicore CPUs.

At Intel they are using what they call the Sandy Bridge Architecture to achieve this combination. AMD's initiative on the other hand is called AMD Fusion.

Back in 2006, Intel started exploring making their own graphics card. Which probably forced AMD to buy ATI, the graphics card company.

One has to wonder what NVIDIA is thinking or doing right now. If in the future we will no longer need a separate graphics card, this will be a big problem for them.