Is the Desire for Chip Power Efficiency Making Moore’s Law Irrelevant?

Moore’s Law has proven to be reliable so far in the history of computing, but there are now some experts who might argue that it is no longer relevant. This law states that chip performance is doubled every 2 years (or 18 months). This trend in computing was first observed by Gordon E. Moore back in 1965. He predicted that the transistors and integrated circuits in computing hardware double every two years. Moore initially thought that this doubling would last for ten years, but it has proven accurate right up until today. There are now some signs that this law might no longer be relevant.

Power Efficiency Not Computing Power

In recent years the attention of the computing industry has moved from their obsession with computing power to a rising interest in power efficiency. The rise of mobile devices has been very important in creating this change of focus. In order to be able to make computing truly mobile there has to be a way to get more from the available power. This desire for power efficiency has been the hallmark of tablets and mobile phones and now this same technology is transferring into PCs and servers. It really does seem that it is the hunt for energy efficient chips that is becoming the goal of modern computing, and this makes Moore’s Law less important.

Moore’s Law has proved to be more reliable than anyone could have ever expected. Some optimists are even predicting that it may hold true for up to another twenty years, but it seems more likely that things will slow down over the next five years. It is then expected that it will take at least three years for this doubling to occur. This will still lead to an impressive increase in computing power in relatively short periods of time, and as power becomes less important the focus on efficiency should mean great improvements there.

The benefits of improved power efficiency in computing, is not only something that benefits mobile devices. Those companies who are using large servers can spend a small fortune on keeping these monsters up and running. If the chips inside of these servers can be made to use less electricity then this will lead to some great savings. It will also mean that data centres will have more choices in regards to where they can be located. Some companies pay huge sums of money to put their data centres in certain locations in the world so they can benefit from cooler conditions. When chips are lower powered this will no longer be such a necessity.

In the future it will be possible to make smaller and lower powered severs and other computer technology. This is going to be a real game changer, and it will no doubt be good news for the industry. Some of the biggest names in the computer industry already have plans for introducing power efficient chips so this is something that is likely to happen sooner rather than later.