Moore's Law: Fast, Cheap Computing and What It Means for the Manager

Moore's Law, named for the co-founder of Intel Gordon Moore, defines expected advances in the need for data storage over time. In reality, it defines much more, beyond simply data storage. Read this chapter and attempt the exercises to gain a broader understanding of the importance and costs associated with Information Systems.

The Death of Moore’s Law?

LEARNING OBJECTIVES

  1. Describe why Moore's Law continues to advance and discuss the physical limitations of this advancement.
  2. Name and describe various technologies that may extend the life of Moore's Law.
  3. Discuss the limitations of each of these approaches.

Moore simply observed that we're getting better over time at squeezing more stuff into tinier spaces. Moore's Law is possible because the distance between the pathways inside silicon chips gets smaller with each successive generation. While chip plants (semiconductor fabrication facilities, or fabs) are incredibly expensive to build, each new generation of fabs can crank out more chips per silicon wafer. And since the pathways are closer together, electrons travel shorter distances. If electronics now travel half the distance to make a calculation, that means the chip is twice as fast.

But the shrinking can't go on forever, and we're already starting to see three interrelated forces - sizeheat, and power - threatening to slow down Moore's Law's advance. When you make processors smaller, the more tightly packed electrons will heat up a chip - so much so that unless today's most powerful chips are cooled down, they will melt inside their packaging. To keep the fastest computers cool, most PCs, laptops, and video game consoles need fans, and most corporate data centers have elaborate and expensive air conditioning and venting systems to prevent a meltdown. A trip through the Facebook data center during its recent rise would show that the firm was a "hot" start-up in more ways than one. The firm's servers ran so hot that the Plexiglas sides of the firm's server racks were warped and melting! The need to cool modern data centers draws a lot of power and that costs a lot of money.

The chief eco officer at Sun Microsystems has claimed that computers draw 4 to 5 percent of the world's power. Google's chief technology officer has said that the firm spends more to power its servers than the cost of the servers themselves. Microsoft, Yahoo! and Google have all built massive data centers in the Pacific Northwest, away from their corporate headquarters, specifically choosing these locations for access to cheap hydroelectric power. Google's location in The Dalles, Oregon, is charged a cost per kilowatt hour of two cents by the local power provider, less than one-fifth of the eleven-cent rate the firm pays in Silicon Valley. This difference means big savings for a firm that runs more than a million servers.

And while these powerful shrinking chips are getting hotter and more costly to cool, it's also important to realize that chips can't get smaller forever. At some point Moore's Law will run into the unyielding laws of nature. While we're not certain where these limits are, chip pathways certainly can't be shorter than a single molecule, and the actual physical limit is likely larger than that. Get too small and a phenomenon known as quantum tunneling kicks in, and electrons start to slide off their paths. Yikes!