For each reduction of the size of transistors by half, the area for circuits on a chip quadrupled. Computer speed and capacity would continue to increase while costs fell and the size of computers shrank. It was a straightforward insight, but for those who made the leap it was the mind-expanding equivalent of taking a psychedelic drug.
In 1965, Intel cofounder Gordon Moore noted the phenomenon, which was later known as Moore's Law and which became Silicon Valley's defining principle. By the 1980s and 1990s, Moore's Law had emerged as the underlying assumption that governed almost everything in the Valley, from technology to business, education, and even culture. The "law" said the number of transistors would double every couple of years. It dictated that nothing stays the same for more than a moment; no technology is safe from its successor; costs fall and computing power increases not at a constant rate but exponentially: If you're not running on what became known as "Internet time," you're falling behind.