In the 1960s, a young computer engineer named Douglas Engelbart broke ground by creating a concept in technology called “scaling.” Scaling, or “scaling up” as it’s commonly called, is the capability of a system to handle a growing amount of work.
Dr. Engelbart went on to create amazing technological gadgets such as the personal computer mouse. He theorized that as computer components became smaller, they would be able to scale up or handle more work. The smaller computer chips became, the more efficient they would become and the cheaper they would be to produce. His theory proved to be correct.
Gordon Moore, who lated co-found Intel, put specific numbers to Engelbart’s scaling up theory. According to a New York Times article, he stated that the number of transistors that could be etched on a computer chip would double annually for at least a decade, leading to astronomical increases in computer power, publishing his theory, known as Moore’s Law, in Electronics’ magazine in 1965.
Moore’s Law would prove to be true, with a computer chip the size of a fingernail holding billions of transistors. In the 1960s, transistors were the size of a cotton fiber. They have dramatically changed in the past 50 years.