“Cramming more components onto integrated circuits.” That was the blunt title of Gordon E. Moore's essay on silicon chips published in Electronics magazine in April 1965. In the space of just three pages, the director of semiconductor R&D at Fairchild Camera and Instrument Corp. outlined one of the most powerful observations in modern business and science. He wouldn't have known it at the time, but it also serves as a precept ensuring semiconductor leaders stay ahead for as long as they keep spending.
Later dubbed “Moore's Law” by noted scientist and engineer Carver Mead, that paper in the early days of electronics posited that the number of components per integrated circuit would double every two years. Moore, who went on to found Intel Corp., expected it would be the case for at least the following 10 years. Almost six decades later, it still holds true.
There's almost no other sector in history that's shown the same level of consistent development for so long. Cars are little faster than they were in 1965, although fuel economy almost doubled from 14.5 miles per gallon to 28.3 mpg. Battery technology, key to the future of electric vehicles, saw an even more impressive 40-fold improvement in cost per kilowatt-hour between 1991 and 2018.
Since Moore's prescient remarks, the number of transistors per chip has increased from 100 to almost 50 billion while the size of components has shrunk. In simple terms, the density rate added a zero every 3.5 years. Moore published his original prediction as a logarithmic chart of base two, but in standard notation it looks more like a hockey stick.
An important implication of this trend is that the cost of computing plummeted. Chips work by feeding binary units (bits) of data
Read more on tech.hindustantimes.com