Moore's Law

Moore’s Law describes the trend that the number of transistors on a microchip doubles approximately every two years, leading to exponential increases in computing power. While not an actual scientific law, it is an insight first made by Gordon Moore in the 1960s about the fast pace of progress in semiconductor technology.

This pattern of rapid hardware advancement has shaped the modern tech landscape, allowing computers and devices to become faster, smaller, and more efficient over the decades. Improvements predicted by Moore’s Law have enabled complex applications—from artificial intelligence to high-performance computing and cryptocurrency mining—to function on affordable, widely available hardware.

Although recent years have seen the speed of transistor miniaturization slow somewhat, the influence of Moore’s observation continues to drive innovation and set expectations for computing performance improvements.