Visualizing Moore's Law

By Kevin Wong & Andrew H. Yang

Chapter 1

What is Moore's Law?

Moore's Law, created in 1965 by Gordon Moore, is based on the idea that the amount of transistors in a computer would double every two years. Shockingly, this rule has remained as a golden rule in technological innovation for the last 50 years, as our computational progress has roughly remained on this track.

This suggestion that our progress will become significantly faster, smaller, and more efficient over time, has become widely regarded as one of the hallmark theories of the 20th and 21st centuries, and it carries important implications for the future of technological innovation — both for our progress and our limitations.

To help you understand the history of Moore's Law, we present the milestones in microchip development history. These milestones are what allowed us to stay consistent with Moore's Law.

Plotting Number of Transistors on a Microprocessor Through TimeYearTransistor(s) Expected Observed


Choose Graph Display Mode

Yet Moore's Law isn't an all perfect relation. Because of setbacks in innovation, there are inevitabilities that make it so that Moore's Law cannot be a perfect line that doubles every two years. Even so, for the last half-century, it has remained influential because it's a goal that companies strive to achieve. However, now that we are reaching into the second half of the 2020s, there is evidence that Moore's Law is slowly declining.

In 2015, Moore himself predicted that Moore's Law would be ending within 10 years, which is approaching soon. We are steadily reaching the theoretical limit for computing power, and, as components continue to shrink, physical limits are also being reached because it's unlikely that we can further shrink transistors to the size of mere atoms.

Plotting Number of Transistors on a Microprocessor Through TimeYearTransistor(s)Expected Observed

1970
2021

Choose Graph Display Mode

Chapter 2

Why Does Moore's Law Matter?

At first glance, it might seem like Moore's Law only matters in terms of transistors and computing power. Yet Moore's Law was a hallmark theory for a reason. Not only did Moore's Law predict computing power, it also predicted something just as important: cost. The final part of this equation, Moore predicted faster, more efficient, smaller, and CHEAPER.

We plotted out the costs of memory, flash, SSD, and disk drives ever since 1955 and have observed their continual downtrend in costs, which emulates Moore's Law to a tee — yet instead of having an upwards exponential relationship, this time, they have a major decline in cost. Let's observe how Moore's Law affects things other than transistors — this time, in regards to the cost of these computing components

Before we show you the graph, let's play a game first!

Input below how much storage you have on your device:

How much do you think 5 GB costed in ?


Ready?

Surprising right? Let's show you what that graph looks like now.

MemoryDiskFlashSSDPlotting the Cost of Components Through Time (Logarithmic Scale) YearCost ($/TB)

Immediately, when we observed the cost of memory in 1956, we were shocked. It's an enormous amount of money, more than I would ever be able to see in my lifetime, all for the cost of a TB of memory.

Fortunately, as Moore predicted, the costs of these components became cheaper and cheaper as time went on. However, it seems like we are slowly reaching a hard limit in terms of the costs of these components, as the ending parts of our graph show that there hasn't been much of a decline since 2010.

Chapter 3

Predicting With Moore's Law?

Moore's Law seems to be almost a universally agreed upon rule within the tech industry mostly because of how well it has translated and predicted the future, especially within the field of micro-transistors and chips. But how well does it actually do in terms of predicting other technological inventions that are commonplace in households today, like phones, tablets, fridges, and more?

In order to do this, we looked through some prices of Colored TVs, starting from when they were created, to the 1990s, when TV prices started to hit lows. We wanted to do this in order to observe if the price of colored TVs also decrease logarithmically and thus adhere to Moore's Law.

Tv Prices Through Time YearPrice ($)Original Price Inflation Adjusted Price

Surprisingly and funnily, we found that the price of colored TV (after accounting for inflation) does follow an exponential decay relationship and thus Moore's Law. In fact, after fitting an exponential line of best fit, we discovered that the price of colored TVs dropped about 5.9% per year.

By applying this knowledge with a (massive) stretch, we can predict how other pieces of technology would decrease in price over the years. So, finally, we present the 'TOTALLY ACCURATE TECH PRICE PREDICTOR' (TATPP® for short).

TOTALLY ACCURATE TECH PRICE PREDICTOR (TATPP®)

Enter the price of your tech item:

Input Item Price to See Prediction

When we apply our observation to technology today, like an Apple Vision Pro ($3499.99), means that you could buy one for $100 in 58 years. You could even buy a SpaceX rocket ($62 million) for $100 in 218 years if this rule holds. Unfortunately but also expectedly, a relationship that we derived using a staggering 5 datapoints most likely will not hold.

There are a ton of other factors that go into cost other than a observation that a certain CEO had almost 60 years ago, but that doesn't dissuade from the fact that Moore's Law has been one of the most important observations and rules that has lodged it's way to almost every chip innovation for the last 60 years. Is it dead?

Well, that's really up to you to decide. Who knows, with the ingenuity and smarts from scientists all over the world, perhaps we could see a new golden age for what we now know as Moore's Law.

End of the Curve?

The End of a Visual Journey of Moore's Law