This week marks the 50th anniversary of Intel's 4004 — the first computer microprocessor.
At its simplest, a computer only sees two things — zeros and ones — but you don't need electronics for that. Charles Babbage designed his analytical engine in the 1840s and though he couldn't get it built before his death around 150 years ago, curious engineers decided to try in 2008 and it worked.
By the 1940s, computing moved to electronics. ENIAC — the first general-purpose digital computer — had 18,000 vacuum tubes, which gulped lots of electricity and heat.
Next came the transistor,which was a whole lot smaller, producing much less heat, and it was a whole lot faster. It was still much larger than what we've got today.
The microprocessor changed all of that. Instead of wiring together thousands of transistors, they were etched onto a silicon wafer.
The 4004 contained 2,300 transistors for data processing logic and control. For the first time, it was all together on one chip. Even though it was as small as a fingernail, it was as powerful as ENIAC.
We've spent the last 50 years making things smaller. The smaller the device, the faster transition it can do, and when the transition is faster, this is when we get better performance.
Today's chips hold billions of transistors tied together using connections measured in tens of nanometers.
Some folks think we're approaching the end of the line. There are physical limits to how small transistors can be shrunk.
However, just like how chips replaced the transistors that replaced tubes, we might be approaching another change. The next wave of computing may use light instead of electrons. As innovation continues, we're finding ways to make things even smaller using new materials.