Moore’s law has been prophetic, but nothing lasts forever. When Gordon Moore predicted that transistors inside of a dense integrated circuit would double every 18 months in 1965, it seemed like a very aggressive prediction. 57 years later, it has come to pass, with computing speeds doubling every year and a half. Now, it seems like the tried and true prediction is coming to a close, so we thought we’d look at how technology will continue growing when Moore’s law actually becomes obsolete.
After Moore made his famous prediction he went on to co-found Intel, which if you know anything about computers, represents probably the most important name in semiconductor technology over the past 50 years. Their first microprocessor, the Intel 4044 had 2,300 transistors. Today’s microprocessor has billions of transistors. It goes to show that his prediction was right on point.
The fact is that today’s microprocessors already pack in about as much avenue for computation as possible. This actually all ties into the speed of light. That’s right, the speed of light. You see, the speed of light is a finite number. It is constant and limits the true number of computations a single transistor can handle. Since you can’t supercede the speed of light, and computing is basically electrons moving through matter, the flow of bits (which is how traditional computing is measured) is also finite. So you see, it is impossible to create computation that moves faster than the physical universe allows. Physicist James R. Powell has done the calculations and figures that Moore’s Law will be obsolete by 2036.
If you couple the hardware limitations with other hurdles such as cooling systems for these microprocessors and the costs associated with creating faster and faster chips with billions upon billions of transistors, it seems as if we’ve already begun to see the end of the consistent growth in computing speeds.
If one thing is for certain, humans are going to press the issue. There are far too many industries that depend on it. As that happens and the end of what is physically possible with microprocessors comes closer, you will see a growth in what is called Quantum Computing.
Quantum computers are computers that compute with what’s called a qubit (quantum bit) and use effects called superposition and entanglement. This process allows the computer to overcome the miniaturization problems of traditional computing. This allows these computers to solve problems in minutes that would take a 5nm microprocessor decades.
With processing-intensive applications such as AI becoming more relevant throughout many industries, the continued innovation of computers is a sure thing, it’s just going to have to come about in ways that don’t look like the computers we use each day.
What are your thoughts about the long-term innovation of the computer? Do you think that the end of the microprocessor as we know it will come in the next quarter century? Leave your thoughts in the comments section below and stop back to our blog soon for more great technology content.
Reach Out Today!
Mobile? Grab this Article!
Tag Cloud
Latest Blog
Latest News
Comments