When I was coming up with names for snuff themed game shows, I figured 'The Slice Is Right' and 'Family Blood Feud' were a little bit too cheesy. And 'Jeopardy' would take on a whole new meaning!
Iggy and Freddy to the rescue! (I think)
I guess Iggy could just let Manu get killed and then have him cloned, sort of like when you accidentally kill your little sister's Siamese fighting fish and then go buy her a new one from the pet store and she's none the wiser. Fish are stupid pets.
In 2016, Moore's Law was officially pronounced dead by Intel. You probably didn't notice.
Moore's Law, in case you didn't already know, is the idea that the number of transistors on an integrated circuit doubles every two years. Basically, computers double their processing speed every two years. In some extreme interpretations, it's been taken almost as gospel that this exponential growth would continue unabated until computers eventually become so powerful and intelligent that they become our masters, and we the slaves… or more benignly that they usher in an age of boundless prosperity and leisure for humankind. Alas, this dream may have hit a road bump.
Personally I've known Moore's Law was sick for quite some time, and it's painfully obvious to me every time I go out to buy a new computer. In theory, if I buy a new computer every four years, I should be able to get a computer that's something like four times as powerful as my old one for about the same money. But instead, I'm usually ending up with something very close to what I bought four or five years ago for the same price, only now they throw in bells and whistles that I don't care about, like touchscreen or some cruddy new operating system that mostly just gets in the way of things. Computers may still be getting faster, but it's nowhere near the dizzying rate they were in the latter part of the twentieth century, and that shouldn't be a big surprise. Back then, integrated circuit technology had so much room to grow. Now, it's running out of space.
Moore's Law is not a true law. It was an observation made by Intel co-founder Gordon Moore, that the number of transistors you could fit onto an integrated circuit seemed to double every year, and later adjusted to every two years. When the microprocessor industry took note of this, they began coordinating their development cycles in accordance with Moore's observations, leading it to become a self-fulfilling prophecy in a way for a very long time. Up through the 1990s, they were able to keep pace mainly by geometrically shrinking features, but by the 2000s they had to find more clever solutions like exotic materials, tri-gate transistors, and layering… but these fixes all have their limits, and the additional heat generated by these overcrowded chips is becoming a real problem.
In their 2016 annual report, Intel announced that they would no longer be operating on two year schedules, opting instead for a longer development period. Moore's law was officially dead.
The current state-of-the-art microprocessors have features as small as 10 nanometers across, which is smaller than most viruses. Some industry experts predict that by the 2030s, assuming we can solve a lot of waste heat and materials problems, we may be able to get as small as 3 nanometers… but if we can manage that, then that would be the end of the line. At 3 nanometers, circuit features would be 10 atoms across in some places, and quantum uncertainty would begin affecting electron behavior, to the extent that the circuits would no longer be reliable.
This does not mean the end of computer evolution is on the horizon. It does mean we may be approaching the limit of this particular branch of computer technology. Developers will continue to find creative ways to make computers faster and smarter, but at some point, we may need to reconsider the way we design computers, or even what a computer is. The core concepts behind how we design computers have remained more or less unchanged since 1945. Computers have many all-too-familiar limitations that, so far at least, make them incapable of recreating the way a human thinks, or ever being truly intelligent in a sentient sense. This may ultimately be due to fundamental barriers built into the concept of what a computer is, and computers will eventually need to be supplanted by some completely new kind of thinking machine that we have yet to envision at this point.
Comments
Please login to comment.
Login or Register${ comment.author }} at
${ comment.author }} at