I tried to model Moore’s Law and computing power over time, and wow those assumptions of data processing are pretty exponential. Note that Moore’s Law is made up of two things: The scientific progress towards more CPU output (more FLOPS) AND the growth and investment of the Computer Industry.
Linus Torvalds is worrying about the end of Moores law’s rate of improvement. Why? I’ll try to explain that below.
Growth is measured in Annual Compound Interest with the progress of computer science you have your Moore’s Law rate of improvement which is roughly 40% increase of computing power per year.
Note that next couple of year’s newest “Standard” Desktop chip may be double the processing power and would get pretty much the cost of the same model today.
Longer Cycles but the Market Adapts.
4-6 year hardware cycles will be LONGER! What does that mean to the consumer, computers should not be replaceable in 2-3 years they should be replaceable only 4-6 year cycles now. I think smaller computers like Smart Phones and Watches and accessories
For those of us who replace hardware much slower than companies (typically 4-6 years) that means being stuck with the same computer performance 6 years (which is not bad). Marginal improvements come from Collective Computing or Swarm Intelligence. That means a new path of computer consumer electronics is through merchandising, through Computer Accessories and allowing multiple computers to share cores.
More than ever Digital Ecosystems will matter… and notice which OS is finding its way into many smaller computers and accessories. Which is why I’m not so worried about Linux, since it empowers people who are not programmers like me to self study to learn to program while we maintain our day jobs. The first to reap the full benefits of Digital Ecosystem in a micro scale (individuals and small companies) will be those who are now seeing the trend and the raw productivity enhancement. Basically increased multi-tasking (via variable priority processes having faster cycle rates between tasks) in a wearable intuitive input medium.
So that Smart Watch/Bands, Phone, HUD eyeware, that computer stick you carry, and all these smaller computers worn will slowly add up their processing for greater feats of CPU performance or extraordinary feats of multi-tasking and coordination.
Imagine my biometrics computer (like the fitbit but on steroids measuring heart rate, blood pressure, kinesthetic motions through NFC accessories, also emergency contact and medical information; a marketing ploy of medical insurance to better track health), my visual HUD computer, my Smart Phone, Smart Watch/Band, the computer in my car or on my bag (which has power supply accessories for all my computers) wirelessly depending on each other. With a digital ecosystem, they all mesh perfectly.
Well I was going to write it up for a game, and decided to write up all these assumptions and trends for my Sci-fi shortstories instead lolz. Still, I’ll try to make some rules regarding CPUs. I’ve talked about these before when I talked about GURPS Ultratech. Now I guess I’m fleshing out my sci-fi assumptions for a series of stories I’m drawing up.
Its easier to write stories with well researched (by a munchkin) backgrounds and assumptions. Unfortunately research is not an entertaining read for normal people, but a good story out of that data can be interesting.
Relevant News
First Carbon nano Tube Computer
Leave a Reply
You must be logged in to post a comment.