I saw this article the other day talking about Nvidia's Kepler chips and how they will "serve up a desktop experience from the cloud". This whole idea is crazy to me.
Don't get me wrong, I'm sure the Kepler chip is awesome. Based upon what is said about the chips themselves, they seem to have some massive power behind them, and they will be incredibly helpful in software where you can make use of the GPU. I just don't buy in to the whole stream your desktop from the cloud idea.
The premise is just a bad one. What happens if your network is down? What happens if the cloud goes down? What happens if you have some latency issues? Network is slow? Bandwidth caps? You get the idea.
Then Rob Enderle takes it a step further and asks "what if you could run Windows on a Mac, or an iPad, or anything that would host a tiny client". Another terrible idea! I've used remote desktop software to access a windows desktop from my iPad, and it was not a great experience. Windows is not built for touchscreens. That is the whole purpose behind Metro for Windows 8. Microsoft has already tried that, and it didn't work. Also, if you want to run Windows on your Mac, you can. Use VMWare or Parallels or any other virtualization software. You could even use Bootcamp and have a full install to go to.
The concept of having a central operating system with a thin client on your device is a great one. There are just too many problems with it. I think having a central server with preferences that sync to devices is a better idea. That way you can store the preferences on the device, and they'll update when they can. This is the same concept behind iCloud and Dropbox. Dropbox is specifically for files, of course, but it has been used that way for BBEdit's preferences. Using this sort of setup means your device isn't useless when the internet is down. Not being able to play Angry Birds just because the internet is down would be incredibly frustrating. I was always mad when I couldn't play Half Life because I couldn't get on Steam.
Later in the article Rob Enderle does get in to some use cases that are great. Modeling galaxy class events where you are tracking millions to billions of objects over a billion years. This kind of computation can be of enormous use to scientists. The datasets that are being worked with now are on that kind of scale, so a chip that can process them quickly is great.
Finally, Rob Enderle goes into possible use in robotics. This is another frontier that can use this sort of processing power. I personally believe in order to make AI approaching our intelligence we will need massive parallelism. The human brain is performing hundreds if not thousands of concurrent "threads" to keep us alive. A robot will need the same amount of computation to stand, walk, and talk.
Looking to the future of what we can do with our most advanced technology is great, but you have to stay realistic. Robots and massive dataset modeling are feasible. Streaming operating systems are great, but we just don't have the infrastructure to support it.