I first saw this technology concept video about 20 years ago at a presentation at the National Center for Supercomputer Applications at the University of Illinois at Urbana-Champaign. Part of NCSA’s mission was to explore personal computing applications that required supercomputer-level processing at the time, but which would make sense in the future when everyone has a supercomputer on their desk.
I vaguely remember my impressions at the time:
- I was skeptical about the voice recognition and natural-language in the interface. That seemed like an awful hard thing to do back then, and it still seems pretty hard to do. Modern voice recognition is a lot better, but it’s still hard to get a computer to understand the structure of natural languages.
- I was also skeptical about the document searching capability shown in the video. I wouldn’t have thought it was possible to do useful searches of millions of documents without natural language processing to understand what the documents were about, but it turns out you can do a useful amount of information retrieval using relatively simple keyword algorithms. We even have some form of the video’s query completion.
- I thought the talking-head to represent the computer was excessive and pointless. I think the world-wide hatred for Microsoft’s “Clippy” proves I was right.
- I thought the streaming video conferencing was excessive too. I was mostly wrong about that. The technology is well within reach, and we could have it whenever we want, but we don’t seem to want it very much. Phone calls are intrusive enough without having to worry about how we look.
The most interesting part for me was the computing technology behind the real-time climate simulation. The NCSA had one of the most powerful computers available—a multi-million dollar liquid-cooled Cray-2 supercomputer—and it would take thousands of them running in parallel to perform that kind of climate simulation at the speed shown in the video.
The capability shown in the video implied some sort of computing utility—the term “grid computing” would later become fashionable—that could quickly and cheaply provide massive amounts of computing from a shared resource pool, much the same way you can quickly grab a few kilowatts of electricity off the power grid whenever you need it.
We are tantalizingly close to reality here:
- Making some rough assumptions about relative computing power and speed two decades ago and now, think I could rent the modern equivalent of 1000 Cray-2 computers from the Amazon Elastic Compute Cloud service for about $40/hour. It’s not quite like the video yet because it would take about 10 minutes to bring them all online and get them running a particular program.
- Every time you do a Google search, you probably grab that much computing power for the split-second it takes to do the search. Google’s datacenter has their search software pre-installed on pre-provisioned hardware, so you couldn’t do that with an arbitrary computer program of your own design like in the video.
- A 3D graphics gaming card for a personal computer uses specialized graphics processing units. Although optimized for shading computer-generated images, modern GPUs are becoming complex enough to perform general purpose computing. A top-end gaming card for under $500 is probably comparable to hundreds of Cray-2 computers for solving certain specialized problems.
It won’t be much longer.
I found this video at Google Blogoscoped, which also has some interesting examples of more recent concept videos from other companies.
Leave a ReplyCancel reply