Is the recent virus epidemic waking developers to the need to design their software differently?
People still don't recognize the scope of what we have to do. You can't simply write a new, multimillion-line program in C and expect it to be reliable unless you're willing to work on it for 20 years. It takes such a long time because that language doesn't support the easy detection of the kinds of flaws most viruses exploit to bring down systems. Instead, you need to use a programming language with solid rules so that you can have the software equivalent of chemistry: the predictable interaction of code as it runs. But on the network, where part of the software works here and part of it works there, programs also behave in emergent ways that are more biological and difficult to predict. So until you have a science of doing distributed computing, software developers will continue to just throw stuff out there. That's why the Net is not going to be secure.
Also, distributed software systems have to be a lot simpler than they are now for us to have any hope of understanding even the mechanistic consequences, much less the nonlinear, biological consequences. You may not want to print this, but why have we been so fortunate that no one has done a Sobig virus that wipes your hard disk clean? It's just one more line of code. Just one line.
That said, I suspect some of these virus writers never expected their bugs to replicate quite the way they did. The fact that a virus goes hypercritical doesn't necessarily mean it was intended to. You could take a loop of code that is perfectly functional and add or delete a single character and unintentionally turn it into an exponential. Then again, perhaps they were just curious what would happen.
Spam is a different matter. It is mainly the result of the Internet having no friction. As long as e-mail is free, you're going to get a lot of spam because there's no disincentive to send it. A simple thing like requiring every Internet service provider to charge for sending mail could be a limiting factor.
Another reason spam is so bad is that so many companies use Microsoft Outlook for reading e-mail. Again, because that program is written in C, it's quite easy to design a virus to go through your e-mail address book and broadcast spam to all the people you know. As soon as your company starts using Outlook, you can see emergent, horrible, almost biological things start to happen. So by using Outlook, you're not practicing safe e-mail. We need a "condomized" version of it.
Is it really fair to blame Microsoft for so many of the Net's woes?
The problem with Windows isn't so much that it's insecure, but that it is stale. The company has flailed away, making changes mainly to protect its monopoly. So lately, instead of getting better with each new release, Windows is just getting different.
Also, Windows isn't well architected. There's a simple way to find out if an operating system has been well designed. When you get an error message, go to the help system and look up the exact words in that message to see if there was enough of a concept of an architecture that they have a consistent vocabulary to talk about what's broken.
All you have to do is try it on a Mac and on a PC to see the difference. Apple took the time to come up with a concise vocabulary, but in Windows the designers of the help system used different terminology from the programmers. That reflects a lack of design discipline, which means that as the system grows, so does the ambiguity of the software itself. The result is a system encrusted with multiple layers of things that weren't really designed in so much as bolted on. Plus there are inessential artifacts of DOS from 20 years ago that still peek through and make trouble.
Now Microsoft's working on a new version of Windows called Longhorn. But there are so many people working on it that it can't be conceptually simple. Bill Gates is a very smart person and is very dedicated, but you can't change the fact that it is human nature for people to carve up a problem and try to own things, for the complexity to accrete in corners, and for the vocabulary of the project not to make it all the way across.
Describe the trajectory of your career and where it might lead next.
I'd divide it into six chunks. As an undergraduate at the University of Michigan, I did numerical supercomputing and got to program some of the early Crays. Then I went to Berkeley and started working on Unix and building Internet protocols into it. My third stage was when we started Sun and built workstations and a distributed network file system and the Sparc microprocessor.
I was all set to leave Sun in 1987 when the company entered into a contract with AT&Twhich actually owned Unixand asked me to completely rewrite it in a modular way. But I couldn't find the right programming language, so my fourth career didn't really go anywhere. Then, after the San Francisco earthquake in 1989, I moved to Aspen and started a research lab for Sun called Smallworks, where I messed around some more with the Sparc chip and some other odds and ends.
In 1994, when a large block of ten-year options vested, I was thinking about leaving Sun again, but then the web came along and [CEO Scott McNealy] asked me to stick around a little longer. So I re-enlisted for the second time. That turned out to be the fifth stage, when I worked on the Java programming language, the Jini and JXTA concepts [networking and peer-to-peer technologies, respectively], Java chips for cellphones and smart cardsall that J-stuff. And finally, what really sucked up a lot of time the past couple of years was the aftermath of my Wired article, when I decided to try to expand it into a book that warns about why biotech, nanotechnology, and robotics have the power to render human beings extraneous. That is what I'd have to call the sixth phase of my career.
But I also did other things on the side. I had a gallery in San Francisco that sold the work of untrained, "primitive" artists. I was on the board of the Oregon Shakespeare Festival for four or five years. I'm also really into architecture and architectural modeling on computers. I've worked with Christopher Alexander [the renowned Berkeley professor, artist, and author of The Pattern Language] and Richard Meier [who designed the Getty Museum in Los Angeles]. Great architects are the last of the purists. What they do is not derivative.
When I think of my own work, most of it is built upon the efforts of others. The Unix work I did was derived from the work of Bell Labs and was more like a remodel than new construction. I'd really like to go and do something that's more like Javathat starts from a clean sheet and that isn't required by its compatibility with something else to be so complicated. Unfortunately, too few people get to do that in our industry.