Class 2

2-1) How is the transition from PC's to PDA's similar to, or different from, the transition from cable-driven to hydraulic excavators?

As with all disruptive technologies, each new wave starts out completely inadequate for performing mainstream tasks in the dominant technology's mainstream niche, but the new "roaches under the floorboards" niche provides an incubator in which the new technology can mature, grow, and improve. As long as the capabilities provided by the new technology increase faster than the needs of the market, eventually it will be capable of handling mainstream uses that displace the old technology, generally without ever having abandoned its original userbase.

That said, the switch in earth moving equipment was a one time event, while computer technology has undergone a series of disruptive changes over the past fifty years. DEC minicomputers bit into the soft underbelly of IBM's mainframe business, offering individual departments access to computers where previously an entire corporation might only have one. Then the PC came along, putting a computer on each individual's desk. An intermediate niche that is being skipped by the PC vs PDA comparison is laptops, which (especially combined with wireless internet access) took computers off the desk and put them in people's briefcases and bookbags.

PDAs put computers in people's pockets. The most complicated piece of electronic equipment currently in most people's pockets is the cell phone, and with the advent of cell phones with graphical displays, internet access, scheduler functions, and downloadable games, it seems that the cell phone niche is likely to consume (or at least merge with) the PDA. A modern cell phone is already considerably more powerful (and with a better display) than the early 1980's 8-bit microcomputers, and the first generation of IBM PC hardware and software. Third party programmability has been missing, but we're starting to see it now with cell phones that run Linux internally.

PDAs are held back by user interface concerns, but this is not an insoluble problem. As a transitional strategy, they can be plugged into larger displays and keyboards the way a laptop can, or be synchronized with a PC and used as portable data storage with attached processing capacity (ala the iPod MP3 player). The same basic technology is currently being adapted to multiple niches, but the internal components remain generic. As processing, storage, and battery capacity increases, exposing the general purpose multi-function nature of these devices to end users becomes just one more user interface issue. However it goes, 2.5 inch hard drives are going to steal away the unit volumes of 3.5 inch drives, the LCD is supplanting the CRT, and the UPS is just a special case of battery power...

(Servers are interesting because in clusters, heat and power consumption are the main limiting factor. So clustering laptop components makes sense at the extreme high end. Remove the LCD and keyboard, and call it a "blade". Small world, isn't it?)

2-2) How important are disruptive technology issues in the "build or buy" decisions of established companies considering acquisitions of startups?

Very important, but the answer is neither: they should lease.

I wrote about this at length in one of the Motley Fool articles I sent you. http://www.fool.com/news/foth/2000/foth000918.htm

An established corporate organization cannot develop a disruptive technology in-house (with some multi-sigma exception as Christensen explained), unless they spin off a protected "fishbowl" environment, such as the thinktanks at Bell Labs or Xerox Parc. (The problem with this arrangement is that the firewall that keeps the corporation out also keeps the new ideas in; these companies repeatedly fail to make good use of their own inventions.)

Unfortunately, the innovative individuals who pioneer these technologies are usually flushed out by a corporate acquisition. Companies that buy immature companies that are still inventing the future tend to crush them. It is possible to wait for the innovators to move on to other projects naturally, and acquire the stable company (with a well-defined core business) they leave behind. But you usually can't afford to just sit around: while waiting you can lose your market, and the eventual price to buy in skyrockets.

The best course may be to partner with these companies, as a customer, supplier, or both, leaving open the option of acquisition or merger years down the line. Sign a contract that includes a boat load of stock options that don't trigger for at least five years, which give you a huge stake in the company (probaly even a controlling interest), but NOT YET. Then try to stay as hands off as possible in the meantime.

Keep in mind that the distorting effects of a single large demanding customer upon whom a company's future depends can be just as disruptive as an acquisition. And keep in mind that individuals are vital to innovation. Understand why IBM made sure that Ray Ozzie's contract was secure when buying Lotus, for example. Don't take all the fun out of working there.


Class 3

3-1) Question: Compare QWERTY's effects as a standard to a particular standard in the computer industry? Does it have similar benefits and drawbacks, or is it different because it is primarily an "end user" standard?

First, I'd like to point out what technologies lost. If it was patented, it lost. Edison's patented wheel typewriter, the patents filed on improved keyboard layouts. Patented technologies fighting against unpatented technologies (or ones where the patent had expired) got their head handed to them. From the patent application in 1867 through the 1880's when it expired, the QWERTY keyboard remained a tiny niche market; it took off after the patent expired, for the same reason the IBM PC took off and the Apple II didn't. IBM lost to Compaq, but Apple beat Franklin; the clones made it a success.

(I'd also like to beat the author of the QWERTY piece about the head until he loses the desire to write flowery purple prose, and would like to ask him if he's ever heard of the phrase "local peak" and if so why he's dancing around just coming out and saying it.)

That said, let's talk about standards.

Technically speaking, QWERTY is a protocol standard. The mechanism by which data enters or exits a system must be rigidly standardized in order for systems to communicate. The underlying implementation can be ripped out and redone, but the old interface needs to be maintained in order to interoperate with the existing network. It doesn't matter whether the machine is interoperating with trained humans, or with other machines, the principle is the same.

Most interesting standards are protocol standards, but the most similar computer standard to QWERTY keyboards is probably the x86 instruction set. Modern Pentium or Athlon processors are multi-issue RISC internally with out of order execution, speculative execution with branch prediction, register renaming, and a partridge in a pear tree. They could execute java bytecode as readily as x86 instructions, but x86 instructions are what all the software is written in, so that's what the processors support. Even Transmeta, which put the translation layer in software, still hasn't bothered to support any instruction set other than x86.

The analogy is actually stronger than it appears on the surface, because it turns out that the penantly for using QWERTY isn't as bad as most people think. More recent studies have failed to reproduce the 1940's Navy studies, and instead show Dvorak keyboards as roughly equivalent to Qwerty in real world use. The ability to type faster doesn't translate to the ability to compose text faster. QWERTY remains in large part because it's not a real-world bottleneck worth the pain of switching for the vast majority of people.

Similarly, x86 assembly turns out to have unexpected benefits over RISC, because the performance bottleneck on real world systems isn't inside the processor, but in the memory bus. RISC instructions are effectively padded out to a fixed length, wasting space, and the simpler nature of the instructions means you need more of them to accomplish the same task. This puts more pressure on the memory bus, requiring potentially megabytes of cache to fit in all the shared library and operating system code that the program calls through in the course of normal operation. And then there's multitasking context switches. The penalty for translating code can be avoided by pipelining; translating x86 as part of the prefetch cycle and executing the translated instructions in parallel with more translation. (You pretty much have to do this kind of thing anyway to handle instruction reordering and speculative execution past predicted branches.)

An example of an actually bad standard that has a real downside to the user base being tied to it is Windows, but beating on them is just too easy. :)

3-2) Question: Discuss the 64bit CPU standards war, Itanium vs. AMD-64 vs the IBM Power architecture using this paper's classification scheme.

This is a good article.

AMD has adopted an evolution strategy, while Intel has a Revolution strategy. This means it's an evolution vs revolution battle. Unfortunately for Intel, the "superior performance of Revolution" the paper talks about is NOT evidenced by Intel's Itanium.

AMD-64 leverages 32 bit x86 the way the 386 leveraged the 16-bit x86 installed base. Opteron and Athlon 64 processors are also AMD's fastest chips for executing 32 bit code: 95 percent of the circuitry is the same as a 32-bit Athlon, and they have an improved branch predictor, faster front-side bus, and larger cache. Upgrading to AMD's 64 bit chips, even to run 32-bit code, is a no-brainer as long as the price is right.

Itanium, meanwhile, is an incompatable brick wall that's faster at emulating 32 bit code than at running it with the built-in compatability circuitry (which is being removed from future versions). Itanium hasn't targeted the desktop since the name was changed from Merced. (The name change and the retreat from the desktop were in the same press release.) Thus it can't have the economy of scale in hardware _or_ software that the desktop does, and correspondingly less investment in development tools or programmer training for the new platform.

Itanium also has performance problems in native mode (as alluded to earlier: the memory bus is the real bottleneck, plus the sucker's so complicated they can't clock it very high).

Hooking up with HP as a partner was only a way of sharing the pain. Both HP and Intel have been forced to continue their previous non-Itanium processor designs. HP revived HP-UX, and Intel did Pentium 4 to avoid abandoning the x86 market to AMD's Athlons. (Itanium's most significant competitor is Intel's own Pentium 4.)

In terms of the seven key assets: AMD has the installed base of 32-bit software to leverage, and Intel has superior manufacturing capabilities (which has always been Intel's core competency).

The intellectual property rights for x86 code are irrelevant, there are no patents and the copyrights to specific pieces of software are owned by diverse third parties. Design innovation hasn't really helped either side (although hyper-threading is cool). Both sides have sufficient brand name. In terms of complements, both AMD and Intel have third parties manufacturing motherboards, although Intel has a clear lead in this one.

The first mover advantage theoretically should have gone to Intel, but they squandered it. Itanium first taped out almost five years ago, but performance simply sucked. In terms of 64 bit x86 extensions, AMD has the first mover advantage, with Intel potentially releasing a me-too Yamhill product if AMD pulls too far ahead.

To be fair, Itanium also has technical performance problems (as alluded to earlier: the memory bus is the real bottleneck, plus the sucker's so complicated they can't clock it very high).


Class 4

4-1A) What has happened now that some parts of Microsoft's business are past the rapid growth phase?

Strangely enough, I wrote about this at length in a Motley Fool article that ended the "three waves" series I sent you:

http://www.fool.com/portfolios/rulemaker/2000/rulemaker000928.htm

So I'll do the other one:

4-1B) Are operating systems a natural monopoly? Is this the main reason for Microsoft's success?

No. Operating systems are no more a natural monopoly than PC hardware is. They are a natural standard, but that's not the same thing. It's quite possible to have multiple interoperable PC manufacturers, web browsers, or operating systems that work based on a defined API. The Unix market is one example of this, although that example also shows that a closed source software standard is generally pretty hard to get right.

Microsoft benefitted from mistakes by IBM and Gary Kildall of CP/M. IBM chose two vendors of compatible products: DOS 1.0 and CP/M-86. They ran the same software, and were compatabile with each other. But the expansion path each one went on was very different. Microsoft's Paul Allen added Unix features to DOS 2.0, as a migration path to Xenix (another standardized operating system, part of the Unix family that works according to Posix and the Single Unix Specification). Kildall, on the other hand, explicitly did not want a business larger than he could run out of his living room, and resisted expanding Digital Research at all. (This is why Microsoft became the primary OS vendor, even though their product was a clone of Kildall's.) When Kildall did upgrade CP/M, he went for multi-processing capability (with MP/M), which was kind of useless back when RAM cost $5000/megabyte.

When Digital Research realized that Microsoft's Unix extensions had run away with the market and came out with their own extensions (DR-DOS), Microsoft used every dirty trick in the book to put them out of business, and settled an antitrust suit on the matter for $150 million in 2000. Digital Research also got sued by Apple over its attempt at doing a GUI (Gem), leaving Microsoft's windows unfettered. (Apple also tried to sue Microsoft, but Microsoft threatened to yank their office suite from the Apple platform and Gil Amelio caved.)

Everybody else was kept out by exclusive preload contracts with per-motherboard licensing (the "CPU tax"), which is what got Microsoft in antitrust trouble during its first antitrust trial in 1995 with Judge Sporkin:

http://www.procompetition.org/litigation/timeline.html

The 1998 antitrust trial was theoretically for violating the 1995 consent decree, although the actual complaint that led to the second antitrust trial was not about netscape's web browser but about netscape's web _server_, and microsoft's attempts to keep it off NT. Microsoft's license for the "client" version of NT forbid running your own servers on it (including netscape's web server), despite being technically capable of doing so. The "server" version cost several thousand dollars, and came bundled with Microsoft's web server. This was the complaint in 1996 that eventually led to the second antitrust trial. You can read the details here:

http://www.oreillynet.com/pub/a/oreilly/news/tim_justice_nt.html

4-2) Discuss the pros and cons of various remedies proposed for Microsoft's monopolistic practices in terms of their likely effects on the industry. Make a recommendation.

The real remedy is to push Linux as far as it will go. Competition thrives with commodity standards (like the ISA and PCI busses, or 8 zillion ISPs mediating between the commodity internet and commodity modems). Linux is a commodity standard operating system, allowing applications to differentiate on top of it. If the government wanted to drive its adoption, rather than mandating stuff for private companies it could simply make its own purchasing decisions require an open source base.

The breakup would have been nice for Microsoft itself (one of my Motley Fool articles talked about that), but that this point it's a bit late. Every new acquisition has been welded to the big lump that is microsoft, so that you can't use MSN on Linux (unless you're a really skilled techie who can bypass a lot of pointless checks). Microsoft has exactly two product lines that make them money: Windows and Office. Everything else they do either loses money or breaks even. In the absence of a monopoly, Microsoft would be hard pressed to make any profit at all.

Another commonly suggested remedy is forcing them to license their source code. This might help the Wine project, but the Linux developers don't care: they've been ahead of windows technically for about five years now, and continue to open the gap. (They've only started to focus on the desktop in the past year or so, and the 2.6 kernel is the first one where latency is explicitly optimized for rather than just throughput. They wanted to secure the server market first, which they have.)

Also, Microsoft itself can't build Windows 2000 derivatives; its build process is so dysfunctional that it had to back up to the NT4 base and graft Windows Millenium code onto it to create windows XP. Giving competitors access to year-old undocumented code that can't even be compiled outside of a very specific environment (using development tools that Microsoft only has as binaries: they themselves don't have source code to some of their stuff because the people who did it left the company)... It's a distraction from trying to replace them entirely with something new.

The way to defeat Microsoft is to go around it, and render it irrelevant with open source software. Don't try to patch it, throw it out.


Class 5

5-1) Question: Suppose that Linux and a rich open source software suite (including dosemu) were fully developed a couple of years after the introduction of the i386 in 1985. What impact, if any, would that have had on the development of PC technology?

Glossing over the fact that the Free Software Foundation's GNU project hadn't visibly failed yet (and if Richard Stallman's GNU project had succeeded, his anti-corporate rhetoric would possibly have scared business away from "Free Software" permanently). And assuming Linux didn't get caught up in the USL vs BSD lawsuit. And the fact that internet access was considerably more restricted five years previously along an exponential growth curve...

The fundamental problem is the PC hardware mix. The 386 was introduced on October 17, 1985. It took Linus three months to create the Linux 0.01 kernel in our world (from his first stab at a boot sector to posting the source code), and he didn't begin until three months after he got his 386. The first mostly functional version (0.12) came at about the one year anniversary point, so however it goes Linux couldn't have been available before late 1986.

But there's more. Due to manufacturing glitches, 386 volume shipments didn't happen until 1987, and it was still priced out of most people's reach for years after that because it was the high-end product. The introduction of the 486 (April 10, 1989) dropped it into the mid-range, but the low-end system for new sales (let alone the installed base) remained the 286 until the introduction of the Pentium (March 22, 1993).

Fully 32-bit systems require a 32-bit processor, and won't run without one. Until at least 50% of the installed base of PC hardware is 32-bit capable, network effects work against a 32-bit system in favor of one that can run on the larger installed base of 16-bit hardware. 386 and successor chips did not become the majority of PC hardware until well after their introduction; 286 systems continued to be sold at the low end (where the highest unit volume is) until the introduction of the Pentium kicked the 386 down into the low-end. Adding in the years of backlog of older 16-bit systems (which are generally replaced every three to five years), the "tipping point" of 50% market share came somewhere in 1994 or 1995. (Developers tend to lead the mainstream; whenever it matters we develop for the high end so our products have the longest possible shelf life. So for us, the tipping point came in 1992 or 1993.)

Users were actively looking for a replacement for DOS in the late 1980's due to Moore's Law rendring the 640k memory capacity insufficient for very simple reasons. Moore's law doubles memory capacity every 1.5 years, the original IBM PC was introduced in mid-1981 in configurations ranging from 16k to 64k of ram. The 640k limit is exceeded at the high end with four doublings (mid 1987 for developers and power users) and at the low end two doublings later (mid 1989 for the mass market).

But the 386 was at the high end when it was first introduced, dropped into the midrange with the introduction of the 486, and only hit the low end when the Pentium came out. And that's new sales, so add a couple more years for the installed base to catch up. Therefore, people wanted to be able to stick more than 1 megabyte of memory into their old 16 bit system somehow. Hence they needed a transitional system that could (however clumsily) use the 286's expanded and extended memory features for code and data. (OS/2 1.x and Windows 3.1 were aimed at this niche.) Windows 95 was introduced a year or two after the optimum time, but not much more than that.

That said, Linux would have (and did) survive the introduction of Windows 95. The "Anything but Microsoft" consolidation that led to the rise of Java in 1996 would probably have gone to Linux instead as the last alternative standing. The Java crowd poured into Linux in 1998 (hence the 212% growth in a single year), meaning that the alternative history would have accelerated Linux adoption by two to three years, so it might have been in a better position to get a boost from the Y2K hype.

Linux would also have been in a MUCH better position to take advantage of Microsoft's 1995 misunderstanding of the internet (labeling MSN an "internet killer", etc), and the stagnant Windows 98 release with entrenched Internet Explorer. Netscape came out in 1995, and in 1996 and 1997 Netscape was strong enough that it could have forced a platform migration if the new version came out on Linux significantly before coming out on Windows, or was free for Linux and cost money on Windows. Netscape was a killer app people would buy a new computer for, but with Linux they wouldn't need to change their hardware. This could have effectively walled Microsoft out of the entire dot-com boom, and made Linux the standard desktop client. (The word processing problem, with the need to read and write .doc files, would still need to have been solved. But there would have been enough industry momentum to push Corel's WordPerfect, StarOffice, Abiword, etc into the spotlight, especially if they read old word files (the easy part, it's writing them that's hard) and agreed on a common open document standard for new ones).

So however it went, Linux probably would not have made its big mainstream splash before 1995 or 1996. It might have had an OS/2-like niche before, but like OS/2 it would have suffered from a lack of OEM preinstalls, and poor vendor hardware support. However, it if had taken the #2 spot from the Macintosh before 1997, it potentially could have killed Apple before the return of Steve Jobs, at which point the remaining mainstream application vendors (Intuit's Quicken, Corel's Photoshop, Symantec, various game companies, etc) would have jumped on board. (Apple's resurgence has delayed this, it's hard to tell when Linux becomes the #2 platform, and when it does Apple will claim to be part of a larger "Desktop Unix" platform to prevent the loss of third party support by playing to inertia...)

I'll stop now.

5-2) Question: Suppose your goal is to derive a proprietary advantage from a product based on open source technology. How would you go about doing so?

Since this question is about the Dalle article, I'd like to point out a couple things he missed at the end. First, DOS died because 640 kilobytes wasn't enough, and people were actively looking to replace it after Moore's Law equipped computers with more than that. Otherwise, people would simply have bolted a standard GUI library on top of it (a graphical version of Borland's OWL, for example) and stuck a multitasker under it (ala desqview) and limped along for another ten years. But because of the memory address space, they _had_ to leave their old software behind. This is relevant today because there's a similar barrier at 4 gigabytes, where we switch over to Opterons, and the mainstream will hit that barrier in 2005. At that point, people will be actively looking to replace 32-bit windows, which they aren't quite yet.

Secondly, Microsoft's main competition has always been its own installed base. They've had trouble forcing people to upgrade ever since their first "good enough" 32-bit product, Windows 95. They've had to forcibly discontinue products (and support for products) like Windows 98 and Windows 2000 to have any hope of selling XP, and in hopes of mitigating this in the future they're switching to a rental model for Windows 2003 and Longhorn. (This is not universal yet, but they've forcing it on some of their big corporate customers, such as the City of Austin.) This forced upgrade treadmill is a big help to Linux, and Microsoft's new initiative ("our servers stopped working, did we forget to pay our Microsoft bill this month?") are both big benefits to Linux. A bigger help to Linux is the "Business Software Alliance" audits, and the license clause that allows Microsoft to snoop in your computer through the internet. This built-in spyware is illegal for computers in hospitals, financial institutions, government agencies... All this means that plenty of people who don't really want to go to Linux are being forced to consider it, by Microsoft's own actions.

Now, on to the question:

The primary method of gaining an open-source compatable proprietary advantage is branding. This is the strategy of Red Hat, which used it to become the leading Linux distributor in North America before its founder (Robert Young) quit. (It's drifted a bit since then, in the hands of venture capitalist-appointed corporate officers.) Young liked to compare Red Hat Linux to Heinz Ketchup: taste tests show that people who have never tried Ketchup before universally dislike it, and they dislike the various brands of it equally. But Heinz has convinced people to prefer a type of Ketchup that doesn't even come out of the bottle easily. You can invest in trademarks and brand name, even if you're selling bottled water or near-identical soda, and that brand equity is a sustainable proprietary advantage separate from the product itself. Before SCO went nuts, their CEO compared it with bottled water, and the markups one can get by selling gallon jugs in the supermarket, let alone Evian or Perrier.

Beyond that, Eric Raymond's "Widget Frosting" is sometimes a workable strategy, but only in certain cases. People don't mind "leaf nodes" being proprietary. If end users never derive their own IP from your IP, then they don't mind not owning your IP. Games are one example: they can be as proprietary as you like because if at some point in the future you become evil, it's not the end of the world. Switching to another game has relatively little cost.

But if your own work is built on top of somebody else's word processor, or somebody else's spreadsheet, or somebody else's operating system, then the supplier can potentially charge you a fee to use your own work. And that is deeply offensive. (It's not just software: I reserve the right to move my web page to another ISP. The ISP does not own my page, I do.)

So any new extension that other people extensively customize or use as infrastructure to build their own work on top of (powerpoint, for example) is eventually going to be reverse engineered as open source, and not a sustainable business to be in. This is a problem for Oracle, which is why they've been doing a classic upward retreat in the face of MySQL, PostgreSQL, and the new generation of fully in-RAM database programs that only write a transaction log and periodic snapshots to disk.


Class 6

6-1) Question: Many fundamental cluster effects seem to be rooted in interpersonal relationships and a common employment pool in a local area. These also drive up the costs of doing business in a cluster area and promote exporting clusters. How do you see the advent of the Internet, telecommuting, cellphones, and other changes in communication and workplace structure affecting the relative strengths of positive and negative cluster effects? Will clusters be more, less, or equally important in the future as a result?

Will speculating about the future ever produce concrete answers?

There will always be clusters, but how important geography will be is a different question. The important thing is people, there was no hollywood cluster before people went there to make movies (to get away from edison's patents; the climate came later and doesn't explain soundstage work). Birds of a feather flock together, hence aspiring actors move to California.

I work in a cluster called the open source community. I read the linux-kernel mailing list, as well as the ones for the uClibc and busybox projects. Recently I had to install an irc client, despite a personal dislike for irc (it eats too much time), because that's where many of the developers I want to interact with hang out. (Physically, a large number of them appear to live in australia, although part of that may be because I tend to do most of my programming late at night, in the middle of their daytime.)

I've been paid to telecommute; email reports, cell phone conversations, etc. This doesn't work too well as a full-time job: employers like to see you at a desk in an office. But for either billable hours where you produce an itemized timelog, or pay for work things (like writing a periodic column, or preparing research reports), it works fine.

Software development can also be done via a telecommuting arrangement, with patches serving the place of periodic reports. As long as you can show some tangible results of the work you do on a regular basis, your actual presence isn't always required. However, rather a lot of "software development" work isn't pure development. There's a huge support element (some of which is hand holding), and interacting with people to try to turn vague inarticulate desires into specifications you can code to. The other people involved tend to get frustrated on a regular basis, without ever wanting to blame themselves for anything, and anything new becomes a convenient target. Telecommuting has thus been tried and rejected on a number of occasions because "if you were here in person we'd have this solved by now" is a way pointy-haired idiots can vent, whether it's true or not.

On the other hand, telecommuting can be just as much of a plus to employees as physical presence is to employers. If the best programmer is in australia and doesn't want to move, or the lowest bidders are in india and you don't care at all about being able to maintain or upgrade the finished product after it ships... And in an environment where telecommuting is no longer new and scary, it tends not to get blamed for daily stresses.

There's also the question of what a given company is really doing. Are they producing a product, or rendering a service? Product production already has manufacturing outsourced. Rendering a service can be at a distance or the client may want suits carrying briefcases to show up in person, for purely emotional reasons. It all depends on what they're willing to pay for. Services rendered at a distance will generally be cheaper, because there's a greater selection of providers to choose from, and usually less overhead. It may take a few generations for the old managers to die off, though. But at least we don't have to wear a suit and tie so often for the priviledge of typing on a keyboard...

6-2) Question: Porter's explanation of cluster effects in the paper above was pretty generic, and as such doesn't really distinguish very well between characteristics of clusters in the same business such as those of Silicon Valley and Route 128. How would you modify or extend Porter's explanation to account for the arguments in this article?"

Boston was the headquarters of Digital Equipment Corporation (and its affiliates). Silicon valley came out of Schockley Semiconductor. Boston got sidelined when the minicomputer industry died. Silicon valley blossomed with the integrated circuit.

The cluster in Boston spun out of a university, MIT. The cluster in California spun out of for-profit companies: Schockley Semiconductor and HP. (Schockley left Bell Labs to start Schockley semiconductor in california, his reasons for picking the location are mentioned in the book "Crystal Fire", which I haven't got a copy of with me at the moment. I do remember that one big factor was that forty years ago land was cheap in Orange County, and expensive near Boston. That didn't last, but it helped catalyze the migration and establish a new cluster based on start-ups vs established firms.

The breakup of Schockley Semiconductor giving rise to Fairchild's electronics division, giving rise to Intel, helped set a patter of perpetual start-ups. If the "traitorous eight" could make it big, there was no real point in traditional company loyalty. On the east coast, people expected to stay with one company and make a career of it.


Class 7

Today's theme is "Focus".

7-1) Question: What stage of the technology adoption cycle is videoconferencing technology currently in? What would be the best strategy for a company selling videoconferencing equipment to use to drive the technology and marketplace to the next stage of the cycle?

First one comment on the article: disruptive technologies go from niche to niche. This is a complimentary view that says you need to focus on which niche you're attacking next.

Videoconferencing is stalled at the "Visionary" stage. The technology is there; The techies were using cuseeme five years ago, and now anybody can stream realvideo or mp4 over the internet. But it's a little more complicated than just finding investors and commercializing it: nobody will actually invest money in a market Microsoft's already in, and there's "Microsoft Netmeeting" waiting to squash anybody who actually figures out how to profit from it. Meanwhile, the open source guys are worried about patents (see http://www.videolan.org/).

There are actually two distinct technologies which don't interoperate yet. Internet based TCP/IP streaming video, and 3G wireless phones with built-in cameras. The phone makers are quite happy to sell the hardware straight to individuals, since as long as you're carrying around an orwellian tracking device anyway, it might as well have a spy camera in it. Video cell phones are likely to flood the market even if people only use them to take snapshots, and once it's widely distributed the younger generation will make use of it.

Obviously the paper would suggest bundling together a specific solution for specific customers, but that's already there. The pornography industry has been an early adopter of all sorts of technologies, including videotape and daugerotype photographs. The BBC wrote about them using video streaming:

http://news.bbc.co.uk/1/hi/technology/2992914.stm

I think what's more likely to happen to get it into business use is that one of the open source packages will be adapted for in-house use somewhere like IBM, and cleaned up for internal use directly by a customer who knows what they want. Then that technology may be adopted elsewhere the way Cisco's home-grown linux based printing system paved the way for CUPS and LPRng.

http://www.linuxjournal.com/article.php?sid=2907

The principle is the same, though: adapting technology to a specific niche. One such potential niche is security systems; the 8 zillion hidden video cameras that security guards monitor. Another potential niche is fast food order windows, actually letting you see a human on the drive-through screen. But for most potential niches, videoconferencing turns out not to buy you anything beyond a warm fuzzy feeling. Speech goes just fine over the telephone, the ability to see the other person conveys very little additional information while taking up a lot more attention. Still, corporate boardrooms might want it for psychological reasons, with the proper marketing, and the profit margins are insanely high at the executive level (to make up for the low volume).

The more interesting applications of video streaming (so far) have been broadcasting canned video streams, but the MPAA is just as reactionary as the RIAA, and has been trying very hard to suppress this technology, since it renders them irrelevant as well. Buggy whip manufacturers again.

7-2) Question: Is Treacy's analysis appropriate for technology companies? Why or why not?

I'm not sure Treacy's analysis is appropriate to any company. Why would printing your bill on both sides of the page not fall under being cost-conscious? What does the smell of bread in costco have to do with operational efficiency? He's shoehorning stuff into his categories left and right.

His basic idea seems to be that a company can excel in inventory management, product development, or customer relationships, but not more than one. (I'd bring up Home Depot as a counter-example, although now that its founders Bernie and Arthur have retired, it's degrading rapidly if you ask me.)

I wrote about companies like Dell, Home Depot, and Wal-Mart back at The Motley Fool. Our portfolio was called the "cash king portfolio", focusing on dominant companies with high margins and large cash reserves, and I was interested in companies that optimized the cash conversion cycle instead, basically excelling in inventory management. Treacy points to Home Depot as an example of excellent customer service, yet I always saw their real strength as inventory management. They're the wal-mart of hardware stores (a niche known for being tolerant of inventory since what they sell has a multi-year shelf life, yet they focus on moving it rapidly anyway). Home Depot's basic idea was that if you let customers into the warehouse, you don't need a seperate warehouse and retail site. This cuts a stage out of the inventory management chain. On top of that, they hold how-to seminars and hire extra staff who are instructed to walk you to your destination rather than just give you directions. Initially, they did _both_ better than their competition.

So basically, I think the guy's full of it.


Class 8

In the computer use/productivity growth article, page 9, footnote 14. Federal employment grew at an annual rate of 0.5% (less than the 1.4% annual growth in employment as a whole), but the federal government's total percantage of employement went UP from 2.3% to 2.5%. They grew slower than the rest, so their share went up. What the...?

8-1) Question: What are the three most important business-process changes that Wal-Mart made to be able to realize produtvity benefits from technology?

Well, according to the McKinsey study, Wal-mart's prime advantage was the big box format, but on more than one occasion it repeats "inventory management, electronic data interchange and [barcode] scanning systems" as the three big areas where IT helped boost productivity.

Everything wal-mart does is a big exercise in logistical optimization. Stock what sells, draw a straight line from supplier to consumer, try to have in stock exactly what the customer is currently trying to buy (no extra you have to store, no shortages that mean missed sales). Wal-mart's electronic tracking system is conceptually similar to the electronic package tracking systems of UPS or Fed-Ex. Scan it on delivery, scan it whenever it's forwarded from place to place, and scan it when it's delivered.

The "delivery" scanning at the cash register has been widespread for over ten years (and according to page 6 of the harvard business school article, was introduced by wal-mart in 1983). What's different from a traditional store model (and what most other stores did NOT copy when their checkouts grew barcode scanners) is that instead of thinking of themselves as storing goods (warehousing), package delivery services like FedEx see themselves as _transporting_ goods. They care how long it takes to get from point A to point B, and try to minimize that time. Shutting down FedEx to take inventory would be unthinkable, it's just not a part of the business.

Now WalMart is tracking its goods from start to finish the way FedEx does, taking the rate of delivery into account. (And what gets measured is what gets optimized.) And "start" is getting pushed back into the vendors via EDI, although part of that's just a huge company making demands on its suppliers, such as forcing them to carry its inventory for it to look better to wall street (a trick Dell also employs, and the reason Coca Cola created Coca Cola Enterprises).

I also thought the bit about catching shoplifters trying to get refunds was clever. The data they used to send across the sattelite they put up can now be more cheaply sent through land-line internet links.


Class 9

9-1) Question: Analyze a recent patent case using the framework presented by Rivette and Kline. For example: Is it wise for SCO to try to make money by suing IBM?

The Rivette and Kline article is about Dell. What Dell did is take over the market niche of Computer Shopper, people knew for years they could get a cheaper PC by mail-ordering parts from "the direct channel". All Dell did was read Computer Shopper for you, and pre-assemble the box. There was a HUGE market inefficiency with companies like Compaq keeping months of inventory on hand during manufacturing, and keeping computers on retail shelves for three months beyond that, so after half a year of Moore's Law depreciation the machine you were buying cost 30% more than it had any reason to. Compaq's biggest problem copying Dell was it was tied down by existing reseller relationships: if they sold direct they'd be in competition with their own resellers.

Patents have never promoted economic development. Dell didn't get where it is because of patents. Eli Whitney's patent on the Cotton Gin was universally ignored. (Whitney got rich through the invention of the "american system" of mass produced interchangeable parts later.) The Wright Brothers were more or less driven out of the airplane business by 1909. The article mentions Alexandar Graham Bell, but _doesn't_ mention he wound up with a stake in AT&T is that the executives had a change of heart after trouncing him in court.

In any fast moving niche, effort spent on laying roadblocks for competitors from following you is effort that is not spent moving forward. In a niche like pharmaceuticals where a dozen years of regulatory approval are required before you can sell something, such roadblocks may make sense. But it's stupid in a niche where what you just did will be either obsolete in three years (not just computers but hollywood movies, popular music) or which will be a commodity.

This is why everybody went to mutually assured destruction. Suing over patents is seldom a profitable strategy, so you use your patents to prevent anyone ELSE fro suing you, and cross-license the lot as a big lump to anybody who agrees not to sue you. (If they're smaller than you, you squeeze money out of them in the process, but that's corporate america for you.)

It is not wise for SCO to try to sue IBM because they haven't got any defensive arsenal. SCO hasn't _got_ any relevant patents; the remaining Unix patents stayed with Novell, and the core concepts of Unix are old enough that patents on the original ideas would have expired by now. In its counter-suit, IBM pulled four random patents out of its portfolio, one for each of SCO's remaining product lines. It has zillions more, there's just no point in pulling them out unless SCO an shoot down the first ones. (Why tip your hand?) If SCO gets any of IBM's original four patents invalidated, IBM can just pull out another one, and SCO has to start the fight all over again from ground zero. SCO doesn't know what patents IBM will pick next, so the only way to prepare would be to research thousands of patents, which isn't effective. IBM's countersuit can keep SCO in court _forever_, it only ends when SCO either loses or runs out of money to pay lawyers.


Class 10

10-1) Is there (or will there be) an Internet Patent War?

The first article again mentions the mutually assured destruction nature of modern software patents. The main way of getting around this is moving software development to europe, which is why the current big brouhaha over europe voting to destroy its software industry by approving software patents. (The next stops after europe may be India, China, Brazil, Africa...)

The patent examiners use previous patents to tell if something's novel or not, and because nobody patented software twenty years, ago, people are getting patents on twenty year old osftware. That said, the current generation of truly crappy patents should expire in another decade or so, and most of the basic principles should be back in the public domain by then. In the meantime, mutually assured destruction is the technique people wait out the clock with. The people who waste their time on legal infighting tend to get eaten alive by competitors (as Lotus and Borland found out), or find out that the destruction is not necessarily mutually assured when an idiot starts trouble (as SCO is learning).

10-2) Will patent strategy help or hinder the continued growth of Open Source software? What should the Open Source community do about patents?

If patents could stop open source, Microsoft would have used them years ago. These days, the open source crowd is cozying up with IBM, which has a patent portfolio larger than many entire countries. Red Hat has also started its own GPL patent pool for MAD purposes, as they explain here:

http://www.redhat.com/legal/patent_policy.html

The patents on MP3 led to the creation of Ogg Vorbis. Red Hat yanked its MP3 player for legal reasons (even through Bartlesmann's patents don't cover playback, just encoding). It's interesting to note that shortly after Red Hat started allowing lawyers to make technical decisions (such as not shipping an MP3 player with Red Hat 9), their distribution folded, and was outsourced to the Fedora project.

In August 2002, Linus Torvalds expressed his policy on dealing with patents in the Linux Kernel here:

http://groups.google.com/groups?selm=Pine.LNX.4.44.0208111553010.1233-100000%40home.transmeta.com&rnum=2

In brief: he ignores them until somebody sues, and doesn't even want to be told about them so he's not willfully infringing on anything.

Earlier, an excellent exchange about patents in Linux occurred (in 1999), including contributions from a former patent examiner and a discussion of the Patent and Trademark Office's administrative procedures. This discussion laid out the Linux kernel developers' strategy, which is basically wait to be sued and then hit back hard:

http://kt.zork.net/kernel-traffic/kt19991115_43.html#9

According to the next major linux-kernel patent discussion, 90% of patents are never defended in court, and of those that are 80% are overturned, so only 2% of patents are ever successfully defended in court.

http://kt.zork.net/kernel-traffic/kt20001016_89.html#7

Next, Microsoft patented loading a "trusted" OS into a "trusted" CPU, ala palladium:

http://kt.zork.net/kernel-traffic/kt20011231_148.html#1

The linux developers' aversion to even liberally licensed patents showed up in the discussion of the rtlinux patent:

http://kt.zork.net/kernel-traffic/kt20020609_170.html#1

Which led to the creation the AdeOS project, an improved way of doing the same thing as rtlinux without infringing upon their patent:

http://www.nongnu.org/adeos/

Similarly, the patents on the unix compress algorithm (from Unisys and IBM) explicitly led to the creation of gzip, the developers of which spent more time reading patents than reading compression research:

http://www.gzip.org/#faq11

If the patent office knew what they were doing, they wouldn't allow patents on mathematically impossible processes, like compression of random data:

http://gailly.net/05533051.html

These days, the linux kernel developers try to team up with companies like IBM, who have enormous defensive patent portfolios, for the explicit purpose of countersuing. Just as the GPL uses copyright against itself, the patent system has become so dysfunctional that it is being used against itself.

Some patents are licensed for the sole purpose of preventing anyone else from suing other licensees of the patent over other patent infringement claims relating to the technology. IPMI and USB are two technologies like this.

http://kt.zork.net/kernel-traffic/kt20021028_190.html#4


Class 11

11-1) Which is more critical to the rapid pace of innovation in the technology industry, the culture and infrastructure of start-up companies or the proper organization of established companies to meet the challenges of disruptive technology and new competition? Why?

Start-ups, hands down. Established companies aren't very good at disruptive technologies until they become sustaining technologies. Never were, aren't about to start. They acquire disruptive technologies by buying smaller companies, once they've been proven.

Mark Andreesen (co-founder of netscape) said in a recent interview: One of the problems big companies tend to have with innovation is not that they don't have ideas. It's just they're so big that the next innovative idea -- if it's not equally huge -- isn't going to move the needle on their financials.

You can read the interview at http://www.sfgate.com/cgi-bin/article.cgi?file=/chronicle/archive/2003/12/07/BUGMP3GOVK1.DTL

11-2) What is your view of the role of university research in the technological ecology in this country? Are universities properly structured to perform this role?

Universities used to do this thing called "teaching", long long ago. Back in those days, their research efforts were largely side effects of having a lot of smart educated people (professors) together with a lot of smart naieve people (students), along with a lot of toys. Now universities are funded as an alternative to corporate think tanks, since corporations have noticed that they don't value their in-house research that much. (Why commercialize something that's risky when you can keep it off the market instead?)

In software, I think open source is where all the new stuff is going to come from. I think universities can help, since what the open source community could really use is academic fellowships for people like Larry Wall. But since universities aren't stepping up and doing it, we're getting organizations like OSDL, largely funded by corporations, which seems to be working for now. Universities could fund it, government grants could fund it, or people could just go back to doing open source in evenings and weekends (which still outperforms proprietary development but slows us down to maybe half the speed we go at when we can really put time into it.)

The point is to get the new technology out there so SOMEBODY picks it up. Every time you put a leash on it to restrict the pool of potential adopters, the result is bad for the economy. "Company B could bring this to market in six months with a version that would sell ten million copies. Company A is going to take five years to produce crap, and will then sell three, two of which will be returned for refunds, and will then SIT on it until the patent expires." Even good companies can do this: IBM is sitting on the "butterfly keyboard" patents until they expire. Right now, that technology is doing nobody any good, despite the fact customers loved it.

Oh sure IBM's labs are still puttering with it, see http://reviews.cnet.com/2100-1044_3-5098732.html But they're not going to do anything. Quote from that article:

Hill emphasized that there are no current plans to commercialize any of the designs but said there was no intrinsic reason why this couldn't be done if demand warranted it. "The technology required to do it is not enormously sophisticated," he said.

If start-ups had access to the technology it never would have gone OFF the market in the first place. (Sharp's zaurus has a fold-out keyboard now, but can't do it right for fear of hitting IBM's patent...)


Class 12

12-1) Question: Is the venture capital system as it is practiced in this country good for the investing public and good at spurring innovation? Can you think of a better alternative for the investing public and/or entrepreneurs?

People offering you money are almost never doing it for your benefit. There are a few exceptions (the pell grant comes to mind), but even there the money is given to further the agenda of the person giving it, and only coincidentally the person getting it.

The venture capital system is designed for investors, although not so much the 'investing public' as the big boys. Big institutional portfolio managers who want to put 10% of their portfolio in high risk things just like they put the opposing 10% into treasury bonds or similarly low-risk investments.

On a certain level, venture capital funds make sense as the way the market has to react to sturgeon's law in start-ups. If 90% of them fail, then the remaining winners must produce enough profits for the venture capital fund to make up for it, so they have to bleed them white.

Venture capital has nothing to do with innovation. Money doesn't make ideas happen. It comes afterwards, when you're trying to develop your idea into a product. Venture capitalists want to fund something similar to a previous success, not something truly innovative. They're also at a loss dealing with disruptive technologies: if the market starts small and grows slowly, where's their big short-term payoff?

My main objection to venture capital is that it distracts companies from funding from operations. A three man company can survive off of $250k/year fairly comfortably, and a $200k opportunity looks incredibly appealing. Add $10 million of venture capital and suddenly $200k opportunties stop demanding so much of your company's attention. The attractive strategy becomes pursuing more venture capital. And once you've hired more people and gotten a bigger place you need more than $250k/year to survive. Venture capital grows the business without naturally growing the customer base to support it. Funding growth from operations doesn't have that problem.


Class 13

After reading the article, I still don't know how task orientation is supposed to differer from single mindedness. But that's just the tip of the iceberg of "not liking this article".

Yes, Entrepreneurs see people as people rather than roles. Yes, they tend to want to do things themsleves if they're the guy who invented it and are better at it than anybody else they've managed to train. What else is new?

The rest of the article was case studies of composites the guy made up, and meant nothing to me. "During the four-hour meeting, the group would force itself ot distill, from a list of ten, three key initiatives to be accomplished during the next 90 days." Oh yeah, this tells me all SORTS of stuff about how to run a good company. It's bland. It's fake. The names have been changed to unsuccessfully disguise the fact the author's making it all up, or at the very least sucking all human interest out of it...


Class 14

Best quotes from "Engineers as Entrepreneurs": "HOUSE: But a lot more people are apt to succeed at doing the small to medium company, I suspect, than to become the next Bill Gates. So I wouldn't want our readership to feel that, hey, the only game in town is to do a Yahoo or Microsoft." And McMinn's follow-up.

Programming is a skill, and the traditional business model for brokering skills is the law firm, or doctors/dentisits/opticians offices. You come to The Great Geek when you need something specific, or you contract out to him. But it doesn't scale up to huge businesses because you can't find 8 gazillion more of them to put on an assembly line. But the point of software being infinitely replicable is that you don't have to. You pay to have it created or customized. After that, copying it doesn't actually cost anything, no matter what the law says about it. When the law or existing business models don't match reality, reality is not what eventually erodes.


Class 15

I suppose I trust angel investors more than I trust venture capitalists, but in either case their objective is to get their money back out in less time than it really takes to grow the business. My own investments were in standard stocks and such, but I never invested in anything where I had to care about an exit strategy. Personal preference, I suppose.

Obviously there's a group of investors out there exploiting an otherwise underserved niche. If there's money to be made, somebody's trying to make it. That's what the free market's all about. (Even where there ISN'T money to be made, somebody's prospecting for it...)

But still, the businesses that impress me are the ones financed by the founder's credit cards. I remember reading some statistics a few years on the amount of funding successful small businesses had had, and there was a definite spike at credit card limits (which the article discussed). Some people also took out a home equity loan, or simply got a $20-30k loan based on their personal credit.

If you can line up customers and deliver a product or service to them in a single digit number of months, you're in good shape. If you can't, you have more than a 50% chance of failure anyway. (If you have a business plan that involves doing something two years from now, you may want to have another line of business to pay the bills until then. This is what Intel did, with side consulting gigs designing custom chips for calculator companies while they worked on their DRAM master plan. The sideline, microprocessors, took over the business. It still took them a while to realise how important it was (they thought microcontrollers for elevators for years), but they had money coming in which gave them time and leeway to screw up.

A business is a process, sucking in money from customers and delivering product. Without that flow, you haven't got a business yet. Angel investors and venture capitalists don't create businesses, they create things that can sometimes mature into businesses.


Class 16

Peter Salus is cool, but I've been following him for a while. I read a quarter century of Unix a long time ago. I could write up some stuff on Unix History, but I think I'll just point you to the Unix History chapter of The Art of Unix programming, which I more or less co-authored:

http://www.catb.org/~esr/writings/taoup/html/historychapter.html

You could also check out the unix history bits in Halloween 9, which link to the MP3 of a couple of computer history talks Peter Salus gave, by the way: http://www.opensource.org/halloween/halloween9.php


Class 17

I'd just like to contrast the Chesterbrook article's "Open Innovation" with open source software. Specifically table I-1, which could have come from The Cathedral and the Bazaar. We know this stuff already. The corporate world is catching up to what the geeks have been doing for years.

However, he treats it like it's something new. Fifty years ago, AT&T totally failed to commercialize the transistor. Hollywood went to California (initially Baja California) to escape Edison's patents. Modern electrical power is based on Tesla's AC, not edison's DC. The wright brothers were not working for a big company when they invented the airplane. Big institutions have have _never_ been the source of the majority fo the world's technological progress. The Catholic church certainly didn't support Galileo's research.

The recent "dot com" bubble of internet commercialization echoes the bubbles in electronics and plastics earlier in the century. The autmobile arose from over a hundred start-up companies fighting for supremacy a hundred years ago. Institutional R&D was never the dominant factor. (A few notable exceptions like World War II and the Apollo program were from government, not corporate, activity.)

17-1) Question: How much of the value created by new technologies can be captured by the organizations that create them?

Potentially, they can capture as much as competitors can. But see question 2. Any time and effort you spend digging a moat around your business is time and effort you're not spending attacking the market opportunity. You don't win races by worrying about slowing down the other runners.

Some of the investment is in infrastructure. Intel doesn't have few competitors because it knows stuff other companies don't. Intel has few competitors because almost nobody can afford billions of dollars per year to keep building and upgrading state of the art manufacturing fabs. Intel gets enormous economies of scale from that, and can amortize that investment over literally millions of processors per year. Its competitors may know exactly how to make such a fab, but they just haven't got the resources or momentum. (If they did build a fab and come out with a chip, and it took a year to convince Dell to pick it up, the chip would already be obsolete...)

17-2) Question: How should companies manage their research to benefit their business?

As a rapidly depreciating asset.

Moore's law is good for the computer industry for a number of reasons, and one of the big ones is that nobody spends much time being particularly defensive about anything that's already shipped. It's going to wind up in a landfill in three years anyway, you have to focus on what comes next or you're history. Therefore, you hit the market hard with a lot of product as fast as you can, fill up the channel with as much as it can hold, and then discontinue production while the inventory is still selling well, so you can tool up for the next generation. Having other people chase your taillights is actually good, because if they get a product to market nine months after you did, they're going to get half as much money out of it (if they're lucky).

Your technology is going to leak out the instant you ship it to customers anyway; it'll be in the hands of anybody who wants to reverse engineer it. (Sure you've got patents, but by the time it works its way through court you're three generations on and you're fighting a cheap taiwanese holding company with no assets anyway.) As soon as you advertise to customers you're tipping your hand to competitors, letting them know what you're doing if not how. (However big or bright your company is, the rest of the world outnumbers you. There are always people out there smarter than you, and more of them, and if you could figure it out so can they.)

What you have therefore is lead time. Use it. Execute rapidly. Come out with rapid upgrades. Go big. Try to establish an installed base in the market so that customers com back to you for support and upgrades by default. Don't hesitate to cannibalize your own installed base; it's that or watch your customers do it.

The hard drive industry is one example. Video games are another. Video games are nice because there's a "mod" industry: if you can get third parties to license your engine instead of cloning it for their inevitable flood of follow-on products, then you get free money from otherwise dead technology. Why do they do this? To save three months development time, of course. (And yes they do it all the time: Epic has licensed the Unreal engine to all sorts of people.)

The only question is: does licensing your technology make more economic sense to your competitors than reverse engineering it? This is a question of dollars and time, there's a price at which it is and a price at which it isn't. Pick the right one.


Class 18

I liked the Chesbrough article about Xerox, but it was _screaming_ for references to the innovator's dilemma. It was published in 2003, the book had been a bestseller for 5 years already, come ON guy...

The harvard article struck me as largely pointless.

Question: For a particular company, the primary value-capture question is "how much value can my company capture"? But for society as a whole, the primary value-capture question may be "how much benefit (value capture!) does the typical member of society receive?" Can you propose a mechanism for balancing value capture by innovators and by customers that would (theoretically) maximize benefit to the median member of the public?

Yeah, patents and copyrights that expire after a reasonable amount of time. That was the original idea, after all.

The only problem we've got with the modern system is that the expiration date is no longer reasonable. Patents used to last 7 years and copyrights 20. Now patents last 20 and copyrights life plus 75 years. (Ah yes, got to provide incentive for dead inventors.) And on top of that, congress extends copyrights retroactively with the Sonny Bono act. Stupid.


Class 19

Question: How might Intel adapt or augment its research funding strategy to address the concerns raised in the last two pages of Chapter 6?

The chapter missed the real problems, actually. Quantum computing's a fad, Moore's Law can keep going for another decade with a diamond substrate, there's a wired article on this at http://www.wired.com/wired/archive/11.09/diamond.html

But Intel's 64 bit strategy (Itanic) is doomed, it's chasing AMD's tail with Yamhill. There's a memory barrier at 4 gigabytes that people are going to hit in 2005. I detailed this at http://www.opensource.org/halloween/halloween9.php#itanium

Beyond that, prices for new systems have been going through the floor ever since 1998. Intel was forced to do the Celeron because the 56k modem was the bottleneck in the system, and more processor power didn't make the internet go faster. Now we've got broadband, but the momentum has been broken somewhat. (People are also resisting upgrading to new ever more wasteful windows versions. Games require powerful 3D cards more than powerful CPUs. For just about everybody, a gigahertz is plenty of speed to do everything they need. The metrics are shifting to low power consumption in laptops, and area that the Pentium 4 is not well suited for.) Plus the developed world is somewhere near market saturation now, and although there's an upgrade stream there's downward price pressure because existing systems are "good enough", so people can wait for new more powerful systems to get cheap. The real growth is in the developing world, which is more price pressure (India's simputer, etc).

What Intel's been trying to do is diversify into areas like networking. See http://www.internetweek.com/interviews/barrett102300.htm for example. Unfortunately, they've tried to do it with internal start-ups, and it blew up in their face a bit. A mature conglomerate is not the best place for in-house startups. They may have the cash to bash their way into new areas, but an easier course probably would have been growth by acquisition. (If they'd wanted to make a big splash they could have bought a company like Nokia. But they didn't.)


Class 20

I preferred the original book. He hasn't got an actual solution here.

I liked the bit about the company outsourcing itself out of a job. My take on commoditization is to root for it rather than try to avoid it. I side with the open source guys. :)


Class 21

I remember All Things Considered covering Michael Lewis's new baseball book. http://discover.npr.org/features/feature.jhtml?wfId=1338452 Good interview. (I reviewed Liar's Poker for The Motley Fool way back, "http://www.fool.com/portfolios/rulemaker/1999/rulemaker990217.htm". That column was edited within an inch of its life, but oh well...)

21-1) Question: Research institutions (e.g. universities) have difficulty quantifying the productivity of their workers (e.g. faculty). How do they attempt to do so? What do today's readings say about the best approach for doing so? Do these same questions apply to research personnel in a company?

Ah, the old problem of the metric replacing the thing being measured. The only real solution is to have managers who understand what the people who work for them actually do, which is harder than it sounds.

The computer industry's particularly vulnerable to this because it's a full-time job just keeping up to speed, and even a techie who's out of the loop for six months can have a lot of catching up to do when they get back. An MBA who has never BEEN up to speed in the first place can at best guess. And if he's being adviced by an MCSE, all bets are off. QED. :)


Class 22

Question: How would you formulate a national research policy for the 21st century that optimizes the economic benefits per research dollar? Consider the strengths and weaknesses of the three research vehicles we've examined: industry labs, academia, and research consortia like Sematech. Which should be encouraged and funded to do what? Do we need them all? Do we need different models?

If it was government dollars, I'd restart the space program so we're actually trying to accomplish something specific. Take the starlite project (launch 10 probes for $100 million each rather than one probe for $1 billion) as one example, kill the stupid space station we've got now which is just an excuse to funnel money to russian rocket scientists and has been COMPLETELY hijacked by that foriegn policy goal to the point where it's a worthless piece of junk otherwise... And basically get a moon base that private enterprises can rent space on.

If it was coporate dollars, Open Source Development Labs (OSDL) seems like a pretty good model.

Academia's too fluffy for my tastes. They tend to come up with things that have no purpose. Of course you could give research fellowships to proven performers, but what that has to do with academiais an open question, unless the MacArthur genius grants are part of academia...


Class 23

Joel is cool. I still think the Amazon/Ben & Jerry's article was better, though. (Any time you hold up something that happened at Microsoft as an example of a _good_ way to do it, I lose interest.)


Class 24

These articles seem to miss an important element of creativity: play. Creativity involves, at some point along the way, elements of play. Daydreaming, thinking about what if, trying stuff you don't expect to work, etc. Just playing around. (They do occasionally use "play with" as a synonym for "fiddle with", but manage to miss the point anyway.)

Many great inventions were created out of pure sarcasm. The potato chip, for example. There's a whole group of people who are intensely motivated by being told something is impossible.

Deadlines help move beyond play into actually trying to make something work (necessity being the mother of invention, and duct tape being a very common component of the first implementation fo anything). But a challenge can be an important part of play. (Games need rules.)

All of these articles look at creativity from a strange martian point of view, of "I'm an accountant without a creative bone in my body, let's put on a white lab coat and see what this creativity thing is like. If you break it down to its component atoms, you will eventually find the "art" molecule...

Makes me want to hit them.

Managing creative people is like herding cats, the xerox article weeks ago had excellent examples of it. Know when to stay out of their way, know when to squeeze a stopgap prototype out of them, but never EVER take the fun out of it or you take the creativity out of it.


Class 25

25-1) Question: What forces are most likely to shape the Internet for the next decade? Government? Private industry? The open-source movement? How will things change?

I was one of those 8-bit bbs kids twenty years ago, and although I couldn't stand AOL my mother and brother used it, so Lessig's description of spaces is a bit old hat to me. :)

All three (government, private industry, and collaborating end users) will have a hand in it, but the dominant one is the collaborating end users (of which open source is one expression). Collaborating users built the internet, and the web on top of that. Industry comes in to profit from it because that's where the customers are, and government comes in to regulate it because that's where the citizens are. But they are not and never have been the driving force behind it.

All three boil down to "what people want", albeit sometimes only certain people. When different people want different things, competing forces tend to fight it out for a while. I consider the open source movement the most direct expression of what people want (cutting out the middle man and writing it for themselves, without commercial or regulatory interests involved). But there are decades of drama to work through before open source becomes the standard.

The internet is just the printing press all over again. This is not a new observation. (I wrote about it at length for The Motley Fool years ago, see http://www.fool.com/portfolios/rulemaker/2000/rulemaker000413.htm ) But it remains true. The printing press commoditized printed pages and sparked a larger social movement that broke the power of the Cathloic church, which was effectively Europe's federal government at the time. The church had the authority to burn people at the stake simply for disagreeing with them, but that only only delay the effects of the printing press. Then again, those delaying tactics allowed the church to hang on to governmental and regulatory power for hundreds of years after the arrival of the printing press. And the church itself found the printing press quite lucrative (printing indulgences) along the way.

The internet is faster and more versatile than the printing press, but still exactly the same concept. The internet allows individuals to become publishers, in a way that quickly and cheaply scales to a global audience of millions of readers if the content merits their attention. Text is already a done deal; the Drudge Report can compete on an equal footing with Time magazine or CNN. (This even helps traditional books: Wil Wheaton's self-published "Dancing Barefoot" sold out its first print run and has a backlog for the second, because the 250,000 regular readers of his blog were better grassroots advertising than a big publisher's entire advertising budget.) Audio streams are currently in progress, and video content is happening any time now as the infrastructure scales up to meet the bandwidth requirements.

There are a number of attempts by the entrenched interests to fight these changes and preserve their obsolescent position of priviledge. One approach is technological, such as the content scrambling system of DVDs. But technical approaches don't work: decss was written in a single afternoon by three bored Europeans, and released by a fourth. It is impossible to give a user both a lock and a key and prevent them from opening the lock at will. If they can get into the content on their own without external permission, then they have everything they need to do anything they want with them. Either they can access the content or they can't, there's no "sort of access".

And just as the MPAA wanted to suppress decss, the federal government originally wanted to suppress public key cryptography, ala Zimmerman's PGP. It leaked as open source a decade ago, they fought for a bit, then tried to regulate just its export (see barn doors and horses), then finally caved to the inevitable. And now https is an important part of the basic infrastructure of the internet (without with credit card transactions online would be too risky to be widespread), and harvard professors are writing about it as a great thing. There's a pattern here.

Legal approaches don't do much good in the face of widespread civil disobedience. Napster attracted millions of nontechnical end users, who didn't care about intellectual property law and still don't. The destruction of Napster simply led to the creation of numerous alternatives: Kazaa, Limewire, BearShare, BitTorrent, Audio Galaxy, etc. The legal game quickly became whack-a-mole, and new technologies were designed not to have an easily suppressible central point of failure.

Throwing money at the problem allowed entrenched interests to smother individual instances of new business models, such as the recent destruction of mp3.com's song archive (see http://www.theregister.co.uk/content/7/34009.html ). But that just means that the new system that springs up to replace it won't have that particular vulnerability next time. (The new independent music index will be distributed, and mirrored, and not have a central point of failure.)

Product activation seems like a good idea, until you try it. One of the big driving forces behind Linux adoption on the desktop (in corporate settings at least) is XP's need to contact Microsoft's servers for permission to run. Smaller companies without a monopoly advantage are severely punished for even attempting it: Intuit had to publicly apologize to its customers for playing with product activation earlier this year, and many customers did not accept the apology but permanently switched to competing products. See "http://news.com.com/2100-1046_3-5088604.html".

The fact is, end users don't want it, and to force it down their throats you not only have to prevent them from using your product without your restrictions, but prevent any competitors from offering an alternative that does not have the restrictions. Bartlesmann has a patent on MP3 generation, but if they successfully enforced it people would switch to using ogg files. The only reason ogg has not yet replaced mp3 is that Bartlesmann's attempts to enforce its mp3 patent have not yet been effective enough to make the inert entrenched base of mp3 users care.

Leveraging existing monopolies to preserve themselves and squelch alternatives is the classic delaying tactic. Microsoft's palladium is one example, but it can't prevent competitors from offering products that are not crippled by artificial restrictions, and guess where consumer demand goes? Any monopoly that actually can prevent domestic competitors from offering competing products is subject to antitrust suits in the short term, and foriegn competition in the long term. There are always competitors happy to participate in a commodity market and help that commoditization along, because even if the market is only worth 1/10th as much afterwards, because they get a bite of that 10% and they're shut out of the lucrative monopoly entirely. This is neither new, nor specific to the internet. Open source is just one expression of this trend.

The RIAA and MPAA appealing to congress to pass laws against their competition is another predictable clock-stopping approach. But these are still only delaying tactics, just like the Catholic Church used, and now even the delaying tactics operate on internet time.

P.S. An interesting thing to point out is that government and private industry have both proven fairly powerless against spam, but one thing that's helping to reduce it is the rise of spam baiting as a sport. People respond to spammers and draw them out, getting them to spend a lot of time answering to hook a potential victim, sending handwritten letters and photos, hanging around the airport waiting for the arrival of someone who isn't coming (and in extreme cases spending their own money booking a hotel for that person's visit)... see www.419eater.com for one example site.


Class 26

Here's my notes from the lecture:

Austin Ventures

Venture capital, subset of "private equity" subset of "alternative assets'

Venture capitalists are looking to build great companies.  To do this, they are looking for:

A great team of founders and managers
  - absolutely key -- the team is the core to success.  Without the right people, I don't care how good the idea is.
  - focused, passionate/committed and creative people
  - looking for a balance of skills (general management, technical expertise, marketing/business development) with experience in relevant fields
  - team does nto have to be complete, better open slots than "weak people".  A good vc company can help with filling the open slots.)
  - the set of people who meet these criteria will be small the channelge is to find and recruit them.

A good business plan
blockbuster product


All you need in this life is ignorance and confidence, and then success is sure - mark twain.

Team, market, technology.  Risk in one or two, not all three.
Needs less than $5 million to get to cash flow break even, no venture capital.
More than $10 million, venture capital open way.

thoughts:
  unlimited demand model (gold mine?).  Otherwise competitors try to take
  business from you.

Class 27

Question: Pick a modern technological change (e.g. the widespread adoption of email), and specify whom is economically harmed by the widespread adoption of the new technology.

In the case of email, the post office and fax machine manufacturers. The decline in fax machines is hurting phone companies (the shortage of phone numbers let up considerably when everybody started getting rid of their dedicated fax lines, although cell phones are making up for it but that's another cut into the monopoly local carriers, and cable is all set to cut into what's left with voice over IP...)

It's not hard to find these. The refrigerator put "ice men" out of business. There used to be a whole industry cutting blocks of ice out of rivers in the winter, storing them under sawdust all summer, and delivering them the way milk men used to, every morning, so you could keep your "ice box" cold. How many people used to make saddles and tack harnesses for horses? Scribes were put out of business by moveable type... This is normal.