9 Waging a Standards War

North versus South in railroad gauges, Edison versus Westinghouse in electricity, NBC versus CBS in color TV, Sony versus Matsushita in VCRs, the United States versus Japan in HDTV, 3Com versus Rockwell and Lucent in modems. It’s fine to talk about the advantages of standard setting and alliances, but agreement is not always reached on technology standards. Time and again, incompatible technologies battle it out in the market in a high-stakes, winner-take-all battle.

When two new incompatible technologies struggle to become a de facto standard, we say that they are engaged in a standards war. These wars can end in a truce (as happened in modems), a duopoly (as in video games today), or a fight to the death (as with VCRs). Standards wars are unique to network markets with powerful positive feedback. Traditional principles of strategy, while helpful, are not enough when it comes to standards wars.

We do not mean to suggest that every new information technology must endure a standards war. Take the CD technology, for instance. Sony and Philips adopted a discontinuity strategy: they openly licensed their CD patents as a means of establishing a new technology completely incompatible with the existing audio technologies of phonographs, cassette players, and reel-to-reel tapes. They were not in a battle with another new technology. They merely (!) had to convince consumers to take a leap and invest in a CD player and compact disks.

What is distinct about standards wars is that there are two firms, or alliances, vying for dominance, each one employing one of the four generic strategies discussed in Chapter 7 in the battle. One of the combatants may be an incumbent that controls a significant base of customers who use an older technology, as when Nintendo battled Sony in the video game market in the mid-1990s. Nintendo had a large installed base from the previous generation when both companies introduced 64-bit systems. Or both sides may be starting from scratch, as in the battle between Sony and Matsushita in VCRs.

The outcome of a standards war can determine the very survival of the companies involved. How do you win one?

CLASSIFICATION OF STANDARDS WARS

Not all standards wars are alike. A critical distinguishing feature is the magnitude of the switching costs, or more generally the adoption costs, for each rival technology. We can classify standards wars according to how compatible each player’s proposed new technology is with the current technology.

If both your technology and your rival’s technology are compatible with the older, established technology but incompatible with each other, we say the battle is one of rival evolutions. Competition between DVD and Divx (both of which will play CDs), the 56k modem battle (both types communicate with slower modems), and competition between various flavors of Unix (all of which run programs written for older versions of plain vanilla Unix) all fit this pattern.

If your technology offers backward compatibility and your rival’s does not, we have evolution versus revolution. The evolution versus revolution war is a contest between backward compatibility, evolution, and superior performance, revolution. Evolution versus revolution includes the important case of an upstart fighting against an established technology that is offering compatible upgrades. The battle between Lotus 1-2-3 and Excel in the late 1980s and early 1990s in the market for spreadsheets followed this pattern. So did the contemporaneous struggle between dBase IV and Paradox in the market for desktop database software. (The mirror image of this occurs if your rival offers backward compatibility but you do not: revolution versus evolution.)

Table 9.1. Types of Standards Wars

  Rival Technology
Your Technology Compatible Incompatible
Compatible Rival, evolutions Evolution versus revolution
Incompatible Revolution versus evolution Rival revolutions

Finally, if neither technology is backward-compatible we have rival revolutions. The contest between Nintendo 64 and the Sony PlayStation and the historical example of AC versus DC in electrical systems follows this pattern. These four types of standards battles are categorized in Table 9.1.

INFORMATION-AGE STANDARDS WARS

We start with three case studies of information-age standards wars. They illustrate several of the tactics that can be employed and some possible outcomes. One war, that over AM stereo radio, was mutually destructive. Another war, cellular telephones, has led to the continued use of two incompatible technologies. The third battle, over 56k modems, was resolved through a standards agreement.

AM Stereo Radio

Some wars have no winners. AM stereo is a good example. Never heard of AM stereo? Our point exactly. The failure of AM stereo radio to gain popularity in the 1980s resulted from a battle between rival revolutions that left no winners.

As early as 1959, petitions were filed with the FCC to adopt an AM stereo standard. By the late 1970s, several incompatible systems were competing for FCC endorsement, sponsored by Magnavox, Motorola, Harris, Belar, and Kahn. The FCC actually picked the Magnavox system in 1980, only to be met with a storm of protest. In an echo of the color television fiasco, the FCC reversed itself in 1982, voting 6 to 1 to let the “market” decide. Four of the five rival systems started to compete in the market, seeking to attract both radio broadcasters and receiver manufacturers.

Since the radio industry itself was quite fragmented, the pivotal player was General Motors’ Delco Electronics Division, the largest dominant manufacturer of radio receivers. Delco picked the Motorola system. AM stereo was estimated to add $20 to $40 to the retail price of a car radio. But radio stations saw little reason to invest in equipment, especially in the face of uncertainty over which technology would prevail. Some 30 percent of radio stations cited “market confusion” as a reason for not broadcasting in stereo. The second most-cited reason was “insufficient audience,” which is almost the same thing.

We see several lessons in this experience. First, it is a reminder that rival, incompatible approaches to a new technology can indeed kill or at least greatly retard the growth of that technology. Second, a new technology had better offer significant value-added to start a bandwagon rolling. Third, the AM stereo experience shows that adoption is especially difficult when multiple groups of buyers (automobile companies/drivers and radio stations) need to coordinate. Fourth, the example suggests that the best strategy was that adopted by Motorola, namely, to focus on the buyer group that was more concentrated, auto manufacturers, and specifically on Delco, potentially a pivotal buyer. Finally, we note with dismay that neighboring radio stations were unable to coordinate to at least pick the same technology in their own local geography, in part because the National Association of Broadcasters warned its members that this type of coordination might subject station owners to antitrust scrutiny.

Digital Wireless Telephones

Digital wireless telephones permit an interesting comparison of formal standardization in Europe with a standards war in the United States. As with HDTV, the United States has adopted a market-oriented approach, while Europe centralized the selection of new technology. As with HDTV, the U.S. system encouraged the emergence of a promising new technology initially backed by an upstart. Unlike with HDTV, however, the Europeans managed to adopt new digital wireless telephone technology more rapidly than in the United States. So far, at least, the U.S. standards battle has delayed adoption of a promising technology, without any evident benefit in terms of greater product variety.

In Europe, the Global System for Mobile Communications (widely known as GSM) is a well-established standard for digital wireless telephone service. GSM was officially endorsed back in 1992, and valuable spectrum was provided to support GSM implementation. As of 1997, some 40 million Europeans were using GSM. Worldwide, GSM is the dominant technology for digital wireless telephones, with 108 countries adopting it as a standard.

In the United States, by contrast, three systems are offering rival revolutions. The three incompatible technologies vying for leadership in the market for digital telephone systems are (1) GSM, (2) Time Division Multiple Access (TDMA, a close cousin of GSM), and (3) Code Division Multiple Access (CDMA), a radically different system sponsored by the company Qualcomm. The three systems are incompatible in the sense that consumers buying a phone for one system will not be able to switch to another without buying an expensive new phone. However, they are compatible in the sense that users of one system can make calls to users on another system. Fragmentation of the market not only raises consumer switching costs; it also undermines economies of scale in the manufacture of the telephones and gear.

As of 1997, TDMA was in the lead in the United States with more than 5 million subscribers; CDMA had about half that amount. GSM was a distant third with around a million subscribers. Older analog technology retains the lead in the U.S. cellular telephone industry, with nearly 50 million subscribers, but sooner or later analog will surely be displaced by either CDMA or TDMA. Some would say the United States is five years behind Europe in the adoption of digital wireless telephone service, but others argue that CDMA is technologically superior.

Since the buyers in this market, the cellular telephone and personal communication services (PCS) providers, are large, the three-way battle has led to an intricate mating dance between wireless carriers and manufacturers. Ericsson, the champion for TDMA, has AT&T Wireless, SBC, and BellSouth on board. Qualcomm, which created CDMA and has championed it, has signed up Primeco (a joint venture between Bell Atlantic, US West, and AirTouch), Sprint PCS, and most of the other PCS providers. This industry offers a good example of how large, pivotal buyers can obtain attractive terms and conditions if they are willing to make early commitments to specific technologies.

Qualcomm has aggressively pursued a performance play strategy. Qualcomm has been persistent in promoting CDMA, going back to a time when many industry observers dismissed its technology as futuristic but unrealistic. In 1990, when Bell Atlantic and Nynex picked CDMA, the industry was shocked. The Cellular Telephone Industry Association had endorsed TDMA in early 1989 (over Frequency Division Multiple Access, FDMA, a technology supported by Motorola and AT&T that has since disappeared), at which time Qualcomm had not even announced its technology. Many thought CDMA would not be workable for another decade. To this day, Qualcomm’s assertions that CDMA has far greater capacity than GSM or TDMA are hotly disputed. Qualcomm managed to shake up this industry much as General Instrument did in the HDTV competition, stunning the larger players with an all-digital system. By bringing on board Bell Atlantic and Nynex, Qualcomm forced equipment manufacturers to make CDMA products.

The precise extent of network externalities is critical to the dynamics of this battle. Consider first the geographic scope of network externalities. If users stayed near their homes, network externalities would apply only within each cellular franchise territory. Consumers in one area would benefit if both cellular providers used the same system, so they could switch systems without buying new handsets. But these same consumers would care little about the technology used in other areas (apart from the chance of relocating to another area). Under these circumstances, there would be little reason to expect a single system to dominate the entire U.S. market. As roaming becomes more important to wireless telephone customers, however, national market shares matter and positive feedback becomes stronger. Moreover, there is always the prospect of positive feedback based on traditional (supply-side) economies of scale in the manufacture of equipment.

How large are the network externalities within a region? Strong, but not overwhelming. Customers need not worry very much about being stranded: if a local carrier has invested in a CDMA system, say, there is little danger that CDMA service will become unavailable (since the infrastructure investments are largely sunk and cannot be moved to other regions). Most important, a user of a CDMA system has no difficulty placing a call to a user of a GSM system. Still, with incompatible local systems, a consumer buying an expensive wireless telephone is locked in. The natural solution to this problem is for consumers to obtain discounts on telephones in exchange for signing service contracts. The conclusion: the market for digital wireless telephone systems is subject to consumer lock-in (wireless carriers are heavily locked into the technology they deploy, and subscribers are somewhat locked in when they buy a phone), but not especially prone to tipping.

What can we learn from this example? First, a decentralized, market-oriented approach may be slower, but it also gives smaller players a chance to succeed with revolutionary new technology. By contrast, picking new technology through a more political process tends to favor larger, established players, even if they are not as imaginative and do not take the same risks. Second, remember that not every market tips. There is surely some positive feedback in the digital wireless telephone market, both around the world and in the United States, but it is not a winner-take-all business. Third, we see Qualcomm successfully executing a performance play strategy based on enlisting large and influential customers, starting with Bell Atlantic and Nynex. We discuss preemption tactics in standards wars below. Even if CDMA is truly the superior technology (which many doubt), Qualcomm could not claim victory simply on technical grounds. Preemption and expectations management were critical to their success.

56k Modems

A standards battle involving two distinct sets of buyers recently played out in the market for 56k modems. The battle was waged between U.S. Robotics (now owned by 3Com) and a team led by Rockwell and Lucent. This was a battle over rival evolutions, since both versions of the modem communicate well with older, slower, standardized modems.

The fact that there are 56k modems is somewhat of a surprise, even to experienced modem engineers. For years the accepted wisdom was that modems simply could not operate faster than around 28.8 kbs over regular telephone lines; 28.8 kbs was close to the theoretical limit, and the corresponding ITU standard, V.34, was widely expected to be the “last” modem standard. Integrated Services Digital Network (ISDN) was seen as the only way to speed things up, but ISDN has been slow in coming and a tough sell for household adoption.

Well, theoretical limits just aren’t what they used to be. Earlier modem standards had been designed for a roughly symmetric flow of inbound and outbound information. For downloading from the Internet, however, the flow is highly asymmetric: users receive information, and ISPs send it out. Using this idea to redesign modems has led to the 56k category (although performance is highly sensitive to conditions on the telephone line and the higher speeds only apply to downloading).

Everyone knew there was tremendous pent-up demand for faster modems, with consumers impatient with the sluggish pace of downloading information from the Internet at 28.8k. The available market has been estimated at more than $5 billion per year. So the 56k technology represented a major market opportunity, made all the more attractive because backward compatibility with 28.8k modems (and lower) was available to all under ΓΓU specifications.

U.S. Robotics, the leader of one camp, controlled roughly 25 percent of the market for modems, enjoyed strong brand-name recognition, and asserted control over patents crucial to the 56k technology. Rockwell was the leader of the rival team. Rockwell’s chief advantage was that it manufactures most of the chipsets that are the electronic heart of modems. But neither player could move forward smoothly without the other, and, in any event, a formal ITU recommendation is widely seen as critical for the legitimacy of any modem standard.

U.S. Robotics attempted to preempt with its “x2” products. The company signed up most ISPs, including America Online, Prodigy, MCI, and CompuServe. In doing this, it attacked the most concentrated part of the demand side, which is an excellent way to execute a preemption strategy in a standards battle, so long as pivotal buyers like America Online do not capture all of the profits from the new technology. This strategy is in harmony with the key assets of U.S. Robotics as a leading modem maker with strong ties to ISPs. 3Com’s acquisition of U.S. Robotics only strengthened its hand vis à vis ISPs. U.S. Robotics was also poised to take the lead in building an installed base, exploiting what looked in early 1997 like a genuine time-to-market advantage.

But Rockwell and Lucent were not sitting on their hands. First, since Rockwell and Lucent are leading manufacturers of modem chipsets, they were well placed to control the actual implementation of 56k technology by modem manufacturers. Second, Rockwell accelerated its efforts and successfully narrowed the timing gap with U.S. Robotics by coming to market with its own “K56flex” brand. Perhaps most important, Rockwell and Lucent boldly adopted an alliance strategy, assembling an impressive coalition of modem manufacturers, computer OEMs, and networking equipment manufacturers such as Ascend Communications and Cisco Systems. Computer OEMs are increasingly key, since more and more consumers now purchase computers already fitted with a modem than ever before. In February 1997 the “Open 56k Forum” was unveiled to great fanfare (expectations management), consisting of companies that sell 70 percent of the modems worldwide.

Both sides in this battle worked hard to manage expectations and instill an aura of inevitability to their approach. One ad for Ascend 56k modems used the headline “If you are going to take sides in the 56k battle, make sure you choose the right one.” Knowing that consumers, afraid of being stranded, would try to pick the winner, each side claimed it was capturing a large share of the market. At one point, the Rockwell team claimed that 93 percent of ISPs were using Rockwell-based hardware, while U.S. Robotics asserted that 80 percent of ISPs supported its design. While jarring, these claims were not necessarily inconsistent, since many ISPs were indeed supporting both protocols in order not to lose business.

The battle for users’ minds—or at least their modems—was also waged on the Internet. Rockwell/Lucent and U.S. Robotics both maintained Web sites touting their products. In August 1997, Rockwell/Lucent listed 650 supporters on its Web site and U.S. Robotics listed around 500. PC World contacted the eighteen ISPs listed on the K56flex Web site that “supported and planned to deploy” this standard and found that only three actually offered the service, while eight others planned to do so. The U.S. Robotics site was a little better; fourteen of the twenty-one ISPs on the x2 list of supporters actually offered x2 support, and four others said they planned to do so.

At times, it looked like this standards battle would play out in a crazy way, with ISPs largely choosing U.S. Robotics x2 technology and households mostly buying Rockwell/Lucent technology. With this adoption pattern, no one would be able to take advantage of the higher speeds! An outcome with consumers using one standard and ISPs using another would not be a happy one, nor would it be sustainable.

Fears over incompatibility surely slowed down the market during 1997. Pressure mounted on ISPs to offer separate dial-in lines for each of the two protocols. But in the end, this was a standards battle in which consumers were not badly burned. Crucially, both camps promised free upgrades that would make their modems compatible with the ultimate ITU standard. This eased consumer fears to some degree, but consumers were rightly wary of “patches” meant to solve compatibility problems down the line, and were uncertain in any event whether they would see improved performance right away.

The battle wound down in early December 1997, when a working committee of the ITU announced that 3Com and Rockwell had reached a tentative agreement on a compromise standard now known as the ITU V.90 standard. 3Com stock jumped dramatically on the news, with Rockwell’s stock making more modest gains. The new international standard encompasses technical aspects of both transmission methods. Each side claimed victory. Industry observers agreed that the accord would spur modem sales: Dataquest estimated that sales of 56k modems would rise from 10.8 million in 1997 to 33 million in 1998.

KEY ASSETS IN NETWORK MARKETS

Just what does it take to win a standards war? Your ability to successfully wage a standards war depends on your ownership of seven key assets: (1) control over an installed base of users, (2) intellectual property rights, (3) ability to innovate, (4) first-mover advantages, (5) manufacturing abilities, (6) strength in complements, and (7) brand name and reputation. What these assets have in common is that they place you in a potentially unique position to contribute to the adoption of a new technology. If you own these assets, your value-added to other players is high.

The very same assets that bolster your position in a standards war also strengthen your hand in standards negotiations. For just this reason, we have already noted some of the key assets in network markets in our treatment of standard setting in Chapter 8. Here we offer a more complete list of assets, noting that some companies have used these assets to fight standards wars, while others have used them to help establish standards favorable to their interests.

  1. Control over an installed base of customers. An incumbent firm, like Microsoft, that has a large base of loyal or locked-in customers, is uniquely placed to pursue an evolution strategy offering backward compatibility. Control over an installed base can be used to block cooperative standard setting and force a standards war.
  2. Intellectual property rights. Firms with patents and copyrights controlling valuable new technology or interfaces are clearly in a strong position. Qualcomm’s primary asset in the digital wireless telephone battle was its patent portfolio. The core assets of Sony and Philips in the CD and DVD areas were their respective patents. Usually, patents are stronger than copyrights, but computer software copyrights that can be used to block compatibility can be highly valuable.
  3. Ability to innovate. Beyond your existing IPRs, the ability to make proprietary extensions in the future puts you in a strong position today. In the color TV battle, NBC’s R&D capabilities were crucial. If you have a crackerjack R&D group, it may be worth some current sacrifices if you think you can outrun your competitors in the long haul. Hewlett-Packard’s engineering skills are legendary in Silicon Valley; it is often in HP’s interest to compromise on standards since it can out-engineer the competition once the standard has been defined, even if it has to play some initial catch-up.
  4. First-mover advantages. If you already have done a lot of product development work and are farther along the learning curve than the competition, you are in a strong position. Canon is a good example. It created the personal laser printer market and has continued to dominate the manufacture of the engines in laser printers, in part by exploiting the experience curve to keep costs lower and quality higher than its competitors. Netscape obtained stunning market capitalization based on its ability to bring new technology to market quickly.
  5. Manufacturing abilities. If you are a low-cost producer, owing to either scale economies or manufacturing competence, you are in a strong position. Cost advantages can help you survive a standards war or capture share competing to sell a standardized product. Compaq and Dell have both pushed hard in driving down their manufacturing costs, which gives them a strong competitive advantage in the PC market. Rockwell has lower costs than its competitors in making chipsets for modems. These companies benefit from open standards, which emphasize the importance of manufacturing skills.
  6. Strength in complements. If you produce a product that is a significant complement for the market in question, you will be strongly motivated to get the bandwagon rolling. This, too, puts you in a natural leadership position, since acceptance of the new technology will stimulate sales of the other products you produce. The larger your gross margins on your established products, the stronger this force is. Intel’s thirst to sell more CPUs has driven in its efforts to promote new standards for other PC components, including interfaces between mother-boards and CPUs, busses, chipsets, and graphics controllers.
  7. Reputation and brand name. A brand-name premium in any large market is highly valuable. But reputation and brand name are especially valuable in network markets, where expectations are pivotal. It’s not enough to have the best product; you have to convince customers that you will win. Previous victories and a recognized name count for a lot in this battle. Microsoft, HP, Intel, Sony, and Sun each have powerful reputations in their respective domains, giving them instant credibility.

Don’t forget that customers as well as technology suppliers can control key assets, too. A big customer is automatically in “control” of the installed base. America Online recognized this in the recent 56k modem standards battle. Content providers played a major role in the DVD standards battle. IBM was pivotal in moving the industry from 5¼″ diskettes to 3½″ disks. Most recently, TCI has not been shy about flexing its muscle in the battle over the technology used in TV set-top boxes.

No one asset is decisive. For example, control over an older generation of technology does not necessarily confer the ability to pick the next generation. Sony and Philips controlled CDs but could not move unilaterally into DVDs. Atari had a huge installed base of first-generation video games in 1983, but Nintendo’s superior technology and hot new games caught Atari flat-footed. The early leader in modems, Hayes, tried to buck the crowd when modems operating at 9600 kbps were introduced and ended up in Chapter 11 bankruptcy.

TWO BASIC TACTICS IN STANDARDS WARS

Whichever generic strategy you are pursuing in a standards battle, there are two basic marketplace tactics that you will need to employ: preemption and expectations management.

Preemption

The logic of preemption is straightforward: build an early lead, so positive feedback works for you and against your rival. The same principle applies in markets with strong learning-by-doing: the first firm to gain significant experience will have lower costs and can pull even farther ahead. Either way, the trick is to exploit positive feedback. With learning-by-doing, the positive feedback is achieved through lower costs. With network externalities, the positive feedback comes on the demand side; the leader offers a more valuable product or service.

One way to preempt is simply to be first to market. Product development and design skills can be critical to gaining a first-mover advantage. But watch out: early introduction can also entail compromises in quality and a greater risk of bugs, either of which can doom your product. Recall the examples of CBS in color television and Japan in HDTV. The race belongs to the swift, but that speed should be gained by R&D, not by marketing an inferior system.

In addition to launching your product early, you need to be aggressive early on to build an installed base of customers. Find the “pioneers” (aka gadget freaks) who are most keen to try new technology and sign them up swiftly. Pricing below cost—that is, penetration pricing—is a common tactic used to build an installed base. Discounting to attract large, visible, or influential customers is virtually unavoidable in a standards war.

In some cases, especially for software with a zero marginal cost, you can go beyond free samples and actually pay people to take your product. As we see it, there is nothing special about zero as a price, as long as you have multiple revenue streams to recover costs. Some programmers pay cable operators to distribute their programming, knowing that a larger audience will augment their advertising revenues. In the same fashion, Netscape is prepared to give away its browser or even pay OEMs to load it on new machines in order to increase the usage of Navigator and thus direct more traffic to the Netscape Web site.

The big danger with negative prices is that someone will accept payment for “using” your product but then not really use it. This problem is easily solved in the cable television context, where programmers simply insist that cable operators actually carry their programming once they are paid to do so. Likewise, Netscape can check that an OEM loads Navigator (in a specified way) on new machines and can conduct surveys to see just how the OEM configuration affects use of Navigator. Manufacturers do the same thing when they pay “slotting allowances” to supermarkets for shelf space by checking that their products are actually displayed where they are supposed to be displayed.

Before you go overboard by giving your product away or paying customers to take it, you need to ask three questions. First, if you pay someone to take your product, will they really use it and generate network externalities for other, paying customers? Second, how much is it really worth to you to build up your installed base? Where is the offsetting revenue stream, and when will it arrive? Third, are you fooling yourself? Beware the well-known winner’s curse, in which the most optimistic participant wins a bidding war only to find that the other bidders were more realistic.

Penetration pricing may be difficult to implement if you are pursuing an openness strategy. The sponsor of a network can hope to recoup the losses incurred during penetration pricing once it controls an established technology. Without a sponsor, no single supplier will be willing to make the necessary investments to preempt using penetration pricing. For precisely this reason, penetration pricing can be particularly effective when used by a company employing a control strategy against a rival adopting an openness strategy.

Another implication is that the player in a standards battle with the largest profit streams from related products stands to win the war. We have seen this with smart cards in Europe. They were introduced with a single application—public telephone service—but soon were expanded to facilitate other transactions involving small purchases. Eventually, many more applications, such as identification and authentication, will be introduced. Visa, MasterCard, and American Express are already jockeying for position in the smart card wars. Whichever player can figure out the most effective way to generate multiple revenue streams from an installed base of smart card holders will be able to bid most aggressively, but still profitably, to build up the largest base of customers.

Expectations Management

Expectations are a key factor in consumer decisions about whether or not to purchase a new technology, so make sure that you do your best to manage those expectations. Just as incumbents will try to knock down the viability of emerging new technologies, so will those very entrants strive to establish credibility.

Vaporware is a classic tactic aimed at influencing expectations: announce an upcoming product so as to freeze your rival’s sales. In the 1994 antitrust case brought by the Justice Department against Microsoft, Judge Sporkin cited vaporware as one reason he found the proposed consent decree insufficient. In an earlier era, IBM was accused of the same tactic. Of course, drawing the line between “predatory product pre-announcements” and simply being late bringing a product to market is not so easy, especially in the delay-prone software market. Look at what happened to Lotus in spreadsheets and Ashton-Tate in database software. After both of these companies repeatedly missed launch dates, industry wags said they should be merged and use the stock ticker symbol “LATE.” And we must note with some irony that Microsoft’s stock took a 5.3 percent nosedive in late 1997 after announcing a delay in the launch of Windows 98 from the first to the second quarter of 1998.

The most direct way to manage expectations is by assembling allies and making grand claims about your product’s current or future popularity. Sun has been highly visible in gathering allies in support of Java, including taking out full-page advertisements listing the companies in the Java coalition, showing how important expectations management is in markets with strong network externalities, WordPerfect even filed a court complaint against Microsoft to block Microsoft from claiming that its word processing software was the most popular in the world. Barnes & Noble did the same thing to Amazon, arguing that its claim to being the “world’s largest bookstore” was misleading.

ONCE YOU’VE WON

Moving on from war to the spoils of victory, let’s consider how best to proceed once you have actually won a standards war. You probably made some concessions to achieve victory, such as promises of openness or deals with various allies. Of course, you have to live with those, but there is still a great deal of room for strategy. In today’s high-tech world, the battle never really ends. So, take a deep breath and be ready to keep moving.

Staying on Your Guard

Technology marches forward. You have to keep looking out for the next generation of technology, which can come from unexpected directions. Microsoft, with all its foresight and savvy, has had to scurry to deal with the Internet phenomenon, trying to defuse any threat to its core business.

You may be especially vulnerable if you were victorious in one generation of technology through a preemption strategy. Going early usually means making technical compromises, which gives others that much more room to execute an incompatible revolution strategy against you. Apple pioneered the market for personal digital assistants, but U.S. Robotics perfected the idea with Palm Pilot. If your rivals attract the power users, your market position and the value of your network may begin to erode.

The hazards of moving early and then lacking flexibility can be seen in the case of the French Minitel system. Back in the 1980s, the French were world leaders in on-line transactions with the extensive Minitel computer network. The network is sponsored and controlled by France Telecom. Before the Internet was widely known, much less used, millions of French subscribers used the Minitel system to obtain information and conduct secure on-line transactions. Today, Minitel boasts more than 35 million French subscribers and 25,000 vendors. One reason Minitel has attracted so many suppliers is that users pay a fee to France Telecom each time they visit a commercial site, and a portion of these fees is passed along to vendors. Needless to say, this business model is quite different from what we see on the Web.

Nonetheless, the Minitel system is beginning to seem limited when compared with the Internet, and France is lagging behind in moving onto the Internet. Just as companies that invested in dedicated word processing systems in the 1970s were slow to move to more generalized PCs in the 1980s, the French have been slow to invest in equipment that can access the Internet. Only about 3 percent of the French population uses the Internet, far short of the estimated 20 percent in the United States and 9 percent in the United Kingdom and Germany. Roughly 15 percent of French companies have a Web site, versus nearly 35 percent of U.S. businesses. Only in August 1997 did the French government admit that the Internet, not Minitel, was the way of the future rather than an instrument of American cultural imperialism. France Telecom is now planning to introduce next-generation Minitel terminals that will access the Internet as well as Minitel.

What is the lesson here? The French sluggishness to move to the Internet stems from two causes that are present in many other settings. First, France Telecom and its vendors had an incentive to preserve the revenue streams they were earning from Minitel. This is understandable, but it should be recognized as a choice to harvest an installed base, with adverse implications for the future. Milking the installed base is sometimes the right thing to do, but make this a calculated choice, not a default decision. Second, moving to the Internet presents substantial collective switching costs, and less incremental value, to French consumers in contrast with, say, American consumers. Precisely because Minitel was a success, it reduced the attractiveness of the Internet.

The strategic implication is that you need a migration path or roadmap for your technology. If you cannot improve your technology with time, while offering substantial compatibility with older versions, you will be overtaken sooner or later. Rigidity is death, unless you build a really big installed base, and even this will fade eventually without improvements.

The key is to anticipate the next generation of technology and co-opt it. Look in all directions for the next threat, and take advantage of the fact that consumers will not switch to a new, incompatible technology unless it offers a marked improvement in performance. Microsoft has been the master of this strategy with its “embrace and extend” philosophy of anticipating or imitating improvements and incorporating them into its flagship products. Avoid being frozen in place by your own success. If you cater too closely to your installed base by emphasizing backward compatibility, you open the door to a revolution strategy by an upstart. As we discussed in Chapter 7, this is precisely what happened to Ashton-Tate in databases, allowing Borland and later Microsoft to offer far superior performance with their Paradox and FoxPro products. Your product roadmap has to offer your customers a smooth migration path to ever-improving technology, and stay close to, if not on, the cutting edge.

One way to avoid being dragged down by the need to retain compatibility with your own installed base is to give older members free or inexpensive upgrades to a recent but not current version of your product. This is worth doing for many reasons: users of much older versions have revealed that they do not need the latest bells and whistles and thus are less likely to actually buy the latest version; the free “partial” upgrade can restore some lost customer loyalty; you can save on support costs by avoiding “version creep,” and you can avoid being hamstrung in designing your latest products by a customer-relations need to maintain compatibility with older and older versions. To compromise the performance of your latest version in the name of compatibility with ancient versions presents an opening for a rival to build an installed base of more demanding users. Happily, this “lagged upgrade” approach is easier and easier with distribution so cheap over the Internet. Lagged upgrades also tie in well with the versioning approach to software we described in Chapter 3.

Microsoft did a good job handling this problem with migration to Windows 95. Politely put, Windows 95 is a kludge, with all sorts of special workarounds to allow DOS programs to execute in the Windows environment, thereby maintaining compatibility with customers’ earlier programs. Microsoft’s plan with Windows 98 is to move this consumer version of Windows closer to the professional version, Windows NT, eventually ending up with only one product, or at least only one user interface. It will still want to version its operating system’s capabilities for all the reasons described in Chapter 3.

Commoditizing Complementary Products

Once you’ve won, you want to keep your network alive and healthy. This means that you’ve got to attend not only to your own products but to the products produced by your complementors as well. Your goal should be to retain your franchise as the market leader but encourage a vibrant and competitive market for complements to your product.

This can be tricky. Apple has flipped back and forth on its developer relations over the years. First it just wanted to be in the computer business and let others develop applications. Then it established a subsidiary, Corbis, to do applications development. When this soured relations with other developers, Apple spun Corbis off. And so it went—a back-and-forth dance.

Microsoft faced the same problem, but with a somewhat different strategy. If an applications developer became successful, Microsoft just bought it out! Or tried to—Microsoft’s intended purchase of Intuit was blocked by the Department of Justice. Nowadays a lot of new business plans in the software industry have the same structure: “Produce product, capture emerging market, be bought by Microsoft.”

Our view is that you should try to maintain a competitive market in complementary products and avoid the temptation to meddle. Enter into these markets only if (1) integration of your core product with adjacent products adds value to consumers or (2) you can inject significant additional competition to keep prices low. If you are truly successful, like Intel, you will need to spur innovation in complementary products to help fuel growth.

Competing with Your Own Installed Base

You may need to improve performance just to compete with your installed base, even without an external threat. How can you continue to grow when your information product or technology starts to reach market saturation? One answer is to drive innovation ever faster. Intel is pushing to improve hardware performance of complementary products and to develop applications that crave processing power so as to drive the hardware upgrade cycle. Comptition with one’s own installed base is not a new problem for companies selling durable goods. The stiffest competition faced by Steinway in selling pianos is that from used Steinways.

One way to grow even after you have a large installed base is to start discounting as a means of attracting the remaining customers who have demonstrated (by waiting) that they have a relatively low willingness to pay for your product. As we saw in Chapters 2 and 3, this is a good instinct, but be careful. First, discounting established products is at odds with a penetration pricing strategy to win a standards war. Second, if you regularly discount products once they are well established, consumers may learn to wait for the discounts. The key question: can you expand the market and not spoil your margins for traditional customers?

Economists have long recognized this as the “durable-goods monopoly” problem. Ronald Coase, recent winner of the Nobel Prize in economics, wrote twenty-five years ago about the temptation of a company selling a durable product to offer lower and lower prices to expand the market once many consumers have already purchased the durable good. He conjectured that consumers would come to anticipate these price reductions and hold off buying until prices fall. Since then, economists have studied a variety of strategies designed to prevent the resulting erosion of profits. The problem raised by Coase is especially severe for highly durable products such as information and software.

One of the prescriptions for solving the durable goods monopoly problem is to rent your product rather than sell it. This will not work for a microprocessor or a printer, but rapid technological change can achieve the same end. If a product becomes obsolete in two or three years, used versions won’t pose much of a threat to new sales down the line. This is a great spur for companies like Intel to rush ahead as fast as possible in increasing the speed of their microprocessors. The same is true on the software side, where even vendors dominant in their category, such as Autodesk in computer-aided design, are forced to improve their programs to generate a steady stream of revenues.

Protecting Your Position

A variety of defensive tactics can help secure your position. This is where antitrust limits come in most sharply, however, since it is illegal to “maintain a monopoly” by anticompetitive means. We’ll discuss those limits further in Chapter 10.

One tactic is to offer ongoing attractive terms to important complementors. For example, Nintendo worked aggressively to attract developers of hit games and used its popularity to gain very strong distribution. This tactic can, however, cross the legal line if you insist that your suppliers or distributors deal with you to the exclusion of your rivals. For example, FTD, the floral network, under pressure from the Justice Department, had to cancel its program giving discounts to florists who used FTD exclusively. Since FTD had the lion’s share of the floral delivery network business, this quasi-exclusivity provision was seen as protecting FTD’s near-monopoly position. Ticketmaster was subjected to an extensive investigation for adopting exclusivity provisions in its contracts with stadiums, concert halls, and other venues. And the Justice Department has attacked Microsoft’s contracts with OEMs for having an effect similar to that of exclusive licenses.

A less controversial way to protect your position is to take steps to avoid being held up by others who claim that your product infringes on their patents or copyrights. Obviously, there is no risk-free way to do this. But it makes a great deal of sense to ask those seeking access to your network to agree not to bring the whole network down in an infringement action. Microsoft took steps along these lines when it launched Windows 95, including a provision in the Windows 95 license for OEMs that prevented Microsoft licensees from attempting to use certain software patents to block Microsoft from shipping Windows 95. Intel regularly asks companies taking licenses to its open specifications to agree to offer royalty-free licenses to other participants for any patents that would block the specified technology. This “two-sided openness” strategy prevents ex post hold-up problems and helps to safely launch a new specification.

Leveraging Your Installed Base

Once you have a strong installed base, basic principles of competitive strategy dictate that you seek to leverage into adjacent product spaces, exploiting the key assets that give you a unique ability to create value for consumers in those spaces. We discussed such leveraging in Chapter 6, but some new wrinkles come up in the network context. For example, control over an interface can be used to extend leadership from one side of the interface to the other.

But don’t get carried away. As we have just seen in this chapter, you may be better off encouraging healthy competition in complementary products, which stimulates demand for your core product, than trying to dominate adjacent spaces. In acquiring companies selling neighboring products, you should be driven by true synergies of bringing both products into the same company, not simply by a desire to expand your empire. Again, legal limits on both “leveraging” and on vertical acquisitions can come into play. For example, the FTC forced Time Warner to agree to carry on its cable systems a rival news channel when Time Warner acquired CNN in its merger with Turner.

Geographic expansion is yet another way to leverage your installed base. This is true for traditional goods and services, but with a new twist for network products: when expanding the geographic scope of your network, make sure your installed base in one region becomes a competitive advantage in another region. Just don’t build a two-way bridge to another region where you face an even stronger rival; in that case, more troops will come across the bridge attacking you than you can send to gain new territory.

Geographic effects were powerful in the FCC auctions of spectrum space for PCS services, the successor to the older cellular telephone technology. If you provide personal digital assistance (PDA) wireless services in Minneapolis, you have a big advantage if you also provide such services in St. Paul. The market leader in one town would therefore be willing to outbid rivals in neighboring locations. In the PCS auctions, bidders allegedly “signaled” their most-preferred territories by encoding them into their bids as an attempt to avoid a mutually unprofitable bidding war. The Department of Justice is investigating these complaints. Our point is not to offer bidding strategy but to remind you that geographic expansion of a network can be highly profitable. Network growth generates new customers and offers more value to existing customers at the same time.

Staying Ahead

How can you secure a competitive advantage for yourself short of maintaining direct control over the technology, through patent or copyright protection, for instance? Even without direct control over the installed base or ownership of patents, you may be able to make the other factors work for you while garnering enough external support to set the standards you want.

If you have a good development team, you can build a bandwagon using the openness approach of ceding current control over the technology—through licenses at low or nominal royalties, for example—while keeping tight control over improvements and extensions. If you know better than others how the technology is likely to evolve, you can use this informational advantage to preserve important future rights without losing the support of your allies. IBM chose to open up the PC, but then lost control because it didn’t see what the key assets would be in the future. Besides the now-obvious ones (the design of the operating system and manufacturing of the underlying microprocessor), consider the example of interface standards between the PC and the monitor. During the 1980s, IBM set the first four standards: the monochrome graphics adapter (MGA), the color graphics adapter (CGA), the enhanced graphics adapter (EGA), and the video graphics adapter (VGA), the last in 1987. But by the time of the VGA, IBM was losing control, and the standard started to splinter with the Super VGA around 1988. Soon, with the arrival of the VESA interface, standard setting passed out of IBM’s hands altogether. By anticipating advances in the resolution of monitors, IBM could have done more to preserve its power to set these interface standards, without jeopardizing the initial launch of the PC.

Developing proprietary extensions is a valuable tactic to recapture at least partial control over your own technology. You may not be able to execute a control strategy now, but you will gain some control later if you launch a technology that takes off and you can be first to market with valuable improvements and extensions.

One difficulty with such an approach is that your new technology may be too successful. If the demand for your product grows too fast, too many of your resources may end up being devoted to meeting current demand rather than investing in R&D for the future. This happened to Cisco. All of its energies were devoted to the next generation of networking gear, leaving little time for long-run research. If you are lucky enough to be in Cisco’s position, do what it did: use all the profits you are making to identify and purchase firms that are producing the next-generation products. As Cisco’s CEO, John Chambers, puts it: “We don’t do research—we buy research!”

Allow complementors, and even rivals, to participate in developing standards, but on your terms. Clones are fine, so long as you set the terms under which they can operate. Don’t flip-flop in your policies, as Apple did with its clone manufacturers: stay open, but make sure that you charge enough for access to your network, as in the form of licensing fees, that your bottom line does not suffer when rivals displace your own sales. Build the opportunity costs of lost sales into your access prices or licensing fees.

REAR-GUARD ACTIONS

What happens if you fall behind? Can you ever recover?

That depends on what you mean by “recover.” Usually it is not possible to wrest leadership from another technology that is equally good and more established, unless your rival slips up badly. However, if the network externalities are not crushing, you may be able to protect a niche in the market. And you can always position yourself to make a run at leadership in the next generation of technology.

Atari, Nintendo, Sega, and Sony are good examples. Atari was dominant in the first generation of video games, then Nintendo in second-generation, 8-bit systems. Sega made inroads by being first to market a 16-bit system, and recently Sony has been giving Nintendo a run for its money in 64-bit systems. Losing one round does not mean you should give up, especially if backward compatibility is not paramount.

The question is, how should you manage your customers if you have done poorly in one round of the competition? Stranding even a small installed base of customers can have lasting reputational effects. IBM was concerned about this when it dropped the PC Jr. in the mid-1980s. Apart from consumer goodwill, retaining a presence in the market can be vital in keeping up customer relations and brand identity, even if you have little prospect of making major sales until you introduce a new generation of products. Apple faces this problem with its new operating system, OS X. How does it maintain compatibility with its loyal followers while still building a path to what it hopes will be a dramatic improvement in the operating environment?

Adapters and Interconnection

A tried and true tactic to use when falling behind is to add an adapter or to somehow interconnect with the larger network. This can be a sign of weakness, but one worth bearing if the enhanced network externalities of plugging into a far larger network are substantial. We touched on this in our discussion of how to negotiate a truce; if you are negotiating from weakness, you may simply seek the right to interconnect with the larger network.

The first question to ask is whether you even have the right to build an adapter. Sometimes the large network can keep you out. Atari lacked the intellectual property rights to include an adapter in its machines to play Nintendo cartridges because of Nintendo’s lock-out chip. In other cases, you may be able to break down the door, or at least try. Discover Card wanted the right to issue Visa cards; American Express hoped to offer cards that could be used as Visa cards if a cardholder went to a merchant that did not accept American Express. Discover sued Visa, but did not gain the right to issue Visa cards. However, in Canada, the dominant ATM network, Interac, was compelled to let nonmember banks interconnect. In the telephone area, the FCC is implementing elaborate rules that will allow competitive local exchange carriers to interconnect with the incumbent monopoly telephone networks.

The most famous legal case of a less-popular network product maneuvering to achieve compatibility is the battle between Borland and Lotus in spreadsheets. To promote its Quattro Pro spreadsheet as an alternative to the dominant spreadsheet of the day, Lotus 1-2-3, Borland not only made sure that Quattro Pro could import Lotus files but copied part of the menu structure used by Lotus. Lotus sued Borland for copyright infringement. The case went all the way to the Supreme Court, where the vote was deadlocked, so Borland prevailed based on its victory in the First Circuit Court of Appeals. This case highlights the presence of legal uncertainty over what degree of imitation is permissible; the courts are still working out the limits on how patents and copyrights can be used in network industries.

There are many diverse examples of “adapters.” Conversion of data from another program is a type of adapter. Translators and emulators can serve the same function when more complex code is involved. Converters can be one-way or two-way, with very different strategic implications. Think about WordPerfect and Microsoft Word today. WordPerfect is small and unlikely to gain much share, so it benefits from two-way compatibility. Consumers will be more willing to buy or upgrade WordPerfect if they can import files in Word format and export files in a format that is readable by users of Word. So far, Word will import files in WordPerfect format, but if Microsoft ever eliminates this feature of Word, WordPerfect should attempt to offer an export capability that preserves as much information as possible.

The biggest problem with adapters, when they are technically and legally possible, is performance degradation. Early hopes that improved processing power would make emulation easy have proven false. Tasks become more complex.

Digital’s efforts with its Alpha microprocessor illustrate some of the ways in which less popular technologies seek compatibility. The Alpha chip has been consistently faster than the fastest Intel chips on the market. Digital sells systems with Alpha chips into the server market, a far smaller market than the desktop and workstation markets. And Digital’s systems are far more expensive than systems using Intel chips. As a result, despite its technical superiority, the Alpha sold only 300,000 chips in 1996 compared with 65 million sold by Intel. This leaves Digital in the frustrating position of having a superior product but suffering from a small network. Recognizing that Alpha is in a precarious position, Digital has been looking for ways to interconnect with the Intel (virtual) network. Digital offers an emulator that lets its Alpha chip run like an Intel architecture chip, but most of the performance advantages that Alpha offers are neutralized by the emulator. Hoping to improve the performance of systems using the Alpha chip, Digital and Microsoft announced in January 1998 an enhanced Alliance for Enterprise Computing, under which Windows NT server-based products will be released concurrently for Alpha- and Intel-based systems. Digital also has secured a commitment from Microsoft that Microsoft will cooperate to provide source-code compatibility between Alpha- and Intel-based systems for Windows NT application developers, making it far easier for them to develop applications to run on Alpha-based systems in native mode.

Adapters and converters among software programs are also highly imperfect. Converting files from WordStar to WordPerfect, and now from WordPerfect to Word, is a notoriously buggy process. Whatever the example, consumers are rightly wary of translators and emulators, in part because of raw performance concerns, and in part because of lurking concerns over just how compatible the conversion really is: consider the problems that users have faced with Intel to Motorola architectures, or dBase to Paradox databases.

Apple offers a good example of a company that responded to eroding market share by adding adapters. Apple put in disk drives that could read floppy disks formatted on DOS and Windows machines in the mid-1980s. In 1993 Apple introduced a machine that included an Intel 486 chip and could run DOS and Windows software along with Macintosh software. But Apple’s case also exposes the deep tension underlying an adapter strategy: the adapter adds (some) value but undermines confidence in the smaller network itself.

Finally, be careful about the large network changing interface specifications to avoid compatibility. IBM was accused of this in mainframe computers. Indeed, we suggested this very tactic in the section above on strategies for winners, so long as the new specifications are truly superior, not merely an attempt to exclude competitors.

Survival Pricing

As we saw in Chapter 2, the marginal cost of producing information goods is close to zero. This means that you can cut your price very low and still cover (incremental) costs. Hence, when you find yourself falling behind in a network industry, it is tempting to cut price to spur sales, a tactic we call survival pricing.

This temptation should be resisted. Survival pricing is unlikely to work. It shows weakness, and it is hard to find examples in which it has made much difference. Our very first case study of the Encyclopedia Britannica versus Encarta illustrated this problem.

Computer Associates gave away Simply Money (for a $6.95 shipping and handling fee), but this didn’t matter. Simply Money still did not take off in its battle against Quicken and Money. On the other hand, Computer Associates got the name and vital statistics of each buyer, which was worth something in the mail list market, so it wasn’t a total loss. IBM offered OS/2 for as little as $50, but look at the result. Borland priced Quattro Pro very aggressively when squeezed between Lotus 1-2-3 and Microsoft Excel back in 1993.

The problem is that the purchase price of software is minor in comparison with the costs of deployment, training, and support. Corporate purchasers, and even individual consumers, were much more worried about picking the winner of the spreadsheet wars than they were about whether their spreadsheet cost $49.95 or $99.95. At the time of the cut-throat pricing, Borland was a distant third in the spreadsheet market. Lotus and Microsoft both said they would not respond to the low price. Frank Ingari, Lotus’s vice president for marketing, dismissed Borland as a “fringe player” and said the $49 price was a “last gasp move.”

Survival pricing—cutting your price after the tide has moved against you—should be distinguished from penetration pricing, which is offering a low price to invade another market. Borland used penetration pricing very cleverly in the early 1980s with its Turbo Pascal product. Microsoft and other compiler companies ignored Turbo Pascal, much to their dismay later on.

Legal Approaches

If all else fails, sue. No, really. If the dominant firm has promised to be open and has reneged on that promise, you should attack its bait-and-switch approach. The Supreme Court in the landmark Kodak case, discussed in Chapter 6, opened the door to antitrust attacks along these lines, and many companies have taken up the invitation. The FTC case against Dell Computer also fits into the category of reneging on promises of openness. All of this corresponds with our earlier admonition: get clear and explicit protections early on, if you can, or else give serious thought to fighting a standards war.

CAPSTONE CASE: MICROSOFT VERSUS NETSCAPE

We conclude our discussion of strategic standard setting by applying our framework to one of the most widely watched and reported standards wars of the last several years: the Battles of the Browsers. During one heated skirmish in this war, interest was so intense that Business Week reported that President Clinton queried Netscape chief executive James L. Barksdale about his strategy. “That the contest caught even the President’s eye underscores just how seminal it is: This battle is for nothing less than the soul of the Internet.”1

In one corner we have the company that popularized the very idea of an Internet browser: the Internet pioneer, darling of the stock market, and still reigning champion in the browser category, Netscape Communications Corporation. In the other corner we have the heavyweight of high tech: the world’s largest software supplier, dominant on the desktop, grimly intent on catching the Internet wave, none other than the mighty Microsoft.

For the past three years, Microsoft has been pulling out the stops to overtake Netscape, trying to displace the Netscape Navigator with its own Internet Explorer. Each company has brought to bear substantial competitive assets. When Microsoft went on the attack, Netscape had a far superior product and a substantial installed base of satisfied users. Microsoft, however, had its brand name, a history of dominating one software application after another, control over the underlying operating system, and seemingly limitless financial resources at its disposal.

Let’s follow the steps laid out in the last few chapters.

The first step is to gauge the importance of positive feedback in the browser category. Are strong network externalities present for browser users? To date, we would say the network externalities are modest, not strong. First, there appears to be little by way of training needed for someone to effectively use a browser. Indeed, one of the attractions of the Netscape Navigator is that many people find it simple and intuitive. Nor do most users have any “data” in a Navigator-specific format. To the contrary, Navigator relies on HTML, which is quite open, and bookmark files are easily transferred between browsers. So individual switching costs are not large.

What about collective switching costs? Are there strong forces stopping any one user from employing an unpopular browser? Not yet. So far at least, either brand of browser can view the vast majority of Web pages with equal effectiveness. This is not universally true, so network externalities are present to some degree, but they remain small so far. Indeed, some observers have expressed concern that Microsoft will find a way to strengthen the network externalities, through control over software for servers, if and when it has a stronger position on the client/browser side. If Microsoft is able to get the majority of servers to display material in a superior fashion for the Internet Explorer, strong positive feedback might kick in. However, the most popular product in the Internet server market is Apache, which enjoys a 47 percent market share and is completely open. Microsoft and Netscape servers have 22 percent and 10 percent of the market, respectively.

In fact, the relatively weak network externalities explain in part why the browser war has turned into trench warfare, not a blitzkrieg. Netscape’s position is hardly impenetrable, and Microsoft, especially with its offsetting advantages, could credibly stay in the game with a market share around 30 percent of shipments during 1997.

What are the market shares in the browser war, anyhow? An unusual but handy aspect of the browser market is that market shares can be measured in terms of usage rather than purchases of the product, since Web sites can determine the browser used by a visitor. For the purposes of assessing network externalities, usage is far more important than purchase: the “active” installed base is what matters. Products given away for free but not used just don’t matter. Recent data indicate that Netscape Navigator’s share of usage is 54 percent, with Microsoft’s Internet Explorer weighing in at 33 percent. (Cyberdog, for the Macintosh, is a distant third with around 5 percent of hits.)

The browser wars involve rival evolutions. Consumers bear little of the costs of adopting either brand of browser. So far at least, both browsers are compatible with existing hardware and software systems. If Microsoft were ever to design Windows to make Navigator incompatible with Windows, they will have converted the war into evolution versus revolution. We doubt this will happen, however, so long as the Justice Department remains alert.

Most of the action involves four of the tactics for waging a standards war that we have discussed above: (1) preemption, (2) penetration pricing, (3) expectations management, and (4) jockeying for allies. Let’s look at these in turn.

Preemption

Netscape enjoyed a big head start with Navigator, which was introduced in 1995. Microsoft licensed the original source code for Mosaic from Spyglass and rushed Internet Explorer to market. Microsoft’s haste showed, and Internet Explorer was widely regarded as a joke until Internet Explorer 3.0 was released in August 1996. By that time, many firms and individuals had already installed Netscape Navigator. With technology moving so rapidly, however, and in the absence of substantial consumer lock-in, an ongoing race developed to bring out new and improved versions ahead of the competition. As in other software categories, sales surge with the release of a new version, then drift until the cycle repeats itself.

Preemption and leapfrogging play out differently in different distribution channels. The primary channels are (1) direct distribution to consumers, either over the Internet or through retail outlets, (2) sales to OEMs for placement on new machines, and (3) indirect distribution through ISPs. Once a user has downloaded one browser, there is little reason to use another unless it offers superior functionality. OEMs can and do place multiple browser icons on the desktop to give their customers a choice when they turn on their new machine. In this channel, preemption can still occur if one browser supplier obtains exclusive rights to have its browser on that OEM’s desktop, or if the OEM is given an incentive not to load the rival browser on its machines. So far, browser software does not occupy so much disk storage space as to crowd out another browser, and antitrust oversight makes it risky for Microsoft to sign exclusive deals with OEMs.

Preemption is also possible through the ISP channel. Microsoft structured deals with America Online, CompuServe, Prodigy, AT&T Worldnet, Netcom, and MCI, among others, that made Internet Explorer the “preferred browser” for those ISPs. Since many consumers are inclined to follow the advice of their ISP in picking their browser, these bundled offerings can have a material affect on market shares. Precisely for this reason, the Justice Department has scrutinized Microsoft’s contracts with ISPs, and in early 1998 Microsoft modified these contracts to permit ISPs to promote rival browsers.

Penetration Pricing

Both Netscape and Microsoft are masters at penetration pricing, each in its own way.

Netscape led the way in making its software available free of charge over the Internet. As we saw in Chapter 4, one of the wonderful things about the Internet is that it can serve as an extremely efficient, low-cost means of distributing information products, be they content or tools such as software. So, even while Netscape sold Navigator in retail stores for $49, with printed documentation, the same software was generally available free on-line. Of course, many users who were new to the whole on-line world were not sophisticated enough to download Navigator without the use of the Navigator software itself.

Netscape also pioneered the idea of plug-ins, software written by third parties that enhance the functionality of the basic Navigator program. Netscape provided links to these publishers from its Web site to make it easy for users to customize their browser. Making quality enhancements available free is a variant on penetration pricing. In this way, Netscape was able to build up a network of software developers tied to its technology.

For a while, Netscape tried to charge customers who were downloading Navigator. This attempt was half-hearted, however: Navigator 4.0 was available free for a trial period, after which Netscape requested that users pay if they wished to keep using the software. In early 1998, Netscape went way beyond simply offering Navigator for free. It released the source code for Navigator, so people can now both use it freely and modify it at will.

Microsoft’s first step was to make Internet Explorer available free on-line. This tactic made a lot of sense as part of Microsoft’s catch-up strategy. In fact, Microsoft has gone even further, actually paying OEMs and ISPs to give preference to Internet Explorer over Navigator, by making Internet Explorer the “default” browser. The company has also publicly stated that Explorer will be free “now and in the future,” an obvious attempt at expectations management.

Why are both companies willing to engage in such aggressive penetration pricing? These giveaways are taking a toll on Netscape: revenues from “client licenses” declined from more than half of Netscape’s revenues in 1996 to less than 40 percent in the second quarter of 1997. Our discussion in Chapter 2 raises one possibility: competition has driven the price of this information good down to marginal cost, which is tiny. But this explanation is incomplete. Clearly, each company sees longer-term strategic benefits of increased use of its browsers. What are these benefits, and how do they relate to the giveaways? To answer that question, we have to follow the money: what revenues are at stake in this standards war?

Start with Netscape. The key is that placements of Navigator help Netscape earn revenues from its other products. For example, Netscape’s Web site is one of the most heavily accessed pieces of real estate on the Net, in large part because many of the 65 million users of Navigator have never changed the default setting on their browsers. This gives Netscape a very attractive platform for advertising. Netscape is clearly moving to establish its Web site as a major “portal” to the Internet. This will bring Netscape more directly into competition with Yahoo! and Excite, while helping to wean it of the need for browser revenues.

Beyond that, Netscape recently released its push-media software, Netcaster, which is piggybacking on Netscape’s browser: as customers download the browser, they have the option of taking the whole package. The more people use Navigator and Netcaster, the more of users’ time and attention Netscape has to sell to its advertisers, and the more revenue Netscape can earn by selling space on Netcaster. Yahoo!, for example, recently announced that it will be paying $4.7 million for the rights to the Netscape Guide button. Not surprisingly, advertising revenues are a growing share of Netscape’s total revenues.

Netscape’s grander plan is to offer an entirely new user interface. The new Netscape Constellation is nothing less than a complete user environment, centered around the browser. Constellation thus can serve as a layer between users and the existing operating system, just as Windows initially was a layer between the user and the aging DOS. In addition, this user interface is a gateway to the Internet. Viewed this way, the browser wars are but a skirmish in the larger battle over users and gateways to the Internet, which is, of course, a major threat to Microsoft. Who cares about access to the desktop if someone else controls access to the Internet?

Microsoft’s motives are also based on augmenting revenue streams “adjacent” to the browser itself. As we noted above, Microsoft has publicly stated that it intends never to charge consumers for a stand-alone browser. Microsoft’s plan is to tie the browser into its operating system, replacing the Windows 95 user interface with an interface much more like that of today’s stand-alone browsers. Viewed in this way, it is easier to see why Microsoft is willing to invest so heavily in building up the installed base of Internet Explorer users: it will both ease the transition to Windows 98 and deny Netscape the chance to challenge Microsoft’s control over the user interface. Control of the user interface is enormously valuable because it gives Microsoft access to that most valuable item in the information age: human attention. Indeed, one of Microsoft’s weaknesses is that many people fear that it will use its browser to somehow control on-line sales. These fears were fueled by a statement by Microsoft’s Nathan Myrhvold that Microsoft hoped to earn a “vig” or vigorish, on every transaction over the Internet that uses Microsoft’s technology. However, testifying before Congress, Bill Gates denied that this was Microsoft’s goal.

Expectations Management

Netscape has stated recently that it plans to place its browser on as many as 100 million desktops. The company also announced that one hundred industry partners will package the Navigator browser with their products. Trumpeting grand plans for future sales, as well as extensive distribution agreements, is a classic method of building favorable expectations in the hope that they will be self-fulfilling. The very name of Netscape’s recent marketing campaign for Navigator says it all: “Netscape Everywhere.”

Microsoft is not pulling its punches, either, in attempts to convince consumers that Internet Explorer is the browser of the future. Microsoft stated clearly and at an early stage that it planned to further integrate Internet Explorer into its Windows operating system. By doing so, Microsoft is simultaneously making it more difficult for any operating-system challenger to offer dramatic improvements in the user interface, guaranteeing enormous distribution for Internet Explorer and making it harder for Netscape to convince consumers that they need Navigator.

Alliances

Allies are especially important to Netscape, given its small size and young age. Netscape and Sun Microsystems are strong allies, with Netscape supporting Sun’s Java and Sun helping lend credibility to Netscape. Arthur Andersen’s support has helped Netscape make big inroads into the corporate intranet market. Netscape has also made arrangements with publishers to distribute on-line material to Navigator users and with Internet service providers to offer Navigator to their customers.

As already noted, Microsoft has assembled its share of allies by offering attractive financial terms to content providers, ISPs, and OEMs. Indeed, even Microsoft’s 1997 investment in Apple was designed to promote Internet Explorer by increasing the distribution of the browser on Macintosh machines. Oddly, most of the press reports at the time missed this important aspect of the new Microsoft/Apple accommodation.

LESSONS

  • Understand what type of standards war you are waging. The single most important factor is the compatibility between the dueling new technologies and established products. Standards wars come in three forms: rival evolutions, rival revolutions, and revolution versus evolution.
  • Strength in the standards game is determined by ownership of seven critical assets. The key assets are (1) control of an installed base, (2) intellectual property rights, (3) ability to innovate, (4) first-mover advantages, (5) manufacturing abilities, (6) presence in complementary products, and (7) brand name and reputation.
  • Preemption is a critical tactic during a standards war. Rapid design cycles, early deals with pivotal customers, and penetration pricing are the building blocks of a preemption strategy.
  • Expectations management is also crucial to building positive feedback. Your goal is to convince customers and complementors that you will emerge as the victor; such expectations can easily become a self-fulfilling prophecy. To manage expectations, you should engage in aggressive marketing, make early announcements of new products, assemble allies, and make visible commitments to your technology.
  • When you’ve won your war, don’t rest easy. Cater to your installed base and avoid complacency. Don’t let the desire for backward compatibility hobble your ability to improve your product; doing so will leave you open to an entrant employing a revolution strategy. Commoditize complementary products to make your systems more attractive to consumers.
  • If you fall behind, avoid survival pricing. A better tactic is trying to interconnect with the prevailing standard using converters and adapters.
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.147.49.106