8 Cooperation and Compatibility

Armed with an understanding of how positive feedback works and informed by historical precedent, we are now ready to explore in depth the different strategies for competing in network markets. This chapter focuses on the openness strategies, open migration and discontinuity, which are fundamentally based on cooperation with allies. The next chapter focuses on the control strategies, controlled migration and performance play, in the context of a battle between incompatible technologies.

Strategy in network markets is distinct from strategy in markets for information content, not to mention traditional industrial markets. Figuring out early on who are your allies, and who are your enemies, is especially important in network markets because of the winner-take-all nature of these markets. Do you really want an “open” standard? Do others? Which allies do you need to win, and how can you most effectively attract them? Can you assemble allies to launch your technology successfully while keeping some control over how it evolves? Should you fight a standards war or seek an early truce? And what should you do if you have a declining market share in a network industry? We will systematically look into these questions in the pages that follow.

Many commentators have likened cyberspace to the Wild West, where old patterns of behavior no longer apply and everything is up for grabs. Perhaps, but the lone cowboy approach rarely works in the information age. Network economics and positive feedback make cooperation more important than ever. Most companies need to cooperate with others to establish standards and create a single network of compatible users. But as soon as the ink is dry on the standards agreement, these same companies shift gears and compete head to head for their share of that network. The term coopetition captures the tension between cooperation and competition prevalent in network industries. When distinct components have to work together as a system, the paramount strategic questions involve cooperation and coordination: with whom should you cooperate, how broadly, and under what terms?

HOW STANDARDS CHANGE THE GAME

As you map out your strategy in the face of positive feedback and network effects, you will need to identify your natural allies early in the game. This can be a difficult process, since there are no clear battle lines in network markets. For example, you cannot take it on faith that the other market participants truly want to establish a standard. Rather, an incumbent supplier may prefer to see a new technology die from lack of standardization, hoping to prolong its profits from older technology. We doubt that Microsoft had much interest in seeing a unified Unix standard, or a unified Java standard for that matter, since these technologies pose far more of a threat to Microsoft than an opportunity. Beware of companies participating in the standard-setting process, formally or informally, that deep down have no interest in seeing a successful standard emerge.

Even if your allies all welcome a standard, they may disagree over how extensive or detailed that standard should be. As we saw, a big, if late-breaking, issue in the HDTV standards process was whether the standard would include specifications regarding scanning formats and line resolution. The scope of the standard is also under attack in DVD, with an apparent breakdown regarding the “write” part of the standard. The major players in the DVD industry have agreed to a “read” standard under the pressure of the content providers, who naturally prefer to provide content in a standardized format. But the content providers don’t care about write standards. If anything, they would be happy to see incompatible standards, since it would make piracy more difficult. Without the harmonizing pressure from the content providers, the DVD manufacturers have succumbed to their natural instinct to use their own proprietary write formats.

To figure out who really wants a standard, and who doesn’t, you need to envision how the market is likely to evolve with or without an agreed-upon standard. Standards alter the very nature of competition in several important ways.

Expanded Network Externalities

First and foremost, standards enhance compatibility, or interoperability, generating greater value for users by making the network larger. To illustrate, consider format standards for information media, such as the VHS standard for videotapes or the 3½″ standard for computer disks. These standards fuel beneficial network externalities in two ways. First, and most directly, the standard makes it possible to share information with a larger network (without the need to convert the data from one format to another). Second, and indirectly, the enhanced ability to share data attracts still more consumers using this format, further expanding the available network externalities. This analysis applies equally to real communications networks, like fax machines and ATM networks, and to virtual networks, such as users of compatible computer software or compatible disk drives. Either way, the larger network is a real boon to consumers.

If you ever lose sight of this basic tenet of network markets—that is, the fact that compatibility creates substantial consumer benefits, think of the Baltimore fire of 1904: when firemen from neighboring towns arrived to help fight the fire, many of their fire hoses did not fit into Baltimore’s hydrants. The information age equivalent occurs when your wireless telephone fails to work in an incompatible distant PCS system, or when you cannot plug in your laptop or download your e-mail in a foreign country.

Reduced Uncertainty

Standards reduce the technology risk faced by consumers. This, too, accelerates acceptance of a new technology. A standard with many backers can go far to bolster the credibility of the technology, which then becomes self-fulfilling. In contrast, with incompatible products, consumer confusion and fear of stranding may delay adoption. Consumer confusion helped kill AM stereo radio a decade ago. More recently, the growth of the market for 56k modems was retarded until modem manufacturers could agree on a common standard.

We have stressed the importance of expectations as a driver of positive feedback in network markets: confidence breeds success, while doubt spells doom. One of the risks in a standards war is that the battle to win market share will undermine consumer confidence that either technology will prevail, resulting in a war with no victor. As each side strives to convince consumers that it will be the winner, consumers may take the easy way out and sit on the sidelines, especially if a serviceable older technology is already available and standardized. The same fate can easily befall a single new technology that lacks the support of sufficient market participants to become a standard.

Reduced Consumer Lock-In

If the standard is truly open, consumers will be less concerned about lock-in. They can count on future competition. This has worked nicely for CDs, where holders of patents essential to the CD standard, including Sony, Philips, and DiscoVision Associates, have charged only modest royalties. Likewise, consumers expected competition on the PC platform, owing to IBM’s open approach. And competition they got—among hardware providers, that is, but not among operating systems, which became dominated by Microsoft.

Netscape is now touting the open nature of its product line to convince users that they will not be locked into a proprietary solution. Indeed, in June 1997 it even offered an “Open Standards Guarantee” on its Web site, and in early 1998 Netscape published the source code to its browser, Navigator. Even mighty Microsoft has been forced to move toward open standards such as XML in order to reassure its clientele that they will be able to exchange data with other users.

Competition for the Market versus Competition in the Market

Precisely because standards reduce lock-in, they shift the locus of competition from an early battle for dominance to a later battle for market share. Instead of competing for the market, companies compete within the market, using the common standards. Aggressive penetration pricing is far less likely under a common standard, but so is lock-in. One of the worst outcomes for consumers is to buy into a standard that is widely expected to be open, only to find it “hijacked” later, after they are collectively locked in. Motorola has been accused of precisely this tactic in promoting standards for public safety radio equipment and modems.

Dow Jones recently renegotiated contracts with firms that distributed quotes on the Dow Jones Industrial Average (DJIA), proposing to charge $1 per month per user for real-time quotes and 25 cents a month for quotes delayed by twenty minutes. (Note the versioned prices.) Dow Jones waited to announce these new charges until after a derivative securities market started that was based on the DJIA. The company argued that the new derivative securities made its quotes more valuable, but some providers of on-line financial services certainly felt that a formerly open standard had been slammed shut.

Competition on Price versus Features

Standards shift competition away from features and toward price, for the simple reason that many features are common across all brands. How many? This depends on how specific the standard is: the more detailed the standard, the harder it is for each producer to differentiate its product and still comply with the standard.

So, while a more extensive standard leads to fewer compatibility problems, and stronger network externalities, it also can reduce the ability of each supplier to differentiate its products, thereby intensifying price competition. For this very reason, consumers tend to seek more extensive standards than do suppliers.

It follows that rival manufacturers may all be better off living with some incompatibilities and with a smaller total market in order to deemphasize pricing competition and focus competition more on product features.

Competition to Offer Proprietary Extensions

Over time, there are strong incentives for suppliers to differentiate themselves by developing proprietary extensions, while still maintaining some degree of backward compatibility. This is one reason why hardware and software incompatibilities tend to crop up even on the relatively standardized PC platform. Competition to extend a standard can certainly be a boon to consumers, as new features are designed in a highly competitive race to offer improvements. But the resulting incompatibilities can be a major source of irritation.

The fruits and frustrations of competition to extend a standard technology can be blockaded by an owner of proprietary rights who uses these rights to control the evolution of technology. We described in Chapter 7 how a firm sponsoring an industry standard can control its evolution. Successful sponsors can commoditize certain components of the system, while making sure that network externalities are not lost over time owing to incompatibilities. Of course, the sponsor will seek to capture profits for itself. This is what Sony and Philips did, both by charging royalties to manufacturers of CD players and disks and by limiting the manufacture of certain enhanced CD players (such as players of interactive and high-density CDs). Sony and Philips made the decision that it was worth foregoing these improvements, which might have spurred sales of both players and disks, to avoid unfavorable publicity surrounding incompatibilities and thus preserve consumer confidence in the integrity of the standard.

Intel is taking a similar approach with the PC platform. Intel Labs is playing a major role in developing interfaces and standards such as “plug and play” and the “accelerated graphics port,” then making them available to component manufacturers. Of all the players in the hardware side of PC world, Intel has the greatest interest in seeing that components interconnect smoothly and perform well. The faster, cheaper, and easier to use the components are, the more demand there is for Intel CPUs.

Component versus Systems Competition

Standards shift the locus of competition from systems to components. When Nintendo competes against Sega, consumers compare the Nintendo system of hardware and available software with the Sega system. The firm that can offer the superior total package stands to win. Compare this with audio and video equipment (stereo systems, televisions, and VCRs), where the various components are (by and large) compatible. A company can do well making the best or cheapest television, even if it sells no VCRs. Similarly, a different company can profit by selling stereo speakers, even if it makes no receivers or CD players. The same is true for PCs: HP has a very profitable printer business, even though its computer sales are modest. Sony has done well selling monitors, with essentially no presence in the PC box business, at least in the United States.

And so it goes. Specialists tend to thrive in the mix-and-match environment created by interface standards. Generalists and systems integrators tend to thrive in the absence of compatibility.

WHO WINS AND WHO LOSES FROM STANDARDS?

We have seen how standards change the nature of the game; here we examine how they affect the players.

Consumers

Consumers generally welcome standards: they are spared having to pick a winner and face the risk of being stranded. They can enjoy the greatest network externalities in a single network or in networks that seamlessly interconnect. They can mix and match components to suit their tastes. And they are far less likely to become locked into a single vendor, unless a strong leader retains control over the technology or wrests control in the future through proprietary extensions or intellectual property rights.

Standardization does have some downsides for consumers, however. The main one is a loss of variety: the standard may be poorly suited to some customers’ needs, or it may just turn out to be an inferior technology, like QWERTY. Standardization can also deprive consumers of the benefits of aggressive penetration pricing during a standards battle. This loss is most likely to be significant for large or influential users that can play a pivotal role in the battle, like the large ISPs in the browser battle between Microsoft and Netscape. For consumers as a whole, however, penetration pricing is largely a down payment on future lock-in, so this factor should be of secondary concern.

Standards that “don’t quite work” are the bane of customers. It used to be that you were never quite sure exactly which video cards would work with which sound cards; your PC maker added value by making sure that the components in the system you ordered all worked together. Nowadays, pretty much all PC hardware works together because of efforts by Intel and Microsoft to promulgate industry standards. This has been great for Intel and Microsoft but has partially commoditized the PC OEM business, in which competition is increasingly based on being the low-cost producer and distributor.

We’re at the same point in software standards now that we were with PC hardware standards a decade ago—you’re never quite sure what works together. The problem is that there isn’t an industry player with enough clout to coordinate independent suppliers’ efforts. Microsoft, naturally enough, pushes its own solutions; Sun, Oracle, and Netscape are trying to build an industry alliance around a different set of solutions, but seamless component integration just isn’t here yet.

Complementors

Like consumers, sellers of complements welcome standards, so long as their products comply with the standard. AOL sells Internet access, a complement to modems. AOL benefits from the use of standardized, high-speed modems in that AOL itself does not need to maintain separate banks of modems with different formats. It follows that the demand for on-line services is stimulated when modem sales rise as a result of standards. In fact, influential complementors can affect the choice of a standard, just as can influential consumers. For example, content providers such as studios have been influential in the development of each generation of consumer electronics equipment.

The markets for audio and video entertainment illustrate just who the complementors are. Recording studios and retail music stores are complementors to music CDs and therefore beneficiaries of the CD standard. Phonograph manufacturers, on the other hand, offered a product that was a direct competitor to CD players. The CD was a grave threat to these companies; they had to learn to make CD players, a very different business from making phonographs, or go out of business.

In the case of the emerging DVD standard, content providers such as movie studios and software houses offer a complement to the new disks and stand to benefit from the new standard. Now it is makers of videocassette players that are in danger, since DVD players promise eventually to make VCRs obsolete. The impact of DVD on a distributor like Blockbuster is not as clear: as a distributor of video content, Blockbuster sells a complement to the DVD technology and stands to gain as higher-quality video images (with improved sound) become available. However, precisely because of the flexibility that DVD disks will allow, they are well suited to new channels of distribution, threatening to devalue Blockbuster’s network of retail locations.

Incumbents

Product standards for new technologies can pose a grave threat to established incumbents. After all, if standards fuel the positive feedback cycle and help launch a new technology, they can easily cannibalize sales from an older technology. RCA, the leading maker of black-and-white television sets during the 1940s, was not eager to see a color television standard established that would challenge its leadership. Atari was none too happy when Nintendo managed to get positive feedback working for the Nintendo Entertainment System back in the mid-1980s.

Incumbents have three choices. First, an incumbent can try to deny backward compatibility to would-be entrants with new technology in the hope of blockading entry altogether, thereby extending the life of its own technology. This is what AT&T tried to do in the 1960s and 1970s when faced with demands that it permit various equipment, such as telephone handsets and PBXs, to interconnect with the AT&T system. Regulatory rules forced AT&T to open up its network to interconnection, first with equipment and later with other carriers, most notably MCI.

Second, an incumbent can rush to introduce its own new generation of equipment, perhaps with the unique advantage of backward compatibility, to win a standards war. This is what Atari did (unsuccessfully) when faced with Nintendo’s entry into the U.S. video game market in the mid-1980s. Atari’s second-generation equipment, the Atari 7800, could play games written for Atari’s dominant first-generation system, the Atari 2600. Unfortunately for Atari, these older games held little appeal for a new generation of boys entranced by the superior games available on Nintendo’s system.

Finally, an incumbent can ally itself with the new technology, hoping to benefit from its established brand name, an expanded market, and perhaps from royalty and technology licensing income. This is what Sony and Philips have done in the transition from CDs to DVDs.

An incumbent with little to offer to the new generation of technology, offensively or defensively, will have a greater interest in sabotaging new standards than in promoting them. Sun is learning this lesson the hard way in its battle with Microsoft over Java.

Innovators

Companies developing new technology collectively tend to welcome standards, because standards typically expand the total size of the market and may even be vital for the emergence of the market in the first place. Whenever a group of innovators collectively benefit from a standard, there is always some way for them to structure an agreement in support of that standard. For precisely this reason, we see literally hundreds of standards introduced each year.

Smart cards offer a good example. These are plastic cards containing a small computer chip that can store 500 times the data of a magnetic strip card. Banks are keen to see smart cards take off because they will be able to use this technology to offer a far greater range of value-added services to their customers. Digital money can be downloaded into a smart card, enhancing the value of on-line banking. And smart cards will enable banks to capture more transaction volume from cash, especially small transactions for which credit cards are too expensive. For all of these reasons, Visa and MasterCard are working to establish a smart card standard that will allow smart cards offered by different suppliers to work in the same card readers.

When a group of innovators collectively benefit from setting a standard, but the standard impacts them in very different ways, a complex negotiation ensues. Standards tend to have markedly different effects on different suppliers based on their underlying assets. Companies with a large installed base have the most to lose, while companies controlling far-superior technology have the most to gain. Size is important as well; as we have already noted, small players may especially welcome a standard, since standards tend to level the playing field between big and small suppliers. We explore standards negotiations below when we discuss how to build an alliance in support of a new standard.

FORMAL STANDARD SETTING

Most standard setting takes place through formal standard-setting processes established by various standards bodies. Never before have such cooperative, political processes been so important to market competition.

There are hundreds of official standard-setting bodies throughout the world. Some, like the Underwriter’s Laboratory (UL), which sets safety standards, are household names. Others, like the International Telecommunications Union (ITU), seem far removed from everyday experience but exert significant, behind-the-scenes influence. Some are independent professional organizations, like the Institute of Electric and Electronic Engineers (IEEE); others are government bodies, like the National Institute of Standards and Technology (NIST). About the only thing they have in common is their reliance on acronyms. And these are only the official standard-setting bodies. On top of these, we have any number of unofficial groups haggling over product specifications, as well as various special interest groups that offer forums for the exchange of information about product specs. For example, there are thirty-six such groups operating under the auspices of the Association for Computing Machinery (ACM) alone, including SIGART (artificial intelligence), SIGCOMM (data communications), SIGGRAPH (computer graphics), and SIGIR (information retrieval).

Participants often complain about the formal standard-setting process: it is too slow, it is too political, it doesn’t pick the “best” technology, and so on. But history proves that the consensus process of formal standard setting is time and again critical to launching new technologies. The telecommunications industry, for example, has relied on the ITU to set international standards, starting with the telegraph in the 1860s, through radio in the 1920s, to a panoply of standards today: from the assignment of telephone numbers, to protection against interference, to data protocols for multimedia conferencing. Whether you consider formal standard setting a necessary evil or a godsend, it is here to stay.

Formal standard setting is designed to be open to all participants and to foster consensus. This sounds good, but often results in a very slow process. The HDTV story is one example: it took roughly ten years to set a technical standard for digital television in the United States, and HDTV is yet to be adopted in the United States on a commercial scale.

A fundamental principle underlying the consensus approach to standards is that they should be “open,” with no one or few firms controlling the standard. Thus, a quid pro quo for having one’s technology adopted in a formal standard is a commitment to license any patents essential to implementing the standard on “fair, reasonable, and nondiscriminatory” terms. Note that this duty does not extend to nonessential patents, which can lead to an amusing dance in which companies claim that their patents merely cover valuable enhancements to the standard and are not actually essential to complying with the standard.

The openness promise of a formal standards body is a powerful tool for establishing credibility. However, be aware that most standards bodies have no enforcement authority. Aggrieved parties must resort to the courts, including the court of public opinion, if they feel the process has been abused.

In the late nineteenth and early twentieth centuries, as part of the industrial revolution, formal standard setting focused on traditional manufacturing standards, such as those needed for interchangeable parts and mass production. As the twentieth century closes, the information revolution has shifted more and more formal standard setting into the high-tech and information areas.

TACTICS IN FORMAL STANDARD SETTING

If you are involved in setting a formal standard, it is important to determine your goal at the outset. If your goal is to quickly establish a standard incorporating your proprietary technology, you better not rely on formal standard setting. It’s wise to participate, but you should be following a market-oriented track in parallel. If most network externalities occur at the national level, you can likely avoid the entanglements of the global standard-setting organizations. If you are not too picky about the actual standard, but want to make sure that no private entity controls the chosen standard, ANSI and ITU rules are well suited to your objectives. Very often, the most important rule is simply to show up at standard-setting meetings to make sure a “consensus” adverse to your interests does not form. Smaller companies sometimes find attendance burdensome, allowing larger firms to steer the process to their advantage. If you cannot spare someone to attend, consider teaming up with other small players whose interests are aligned with yours to send a representative.

Formal standard setting often involves a dance in which companies negotiate based on quite different strengths. In setting the standard for 28.8k modems, for example, AT&T, British Telecom, and Motorola brought their patents to the table, Hayes and U.S. Robotics brought strong brand names to the table, and Rockwell brought its chipset manufacturing skills to the table, as they all negotiated the terms on which each company could manufacture these modems. Multiple patent holders jockeyed to get their patents built into the standard to ensure royalty income and to gain time-to-market advantages.

To navigate in this type of environment, you are well advised to gather information about the objectives of the other participants. This intelligence and analysis can be enormously helpful in targeting common interests, allies, and potential compromises. For example, if you can ascertain who is in a rush and who stands to gain from delay, you will be able to play the standards “game” far better.

Once you have assessed the strengths and objectives of the other players, you should apply the following principles of strategic standard setting:

Don’t automatically participate. If you can follow a control strategy or organize an alliance outside the formal standard-setting process, you may be far better off: you can move more quickly, you can retain more control over the technology and the process, you will not be bound by any formal consensus process, and you need not commit to openly licensing any controlling patents. For example, Motorola did not participate in the ITU T.30 recommendation for facsimile equipment and later sought royalties from manufacturers of that equipment. This generated some ill will, since Motorola had previously agreed to license this same technology on reasonable terms for modems as part of the V.29 modem standard-setting process, but nonparticipation also generated significant royalty income for Motorola. To cite another example, the Federal Trade Commission sued Dell Computer over Dell’s attempt to collect royalties on patents essential to the VESA bus standard, after Dell had previously represented that it held no such patents. In its defense, Dell asserted that it was unaware at the time that it held any such patents, but the case makes clear that participation brings with it real responsibilities.

Keep up your momentum. Don’t freeze your activities during the slow standard-setting process. Actively prosecute any patent applications you have pending, keep up your R&D efforts, and prepare to start manufacturing. Remember how CBS was caught flat-footed, not ready to manufacture sets even after the FCC picked CBS’s color TV standard.

Look for logrolling opportunities. The term logrolling refers to the trading of votes by legislators to obtain passage of favorable legislation. Logrolling has always been a part of the political process. The standard-setting process is a wild mix of politics and economics, including explicit side payments and side deals. Typically, such deals include agreements to incorporate pieces of technology from different players, as was done for HDTV in the United States and for modems at the ITU. Side deals can also involve agreements between those who have intellectual property rights (aka the “IPR club”) such as patents to share those patents on a royalty-free basis, while imposing royalties on participants who are not members of the club. Whatever deals you offer to attract allies, make them selectively to the stronger players. But be sure to abide by the rules of engagement, including any nondiscrimination rules. Form or join an alliance, and make sure the other members do not defect.

Be creative about cutting deals. Figure out what key assets you bring to the table, and use those to assemble a coalition or to extract favorable terms when you pick sides. Consider low-cost licensing, second sourcing, hybrid standards, grantbacks of improvement patents, and commitments to participate in future joint development efforts. Whatever cards you have in your hand, play them when you are most likely to make a pivotal difference. Don’t confine your deal making to the technology or product in question; think broadly of ways to structure deals that are mutually beneficial.

Beware of vague promises. The formal standard-setting process has a great deal of momentum. Don’t count on vague promises of openness made early on; these may evaporate once a standard is effectively locked in. In the ITU, for example, individual companies are expected to support whatever position the State Department takes on behalf of the United States, since the department consults first with the industry. As a result, companies lose the ability to stop or steer the process once national positions are set; to do so would be regarded as treason. For just this reason, make sure early on that holders of key patents are explicit about their commitment to license for “reasonable” royalties. Reasonable should mean the royalties that the patent holder could obtain in open, up-front competition with other technologies, not the royalties that the patent holder can extract once other participants are effectively locked in to use technology covered by the patent. This is like the medieval concept of the “just price”; the just price of a horse was the price that would prevail at the open market at the annual fair, not the price that happens to emerge from a traveler in desperate need of a horse.

Search carefully for blocking patents. Beware of picking a standard that will require using a patent held by a company not participating in the standard-setting process. Suppose a standard is selected, production begun, and positive feedback is achieved. Then a company that did not participate in the standard-setting process suddenly appears and asserts that everyone complying with the standard is infringing on a patent held by that company. Remember, a nonparticipating patent holder is not required to license its patents on fair and reasonable terms. This is the nightmare of every participant, since the interloper can potentially control the entire market the participants have built. You cannot fully protect yourself from this contingency, but any technology not clearly in the public domain, or controlled by participants, should be thoroughly searched. Note that our advice to search for blocking patents is the flip side of our suggestion that some companies not participate in the process but instead seek to pursue a control strategy by establishing a proprietary standard with the aim of collecting substantial royalty payments.

Consider building an installed base preemptively. This is risky, and not always possible, but it can strengthen your bargaining position. Establishing manufacturing sources and building an installed base are akin to moving your troops into a stronger position while negotiating for peace. You might undermine the peace process and your efforts may go to waste, but flanking maneuvers are one way to kick-start a slow negotiation. U.S. Robotics/3Com and Rockwell/Lucent each marketed their modems actively, even while they negotiated under ITU auspices for the 56k modem standard. In this case, both camps offered free upgrades to the final ITU standard. The same thing happened in the previous generation of 28.8k modems. Rockwell offered “V.FC” (“fast class”) modems in advance of the V.34 ITU standard, but then had to face infringement claims from Motorola. Among other things, Motorola asserted that its commitment to license patents essential to the V.34 standard did not apply until after the V.34 standard was formally in place.

BUILDING ALLIANCES

Whether you are participating in a formal standard-setting process or simply trying to build momentum behind your product, you need allies to ignite positive feedback. This requires identifying the companies that are your natural allies and then negotiating to obtain their support for your technology.

As you seek to build an alliance in support of a new standard, you should keep firmly in mind the competitive advantages you aim to retain for yourself. Promising sources of advantage include a time-to-market advantage, a manufacturing cost advantage, a brand-name advantage, and/or an edge in developing improvements. One or all of these competitive advantages can persist, even if the technology is freely available to all so you are barred from asserting IPRs to exclude competition. We’ve seen companies fight tooth and nail to have their technology included in a standard, even if they anticipate little or no royalty income as a result. In the HDTV story, Zenith’s stock price surged after key components of its technology were selected for inclusion in the HDTV standard, even though Zenith had already agreed to extensive cross-licenses with General Instrument and others in the HDTV technical competition.

Assembling Allies

Look broadly for allies. Your allies can include your customers, your suppliers, your rivals, and the makers of complementary products. For each potential ally, try to figure out how your proposed standard will affect their fortunes, using the framework we developed earlier in this chapter for predicting how standards will alter competition.

What will it take to attract each ally? When is the opportune time to make an offer? Building a coalition is very much a political process. It is critical to understand both the concerns and the options of your potential partners to design a deal that will appeal to them.

Pivotal or influential customers should get special deals. For example, when Microsoft introduced Internet Explorer, it signed a deal with Dow Jones, giving Explorer users free access to the Wall Street Journal, a complementary product. As we mentioned in Chapter 3, many digital cameras are bundled with a stripped-down version of Adobe’s Photoshop. The camera or scanner doesn’t have big network externalities or switching costs, but Photoshop certainly does. It is a powerful and complex piece of software that has a wide following in the industry. Adobe has done a marvelous job of creating software that is easy to use out of the box and yet powerful enough to whet the consumer’s appetite for the full-fledged version.

DigiMarc, initiator of the digital watermarking system described in Chapter 4, has partnered with providers of image manipulation software such as Adobe, Corel, and Micrografx, allowing them to include a low-end version of the DigiMarc system with their products in an attempt to get the bandwagon rolling for the DigiMarc standard.

It is tempting to offer very good deals to the early signers in an effort to get the bandwagon rolling. But if these deals cannot be extended to their competitors, you may have a hard time attracting other partners in the same industry because they would find themselves in an untenable competitive position. If you set a 10 percent royalty for the first firms to adopt your technology, it will be hard to get the later signers to accept a 20 percent royalty because they will find it difficult to compete with their lower-cost rivals. This is what happened to DiscoVision Associates, a company controlling key patents for the encoding and manufacturing of compact disks: after signing up a number of licensees on attractive terms early in the lifetime of the CD technology, DiscoVision was unable to raise its royalty rates to new licensees who had to compete in the low-margin CD replication business, even though the CD standard was by then well established.

A better strategy is to offer the early birds a temporary price break on the royalty. This gives them an incentive to climb on board, but it doesn’t preclude higher rates for latecomers. One way to structure royalties to achieve this end is to offer a discounted royalty rate up to a certain cumulative output level, after which the royalty reverts to the “standard” rate. This is the opposite of the popular royalty structure in which rates decline with volume. Our proposed structure reduces the risks faced by early licensees, gives an advantage to early allies, and preserves more options for the licenser in the future.

Don’t forget to address the question of who will bear the risk of failure if the bandwagon collapses. Will your partners be left holding the bag? In general, the costs of collapse should end up falling on those who are best positioned to prevent such a collapse and on those who can most easily absorb any unavoidable risk. Normally, both of these factors point to the larger firms, but not always. If smaller firms are in a better position to seek bankruptcy protection, it may be that they are better placed to absorb a lot of risk. Of course, in this case it’s the creditors of the bankrupt firms that end up holding the bag.

One clever approach is to shift some risk to a really big player, such as the government or a regulated monopolist. As we noted earlier, smart cards have not had much success in the United States but have done well in Europe. One reason is that the European state telephone monopolies mandated smart cards for pay phones. This was enough to build a critical mass for that technology. Other vendors felt comfortable adopting the technology, figuring that the government would support the system if necessary to prevent its failure.

There is nothing dishonorable about piggybacking on government efforts to establish a new standard. The U.S. Congress has mandated that U.S. benefit payments must be electronic by January 1, 1999. Smart cards may well play a role in such electronic benefit transfers, so the new government rules could significantly aid smart card deployment in the United States. Effectively, a very large and well-heeled customer is making a commitment to smart cards.

How much do you need allies? We discussed this in Chapter 7 when we compared the openness and control strategies. We identified three key assets that govern your ability to ignite positive feedback: existing market position, technical capabilities, and control over intellectual property rights. The stronger your position in terms of these three critical assets, the less important are allies and the more easily you can play those allies off against each other. In the mid-1980s, Nintendo had a distinctly superior system, strong copyright and patent protection for that system, and a solid installed base in Japan with which to attract game developers. Thus, Nintendo could afford to charge game developers for the right to put their games on the Nintendo system. No individual game created by these developers was crucial to Nintendo, but access to the Nintendo installed base was soon critical to each of them.

Be careful of building an alliance consisting of companies with very different interests; such unions can prove unwieldy. In consumer electronics, equipment manufacturers and content providers often come to loggerheads because they have very different interests regarding copying. The current standards war surrounding “write” technology for DVDs, mentioned earlier in this chapter, illustrates the problem.

Interconnection among Allies

We have emphasized that today’s virtual networks of compatible users have much in common with the more familiar transportation and communications networks. We can exploit these similarities to learn from the experience of alliances in the more traditional network industries. Just as Apple agonized over the terms on which it permitted Macintosh clones to be built, flip-flopping several times, so too did railroads, telephone companies, and broadcast networks ponder interconnection terms in their day.

For as long as there have been networks, there has been interconnection: passengers or cargo brought by one network to its extremities are carried farther along by an adjacent network. National postal services developed interconnection procedures centuries ago, while telephone systems figured out interconnection roughly one hundred years ago. Airlines and railroads regularly exchange traffic. Over the years, smaller carriers have regularly complained about the terms on which larger carriers would interconnect with them. This issue is beginning to surface on the Internet, and it is endemic to the virtual networks that populate the information economy.

We can all learn much from historical interconnection agreements. While the technology underlying the Internet is new, the economic issues surrounding interconnection are not. Networks that deliver messages or physical goods typically involve four parties: the sender, the sender’s carrier, the recipient, and the recipient’s carrier. (More parties are involved if intermediate carriers handle the traffic; only three parties are involved if one carrier takes the message from end to end.) When you send a letter from the United States to France to your friend Jean, the four parties are you, the U.S. postal service, the French postal service, and Jean. (FedEx and DHL speed things up by routing traffic entirely over their own proprietary networks, reducing the transaction to three parties.) The same pattern applies to the Internet, with different carriers. Many of the economic issues surrounding interconnection that apply to the Internet today have been present in postal systems for centuries: how should payments be split between sender and recipient, and what “carrier-to-carrier” charges apply? In our example, who pays for the letter, you or Jean, and what payment, if any, must the U.S. postal service make to the French postal service as compensation for its delivery of the message to Jean?

Postal services have been dealing with these problems for centuries. Mail services arose more than two thousand years ago, initially to serve kings and emperors. Religious orders and universities also set up their own systems, with relay stations, and eventually permitted private individuals to send messages using their systems. Opening the systems to private customers was a way of spreading the fixed costs of these messenger systems over more users. Charges were based on the type of message, its size, and the distance traveled, and were generally paid by the recipient, not the sender. (In an unreliable system, both incentives and risk are better managed by making the recipient pay for the delivery of the message.)

Interconnection issues arose when one postal system sought to hand off mail to another for delivery. Bilateral agreements between European countries were negotiated in the seventeenth century to govern interconnection. By the nineteenth century, most large European countries were party to at least a dozen of these treaties, requiring that multiple detailed accounts be kept. This complex and costly system was finally replaced in 1874 by the Treaty of Berne, which led to the Universal Postal Union, now a part of the United Nations. Then, as now, a multilateral agreement and a centralized clearinghouse greatly reduced interconnection costs among “end-to-end” networks.

Interconnection became more strategic once networks began to compete against each other over the same routes: side-by-side networks rather than end-to-end networks. For as long as there have been competing networks, these networks have used interconnection terms and conditions to gain competitive advantage. For decades, U.S. telephone companies have been paying outrageous fees to foreign state-run telecommunications monopolies for completion of outbound calls in foreign countries. As we saw in Chapter 7, early in this century AT&T used its control over the long-distance telephone network to consolidate control over the majority of local telephone service in the United States.

All of these practices have their virtual equivalents in computer and information networks, virtual or real. Take the Apple Mac network. Apple limited “access” to its network by refusing to license independent manufacturers, so-called clones, until roughly a decade after the introduction of the Mac. Apple did not aggressively seek to establish the largest network or to connect with the PC network using adapters, the virtual equivalent of interlining. Rather, Apple was content at the outset to have a cool product with a loyal following in the education and graphics markets. But niche strategies are inherently dangerous in markets with strong network externalities. Apple’s strategy was akin to having a specialty fax network for the design and publishing industries, based on superior image resolution and color capabilities. This is fine until the makers of mass-market fax machines learn to match you in performance, and then you’re dead before you know what hit you. To get on the right side of the positive-feedback curve requires a strategy based on broad appeal, along with a broad, compatible product line. Only the impressive performance of the Macintosh and the technological sluggishness of Microsoft in matching the ease of use of the Macintosh have allowed Apple to survive as long as it has with its niche strategy.

In the presence of strong network externalities, interconnection and network access strategies can make the difference between achieving critical mass and floundering. It is all too easy to try to retain tight control over your network by refusing to license critical technology or by manipulating interface specifications to disadvantage rival suppliers, only to find that this strategy has backfired by steering customers and suppliers to rival networks. In hindsight, this is where Sony went wrong with VCRs: it lost out to Matsushita’s open licensing program. Today, many industry observers believe that Apple went wrong in personal computers by refusing to license its hardware and software, thereby losing out to IBM and its clones.

In assembling allies, we advise you to offer interconnection or compatibility, but on terms that reflect your underlying strength, and with limitations to reduce the risk that you will lose control over the network with time. Java gives us a sobering example of the dangers of losing control. Sun was eager to license Java to as many producers as possible and was even happy to offer a license to its fiercest competitor, Microsoft. But Microsoft cleverly retained the right to “improve” Java in the licensing agreement. Microsoft then proceeded to add its own “improvements” that worked only in the Windows environment! Microsoft critics called this a foul attempt to fragment a standard; Microsoft itself says it is only trying to give customers better performance. It’s likely that both positions are correct—but it’s still a big headache for Sun.

Negotiating a Truce

In standard setting as in diplomacy, alliances form between potential combatants as a means of preventing war, not merely to solidify common interests. In both situations, the alliance arising out of a negotiated truce can be a lifesaver, even though it is an uneasy union. We’ll discuss standards wars in the next chapter; here we consider the rewards and perils of negotiating a truce to avoid such a war.

If you control one of two incompatible technologies competing for market ascendancy, you may well be better off negotiating a truce than fighting a costly and lengthy standards war. Ideally, these negotiations will take place not through the slow, formal standard-setting process but by fashioning a creative agreement between your camp and the rival camp.

A standards truce should be possible if both sides can make more money in peaceful coexistence than in a standards war. If cooperation increases the players’ joint profits, there should be a way to structure a truce to make both sides happy. (Usually, such deals do not run afoul of antitrust laws; we’ll consider the legal limits on standard setting in Chapter 10.)

There is plenty of reason to think a truce will normally lead to higher profits. Basically, if the total value created by the technology is enhanced by standardization, suppliers and customers should be able to divide up this value. If the pie is larger, everyone should be able to get a bigger piece, including consumers. But the hard part comes in dividing up the enlarged pie. This is where the standard-setting tactics listed above enter into the picture: picking a hybrid technology, licensing and cross-licensing, most-favored customer terms, commitments to openness, and so on.

As in any truce negotiations, both sides need to determine how they would fare if war were to break out. Based on the assets held by the two companies or coalitions, the negotiations can take one of three basic forms: (1) an inevitable standards war, (2) a game of chicken in which each side tries to assert its own technology over the other but will concede rather than fight, or (3) an unbalanced game between a strong team that would rather fight and a weak team that would rather negotiate a truce. These three possibilities are shown in Table 8.1.

First, it may be that both sides would rather fight than join forces. That is, they would rather compete to set their own standard rather than agree on a common standard. This happens when consumers put a high value on variety as well as network externalities, when price competition to sell a standardized product standard would erode profit margins, and when each side is confident it will win the war. The force of the “not invented here” syndrome should not be underestimated. If both key players would rather compete than agree on a standard, a standards battle is inevitable. Each team should start lining up allies for the fight and moving troops into position. See Chapter 9 about tactics you can use to wage—and win—a standards war.

Table 8.1. The Standards Game

art

The second possibility is that each side would prefer to establish its own technology as a standard but is prepared to accept the other’s technology as a standard rather than waging a ruinous winner-take-all battle. That is, each side prefers its own technology but would rather switch than fight. In this case, the standards negotiations are like a game of chicken: each side will try to convince the other that it is the more stubborn of the two. War may come, but the two sides are better off cutting a deal.

In the third scenario, one player is strong and confident of winning a standards battle. This player would prefer to compete with incompatible products. The other side is weak, and knows it. The weak player would like to adopt the strong player’s technology to ensure compatibility and reduce or neutralize its disadvantages. The stronger firm may be able to prevent the weaker firm(s) from achieving full compatibility, either by asserting intellectual property rights or by changing interfaces frequently. In this third case, there will be a predictable dynamic in which the strong team tries to limit access to its network or at least charge for interconnection or compatibility. See Chapter 9 for advice on how to play the two roles in this game, those of the strong and the weak companies.

As with any negotiation, stubborn players can erode or destroy the gains from trade. Our advice: don’t be proud. Be prepared to join, even with a bitter rival, to establish a standard if it helps you both. Of course, you need to stay on your guard in dealing with a direct rival. Will the standard picked give your rival an edge? Is the proposed standard really neutral now and in the future? Above all, remember that maximizing your return does not mean maximizing your control over the technology. As we said in Chapter 7:

Your reward = Total value added to industry
× your share of industry value

Avoiding a standards battle will increase the value to all firms operating in the industry if consumer confusion, fear of stranding, and lack of consensus would otherwise have stalled the technology. The critical issue you face is how much of that extra value you will be able to appropriate.

The imperative to find common ground, and the fact that savvy companies can get past their differences and cooperate to enable new technologies, can be seen in the dealings between Microsoft and Netscape. Much has been made of the browser wars between Netscape and Microsoft, which we discuss in some detail below. But focus for a moment on the spheres in which these two implacable enemies have agreed to follow a common standard.

First consider the problem of protecting privacy on the Internet. Consumer fears over loss of confidential information are clearly a drag on on-line commerce, to the detriment of both Microsoft and Netscape. Netscape took the first step, proposing the Open Profiling Standard (OPS) along with the Firefly Network and Verisign. The OPS employs profiles that enable personal computer users to control the information about themselves that is disclosed to a particular Web site. To get things rolling, Netscape lined up about forty other companies in support of the standard, including IBM and Sun Microsystems as well as some on-line publishers. Microsoft was conspicuously absent from the coalition. For a brief time, it looked like the two arch-rivals would promote different standards for privacy software. But they quickly avoided this mutually destructive approach. Just weeks after Netscape had made its move, Microsoft announced its support in June 1997 for the Netscape-sponsored standard. This standard will now become part of the Platform for Privacy Preferences (P3) being developed by the World Wide Web Consortium.

Neither company was proud, but both were cautious. Netscape has a policy of not inviting Microsoft into its standard-setting efforts too early, for fear of giving Microsoft the opportunity to use the process to gain a proprietary advantage. According to Mike Homer, Netscape’s vice president for marketing, “Nobody tells Microsoft of these things if they want to gain a broad consensus.” For its part, Microsoft stated that it would have supported the OPS earlier had Netscape agreed to share its specifications at that time.

A second arena in which Microsoft and Netscape were able to cooperate involved 3-D on the Internet. In August 1997, they agreed to support compatible versions of Virtual Reality Modeling Language (VRML), a 3-D viewing technology, in their browsers. Again, Microsoft was pragmatic rather than proud, adopting a language invented at Silicon Graphics. There is no doubt compatibility will create a larger pie to split: VRML had been slow to gain acceptance, both because it was embedded in incompatible browsers and because consumers had to download plug-in software for displaying the graphics. Problems still remain—3-D files are large and slow to download—but at least consumers will not have to worry whether their browser will work at a particular Web site. Both Navigator 4.0 and Internet Explorer 4.0 now contain VRML capability.

A third example of Microsoft and Netscape teaming up involves security for on-line transactions. In February 1996, Visa and MasterCard announced the Secure Electronic Transactions (SET) standard. SET was a method of protecting the security of electronic payments by encrypting credit card numbers sent to on-line merchants. It was backed not only by Visa and MasterCard but also by Microsoft, Netscape, and IBM.

That Visa and MasterCard could cooperate is less surprising on its face than the joint endorsement of Microsoft and Netscape: Visa and MasterCard are both controlled by roughly the same set of banks, and they cooperate extensively to route transactions between their two merchant acceptance and cardholder networks. But, again, Microsoft and Netscape were smart enough to figure out how not to compete, at least on this dimension. Such a dispute would undoubtedly delay widespread Internet commerce and work to the detriment of both firms as well as consumers.

The path to peace was rocky. Back in June 1995, MasterCard and Visa had said they would coordinate. But by the fall of 1995 a standards war was brewing: Microsoft and Visa proposed what they called Secure Transaction Technology, while MasterCard, Intuit, IBM, and Netscape pushed for a system called Secure Courier. The Microsoft/Visa proposal was touted by them as “open”—that is, available to any company—but the underlying computer software needed to create actual products was only to be made available through licenses from Visa or Microsoft. When it became clear that this wouldn’t fly, the companies capitulated and settled on a truly open standard.

Alliances in Action

XEROX AND ETHERNET. The story of the Ethernet standard shows how you can use a formal standards body to establish credibility. Bob Metcalfe developed Ethernet at Xerox PARC in the late 1970s as a way to send vast amounts of data at high speed to the laser printers that Xerox was designing. Xerox patented Ethernet, and Metcalfe left PARC to start 3Com, a company dedicated to networking products.

His first client was Digital, which asked him to develop a new highspeed network standard that didn’t infringe on Xerox’s patents and that Digital could use to network its workstations. Metcalfe suggested that Digital talk to Xerox first; why reinvent the wheel if Xerox would license it on attractive terms?

Xerox realized, quite correctly, that it would have to offer an open networking standard to get computer manufacturers to adopt the Ethernet interface for their printers. If that same standard could be used for connecting computers, so much the better. Digital, Xerox, and 3Com recognized the value of having an open standard, and Metcalfe went to the National Bureau of Standards to try to set the process in motion. While there, he ran into an Intel representative who was looking for new technologies to embed in integrated circuits.

Digital, Intel, and Xerox subsequently recognized their common interest and formed the DIX group, named after the first letters of their names. (Metcalfe says it was spelled DI3X, but the 3 is silent.) The coalition convinced the IEEE, a highly respected and neutral industrywide organization, to adopt Ethernet as an open standard, subject to the usual “fair and reasonable” licensing terms, and Xerox agreed to license Ethernet to all takers at a nominal $1,000 flat fee. Adoption by the IEEE did much to create self-fulfilling expectations that Ethernet would emerge as the accepted industry standard.

A few years later, IBM made its Token Ring an open standard on similar terms, but by that time Ethernet had such a large installed base that IBM wasn’t able to catch up. Ethernet became the LAN standard because the DIX group recognized the value of openness from the beginning.

ADOBE POSTSCRIPT. Adobe PostScript is another wonderful example of opening up to establish a standard. Xerox had an earlier page description language called Interleaf that it kept proprietary. Interleaf ran only on Xerox hardware, dooming it to a small market share. John Warnock, the leader of the Interleaf team, left Xerox to create PostScript. He realized that PostScript would succeed only if it was open, so Adobe publicly announced that it was not restricting other uses of its page description language: anyone could write and market a PostScript interpreter. Adobe asserted no intellectual property rights to the language itself. Several vendors took Adobe up on the offer, and now there are several suppliers of PostScript interpreters, including GhostScript, a free PostScript interpreter from the GNU Project.

How did Adobe profit from this alliance strategy? Adobe was already far down the learning curve, and it managed to keep a few tricks to itself, including “font hints,” which made Adobe PostScript look better on low-resolution devices. The strategy worked well. PostScript became a standard, and Adobe maintained a leading position in the page-description industry and managed to leverage this position in several complementary products in the publishing field.

Several years later Adobe managed to follow a similar strategy with its portable document format (PDF). The company allowed PDF to become an open standard but cleverly exploited the complementarities between creating and viewing a document. Adobe charged for the PDF creation software, while giving away the viewing software.

MICROSOFT’S ACTIVEX. A more recent example of giving away a technology is Microsoft’s ActiveX protocols, which allow programs on one computer to communicate with programs on another remote machine. Microsoft did not just say it would make ActiveX open, it actually gave responsibility for managing ActiveX to the Open Group, an independent industry group. ActiveX is competing with a rival technology called CORBA, a much more sophisticated, cross-platform technology backed by nearly everyone else in the industry.

Microsoft reportedly spent more than $100 million to develop ActiveX and yet was willing to give it away, at least in part. Microsoft rightly recognized that the relevant question was not how much it cost to develop the technology but rather how much it would cost if Microsoft kept it proprietary. In that case, CORBA would be the only open standard for object calls, and Microsoft could find itself with an orphaned technology and stranded customers. Sunk costs are sunk—it is future costs that matter. But note that Microsoft will continue to make and sell its own enhancements to ActiveX, provided they meet the specifications that will be managed by the Open Group.

A key issue in ceding control of ActiveX is Microsoft’s reputation. According to the Wall Street Journal, “In the past, some software developers were hurt when Microsoft unexpectedly changed key specifications for the technologies it controlled, including its mainstay, the Windows operating system. On occasion, Microsoft has also been able to get a head start on rivals by exploiting new technologies it has developed.”1 Merely announcing that ActiveX would be open would not be enough to convince people to use it—Microsoft actually had to give up some control of the system to make its claims credible.

Assigning control of a standard to a “neutral” industry group has its own dangers: who will invest in the technology, and how will the standard be improved over time? A modern version of the “tragedy of the commons” can be the sad result: just as few took the trouble to protect the common grazing land from overuse in the seventeenth century, few today will make major investments to advance a technology that is in the public domain. Indeed, for just this reason, an article in Byte magazine reported that Microsoft has in fact retained effective control over ActiveX/COM, just as Sun has retained effective control over Java: “Both leading object technologies—and the Java environment—are now controlled by single vendors. Our industry has finally learned a crucial lesson: Technologies controlled by slow-moving standards bodies can’t keep up with rapidly changing markets.”2

MANAGING OPEN STANDARDS

What happens once an open standard is accepted and successful?

Managing successful open standards can be especially tricky. Truly open standards face two fundamental threats. First, if there is no clear sponsor, who will be in charge of setting the direction in which the standard will evolve? Will the standard stagnate, or will crippling incompatibilities arise, since no one can exert control? Second, without a sponsor, who will invest the resources to make improvements and thus keep the standard from stagnating? Who will be willing to invest in the installed base by pricing below cost—penetration pricing—if that is needed to stave off a threat?

Open standards are prone to “splintering,” or “fragmentation.” Splintering of a standard refers to the emergence of multiple, incompatible versions of a standardized technology.

The classic example of the perils of managing open standards, and the dangers of splintering, is the story of the Unix operating system. Unix was originally developed at Bell Labs as a research tool. AT&T gave away the source code to academic researchers for many years, and it became a standard in the research community.

When the minicomputer market took off in the 1970s, Unix was modified and sold by many different companies; the workstation boom of the 1980s led to more versions of Unix, and no industry standard was established. Several different hardware vendors, including IBM, Sun, Hewlett-Packard, Silicon Graphics, and Novell, in a desire to differentiate their products, to add value, and to make improvements, created their own flavors of Unix. None of them wanted to wait for formal approval of their improvements and thereby lose both a timing and a differentiation advantage.

Beginning in the mid-1980s there were efforts to agree on a standard, but these were hampered by infighting among hardware and software vendors. Even the growing and common threat of Windows NT was not sufficient to create harmony among the various Unix vendors in the early 1990s.

In March 1993 the major suppliers of Unix attempted yet again to adopt a common approach that would make it possible for Unix applications to look and work the same on different computers. This alliance consisted of several major players in the Unix industry, including Sun Microsystems, Novell, Santa Cruz Operation, IBM, and Hewlett-Packard. HP and IBM in particular had been direct rivals of Sun and had not generally collaborated on software matters with Sun. The threat posed by Windows NT helped spur these rivals to try to cooperate.

In June 1993, Novell tried to take on a leadership role in the Unix world by acquiring Unix System Laboratories from AT&T in a stock swap valued at about $320 million. Later that year, Novell freely gave away the Unix trademark. Novell’s plan was to give the Unix name away to the X/Open Company, a London-based consortium of fourteen hardware and software companies founded in 1985 to promote standardized approaches to Unix. The idea was to let any company call its product Unix as long as it met the X/Open specifications.

How did Novell plan to gain from this? Novell continued marketing its own flavor of Unix, UnixWare, hoping that X/Open would give Unix new momentum and that UnixWare would get a decent share of a growing Unix market. Novell’s plan ran into snags, however, as IBM, HP, Sun, and Santa Cruz Operation expressed concerns that Novell was attempting to make UnixWare the de facto Unix standard. They asserted that UnixWare was an inferior version of Unix. Meanwhile, Windows NT continues to make inroads in markets that were once the exclusive province of Unix.

Open standards can also be “hijacked” by companies seeking to extend them in proprietary directions, and thus in time gain control over the installed base. Microsoft has been accused of trying to extend both Java and HTML in proprietary directions.

The Standard Generalized Markup Language (SGML) is an open standard for storing and managing documents. Its best-known instance is HyperText Markup Language (HTML), but SGML goes far beyond HTML in its capabilities. SGML’s development was pushed by the Department of Defense and other large agencies for whom multiple formats of documents are a huge headache. Despite its claim of being a lingua franca for documents, SGML has never taken off because no large firm has emerged to champion it. Recently there has been some excitement about the Extensible Markup Language (XML), which is a subset of SGML. The danger, of course, is that XML will splinter in the same way Unix did, with multiple dialects being promulgated.

Sun faces this problem with Java. Sun’s competitors and complementors would like to see Java open. However, Sun has been reluctant to give up control over the development of Java, fearful that without a champion, Java could fragment. This puts Sun in a difficult position with other players in the Java coalition.

A final warning on alliances: they can collapse, too. You need to worry not only about forming them but also about keeping them together. The Unix example of splintering is one way in which an alliance can come apart, but not the only way. The “grand alliance” of HDTV offers a good example of a shaky alliance; television manufacturers, broadcast networks, computer manufacturers, and software firms are all sniping at each other about various extensions of the original agreement. Many broadcasters, for example, are planning to use their new spectrum space to deliver multiple channels using digital signals, not to send HDTV signals. Set manufacturers, hoping to sell lots of pricey HDTV sets, are understandably distressed at this prospect. The cable TV networks, which were not involved in the original negotiations, are yet another wild card. They, too, are planning to use digital compression technology to offer more lower-quality channels rather than fewer high-quality channels.

LESSONS

We can distill from this chapter a number of lessons useful to any company participating in an alliance in support of a compatibility standard:

  • To compete effectively in network markets, you need allies. Choosing and attracting allies is a critical aspect of strategy in the network economy. Competition thus becomes a mixture of politics and economics. You must assemble allies to promote a standard and then compete against these same companies once the standard is established.
  • To find your natural allies, you must determine how a proposed standard will affect competition. Standards alter competition in several predictable ways. Standards expand network externalities, reduce uncertainty, and reduce consumer lock-in. Standards also shift competition from a winner-take-all battle to a more conventional struggle for market share, from the present into the future, from features to prices, and from systems to components.
  • Standards tend to benefit consumers and suppliers of complements at the expense of incumbents and sellers of substitutes. Look for your allies among the groups that will benefit from a standard. Then be creative in finding ways to split the enlarged pie that results from a successful standard.
  • Formal standard setting is now being used to develop more standards than ever before. Formal standard setting is slow, but it can give a new technology enormous credibility. Several key tactics will make you more effective in the formal standard-setting process. Don’t slow down your competitive efforts just because you are engaged in formal standard setting. Look for opportunities to build alliances by cutting creative deals, such as licensing arrangements, with selected participants of the standard-setting effort. Beware of companies that hold key patents and are not participating in the process.
  • Find your natural allies and negotiate to gain their support for your technology. Allies can include customers, complementors, suppliers, and competitors. Be prepared to offer special deals to early supporters; with positive feedback, a few visible early supporters can be enough to tip expectations in your favor, making it easier to attract more allies over time.
  • Before you engage in a standards battle, try to negotiate a truce and form an alliance with your would-be rival. An agreed-upon standard may lead to a far bigger overall market, making for a larger pie that you can split with your partners. Don’t be proud; be prepared to cut a deal even with your most bitter enemy.
  • Try to retain limited control over your technology even when establishing an open standard. Without a champion, open standards can stagnate or splinter into incompatible pieces. Allies may be happy to let you guide the future evolution of the standard, so long as you have a lasting commitment to openness.
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
52.14.211.70