Chapter 20

Market Analysts Morphing

At the end of this chapter, we present the Annual Shareholder Letter that Jeffrey P. Bezos, CEO of Amazon, sent to investors. It starts with “Random forests, naïve Bayesian estimators, RESTful services, gossip protocols, eventual consistency, data sharding, anti-entropy, Byzantine quorum, erasure coding, vector clocks …”

Huh? This is from a retailer?

Amazon has good reason to talk technology and related math and science. It is the largest online retailer.1 Its Kindle, eBook sales, and Amazon Web Services revenues make up 10 percent of its revenues and are some of the fastest growing products in its store. Nonetheless, stop and ask: How many Wall Street analysts can relate to the language that Bezos used?

The Geeks on Wall Street

Wall Street spends plenty on technology. Much has been written about the “quants,” especially during the market meltdown of 2008. Scott Patterson of the Wall Street Journal, who wrote a book on the topic, says about them:

By the early 2000s, such tech-savvy investors had come to dominate Wall Street, helped by theoretical breakthroughs in the application of mathematics to financial markets, advances that had earned their discoverers several shelves of Nobel Prizes.2

Yet, even as Wall Street is investing massively in its own technologies, its analysis of technology of industries it follows has not been keeping pace.

Early in 2011, Amazon stock took a hammering even as the company announced robust growth—net income grew an impressive 28 percent over the year ended December 31, 2010, but it also reported lower margins. Evan Schuman, who covers the retail sector, wrote, “It's really about Wall Street's difficulty in understanding IT investments.”3 He went on to add, “… its actions with Amazon a few days ago really raise serious doubts about whether any Wall Street trader should be allowed to have a driver's license.”

Amazon's sin was that its capital expenditures (capex) grew from around $373 million in 2009 to $979 million in 2010. As a percentage of revenues, capex increased from 1.5 percent in 2009 to 2.9 percent in 2010. Schuman's exasperation was related to Wall Street's failure to understand it was due to investment in fulfillment centers—up from 39 to 52, a 33 percent increase, in 2010. Additionally, the company is adding technology infrastructure capacity to support its fast growing cloud computing business and Kindle manufacturing.

Amazon engaged in another session of analyst soul-searching after a significant outage in its cloud services in April of 2011. The blueGecko blog talked about widespread ignorance about cloud computing and used as an example an article in the Wall Street Journal. Quoting the authors:

Here the authors seem to be referring to EC2 availability zones. As most who have worked even a little with EC2 know, when you run an instance or store volumes in one availability zone, there is no automatic mechanism available to “reroute capacity” between availability zones.4

Financial Analysts and Domain Knowledge

Dr. Paul Kedrosky is an investor, author, entrepreneur, and a technology analyst for CNBC. On his blog, Infectious Greed, and in presentations at various industry events, he covers a wide range of technology and financial topics.

His perspective on financial analysts keeping up with technology:

Wall Street's sell-side (those that work for investment banks and publish much of the stock research we see) has a very low competency hurdle when it comes to domain expertise. As long as your results are nontoxic (that is, you make money for the firm), and your clients and trading desk don't hate you, that's all that matters.

There are some very good analysts with deep domain expertise who make terrible picks, and there are very bad analysts with no/errant domain expertise who make good picks. Further, the financial side of the Street more than trumps the industry side, so being able to geek out about thin-film poly-Si is much less important than having a developed cash flow model from 2011–2015.

On the buy side, things are very different. Long- and short-side analysts from fundamentally driven firms have fantastically smart analysts who generally know a great deal about domains. The reason, of course, is that there is direct feedback to them from their investments. They buy and sell stuff, and they are held accountable for real portfolios, so they have to be able to bridge industry, financial, and picking expertise.

None of this is to say that there aren't sell-side analysts capable of same—because there are many of them. It's just that the incentive structure doesn't encourage that kind of specialization (nor does it actively discourage it), with the result being that it's a coin-flip what sort of expertise you run across on the sell-side of Wall Street when it comes to highly technical domains.

A fund manager goes even further:

What is a sell-side researcher's job? It's not to know the most about all aspects of the company (even though we would like to think so). It's to know enough about what moves the stock and also how to build a reputation for conveying that information to clients who will pay for their perspective.

And even on the buy-side, knowing too much about a particular company or sector can IMPAIR your investment returns. Sometimes it's easier to really zero in on “what matters” in terms of a company and its stock price when it's greenfield territory, versus something where you have all the nooks and crannies figured out.

So, financial analysts end up balancing “knowing too much” against the growing tech talk they are increasingly presented. Mark Little, Senior Vice President of GE Global Research, uses terms like Biomimetics in presentations to financial analysts. That refers to the discipline of science mimicking nature, as in GE drawing innovative inspiration for moisture repellants from the lotus leaf. As we saw in the UPS case study in Chapter 1, its CEO tells investors “we're about half a transportation company, half a technology company.”

As we saw in Chapter 2, Tony Prophet of HP explains to financial analysts the gory details of its global technology supply chain. Then in Chapter 18, we saw David Meline, CFO of 3M, present to Wall Street a slide on 3M's “periodic table” that summarizes 46 different areas in which it offers technology products.

As more companies talk that way to investors, the more tech-savvy Wall Street analysts will have to become. In the end, behind all that technology are financial numbers. As Bezos points out in his letter: “Now, if the eyes of some shareowners dutifully reading this letter are by this point glazing over, I will awaken you by pointing out that, in my opinion, these techniques are not idly pursued—they lead directly to free cash flow.”

The Evolving Industry Analyst

Gartner, the technology research firm (different from Wall Street analysts), covered the Amazon outage mentioned earlier extensively and has reports for its clients with titles such as “Protecting Sensitive Data in Amazon EC2 Deployments.”

But search the Gartner database for 3M and you get very few hits. In Chapter 18 we saw 3M has products in 46 different technology platforms. Gartner is the largest technology research firm. What gives?

The reality is that the industry analyst marketplace is itself morphing. Over the course of the book we have mentioned firms like Redmonk, Horses for Sources, iSuppli, and other specialized analyst firms. In the Lexmark Genesis story in Chapter 16, we saw how they targeted technology blogs for their product launch, rather than traditional technology media and analysts.

Evangelos Simoudis, a Senior Managing Director at the venture capital firm Trident Capital, says, “We are a Forrester customer and were a Gartner customer. When it comes to emerging areas for our investments like the social Internet, we use firms like Altimeter Group (which has several Forrester alumni), as well as independent analysts like Esteban Kolsky (who is ex-Gartner), who we see as better “plugged in” to the ecosystem and the market's dynamics. We find that when it comes to business executives driving a decision around emerging technologies the larger, traditional analyst firms are still lacking the market's pulse.”

Chris Selland, a former analyst at Yankee Group in the 1990s and later at Aberdeen Group (both industry analyst firms), says:

During the '90s, there was a tremendous thirst among corporate executives for better understanding of how they could leverage technology and particularly this newfangled “World Wide Web” thing.

During the past 10 years that's been changing dramatically. The typical executive today has a much better understanding of technology, and much less motivation or desire to read a 100-page report that takes a team of analysts six months to develop.

Today's best “analysts”—those who deliver the most true insight and perspective—are almost entirely individuals, not big brand-name firms. Many models for confederating analysts and influencers are being brought to market. I participate in two confederations of independents: the Enterprise Irregulars, which is a loosely connected discussion group for many [former big-firm analysts (including the author of the book)], and Focus Research, which has applied a community and Q&A model to the market.

Frank Scavo of Computer Economics, introduced in Chapter 18, says:

At first glance, the IT analyst business would appear to be consolidating, with Gartner, Forrester, and IDC having acquired many of the smaller firms over the past decade. Beneath the surface, however, there has been an explosion of independent analysts and small analyst firms with a laser-like focus on certain markets and industries.

I first started blogging as the Enterprise System Spectator in 2002, as an experiment to see whether blogging would be useful in conjunction with my consulting work at Strativa, a management consulting firm I co-founded in 2000. At the time there were only a handful of tech bloggers, mostly consumer-tech-oriented blogs, such as Engadget, and only a few like me focused on enterprise IT. In fact, in 2004, I was voted as one of the top 10 independent tech bloggers. That just goes to show you how small the field was back then.

Today, there are hundreds if not thousands of tech bloggers and small analyst firms that provide an independent voice and counterweight to the traditional analyst firms. My early experiment in blogging got me involved as a contributing analyst to Computer Economics and our eventual acquisition of that 30-year-old IT research firm. It also put me in contact with many like-minded independent analysts around the world, with whom I collaborate. In 2010, some of us joined together to form Constellation Research, under the leadership of R. “Ray” Wang, a former Forrester analyst.

So, on the one hand, there may be only a few major analyst firms left standing. On the other hand, there are many new sources of expertise. Some of the best insights are coming from many voices that are enabled by blogging and other social media, though it does take some effort on the part of buyers to find them.

Conclusion

As the technology elite we profile in the book, such as Amazon, 3M, HP, and UPS, pioneer ways of communicating their own technology prowess to financial and industry analysts, they raise the bar for those analysts. They also raise it for peer companies they compete with or are benchmarked against. In the next section we present the complete text of a groundbreaking letter that Bezos sent to his shareholders.

Case Study: Amazon 2010 Shareholder Letter

Below we present the complete text of a letter Jeffrey P. Bezos, CEO of Amazon, sent to his shareholders, and by proxy to financial and industry analysts. It is groundbreaking in that it does not hesitate to use technology and math terms you would typically hear at a university lecture or in a scientific forum. The language raises the bar for financial and industry analysts who track Amazon. It also raises expectations of Amazon peers and competitors to be as sophisticated in use of technology and in their external communications about their technologies.

To our shareowners:

Random forests, naïve Bayesian estimators, RESTful services, gossip protocols, eventual consistency, data sharding, anti-entropy, Byzantine quorum, erasure coding, vector clocks … walk into certain Amazon meetings, and you may momentarily think you've stumbled into a computer science lecture.

Look inside a current textbook on software architecture, and you'll find few patterns that we don't apply at Amazon. We use high-performance transactions systems, complex rendering and object caching, workflow and queuing systems, business intelligence and data analytics, machine learning and pattern recognition, neural networks and probabilistic decision making, and a wide variety of other techniques. And while many of our systems are based on the latest in computer science research, this often hasn't been sufficient: our architects and engineers have had to advance research in directions that no academic had yet taken. Many of the problems we face have no textbook solutions, and so we—happily—invent new approaches.

Our technologies are almost exclusively implemented as services: bits of logic that encapsulate the data they operate on and provide hardened interfaces as the only way to access their functionality. This approach reduces side effects and allows services to evolve at their own pace without impacting the other components of the overall system. Service-oriented architecture—or SOA—is the fundamental building abstraction for Amazon technologies. Thanks to a thoughtful and far-sighted team of engineers and architects, this approach was applied at Amazon long before SOA became a buzzword in the industry. Our e-commerce platform is composed of a federation of hundreds of software services that work in concert to deliver functionality ranging from recommendations to order fulfillment to inventory tracking. For example, to construct a product detail page for a customer visiting Amazon.com, our software calls on between 200 and 300 services to present a highly personalized experience for that customer.

State management is the heart of any system that needs to grow to a very large size. Many years ago, Amazon's requirements reached a point where many of our systems could no longer be served by any commercial solution: our key data services store many petabytes of data and handle millions of requests per second. To meet these demanding and unusual requirements, we've developed several alternative, purpose-built persistence solutions, including our own key-value store and single table store. To do so, we've leaned heavily on the core principles from the distributed systems and database research communities and invented from there. The storage systems we've pioneered demonstrate extreme scalability while maintaining tight control over performance, availability, and cost. To achieve their ultra-scale properties these systems take a novel approach to data update management: by relaxing the synchronization requirements of updates that need to be disseminated to large numbers of replicas, these systems are able to survive under the harshest performance and availability conditions. These implementations are based on the concept of eventual consistency. The advances in data management developed by Amazon engineers have been the starting point for the architectures underneath the cloud storage and data management services offered by Amazon Web Services (AWS). For example, our Simple Storage Service, Elastic Block Store, and SimpleDB all derive their basic architecture from unique Amazon technologies.

Other areas of Amazon's business face similarly complex data processing and decision problems, such as product data ingestion and categorization, demand forecasting, inventory allocation, and fraud detection. Rule-based systems can be used successfully, but they can be hard to maintain and can become brittle over time. In many cases, advanced machine learning techniques provide more accurate classification and can self-heal to adapt to changing conditions. For example, our search engine employs data mining and machine learning algorithms that run in the background to build topic models, and we apply information extraction algorithms to identify attributes and extract entities from unstructured descriptions, allowing customers to narrow their searches and quickly find the desired product. We consider a large number of factors in search relevance to predict the probability of a customer's interest and optimize the ranking of results. The diversity of products demands that we employ modern regression techniques like trained random forests of decision trees to flexibly incorporate thousands of product attributes at rank time. The end result of all this behind-the-scenes software? Fast, accurate search results that help you find what you want.

All the effort we put into technology might not matter that much if we kept technology off to the side in some sort of R&D department, but we don't take that approach. Technology infuses all of our teams, all of our processes, our decision-making, and our approach to innovation in each of our businesses. It is deeply integrated into everything we do.

One example is Whispersync, our Kindle service designed to ensure that everywhere you go, no matter what devices you have with you, you can access your reading library and all of your highlights, notes, and bookmarks, all in sync across your Kindle devices and mobile apps. The technical challenge is making this a reality for millions of Kindle owners, with hundreds of millions of books, and hundreds of device types, living in over 100 countries around the world—at 24 × 7 reliability. At the heart of Whispersync is an eventually consistent replicated data store, with application defined conflict resolution that must and can deal with device isolation lasting weeks or longer. As a Kindle customer, of course, we hide all this technology from you. So when you open your Kindle, it's in sync and on the right page. To paraphrase Arthur C. Clarke, like any sufficiently advanced technology, it's indistinguishable from magic.

Now, if the eyes of some shareowners dutifully reading this letter are by this point glazing over, I will awaken you by pointing out that, in my opinion, these techniques are not idly pursued—they lead directly to free cash flow.

We live in an era of extraordinary increases in available bandwidth, disk space, and processing power, all of which continue to get cheap fast. We have on our team some of the most sophisticated technologists in the world—helping to solve challenges that are right on the edge of what's possible today. As I've discussed many times before, we have unshakable conviction that the long-term interests of shareowners are perfectly aligned with the interests of customers.

And we like it that way. Invention is in our DNA and technology is the fundamental tool we wield to evolve and improve every aspect of the experience we provide our customers. We still have a lot to learn, and I expect and hope we'll continue to have so much fun learning it. I take great pride in being part of this team.

As always, I attach a copy of our original 1997 letter. Our approach remains the same, and it's still Day 1.

Jeffrey P. Bezos

Founder and Chief Executive Officer

Amazon.com, Inc

Source: Amazon 2010 Annual Shareholder Letter.

Ready for a quiz based on comprehension of the Bezos letter? Not to worry. If you glossed over the letter or, as Bezos points out, if your eyes are glazed over after reading it, what you need to take away is the techniques and technologies described in the letter are not idly pursued. They lead directly to free cash flow. They are also likely to lead other CEOs to unabashedly talk about their own technologies to their stakeholders.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.145.82.254