Chapter 5
How to Solve Any Problem

The majority of students who take my course are concentrating their studies in engineering, psychology, or economics. Typically, very few of them consider themselves artists. At the start of a semester, I ask my new students whether they can draw a copy of a photo of Brad Pitt – I mean a near‐flawless replica of the photo. The overwhelming majority responds with a resounding “No.” Some claim they don't possess the artistic gene, whereas others concede that with enough training, they probably could do it.

The truth is, I can teach just about anyone to draw a near perfect replica of this photo in a matter of seconds. It doesn't require a particular genetic gift or the development of a physical skill. The problem isn't one of artistry, but of decision‐ making.

For so long, we have defined creativity as an ability to devise something representative, perhaps even functional, out of abstraction. Thinking creatively, or creative problem solving is often described as thinking “outside the box,” which in itself is an abstract concept. It begs the questions, what is the box, who made it, and how?

The answer is, the box is actually a frame, constructed by you as a function of all the information you've gathered leading up to this moment, and how you went about gathering it. In observing how others behave, solve problems, and gather their own data, you build the foundation of your framework. The way it is delivered to you, complete with the biases possessed by the people who have conveyed it, affects the frame through which all new information is considered and problems are approached. Therefore, there is no one box but, rather, a unique box assembled by each and every one of us for ourselves. You can't think outside your own box, only the box of someone else. In order for me to teach you how to draw a perfect replica of Brad's image, I don't need to teach you a new physical skill. Instead, I must reframe the problem for you. By doing that, your frame expands, allowing you to solve it for yourself.

Take a look at Figure 5.1.When I ask my students, “Can you create a near perfect replica of this image, in which each box contains a solid pigment?” with great confidence, they giggle and answer emphatically, “Yes, of course!”

Image described by caption.

Figure 5.1 A portion of Brad Pitt's eyeball at 30x zoom.

Well, this second image is a portion of Mr. Pitt's eyeball, as seen when the original photo is viewed at 30x zoom. It is arguably the most difficult part of the photo to replicate. If you believe with great confidence that you can replicate the most difficult part of the photo, it must follow that you can do the same for all the other segments. Therefore, you must be able to replicate the photo in its entirety.

The same goes for any problem you face, whether it is how to lose weight, gather more assets, or generate better returns. Simply by reframing the problem, new solutions – often obvious ones – will suddenly appear, seemingly out of thin air. In that moment, your frame expands. With this new solution lying firmly within the bounds of your new frame, and certainly well within your skillset, it becomes impossible to understand how you missed such an apparent solution.

Truth be told, in showing you how to create this image of Brad Pitt, I've actually given you the skill necessary to become a world‐class artist. I suspect you're thinking, “that's not real art, and it certainly wouldn't make me a world class artist,” but you'd be wrong. Chuck Close, one of the highest‐earning artists in the world for many decades, creates his art using this very technique. Recently, he's taken it in a slightly different direction, but the principles are the same.

Close Enough

What you see in Figure 5.2 is a poorly lit photo of a Chuck Close painting. For those unfamiliar with his work, here is what British art historian, Tim Marlow, had to say about this American artist. “[Close is] a kind of lone figure in contemporary art – no one else is doing what he is doing. He's a painter's painter, but his reputation is still growing. I'd put him among the top 10 most important American artists since abstract expressionism, no question.” Of course, Close's paintings aren't everyone's cup of tea, but he is a giant in contemporary art, having had major retrospectives at the most prestigious museums in the world, and his paintings consistently fetch many millions of dollars at auction.

Image described by caption and surrounding text.

Figure 5.2 Close‐up of Chuck Close painting.

It may not be apparent from Figure 5.2, but Chuck Close is a portraiture artist, beginning as a photorealist before moving into more abstract work. He has created massive portraits composed of dots made with a pencil eraser dipped in charcoal, some using torn pieces of paper in different shades of gray and still others using nothing but his fingerprint and an inkpad. In the photo shown in Figure 5.2, he uses pixels filled with colorful, amoeba‐like shapes to generate a beautiful self‐portrait.

Don't see it? That's because I cut out the rest of the image. I truncated your frame of reference and selected only what I wanted you to see in order to tell you a specific story. If I had shown you the entire painting, you might have missed all the intricate details of this section, including the interplay of lively colors and childlike shapes. I could have selected a similarly sized section of any one of his many paintings that employs this style to create a portrait. With only that section, you would see beauty, color, and artistry, but you could never see depth or how that section contributes to and is affected by those surrounding it.

If I provided you with each individual section of every Close creation in which he employed this technique, you could hold your own in an intelligent, in‐depth conversation about the intricacies of Close's paintings with any expert on his work. However, you would know so little about the paintings themselves. You'd be hard pressed to recognize whether it is of a man, a woman, or even a person at all. In discussing the fine details, you might sound intelligent, even to someone who knows the big picture, and would likely believe it yourself, but the truth is you'd know very little about the subject.

Storm of the Century

In January 2015, Anderson Cooper began the night's episode of 360 by telling his viewers that we could look forward to a special two‐hour edition of the show that would bring us all the latest on the Storm of the Century. The first 27 minutes featured live reports from around the northeast where reporters spoke of 30‐foot waves, erosion of the shoreline, hurricane force winds, and snow drifts that forced the closure of schools and transportation systems. Anderson brought in an expert, a CNN weatherman, who went to great lengths to explain why they would never cry wolf about such a storm, because then no one would listen to them in the future. They went on and on about the catastrophic consequences and the intelligent decision‐making by governors and mayors in the region to deal with them proactively.

Over the course of 27 minutes, one of the most viewed news outlets in the world ignored every other story it had been presenting as “vital” information just hours earlier. It was as though nothing mattered except the weather in the northeast. They showed us just one small portion of the entire canvas of what was newsworthy, and we were glued to the screens. For hours, it dominated social media and phone conversations. Millions adjusted their schedules, canceled meetings, and moved cars off the streets. Everything else in the world took a back seat.

The truth is, the weather wasn't the only newsworthy story that night, any more than bedbugs, ISIS, the Brexit vote, or Trump's election were when the entire world was hyperfocused on them for a time. Ultimately, they are all just tiny sections of a much bigger painting. They all are essential to the big picture, but must be understood within the context of the whole to be of value.

We See What We Are Shown

Visual illusions are terrific metaphors for the cognitive illusions we fall prey to all the time. The problem is, we have so much faith in our ability to gather the right information, accurately assess it, and make proper decisions based upon it all, that even in the face of opposing evidence, we tend to trust our gut over the facts. Sure, we see the mistakes being made by others, and, occasionally, we may even see how we ourselves may have been vulnerable to it in the past, but in the moment it is occurring, we are blind to our blindness.

In Figure 5.3, there are 12 black dots that you can easily see one, two or even three at a time. However, it's nearly impossible to see all 12 simultaneously. There is much speculation and study surrounding why this is the case. As it relates to the topic at hand though, I'm less concerned with why, and more interested in the fact that we are blind to three‐quarters of the relevant information that is staring us right in the face. We are fully cognizant of the existence of all 12 black dots, and can see to all the edges of the image (the big picture), but can't process it all simultaneously.

Image described by caption and surrounding text.

Figure 5.3 Twelve black dots on a dark‐gray grid.

SOURCE: J. Ninio and K.A. Stevens, “Variations on the Hermann Grid: An Extinction Illusion,” Perception (2000). Reproduced with permission of SAGE.

However, if I make one minor adjustment to the image (Figure 5.4), dimming the gray lines that connect the dots, suddenly it all comes into focus. In other words, if I reduce the irrelevant information, what we in the business of investment management call the noise, suddenly what is relevant becomes blatantly obvious.

Image described by caption.

Figure 5.4 Twelve black dots on a light‐gray grid

SOURCE: J. Ninio and K.A. Stevens, “Variations on the Hermann Grid: An Extinction Illusion,” Perception (2000). Reproduced with permission of SAGE.

Going back to the Rubik's cube example from Chapter 2, when asked what color the two center pieces are, we instantly ignore all the extraneous information. Well, that's what we believe happens at least. The reality is, as this example shows, once information is processed by our brains, it becomes incredibly difficult to ignore it. Not until the irrelevant information is deemphasized can we see the environment as it actually exists.

In 1999, the CIA declassified articles written for the agency between 1955 and 1992. One of them was an article written by Richards Heuer called “Do You Really Need More Information?” In it, Heuer describes a study wherein they asked horseracing handicappers to rank the importance of 88 handicapping factors, such as the weight of the jockeys, age of the horses, and footing of the track. They then provided the handicappers with the data related to the five they ranked as most important, and asked them to handicap races. Then they were given the data related to their top 10 factors, the top 20 factors, and finally, the top 40 factors in order to handicap races.

Surprisingly, as they added more information, their accuracy actually began to decline. Although that alone is informative, the real value of this study and the many that followed was in discovering that the confidence of the handicappers grew as more data was made available to them. In fact, by the time they had all 40 top factors at their fingertips, their confidence was twice what it had been with just the top five. In other words, there is empirical evidence of a negative correlation between the accuracy of predictions and the confidence predictors have in them.

This can have a big impact on our returns. Since we tend to size our positions relative to the confidence we have in them, we are likely to overweight positions merely as a result of the availability of unnecessary information.

In his book The Psychology of Intelligence Analysis, Heuer explains that “experienced analysts have an imperfect understanding of what information they actually use in making judgments. They are unaware of the extent to which their judgments are determined by a few dominant factors, rather than by the systematic integration of all available information. Analysts use much less available information than they think they do.” He found the same held true for doctors diagnosing illnesses, as well as experts in other professions, too.

In the end, he concluded that “individuals overestimate the importance we attribute to factors that have only a minor impact on our judgment, and underestimate the extent to which our decisions are based on a very few major variables.”

In his book Fooled by Randomness, Nassim Taleb shares his take on the subject when he writes, “It takes a huge investment in introspection to learn that the thirty or more hours spent ‘studying’ the news last month neither had any predictive ability during your activities of that month nor did it impact your current knowledge of the world.”

Thirty‐two years earlier, Amos Tversky and Daniel Kahneman first published their findings on the matter in an article entitled, “Judgement under Uncertainty” in Science. They explained that in order to make decisions when the outcome is uncertain, we rely on our beliefs to assign probabilities to each of the potential outcomes. What they discovered is that very often the heuristics, or mental shortcuts we employ lead to biased expectations that can result in “severe and systematic errors.”

They describe a phenomenon known as judgment by representativeness through a series of examples and experiments, one of which is particularly applicable to Taleb's point. In it they tell subjects that a group consists of 70 engineers and 30 lawyers. Without providing any additional information, they asked the subjects what the probability is that a particular individual selected from that group is an engineer. Participants correctly judged it to be 70%.

They then provided the following personality sketch of the individual in question.

“Dick is a 30‐year‐old man. He is married with no children. A man of high ability and high motivation, he promises to be quite successful in his field. He is well liked by his colleagues.”

The description was meant to convey no information relevant to the question. Therefore, when asked again what the probability is of him being an engineer, the answer should have remained 70%. However, the subjects now judged the probability to be 50% after reading what was essentially worthless information.

“Evidently, people respond differently when given no evidence and when given worthless evidence. When no specific evidence is given, prior probabilities are properly utilized; when worthless evidence is given, prior probabilities are ignored.”

Armed with the evidence that even perfectly innocuous information can have a detrimental impact on our ability to make rational decisions, it should make us question the real value of all the news and research being packaged for our consumption, but let's take it one step further.

Imagine for a moment that I am the publisher of a global macro research newsletter. I understand that the global macro landscape moves at a rather slow pace, with the release of new information that has the power to alter the medium to long‐term trajectory of markets, economies, and policy – the type that truly qualifies as “signal,” occurring just a few times a year. However, there are very few clients who would subscribe to a weekly newsletter wherein the majority of them consisted solely of the sentence, “Same views as last week. Nothing of significance has changed.” To stay in business, I have to find interesting topics to write about and make a compelling argument for why they are both significant and relevant to the reader. The same goes for every newscast, newspaper, and magazine. When nothing of significance is occurring, they must still produce content, and convince readers that it is of great importance to their investments, businesses, and lives. It is then left to us to make a distinction between the signal and noise among all the content. (As we've seen, we do a very poor job of it.) In essence, the very thing for which we go to these content providers – a way to make a distinction between signal and noise – is actually made worse by relying on them.

So how do we use all this to improve our results? It is clear that we as decision‐makers must make a concerted effort to first correctly identify what the key factors are and then work to actively avoid overexposure to any additional factors. In fact, any time and effort we expend to suppress our natural urge to gather more data and read more research, may actually contribute more to our bottom line than giving in to it. Yes, I'm suggesting that gathering smaller quantities of higher quality information is actually better than large quantities of unvetted narratives.

Stay Home, See the World

When I first started trading emerging markets it was firmly believed that locals had a distinct competitive advantage over the rest of us. Those who worked at major institutions with a strong network with local presence were feared. Guys like me didn't have a chance. It was, and for many it is still believed, that you must visit the countries in which you invest. We did the road trips, meeting with central bankers, government officials, and local business leaders. I took notes on who we met, where we went, and what we learned. I also tracked my performance in the days, weeks, and months that followed. What I discovered was a distinct lack of correlation between the trips and my performance. On those trips, I was actively seeking confirmation of the views I held before the trip. If I was bullish before, I would be even more bullish when I returned, and vice versa. In other words, I learned for myself exactly what Heuer had discovered decades earlier.

Yes, I sounded important when I sent around my notes, showing all the high‐profile executives and policymakers with whom I had met. My analysis gained credibility by association, but the reality was, it didn't do a thing for my bottom line, and as a trader, only the bottom line matters. So for more than two decades now I have had a strict rule not to visit those countries in which I am actively involved. The trips were a waste of time and energy that surprisingly had no impact on my trading results, they were no more than a distraction. As the research has proven, by seeking confirmation of our views and inevitably finding it, confidence in our expectations increases and so too will its proportional weight in our portfolios and returns. The reality is, no matter how many meetings I have, companies I meet with, factories I visit, or behaviors I observe, they will always represent just a sliver of the whole picture. How representative that sliver is of the whole is impossible to know, and without that information, it is potentially more harmful than helpful.

I know it is difficult to believe that traveling less, gathering less data, and reading less research can actually improve your returns. It goes against everything we believe. Our brains utilize heuristics, or shortcuts, more commonly than we'd like to think, and they are dependent upon the deeply engrained beliefs we hold. When information is presented that challenges those beliefs, we experience cognitive dissonance, which is best described as a feeling of great unease when we hold two conflicting views. In order to rid ourselves of that uncomfortable feeling, we are driven to resolve it, typically by ignoring evidence in favor of the new argument and seeking what confirms the more deeply engrained belief. It's how we maintain inner peace, and it is why, even after reading this and other books on the subject, it is still very difficult to stop reading research, newspapers, and magazines, watching the news, going to industry conferences, and meeting with experts, something I forced myself to do more than 20 years ago.

ABQs of Research (Always Be Questioning)

I came across an article titled “The Fascinating Story Behind Why So Many Nail Technicians Are Vietnamese” on my Facebook newsfeed today. It was shared from a Yahoo feed, which was fed by an article on TakePart.com. In the article, the writer mentions that 80% of nail technicians in Southern California are Vietnamese and they account for 51% of the total across the United States. She then goes on to quote a number of other statistics such as the size of the nail industry ($8 billion), the GDP of Vietnam ($12 billion) and how important it is to these US‐based technicians to send a sizable proportion of their income back home. It's such a feel‐good, fun, and truly fascinating narrative – including a famous celebrity and grand historical context – it's hard not to want to share it with others. In other words, it's highly likely this story would spread like wildfire. (I mentioned the story to my wife and she informed me she had read about it last week in People magazine.)

However, being aware that we are susceptible to availability bias, the part of my brain responsible for critical thinking is immediately awakened. As entertaining as it is, the entire story hinges on one important fact. The majority of nail technicians in America, and particularly in Southern California, are of Vietnamese descent, and by a very wide margin. Take that away and suddenly its significance is dramatically reduced. So naturally I did some digging. Since neither the Yahoo version, nor the original TakePart post included a source, I had to search the web. I came across the same story on blog after blog and numerous well‐established media sites, but only one, NPR, referenced the source. Turns out, the source was a poll taken by Nails Magazine, which was kind enough to share the results and methodology online, in a very colorful pdf. Finally, I had the info necessary to confirm what I'd read, and could feel confident that it was factual, and therefore worthy of being forwarded. Actually, it didn't. According to the Nails Big Book 2014–2015, one of the key sources for the most important statistic in the article is described as follows: “We surveyed the readers of VietSALON, our Vietnamese language publication for salon professionals.”

I don't know how much it biased the results to have a Vietnamese language publication for salon professionals as the primary contributor to the polling, but I believe it is significant enough to make a rational reader skeptical of the findings. Now, I know what you're thinking. Who cares? It's just a silly story. In a world filled with far more important problems, can't you just let us enjoy a fun one?

Imagine LeBron James chooses a diet of Twinkies and soda combined with a regimen of couch sitting in the off‐season. His coach chides him, complaining that he won't be able to compete when the season begins again. LeBron, one of the best players in the league year after year, consoles his coach by saying, “Relax, it's not like I'm going to do this when the season starts. When I'm on the court, I'll be in tip‐top shape, as always.”

Our brains are muscles that require exercise, just as much as any other muscle, but it is the most important one for our chosen professions. Unless you want to be susceptible to the availability bias, availability cascade, or any of the other potentially damaging cognitive biases in your investment process, you must train yourself to think critically at all times. Unfortunately, you cannot simply turn it on when it's important and off when it isn't. Critical thinking requires serious mental effort at all times, making it a way of life.

Working Smarter, Not Harder

Recently, one of the best investors in the world, someone who has had entire book chapters dedicated to understanding his process, asked a very thoughtful question related to the Vietnamese nail salon story. “Given that today we are flooded with information and stories, how do you cut through the information noise and figure out which stories to spend time approaching analytically? How do you force rank all the things that you could think about today so that you get the best ‘bang for your buck’ from exerting this effort?” (I believe the fact that he still asks these questions has a lot to do with why he is so good at this.). My advice, as mentioned above, is to stop reading all of these stories, and here's why.

Cognitive science is essentially the study of how we gather, process, draw conclusions from and ultimately use information to make decisions. It's important to note, our brains take in information at an extraordinary pace. We see it, read it, hear it, feel it and taste it, and as soon as we come into contact with it our brains automatically go to work integrating it with everything that came before, in an effort to maintain a clear, consistent and cohesive picture of the world in which we live.

The problem is our brains can't thoughtfully process all that data at such a rapid rate. Neither can we set any of it aside until we have the time to give it our undivided attention. It is an automated response that we cannot turn off. The evolutionary solution we've developed is known as heuristics or mental shortcuts which most of the time they work just fine. That's what cognitive scientists tell us at least, but it's a bit misleading. It is true – most data can and should be handled by employing heuristics. We don't need to contemplate why the light is red; we simply need to stop. The identification of a tree as a source of shade when we're hot and water as something that can satisfy our thirst should be left to our automated system, and incoming data like that does in fact make up the majority of information we receive and process. However, we should not take that to mean that our automated systems do a similarly satisfactory job with processing information we depend upon to make decisions of real consequence. You see, what cognitive scientists also agree on is that more often than we'd like to believe, heuristics result in systematic errors in judgement, otherwise known as bias, and worst of all, we are almost always oblivious to when they are occurring or what triggered them.

Our best defense is to recognize when we will be most vulnerable and take action to defend ourselves at those moments. This requires being proactive. It means you must first acknowledge a future moment as one of vulnerability, and one that you are unlikely to recognize as such at the moment it is occurring.

Unfortunately, much of the information we gather is processed by our automated system, which Kahneman and Tversky call System 1. Fundamentally, System 1 seeks cognitive ease. It doesn't contemplate questions like “How,” “Why,” or “Is that accurate” when it is processing incoming data. It simply picks out what it wants, what feels right. It seeks confirmation of the world order as you have already firmly established it. It perks up when it identifies something as familiar and gives it priority. What are things that will feel familiar to you? Things you heard first. What you read last. Things you've heard repeatedly. Information delivered via sources that have established themselves in your mind as trusted sources will be treated as familiar‐by‐proxy, thereby being attributed a higher level of importance and value. To be honest, System 1 isn't very discriminating. The right word grouping and even carefully selected color hues can evoke a sense of familiarity, thereby keeping System 2, the more skeptical, thoughtful part of your intellectual process, at bay. Once that information gets in, it automatically has priority over anything new. It becomes the standard by which all new information is assessed.

What this means is that all future information you gather will be affected, tainted even, by what you have allowed in previously. All decisions you make going forward will be similarly affected, tainted even, by what has become an integral part of that clear, consistent, and cohesive picture of the world you have drawn for yourself. So you can see, as much as we'd like to believe we have properly analyzed every bit of information that we use as a basis for judgement, it's just not true, or even possible.

What can we do then to improve our decisions? Well, as the saying goes, “Garbage in, garbage out,” meaning, if the information you've gathered is subpar, it's very likely that the decisions based upon that information will also be subpar, no matter how smart you may be. Therefore, in order to improve your decisions, invest time in assessing and improving your process for gathering information. Here is an incomplete list of tips for doing so.

First and foremost, it all begins with one seemingly obvious realization. You cannot and will not ever know everything. You can't read, see, hear, feel, and taste everything there is to experience. So stop pretending you can. No matter how many news feeds you subscribe to, newsletters you read, analysts you speak to and policymakers you have on speed dial, there will be a nearly infinite number of things you will not know. That means the time and energy you are wasting (or worse) skimming information cursorily in order to convince yourself you are in fact well informed, can be reallocated to endeavors that will deliver significantly better returns on effort.

Source Assessment

The halo effect is a cognitive bias we employ constantly, though we are rarely aware of it. Essentially, it describes how we let our overall impression of a person, firm, news source, etc. affect our feelings and thoughts about other aspects of that entity. If we like one aspect of something, we will have a positive predisposition toward everything about it, and even things associated with it. This works in both directions, both positive and negative. Be aware: anyone who seeks your attention is cognisant of your penchant for this particular heuristic and bias, so be on guard.

There is little we can do about how often we are affected by this bias, however, in knowing this we can take special care to be certain we are properly appraising people and entities. If you refer to an analyst as "Goldman's US guy" for instance, you can be fairly certain you've outsourced your appraisal of that analyst's skillset and the dependability of his accuracy to those who hired him. We joke all the time about how laughable it is to trust something we read just because it's on the Internet, but we do exactly that all the time with so many other sources.

The goal here is to put in the time and effort to properly appraise our sources like any good detective would. Then, in those moments when we do unwittingly fall prey to the halo effect (and it will be often) we will have lowered the odds of it doing serious damage.

Here are some ways attention seekers will capitalize on the halo effect, effectively gaining credibility by proxy perhaps without you even realizing it.

  • Articles with the name of a highly regarded individual in its title. Example: “What Warren Buffett Is Doing Today.”
  • Ads with celebrity endorsers.
  • Products distancing themselves from things perceived as bad. Example: “No high fructose corn syrup!”
  • Articles written by a highly regarded individual, particularly on a topic unrelated to the basis for that high regard.
  • Analysts quoting key policymakers or even implying an inside connection to them.

Engaged Reading

If you do a proper cost/benefit analysis on skimming articles, reading headlines, even having conversations while watching your screens, you'll stop doing all those things in an instant. The reasons are confirmation bias and the impact of familiarity. When we skim, our brains don't contemplate meaning or develop new context. We simply pull out words and word groupings that look familiar that enable cognitive ease and that fit neatly with our worldview. Have you ever noticed when something relatively obscure to you enters your consciousness that it suddenly appears everywhere?

When my sister became engaged to a man from Costa Rica, it suddenly appeared to my family and me as though this tiny little country was being mentioned everywhere. Of course Costa Rica hadn't suddenly taken on global significance, it had simply become familiar to us, and System 1 was now automatically drawn to any mention of it. The more we heard the country's name, the greater its weight in our worldview. The same happens in our research. If your information sources are all highly connected, they will become highly correlated for this very reason. (It helps explain why the greatest factor in determining correlation among hedge funds is geography.) If all your sources are mentioning the same thing, that thing's weight will increase dramatically in your worldview. It is a phenomenon known as the “mere exposure effect.”

If instead of skimming, you select only a very few uncorrelated, highly reliable sources of information to truly engage with, to contemplate and further investigate on your own, your worldview will be far more balanced and reflective of reality. In those moments when connections develop between seemingly disparate topics covered by those sources, you'll know you've hit on something worthy of serious investigation.

Contemplation

What I mean by contemplation is the act of stepping away from everything. Having accepted that you can't read everything or speak to everyone, you should be free to truly absorb the information you have identified as worthy of gathering. Away from the scrolling headlines and incessant display of tick data, you are better able to see the world as it actually exists. Most importantly, you develop the patience and space to better understand context and connections. You will formulate better questions which will lead to more informed investigations.

Taking Care of Yourself

Studies have shown that our brains function similarly when we are sleep deprived versus when we are intoxicated. Israeli judges have been observed doling out radically different sentences just before lunch versus immediately afterward. We know sleep and nutrition affect us, but very often we sacrifice these things in favor of reading one more research piece or taking one more “peek” at the markets. The rational you knows that the marginal gain of checking your email for the fifth time before ordering appetizers is nearly zero, but the habitual System 1 isn't rational, it is impetuous. Hand your phone to your child at the dinner table. Leave your desk to read the research you value. Go on a hike. Eat a proper breakfast. Read a good novel. Learn a new skill. Unplug.

How Information Can Lead Us Astray

Cognitive psychologists like to use visual illusions to make the point that while we may be very intelligent, rational beings, our brains rely on heuristics, or mental shortcuts to gather clues about our surroundings and leap to conclusions based upon them. Although we tend to benefit greatly from this setup, very often those conclusions are severely flawed. It is easy to see the shortcomings visually, but recognizing when we make cognitive mistakes requires greater effort, in addition to mention a serious dose of humility.

The Problem with Flow Info

Recently I spoke at Drobny Global's annual macro conference in Santa Monica. Afterwards, I sat and chatted with an allocator from one of the large government pension funds who asked, “Don't you think flows matter?” Being a reader of Seeds for some time now, we were both aware that she already knew my answer, however it didn't mean that her question was a rhetorical one. It simply meant I hadn't yet convinced her. There are numerous reasons why I cut myself off from flow information long ago, all of them being statistical in nature. I'll explain my skepticism on the value of fund flow information with a story from an unrelated topic: education.

The Gates Foundation is widely accepted as the leader in education reform. The organization is of course backed by multibillionaire Bill Gates, viewed by most as one of the great minds of our time and an exceptional business leader. As it is with many similar organizations, the belief is that by applying the rigorous analytical tools that led to Microsoft's incredible success in software, the same can be achieved with the many social causes to which his foundation turns its attention. None of this is contentious, and in fact, the foundation carries so much weight, that the mere mention of its backing for a nonprofit or research finding will virtually guarantee a veritable avalanche of additional support from others as well.

In the United States, one of the great frustrations is that K–12 (kindergarten through 12th grade) education, once the gold standard for the rest of the world, has fallen well behind, despite nearly $650 billion being spent on it every year. Not only are public funds thrown at it, but it is a favorite pet project of many who come from the business world, armed with massive war chests, rivaled only by the confidence they have the ability to effect positive change. Theories abound as to how we got here and what should be done to fix the problem. School lunches, higher pay for teachers, and computers in every classroom are but a few of the solutions that have been proposed, received tremendous funding, and have been implemented.

In the late 1990s, the argument that smaller schools produced better results was gaining steam and powerful adherents. Research, such as that based on test results from the Pennsylvania System of School Assessments, had made a compelling argument in favor of smaller schools. Being based on cold, raw data, it was the kind of result that appeals to great business minds like Bill Gates, and the evidence was compelling.

They'd gathered scores from 3rd‐, 5th‐, 8th‐, and 11th‐grade math and reading tests, plus writing scores for 6th, 9th, and 11th grade. In analyzing the data provided by 1662 separate schools, they found that of the 50 top‐scoring schools (the top 3%), 6 of them were among the 50 smallest (the smallest 3%). If the size of the school were unrelated to performance, the smallest schools should have represented just 3% of the top 50, but according to the data they actually represented 12% (6 out of 50). That's an overrepresentation by a factor of four!

The Gates Foundation was sold. They began pouring money into programs designed to support small schools, nationwide. By 2001, they provided roughly $1.7 billion in grants to education projects and were quickly joined by the upper echelon of not‐for‐profits, including the Annenberg Foundation, Carnegie Corporation, Harvard's Change Leadership Group, the Pew Charitable Trusts, and the U.S. Department of Education's Smaller Learning Communities Program. As Wainer and Zwerling so accurately stated in their follow up research, “The availability of such large amounts of money to implement a smaller‐schools policy yielded a concomitant increase in the pressure to do so, with programs to splinter large schools into smaller ones being proposed and implemented broadly in New York, Los Angeles, Chicago and Seattle.”

In market parlance, the smart money flows became evident, and had become a powerful force in the direction of education reform in their own right. Lives were uprooted and impacted for years to come. Westlake Terrace High School, a suburban school in Seattle with 1800 students was broken into five smaller schools, enabled by a Gates grant of nearly $1 million. It was just one of many such changes being carried out across the country. School boards were taking action, not based on the underlying data that had created the flows, but on the flow itself. Articles were written about the flows, who was behind them, and the action being taken as a result. Politicians jumped on board ideologically and financially.

Then, in 2005, the Gates Foundation made a stunning announcement. They were moving away from converting large schools into smaller ones. They decided that improving classroom instruction was more important to improving schools rather than breaking down the size. What led them to stop after leading the charge seemingly midcourse? Turns out their initial analysis was severely flawed, and both it and the follow‐up data made a very strong case that not only were smaller schools not better, they may actually be worse.

The researchers who made the argument that smaller schools were better than larger ones had shown an exaggerated faith in small samples, a very common problem in research of all kinds, even today. Recent advances in cognitive psychology have taught us that we pay more attention to the content of messages than to information about their reliability. As Daniel Kahneman points out in Thinking, Fast and Slow, “Many facts are due to chance, including accidents of sampling. Causal explanations of chance events are inevitably wrong.”

That's what happened in the case of the findings regarding smaller schools. You might look at the fact that the data included 1662 schools and think that's a sufficient sample size from which to draw conclusions, but that isn't the problem. The error occurred in not recognizing that each of the small schools represented a far smaller sample size than those of the big schools. What you should expect when dealing with small sample sizes versus large ones is a higher degree of variability. As an example, the chances of tossing a fair coin 5 times and having it land on heads every time is far greater than if you tossed it 500 times. The researchers, Gates, Harvard, Pew, and the others that had reviewed the data focused on only one side of the results. Had they gone one extra step and looked at the opposite end of the spectrum exclusively they would have arrived at the exact opposite conclusion.

Going back to the Pennsylvania System of School Assessments data, among the 50 worst‐performing schools, you will find that 9 of them were among the 50 smallest schools. In others words, there was an overrepresentation by a factor of 6! Far bigger even than what they found on the positive side. Truth is, drawing conclusions based solely on that information would have been just as flawed as the other way around. All it proved is that smaller samples generate data with greater variance. When a regression was done of all the data it showed no relationship between results and school size. However, when applied to those of high school students only, the regression line showed a significant positive slope. In other words, the larger the high school the better the scores.

The Gates Foundation recognized the error and stopped out of the position. Harvard, Pew, Carnegie, and the others followed suit shortly thereafter. Unfortunately, the students, teachers, and communities were left holding the position with no one around to help them put the schools back together. No new funding to fix the problems. Principals at the larger schools that had been broken up were held accountable for the diminishing returns on their students. School boards had to answer for the inefficiencies that resulted from the greater overhead of administration per student required for the smaller schools. In other words, those who had invested based on the smart money flows were left holding the bag. If instead, they had done their own research, dug into the data themselves, and recognized where their time and effort would have been better spent, they would have been years ahead of everyone else, being held up as the standard by which all others would aspire to.

So when you ask me why it is that I don't want to know about trading flow information, my simple answer is that it is more noise than signal. Whatever limited selection of flow information that might make its way to me is but a small sample of the total, leading to the very same problem experienced by the Gates Foundation. I'd prefer to depend upon fundamental data, which if you think about it, is also flow data, representing behaviors that are affecting price action, but from a much larger sample set, and therefore worthy of much greater confidence.

Besides, if I were to generate poor returns and an investor or client came to me after the fact to ask why, I never want my answer to be, “Well, everyone else was doing it.”

Postscript

In a speech at a global macro conference I made the argument that markets were exhibiting extreme risk aversion, to which I received some serious pushback. Audience members suggested (quite vehemently at times) that the rally in equity markets, and particularly the compressed spreads served as evidence of quite the opposite of what I had proposed. I will go into why these arguments are flawed later, but I thought it worth noting that immediately following my talk and the counterarguments, the conference hosts posed a number of questions to the audience related to positioning. Nearly every single one of them showed an almost perfect bell‐shaped‐curve result, prompting the host to remark that he'd rarely seen results so symmetrical, with the preponderance of audience members selecting No Position, and a rapidly diminishing distribution as you approached either Very Overweight and Very Underweight. In other words, the very people who were arguing that the markets were not exhibiting indications of extreme risk aversion had just provided clear evidence of exactly that in their own behaviors.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
52.14.130.13