CHAPTER 2

The Components of Thinking

Introduction

Now that we have outlined the definition of thinking and the various types of thinking, we need to examine the elements of how we arrive at our ideas. This chapter provides an overview of cognitive biases, fallacies, and mental models. It is difficult to be an effective strategic thinker and writer if you lack even the most basic understanding of these components of thought. In the 1960 movie Inherit the Wind, Henry Drummond questions Matthew Brady on the scientific authority of the Bible. Drummond asked Brady if he believed the first day that God created was of 24 hours. Brady responded “I don’t know.” Drummond queried, “What do you think?” to which Brady quipped “I do not think about things that I do not think about.” After a brief pause Drummond snapped “Do you ever think about things that you do think about?!”

If you do not think about things that you do not think about, how can you accurately label yourself a strategic thinker? Do you ever think about the things that you do think about? How do you know? And how do you know that your thinking process maintains any level of sophistication? Understanding cognitive biases, fallacies, and mental models will provide you with the necessary foundation as you look to improve your ability to think and write strategically.

The Top 20 Cognitive Biases

Strategic thinking, and more specifically, strategic business writing, requires that you have a basic understanding of the most common cognitive biases. Doing so will provide insight into how you think about what you think. A cognitive bias is a mistake in reasoning, evaluating, remembering, or other cognitive processes, often occurring as a result of holding onto one’s preferences and beliefs regardless of information to the contrary.

Biases can arise in many areas of daily life. From how we choose a college to attend to picking out milk at the grocery store, we often make unconscious, suboptimal decisions. From the original Nobel Prize-winning work of psychologists Amos Tversky and Daniel Kahneman to more recent findings, more than 80 different cognitive biases have been identified over the last 40 years. Here is a list of the 20 most common cognitive biases:

  1. Anchoring bias: People are overreliant on the first piece of information they hear. Example: In a salary negotiation, whoever makes the first offer establishes a range of reasonable possibilities in each person’s mind.

  2. Availability heuristic: People overestimate the importance of information that is available to them. Example: A person might argue that smoking is not unhealthy because they know someone who lived to 100 and smoked three packs a day.

  3. Bandwagon effect: The probability of one person adopting a belief increases based on the number of people who hold that belief. This is a powerful form of groupthink and is the reason why meetings are often unproductive. Example: when choosing between two restaurants which are in similar in cost, menu and atmosphere, we often choose the one which has more people sitting in.

  4. Blind-spot bias: Failing to recognize your own cognitive biases is a bias in itself. People notice cognitive and motivational biases much more in others than in themselves. Example: Executives and senior leaders often have a blind spot and fail to recognize the changing competitive reality in their industries and in their markets until it is too late.

  5. Choice-support bias: When you choose something you tend to feel positive about it, even if that choice has flaws. Example: When you think your dog is awesome even though it bites people.

  6. Clustering illusion: The tendency to see patterns in random events. This is often the key to various gambling fallacies. Example: When you like the idea that red is more or less likely to turn up on a roulette table after a string of reds.

  7. Confirmation bias: We tend to listen to information that confirms our preconceptions—one of the many reasons why it is difficult to have an intelligent conversation about climate change. Example: many people have a confirmation bias and believe that left-handed people are more creative than right-handed people.

  8. Conservatism bias: Where people favor prior evidence over new evidence or information that has emerged. Example: People were slow to accept that the Earth was round because they maintained their earlier understanding that the planet was flat.

  9. Information bias: The tendency to seek information when it does not affect action. More information is not always better. With less information people can often make more accurate predictions. Example: Believing that the more information that can be acquired to make a decision, the better, even if that extra information is irrelevant for the decision

10. Ostrich effect: The decision to ignore dangerous or negative information by “burying” one’s head in the sand, like an ostrich. Example: Research suggests that investors check the value of their holdings significantly less often during bad markets.

11. Outcome bias: Judging a decision based on the outcome rather than on how exactly the decision was made in the moment. Example: Just because you won a lot in Vegas does not mean that gambling was a smart decision.

12. Overconfidence: Some people are too confident about their abilities, and this causes them to take greater risks in their daily lives. Experts are more prone to this bias than laypeople, since they are more convinced that they are right. Example: For example, 93 percent of American drivers claim to be better than the median, which is statistically impossible.1

13. Placebo effect: When simply believing that something will have a certain effect on you causes it to have that effect. Example: In medicine, people given fake pills often experience the same physiological effects as people given the real thing.

14. Pro-innovation bias: When a proponent of an innovation tends to overvalue its usefulness and undervalue its limitations. Example: Roger Smith, then chairman of General Motors, said in 1986: “By the turn of the century, we will live in a paperless society.”2

15. Recency: The tendency to weight the latest information more heavily than older data. Example: Investors often think the market will always look the way it looks today, which leads to unwise decisions.

16. Salience: Our tendency to focus on the most easily recognizable features of a person or concept. Example: When you think about dying, you might worry about being mauled by a lion, as opposed to what is statistically more likely, such as dying in a car accident.

17. Selective perception: Allowing our expectations to influence how we perceive the world. Example: An experiment involving a football game between students from two universities showed that each team saw the opposing team commit more infractions.

18. Stereotyping: Expecting a group or person to have certain qualities without having real information about the person. It allows us to quickly identify strangers as friends or enemies, but people tend to overuse and abuse it. Example: when you believe the people of a certain ethnicity or background act a certain way.

19. Survivorship bias: An error in judgment that comes from focusing only on surviving examples. Example: For instance, we might think that being an entrepreneur is easy because we have not heard of all those who failed.

20. Zero-risk bias: Sociologists have found that we love certainty—even if it’s counterproductive. Eliminating risk entirely means there is no chance of harm being caused. Example: “Zero-risk bias occurs because individuals worry about risk, and eliminating it entirely means that there is no chance of harm being caused,” says decision science blogger Steve Spaulding. “What is economically efficient and possibly more relevant, however, is not bringing risk from 1% to 0%, but from 50% to 5%.”3

In today’s social media driven world, people fall into information silos and only listen, watch, or read what is of interest to them. Doing so compounds many cognitive biases. If you want to improve your thinking and reduce the number of cognitive biases you fall prey to, watch a movie, read a book, or listen to a podcast you would normally shun. Engaging in new thoughts, ideas, or issues can help you refine what it is you value, whether it’s a style, a story line, or an argument. Remember, thinking is hard work and understanding how you think is even more so. Strategic thinkers spend a good deal of time examining what and how they think.

Self-Awareness Check

Check off each of the cognitive biases that you have previously used:

  1. Anchoring bias:   _____

  2. Availability heuristic:   _____

  3. Bandwagon effect:   _____

  4. Blind-spot bias:   _____

  5. Choice-support bias:   _____

  6. Clustering illusion:   _____

  7. Confirmation bias:   _____

  8. Conservatism bias:   _____

  9. Information bias:   _____

10. Ostrich effect:   _____

11. Outcome bias:   _____

12. Overconfidence:   _____

13. Placebo effect:   _____

14. Pro-innovation bias:   _____

15. Recency:    _____

16. Salience:    _____

17. Selective perception:   _____

18. Stereotyping:   _____

19. Survivorship bias:   _____

20. Zero-risk bias:   _____

-----

Were you aware of your cognitive bias at the time? If so, what did you do to improve your thinking process?

How will you adapt your thinking moving forward now that you are more familiar with 20 common cognitive biases?

Knowledge Check

Which of the options best describes the biases listed?

  1. When you are overreliant on the first piece of information you hear.

a. Anchoring bias

b. Availability heuristic

c. Bandwagon effective

d. Blind-spot bias

  2. When you fail to recognize your own cognitive biases.

a. Anchoring bias

b. Availability heuristic

c. Bandwagon effective

d. Blind-spot bias

  3. When you tend to listen to information that confirms your preconceptions.

a. Choice-support bias

b. Clustering illusion

c. Confirmation bias

d. Conservatism bias

  4. When you ignore dangerous or negative information by “burying” your head in the sand.

a. Ostrich effect

b. Clustering illusion

c. Confirmation bias

d. Conservatism bias

  5. When you expect a group or person to have certain qualities without having real information about the person.

a. Stereotyping

b. Selective perception

c. Recency

d. Fallacies

Fallacies are common errors in reasoning that will undermine the logic of your argument. Fallacies can be either illegitimate arguments or irrelevant points and are often identified because they lack evidence that supports their claim. Avoid these 10 common fallacies in your own arguments and watch for them in the arguments of others.

  1. Ad hominem: This is an attack on the character of a person rather than on his or her opinions or arguments. Example: Green Peace’s strategies aren’t effective because they are all dirty, lazy hippies. In this example, the author doesn’t even name particular strategies Green Peace has suggested, much less evaluate those strategies on their merits. Instead, the author attacks the characters of the individuals in the group.

  2. Begging the claim: The conclusion that the writer should prove is validated within the claim. Example: Filthy and polluting coal should be banned. Arguing that coal pollutes the earth and thus should be banned would be logical. But the very conclusion that should be proved, that coal causes enough pollution to warrant banning its use, is already assumed in the claim by referring to it as “filthy and polluting.”

  3. Circular argument: This restates the argument rather than actually proving it. Example: George Bush is a good communicator because he speaks effectively. In this example, the conclusion that Bush is a “good communicator” and the evidence used to prove it “he speaks effectively” are basically the same idea. Specific evidence such as using everyday language, breaking down complex problems, or illustrating his points with humorous stories would be needed to prove either half of the sentence.

  4. Either/or: This is a conclusion that oversimplifies the argument by reducing it to only two sides or choices. Example: We can either stop using cars or destroy the earth. In this example, the two choices are presented as the only options, yet the author ignores a range of choices in between such as developing cleaner technology, car-sharing systems for necessities and emergencies, or better community planning to discourage daily driving.

  5. Hasty generalization: This is a conclusion based on insufficient or biased evidence. In other words, you are rushing to a conclusion before you have all the relevant facts. Example: Even though it’s only the first day, I can tell that this is going to be a boring course. In this example, the author is basing his evaluation of the entire course on only the first day, which is notoriously boring and full of housekeeping tasks for most courses. To make a fair and reasonable evaluation the author must attend not one but several classes, and possibly even examine the textbook, talk to the professor, or talk to others who have previously finished the course in order to have sufficient evidence to form a conclusion.

  6. Moral equivalence: This fallacy compares minor misdeeds with major atrocities. Example: That parking attendant who gave me a ticket is as bad as Hitler. In this example, the author is comparing the relatively harmless actions of a person doing their job with the horrific actions of Hitler. This comparison is unfair and inaccurate.

  7. Post hoc ergo propter hoc: This is a conclusion that assumes that if “A” occurred after “B” then “B” must have caused “A.” Example: I drank bottled water and now I am sick, so the water must have made me sick. In this example, the author assumes that if one event chronologically follows another the first event must have caused the second. But the illness could have been caused by the burrito eaten the night before, a flu bug that had been working on the body for days, or a chemical spill across campus. There is no reason, without more evidence, to assume the water caused the person to be sick.

  8. Red herring: This is a diversionary tactic that avoids the key issues, often by avoiding opposing arguments rather than in addressing them. Example: The level of mercury in seafood may be unsafe, but what will fisherfolk do to support their families? In this example, the author switches the discussion away from the safety of the food and talks instead about an economic issue, the livelihood of those catching fish. While one issue may affect the other, it does not mean we should ignore possible safety issues because of possible economic consequences to a few individuals.

  9. Slippery slope: This is a conclusion based on the premise that if A happens, then eventually through a series of small steps, through B, C, . . ., X, Y, Z will happen, too, basically equating A and Z. So, if we don’t want Z to occur, A must not be allowed to occur either. Example: If we ban Hummers because they are bad for the environment, eventually the government will ban all cars, so we should not ban Hummers. In this example, the author is equating banning Hummers with banning all cars, which is not the same thing.

10. Straw man: This move oversimplifies an opponent’s viewpoint and then attacks that hollow argument. Example: People who don’t support the proposed state minimum wage increase hate the poor. In this example, the author attributes the worst possible motive to an opponent’s position. In reality, however, the opposition probably has more complex and sympathetic arguments to support their point. By not addressing those arguments, the author is not treating the opposition with respect or refuting their position.4

Knowledge Check

  1. Green Peace’s strategies aren’t effective because they are all dirty, lazy hippies.

a. Ad hominem

b. Begging the claim

c. Circular argument

d. Either/or

  2. Filthy and polluting coal should be banned.

a. Ad hominem

b. Begging the claim

c. Circular argument

d. Straw man

  3. We can either stop using cars or destroy the earth.

a. Ad hominem

b. Begging the claim

c. Circular argument

d. Either/or

  4. Even though it’s only the first day, I can tell this is going to be a boring semester and course.

a. Hasty generalization

b. Moral equivalence

c. Post hoc ergo propter hoc

d. Red herring

  5. I drank bottled water and now I am sick, so the water must have made me sick.

a. Hasty generalization

b. Moral equivalence

c. Post hoc ergo propter hoc

d. Red herring

Mental Models and the Difference between Bias and Fallacy

It’s important to realize that a cognitive bias is different from a logical fallacy. A fallacy is an actual mistake in reasoning. A cognitive bias is a tendency to commit certain sorts of mistakes. With practice, we can learn to recognize and completely avoid mistakes of logic. This is not true of biases. While faults of logic come from how we think, and thus we can simply change our thinking to be more logical, biases arise from our mental models that allow us to think.

A cognitive bias is not necessarily a thinking error. Biases can manifest as a sort of prejudice, but it’s best to think of them as a thinking tendency. Biases slant our thinking toward certain avenues and conclusions, and often times those conclusions are useful. Behind every cognitive bias is a mental model that is automatic and to which our conscious minds have no access. Biases are a result of the mental models the brain uses to help it quickly make sense of information and experiences.

Mental models are frameworks from which you view life. They help you understand why you are thinking what you are thinking. Mental models, however, are both difficult to identify and even more challenging to change. Mental models can either help us overcome obstacles or hinder our progress through difficult situations.

In their 2017 publication Red Ocean Traps, researchers W. Chan Kim and Renée Mauborgne wrote: “Though mental models lie below people’s cognitive awareness, they’re so powerful a determinant of choices and behaviors that many neuroscientists think of them almost as automated algorithms that dictate how people respond to changes and events.”5 Such automation of our thoughts requires a constant assessment of how and what we think, in order to achieve the level of strategic thinking required to solve problems, answer questions, or address issues.

One of the more famous mental models is known as the law of the instrument identified by Abraham Maslow in his 1966 publication The Psychology of Science. This mental model is defined as having an overreliance on a familiar tool. In popular culture people are familiar with Maslow’s observation: “I suppose it is tempting, if the only tool you have is a hammer, to treat everything as if it were a nail.”

One example of the law of the instrument would be the mechanic who specializes in transmissions. When you drive your car with a radiator problem into the transmission shop, you are more likely to have a new transmission put in than to have the actual problem fixed. When confronted with unfamiliar problems, a person with one mental model, a hammer, will resort to old techniques of questionable effectiveness as opposed to formulating new and better techniques. Recent research provides further light on mental models.

Nobel laureate and best-selling author Daniel Kahneman published Thinking, Fast and Slow in 2011 and explained a dichotomy between two modes of thought: “System 1” is fast, instinctive, and emotional; “System 2” is slower, more deliberative, and more logical. System 1 is defined as fast, automatic, frequent, emotional, stereotypic, and unconscious. Examples include seeing that an object is at a greater distance than another, localizing the source of a specific sound, completing the phrase “war and . . .,” displaying disgust when seeing a gruesome image and reading a text on a billboard. System 2 is defined as slow, effortful, infrequent, logical, calculating, and conscious. Examples include bracing yourself before the start of a sprint, directing your attention toward someone at a loud party, looking out for the woman with the grey hair, digging into your memory to recognize a sound, and sustaining a higher than normal walking rate.

To understand just how challenging it is to change your mental model, consider Kahneman’s own proclamation: “Except for some effects that I attribute mostly to age, my intuitive thinking is just as prone to overconfidence, extreme predictions, and the planning fallacy as it was before I made a study of these issues. I have improved only in my ability to recognize situations in which errors are likely. . . . And I have made much more progress in recognizing the errors of others than my own.” If one of the leading authorities on mental models finds it difficult to improve his own thinking, rest assured you will most likely find it equally challenging, or perhaps even more so.

Self-Awareness Check

Do you have difficulty finding solutions to problems?

Are you aware of your mental models? Is your only tool a hammer; and if so, do you treat everything in your life as if it were a nail?

What have you done lately to help yourself develop additional mental models so that you are prepared to provide yourself with alternative solutions when the next issue arises?

A Profile in Strategy #1: Chobani

The poet Rumi wrote: “As you start to walk on the way, the way appears.” People like Chobani founder Hamdi Ulukaya did just that. As Ulukaya said, “When I started Chobani, I’d never run a company before and there was no plan.”6 Ulukaya is a Turkish businessman, entrepreneur, investor, and philanthropist based in the United States. Ulukaya is the owner, founder, chairman, and CEO of Chobani, the #1-selling strained yogurt (Greek-style) brand in the United States.

On April 26, 2016, Ulukaya announced to his employees that he would be giving them 10 percent of the shares in Chobani.7 Originating from a dairy-farming family in a small village in Turkey, Ulukaya immigrated to the United States in 1994 to study English and took a few business courses as well. In an interview with CNN Money, Ulukaya explains that he was very serious about Kurdish rights and left Turkey due to the Turkish state’s oppression of its Kurdish minority group.

He started a modest feta cheese factory in 2002 on the advice of his father. His greater success came from taking a major risk: Ulukaya purchased a large, defunct yogurt factory in upstate New York in 2005, in a region that used to be the center of a dairy and cheese industry. With no prior experience in the yogurt business, he has created a yogurt empire, Chobani, with facilities in several states. It was valued at over $1 billion in annual sales in less than five years after the launch, becoming the leading yogurt brand in the United States by 2011.

The popularity of his Greek-style yogurt also sparked the rise in Greek yogurt’s market share in the United States from less than 1 percent in 2007 to more than 50 percent in 2013. Ernst and Young named Ulukaya the World Entrepreneur of the Year in 2013.8 The success of his yogurt empire has made Ulukaya a billionaire as he helped to create new jobs in several regions across the United States. According to Forbes, his net worth as of 2016 is $1.92 billion.9 Ulukaya figured out a way to translate his dream into reality without a plan or a path. Can you?

A Profile in Strategy #2: 20 Miles a Day

In January 1912, both Roald Amundsen and Robert Falcon Scott reached the South Pole within a month of each other, leading separate teams. But while Scott and his four companions died on the return journey, Amundsen’s party returned without loss of life. Each team used a different strategy for their journey. Scott’s team would walk as far as possible on the good weather days and rest up on the bad days. Conversely, Amundsen’s team adhered to a strict regimen of consistent progress by walking 20 miles every day no matter what the weather. Even on those days when his team could have walked further, Amundsen’s team stopped at 20 miles to conserve their energy for the next day’s 20 miles. In his best-selling book, From Good to Great, Jim Collins highlights this trek of roughly 1,400 miles as a case study in diligent consistency. The team that took consistent action made it back safely without loss of life.

This 20-miles-a-day strategy is a good case study, as it includes at least seven specific components to consider as you consistently apply your own strategy to achieve a personal or professional goal. The following seven components are in no particular order.10

  1. Performance indicators: A good 20 Mile March uses performance markers that delineate a lower bound of acceptable achievement. These create productive discomfort and must be challenging (but not impossible) to achieve in difficult times.

  2. Limitations: A good 20 Mile March has self-imposed constraints. This creates an upper bound for how far you’ll march when facing robust opportunity and exceptionally good conditions. These constraints should also produce discomfort in the face of pressures and fear that you should be going faster and doing more.

  3. Customized: There’s no all-purpose 20 Mile March for all organizations or individuals. A good 20 Mile March is tailored to the specific goal in any given situation.

  4. Self-imposed: A good 20 Mile March is designed and self-imposed by the organization or individual, not imposed from the outside or blindly copied by others.

  5. Time limit: A good 20 Mile March has a Goldilocks time frame, not too short and not too long but just right. Make the timeline of the march too short, and you’ll be more exposed to uncontrollable variability; make the timeline too long, and it loses power.

  6. Achievable: A good 20 Mile March lies largely within your control to achieve.

  7. Consistency: A good 20 Mile March must be achieved with great consistency. Good intentions do not count.

Thinking Exercise #2: 360 Degrees

F. Scott Fitzgerald observed: “The test of a first-rate intelligence is the ability to hold two opposed ideas in mind at the same time, and still retain the ability to function.” Strategic thinkers often hold the two opposing ideas of “this might work” and “this might not work” in their mind as they make a decision and move forward. Can you? Have you? What can you do to improve your ability to hold two opposing ideas in your mind simultaneously and still retain the ability to function?

One exercise you can do is labeled “360 Degrees.” This exercise helps you develop not just two opposing ideas, but the spectrum of thoughts involved with a specific topic. Most stories contain multiple interpretations. When analyzing a story, effective strategic thinkers work hard at identifying the various aspects prior to making a decision. Such an approach allows a strategic thinker to exercise a level of sophistication in their assessment. The end product often provides a detailed, nuanced and dynamic insight involving multiple viewpoints for the reader to consider.

The adjacent image of a circle contains 12 segments ranging from 30 to 330 degrees and represents a good analogy for you to keep in mind as you go about developing strategies to resolve a problem, answer a question, or address an issue. Remember Fitzgerald’s observation about intelligence. Once you hear of an idea, challenge yourself to think about the spectrum of elements involved with this idea so that you can get a complete 360-degree view of the situation. Only then can you begin to think strategically.

Let’s take gun control as an example. As one of the most divisive issues in contemporary American society today, the typical debate involves being two dimensional: either you are for or against gun control as illustrated in the adjacent circle. Unfortunately, many topics today are split into this simple dichotomy. But American society is far more complex and as such, any strategic thinking involved with this topic should consider such complexities.

For example, in his 2016 book, American Character: A History of the Epic Struggle Between Individual Liberty and the Common Good, journalist Colin Woodward explores the cultural differences in American history and believes the United States is made up of 11 culturally disparate regions that could easily be their own unique countries, and the dynamic between these regions has shaped the American character11:

“The American effort to achieve consensus on the appropriate balance between individual and collective freedom is hampered by the simple fact that America is not a unitary society with a single set of broadly accepted cultural norms, like Japan, Sweden, or Hungary. It’s a contentious federation comprising eleven competing regional cultures, most of them centuries old, each with a different take on the balance between individual liberty and the common good.”12

As a result of these competing regional cultures, there is often a third component to many issues that I will label “conditional,” otherwise known as the “grey area.” Individuals who fall into the conditional category are neither for nor against a specific topic. They fall into the grey area and often consider the nuances involved. On the issue of gun control, a nuanced examination would involve supporting gun ownership, but only after certain conditions are met. Effective strategic thinkers and writers can often help build the bridge of consensus necessary for conditionals to agree.

Conditionals delve into the nuances involved with a topic and seldom fall into the dichotomy of “for versus against.” The adjacent pie chart includes seven specific topics within the larger issue of gun control: 1. Age of gun owner; 2. Location to purchase a gun; 3. Restrictions on who can buy a gun; 4. Regulations at the state and federal levels; 5. Sales policy to purchase a gun; 6. The role of background checks; and 7. Types of permits required. If you subscribe to the “for versus against” dichotomy you will most likely ignore these specific topics. If you would like to think strategically about the larger issue of gun control, however, you will need to spend considerable time thinking about each of these concerns.

For example, let’s say you are against guns. According to your thought process, no one should own a gun regardless of age, employment position, or other conditions. If that is the case, then you would agree with the following statements:

Women should never own a gun to protect themselves.

People who work in remote areas of the country, where emergency support would take a great deal of time to arrive, should not own a gun.

Celebrities should never own a gun to ward off stalkers.

Members of the military or police who retire should not be allowed to keep or own any firearms after they leave the service.

Conducting this exercise will help you see the grey areas in most issues. Remember that you are conducting an exercise with opposing ideas; some of which may be against what you believe. Strategic thinkers train themselves to be aware of as many sides of an issue as possible. If all you read, view, and see feeds into your current view on a topic, it will be rather difficult for you to think, act, or write strategically. A myopic view on any subject often illustrates a lack of thinking. Spend time and develop your ability to think strategically. Doing so will help you grow both personally and professionally.

-----

360 Degree Exercise

Select a topic, other than gun control, and create a 360-degree view of the key concerns within that issue so that you can prepare to think and write strategically. Try to identify at least six conditions that are involved with the issue you select. The preceding gun ownership example had seven conditions.

1Don Moore, “Overconfidence,” Psychology Today, January 22, 2018.

2Howard F. Didsbury (2004). Thinking Creatively in Turbulent Times. World Future Society. p. 41.

3Shana Lebowitz and Drake Baer, “20 cognitive biases that screw up your decisions,” Business Insider, August 6, 2015.

4For more information on fallacies visit the Purdue Online Writing Lab located at https://owl.purdue.edu

5W. Chan Kim, and R. A. Mauborgne. 2017. Red Ocean Traps (Harvard Business Classics, Boston).

6Forbes, “Greatest Living Business Minds.” https://www.forbes.com/100-greatest-business-minds/person/hamdi-ulukaya, (date accessed May 16, 2018).

7S. Strom. April, 2014 “At Chobani, Now It’s Not Just the Yogurt That’s Rich,” The New York Times. https://www.nytimes.com/2016/04/27/business/a-windfall-for-chobani-employees-stakes-in-the-company.html, (date accessed April 10, 2018).

8For a complete list of entrepreneurs of the year award visit https://www.ey.com/gl/en/about-us/entrepreneurship/entrepreneur-of-the-year/world-entrepreneur-of-the-year---past-winners

9I. Bosilkovski. July, 2017. “Billionaire Hamdi Ulukaya’s Chobani To Spend $5 million to Train, Assist Young Entrepreneurs,” Forbes. https://www.forbes.com/sites/igorbosilkovski/2017/07/07/billionaire-hamdi-ulukayas-chobani-to-spend-5-million-to-train-assist-young-turkish-entrepreneurs/#ff1893e5d2e2, (date accessed March 3, 2018).

10Brett & Kate McKay. 2013. “What’s Your 20 Mile March?” The Art of Manliness. https://www.artofmanliness.com/articles/whats-your-20-mile-march (data accessed April 2, 2018).

11C. Lynch. October, 2017. “America May Be More Divided Now Than at Anytime Since the Civil War.” Salon. https://www.salon.com/2017/10/14/america-may-be-more-divided-now-than-at-any-time-since-the-civil-war/, (date accessed August 4, 2018).

12Ibid.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
44.220.184.63