Multiple studies suggest public relations people have about the same level of ethical development as the average college-educated adult. On the standard tool customarily used to measure ethical development, public relations practitioners score just above business people but below journalists. Not surprisingly, philosophers scored highest of all and prison inmates, lowest.1
And yet when one researcher examined the philosophy stacks in 31 leading academic libraries, he discovered the majority of missing books were on the subject of ethics (Schwitzgebel, 2009). In fact, obscure texts of interest only to scholars were about twice as likely to be missing. It is a wonder prison library shelves are not more empty than they are.
All of which suggests that, contrary to Aristotle’s notions of character as a disposition to act in a certain way, our ethical behavior is malleable and dynamic, the product of psychological and social forces operating in the darkest recesses of our mind below all levels of consciousness.
Situational Influences
In a classic series of experiments conducted by Stanley Milgram at Yale University in the 1960s, participants were instructed to inflict electrical shocks on someone they could not see, but could certainly hear (Milgram, 1963). The results were always the same—a majority of the participants continued inflicting shocks at higher and higher voltages, even when the unseen subject was screaming in agony. “Stark authority was pitted against the subjects’ strongest moral imperatives against hurting others,” Milgram (1973, p. 62) wrote, “and, with the subjects’ ears ringing with the screams of the victims, authority won more often than not.”
A similar experiment by Philip Zimbardo and colleagues at Stanford University explored the psychological effects of becoming a prison guard or a prisoner (Haney et al., 1973). Volunteer university students were assigned roles as “guards” or “prisoners” in a mock prison in the basement of the psychology building. The “guards” carried wooden batons and wore military style khaki uniforms and mirrored sunglasses. The “prisoners” wore badly fitting smocks, stocking caps, and a chain around one ankle. They were addressed only by the number sewn onto their smocks. Within six days of a planned two-week experiment, the guards were exhibiting sufficiently sadistic behavior, and the prisoners were suffering from such extreme stress, the whole thing was called off.2
Both experiments were heavily criticized at the time, but they were also replicated elsewhere with substantially the same results, strongly suggesting our ethical behavior is highly influenced by the concrete situations in which we find ourselves.3
A long litany of psychological experiments demonstrates those influences can be quite subtle. One study showed that someone standing outside a bakery with the smell of fresh bread in the air is more likely to help a stranger than someone standing outside a “neutral-smelling hardware store” (Baron, 1997). Someone asked to read sentences with words like “honor” and “respect” is more polite, minutes later, than someone who read words like “obnoxious” and “bluntly” (Bargh et al., 1996). People are more likely to litter if there is a lot of trash lying around (Chialdini et. al., 1990). Graffiti leads to more graffiti and even to more theft (Keizer et al., 2008).
Implicit Cognition
And if that is not discouraging enough, it turns out our deepest attitudes are not as pure as we thought. The relatively new field of “implicit social cognition” studies the unconscious associations and impressions we accumulate indiscriminately as we go about our daily life. Unlike explicitly held knowledge, these impressions do not go through fact checking and reconsideration as they are formed. But they become deeply-rooted assumptions about the world and the people around us. And as Harvard social psychologists Brian Nosek and Jeffrey Hansen (2008, p. 554) put it, they operate “without the encumbrance of awareness, intention, and control.” But they manifest themselves as positive or negative attitudes that reflect what we like or dislike, favor or disfavor, approach or avoid.
Many people would repudiate those negative attitudes if they were aware of them. Nevertheless, they do affect our behavior. For example, Eugene Caruso, a professor of behavioral science at the University of Chicago, gave participants in a trivia game the option of choosing partners based on certain traits such as IQ or weight. Although participants said weight was “the single least important factor in their choice,” a clear preference for thin partners emerged. In fact, participants sacrificed between 10 and 12 IQ points to work with thinner teammates. In another study, participants were willing to accept a 20 percent lower salary to work for a man, instead of a woman (Caruso et al., 2009). Leaving IQ points or money on the table, especially when one explicitly reports that weight and gender do not matter, is hardly rational.
In 2004, behavioral economists at MIT and the University of Chicago sent resumes out to prospective employers in Boston and Chicago. All the resumes listed the same backgrounds, experience, and qualifications, but half were for candidates with names like “Emily” or “Greg,” while the others were for people named “Lakisha” or “Jamal.” The “white-sounding” names received 50 percent more callbacks (Bertrand and Mullainathan, 2004). Research in Europe with people who had Muslim-sounding names produced similar results (Rooth, 2010).
Other research has demonstrated that the effects of unconscious negative racial attitudes extend into every aspect of life, from the serious to the trivial. Doctors are more likely to prescribe life-saving care to whites (Green et al., 2007), people feel less empathy toward someone in pain if they are of a different race (Avenanti, A. et al., 2010), and basketball referees subtly favor players with whom they share a racial identity (Price, and Wolfers, 2010). As Harvard Law professor Jon Hanson once said, “Our brains, it seems, have a mind of their own.”4 Our brain’s “mind” operates at levels far below our awareness, at frightening velocity, with powerful biases. “What we think we know about what is moving us is only a tiny, and often a misleading, part of what is actually going on in those parts of our brains that elude introspection but that can nonetheless manifest in our perceptions, emotions, and actions,” Hanson says. We may think we are colorblind, but our brain knows differently. We do not choose our unconscious attitudes. We are bombarded every day by cultural messages associating white with good and black with bad. (If you would like to better understand your own implicit biases, take one of the online surveys at http://implicit.harvard.edu).5
Compounding matters, a long series of experiments have demonstrated that as soon as humans bunch together we start to copy other members of our group, favor members of own group over others, look for a leader to worship, and eagerly fight anyone not in the group (De Dreu et al., 2010). And it is amazing how little it takes to corral people into herds. Psychologists assigned teenage boys to different groups based solely on their preferences for paintings by Klee and Kandinsky (Tajfel et al., 1971). The boys never met each other and had no idea what significance their group assignment had. But when each boy was asked to distribute money to the members of both groups, they distributed more money to those in their own group than in the other, even though they seemingly had nothing to gain from it, suggesting that people build their own identities from their group memberships. The boys in the experiment were boosting their own identities by rewarding their group. Such is the power of group membership.
Moral Balance
Even when external influences are not at play, our behavior can be surprisingly inconsistent with our values and beliefs. Psychologist Mordecai Nisan suggests we internalize a “sort of moral balance … of all morally relevant actions within a given time span” (1991, p. 213). We make deposits and withdrawals from that account, but we will not allow ourselves to go below a certain “personal standard.” Faced with an ethical choice, we select the option that allows us to maintain a “satisfactory balance,” taking into account all we have done to the present and what we have committed to do in the future.
Research supports his theory. For example, several experiments show that, if the probability of getting caught is low enough, many people will seize the opportunity to advance their self-interest. Whether participants reported on their ability to add columns of numbers (Gino et al., 2009) or to win simple coin tosses (Batson et al., 2003), a significant proportion would fudge their results to win a small reward if their chance of discovery appeared low. But interestingly, they changed their results just a little, suggesting there is a limit to how much people will cheat. Other experiments (Rosenhan et al., 1981) have shown a rise in altruistic behavior (deposits in the moral balance) following an ethical transgression (withdrawals from the moral balance).
Inconsistencies between behavior and belief can also be the product of moral disengagement. Psychologists have known since the 1950s that behavior inconsistent with beliefs creates psychological tension that can only be relieved by changing one or the other. When the cost of changing behavior is high enough, many people will unconsciously change their beliefs in a process known as cognitive dissonance (Festinger, 1962). And they will find creative excuses to justify their behavior, for example claiming it serves a moral purpose, blaming it on external factors, minimizing its consequences, or dehumanizing its victims (Shu et al., 2009). And in the end, no matter what ethical theory we follow, research shows we are far more likely to condemn behavior that leads to a bad outcome (Gino et al., 2010).
Bounded Rationality
All this has clear implications for the application of ethical theory. Faced with an ethical dilemma, we never have enough time to decide what to do, and how we use the time available depends to a great extent on unconscious influences. We never have enough hard data either, but whatever information we do have goes through the filter of our past experience, our current concerns, and our innate prejudices, biases, and cognitive illusions. That is not to suggest we should ignore our deepest sensations and inclinations, as if they are devoid of meaning. On the contrary, our feelings and passions brim with information. We must strive to understand and articulate their meaning so we can control them and factor what is useful and true into the ethical choices we make. For example, knowing what makes us angry or uncomfortable can help us manage our feelings. Recognizing our unconscious biases and inclinations can be the first step in controlling them. Taking note of the developing field of behavioral science, Nobel prize winning economist Herbert Simon (1916–2001) called this process “bounded rationality” (Simon, 1982).
Relative or Universal
Adding another element of complexity to ethical decision making is an age-old debate over the universality of ethical standards. Some people believe ethical rules are relative and apply differently in different cultures. They point out, for example, that in many Western countries, we would consider it unjust to give relatives preferential treatment in hiring and promotion decisions. But in some Asian and Arab countries it would be considered unfair and discourteous to do otherwise.
Others believe ethical rules are universal and apply everywhere, regardless of local norms and customs. A large U.S. computer company discovered how naïve this is when it required its Saudi Arabian engineers to attend the same sexual harassment training as its U.S.-based managers, including a case in which a manager makes sexually explicit remarks to a female employee over drinks in a bar. The Saudi engineers were so baffled and offended by that scenario, they missed the main message about sexual harassment.
We believe the basic problem is confusing “universal” with “absolute.” An absolute approach allows no exceptions and no room for interpretation or expression that may vary from culture to culture. A universal approach recognizes that basic ethical tenets apply to all human beings in like circumstances, but their interpretation and application can vary from society to society. Since ethical decisions are often based on deciding what is best for society, in practice their application is highly influenced by cultural values or what a group believes to be good, right, and desirable as passed on from generation to generation (Herskovitz, 1952, p. 634).
The Dutch social psychologist Gert Hofstede (2010) constructed a framework for differentiating cultures by their respective “values,” which he defined as “broad tendencies to prefer certain states of affairs over others.” Initially, Hofstede’s framework had four primary dimensions (here, highly simplified):6
Whether the culture is individualistic or collective. Are people and their families essentially on their own? Or do they belong to strong groups from birth onwards?
How power is distributed within a culture. Is the culture hierarchical or nonhierarchical? Is unequal distribution of power expected and accepted?
How a culture handles uncertainty. Are people comfortable in unstructured or new situations? Or do they like familiar situations and strict rules and standards?
How “masculine” or “feminine” a culture is. Are the men assertive and competitive or more modest and caring? Are women modest and caring or competitive and assertive?
While these dimensions can help in understanding a culture, it would be a mistake to assume they work mechanistically. For example, it is generally accepted that the culture of the United States is toward the “individualistic” end of the scale, while Latin and Asian cultures tend to be more “collectivist.” But those are broad generalizations. A group of Stanford University researchers (Morris et al., 2001) studied the conditions under which Citibank employees in different countries would agree to help a colleague with a task. They could have had any of a range of reasons for complying with the request—the rank of the employee making it, the requestor’s past cooperation, or maybe they just liked him or her. But their actual reason tended to follow a similar pattern, depending on the country. As expected, in the individualistic culture of the United States, reciprocity was the key motivator. U.S. employees usually asked, “What’s this guy or gal done for me before?” In China and Spain, employees reflected the culture’s collectivist leanings, but in different ways. The Chinese asked themselves, “Is the person making the request connected to someone of higher authority?” while the Spaniards asked themselves, “Is the requestor connected to any of my friends?” The Germans, also in a collectivist culture, asked, “What do the rules require?” Even though each culture behaved in ways consistent with their position on the individualistic/collective dimension of Hofstede’s framework, they approached the request very differently.7
So Hofstede’s research, as groundbreaking as it was, should only be considered a starting point in understanding other cultures. One nation’s culture can only be described relative to another’s. And even then, there are no absolutes. Hofstede is dealing with the central tendency within one culture as compared to others. But there is always variation around a central tendency, the “standard deviation” in math-speak. There is also nothing magical about the number of dimensions Hofstede identified—he started with four and, as more data came in, he added two—whether a culture’s orientation is toward the future or the past and whether a culture fosters immediate gratification or restraint.8
But what of ethical standards themselves? Are they universal or relative? Our own view is that ethical standards are the product of reason informed by local culture. More importantly, as we noted in Chapter Two, we are constantly refining our understanding of good and evil, right and wrong. For example, it is pretty clear our understanding of human equality is more complete today than it was a century ago. And it is probably fair to speculate it will be better a hundred years in the future. But our progress is almost always uneven and halting, especially at the margins of human activity.
Nor should we assume progress in understanding ethical standards is a phenomenon of western civilization. Western ideas about the ethical rights of women may be superior to those in some less developed societies, but we should also be open to the possibility that the reverence some of the least developed societies have for the natural environment is superior to our relative indifference. And who knows? A hundred years from now vegetarianism may be a widely held ethical standard rather than a culinary preference. So we should always approach questions of ethics with some humility and even a dose of uncertainty. Our goal should not necessarily be to find the absolutely right answer that applies everywhere all the time, but the best answer under the circumstances—and to justify it to the best of our abilities.
Ethical Relativism
Respecting differences and recognizing our own limits as human beings do not equate to ethical relativism. As Bill George, the former CEO of Medtronic and now a professor at the Harvard Business School, once wrote in BusinessWeek, “To sustain their success, companies must follow the same standards of business conduct in Shanghai, Mumbai, Kiev, and Riyadh as in Chicago.”9 Ethical values are not something we put on and take off like a comfortable overcoat depending on the temperature of the country we are in. And there is a big difference between etiquette and ethics. Respect for people’s human dignity is a matter of ethics; whether a woman chooses to wear a veil or not is a matter of etiquette. We can respect the latter without denying the former.
It is also important to distinguish between customary behavior in some societies and underlying ethical standards that may or may not be consistent with them. For example, we cannot think of a single culture that does not value honesty. But in some countries reporters expect compensation for covering a news conference or for writing a story based on a company news release. Some public relations practitioners do not consider this very different from tipping a waiter. “Reporters in some countries do not make much money,” they have told us. “These payments are considered part of their compensation.”
That may be true, as far as the bribe-taking reporters are concerned. But it seems to us paying reporters to run a news release violates a number of ethical principles. In terms of consequences, it harms the reporter’s readers. When they read a newspaper, they expect articles free from outside influences. Even on the assumption that a news release contains no misleading information, its very appearance in the paper gives it, more significance than it might otherwise have, which makes it misleading. It also violates a public relations person’s duty to engage in fair and open communications. Public relations people are supposed to contribute to the free flow of information. This behavior corrupts one of any democracy’s key institutions—a free press. And on the level of virtue, it is clearly dishonest; otherwise, why hide it? Tipping a waiter is done in the open for everyone, including the waiter’s employer to see. But the waiter’s employer would likely frown on a gratuity quietly slipped to a server prior to the meal to ensure priority service. Such behavior would put other diners at a disadvantage and endanger the employer’s reputation. That is more analogous to the situation at hand.
Local Customs
Local customs clearly complicate the situation. It is true that “tipping” journalists is condoned in some circles in some countries. But even in those countries, newspaper readers would likely consider it corrupt and unethical. At minimum, an ethical public relations practitioner would insist on disclosure of the payment so readers can draw their own conclusion about the resulting article’s newsworthiness and read it with full knowledge of its sourcing. Bribing reporters is not really an accommodation to cultural differences; it is capitulation to a dishonest practice no culture should accept.
As in many areas of ethical decision making, hard and fast rules in global public relations are rare. But some standards are universal, starting with the most basic principle of the United Nation’s Declaration of Human Rights—everyone is born equal under the law with basic rights and freedoms. “Cultural relativism is morally blind,” writes Thomas Donaldson (1996, September–October), professor of law and business ethics at the Wharton School of the University of Pennsylvania. “There are fundamental values that cross cultures, and companies must uphold them.”In Donaldson’s view, and ours, all organizations have an ethical duty to respect human dignity, to respect people’s basic rights, and to practice good citizenship.
The first of these duties—to respect human dignity—means treating people as ends, not simply means to accomplish corporate purpose. It means respecting their autonomy and right to reason. Giving them a safe place to work and producing safe products and services. Respecting people’s basic rights means acting in ways that support their rights under the UN Declaration of Human Rights, including full equality, liberty, and personal security. And practicing good citizenship means supporting social institutions that further these rights, such as the economic system, the educational system, and organizations to protect the environment. And, yes, a free press.
Business ethicist Richard DeGeorge (2000, September 1, p. 50) proposes a very similar set of guidelines to address international business ethics questions:
• Do no direct intentional harm.
• Produce more good than harm for the host country.
• Respect the rights of employees and of all others affected by one’s actions or policies.
• To the extent consistent with ethical norms, respect the local culture and work with and not against it.
• Pay your fair share of taxes and cooperate with the local governments in developing equitable laws and other background institutions.
In practice, even adhering to broad ethical standards such as these will present ethical dilemmas. Google, for example, operates under the standard of “Don’t be evil.” Yet when the Chinese government instructed the company to censor the results of its search engines based in Mainland China to omit subjects deemed “offensive” or “subversive,” the company complied.
There is little question American companies have to obey the laws of their host countries if they want to operate within their borders. The real question is whether or not they want to operate there at all. As in many ethical questions, it all comes down to finding the right balance between benefits and costs. And as Google correctly determined, the relevant cost/benefits were not only those the company itself would endure or enjoy—lost market opportunity if it left, revenue and criticism if it stayed—but the effect on the people of China. Google’s ethical calculus had to include the effects of its decision on the Chinese people.
On that basis, Google decided that censored search capabilities, which was its only option since it needed Chinese government approval to locate its servers in the country, would be better than nothing. Google cofounder Sergey Brin explained his reasoning to Fortune magazine. “We felt that by participating there, and making our services more available, even if not to the 100 percent that we ideally would like, that it will be better for Chinese Web users, because ultimately they would get more information, though not quite all of it.”10
Importantly, Google made all the limitations of its China service known. If a computer user typed something like “Tiananmen Square” into the Google China search engine, the results pages would not show the protestors and government tanks that show up on the same search from any other country, but it would include a small disclaimer at the bottom of the page—“Local regulations prevent us from showing all the results.” Meanwhile, Google did not shut down the existing uncensored search engines located outside China, and it stayed away from e-mail or blogging services based on the mainland to avoid future government demands to cough up user identities.
Nevertheless, Google was criticized for its “surrender” to the Chinese. Amnesty International, for example, said Google’s decision showed that “when it comes to the crunch, profits have come before principles.”11 Google lived with that criticism for four years. But in 2010, when the Chinese government increased censorship of search results even further and the offshore Gmail accounts of Chinese dissidents were hacked, the company pulled its Web search engine from Mainland China and explained its reasoning online.12 Still, as the Wall Street Journal observed, “stepping back from these countries is financially risky for Google because they are large economies with growing online populations.”13 And as this was written the company was thinking about dipping its toes back into China through its mobile app store.14
Implications for Public Relations Practice
Sometimes—as in South Africa during apartheid—the answer to these ethical questions will be “doing business here will cost the local people more than it will benefit them.” There are no general rules of thumb to make these decisions easier. Different companies, operating under different conditions, may even come to different conclusions. But a global company needs to know how to make those decisions, drawing on the best available advice, if possible from the people most directly affected, and with clear transparency. Part of the secret to global success is knowing how to be local without sacrificing one’s core values (“thinking global, acting local”). And that involves a lot more than knowing what side of the road to drive on.
But it would be a mistake to assume western companies will only encounter ethical conflicts in countries with authoritarian regimes. For example, conceptions of privacy are very different in the United States than in Europe, which has granted its citizens “a right to be forgotten” that extends beyond its borders. Many observers believe the U.S. Constitution would prohibit such a provision in the United States. Nevertheless, American companies doing business in Europe have to find a way to work within those standards because even if they manage to get the rules changed, they will have to deal with the public attitudes that underlie them. And that points up another issue American companies bump up against worldwide—their preference for light regulation conflicts with other countries’ political history and reality.
Summary
As this chapter illustrates, public relations operates on the ragged edges of the social and psychological sciences. Whether practicing at home or abroad, we operate in a gray area of ambiguity and uncertainty. Some impediments arise from the particular situation we are in, some from our personal psychological makeup, some from unconscious biases, and some from an unfamiliar cultural context. So what are we to do when facing a thorny ethical dilemma?
• First, we should ensure we have the cognitive and emotional space to think clearly.
• Second, we should recognize any factors in the situation or in ourselves that could influence our decision making, consciously or unconsciously.
• And third, we should question our knee-jerk thinking patterns, carefully adhering to a systematic and organized approach.
That systematic and organized approach—a framework for ethical reasoning—will be the topic of our next chapter.
________________
1 Psychologist Lawrence Kohlberg divided ethical development into three primary levels of two stages each. The first or “preconventional” level is guided by punishment or reward. The second or “conventional” level is guided by the expectations of a given society as in “doing one’s duty.” The third or “postconventional” level is guided by universal, shared principles such as justice and care. Another psychologist, James Rest developed the “Defining Issues Test” (DIT) to quantify Kohlberg’s model. It presents six ethical dilemmas accompanied by 12 ranked statements that correspond to Kohlberg’s six stages. Respondents are instructed to rate these statements according to their perceived levels of importance in making an ethical decision about the dilemma presented. The score obtained from these rankings is considered a reflection of moral development. Since the DIT was developed, it has been taken by thousands of people, providing average scores for a number of professions. Several researchers have applied the DIT to public relations practitioners. For example, Paul Lieber’s (1998) Masters’ Thesis used it to gauge the ethical decision making patterns of public relations practitioners He expanded on this work in a 2008 paper for Public Relations Review. In 2009, Lieber’s thesis advisor, Renita Coleman, (2009) did her own analysis of PR practitioners’ moral development with colleague Lee Wilkins for Public Relations Research. The DIT scores cited here are drawn from that research, which showed the following “scores”: prison inmates, 23.7; business professionals, 38.13; adults in general, 40; graduate students, 44.9; public relations practitioners, 46.2; journalists, 48.68; philosophers, 65.1. Links to all these papers are provided in the References section.
2 The Stanford Prison Experiment was conducted on behalf of the U.S. Navy and was documented in a paper by the principal researchers, Craig Haney, Curtis Banks, and Philip Zimbardo (1973). See http://www.zimbardo.com/downloads/1973%20A%20Study%20of%20Prisoners%20and%20Guards,%20Naval%20Research%20Reviews.pdf. There is also a website dedicated to the experiment. See http://www.prisonexp.org/psychology/41. It has even inspired a movie. See http://www.imdb.com/title/tt0420293/
3 For replications of Milgram’s experiments, see Burger (2009) More shocking results: New research replicates Milgram’s findings. Monitor on Psychology, Vol. 40, p. 3. http://www.apa.org/monitor/2009/03/milgram.aspx. For cross-cultural implications, see Shanab and Yahya (1978). A cross-cultural study of obedience. Bulletin of the Psychosomatic Society, Vol. 11, pp. 267–269. http://link.springer.com/article/10.3758/BF03336827#page-2. Although contemporary ethical concerns have made it difficult to repeat the Stanford prison experiment in later years, it was recreated for a BBC television program with similar results. See Wells (2002, January 24) BBC halts “prison experiment.” The Guardian. http://www.theguardian.com/uk/2002/jan/24/bbc.socialsciences.
4 Hanson, J. (2009, February 19). Why race may influence us even when we “know” it doesn’t. The Situationist. https://thesituationist.wordpress.com/2009/02/19/why-race-may-influence-us-even-when-we-know-it-doesnt/. Accessed July 22, 2015.
5 Harvard University is conducting an online study on implicit bias in a wide variety of contexts such as attitudes toward fat people, people of color, or people who are gay. See https://implicit.harvard.edu. Nearly 80 percent of everyone who has taken the test—including Blacks, Non-Blacks, Hispanics, and Asians—have had “pro-white” biases.
6 We have taken the liberty of slightly changing Hofstede’s nomenclature in the interests of clarity and succinctness. The actual names of the six dimensions are: Individualism, Power Distance, Uncertainty Avoidance, Masculinity-Femininity, Long-Term Orientation, and Indulgence versus Restraint. For more, consult Greet Hofstede’s web site: www.geerthofstede.nl or his book, Cultures and Organizations: Software of the Mind, McGraw-Hill, New York, 2010.
7 This brief summary of the Citibank study merely touches the surface. The full study is available at http://www1.gsb.columbia.edu/mygsb/faculty/research/pubfiles/1913/1913.pdf. It’s well worth reading for anyone interested in better understanding inter-cultural persuasion.
8 Hofstede’s work has been validated through a number of studies and dozens of books and articles have been written on his theory of national cultures. However, Hofstede’s theories have not been free of criticism. Some researchers believe that cultures are too complicated to be measured like the weather trends in different countries. Others argue with the specific dimensions Hofstede identified. And, of course, some argued with his methodology. The International Business Center has a web page that lists the most prominent critiques, along with links to the original publications. See http://geert-hofstede.international-business-center.com/
9 George, B. (2008, February 12). Ethics must be global, not local. BusinessWeek. http://www.businessweek.com/stories/2008-02-12/ethics-must-be-global-not-localbusinessweek-business-news-stock-market-and-financial-advice. Accessed July 22, 2015.
10 Kirkpatrick, D. (2006, January 25). Google founder defends China portal. Fortune. http://money.cnn.com/2006/01/25/news/international/davos_fortune/. Accessed July 22. 2015.
11 Amnesty International. (2006, January 26). http://www.amnesty.org.uk/press-releases/china-google-and-others-must-end-complicity-restricting-freedoms. Accessed July 22, 2015.
12 A New Approach to China: An Update. (2010, March 22). Google Blog. http://googleblog.blogspot.com/2010/03/new-approach-to-china-update.html. Accessed July 22, 2015.
13 Sonne, P., & Schechner, S. (2014, December 12). Google to shut engineering office in Russia. Wall Street Journal. http://www.wsj.com/articles/google-to-shut-engineering-office-in-russia-1418401852?KEYWORDS=google. Accessed July 22, 2015.
14 Winkler, R., Barr, A., & Ma, W. (2014, November 20). Google looks to get back into China. http://www.wsj.com/articles/google-looks-to-get-back-into-china-1416527873. Accessed July 22, 2015.
18.219.178.166