CHAPTER 50

USING SOCIAL PSYCHOLOGY TO IMPLEMENT SECURITY POLICIES

M. E. Kabay, Bridgitt Robertson, Mani Akella, and D. T. Lang

50.1 INTRODUCTION

50.2 RATIONALITY IS NOT ENOUGH

50.2.1 Schema

50.2.2 Theories of Personality

50.2.3 Explanations of Behavior

50.2.4 Errors of Attribution

50.2.5 Intercultural Differences

50.2.6 Framing Reality

50.2.7 Getting Your Security Policies Across

50.2.8 Reward versus Punishment

50.3 BELIEFS AND ATTITUDES

50.3.1 Beliefs

50.3.2 Attitudes

50.3.3 Changing Attitudes toward Security

50.4 ENCOURAGING INITIATIVE

50.4.1 Prosocial Behavior

50.4.2 Conformity, Compliance, and Obedience

50.5 GROUP BEHAVIOR

50.5.1 Social Arousal

50.5.2 Locus of Control

50.5.3 Group Polarization

50.5.4 Groupthink

50.6 TECHNOLOGICAL GENERATION GAPS

50.7 SUMMARY OF RECOMMENDATIONS

50.8 FURTHER READING

50.9 NOTES

50.1 INTRODUCTION1.

Most security personnel have commiserated with colleagues about the difficulty of getting people to pay attention to security policies—to comply with what seems like good common sense. They shake their heads in disbelief as they recount tales of employees who hold secured doors open for their workmates—or for total strangers, thereby rendering million-dollar card-access systems useless. In large organizations, upper managers who decline to wear their identification badges discover that soon no one else will either. In trying to implement security policies, practitioners sometimes feel that they are involved in turf wars and personal vendettas rather than rational discourse.

These problems reflect the social nature of human beings; however, they also reflect the fact that although people involved in information systems security and network management have a wide variety of backgrounds, many lack training in social or organizational psychology.

Security policies and procedures affect not only what people do, but also how they see themselves, their colleagues, and their world. Despite these psychosocial issues, security personnel pay little or no attention to what is known about social psychology. The established principles of human social behavior have much to teach in any attempts to improve corporate and institutional information assurance (IA).

IA specialists concur that security depends on people more than on technology. Another commonplace is that employees are a far greater threat to IA than outsiders.

It follows from these observations that improving security necessarily involves changing beliefs, attitudes, and behavior, both of individuals and of groups. Social psychology can help us understand how best to work with human predilections and predispositions to achieve our goals of improving security:

  • Research on social cognition looks at how people form impressions about reality. Knowing these principles, we can better teach our colleagues and clients about effective security.
  • Work on attitude formation and beliefs helps to present information effectively, and so convince employees and others to cooperate in improving security.
  • Scientists studying persuasion and attitude change have learned how best to change people's minds about unpopular views, such as those regarding the security community.
  • Studies of factors enhancing prosocial behavior provide insights on how to foster an environment where corporate information is willingly protected.
  • Knowledge of the phenomena underlying conformity, compliance, and obedience can help to enhance security by encouraging compliance and by protecting staff against social pressure to breach security.
  • Group psychology research provides warnings about group pathology and about hints for working better with groups in establishing and maintaining IA in the face of ingrained resistance.

This chapter reviews well-established principles of social psychology that help security and network management personnel implement security policies more effectively. Any recent introductory social psychology college textbook will provide ample references to the research underpinning the principles applied here to security policy implementation.2

50.2 RATIONALITY IS NOT ENOUGH.

IA policies sometimes evoke strong emotions. People can get very angry about what they perceive as interference with their way of getting their work done. From the perspective of the traditional information security professional, information security is still perceived as a technical problem, but as recent research reveals, it is more of a management problem, and the prevalent security culture offers insight into how management handles this problem.

To put the discussion in context, here is a definition of rationality from a respected academic, Jonathan Baron: rationality is “the kind of thinking we would all want to do, if we were aware of our own best interests, in order to achieve our goals.”3

Applying this definition to the concept of security, it seems to imply that rational thought would direct us to what appears to be the best compromise between what we perceive as our security needs and what appears to be the most convenient process.

50.2.1 Schema.

Psychologists use the word “schema” to summarize the complex picture of reality upon which we base our judgments. The schema is what social psychologists call the way people make sense of their social interactions. IA practitioners must often change their colleagues' schemata.

Schemata are self-consistent views of reality. They help us pay attention to what we expect to be important and to ignore irrelevant data. They also help us organize our behavior. For example, our schema for relations at the office includes polite greetings, civil discussions, written communications, and businesslike clothes. The schema excludes obscene shrieks, abusive verbal attacks, spray-painted graffiti, and colleagues dressed in swim suits. It is the schema that lets people know what is appropriate or inappropriate in a given situation.

Unfortunately, security policies and procedures conflict with most people's schemata. Office workers' schemata includes sharing office supplies (“Lendme your stapler, please?”), trusting their team members to share information (“Take a look at these figures, Sally”), and letting their papers stay openly visible when they leave their desks.

Sharing user IDs, showing sensitive information to someone who lacks the appropriate clearance, and leaving workstations logged on without protection are gross breaches of a different schema—that of the IA specialist. Think about access controls: Normal politeness dictates that when a colleague approaches the door we have just opened, we hold the door open for the person; when we see a visitor, we smile politely—after all, it might be a customer. In contrast, access-control policies require that we refuse to let even well-liked colleagues piggyback their way through an access-card system; security policies insist that unbadged strangers be challenged or reported to security personnel. Common sense tells us that when the chief executive officer (CEO) of the company wants something, we do not oppose it; yet good IA dictates that we train computer room operators to forbid entry to anyone without documented authorization—including the CEO.

Sometimes people subvert IA by systematically getting around the rules because their normal social schema supersedes the security schema. It is not uncommon for naive staff to give keys or the door lock combination for access into secured areas to regularly schedule outside delivery and maintenance persons. Such delivery people are rarely subjected to security checks, and yet their potential for intentional or inadvertent damage is great; nonetheless, the naive staff members are acting without authorization, subverting normal security controls, and entrusting the safety and security of corporate resources to relative unknowns: Why? Are they deliberately violating security policy with evil intent? Of course not: The employees are simply acting in a friendly fashion and extending trust that might be appropriate in other circumstances—but they are using the wrong schema for a high-security corporate environment. In contrast, an IA specialist's schema in the same circumstances includes all the potentially untrustworthy friends of those outsiders. Until the security administration group alters the employees' perception of the appropriateness of the security regulations, the conflict between different schemata will continue to cause security violations and fuel resentment on all sides.

Indeed, a common response to attempts at enforcing existing policies and procedures, or to new security rules, is a charge of paranoia leveled against security personnel. Other accusations include authoritarian behavior and undue interference with job functions. These responses usually indicate a conflict between accepted norms of behavior and the need to change behavior to conform to security principles. They imply that social and cultural needs and behavior were not accounted for in the design of the principles. They also indicate a need for security personnel to understand that basic social graces and accepted norms of social activity conflict with the basic needs of security—and that the employees violating security rules are not inherently bad people.

Some redesign might allow for better security by recognizing that security rules can violate accepted social norms. For example, in the case of security-locked doors, one could allow for relatively free access to common areas (thus allowing people to hold the door for colleagues following them) while forbidding such actions in secured locations. A posted explanation of both the reasons for, and consequences of, the policy explicitly addressing the difference between the needs of normal politeness and the needs of high security could lead to better adherence to stated policy. The text might read something like this:

This is a high-security area. Preventing anyone from entering without swiping their owned access card is not rude: it's common sense. Entering secured areas without using each employee's access card could put employees who did not use their access cards to enter at risk in an emergency if security staff didn't know that they were still in the secured areas. You are welcome to exercise normal politeness by holding the doors open for your colleagues and badge-wearing visitors in the nonsecured areas.

If we persist in assuming that we can influence our colleagues to change their perception of IA solely by informing, cajoling, nagging, or browbeating them, we will continue to fail. IA must be integrated into the corporate culture by changing our colleagues' schemata, a process that needs to use all of the techniques that social psychology can teach us.

A simple measure of this reality is to be found in the persistent avoidance practiced by many U.S. Government agencies in applying the information security program requirements of the Federal Information Security Management Act of 2002 (FISMA). It took almost five years and a series of undesirable security incidents before the various agencies woke up to the reality of the need to implement appropriate protection.4 Another illustration of the reluctance to implement security policies is the Veterans Affairs debacle in the late 2000s involving loss of control over personally identifiable information on unencrypted disk drives.5

50.2.2 Theories of Personality.

One of the most pervasive obstacles to cooperation in organizations is interpersonal conflict. Many conflicts are rooted in differences of personality style. For example, one widely used set of categories for describing people's personalities uses this schema:

  • Extroversion
    • High: active, assertive, energetic, outgoing, talkative
    • Low: quiet, reserved, shy, silent, withdrawn
  • Agreeableness
    • High: affectionate, appreciative, kind, soft-hearted, sympathetic
    • Low: cold, fault-finding, hard-hearted, quarrelsome, unfriendly
  • Conscientiousness
    • High: efficient, organized, planful, responsible, thorough
    • Low: careless, disorderly, frivolous, irresponsible, slipshod
  • Emotional stability
    • High: calm, contented, stable, unemotional
    • Low: anxious, moody, nervous, tense, worrying
  • Openness or culturedness
    • High: imaginative, insightful, intelligent, original, wide interests
    • Low: commonplace, shallow, simple, narrow interests, unintelligent

The adjectives used in this summary are positive for the “high” side of each trait and negative for the “low” side. However, the assumption that different personality types are easily characterized as superior and inferior seriously interferes with respectful communications among colleagues. For example, people with “low” characteristics might view the preceding summary in this way:

  • Extroversion
    • High: nervous, aggressive, excitable, pushy, chattering
    • Low: dignified, respectful, unassuming, attentive, self-sufficient
  • Agreeableness
    • High: clinging, gushy, soft-headed, knee-jerk reactive, uncritical
    • Low: stately, analytical, rational, principled, reserved
  • Conscientiousness
    • High: obsessive, compulsive, unspontaneous, pompous, slavish
    • Low: free, spontaneous, creative, fun, youthful, having perspective
  • Emotional stability
    • High: frozen, ambitionless, boring, dead
    • Low: vibrant, romantic, alive, strong, sensible
  • Openness or culturedness
    • High: flaky, theoretical, complicated, off-the-wall, dilettante
    • Low: earthy, smart, grounded, focused, practical

In discussing corporate culture change, leaders must be on guard to defuse conflicts based on the misperception that one particular response or view of an issue is necessarily good and another necessarily bad. The conflict may be rooted in personality styles rather than in problems of understanding. If the security working group proposes that all employees must challenge anyone in the secured areas who is not wearing a badge, some people—those who have low extroversion, for example—may have a great deal of difficulty with the concept that they should tell anyone else what to do, especially a manager of a higher rank than their own. Arguing only over the reasons why such a policy would be useful would sidestep the fundamental problem: that the required behavior is in direct conflict with possibly lifelong and firmly held views on appropriate behavior.

Security personnel must remember that failure to comply with policy is not necessarily the result of a bad attitude. When it becomes obvious that conflicts are rooted in personality, security personnel will have to try to arrive at a useful compromise. Instead of requiring that everyone confront the unbadged individual personally, the security policies could include a proviso allowing for individuals to choose simply to inform security personnel immediately.

Role-playing exercises sometimes can defuse a problem in accepting security policies by desensitizing resistant personnel. Going through the motions of what they fear or dislike sometimes can help them come to realize that the proposed change in behavior is not as bad as they originally thought. Returning to the example of confronting violations of security, many people have difficulty imagining that they could tell a superior in the management hierarchy not to piggyback. This term, like “hitchhiking” and “tailgating,” describes entering through a secured door that has been opened by someone else using a valid access code or token. Going through exercises in which each person pretends in turn to be the upper manager and then the challenger helps to break down resistance to this particular security policy. Trainers can emphasize that the challenge must be seen using a different schema from the normal situation:

  • It is socially acceptable to apply security policies within the organization.
  • Higher-status employees can encourage lower-status employees by articulating their support for the policy and showing that they are not offended by the request.
  • Participants of all hierarchical levels can introspect to see that they themselves do not feel offended when they are politely asked to use their badge during the role-playing exercise.

In general, leaders of the security team responsible for implementing security policies should be on the lookout for conflicts of style that interfere with the central task of making the enterprise more secure. If an individual likes short, direct instructions without chitchat about nonessentials, the security team member should adapt and stick to essentials; if an individual is known to like getting to know a stranger and wants to spend a few minutes learning about family background, it should not be opposed. Communicating ideas in a way that is likely to be acceptable is more important than imposing one's own interpersonal style preferences on others.

Above all, security personnel—and management in general—ought to be doing a great deal more listening and a great deal less commanding.

Some important psychological issues for security leaders to consider include:

  • Digital security is extremely complicated, and the explanations can be very technical—both attributes that are unfavorable to fostering management attention.
  • Most security incidents stem from insiders rather than from outsiders prevention requires consistent nagging—not something management or anyone else normally regards favorably.
  • Success in digital security (in fact, all of security) is best shown by having nothing happen, which is a tough thing to measure and tougher to sell. So the personal payoff for a well-executed security strategy is often little to nothing—and no management executive wants to put up nothing as a true measure of success.

The practical implications of these observations include:

  • Discussions of IA should be down-to-earth and practical whenever possible.
  • Awareness can be achieved by more positive means than nagging.
  • One can create metrics of success by using security games and competitions that are fun as well as informative and effective at maintaining security awareness.

Here is an abstract of an article on the general manager's contribution to security:

Few senior executives pay much attention to computer security. They either hand off responsibility to their technical people or bring in consultants. But given the stakes involved, an arm's-length approach is extremely unwise. According to industry estimates, security breaches affect 90% of all businesses every year and cost some $17 billion. Fortunately,…senior executives don't need to learn about the more arcane aspects of their company's IT systems to take a hands-on approach. Instead, they should focus on the familiar task of managing risk. Their role should be to assess the business value of their information assets, determine the likelihood that those assets will be compromised, and then tailor a set of risk abatement processes to their company's particular vulnerabilities. This approach, which views computer security as an operational rather than a technical challenge, is akin to a classic quality assurance program in that it attempts to avoid problems rather than fix them and involves all employees, not just IT staffers. The goal is not to make computer systems completely secure—that's impossible—but to reduce the business risk to an acceptable level.6

We have italicized the key sentence in the quote to emphasize the critical role of changing corporate culture in successful security management.7

50.2.3 Explanations of Behavior.

In practice, trying to change corporate culture can be a frustrating and long-drawn-out project. One aspect of this process that security group leaders should monitor closely is the interpretation of employee behavior (called attribution theory in the social psychology literature) by members of the security team. In general, people can be viewed as interpreting (i.e., explaining) other people's behavior according to two independent dimensions: internal or external and stable or unstable. Here are some explanations of why Betty has failed to log off her session for the fourth time this week before leaving the office:

  • Internal, stable. “That's just the way she is—she never pays attention to these rules.”
  • Internal, unstable. “She's been under strain lately because her child is sick—that's why she's forgotten.”
  • External, stable. “The system doesn't respond properly to the logoff command.”
  • External, unstable. “This week, the system has not been responding properly to the logoff command.”

This simple four-way classification is useful for leaders in understanding and avoiding classic errors of attribution. Such attribution errors can cause conflicts between the security staff and other employees, or even among employees with different degrees of compliance to policy.

50.2.4 Errors of Attribution.

Some well-established misinterpretations of others' behavior can interfere with the acceptance of security policies. Such errors interfere with the ability of security personnel to communicate the value of security policies. Security group leaders should sensitize their staff to the consequences of these errors.

50.2.4.1 Fundamental Attribution Error.

The most important error people use when explaining other people's behavior is to assume that a person's actions are stable, internal features; a typical example of this error is the naive belief that an actor's personality is essentially what that person portrays in performance. Anyone who has ever experienced surprise at the demeanor and speech of a favorite actor who is being interviewed has committed the fundamental attribution error. Some actors who play bad characters in fictional situations have even been verbally and physically assaulted by viewers who cannot resist the fundamental attribution error, and who genuinely believe that the actors are as bad as the nasty people they portray. The abstract of a conference presentation in 2008 summarized some significant findings illustrating this point:

Two studies attempted to document the occurrence of the psychological phenomenon known as the fundamental attribution error (FAE) in the audiovisual medium. The FAE refers to the human tendency to attribute people's behavior to internal attributes more than external factors. In Study 1, we demonstrated that in the audiovisual medium, viewers tend to attribute an actor's behavior in television dramas to the actor's personality, ignoring the existence of a script dictating the actor's behavior. Study 2 replicated this finding, and also demonstrated that the tendency to make the FAE is related to the degree to which the person reports being transported into the narrative of the TV drama. Furthermore, we showed that the tendency to attribute character traits to the actor is not diminished following exposure to the same actor playing two opposing roles. The last scene viewed was found to determine the evaluation of the actor's characteristics.8

In security work, being on guard against the fundamental attribution error helps to smooth relations with other employees. For example, if a security group member sees an employee, Jill, who is not wearing her badge, it is easy to assume that she never wears her badge and is refusing to wear it because of a character flaw. The security officer may act according to these assumptions by being harsh or unfriendly in correcting Jill's behavior. The harshness generates resentment, and Jill may come to associate security with unpleasant people, thus reducing the likelihood that she will comply with policy or encourage others to do so.

In fact, however, most people's behavior is far less stable and internal than unstable and externally based. For example, if the security officer simply smiled and pointed gently to the lack of a badge instead of jumping to conclusions, he might discover that Jill's lack of a badge today was due simply to her having taken her jacket off just before an urgent call from the vice president, interrupting her normal procedure of moving the badge from jacket to shirt pocket. Thus, her lack of a badge would not be stable behavior at all—it would be a temporary aberration of no long-lasting significance. A solution would be to get used to clipping the badge to her trousers or her skirt instead of her jacket or her blouse. Similarly, just by asking nicely, the security officer might learn that Jill normally does wear her badge, but today her four-year-old son took it off her jacket to play with it, without his mother's noticing the change. In this example, Jill's behavior is externally based and has nothing to do with character. The kindly interactions between Jill and the security guard increase the sense of social relation and make it more likely that she will remember the incident positively and comply with the security policy in the future.

In summary, by being aware of the fundamental attribution error, security personnel can be trained to adopt a less judgmental, or quick-draw, mentality that can alienate other employees and damage security programs.

50.2.4.2 Actor-Observer Effect.

The actor-observer effect consists of interpreting one's own behavior as appropriate unstable, externally motivated responses to environmental conditions, whereas other people's behavior is viewed in the light of the fundamental attribution error as stable, internally motivated expressions of character traits. Becoming aware of this tendency helps security personnel resist the fundamental attribution error.

50.2.4.3 Self-Serving Bias.

The counterpart of the actor-observer effect is the self-serving bias, which fools people into believing that their own behavior is due to stable, internal aspects of their character. Security officers who are unaware of this dangerous error may come to feel that they are in some sense superior to other people who do not know as much about security as they do or who do not comply as fully as they do with security policy. The officers may have failed to integrate the fact that hours of training and coaching by their security group leaders are at least as responsible for their own knowledge of, and compliance with, security policies as any innate superiority.

By bringing this kind of erroneous thinking to light during training and supervision of security staff, managers can help reduce the conflicts that naturally result from an air of assumed superiority.

50.2.4.4 Salience and Prejudice.

When people are asked to guess which person in a group is the most influential (or least influential) person, social psychologists find that whichever person stands out the most, for whatever reason, is more often attributed with the special properties in question. Such effects apply to any characteristic that the psychologists ask about: most (or least) intelligent, aggressive, sympathetic, and so on. This phenomenon is known as the salience effect.

An application of the salience effect might occur if security officers see a group of employees who are violating security policies. A natural and counterproductive tendency is to leap to the conclusion that the tallest or shortest, the thinnest or fattest, the whitest or blackest person in the group must be to blame. This error can result in unfair treatment of perfectly innocent people.

This problem of misinterpreting salience is exacerbated by prejudice; for example, imagine there were an identifiable group called the “Ogunians” (as far as we can determine, there is no such group) who traditionally wear, say, a seven-sided symbol of their identity. If an anti-Ogunian security officer sees a noncompliant group where one of the members is wearing the characteristic heptagon of Ogun, it may be hard for the officer to resist blaming the noncompliance on the Ogunian even if, in fact, the Ogunian was waiting to use a valid access card in full compliance with security policy.

Worse, people can be so strongly influenced by expectation—part of their schema—that they actually misperceive a situation altogether. For example, in some classic experiments studying prejudice in the 1950s, psychologists showed subjects a drawing of two people, one light-colored and the other dark-colored, standing in a tramway car. One was holding a knife. When questioned about the image afterward, white subjects consistently reported that the black figure had been holding the knife, but actually it was the white figure in the drawing who had the knife.

Thus, even observation itself can be twisted by prejudice and expectations; for example, if the anti-Ogunian security officer sees a group of people passing through an open doorway into a secured area without using their badges, the officer may incorrectly report that it was the fault of an Ogunian even if there was no Ogunian in the group. Such a mistaken report would not only infuriate innocent Ogunians and possibly cause general Ogunian resentment or hostility toward “security,” but it also could mislead the security group itself into trying to correct the behavior of the wrong person or people.

Similarly, any minority—whether in terms of gender, gender orientation, religion, race, or disability—can be the focus of a prejudiced security officer's blame when a group disobeys policy. Security leaders should make their staff aware of the danger of applying this erroneous method of explaining group behavior. In many organizations, such discrimination is a violation of corporate policy and may even be illegal. In any case, prejudice is not constructive and must be monitored and overcome.

50.2.5 Intercultural Differences.

Many countries in the world are experiencing changes in their population due to immigration. Especially in areas where people have heretofore been largely homogeneous, cultural, religious, and racial diversity can lead to interpersonal and intergroup conflicts. Such conflicts may be based in part on prejudice, but they also may be the result of differing values and assumptions.

This definition of culture helps define the discussion:

Culture as Mental Programming

Every person carries within him- or herself patterns of thinking, feeling, and potential acting that were learned throughout their lifetime. Much of it has been acquired in early childhood, because at that time a person is most susceptible to learning and assimilating. As soon as certain patterns of thinking, feeling, and acting have established themselves within a person's mind, he or she must unlearn these before being able to learn something different, and unlearning is more difficult than learning for the first time.

Using the analogy of the way computers are programmed, this book will call such patterns of thinking, feeling and acting mental programs, or … software of the mind….

A customary term for such mental software is culture.9

Security personnel engaged in the process of corporate culture change should be sensitive to the possibility that people with different real-world cultural backgrounds can respond differently to proposed security policies. For example, in 2001 the fundamentalist extremists of the Taliban in Afghanistan decreed that non-Muslim people would have to wear badges in public.10 One can imagine that a Hindu Afghan refugee in the United States who is told to wear a badge for security reasons might have an unexpectedly emotional response to the order. Before pressuring (or becoming hostile to) anyone who seems to be resisting a policy, it is valuable to inquire about the person's beliefs and attitudes and to explain the foundation for the policies in question. Especially where there are intercultural differences, such inquiry and discussion can forestall difficulties and dissension and assuage unexpected, culturally rooted anxiety.

Security professionals need to be acutely aware of the cultural differences of individuals in their target audiences. For example, in some cultures (mostly the new world and western), reality is directly related to facts and verifiable calculations. Other cultures may put a stronger emphasis on personal feelings, intuition, and culturally ingrained beliefs into their comprehension of reality. Hence different people may not share the same understanding of reality or implement policies by the same principles unless they make their assumptions known, and discuss them with the intention of coming to agreement.

When considering culture, the changing dynamics of modern society also need to be accounted for. Any modern city or leading public institution today is a complex combination of people from different cultural and social backgrounds. Working together for large parts of their active days together as a group leads to a melting pot situation, with all the individual cultures and social leanings being added into the mix and emerging as a (mostly) different culture. The individuals then take this new culture back home, where it mixes in with their family and friends' contributions, resulting in often more modifications and variation. The organization itself is constantly changing, as people join and leave. All of this leads to culture being a dynamic phenomenon, albeit not as rapidly changing as the technology front. Hence, any security policy or framework needs to be able to account for and accommodate this changing paradigm if it has to be successful in securing the enterprise.

50.2.6 Framing Reality.

How can we make the corporate culture more supportive of IA?

Schemata influence what we perceive. For example, an employee refuses to take vacations, works late every night, is never late, and is never sick. A model employee? Perhaps, in one schema. From the security point of view, the employee's behavior is suspect. There have been cases where such people have been embezzlers unable to leave their employment: Even a day away might result in discovery of their crimes. Saint or sinner? Our expectations determine what we see.11

To change the schema so that people take IA seriously, we should provide participants in training and security awareness with real-life examples of computer crime and security breaches, so that security policies make obvious sense rather than seeming to be arbitrary.

Schemata influence what we remember. When information inconsistent with our preconceptions is mixed with details that fit our existing schemata, we selectively retain what fits and discard what conflicts. When we have been fed a diet of movies and television shows illustrating the premise that information is most at risk from brilliant hackers, why should we remember the truth: that carelessness and incompetence by authorized users of information systems cause far more harm than evil intentions and outsiders ever do?

Instructors should emphasize the practical side of IA by showing how policies protect all employees against false accusations, prevent damage to the organization's reputation and profits, and even play a role in national security. This is especially true where business touches the technical infrastructure on which we all depend.

Most important of all, teaching others about IA cannot be an occasional and hap-hazard affair. Before attempting to implement policies and procedures (aside from emergency measures that are needed at once), we should ensure that we build up a consistent view of IA among our colleagues. In light of the complexity of social cognition, our usual attempts to implement security policies and procedures seem pathetically inept. A couple of hours of lectures followed by a video, a yearly ritual of signing a security policy that seems to have been written by Martians—these are not methods that will improve security. These efforts merely pay lip service to the idea of security.

According to research on counterintuitive information, people's judgment is influenced by the manner in which information is presented. For example, even information contrary to established schemata can be assimilated if people have enough time to integrate the new knowledge into their worldviews. It follows that nonemergency security policies should be introduced over a long time, not rushed into place.

An effective IA program includes frequent reminders of security. To change the corporate culture, practitioners should use methods such as a security corner in the corporate publication, security bulletins detailing the latest computer crime or security breach that has hit the news, contests for identifying the problems in realistic scenarios, and write-in columns to handle questions about policies. IA has to become part of the framework of reality, not just an imposition from management.

In every security course or awareness program, instructors and facilitators should explicitly address the question of corporate culture, expectations, and social schemata. Do not rely solely on intellectual discourse when addressing a question of complex perceptions and feelings. Use simulations, videos, and role-playing exercises to bridge the gap between intellect and emotion.

Address the feelings and perceptions of all participants as they learn about the counterintuitive behaviors that improved security will demand. Encourage learners to think about how they might feel and respond in various situations that can arise during the transition to a more secure environment. For example, ask participants to imagine:

  • Asking colleagues not to step through a secured entrance without passing through the access-control system with their own identity
  • Telling their boss that they will not copy software without a license to do so
  • Questioning a visitor or employee who is not wearing an identity badge12

50.2.7 Getting Your Security Policies Across.

What are some ways to change our colleagues' schemata so that they become more receptive to IA policies?

  • Initial exposure. Preliminary information may influence people's responses to information presented later. For example, merely exposing experimental subjects to words such as “reckless” or “adventurous” affects their judgment of risk-taking behavior in a later test.

    It follows that when preparing to increase employee awareness of security issues, presenting case studies is likely to have a beneficial effect on participants' readiness to examine security requirements.

  • Counterexamples. Preexisting schemata can be challenged by several counter-examples, each of which challenges a component of the schema. For example, prejudice about an ethnic group is more likely to be changed by contact with several people, each of whom contradicts a different aspect of the prejudiced schema.

    It follows that security awareness programs should include many realistic examples of security requirements and breaches. In a counterexample, students in college IA courses have commented on the unrealistic scenario in a training video they were shown: a series of disastrous security breaches occurring in the same company. Based on the findings of cognitive social psychologists, the film would be more effective for training if the incidents had been dramatized as occurring in different companies.

    In practical terms, practitioners should stay current and update their materials. Many IA publications provide useful case studies that will help make awareness and training more effective.

  • Choice of wording. Perceptions of risks and benefits are profoundly influenced by the wording in which situations and options are presented. For example, experimental subjects responded far more positively to reports of a drug with “50 percent success” than to the same drug described as having “50 percent failure.”

    It follows that practitioners should choose their language carefully during security awareness campaigns. Instead of focusing on reducing failure rates (violations of policy), we should emphasize improvements in our success rates. Unfortunately, some rates cannot be expressed in positive terms; for example, it is not easy to measure the success rate of security measures designed to foil attacks on systems. Judgments are easily distorted by the tendency to rely on personal anecdotes, small samples, easily available information, and faulty interpretation of statistical information. Basically, we humans are not always rational processors of factual information. If security awareness programs rely strictly on presentation of factual information about risks and proposed policies and procedures, they are likely to run up against a stubborn refusal to act logically. Security program implementation must engage more than the rational mind. We must appeal to our colleagues' imagination and emotion as well. We must inspire a commitment to security rather than merely describing it.

50.2.8 Reward versus Punishment.

When enforcing security policies, too many organizations focus entirely on punishing those who break the rules. However, everything we know about modifying behavior teaches us to use reward rather than punishment. Punishing people who do not comply with security rules often generates resentment and hostility that carry over into future interactions. Instead of seeing information assurance as a benefit to the organization and to their interests, victims of harsh treatment can resist even well-intentioned, sensible changes in security policies simply because of the emotional overlay associated with the embarrassment and frustration generated by criticism and penalties.

In addition to avoiding negativity and push-back, reward may simply work better than punishment at changing behavior. For example, a security officer from a large corporation experimented with reward and punishment in implementing security policies. Employees were supposed to log off their mainframe terminals when leaving the office, but compliance rates were only around 40 percent. In one department, the security officer used the usual techniques recommended in the literature and common among security professionals; for example, she put up nasty notes on terminals that were not logged off, changed the passwords on delinquent accounts, and humiliated violators by forcing them to report to their bosses for authorization to obtain a new password. However, in a different department, she simply left a Hershey's Chocolate Kiss on the keyboard of every terminal whose user had indeed logged off before leaving. After one month of these two strategies, compliance rates in the department subject to punishment had climbed to around 60 percent. Compliance in the department getting chocolates had reached around 80 percent—and there feelings toward security were much more favorable than the norm.

This case illustrates some of the benefits of reward:

  • Compliance rates were significantly higher than in the group subjected to punishment.
  • Attitudes (see Section 50.3) are more likely to be positive.
  • Costs can be lower because small rewards delivered en masse may be much cheaper and quicker to apply than administrative procedures applied one by one through management intervention

50.3 BELIEFS AND ATTITUDES.

Psychologists distinguish between beliefs and attitudes. A belief refers to cognitive information that need not have an emotional component. An attitude refers to an evaluation or emotional response. Thus, a person may believe correctly that copying a large number of proprietary software packages without authorization is a felony while nonetheless having the attitude that it does not matter to him. A rational employee may believe that malware can be downloaded onto company computers through unauthorized software available on unvetted Web sites (and thus answer a questionnaire evaluating security awareness correctly) yet have the attitude that the risk is negligible—and cheerfully go on downloading unauthorized software at work.

50.3.1 Beliefs.

Beliefs can change when contradictory information is presented, but some research suggests that it can take up to a week before significant shifts are measurable. Other studies suggest that when people hold contradictory beliefs, providing an opportunity to articulate and evaluate those beliefs may lead to changes that reduce inconsistency.

These findings imply that corporate security must explore the current structure of beliefs among employees and managers. Questionnaires, focus groups, and interviews may not only help the security practitioner, they actually may help move the corporate culture in the right direction. The Hawthorne Effect is the name given to improvements in measured behavior resulting simply from employee responses to being studied; done correctly, honestly, and nonpunitively, inquiring into employee beliefs and attitudes may communicate a genuine interest by management in improving security policy and practice with input from everyone involved.

50.3.2 Attitudes.

An attitude, in the classical definition, is a learned evaluative response, directed at specific objects, which is relatively enduring and influences behavior in a generally motivating way. The advertising industry spends over $50 billion yearly to influence public attitudes in the hope that these attitudes will lead to changes in spending habits—that is, in behavior.

Research on classical conditioning suggests that attitudes can be learned even through simple word association. If we wish to move our colleagues toward a more negative view of computer criminals, it is important not to portray computer crime using positive images and words. Movies that show criminal hackers as pleasant, smart, physically attractive, and likable people may do harm by minimizing the seriousness of industrial espionage and cybervandalism. When teaching security, we should avoid praising the criminals we describe in case studies.

Studies of how attitudes are developed consistently show that rewards and punishments are important motivators of behavior. Studies show that even apparently minor encouragement can influence attitudes. A supervisor or instructor should praise any comments that are critical of computer crime or that support the established security policies. Employees who dismiss security concerns, or who flout the regulations, should be challenged on their attitudes, not ignored. Such challenges are best carried out in private to avoid causing embarrassment to the skeptics and possibly generating resistance due to pride or a sense of machismo.

50.3.3 Changing Attitudes toward Security.

Persuasion—changing someone's attitudes—has been described in terms of communications. The four areas of research include:

  1. Communicator variables. Who is trying to persuade?
  2. Message variables. What is being presented?
  3. Channel variables. By what means is the attempt taking place?
  4. Audience variables. At whom is the persuasion aimed?

50.3.3.1 Communicator Variables.

Attractiveness, credibility, and social status have strong effects immediately after the speaker or writer has communicated with the target audience; however, over a period of weeks to a month, the effects decline until the predominant issue is message content. We can use this phenomenon by identifying the senior executives most likely to succeed in setting a positive tone for subsequent security training. We should look for respected, likable people who understand the issues and sincerely believe in the policies they are advocating.

One personality style in particular can threaten the success of security policies: the authoritarian personality. A body of research suggests that some people, often those raised by punitive parents highly concerned with social status, become rigidly devoted to conventional beliefs, submit to authority, exercise authority harshly themselves, and are hostile to groups they perceive as unpopular. An authoritarian person might make a terrible security officer. Such an officer might derive more satisfaction from ordering people around and punishing them than from long-term success in implementing security policies.

50.3.3.2 Message Variables.

Fear can work to change attitudes only if judiciously applied. Excessive emphasis on the terrible results of poor security is likely to backfire, with participants in the awareness program rejecting the message altogether. Frightening consequences should be coupled immediately with effective and achievable security measures.

Some studies suggest that presenting a balanced argument helps convince those who initially disagree with a proposal. Presenting objections to a proposal and offering counterarguments is more effective than one-sided diatribes. Popular training videos from the Software & Information Industry Association use this technique: they show people such as “college students, college faculty and publishers of all types of media discuss[ing] the legal and ethical implications of copying other people's works” and fairly present the arguments of copyright violators before rebutting them.13

Modest repetition of a message can help generate a more positive response. Thus security awareness programs that include imaginative posters, mugs, special newsletters, audio and videotapes, and lectures are more likely to build and sustain support for security than occasional intense sessions of indoctrination. The use of multiple communications channels (discussed in the next section) also increases the effectiveness of the message.

50.3.3.3 Channel Variables.

The channel through which we communicate has a strong effect on attitudes and on the importance of superficial attributes of the communicator. In modern organizations, most people assume that a meeting is the ideal way to communicate new information. However, the most effective medium for convincing someone to pay attention to any topic is face-to-face persuasion. Security training should include more than tapes and books; a charismatic teacher or leader can help generate enthusiasm for—or at least reduce resistance to—better security.

In addition, security educators should not introduce new ideas to decision makers in a meeting. There is too much danger of confounding responses to policy with nonpolicy matters rooted in relationships among the participants. It is not uncommon for one executive to oppose a new policy simply because another has supported it. A good way to introduce security policies is to have individual meetings with one executive at a time in order to explain the issues and proposals and to ask for support.

Psychologists testing cognitive response theory have studied many subtle aspects of persuasion. Experiments have shown that rhetorical questions, such as “Are we to accept invasions of our computer systems?” are effective when the arguments are solid but counterproductive when arguments are weak. Security officers should not ask rhetorical questions unless they are certain that almost everybody will inevitably have the same answer—the one the security officers are looking for.

Consideration of facts and logical arguments, as the central route to persuasion, has been found to lead to more lasting attitudes and attitude changes than the peripheral influences from logically unrelated factors, such as physical attractiveness of a speaker.

50.3.3.4 Audience Variables.

As mentioned, questionnaires and interviews may help cement a favorable change in attitude by leading to commitment. Once employees have publicly avowed support for better security, some will begin to change their perception of themselves. Specific employees should be encouraged to take on various areas of public responsibility for IA within their work group. These roles should periodically be rotated among the employees to give everyone the experience of public commitment to improved security.

To keep up interest in security, regular meetings of enthusiasts to discuss recent security news can keep the subject fresh and interesting. New cases can help security officers explain policies with up-to-date references that will interest their fellow employees and motivate managers to pay attention to security policies.

50.4 ENCOURAGING INITIATIVE.

The ideal situation would be for everyone actually to help enforce security policies. Actually, however, some people are cooperative and helpful whereas others—or even the same people in different circumstances—are reluctant and suspicious about new policies. What can we do to increase cooperation and reduce rejection?

50.4.1 Prosocial Behavior.

Studies of people who have come to the aid of others can help to encourage everyone in an organization to do the right thing. Some people intervene to stop crimes; others ignore crimes or watch passively. Social psychologists have devised a schema that describes the steps leading to prosocial behavior:

  1. People have to notice the emergency or the crime before they can act. Thus, security training has to include information on how to tell that someone may be engaging in computer crime.
  2. The situation has to be defined as an emergency—something requiring action. Security training that provides facts about the effects of computer crime on society and solid information about the need for security within the organization can help employees recognize security violations as emergencies.
  3. Everyone must take responsibility for acting, but the larger the number of people in a group confronted with an emergency, the slower the average response time. Larger groups seem to lead to a diffusion of responsibility; each person feels that someone else is more responsible for dealing with the emergency. Another possible factor is uncertainty about the social climate; people fear appearing foolish or overly emotional in the eyes of those present. To overcome this effect, a corporate culture must be established that rewards responsible individual behavior, such as reporting security violations.
  4. Once responsibility for solving a problem has been accepted, appropriate decisions and actions must be taken. Clearly written security policies and procedures will make it more likely that employees act to improve security. In contrast, contradictory policies, poorly documented procedures, and inconsistent support from management will interfere with the decision to act.

Another analysis proposes that people implicitly analyze costs of helping and of not helping when deciding whether to act prosocially. The combination of factors most conducive to prosociality is low cost for helping and high cost for not helping.

Security procedures should make it easy to act in accordance with security policy. There should be a hot line for reporting security violations, and anonymity should be respected if desired. Psychological counseling and follow-up should be available if people feel upset about their involvement. Conversely, failing to act responsibly should be a serious matter; personnel policies should document clear and meaningful sanctions for failing to act when a security violation is observed. Penalties would include critical remarks in employment reviews and, where appropriate, even dismissal.

One method that does not work to increase prosocial behavior is exhortation; merely lecturing people in the abstract about what they ought to do has little or no positive effect.

Significantly, the general level of stress and pressure to focus on difficult tasks with seemingly impossible deadlines can greatly reduce the likelihood that people will act on their moral and ethical principles. Security is likely to flourish in an environment that provides sufficient time and support for employees to work professionally. Offices where everyone responds to a continuing series of apparent emergencies will not be likely to pay attention to security violations.

Some findings from research confirm common sense. For example, guilt motivates many people to act more prosocially. This effect works best when people are forced to assume responsibility. Thus, enforcing standards of security using reprimands and sanctions can indeed increase the likelihood that employees subsequently will act more cooperatively; however, as suggested earlier, punishment should not replace reward.

In addition, mood affects susceptibility to prosocial pressures. Bad moods make prosocial behavior less likely, whereas good moods increase prosociality. A working environment in which employees are respected is more conducive to good security than one that devalues and abuses them.

Even cursory acquaintance with other people makes it more likely that we will help them; it thus makes sense for security supervisors to get to know the staff from whom they need support. Encouraging social activities in an office (e.g., lunchtime discussion groups, occasional parties, and charitable projects) enhances interpersonal relationships and can improve the climate for effective security training. Management by walking around is an excellent practice at many levels, including fostering at least the first stage of interpersonal relationship among coworkers.14

50.4.2 Conformity, Compliance, and Obedience.

These days, many people react negatively to the words “conformity,” “compliance,” and “obedience,” but ignoring social phenomena will not help security practitioners to attain their goals. Despite the unpopularity of this subject area, it is valuable to understand how people can work together in reinforcing security policies. The next sections look at how to increase conformity with a culture of cooperation for increased security, compliance with rational security rules, and a bias toward obedience to IA authorities in security matters.

50.4.2.1 Social Pressure and Behavior Change.

Turning a group into a community provides a framework within which social pressures can operate to improve an organization's IA. Most people respond to the opinions of others by shifting their own opinions, sometimes unconsciously, toward what statisticians call the mode—the most popular opinion. Security programs must aim to shift the normative values, the sense of what one should do, toward protecting confidentiality, possession or control, integrity, authenticity, availability, and utility of data.

An informal survey conducted by Mani Akella, a coauthor of this Chapter, at three leading financial firms on Wall Street resulted in these inferences from a test group of 80 respondents:

  • Older employees prefer to model their reactions based on common group preferences even if some of the reactions go against their own gut feeling. The rationale here seems to be that the group provides anonymity and even insulates them from management reaction. Younger employees, however, tend to buck the group trend when they disagree with proposed concepts.
  • Leadership has a large role to play—and the group modifies its reactions very quickly to adapt to leadership changes. If the leader likes to follow a specific path and not ask questions, the entire group tends to let issues lie and not disturb the even tenor of the organization for fear of disturbing the leader, even at the cost of risking serious potential security lapses (see Section 50.5.4 on groupthink). If the leader fosters adynamic, open, and collaborative environment with a measured adaptability to evolving threats, however, the group enlivens itself with innovation and puts out additional effort to stay abreast (or even ahead) of the current threat landscape.
  • When the leaders challenge the individuals to greater achievement without threats of punitive reaction, the group reacts with positive response. Leadership can create a security environment that can exceed the enterprise's security expectations by encouraging the individual to increase productivity and to reward oneself with greater job satisfaction. Security, like most other organizational management efforts, is all about people. Responsible, satisfied, and aware personnel naturally lead to better overall security for the organization.

50.4.2.2 Changing Expectations.

As has been evident in public campaigns aimed at eliminating drunken driving, it is possible to shift the mode. In the United States in the mid-twentieth century, many people believed that driving while intoxicated was amusing; today, a drunken driver is a social pariah. High school children used to kill themselves in large numbers on the nights of their high school proms; today, many children spontaneously are arranging for safe rides home. Between 1982 and 2006, the percentage of people killed due to alcohol intoxication in traffic fatalities in the United States fell from 60 percent to 41 percent.15 In much the same way, we must move toward making computer crime as distasteful as public drunkenness.

The trend toward similar behavior increases when people within the group like or admire each other. In addition, the social status of an individual within a group influences that individual's willingness to conform to group standards. High-status people (those liked by most people in the group) and low-status people (those disliked by the group) both tend to be more autonomous and less compliant than people liked by some and disliked by others. Therefore, security officers should pay special attention to those outliers during instruction programs. Managers should monitor compliance more closely at both ends of the popularity range. If security practices are currently poor, and allies are needed to change the norm, working with the outliers to resist the majority's anti-security bias may be the most effective approach. The most popular people may be disastrous agents of rebellion if they do not sign on to the security program; paradoxically, the most unpopular people may be helpful if they can be persuaded to comply.

50.4.2.3 Norm of Reciprocity.

According to social psychologists, the norm of reciprocity indicates that, in social relations, favors are usually returned. Even a small, unexpected, unsolicited, or even unwanted gift increases the likelihood that we will respond to requests. For example, members of various religious cults often hand out flowers or books at airports, knowing that the norm of reciprocity will increase the frequency and amount of donations from basically uninterested passersby.

A security awareness program that includes small gifts, such as an attractive mug labeled “SECURITY IS EVERYONE'S BUSINESS” or an inexpensive but useful booklet summarizing security policies, can help get people involved in security. The combination of such programs with rewards for compliance can be a powerful tool for improving security.

Combining a token of appreciation with direct personal contact starting with the statement “I need your help” followed by a frank exposition of the security situation can be positive at all levels. This approach works at multiple levels—establishing personal relations, building on the norm of reciprocity, and changing the schema.

50.4.2.4 Incremental Change.

The foot-in-the-door technique suggests that a small initial request should be followed by an even larger second one. Political field workers, for example, know that they can start small by asking people to let them put candidate stickers in their window; then they ask to put a candidate's poster on their lawn; eventually they can ask for volunteer time or money. Every compliance with a request increases the likelihood that the person will agree to the next step in an escalating series. It is as if agreeing to one step helps to change the targets' sense of themselves. To reduce discomfort about their beliefs and their behavior (what psychologists call cognitive dissonance), people change their beliefs to conform with their behavior.

Employees can be asked personally to set a good example by blanking screens and locking terminals when leaving their desks. Later, once they have begun the process of redefining themselves (“I am a person who cares about computer security”), they can be asked for something more intense, such as participating in security training by asking others to blank their screens and lock their terminals—or rewarding those who do with the famous chocolate tidbit. By applying the same methods to various tasks, the corporate culture can change so that a majority of people feel personally committed to good security practices.

Some security specialists have proposed that we should not ask the audience to think. The reasoning is that each incremental policy step should not require the target audience to reason or explain behavior. Rather, focus on building conditioned reflexes to specific environmental and usage factors. The organization and the security team should be more assured of a common and predictable reaction to any security threat from each individual internal person. However, a countervailing view is that every behavior proposed to improve security must be grounded in an understandable schema. In other words, although one need not force the members of the audience to articulate the rationale, the rules must make sense if they are to be integrated, remembered, and applied in the long term.

50.5 GROUP BEHAVIOR.

Some groups of people are referred to as teams, while others are called gangs. Social psychological insights into group behavior can improve success rates for IA policies.

50.5.1 Social Arousal.

Studies on the behavioral effects of being in groups produced contradictory results; sometimes people did better at their tasks when there were other people around, and sometimes they did worse. Eventually, psychologists realized that the presence of other people is socially arousing; that is, people become more aware both of their own behavior and of social norms when they are in groups. Social arousal facilitates well-learned habits, but it inhibits poorly learned habits. Thus, when trying to teach employees new habits to improve security, it is counterproductive to put them into large groups. Individualized learning (e.g., by means of computer-based training and videotapes) can overcome inhibitory effects of groups in the early stages of behavioral change.

50.5.2 Locus of Control.

Another factor that interferes with implementation of security policies is the locus of control. People do not like feeling that they have no control over their environment. For example, in a classic experiment reported in social psychology textbooks, two equivalent teams of people were both subjected to loud and disruptive noise coming through a loudspeaker in their work area. One group had no control whatever over the noise, whereas the other had a large button with which they could stop the noise at once. The group with the stop button did noticeably better at their complex task than the other group—yet in no case did anyone actually press the button. Simply feeling that they could exert control, if they wanted to, significantly altered the performance of the experimental subjects.

Similarly, in studies of healing among older patients, three groups were defined: (1) controls, (2) people given a plant in a pot, and (3) people given a plant in a pot plus instructions to water it regularly. The third group did significantly better than the second in their recovery. Once again, the sense of control over the environment appeared to influence outcomes.

In security policy implementation, experience confirms that those organizations with the most participation and involvement by all sectors do best at developing and implementing information protection plans. A common phrase that refers to this phenomenon is “buy-in,” as in: “The different departmental representatives felt that they could genuinely buy into the new policies because they had fully participated in framing them.”16

50.5.3 Group Polarization.

Another branch of research into group psychology deals with group polarization. Groups tend to take more extreme decisions than would individuals in the group acting alone. In group discussions of the need for security, polarization can involve deciding to take more risks—by reducing or ignoring security concerns—than any individual would have judged reasonable. Again, one-on-one discussions of the need for security will generally be more effective in building a consensus that supports cost-effective security provisions than will large meetings.

50.5.4 Groupthink.

In the extreme, a group can display groupthink, in which a consensus is reached because of strong desires for social cohesion. When groupthink prevails, evidence contrary to the received view is discounted; opposition is viewed as disloyal; dissenters are discredited. Especially worrisome for security professionals, those people in the grip of groupthink tend to ignore risks and contingencies. To prevent such aberrations, the leader must remain impartial and encourage open debate. Respected security consultants from the outside could be invited to address the group, bringing their own experiences to bear on the group's requirements. After a consensus—not the imposition of a dominant person's opinions—has been achieved, the group should meet again and focus on playing devil's advocate to try to come up with additional challenges and alternatives.

In summary, security experts should pay attention to group dynamics and be prepared to counter possible dysfunctional responses that interfere with acceptance of IA policies.

50.6 TECHNOLOGICAL GENERATION GAPS.

In our society there are growing societal gaps between the social groups that grew up interacting in real-world communities (unwired), groups that grew up with the Internet (wired), and the newest group growing up with the always-on technology of our complex and content-rich wireless social networks.

  • The unwired generation. In today's “always-on” world of ubiquitous wireless communications, we sometimes forget about the unwired generation, those born in the early 1960s or before, who grew up actually playing outside with their friends and communicating face to face. For these employees, online discussion groups, streaming video training, blogs, and e-mail may not be as effective as with other, more technological groups. The unwired generation frequently sees the use of impersonal technology in training as an indicator of management apathy toward a topic that is not worth taking the time for real-world interaction. To this generation, if security training is important, someone should take the time to deliver it face to face. The unwired generation is also the most susceptible to many of the social psychology techniques and pitfalls discussed in this chapter.
  • The wired generation. Those born from the early 1960s to the late 1970s comprise the wired generation. This transitional generation grew up at the dawn of the Internet from 300 baud dial-up access to ISDN. From MS DOS to Windows 98 and from Cobol to C++, this generation is the bridge from the real world to the cyber world. This generation is also the generation currently coming to power in both business and government. Although accustomed to meeting face to face for important business, the wired generation lives by e-mail and cell phones, tolerating both the unwired and wired methods of communicating and learning.
  • The always-on generation. Those born after 1980 comprise the always-on generation. This generation grew up with high-speed Internet, cell phones, video games, portable electronics, and online virtual communities. This is the connected, net-worked, MySpace generation. It is this generation that seems to confound traditional security screening and implementation policy.

Although the unwired and wired generations shared psychological roots in the real world, the always-on generation has two homes: the real world and the cyber world. With instant messaging (IM) and cellular wireless, this generation moves in and out of cyberspace as the previous generation moved between the worlds of work and home. Moreover, like the previous generations, the always-on generation has different psychological and sociological frameworks for their worlds—a serious information assurance concern.

To the unwired and wired generations, personal communication meant face to face. To the always-on generation personal, communication means IM, e-mail, and digital phones. In addition, social interaction often means blogs, Web pages, message boards, and wikis. Add to this the fact that the always-on generation has a keen ability to quickly transfer information between their tightly integrated worlds, and you have the ingredients for a security officer's nightmare.

Recognizing these differences and reacting accordingly can pay sizable dividends in both security compliance and general management success. As a first step, one can ensure that security teams include members from more than the oldest generation in the enterprise; younger people may be able to act as intermediaries or translators between increasingly disparate cultures.

50.7 SUMMARY OF RECOMMENDATIONS.

This chapter has reviewed the major findings of social psychology that can help to improve IA programs. These ideas can prove useful to readers who think about social psychology as they work to implement security policies:

  • Recognize that IA policies often conflict with the schema for trusting, polite behavior in situations outside the work arena.
  • Train IA personnel to recognize that failure to comply with security policies may be rooted in many other factors than simply bad attitude.
  • Listen more than you command.
  • Teach security personnel to avoid the classic errors of attribution when trying to understand their colleagues' motivations.
  • Openly discuss and counter prejudice before it causes conflicts.
  • Take intercultural differences into account when setting and implementing security policies.
  • Before attempting to implement policies and procedures, ensure a consistent view of IA among colleagues.
  • Whenever possible, security policies should be introduced over a long time, not rushed into place.
  • Presenting case studies is likely to have a beneficial effect on participants' readiness to examine security requirements.
  • Security awareness programs should include many realistic examples of security requirements and breaches.
  • Attempt to inspire a commitment to security rather than merely describing it.
  • Emphasize improvements rather than reduction of failure.
  • Create a new concern for corporate security by exploring the current structure of beliefs among employees and managers.
  • Never portray computer crime using positive images and words.
  • Praise any comments that are critical of computer crime or that support the established security policies.
  • Employees who dismiss security concerns or flout the regulations should be challenged on their attitudes, not ignored.
  • Identify the senior executives most likely to succeed in setting a positive tone for subsequent security training and engage their cooperation to act as role models.
  • Examples of frightening consequences used in awareness and training materials should be coupled immediately with descriptions of effective and achievable security measures to forestall such consequences.
  • Presenting objections to a proposal and offering counterarguments is more effective than one-sided diatribes.
  • Security awareness programs should include many, frequent, and preferably novel and entertaining reminders of security issues.
  • In addition to tapes and books, rely on a charismatic teacher or leader to help generate enthusiasm for better security.
  • Encourage specific employees to take on public responsibility for IA within their work groups.
  • Rotate security roles periodically.
  • Security training should include information on how to tell that someone may be engaging in computer crime.
  • Build a corporate culture that rewards responsible behavior, such as reporting security violations.
  • Develop clearly written security policies and procedures.
  • Security procedures should make it easy to act in accordance with security policy.
  • Treat failures to act in accordance with security policies and procedures as very serious matters.
  • Enforcing standards of security can increase the likelihood that employees will subsequently act more cooperatively.
  • A working environment in which employees are respected is more conducive to good security than one that devalues and abuses them.
  • Get to know the staff from whom you need support.
  • Encourage social activities in the office.
  • Pay special attention to social outliers during instruction programs.
  • Monitor compliance more closely at both ends of the popularity range.
  • Work with the outliers to resist a group's anti-security bias.
  • Include small gifts in your security awareness program.
  • Start improving security a little at a time, and work up to more intrusive procedures.
  • Before discussing security at a meeting, have one-on-one discussions with the participants.
  • Remain impartial, and encourage open debate in security meetings.
  • Bring in experts from the outside when faced with groupthink.
  • Meet again after a consensus has been built, and play devil's advocate.
  • Recognize the generational technology gaps in our culture and communicate accordingly; therefore, include people from different generations in your security teams.

None of these suggestions is essential; none of them is appropriate in all situations. However, building on the accumulated experience and wisdom of social psychologists will support the smooth integration of information assurance into any corporate culture. We hope that readers will explore the literature of social and organizational psychology and will try out new ideas that will enrich the field of information assurance in years to come.

50.8 FURTHER READING

Adler, N. J., and A. Gunderson. International Dimensions of Organizational Behavior, 5th ed. Cincinnati, OH: South-Western College Publications, 2007.

Greenberg, J. Managing Behavior in Organizations, 5th ed. Upper Saddle River, NJ: Prentice-Hall, 2008.

Kinicki, A. Organizational Behavior, 4th ed. New York: McGraw-Hill, 2008.

Kowert, P. Groupthink or Deadlock: When Do Leaders Learn from Their Advisors? Albany, N: State University of New York Press, 2002.

Lesko, W. A. ed. Readings in Social Psychology: General, Classic, and Contemporary Selections, 7th ed. New York: Allyn & Bacon, 2008.

Mills, J. H. Understanding Organizational Change. New York: Routledge, 2009.

Myers, D. G. Social Psychology, 9th ed. New York: McGraw-Hill, 2006.

Senior, C., and M. Butler. Social Cognitive Neuroscience of Organizations. Hoboken, NJ: Wiley-Blackwell, 2008.

Smith, E. R., and D. M. MacKie. Social Psychology, 3rd ed. Philadelphia: Taylor & Francis, 2007.

50.9 NOTES

1. This chapter is based on original work by M. E. Kabay as a contributed paper at the Sixteenth National Computer Security Conference organized in 1993 by the National Computer Security Center. That work was updated over the years and became a chapter in the third edition of this Handbook. It has been updated for this edition with contributions from colleagues teaching and studying in the MSIA program at the School of Graduate Studies at Norwich University.

2. No specific references to the scholarly literature of social psychology research are included in this chapter except for quoted materials. For details of the information presented, consult any college-level introduction to social psychology.

3. J. Baron, Thinking and Deciding (Cambridge, UK: Cambridge University Press, 1994).

4. Government Accountability Office, “Information Security: Emerging Cybersecurity Issues Threaten Federal Information Systems,” United States Government Accountability Office Report GAO-05-231 (May 2005); www.gao.gov/new.items/d05231.pdf.

5. M. E. Kabay, “The VA Data Insecurity Saga.” (2008), www2.norwich.edu/mkabay/infosecmgmt/vasaga.pdf.

6. R. D. Austin and C. A.R. Darby, “The Myth of Secure Computing,” Harvard Business Review (June 2003); http://harvardbusinessonline.hbsp.harvard.edu/b01/en/common/item_detail.jhtml?id=4031&referral=2340.

7. For an in-depth discussion of the manager's role in IA, see Chapter 63 in this Handbook; for a discussion of the role of the chief information security officer, see Chapter 65.

8. N. Tal-Or and Y. Papirman, “The Fundamental Attribution Error in Attributing Fictional Figures' Characteristics to the Actors.” Paper presented at the annual meeting of the International Communication Association, Sheraton New York, New York, NY, April 13, 2008; www.allacademic.com/meta/p13476_index.html.

9. G. Hofstede and G. J. Hofstede, Cultures and Organizations: Software of the Mind, 2nd ed. (New York: McGraw-Hill, 2004).

10. S. Salahuddin, “Taliban Defend Yellow Badges for Non-Muslim Afghans,” Reuters, May 23, 2001; www.afghanistannewscenter.com/news/2001/may/may23c2001.html.

11. See Chapter 45 in this Handbook for a discussion of such an example.

12. For more ideas on effective security-awareness programs, see Chapter 49.

13. SIIA Resource eStore: www.siia.net/estore/10browse.asp?Category=Anti-piracy.

14. See FUTURECents, “Management by Walking Around,”www.futurecents.com/mainmbwa.htm; and R. Amis, “Time Again for Management by Walking Around?” IT Manager's Journal, January 15, 2006; www.itmanagersjournal.com/feature/10227.

15. MADD, “Total Traffic Fatalities vs. Alcohol Related Traffic Fatalities—1982–2006,” Mothers Against Drunk Driving, http://maddtx.org/stats/11882.

16. See Chapter 66 in this Handbook for a discussion of the importance of widespread participation in security-policy development.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.217.146.61