In earlier chapters, we took a broad look at the hazards and challenges organizations face when their businesses depend on data networks. Whether a government agency or private firm, all organizations face similar security challenges – how to best protect assets without impairing productivity and the bottom line. We also looked at various protective measures to protect assets, primarily performed by trained system administrators. We also looked at recommended procedures for reacting to adverse events, thereby controlling damage and minimizing the impact upon the organization.
In this chapter, we will step away from the technical world and discuss administrative mechanisms available to security analysts and system administrators. These mechanisms allow security administrators to guide the behaviors of IT users in the organization in a manner that reduces easily avoidable security hazards. Without these mechanisms, system administrators would spend enormous amounts of time fixing security problems that should not have occurred in the first place, at significant costs to the organization.
At the end of the chapter, you should be able to:
The administrative mechanisms used in the industry to guide end-user behaviors are policies, standards, and guidelines. These mechanisms allow security administrators to obtain executive level endorsement for information security objectives within the organization and translate these objectives to specific actionable items for all organizational members. When their experiences with handling incidents suggests to security administrators that the organization needs to change the way it deals with information security, they can bring suggested changes to the attention of top management for review. While top management is concerned about information security, it is also concerned that additional security usually impedes work, and can add substantial training costs to deal with the change. However, if the change is still warranted, top management will allow the change. The resulting information security practices are released as policies. Standards and guidelines emanate from these policies. Policies, standards, and guidelines need to be targeted and have clear objectives. In order to accomplish this, it is important to understand the basic principles of information security valued by the organization and use those principles as the underlying support of the policy. We will discuss some of these principles in the next few paragraphs.
First of all, the organization must comprehend the fact that security affects the organization and its employees and customers on a daily basis. Security is not something you do today, skip tomorrow, and then try it again next week. Sound principles of security must be embedded in any and all activities in the organization.
Next, understand the concept of “layers of security.” There is no “one size fits all” solution for security problems. As a security analyst or a systems administrator you will find many companies out there trying to tell you that their product is absolutely indispensable and will resolve all your security problems. That will never be the case. If that was the case we would not see repeated in the news media, over and over again, cases of virus outbreaks, data leaks, and web defacements. The best way to fight back hackers, malware, and fraudsters is to implement multiple security systems in order to protect your asset. So, to protect the data in a file server you may have implemented a login system with complex passwords, biometric scans, a firewall, EPP, and encryption, hoping that one of these systems will catch a threat action.
Understanding other positions may also help with the writing of policies. Does the company prefer open source or commercial software? The different approaches may bring up different policy requirements. Does the company adopt one of the industry standards across the board, or is it more selective on what it adopts? Does it hire temporary consultants or does it strive to keep knowledge in-house?
According to the COBIT framework,1 “a policy is a document that records a high-level principle or course of action that has been decided on.” The emphasis here is on “high-level.” Policies reflect principles endorsed at the highest levels of the organization. Executive time at these levels is very expensive, and these executives try very hard not to revisit an issue a second time. Therefore policies are written in a language that is general enough to deal with routine developments in business and technology. The other administrative mechanisms – standards, guidelines, and procedures – emanate from policies and provide specific actionable directions to all employees. Standards, guidelines, and procedures are written by experts such as system administrators and can change as the specific circumstances within the organization change. Thus, while a policy specifies a general direction for the organization to follow, without concerns for how to get there, standards, guidelines, and procedures focus on how to get where the policy desires to go.
For instance, the University of South Florida states the following on policy 0-516, SSN Appropriate Use Policy2:
Paper and electronic files containing Social Security Numbers will be disposed of in a secure fashion in accordance with state and federal retention and disposal policies.
There is no detail on how to dispose of paper containing SSN. The only requirement is that it is done “in a secure fashion” according to the law. The focus of the policy is that the records are disposed, and not how the disposal is to be implemented. That would depend on technology available, cost, etc. and will be described in standards, procedures, and guidelines.
A standard is a defined set of rules, accepted and adopted by several organizations. Some standards are referred to as “industry standards.” These are activities, settings, and measurements that are accepted by all firms in an industry and should be considered the norm for operations.
NIST, the National Institute for Standards and Technology, is one of the foremost sources for standards in terms of IT security, at least for organizations within the United States. Even though their documents are usually carefully labeled as “recommendations” or “guidelines,” they are seen as de facto standards for all organizations in the United States. Some examples have been seen throughout this textbook, and include the “Guidelines for Conducting Risk Assessments.”
The International Organization for Standardization (ISO) is another organization accepted worldwide to produce standards with international scope. One of the most widely used ISO standards is 17799/27002, which deals with information security. According to their website, ISO 27002 “establishes guidelines and general principles for initiating, implementing, maintaining, and improving information security management in an organization. The objectives outlined provide general guidance on the commonly accepted goals of information security management.” ISO/IEC 27002:2005 contains best practices of control objectives and controls in the following areas of information security management:
Once accepted by the organization, standards are mandatory. For instance, in order for an organization to declare itself ISO 27002-compliant, the organization must adhere to all regulations put forth by the standard. There is no such thing as “partial compliance.”
Sidebar
In 2009, Symantec put together a nice poster referencing different standards and regulatory compliance requirements, in a clear fashion so the reader can quickly identify the similarities among each class of security requirement. You can find a copy here:
Standards also are directly related and backed up by a policy. For instance, a policy could declare that all computers in the organization must have installed an end point protection solution put forth by the IT department and made available at the IT website. The standard would then specify which EPP should be installed. The advantage in this situation is simple. As we will see, usually a policy is harder to modify than a standard. By allowing IT to keep the EPP standard, the policy is allowing IT to make decisions in terms of EPP without necessarily having the burden of going through the entire policy life cycle and approval.
Finally, standards make a policy more meaningful. Take the example in the previous paragraph. Without a standard, EPP becomes a bit vague. EPP is a collection of applications that protect an end point. They always include an antivirus solution, but the following are optional:
Without the standard, units would have a hodgepodge of solutions. The standard determines which of these options are important for the organization and forces all units to implement that solution.
Guidelines are the procedures you tell units when “it would be nice if” things were operated or accomplished in a certain way, but it is not a requirement to do so. For instance, let's assume there is a new antivirus application that runs on iOS, and it works wonderfully. The IT department may be able to “suggest” that everyone should be installing and running this app on their devices, but without a policy stating IT has the ability to do so, this suggestion will remain a suggestion and will not be mandatory.
Some guidelines may evolve later to be standards. In a university without a centralized IT department it may be difficult for the security organization to must the support for a unified antivirus solution for all. That would involve convincing all units to give up their right to run any AV software they would like. Instead, IT security may be forced to put a guideline together, specifying the use of vendor “A” and the reasons why this should be done.
This scenario, when the reasons may be more political than technical, is when it is extremely useful to use the “carrot vs. stick” principle. If the IT security organization is able to offer the software provided by vendor “A” for free to the other units on campus, many units will find the idea appealing and convert to using the software. This is the carrot approach.
But, to the point of this section, guidelines are adopted strictly on a volunteer basis. The document generated will continue to remain a guideline until there is enough authority granted by management to be able to make it a standard (Figure 13.1).
A typical complaint of technical folks is “why do we have to go through all this trouble? Why so much paperwork?” IT personnel are generally very hands on people, and documentation is not necessarily their forte. But the need to maintain policies goes beyond compliance. Done right, documenting these operations is not just added red tape and bureaucracy, it can actually improve the organization's functioning.
For example, security policies are a sign to customers, end users, and even employees, that the organization takes security seriously. For example, the City of Tampa, FL, posts the following on their security policy:
Providing you with a secure online experience is a high priority of the City of Tampa. We recognize that your information security is of the utmost importance, and we have devoted a great deal of effort to ensure that your personal information is safeguarded.3
This policy indicates the importance the organization places on information security, which should be reassuring to concerned users. Policies also provide roadmaps for new employees and users. One common policy we will see in the next few sections is known as the “acceptable use policy” or AUP. The AUP describes to users the do's and don'ts of the system, things that are acceptable to do as well as things that would cause an end to services or employment. Here's a sample of AT&Ts IP services AUP:
Threatening Material or Content: IP Services shall not be used to host, post, transmit, or re-transmit any content or material (or to create a domain name or operate from a domain name), that harasses, or threatens the health or safety of others. In addition, for those IP Services that utilize AT&T provided web hosting, AT&T reserves the right to decline to provide such services if the content is determined by AT&T to be obscene, indecent, hateful, malicious, racist, defamatory, fraudulent, libelous, treasonous, excessively violent or promoting the use of violence or otherwise harmful to others.4
Policies force organizations to determine the value of information they generate in support of actual assets. Sometimes it may be advantageous to make this determination and document it in case of litigation. For instance, MIT has the following paragraph on their policy on retention of DHCP logs.
The DHCP server is configured to provide dynamic addresses automatically as needed. The logs of information are maintained on an IS&T-managed server. Each log is tagged with its creation date; once a day, the system deletes logs that are 30 days old.5
MIT is not the only organization to have their DHCP log policies in writing. Many universities do it for a very specific reason: to make organizations tracking violation of copyright laws aware that, if they are to pursue any infringement actions, they must notify the organization within 30 days (in the case of MIT) of the event detection.
Policies also ensure consistency across the organization. And consistency is a good thing. Academic organizations, for instance, are notoriously decentralized. Each college may have their own computing group, each administrative unit. While in terms of desktop support this model ensures that the individuals working on a workstation respond to the same person as the owner of the workstation, usually the College Dean, in terms of security it has the potential to create a panacea of solution. For instance, School of Architecture may decide that antivirus is a waste of funds and not to purchase any. College of Engineering may install a low-quality AV solution just because it is cheaper than others. College of Fine Arts may pay an exorbitant price for their license because they don't have the number of computers to be able to negotiate a better deal. A campus-wide policy affecting antivirus solutions would unify these units, forcing them to work with each other to achieve standardization, better pricing models, and many other benefits associated with consistent use across campus.
Note that this is not a statement that “centralized IT systems work better.” Even if support is decentralized in an organization, certain aspects (such as security) must have a common baseline. Usually this can be accomplished by setting up minimum common denominators. For instance, the Windows Account Management Standard may be something to the effect that “passwords will not be shorter than 8 characters.” This does not keep any department from making their own internal policy and having a password of a minimum of 12 characters.
Finally, the ultimate reason for IT personnel to support the development of a security policy is management backing. If the policy was developed the right way, inputs from all affected units and stakeholders would have been considered before its promulgation. This greatly improves acceptance of any constraints imposed by the policy. And much like the law, once the policy is in place, alleging ignorance will not exempt individuals from consequences. For instance, if the organization has a policy that computers not updating their virus definitions daily will be pulled off the network, and a user's computer is pulled off the network due to this reason, the user would have no recourse to complain.
Much like the incident response cycle, policies also work with cycles. Actually, incidents are often the driving force behind the creation of a new policy or the revision of an existing one. In the late 1990s, massive outbreaks of the “Melissa” and “ILOVEYOU” viruses drove the creation of centralized security policies for decentralized university organizations, and the naming of Information Security Officers for universities.
There are two separate and distinct audiences for a policy. An organization either writes policy for their employees and customers, or the policy is written to satisfy a state or federal regulation. Ideally the security administrator will be using the policy to address both audiences.
At times there may be a specific need for formality in the writing of policies. We have all seen those policies before, when you need a translator next to you in order to figure out what the policy is saying. As a rule of thumb, don't use legal language unless you have to do it. Policies need to be written in such a way that employees and customers clearly and quickly understand what the writer is trying to say. If they are not written in clear language, the only audience the policy is addressing is the regulatory compliance, and it becomes “a policy for the sake of a policy”: no one reads it, everyone has a vague idea of what it is about.
As often as possible, policies should be targeted at a specific issue. Some organizations write pages and pages of policies, expecting users to read and understand the entire document. By breaking up the policies into targeted segments regulatory compliance can be satisfied, and users will have an easier time finding what they are looking for. For instance, you may have one policy that addresses data protection, another that talks about user access, yet another that discusses data backup.
We previously mentioned that policies are often spurred by adverse events. When an event is large enough and grabs the attention of management it is common to see knee-jerk reactions in form of policy. We will discuss Impact Assessment in a few sections but it is a good idea to make sure policies that stifle productivity and usability are not put in place simply as a “knee jerk” reaction to an adverse event. Policies need to be well thought out and impact analyzed before adopted by an organization. This is especially true for security policies.
On one end of the spectrum, the security policies have to be strong enough to protect the confidentiality, integrity, and availability of the assets. Because of the need to protect, many organizations err on the protection side of the issue. However, you cannot strangle productivity and the mission of the organization in order to satisfy that goal. Your employees will have a tendency of keeping you honest as far as policies go. They are an extremely resourceful bunch, dedicated to doing their job or achieving their goal in the easiest possible manner. If they deem an activity or behavior to be the best way to accomplish their goal and you forbid them to do it, they will find a way around it. When your organization puts a policy together make sure users will actually be able to abide by them.
Now let's discuss the stages of the policy cycle are the following:
Now it is time to put pen to paper and actually start writing. Your organization may have a specific format for writing policies. In the absence of that, one of the best things to do is to search the web for similar policies from the same industry. So, if you work on a K-12, look at other schools and school districts. It is useful to look at your State first, since the State may have regulatory compliance issues that may have to be addressed on the topic you are tackling. Then, look at other policies, both nationally and internationally. This will ensure you cover as many sub-topics as possible.
Here we present a generic template which includes the sections you will find in almost any policy. The names may change a little, for instance some may choose to call the “Overview” the “Introduction”, but the content of the sections will remain. For the sake of maintaining a similar thread throughout the section, the examples we will examine are all from higher education.
This is the first section in a policy. The overview tells users the reason why the organization decided that it would be appropriate to have such policy. Let's look at an example from the University of Arizona. This is from a general Security Policy6:
University resources, information and technology have become increasingly important to faculty, staff and students for academic and administrative purposes. At the same time, internal and external threats to the confidentiality, integrity, and availability of these resources have increased. Security breaches are commonplace and universities continue to be popular targets for attack. Critical university resources, such as research, patient care, business transaction, student, and employee nonpublic personal data, must be protected from intrusion and inappropriate use or disclosure. Devices must be set up and routinely maintained and updated so that they prevent intrusion and other malicious activities.
In the first paragraph above, the university outlines that they value their institutional data. They also give a glimpse of some of the issues that will be covered on the policy. In the second paragraph, they elaborate on the purpose for writing the policy.
The purpose of this policy is to ensure that all individuals within its scope understand their responsibility in reducing the risk of compromise and take appropriate security measures to protect university resources. Access to university resources is a privilege, not a right, and implies user responsibilities. Such access is subject to Arizona Board of Regents and University policies, standards, guidelines and procedures, and federal and state laws.
With this paragraph, they go back to some of the guiding principles we discussed earlier. Security is not the job of IT alone. Instead, securing their data is the responsibility of every individual in the university. They also set things up for the enforcement piece, stating that their access, including the access given to students, is not a right due to the paying of their tuition, but a privilege. And if the user abuses these privileges, there may be consequences.
The scope section tells the user what or who is covered by the policy. Policies will always have a scope associated with it. Here's the scope for the Workstation Security Policy7 at Emory College:
The workstation security policy is applicable to all workstations (Windows, Mac OS X, Linux)(including desktops, portables, and virtual machines) that fall under the administrative scope of ECCS.
This is a very clear-cut scope. At one glance, the user can read it and determine whether their workstation is covered under this policy or not. However, organizations must be careful not to over-specify the target of the policy, unless there is a need for it. By noting Windows, Mac, OS X, Linux, ECCS is opening the door for loopholes. A simple “… and other Operating Systems” at the end of that list would cover the loophole. However, as is, a faculty member running an older Solaris Desktop Workstation would not be covered by the policy.
In another example of scope, the Kansas State University has the following scope attached to their Incident Management Policy:8
These procedures apply to all University personnel, units, and affiliates with responsibility to respond to security incidents involving University IT resources or data.
KSU's policy is an excellent example of an Incident Management Policy itself, including data classification and very clear lines of responsibilities. If security is not the sole responsibility of the IT security group but instead it is shared by every user, this scope basically includes all employees and affiliates of the university every time institutional data is involved.
Still in the “pre-policy” sections, where we set the stage for the actual policy, you may see a separate section for definitions. This is particularly useful when the subject matter of the policy may be unclear to the audience, or if the organization needs a bit more clarification on the scope.
As an example consider Georgetown's definition of ePHI, Electronic Protected Health Information:
Electronic Protected Health Information: ePHI includes any computer data relating to the past, present or future physical or mental health, health care treatment, or payment for health care. ePHI includes information that can identify an individual, such as name, social security number, address, date of birth, medical history or medical record number, and includes such information transmitted or maintained in electronic format, but excluding certain education and student treatment records. Not included within ePHI are student education records, including medical records (which are protected under FERPA), medical records of employees received by Georgetown University in its capacity as an employer, and workers' compensation records. Although these records are not covered under the HIPAA Privacy or Security Rules, other University Policies cover the confidentiality and security of these materials. There are special provisions in the law governing the release of psychotherapy records.
This definition is extremely important for Georgetown's HIPAA Policy,9 since ePHI is at the heart of the HIPAA regulations. This definition not only specifies what is considered ePHI but also some clear examples of what is not considered ePHI, such as student records.
A popular term used in IT policies is “Information Resources.” But what exactly is an information resource? Does it include an employee's smartphone? A student's laptop? A departmental fax machine? A faculty's telephone? Here's how the Marist College10 defines Information Resources:
For the purpose of this policy, information resources refer to:
This is a very thorough definition of Information Resources. From now on in the policy, every time the words “Information Resources” are mentioned there should be no questions on what it refers to.
We finally get to the section which will explain to the readers the actual policy we want to establish. The section will pick up all the concepts introduced in the first couple of sections – the purpose, the organization's guiding principles, the targets for the policy, and the definitions, and move it forward to the conclusion. The policy statement formulates how the organization will deal with a particular situation.
The next paragraph contains a piece from the University of Massachusetts Boston's Wireless Requirements and Procedures, discussing Wireless Access Points. WAPs are the point of connection between a mobile device and the rest of the network:
All WAPs connected to university infrastructure must be registered with IT and must comply with the technical standards and naming conventions specified by IT. The registration process requires information including the responsible university unit and designated liaison, as well as the location, purpose, and technical and operational information about the WAP. Registration can be accomplished using the online form located at the IT website. Such registration is intended for the identification of the WAP, to facilitate communications between all parties responsible for wireless network support and operation, and to ensure compliance with all applicable UMass policies, standards, and guidelines, as well as federal, state, and local rules and regulations.
This is a common type of policy. Wireless Access Points or WAPs carelessly deployed on campus could easily cause problems. If settings are not done properly, individuals walking around campus could associate with the WAP accidently, opening themselves up to sniffing attacks. Also, tracking connections back to a particular user may not be possible.
Statements of policy will vary in length depending on the subject matter and the organization's choice of either grouping multiple security issues into one policy or splitting them up into multiple policies. As much as possible, the statement of policy will also outline the responsibilities for implementing the policy.
The Coordinator of Incident Response upon receiving a report is responsible for assessing its veracity, determining whether or not the event constitutes an IT Incident and classifying the IT Incident, and initiating handling procedures.
The above statement is part of Purdue's Data Security Incident Response Policy. It is just one of several statements specifying the responsibilities of the coordinator of incident response.
The enforcement section is usually the last section of the policy. It may refer to other policies for penalties. It is also rarely specific in the penalty. It will usually mention a range of possible measures, with phrases such as “up to and including” and “appropriate measures.” These sections will also tend to use “may” instead of the more absolute “shall” or “must” used in the rest of the policy. Take Carnegie Mellon's enforcement section:11
Violations of this Policy may result in suspension or loss of the violator's use privileges, with respect to Institutional Data and University owned Information Systems. Additional administrative sanctions may apply up to and including termination of employment or contractor status with the University. Civil, criminal and equitable remedies may apply.
The enforcement section may also mention exceptions to the policy, or means by which a user would be able to apply for an exception of the policy. The same policy from CMU adds the following:
Exceptions to this Policy must be approved by the Information Security Office, under the guidance of the Executive Steering Committee on Computing (“ESCC”), and formally documented. Policy exceptions will be reviewed on a periodic basis for appropriateness.
The Information Security Office will not only approve or reject any requests for an exception but also review these requests from time to time to make sure they fit in with current tech threats, not putting the university at risk. Also note something common in policies in this paragraph: the exceptions will be reviewed “periodically.” Not yearly, not every month, but periodically. This is done so that the ISO does not break its own policy by not reviewing them on a specific time table. If other pressing matters appear, the review can be delayed.
With all policies approved by the organization, compliance is mandatory. The following is another example, a snippet of the USDA security policy. It first specifies what was mentioned, that everyone involved with USDA data must comply with the security policy. Then, on the last paragraph, it speaks to the enforcement of the policy:
All users of information and AIS, including contractors working for USDA, are responsible for complying with this information systems security policy as well as procedures and practices developed in support of this policy. Any contractor handling sensitive USDA data is subject to the security requirements specified in this Departmental Regulation.
Anyone suspecting misuse or attempted misuse of USDA information systems resources are responsible for reporting such activity to their management officials and to the ISSPM.
Violations of standards, procedures, or practices in support of this policy will be brought to the attention of management officials for appropriate action which will result in disciplinary action, that could include termination of employment.12
This paragraph does something very common to policies when it comes to enforcement. Instead of specifically stating that anyone does this will be fired immediately, it softens the blow by saying “up to” termination of employment. Making the statement in this manner allows management officials to apply their own penalties without necessarily having to fire the individual employee. In fact, this policy doesn't even set a low threshold for the enforcement. For all intents and purposes, a simple slap on the wrist may be enough.
Not to “pick” on this policy, but it is important to mention that this particular policy is also missing another important point required by COBIT guidelines. And that is the method for exemption. One alternative that would make this policy COBIT compliant would be to state that the perpetrator would be fired, then allowing an avenue for appeal due to special circumstances.
Once the policy is written, it is strongly recommended that the policy be reviewed by all affected stakeholders. During this phase, the draft of the policy is circulated through stakeholders and feedback is requested. One of the questions posed to the stakeholders is whether the new policy or change in existing policy will have an impact on their department, beneficial or not. The organization has to be able to consider also the impact of a failure to pass new policy, as well as the impact of passing the same.
When we discuss policies and vetting the issue of Governance immediately comes up. Governance is the hierarchy of who makes decisions within the organization. In terms of policy, Governance reflects the committees or groups that have the ability to veto a policy before it becomes official. The University of Michigan13 lists the following:
The following identifies the different levels of governance review and vetting of policies, standards and guidelines (initially drafted by IT policy development working groups):
CISO/IIA Executive Director: Initial review of policies, guidelines, and standards
IIA Council: First level of governance review for IT policies, standards, and guidelines
CIO: Second level of governance review for IT policies; final approval of guidelines and standards before adoption and dissemination to campus
IT Council: Third level of governance review for IT policies; new or substantially revised policies require IT Council approval
IT Executive Committee: Final level of governance review for IT policies; policies recommended for adoption as a new or revised Standard Practice Guide require approval of the IT Executive Committee.
There may be other levels of approval involved before the policy becomes official. Generally, in order for a policy to be applied to an entire organization, it also has to be vetted by other groups. Faculty members may have a say on the policy. Perhaps even student organizations. Some universities have specific “Policy Groups” setup with cross-campus representation, responsible for reviewing and approving or rejecting policies. Other universities handle policies within the Office of General Counsel. Here's an example from Cornell University:14
With the responsible executive's approval, the UPO will distribute the draft of the policy document to members of the Policy Advisory Group (PAG) in advance of a PAG review meeting. The responsible executive or the responsible office will present the draft policy to the meeting, where the document will be reviewed for practicality and clarity. After the PAG meeting, the UPO and responsible office will review and make accepted changes proposed by the PAG. Then, the PAG will recommend that the EPRG approve the reviewed document.
With the responsible executive's approval, the UPO will distribute the final draft of the policy to members of the EPRG in advance of the EPRG meeting. The responsible executive will present the final policy draft to this meeting, where the EPRG will deliberate on final approval of the policy, in particular its principles. The UPO and responsible office will make changes as directed by the EPRG.
Once the EPRG and the responsible executive have approved the document, the UPO will note on the document the date of final approval as the date the policy was “Originally Issued,” and will promulgate the policy to the university community through a formal announcement.
The UPO (University Policy Office) handles the mechanics of the policy promulgation process. The PAG is the cross-functional group responsible for the approval. At Cornell the PAG actually meets from time to time to make policy decisions. In other universities, the vetting process may be done over email, with a deadline for comments to be brought forth.
As you can see, at times you may be talking several weeks before a policy goes through the promulgation process and is made enforceable. This is one of the reasons why technical details should be left out of policies as much as possible. By adding a reference to a standard on the policy and putting the IT organization in charge of the standard, things like minimum length of passwords, supported operating systems, and other dynamic IT items can be modified more easily with just an internal review.
While these extensive reviews appear to be unnecessary bureaucracy and red tape, they prevent the organization from developing policies that are difficult to implement, or which create inadvertent consequences among stakeholders. Rather than revise policies after encountering resistance from these groups, and wasting your time as well as your credibility with top management, it is better to consider all possible problems with a policy before bringing it up to top management for approval.
Once the policy or standard is created and promulgated, when should it be reviewed? There are a couple of triggers that should be considered, but one of the most common is the periodic review of policies. Universities are usually accredited by an outside academic organization. In Florida, that organization is SACS, the Southern Association of Colleges and Schools. Every 5 years, SACS sends a team of academic investigators to look at the university, its degree offerings, and overall policies and procedures. One of the things SACS specifically looks at is whether policies, including IT policies, are reviewed periodically. If a policy is 10 years old, has it been reviewed recently? Does it meet the current requirements of the institution? If not, does this reflect a systematic negligence on the part of IT? The rule of thumb is to have an internal review of all policies, standards, and guidelines at least once a year. Usually the folks in IT responsible for writing policies are also the same in charge of administration of the systems, to one extent or another. The yearly review period is the time when all triggers are considered to determine if a policy has to be reviewed.
One of the things that may expedite the review of a policy or standard is the advent of technology change. Ideally, the policy was written in such a way that new technologies may be addressed on the standard, instead of having to go through the entire promulgation process.
New projects deploying new or updated applications also may require a review. For instance, changing your employee portal to a new application could be a massive endeavor, which may very well require the change of policy, standard, and procedures at the same time.
Changes in regulatory compliance may require a reevaluation of governance. For example, the Higher Education Opportunity Act of 2008 forced universities to take a more concrete stance against the illegal sharing of copyrighted material, such as movies or songs. According to EDUCAUSE,15 several sections of the HEOA deal with unauthorized file sharing on campus networks, imposing three general requirements on all US colleges and universities:
Universities were required to make a good-faith effort for compliance by August 2008, even though the law was not going to be enforced until 2010. Failure to comply could result in massive financial losses for the university in terms of Financial Aid funds. Changes in compliances resulted in change in operations, which had to reflect on changes for existing policies.
Before we look at examples and some key policy issues, let's take a look at a topic that is commonly misunderstood: compliance. More importantly, let's understand the difference between a secure environment and a compliant environment.
Compliance, sometimes referred to as regulatory compliance, involves following specifications put forth by policies or legal requirements. Policies are often originated from (a) industry standards for the area, themselves driven by regulatory compliance, or (b) events with adverse effects on the organization. These legal specifications are often vague and confusing, especially in the case of compliance mandated by state and federal law. For instance, leak of Social Security Numbers in the past couple of years generated many State laws requiring protection of SSNs, without addressing the reason why organizations at times are required to collect SSNs.
But for security analysts it is important to understand the difference between security and compliance. Let's assume, for instance, that you maintain a highly secure server which stores restricted data for your company. In this hypothetical server, your company stores thousands of credit card information from their customers. You have set up 20 different controls to maintain the system secure, from a single account only known by you, multi-factor authentication, firewalls, etc. If the credit card data is not encrypted, no matter what else you have done to protect the data, the system may be out of compliance according to the Payment Card Industry (PCI) policy.
This is not to say that compliance is not important. Internal Audit and Compliance departments ensure that administrators and other employees adhere to the laws and policies governing the organization, so not to put the organization at undue risk. In the IT arena, the absence of an internal Audit and Compliance department that partners with IT on projects implies that the responsibility for abiding to all of the sources shown in Figure 13.2 are up to the IT department. If the compliance context is not available, security management and operations teams may well be doing what they believe to be the “right” things, but what could, in fact, be wasting effort and not achieving the needed results.
Compliance is a critical aspect of any project and, as such, should be considered at the planning stages of any endeavor. It is a lot easier to design with compliance in mind than to try to retrofit and accommodate those requirements later.
Each state has its own set of regulatory compliance requirements. Some are directly aimed towards IT resources, such as California's Breach Notification Law. According to datagovernance.com, SB 1386 (the California Security Breach Information Act) is a California State law that requires companies that collect personal information to notify each person in their database should there be a security breach involving personal information such as their Social Security number, driver's license number, account number, credit or debit card number, or security code or password for accessing their financial account.
Others indirectly affect IT operations, such as Florida's Record Retention laws. These are an extremely complex set of regulations with retention schedules that identify agency records and establish minimum periods of time for which the records must be retained based on the records administrative, fiscal, legal, and historical values.16
As if keeping up with State requirements was not enough, there are also Federal compliance requirements put forth by the many different laws and regulations, based on the type of industry or type of data your organization handles. Just like State regulations, some are directly aimed at IT resources. Others end up involving IT indirectly. Here are some of the best-known and most complex federal regulations. Any of these would take at least an entire week to study. We will just give you a brief summary.
The HIPAA Privacy Rule provides federal protections for personal health information held by covered entities and gives patients an array of rights with respect to that information. At the same time, the Privacy Rule is balanced so that it permits the disclosure of personal health information needed for patient care and other important purposes.
The Security Rule specifies a series of administrative, physical, and technical safeguards for covered entities to use to assure the confidentiality, integrity, and availability of electronic protected health information.17
The Gramm–Leach–Bliley Act (GLB or Financial Modernization Act) requires “financial institutions” to protect the privacy of their customers, including customers' non-public, personal information. Because universities also deal with a variety of financial records from students and their parents, universities also have a responsibility to secure the personal records of its students. Among the institutions that also fall under FTC jurisdiction for purposes of the GLB Act are non-bank mortgage lenders, loan brokers, some financial or investment advisers, tax preparers, providers of real estate settlement services, and debt collectors. At the same time, the FTC's regulation applies only to companies that are “significantly engaged” in such financial activities.
GLB is composed of two “rules”: the Safeguards Rule and the Privacy Rule. According to the FTC website, the Safeguards Rule requires companies to develop a written information security plan that describes their program to protect customer information. The plan must be appropriate to the company's size and complexity, the nature and scope of its activities, and the sensitivity of the customer information it handles. As part of its plan, each company must:
The Family Educational Rights and Privacy Act (FERPA) (20 U.S.C. § 1232g; 34 CFR Part 99) is a Federal law that protects the privacy of student education records. The law applies to all schools that receive funds under an applicable program of the US Department of Education.
Schools may disclose, without consent, “directory” information such as a student's name, address, telephone number, date and place of birth, honors and awards, and dates of attendance. However, schools must tell parents and eligible students about directory information and allow parents and eligible students a reasonable amount of time to request that the school not disclose directory information about them. Schools must notify parents and eligible students annually of their rights under FERPA. The actual means of notification (special letter, inclusion in a PTA bulletin, student handbook, or newspaper article) is left to the discretion of each school.18
The Sarbanes–Oxley Act of 2002 (SOX) introduced significant changes to financial practice and corporate management regulation. Passed in the wake of numerous corporate scandals, SOX is a complex piece of legislation that requires companies to make major changes to bring their organizations into compliance. The Act holds top executives personally responsible for the accuracy and timeliness of their company's financial data – under threat of criminal prosecution. Thus, SOX compliance has become a top priority for publicly traded companies.
In compliance with the Federal Act, the IT departments plays a major role in securing the accuracy and reliability of the corporate data. With the implementation of the Sarbanes–Oxley Act, information technology controls have become more popular. Here are some of the IT processes that would likely be investigated when checking for compliance:
Export Control Laws are a hot topic for research universities around the country. The laws prohibit the unlicensed export of certain materials or information for reasons of national security or protection of trade. Export controls usually arise for one or more of the following reasons:
University research is subject to export control laws including the International Traffic in Arms Regulations (ITAR). These laws may also apply to research activities on campus, to the temporary export of controlled university-owned equipment including laptop computers containing controlled software or technical data, and to the shipment of research materials to foreign collaborators.
Here is a quick rundown of some of the key issues any organization should be prepared to handle, either at the policy or standard level.
Acceptable use – The AUP is one of the main policies for an organization. It gives guidelines to users and customers on what is appropriate and what is not appropriate to do with information technology resources. Scope definition is important so that users understand what and who falls under the policy. The costumer AUP may be different than the employee's AUP. In a university environment they are usually the same. If there are any exceptions to the coverage they also should be mentioned. For instance, College of Medicine may need a stricter AUP due to the need for HIPAA compliance.
Information classification – This is the policy which outlines the definitions of criticality and sensitivity of the asset. Examples are important to clarify the intent of the classification. Definition of data ownership and custodianship are also part of this policy.
Network access – This policy spells out which types of users are allowed to connect to network resources. Students at the resident halls may not have access to datacenter subnets. Visiting professors may have to go through a special process to get network privileges. Visitors may only be able to access the guest wireless network if they use their cell number for registration.
Remote access – Specifies the acceptable means by which an employee is allowed to access resources from outside the organization network. This may also include requirements on accessing data through smartphones and other personal devices. Is remote desktop an acceptable option, or should the employee use a VPN connection?
Encryption – What type of data requires encryption? When is a web server required to use SSL? Do test and development environments also require encryption? Can certificates be self-signed? Is it acceptable to send restricted information unencrypted over email?
Contingency planning – Specifies the disaster recovery plans. The policy should establish a clear line of command in case of a localized or generalized disaster, with reporting lines and alternatives in case someone cannot be reached. It designates an executive as the appropriate person to be responsible for the declaration of a disaster. It refers to other standards and procedures for the specifics on what to do with each system in case of disaster.
Incident response – The incident response policy describes the general procedure in case of an incident with adverse effects in the organization. It specifies who is supposed to lead the incident response team, who will be in charge of communications, both internal and external. It will determine when an incident has to be escalated, and how to handle the escalation. It provides the chair of the IRT the latitude to make one-sided quick decisions in order to protect the organization's assets.
Authentication and authorization – What are the accepted methods of authentication? What roles can an individual user take? How soon after termination of employment will the user's account be revoked? Are departments allowed to request an extension to this time period? Who has the right to receive an account on a system?
HB Gary is an information security company with a subsidiary called HB Gary Federal. As the name suggests, this subsidiary aimed to attract information security business from various US federal agencies such as the CIA, FBI, etc. Unfortunately, due to weaknesses in its software and poor implementation of password policies, most of the company's emails were stolen by hacktivists affiliated with Anonymous and Lulzsec in or about February 2011. The publicity associated with the incident made a laughing stock out of the security company seeking security business from some of the highest security organizations in the world. On May 2, 2012, federal prosecutors charged five individuals with this and other associated crimes.
An article in Ars Technica describes the attack in detail. HB Gary had developed a custom content-management-system (CMS) with the help of third parties. Weaknesses in the CMS made the software vulnerable to SQL injection attacks. Exploiting this vulnerability, the attackers downloaded the entire user information from the site. Though the passwords had been encrypted, two key executives at the security firm – CEO Aaron Barr and COO Ted Vera – violated common password recommendations in two important ways: (1) they used simple passwords and (2) they used the same password not only for the CMS, but also for email, Twitter, and LinkedIn. This information about two of the company's most privileged executives followed by some clever social engineering allowed the attackers to download all the company's email and deface the company's website.
The indictment and the Ars Technica article are interesting reading. As is the Wikipedia article on Lulzsec.20
http://arstechnica.com/tech-policy/2011/02/anonymous-speaks-the-inside-story-of-the-hbgary-hack/
http://www.wired.com/images_blogs/threatlevel/2012/03/Ackroydet-al.-Indictment.pdf
http://en.wikipedia.org/wiki/LulzSec
In this chapter, we distinguished between compliance and security. Whereas compliance refers to following specified procedures, security refers to minimizing harm. We distinguished between policies, standards and procedures, which are the three primary forms of formal documents that guide information security in organizations. Given the relative permanence of policies, we walked through the generic process to establish policies that are best likely to accomplish their objectives. We also listed a minimal set of information security policies that we believe every organization should draft.
SANS Policy Templates, http://www.sans.org/security-resources/policies/
(Recommended: Form a group in your class to complete this activity.)
With your group, create an acceptable use policy for the use of Sunshine State computing accounts. When constructing the policy, consider the following information:
Once your group has created the policy, save it on your Linux Virtual Machine as /home/shared/ business_finance/information_technology/ website/it/policy.html which can be viewed at http://it.sunshine.edu/policy.html.
Create a three- to four-line summary of the policy suitable for warning users of the terms of the policy at system login and save it as /etc/motd. Be sure to include the URL of the policy so that users can read the full text.
/etc/motd is the “Message of the Day” file. It is a text file which is printed out to users when they login to relay important messages such as acceptable use guidelines.
Open a new terminal window and use SSH to view the message of the day:
[root@sunshine ~]# ssh [email protected]
Deliverables
Aaron Swartz was a highly respected and extremely skilled computer programmer. At the age of 14, he was one of the creators of the popular RSS service, which allows users to follow content from websites. Aaron also had an intense desire of getting information from behind pay walls so users could access the information for free. Unfortunately, the intensity of this desire led him to surreptitiously download almost all the contents of the JSTOR digital library of academic articles using MIT's network. Almost 4.8 million articles were downloaded in spite of repeated attempts to stop him.
This incident led to a criminal prosecution under the Computer Fraud and Abuse Act (CFAA), and apparently a demand from prosecutors that he go to prison as part of any plea deal. Aaron, long suffering from depression, was unable to deal with the stress and was found dead in his apartment on January 12, 2013. The death was considered a suicide.
Aaron's death generated intense pressure to revise the Computer Fraud and Abuse Act. Aaron's supporters were aghast that a genius such as Aaron would be pushed to his death merely for violating acceptable use policies and other contractual agreements. Rep. Zoe Lofgren of San Jose drafted a law called Aaron's law that would prohibit prosecution under the CFAA for violating acceptable use policies or other contractual obligations if such violation was the sole basis for determining that unauthorized access had occurred.
Other experts argue that Aaron's crime was not merely a violation of acceptable use policies. Aaron stealthily overcame the attempts by JSTOR and MIT to stop the contentious downloads, thereby guilty of false representation.
(There are many articles about Aaron Swartz on the Internet. The following are the primary sources for this case.)
Schwartz, J. “Internet activist, a creator of RSS, is dead at 26, apparently a suicide,” New York Times, 01/12/2013
Sellars, A. “The impact of ‘Aaron’s Law on Aaron Swartz's case,” 01/18/2013, http://www.dmlp.org/blog/2013/impact-aarons-law-aaron-swartzs-case (accessed 07/16/2013)
Healey, J. “One bit of Aaron Swartz's legacy: Fixing a bad law?” Los Angeles Times, 01/16/2013
Computer Fraud and Abuse Act, http://www.law.cornell.edu/uscode/text/18/1030 (accessed 07/16/2013)
During a meeting involving state auditors and university officials you notice that many Sunshine University Deans and Directors make extensive use of tablets and smartphones. Not only that, but they also use applications such as Google Drive and Dropbox to move documents from one device to another seamlessly. You bring this concern to the Dean of the College of Business, who is sitting next to you in this meeting.
Write a policy on the use of cloud-based personal storage space to store university institutional data. The policy must have an overview, scope, definitions, statement of policy, and enforcement. At a minimum, consider and address the following:
In addition to the policy, research and outline the Terms of Service for two personal cloud-storage services, indicating any possible problems these terms of services may bring to the university and to the user, including limited liability, uptime restrictions, etc.
1COBIT 5 Glossary, http://www.isaca.org/Knowledge-Center/Documents/Glossary/glossary.pdf
2SSN Appropriate Use Policy, University of South Florida, http://generalcounsel.usf.edu/policies-and-procedures/pdfs/policy-0-516.pdf
3City of Tampa Internet Policies, http://www.tampagov.net/about_us/tampagov/internet_policies/security_policy.asp
4AT&T Acceptable Use Policy, http://www.corp.att.com/aup/
5MIT DHCP Usage Logs, http://ist.mit.edu/about/policies/dhcp-usage-logs
6UA Security Policy, http://security.arizona.edu/is100
7Workstation Security Policy, Emory College, https://wiki.as.emory.edu/display/ecitprocedures/Workstation+Security+Policy
8Incident Management, Kansas State, http://www.k-state.edu/its/security/procedures/incidentmgt.html
9HIPAA Policy, Georgetown University, http://policies.georgetown.edu/hipaa/sections/security/62953.html
10Information Security Policy, Marist College, http://security.marist.edu/policy.pdf
11Information Security Policy, CMU, http://www.cmu.edu/iso/governance/policies/information-security.html
12USDA Security Policy, http://www.ocio.usda.gov/sites/default/files/docs/2012/DR3140-001.htm
13Policy Development Framework, University of Michigan, http://cio.umich.edu/policy/framework.php
14Formulation and Issuance of University Policies, Cornell University, http://www.dfa.cornell.edu/cms/treasurer/policyoffice/policies/volumes/governance/upload/vol4_1.pdf
15HEOA of 2008, EDUCAUSE, http://www.educause.edu/library/higher-education-opportunity-act-heoa
16Record Retention Scheduling and Disposition, http://dlis.dos.state.fl.us/recordsmgmt/scheduling.cfm
17Understanding Health Information Privacy, http://www.hhs.gov/ocr/privacy/hipaa/understanding/index.html
18US Department of Education website, http://www2.ed.gov/policy/gen/guid/fpco/ferpa/index.html
19UC Berkeley, Export Controls, http://www.spo.berkeley.edu/policy/exportcontrol.html
20Stephen Colbert had his take on the incident, “Corporate hacker tries to take down Wikileaks,” http://www.colbertnation.com/the-colbert-report-videos/375428/february-24-2011/corporate-hacker-tries-to-take-down-wikileaks (accessed 07/23/2013). Warning: NSFW (not safe for work)
3.137.177.179