59Cyber Security
Privacy Act of 1974: This act requires consent of a citizen for a
person or agency to send information on that citizen.
Fourth Amendment: This amendment to the Constitution states that
the right “to be secure in their persons, houses, papers, and effects,
against unreasonable searches and seizures, shall not be violated,
and no warrants shall issue, but upon probable cause, supported
by Oath or afrmation, and particularly describing the place to be
searched and the persons or things to be seized.
Health Insurance Portability and Accountability Act (HIPAA) of
1996: This act species safeguards (administrative, physical, and
technical) to ensure the condentiality, integrity, and availability of
electronic health information.
3.13 Operations Security
Along with the other topics in this chapter, operations security also deals
with a range of activities to minimize the risk of threats and vulnerabilities
to the critical information of an organization. Operations security primarily
entails identifying the information that needs protection and determining
the most effective way to protect the information.
Understanding the key security operations concepts is a priority. An exam-
ple is implementing access control mechanisms (discussed in Section 3.4)
such as the concept of “need to know” where an individual only has access if
the information is needed to perform his or her job. Another way to minimize
risk is to implement “separation of duties,” where no individual has control
over an entire process, thus reducing the possibility of fraud. For example,
changing a rewall rule should be performed by two people. The banking
industry uses this concept in almost all of its processes; for example, no one
person has the full combination to the bank vault. Thus, to open the vault you
need two people, each knowing one-half of the combination. Other security
operations concepts are rotating jobs (illegal activity would not likely be on
the mind of a person being rotated out of a job) and monitoring special privi-
leges to any type of system administrator (database, application, network), as
he or she has the opportunity to corrupt data with privileged access.
Resource protection and incident response should also be considered as
part of the operations security plan. The resources are physical entities such
as hardware (e.g., equipment life cycle, cabling, and wireless networks) and
software (e.g., licensing, access control, and source code protection). Incident
response refers to a plan in place in the event that interruption of normal
operations occurs. This plan would include how to detect and determine the
type of incident, as well as a response, reporting, and recovery strategy. This
will be covered in detail in an upcoming chapter.
60 What Every Engineer Should Know About Cyber Security
Additional areas to consider are preventative measures against malicious
attacks (e.g., DOS, theft, social engineering), patch and vulnerability man-
agement, conguration and change management (i.e., track all versions and
changes to processes, hardware, software, etc.), and the fault tolerance of the
components of any and all devices.
3.14 Information Security Governance and Risk Management
Information security governance is the organizational structure to
implement a successful information security program. To apply security
governance (e.g., processes, policies, roles, compliance, etc.), one must rst
understand the organizations goals, mission, and objectives. Once these are
understood, one can identify the assets of the organization and implement
an effective risk management plan using tools to assess and mitigate threats
to and vulnerabilities of the asset.
Risk assessment can be accomplished either quantitatively, qualitatively, or
both. A quantitative example is being able to put a dollar amount on a risk.
For example, if 1,000 records of patient data were exposed and it costs $30 to
contact a patient, change his or her account number, and print a new health
card, then the loss with this risk is $30,000 (Sims 2012). A qualitative risk
identies characteristics about an asset or activity (Gregory 2010):
Vulnerabilities: An unintended weakness in a product
Threats: An activity that would exploit the vulnerability
Threat probability: The probability that the threat will occur (low,
medium, high, or a numeric scale)
Countermeasures: Tools to reduce the risk associated with the threat
or vulnerability
Other considerations include managing personnel security and developing
security training and awareness. Secure hiring practices, such as performing
reference checks, verifying education, and using employment agreements
and policies between the employer and employee (e.g., nondisclosure)
should be in place. In addition, once the person is hired, there should be
security education, training, and awareness to mitigate risks. Training will
be detailed in the “Preparing for an Incident” chapter.
In addition to software patches and xes to protect against security vulnerabilities, sound
judgment and caution are needed (Microsoft 2012). Here are the 10 immutable laws of
security according to Microsoft:
Law #1: If a bad guy can persuade you to run his program on your computer, it’s not your
computer anymore.
61Cyber Security
Law #2: If a bad guy can alter the operating system on your computer, it’s not your computer
anymore.
Law #3: If a bad guy has unrestricted physical access to your computer, it’s not your computer
anymore.
Law #4: If you allow a bad guy to upload programs to your website, it’s not your website
anymore.
Law #5: Weak passwords trump strong security.
Law #6: A computer is only as secure as the administrator is trustworthy.
Law #7: Encrypted data are only as secure as the decryption key.
Law #8: An out-of-date virus scanner is only marginally better than no virus scanner at all.
Law #9: Absolute anonymity isn’t practical, in real life or on the web.
Law #10: Technology is not a panacea.
References
Bowen, P., Hash, J., and Wilson, M. 2006. Information security handbook: A guide for
managers. NIST special publication 800-100.
Conrad, E. 2011. CISSP study guide. Waltham, MA: Syngress.
Dey, M. 2011. Business continuity planning (BCP) methodology—Essential
for every business. IEEE GCC Conference and Exhibition, February 19–22,
pp.229–232.
Evans, D., Bond, P., and Bement, A. 2004. Standards for security categorization of fed-
eral information and information systems. FIPS PUB 199, February 2004.
Felten, E. 2008. What’s the cyber in cyber-security? Freedom to Tinker, July 24, 2008,
https://freedom-to-tinker.com/blog/felten/whats-cyber-cyber-security/
(retrieved August 12, 2012).
Gregory, P. 2010. CISSP guide to security essentials. Boston: Course Technology, Cengage
Learning.
Jaeger, T. 2008. Operating systems security. San Rafael, CA: Morgan & Claypool.
Khan, M., and Zulkernine, M. 2008. Quantifying security in secure software devel-
opment phases. Annual IEEE International Computer Software Applications
Conference.
King, R. 2009. Lessons from the Data Breach at Heartland. Bloomberg Business Week,
July, 6, 2009, http://www.businessweek.com/stories/2009-07-06/lessons-from-
the-data-breach-at-heartlandbusinessweek-business-news-stock-market-and-
nancial-advice (retrieved August 9, 2012).
Locke, G., and Gallagher, P. 2009. Recommended security controls for federal
information systems and organizations. NIST special publication 800-53.
Martin, B., Brown, M., Paller. A., and Kirby, D. 2011. 2011 CWE/SANS top 25 most
dangerous software errors. The MITRE Corporation.
Microsoft. 2012. 10 Immutable laws of security. http://technet.microsoft.com/
library/cc722487.aspx (retrieved September 8, 2012).
Payne, J. 2010. Integrating application security into software development. IT
Professional 12 (2): 6–9.
Sims, S. 2012. Qualitative vs. quantitative risk assessment. SANS Institute, http://
www.sans.edu/research/leadership-laboratory/article/risk-assessment
(retrieved December 27, 2012).
62 What Every Engineer Should Know About Cyber Security
Swanson, M., and Guttman, B. 1996. Generally accepted principles and practices for
securing information technology systems. NIST special publication 800-14.
Swanson, M., Hash, J., and Bowen, P. 2006. Guide for development of security plans
for federal information systems. NIST special publication 800-18.
Talukder, A. K., Maurya, V. K., Santosh, B. J., Jangam, E., Muni, S. V., Jevitha, K. P.,
Saurabh, S., Pais, A. R. and Pais, A. 2009. Security-aware software development
life cycle (SaSDLC)—Processes and tools. IFIP International Conference on
Wireless and Optical Communications Networks. WOCN ‘09, pp. 1–5.
Theoharidou, M., and Gritzalis, D. 2007. Common body of knowledge for informa-
tion security. IEEE Security and Privacy 5 (2): 64–67.
Title 44 United States Code Section 3542. US Government Printing Ofce. http://
www.gpo.gov/fdsys/pkg/USCODE-2009-title44/pdf/USCODE-2009-title44-
chap35-subchapIII-sec3542.pdf (retrieved August 15, 2012).
Tracy M., Jansen, W., Scarfone, K., and Buttereld, J. 2007. Guidelines on electronic
mail security. NIST special publication 800-45.
Vijayan, J. 2010. Update: Heartland breach shows why compliance is not enough.
Computerworld, http://www.computerworld.com/s/article/9143158/Update_
Heartland_breach_shows_why_compliance_is_not_enough (retrieved August 9,
2012).
Whitman, M., and Mattord, H. 2012. Principles of information security. Stamford, CT:
Cengage Learning, Course Technology.
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.220.126.5