Chapter 6. Threats and Vulnerabilities Associated with Operating in the Cloud

This chapter covers the following topics related to Objective 1.6 (Explain the threats and vulnerabilities associated with operating in the cloud) of the CompTIA Cybersecurity Analyst (CySA+) CS0-002 certification exam:

Cloud service models: Describes Software as a Service (SaaS), Platform as a Service (PaaS), and Infrastructure as a Service (IaaS)

Cloud deployment models: Covers public, private, community, and hybrid clouds

Function as a Service (FaaS)/serverless architecture: Discusses the concepts of FaaS

Infrastructure as code (IaC): Investigates the use of scripting in the environment

Insecure application programming interface (API): Identifies vulnerabilities in the use of APIs

Improper key management: Discusses best practices for key management

Unprotected storage: Describes threats to storage systems

Logging and monitoring: Covers issues related to insufficient logging and monitoring and inability to access logging tools

Placing resources in a cloud environment has many benefits, but also introduces a host of new security considerations. This chapter discusses these vulnerabilities and some measures that you can take to mitigate them.

“Do I Know This Already?” Quiz

The “Do I Know This Already?” quiz enables you to assess whether you should read the entire chapter. If you miss no more than one of these eight self-assessment questions, you might want to skip ahead to the “Exam Preparation Tasks” section. Table 6-1 lists the major headings in this chapter and the “Do I Know This Already?” quiz questions covering the material in those headings so that you can assess your knowledge of these specific areas. The answers to the “Do I Know This Already?” quiz appear in Appendix A.

Table 6-1 “Do I Know This Already?” Foundation Topics Section-to-Question Mapping

Images

Caution

The goal of self-assessment is to gauge your mastery of the topics in this chapter. If you do not know the answer to a question or are only partially sure of the answer, you should mark that question as wrong for purposes of the self-assessment. Giving yourself credit for an answer you correctly guess skews your self-assessment results and might provide you with a false sense of security.


%1. In which cloud deployment model does an organization provide and manage some resources in-house and has other resources provided externally via a public cloud?

a. Private

b. Public

c. Community

d. Hybrid

2. Which of the following cloud service models is typically used as a software development environment?

a. SaaS

b. PaaS

c. IaaS

d. FaaS

3. Which of the following is an extension of the PaaS model?

a. FaaS

b. IaC

c. SaaS

d. IaaS

4. Which of the following manages and provisions computer data centers through machine-readable definition files?

a. IaC

b. PaaS

c. SaaS

d. IaaS

5. Which of the following can enhance security of APIs?

a. DPAPI

b. SGX

c. SOAP

d. REST

6. Which of the following contains recommendations for key management?

a. NIST SP 800-57 REV. 5

b. PCI-DSS

c. OWASP

d. FIPS

7. Which of the following is the most exposed part of a cloud deployment?

a. Cryptographic functions

b. APIs

c. VMs

d. Containers

8. Which of the following is lost with improper auditing? (Choose the best answer.)

a. Cryptographic security

b. Accountability

c. Data security

d. Visibility

Foundation Topics

Cloud Deployment Models

Cloud computing is all the rage these days, and it comes in many forms. The basic idea of cloud computing is to make resources available in a web-based data center so the resources can be accessed from anywhere. When a company pays another company to host and manage this type of environment, it is considered to be a public cloud solution. If the company hosts this environment itself, it is considered to be a private cloud solution. The different cloud deployment models are as follows:

Images

Public: A public cloud is the standard cloud deployment model, in which a service provider makes resources available to the public over the Internet. Public cloud services may be free or may be offered on a pay-per-use model. An organization needs to have a business or technical liaison responsible for managing the vendor relationship but does not necessarily need a specialist in cloud deployment. Vendors of public cloud solutions include Amazon, IBM, Google, Microsoft, and many more. In a public cloud deployment model, subscribers can add and remove resources as needed, based on their subscription.

Private: A private cloud is a cloud deployment model in which a private organization implements a cloud in its internal enterprise, and that cloud is used by the organization’s employees and partners. Private cloud services require an organization to employ a specialist in cloud deployment to manage the private cloud.

Community: A community cloud is a cloud deployment model in which the cloud infrastructure is shared among several organizations from a specific group with common computing needs. In this model, agreements should explicitly define the security controls that will be in place to protect the data of each organization involved in the community cloud and how the cloud will be administered and managed.

Hybrid: A hybrid cloud is a cloud deployment model in which an organization provides and manages some resources in-house and has others provided externally via a public cloud. This model requires a relationship with the service provider as well as an in-house cloud deployment specialist. Rules need to be defined to ensure that a hybrid cloud is deployed properly. Confidential and private information should be limited to the private cloud.

Cloud Service Models

There is trade-off to consider when a decision must be made between cloud architectures. A private solution provides the most control over the safety of your data but also requires the staff and the knowledge to deploy, manage, and secure the solution. A public cloud puts your data’s safety in the hands of a third party, but that party is more capable and knowledgeable about protecting data in such an environment and managing the cloud environment. With a public solution, various cloud service models can be purchased. Some of these models include the following:

Images

Software as a Service (SaaS): With SaaS, the vendor provides the entire solution, including the operating system, the infrastructure software, and the application. The vendor may provide an email system, for example, in which it hosts and manages everything for the customer. An example of this is a company that contracts to use Salesforce or Intuit QuickBooks using a browser rather than installing the application on every machine. This frees the customer company from performing updates and other maintenance of the applications.

Platform as a Service (PaaS): With PaaS, the vendor provides the hardware platform or data center and the software running on the platform, including the operating systems and infrastructure software. The customer is still involved in managing the system. An example of this is a company that engages a third party to provide a development platform for internal developers to use for development and testing.

Infrastructure as a Service (IaaS): With IaaS, the vendor provides the hardware platform or data center, and the customer installs and manages its own operating systems and application systems. The vendor simply provides access to the data center and maintains that access. An example of this is a company hosting all its web servers with a third party that provides the infrastructure. With IaaS, customers can benefit from the dynamic allocation of additional resources in times of high activity, while those same resources are scaled back when not needed, which saves money.

Figure 6-1 illustrates the relationship of these services to one another.

Images

Figure 6-1 Cloud Service Models

Function as a Service (FaaS)/Serverless Architecture

Function as a Service (FaaS) is an extension of PaaS that goes further and completely abstracts the virtual server from the developers. In fact, charges are based not on server instance sizes but on consumption and executions. This is why it is sometimes also called serverless architecture. In this architecture, the focus is on a function, operation, or piece of code that is executed as a function. These services are event-driven in nature.

Although FaaS is not perfect for every workload, for transactions that happen hundreds of times per second, there is a lot of value in isolating that logic to a function that can be scaled. Additional advantages include the following:

Images

Ideal for dynamic or burstable workloads: If you run something only once a day or month, there’s no need to pay for a server 24/7/365.

Ideal for scheduled tasks: FaaS is a perfect way to run a certain piece of code on a schedule.

Figure 6-2 shows a useful car analogy for comparing traditional computing (own a car), cloud computing (rent a car), and FaaS/serverless computing (car sharing). VPS in the rent-a-car analogy stands for virtual private server and refers to provisioning a virtual server from a cloud service provider.

Images

Figure 6-2 Car Analogy for Serverless Computing

Images

The following are top security issues with serverless computing:

• Function event data injection: Triggered not only through untrusted input such as through a web API call

• Broken authentication: Coding issues ripe for exploit and attacks, which lead to unauthorized authentication

• Insecure serverless deployment configuration: Human error in setup

• Over-privileged function permissions and roles: Failure to implement the least privilege concept

Infrastructure as Code (IaC)

In another reordering of the way data centers are handled, Infrastructure as Code (IaC) manages and provisions computer data centers through machine-readable definition files, rather than physical hardware configuration or interactive configuration tools. IaC can use either scripts or declarative definitions, rather than manual processes, but the term more often is used to promote declarative approaches.

Naturally, there are advantage to this approach:

Images

• Lower cost

• Faster speed

• Risk reduction (remove errors and security violations)

Figure 6-3 illustrates an example of how some code might be capable of making changes on its own without manual intervention. As you can see in Figure 6-3, these code changes can be made to the actual state of the configurations in the cloud without manual intervention.

Images

Figure 6-3 IaC in Action

Images

Security issues with Infrastructure as Code (IaC) include

• Compliance violations: Policy guardrails based on standards are not enforced

• Data exposures: Lack of encryption

• Hardcoded secrets: Storing plain text credentials, such as SSH keys or account secrets, within source code

• Disabled audit logs: Failure to utilize audit logging services like AWS CloudTrail and Amazon CloudWatch

• Untrusted image sources: Templates may inadvertently refer to OS or container images from untrusted sources

Insecure Application Programming Interface (API)

Interfaces and APIs tend to be the most exposed parts of a system because they’re usually accessible from the open Internet. API are used extensively in cloud environments. With respect to APIs, a host of approaches—including Simple Object Access Protocol (SOAP), REpresentational State Transfer (REST), and JavaScript Object Notation (JSON)—are available, and many enterprises find themselves using all of them.

The use of diverse protocols and APIs is also a challenge to interoperability. With networking, storage, and authentication protocols, support and understanding of the protocols in use is required of both endpoints. It should be a goal to reduce the number of protocols in use in order to reduce the attack surface. Each protocol has its own history of weaknesses to mitigate.

One API that can enhance cloud security is the Data Protection API (DPAPI) offered by Windows. Let’s look at what it offers. Among other features, DPAPI supports in-memory processing, an approach in which all data in a set is processed from memory rather than from the hard drive. In-memory processing assumes that all the data is available in memory rather than just the most recently used data, as is usually the case when using RAM or cache memory. This results in faster reporting and decision making in business. Securing in-memory processing requires encrypting the data in RAM. DPAPI lets you encrypt data using the user’s login credentials. One of the key questions is where to store the key, because storing it in the same location as the data typically is not a good idea (the next section discusses key management). Intel’s Software Guard Extensions (SGX), shipping with Skylake and newer CPUs, allows you to load a program into your processor, verify that its state is correct (remotely), and protect its execution. The CPU automatically encrypts everything leaving the processor (that is, everything that is offloaded to RAM) and thereby ensures security.

Even the most secure devices have some sort of API that is used to perform tasks. Unfortunately, untrustworthy people use those same APIs to perform unscrupulous tasks. APIs are used in the Internet of Things (IoT) so that devices can speak to each other without users even knowing they are there. APIs are used to control and monitor things we use every day, including fitness bands, home thermostats, lighting, and automobiles. Comprehensive security must protect the entire spectrum of devices in the digital workplace, including apps and APIs. API security is critical for an organization that is exposing digital assets.

Guidelines for providing API security include the following:

Images

• Use the same security controls for APIs as for any web application in the enterprise.

• Use Hash-based Message Authentication Code (HMAC).

• Use encryption when passing static keys.

• Use a framework or an existing library to implement security solutions for APIs.

• Implement password encryption instead of single key-based authentication.

Improper Key Management

Key management is essential to ensure that the cryptography provides confidentiality, integrity, and authentication in cloud environments. If a key is compromised, it can have serious consequences throughout an organization.

Key management involves the entire process of ensuring that keys are protected during creation, distribution, transmission, and storage. As part of this process, keys must also be destroyed properly. When you consider the vast number of networks over which the key is transmitted and the different types of systems on which a key is stored, the enormity of this issue really comes to light.

As the most demanding and critical aspect of cryptography, it is important that security professionals understand key management principles.

Keys should always be stored in ciphertext when stored on a noncryptographic device. Key distribution, storage, and maintenance should be automatic by integrating the processes into the application.

Because keys can be lost, backup copies should be made and stored in a secure location. A designated individual should have control of the backup copies, and other individuals should be designated to serve as emergency backups. The key recovery process should also require more than one operator, to ensure that only valid key recovery requests are completed. In some cases, keys are even broken into parts and deposited with trusted agents, who provide their part of the key to a central authority when authorized to do so. Although other methods of distributing parts of a key are used, all the solutions involve the use of trustee agents entrusted with part of the key and a central authority tasked with assembling the key from its parts. Also, key recovery personnel should span across the entire organization and not just be members of the IT department.

Organizations should also limit the number of keys that are used. The more keys that you have, the more keys you must ensure are protected. Although a valid reason for issuing a key should never be ignored, limiting the number of keys issued and used reduces the potential damage.

When designing the key management process, you should consider how to do the following:

Images

• Securely store and transmit the keys

• Use random keys

• Issue keys of sufficient length to ensure protection

• Properly destroy keys when no longer needed

• Back up the keys to ensure that they can be recovered

Systems that process valuable information require controls in order to protect the information from unauthorized disclosure and modification. Cryptographic systems that contain keys and other cryptographic information are especially critical. Security professionals should work to ensure that the protection of keying material provides accountability, audit, and survivability.

Accountability involves the identification of entities that have access to, or control of, cryptographic keys throughout their life cycles. Accountability can be an effective tool to help prevent key compromises and to reduce the impact of compromises when they are detected. Although it is preferred that no humans be able to view keys, as a minimum, the key management system should account for all individuals who are able to view plaintext cryptographic keys. In addition, more sophisticated key management systems may account for all individuals authorized to access or control any cryptographic keys, whether in plaintext or ciphertext form.

Two types of audits should be performed on key management systems:

Images

Security: The security plan and the procedures that are developed to support the plan should be periodically audited to ensure that they continue to support the key management policy.

Protective: The protective mechanisms employed should be periodically reassessed with respect to the level of security they currently provide and are expected to provide in the future. They should also be assessed to determine whether the mechanisms correctly and effectively support the appropriate policies. New technology developments and attacks should be considered as part of a protective audit.

Key management survivability entails backing up or archiving copies of all keys used. Key backup and recovery procedures must be established to ensure that keys are not lost. System redundancy and contingency planning should also be properly assessed to ensure that all the systems involved in key management are fault tolerant.

Key Escrow

Key escrow is the process of storing keys with a third party to ensure that decryption can occur. This is most often used to collect evidence during investigations. Key recovery is the process whereby a key is archived in a safe place by the administrator.

Key Stretching

Key stretching, also referred to as key strengthening, is a cryptographic technique that involves making a weak key stronger by increasing the time it takes to test each possible key. In key stretching, the original key is fed into an algorithm to produce an enhanced key, which should be at least 128 bits for effectiveness. If key stretching is used, an attacker would need to either try every possible combination of the enhanced key or try likely combinations of the initial key. Key stretching slows down the attacker because the attacker must compute the stretching function for every guess in the attack. Systems that use key stretching include Pretty Good Privacy (PGP), GNU Privacy Guard (GPG), Wi-Fi Protected Access (WPA), and WPA2. Widely used password key-stretching algorithms include Password-Based Key Derivation Function 2 (PBKDF2), bcrypt, and scrypt.

Unprotected Storage

While cloud storage may seem like a great idea, it presents many unique issues. Among them are the following:

Images

Data breaches: Although cloud providers may include safeguards in service-level agreements (SLAs), ultimately the organization is responsible for protecting its own data, regardless of where it is located. When this data is not in your hands—and you may not even know where it is physically located at any point in time—protecting your data is difficult.

Authentication system failures: These failures allow malicious individuals into the cloud. This issue sometimes is made worse by the organization itself when developers embed credentials and cryptographic keys in source code and leave them in public-facing repositories.

Weak interfaces and APIs: Interfaces and APIs tend to be the most exposed parts of a system because they’re usually accessible from the open Internet.

Transfer/Back Up Data to Uncontrolled Storage

In some cases, users store sensitive data in cloud storage that is outside the control of the organization, using sites such as Dropbox. These storage providers have had their share of data loss issues as well. Policies should address and forbid this type of storage of data from mobile devices.

Cloud services give end users more accessibility to their data. However, this also means that end users can take advantage of cloud storage to access and share company data from any location. At that point, the IT team no longer controls the data. This is the case with both public and private clouds.

Images

With private clouds, organizations can ensure the following:

• That the data is stored only on internal resources

• That the data is owned by the organization

• That only authorized individuals are allowed to access the data

• That data is always available

However, a private cloud is only protected by the organization’s internal resources, and this protection can often be affected by the knowledge level of the security professionals responsible for managing the cloud security.

With public clouds, organizations can ensure the following:

• That data is protected by enterprise-class firewalls and within a secured facility

• That attackers and disgruntled employees are unsure of where the data actually resides

• That the cloud vendor provides security expertise and maintains the level of service detailed in the contract

However, public clouds can grant access to any location, and data is transmitted over the Internet. Also, the organization depends on the vendor for all services provided. End users must be educated about cloud usage and limitations as part of their security awareness training. In addition, security policies should clearly state where data can be stored, and ACLs should be configured properly to ensure that only authorized personnel can access data. The policies should also spell out consequences for storing organizational data in cloud locations that are not authorized.

Big Data

Big data is a term for sets of data so large or complex that they cannot be analyzed by using traditional data processing applications. These data sets are often stored in the cloud to take advantage of the immense processing power available there. Specialized applications have been designed to help organizations with their big data. The big data challenges that may be encountered include data analysis, data capture, data search, data sharing, data storage, and data privacy.

While big data is used to determine the causes of failures, generate coupons at checkout, recalculate risk portfolios, and find fraudulent activity before it ever has a chance to affect the organization, its existence creates security issues. The first issue is its unstructured nature. Traditional data warehouses process structured data and can store large amounts of it, but there is still a requirement for structure.

Big data typically uses Hadoop, which requires no structure. Hadoop is an open source framework used for running applications and storing data. With the Hadoop Distributed File System, individual servers that are working in a cluster can fail without aborting the entire computation process. There are no restrictions on the data that this system can store. While big data is enticing because of the advantages it offers, it presents a number of issues when deployed in the cloud.

Images

• Organizations still do not understand it very well, and unexpected vulnerabilities can easily be introduced.

• Open source codes are typically found in big data, which can result in unrecognized backdoors. It can contain default credentials.

• Attack surfaces of the nodes may not have been reviewed, and servers may not have been hardened sufficiently.

Logging and Monitoring

Without proper auditing, you have no accountability. You also have no way of knowing what is going on in your environment. While the next two chapters include ample discussion of logging and monitoring and its application, this section briefly addresses the topic with respect to cloud environments.

Insufficient Logging and Monitoring

Unfortunately, although most technicians agree with and support the notion that proper auditing is necessary, in the case of cloud deployments, the logging and monitoring can leave much to be desired. “Insufficient Logging and Monitoring” is one of the categories in the Open Web Application Security Project’s (OWASP) Top 10 list and covers the list of best practices that should be in place to prevent or limit the damage of security breaches.

Security professional should work to ensure that cloud SLAs include access to logging and monitoring tools that give the organization visibility into the cloud system in which their data is held.

Inability to Access

One of the issue with utilizing standard logging and monitoring tools in a cloud environment is the inability to access the environment in a way that renders visibility into the environment. In some cases, the vendor will resist allowing access to its environment. The time to demand such access is when the SLA is in the process of being negotiated.

Exam Preparation Tasks

As mentioned in the section “How to Use This Book” in the Introduction, you have several choices for exam preparation: the exercises here, Chapter 22, “Final Preparation,” and the exam simulation questions in the Pearson Test Prep Software Online.

Review All Key Topics

Review the most important topics in this chapter, noted with the Key Topics icon in the outer margin of the page. Table 6-2 lists a reference of these key topics and the page numbers on which each is found.

Images

Table 6-2 Key Topics in Chapter 6

Images

Define Key Terms

Define the following key terms from this chapter and check your answers in the glossary:

Software as a Service (SaaS)

Platform as a Service (PaaS)

Infrastructure as a Service (IaaS)

public cloud

private cloud

community cloud

hybrid cloud

Function as a Service (FaaS)

Infrastructure as Code (IaC)

Data Protection API (DPAPI)

NIST SP 800-57 REV. 5

key escrow

key stretching

big data

Review Questions

1. With ______________, the vendor provides the entire solution, including the operating system, the infrastructure software, and the application?

2. Match the terms on the left with their definitions on the right.

Images

3. List at least one advantage of IaC.

4. ___ ________________tend to be the most exposed parts of a cloud system because they’re usually accessible from the open Internet.

5. APIs are used in the ___________________ so that devices can speak to each other without users even knowing the APIs are there.

6. List at least one of the security issues with serverless computing in the cloud.

7. Match the key state on the left with its definition on the right.

Images

8. In the _______________ phase of a key, the keying material is not yet available for normal cryptographic operations?

9. List at least one security issue with cloud storage.

10. ________________is a term for sets of data so large or complex that they cannot be analyzed by using traditional data processing applications

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.15.229.143