Chapter 30

Understanding the Key Aspects of Digital Forensics

This chapter covers the following topics related to Objective 4.5 (Explain the key aspects of digital forensics) of the CompTIA Security+ SY0-601 certification exam:

  • Documentation/evidence

    • Legal hold

    • Video

    • Admissibility

    • Chain of custody

  • Timelines of sequence of events

    • Time stamps

    • Time offset

  • Tags

  • Reports

  • Event logs

  • Interviews

  • Acquisition

    • Order of volatility

    • Disk

    • Random-access memory (RAM)

    • Swap/pagefile

    • OS

    • Device

    • Firmware

    • Snapshot

    • Cache

    • Network

    • Artifacts

  • On-premises vs. cloud

    • Right-to-audit clauses

    • Regulatory/jurisdiction

    • Data breach notification laws

  • Integrity

    • Hashing

    • Checksum

    • Provenance

  • Preservation

    • E-discovery

    • Data recovery

    • Non-repudiation

    • Strategic intelligence/counterintelligence

Digital forensic science is a branch of forensic science that is focused on the recovery and investigative materials found in digital devices related to cybercrime. Digital forensics is the process of identifying, preserving, analyzing, and documenting digital evidence. This is done in order to solve a crime and present evidence in a court of law. There are five steps to digital forensics:

  1. Identification

  2. Preservation

  3. Analysis

  4. Documentation

  5. Presentation

Digital forensics also has nine phases, which are covered in this chapter.

“Do I Know This Already?” Quiz

The “Do I Know This Already?” quiz enables you to assess whether you should read this entire chapter thoroughly or jump to the “Chapter Review Activities” section. If you are in doubt about your answers to these questions or your own assessment of your knowledge of the topics, read the entire chapter. Table 30-1 lists the major headings in this chapter and their corresponding “Do I Know This Already?” quiz questions. You can find the answers in Appendix A, “Answers to the ‘Do I Know This Already?’ Quizzes and Review Questions.”

Table 30-1 “Do I Know This Already?” Section-to-Question Mapping

Foundation Topics Section

Questions

Documentation/evidence

1–4

Acquisition

5–7

Integrity

8

On-premises vs. Cloud

9

Preservation

10

E-discovery

11

Data Recovery

12

Nonrepudiation

13

Strategic Intelligence/Counterintelligence

14

Caution

The goal of self-assessment is to gauge your mastery of the topics in this chapter. If you do not know the answer to a question or are only partially sure of the answer, you should mark that question as wrong for purposes of the self-assessment. Giving yourself credit for an answer you correctly guess skews your self-assessment results and might provide you with a false sense of security.

1. Which of the following is the first step when litigation has been filed or will soon be filed, with a focus on saving data in its current state?

  1. Data recovery

  2. Data collection

  3. Data destruction

  4. Data preservation

2. Which of the following is true regarding the correct format for forensic video evidence?

  1. Video settings should always match the video recorder; otherwise, the playback will be corrupted.

  2. Video formats and settings can vary.

  3. Videos are typically recorded at the same FPS and resolution.

  4. The frames per second (FPS) and resolution cannot be changed.

3. Chain of custody in forensic investigations requires investigators to perform which of the following?

  1. Tag some evidence so that it can be chronologically stored in a locked storage facility.

  2. Clearly and accurately document the collection process, methods, and path of evidence from collection to presentation.

  3. Store evidence in humidity-free lockers to ensure integrity and chain of custody.

  4. Take responsibility for all evidence collection after the site has been turned over from the analyst.

4. Which of following describes the use of a tag in forensic investigations?

  1. Tags provide a history identifying when items sold.

  2. Tags can contain date/time and investigator initials to help determine who collected items and when.

  3. Tags allow forensic investigators to sell the items later after the investigation has been completed.

  4. Tags help investigators determine whether the evidence was collected onsite or during the reverse-engineering process.

5. When you are collecting evidence, the order of volatility requires you to do what?

  1. Start evidence collection with the most volatile data/information.

  2. Start evidence collection after the volatile data has been removed from the site.

  3. Start with the least volatile evidence to make sure your evidence is intact when received at the office.

  4. Ensure that the most volatile data/information that is collected is handled by the police only.

6. Which artifacts can be collected during a forensic investigation of a suspected workstation that will help or contribute to the validity of the investigation?

  1. Data and evidence stored in appropriate containers

  2. Art, photos, and images that contain facts about the attack and that can be used to piece together a picture of what took place

  3. Log files collected from previous years, log file configuration settings for global policies, and SCCM data

  4. All logs, the registry, the RDP cache, and Windows Error Reporting (WER)

7. What volatile memory item contains more trusted information than disk, magnetic, or optical storage?

  1. ROM

  2. RAM

  3. RTU

  4. CPU

8. When dealing with a forensic investigation, you must take images of suspect hard drives. Which of the following best describes the process to ensure a hard drive hasn’t been tampered with after the collection/image has been completed?

  1. Checksums provide a thin piece of paper that acts as a seal over the evidence to ensure no one opens that checked evidence.

  2. Checksums provide checks and balances during an investigation to ensure investigators are working from the most important data to the least important.

  3. Checksums provide a check and sum for each bit of data collected, and when they are combined with a hash, you can track data.

  4. Checksums of a newly imaged drive/data can be created, so if the device/data is modified, the checksum will not match, letting you know it was modified.

9. Which statement regarding on-premises and cloud-based systems is true?

  1. Maintaining a chain of custody isn’t very challenging in a cloud environment versus a traditional forensics environment.

  2. On-premises and cloud-based are simply terms describing where data is stored.

  3. When a cyber incident happens, legal jurisdiction and the laws that govern the region do not normally present unique challenges.

  4. In traditional forensics environments, the external security team has control over who is conducting forensics operations on a machine.

10. What happens when digital evidence is not properly preserved and collected in accordance with forensic best practices?

  1. It becomes volatile.

  2. It is known as nonprovenant.

  3. It becomes an artifact.

  4. It is not admissible in court.

11. What should organizations have to perform to preserve information during e-discovery?

  1. Legal hold process

  2. Nonrepudiation

  3. Order of nonvolatility

  4. Counterintelligence

12. Which of the following is the extraction of data from damaged, deleted, or purposely destroyed evidence sources in a forensically sound manner?

  1. E-discovery

  2. Admissibility

  3. Data recovery

  4. Strategic intelligence

13. Which term best defines the assurance that someone cannot deny the validity of something—where a statement’s author cannot dispute its authorship?

  1. Snapshots

  2. Timestamps

  3. Checksums

  4. Nonrepudiation

14. Which of the following is information gathered and activities conducted to protect against espionage, other intelligence activities, or sabotage conducted by or on behalf of other elements?

  1. Counterintelligence

  2. Strategic intelligence

  3. Artificial intelligence

  4. Intellectual property (IP) intelligence

Foundation Topics

Documentation/Evidence

Digital forensics processes include documentation of evidence and reporting. In digital forensics, the goal is uncovering and interpreting electronic data. The process is to preserve any evidence in its original form while performing a structured investigation by collecting, identifying, validating, and clearly documenting the digital information to reconstruct past events. Careful and meticulous records of all the data to re-create the crime scene must be taken in order for a forensic case to be successfully presented and defended in court.

Legal Hold

A legal hold plan is essential guidance for information governance or an e-discovery strategy, recognizing when the duty to preserve begins, what it entails, and how to implement it and when it ends. A legal hold is a legal document used to communicate to a party or custodian that litigation is imminent or reasonably anticipated, potentially relevant information must be preserved, and electronically stored information may be relevant to pending litigation. Data preservation is the first step when litigation has been filed or will be soon filed, with a focus on the preservation of data in its current state, such as emails, SMS, MMS, and deleted messages (still on disk and not destroyed) on all devices including cell phones, PCs, and mobile devices. All organizations should have a corporate policy in place when dealing with third parties, which gives them a legal hold over data.

Video

Forensic video analysis is the scientific examination, comparison, and/or evaluation of video in legal matters. Forensic video expertise is essential with the proliferation of smartphones and security cameras. Today more incidents are being captured on video than ever before. The increase in the amount of video evidence cuts both ways: a major incident can overwhelm law enforcement agencies with the sheer amount of video that needs to be stored, processed, and reviewed.

Beyond the vast amount of video now available, digital video is much more complex in not only the images but also the data it contains. In most cases if video does not have proper forensic analysis, the courts will not accept it at its face value. There are a large number of factors to consider, from the method video is recorded to the color calibration of the monitor. All of these factors can have a significant impact on what people see and how they interpret the video. These factors can easily lead to incorrect conclusions about what really happened. The way video and images are captured can have a dramatic impact on how people interpret what they see, which can also lead to incorrect conclusions.

Many different camera makes and models are on the market, including dedicated digital photography cameras, action cameras, smartphones, and security system cameras. File formats can vary, as well as settings for recording video, such as frames per second (FPS) and video resolution. These features can all factor into how and what video information is stored. A forensic video analyst must understand all of these issues to prevent a misinterpretation of what a video appears to show, such as colors or perhaps the speed of events unfolding in the video. In a criminal case, where a video could be very important to proving someone’s innocence or guilt, settings and other factors can be very important. Even now with deep fakes and artificial intelligence (AI), it’s more important than ever to ensure the proper investigative processes, procedures, and handling of video evidence.

Admissibility

Digital evidence is admissible if it establishes a fact-of-matter assertation in the case, and it must remain unaltered during the digital forensics process. The results of the examination must be valid, reliable, and peer reviewed. Forensically sound methods and tools must be used to obtain the digital evidence, and the authenticity, integrity, and reliability of this evidence must be supported by the expert witnesses and digital forensic analysts who processed it.

Three standards of evidence must always be considered:

  • Is it sufficient (which is to say, convincing without question)?

  • Is it competent (which means it is legally qualified)?

  • Is it relevant (which means it must matter to the case at hand)?

To be admissible, the findings should be interpreted in an unbiased manner, and errors and uncertainties in the findings, as well as limitations in the interpretations of results, should be disclosed.

Whether evidence is admissible is determined by following three rules:

  • Best evidence means that courts prefer original evidence to copies to avoid alteration of evidence.

  • The exclusionary rule means that data collected in violation of the Fourth Amendment (no unreasonable searches or seizures) is not admissible.

  • Hearsay is second-hand evidence and is often not admissible, although some exceptions apply.

Chain of Custody

For evidence to be credible in court, you must follow strict rules. Following the proper chain of custody consistently and methodically has been challenging due to the dynamic nature of digital evidence. Cyber attacks and cybercrimes are evolving, and the invincible nature of these attacks makes it difficult to gather evidence. Investigators are required to adapt to changing crime scenes and the digital media that has been used to commit these crimes. It has become imperative that standard procedures are coherent and ensure harmony between the legal parties involved.

Note

The chain of custody provides a clear record of the path that evidence takes from acquisition to disposal. It is often required in court proceedings to prove evidence isn’t tampered with.

Timelines of Sequence of Events

The ultimate goal of timelines in digital forensics is to reconstruct what happened, presenting a narrative about an event that is coherent, believable, and supported by sufficient evidence. Timelines have become a mainstay of digital forensic analysis in both public and private sectors. They help explain what was happening on a given device or set of devices during a cybersecurity incident, crime, or other event. The timeline helps frame the situation to explain it to attorneys, juries, and other stakeholders.

Timestamps

Timestamps play a very important role in many digital forensic examinations, so it’s very important for any forensic examiner or analyst to clearly understand how they work. To prove that a certain document was created before or at a certain time, you can timestamp it.

An examiner must know the ins and outs of each operating system; for example, to know that NTFS has eight timestamp values called Modified, Accessed, Changed, and Birth (MACB) times, of which $STANDAR_INFO and $FILE_NAME both contain a set of four each. Certain timestamps can be modified by users, so knowing which files can be changed and which reflect the right date/time is important to an investigation.

To view them the investigator runs MFTRCRD.exe c:filefile.a -d indxdump=off 1024 -s.

Time Offset

In a forensic examination, establishing the time zone from the suspect system is one of the first tasks for a forensic examiner. If the offset information is not established at an early stage and taken into account, the validity of date/time evidence may be brought into question. This is true for the examination of browser history and related artifacts, as well as the examination of file system metadata. Some date/time values stored in binary files are affected by the time zone setting of the original computer, and many digital forensic applications can alter the representation of these dates by the time zone setting of the forensic workstation. The situation becomes particularly complicated when the suspect computer was set to an incorrect time zone and the computer clock was set to correspond to the local time zone. Many date/timestamps store data as Coordinated Universal Time (UTC) values. If this is the case, the operating system and some applications have to convert the value from local time to UTC. On Windows systems, you can check the system registry at HKEY_LOCAL_MACHINESystemSelect, where four keys detail the Current, Default, Failed, and LastKnownGood control sets. UTC = LOCAL TIME + BIAS, where active time bias is the current time difference from UTC in minutes.

Note

Time offset is recorded against a verified time standard. Recording the time offset is critical to an accurate examination when dates and times are at issue. It should be recorded at the beginning of any examination.

Tags

The actual collection and processing of evidence are critical steps in the investigative process. Each piece of evidence collected must be handled in a way that preserves its integrity and any trace evidence. The collection process must provide a detailed record of an item’s whereabouts from the time of collection to the time it arrives in a courtroom. Failure to pay proper attention to any one of these areas can easily result in one or more pieces of evidence having no value in court or in administrative proceedings. After an object is identified as evidence, it must be tagged. Evidence tagging helps identify the collected item. The tag can consist of something as little as a sticker with the date, time, control number, and name or initials of the investigator. Using a control number is an easy way to identify a piece of evidence in documentation such as a chain of custody. A tag can also be an actual document that contains general information about the item and the incident under investigation.

After the evidence is tagged, the investigator should photograph it in a way that also displays the tag information. Photographing evidence becomes another way to document what was collected and how it was processed. When taking pictures of computing devices, you should include all interfaces. If a cable is attached to an interface, it should remain connected during the picture-taking process. It’s a good practice to clearly label each attached cable with the associated peripheral device before taking interface photos. You should place all items in protective bagging. For electronic devices, you should use a Faraday/antistatic bag to ensure no outside radio frequency (RF) can influence the device.

Reports

Every step in the investigation needs to be reported on. Many reports can make up the final report provided as a summation of the entire case. Each report offers its own insights into intent and activity, and the tools deployed are always directly tied to the case at hand and specific hurdles that are faced. As with all evidence, well-documented reports ensure that no corners are being cut and each aspect of the process is forensically defensible. Investigators can use existing templates to guide them along the path of process, procedures, controls, and requirements.

Event Logs

On Windows systems, event logs contain a lot of useful information about the system and its users. Many analysts rely on Windows event logs to help gain the context of attacker activity on a system. Unfortunately, some things cannot be determined using only the event log. Collecting Windows file activity is a massive undertaking. Locating and storing logs from the system should be your first step in evidence collection. For example, Windows Event Viewer logs are stored in %SystemRoot%System32WinevtLogs. Each system contains these critical and useful files in different places depending on the operating system. Based on the system you are investigating, you need to obtain the appropriate location for these files (prior to touching the device).

TIP

You should know the various Windows Event Viewer logs and what they do. The Windows Event Viewer security log provides information related to the success and failure of login attempts as well information related to other audited events. The application log provides information related to applications run on the local system. The system log records error messages and other information generated from the Windows operating system itself.

Interviews

Different types of crimes such as computer crime, fraud, and hacking require different interview methods. These types of crimes can be performed using computers or mobile devices, including smartphones, tablets, and digital cameras. To have a successful interview, the interviewer should collect the following information:

  • Background details regarding the victim or device

  • The focus of the investigation

  • The possible perpetrators and their background information

During interviews, interviewers should try to recognize new details and other witnesses who could help in resolving the case. Different interview techniques exist, but interviews usually should try to answer simple questions such as who, when, where, what, how, and why. The initial interview is typically the best chance to collect basic evidence from perpetrators who used computers and mobile devices as well as information from victims. After information necessary for the interview is gathered, the interview process can start. At this point, the following must be ready:

  • Privacy Act statement

  • List of official papers from the interviewee

  • Checklist with information gathered prior to the interview

  • List of questions

  • Copies of all official papers planned to show to the perpetrator or victim

  • The method of recording the interview

The interview should be conducted in a peaceful and comfortable setting. Considering the interviewees as individuals and cooperating with them in a serious and cautious manner allows the interviewer to develop a rapport. A good relationship during the interview between the investigator and the victims or perpetrators is important to achieving good results. The interviewer should always try to create common ground to make interviewees relaxed and comfortable. At the same time, the interviewer should avoid a heavy-handed approach that enforces authority. An authoritative approach yields less successful results because it makes victims or perpetrators less comfortable and on the defensive.

Acquisition

Forensic acquisition is the process of collecting specific data related to an attack, intrusion, or investigation. Items can include computer media such as hard drives, thumb drives, servers, phones, tablets, and other devices that store electronic data. Investigators need to follow clear and precise steps when acquiring data from a crime/investigative scene so that all collected data retains its integrity.

Order of Volatility

The collection of evidence should start with the most volatile item and end with the least volatile. The order of volatility is the order in which the digital evidence is collected. Highly volatile data resides mostly in the memory, cache, or CPU registers, and it will be lost as soon as the power to the computer is turned off. Less volatile data is a not lost so easily and is relatively stable because it may be stored on disk drives or other permanent storage media, such as CD-ROM discs. Crime scene technicians should collect evidence beginning with the most volatile and then move toward the least volatile. The following order of volatility was taken from RFC 3227, Guidelines for Evidence Collection and Archiving:

  1. Cache, registers

  2. ARP cache, routing table, memory, kernel statistics, process table

  3. Temporary files

  4. Disk

  5. Monitoring data and remote logging pertaining to the computer in question

  6. Physical configurations, network topology

  7. Archival media

Disk

The hard drive or solid-state drive (SSD) is usually the center of most cyber forensic investigations. A forensic image is an electronic copy of a drive. It’s a bit-by-bit or bitstream file that’s an exact, unaltered copy of the media being duplicated. Even though you might think that the data you place on a disk will be around forever, that is not always the case. The likelihood that data on a disk cannot be extracted is very low. The first step in hard drive forensics is the identification of storage devices at the scene.

Forensic disk controllers are most commonly associated with the process of creating a disk image, or acquisition, during forensic analysis. Their use is to prevent inadvertent modification of evidence. Disk drives are considered a nonvolatile form of data storage.

Random-Access Memory

There are several reasons that a complete random-access memory (RAM) capture might prove useful; most often they revolve around the key differences between data stored in RAM and data stored on a hard disk drive. RAM is considered volatile memory and is perceived to be more trusted than nonvolatile memory, such as ROM, disk, magnetic, or optical storage. Investigations using live forensic techniques require special handling because the volatile data in RAM can be lost if the system is turned off.

If we consider data that is either not stored or somehow protected on a hard disk, yet stored in plaintext when stored in RAM, many data types immediately come to mind: passwords, financial transaction information, encryption keys, and similar types of high-value data. The type of information in RAM might not even be intentional (poorly written apps can leave residual data), or it could be a byproduct of malware. Certain malware can be completely memory resident and leave critical information available that may lead to the author. RAM contains

  • Unsaved documents

  • Passwords

  • Credentials

  • Who is logged in to the system

Code from programs that are not necessarily written to the computer and/or saved, but resident, can also include printed pictures, emails, chat messages, malware, running processes, and more. During every second of the computer’s use, RAM is changing, writing, or reading something that the computer is actively working on. It may now become apparent why collecting RAM is the priority in a live forensic triage or on-scene digital investigation. To dump the volatile memory, options are available for all platforms. On Windows, you can perform a kernel or a full dump directly or use a third-party tool specific to your platform. Due to operating system security restrictions, forensic programs occasionally are not entirely executed in the user space; therefore, you must instead inject a special kernel driver or module into the core of the operating system to obtain dumps of memory.

Swap/Pagefile

A swap file or pagefile is an area of the hard drive that the operating system can use to store data from RAM that has not been used in recent activity, typically moving data/apps required from disk to RAM and from RAM back to disk, hence the name “swapping.”

Pagefile.sys is used within Windows operating systems to store data from RAM when it becomes full. Pagefile.sys is a contiguous file, so it can be read more quickly; it is located on the root of the hard drive; and normally, the more infrequently used files are stored to it. It can also be used as a data backup in the event of a system crash. By default, the Windows operating system configures the size of Pagefile.sys; however, it can also be altered by the user. Normally, Pagefile.sys can be a significant proportion of data present on the hard drive, but removing it can greatly reduce the operating speed of the computer.

The swap file or pagefile is also called virtual memory. During a forensics investigation, a pagefile is very important and should be treated as such. Although not as volatile as RAM, it should always be inspected using forensic tools because it might reveal critical details such as passwords and other important forensic artifacts.

Operating System

Operating system forensics is the process of retrieving useful information from the OS of the computer, PC, or mobile device in question. OS forensics involves forensic examination of the PC’s operating system. The foremost commonly used operating systems are Windows, macOS, and Linux. The file system also identifies how disk drives store data. There are many file systems introduced for various operating systems, like FAT, exFAT, and NTFS for Windows operating systems, and Ext2fs or Ext3fs for Linux operating systems. Data and file recovery techniques for these file systems include data carving, slack space, and data hiding.

Another important aspect of OS forensics is memory forensics, which includes virtual storage, Windows memory, Linux memory, macOS memory, memory extraction, and swap spaces. OS forensics also involves web browsing artifacts, like messaging and email artifacts. Some indispensable aspects of OS forensics are discussed in subsequent sections.

Another option for forensic analysts in identifying, collecting, and preserving evidence from computers is to use a bootable forensic toolset/disc, one that has a special operating system and tools for ensuring that the original system is not modified. The SIFT workstation, Sleuth Kit, Autopsy, FTK (Forensic Toolkit), Kali forensics, and many others can help with your investigation.

Device

Depending on the forensic investigation and the device(s) involved, a multitude of devices, software, scripts, and systems can be used. From cell phones, tablets, PCs, networking devices, printers, and many other computer-based devices (like desk phones and voice assistant devices), forensic examination can be performed on these devices and many others. You should choose the appropriate tool and methods for the device and expected results. For example, when you’re performing forensics on a mobile platform device like a cell phone, there are several methods and many tools. They range from automated, physical, and logical to brute-force acquisition tools like svStrike, IP-Box, FTK Imager, Zune, and Xways. Two of the most common tools for cell phone forensics are Cellebrite and EnCase. As previously mentioned, you need to use the right platform, the right software, the right process, and the right procedures for the specific device you are investigating.

Firmware

System firmware like the basic input/output system (BIOS) or Unified Extensible Firmware Interface (UEFI) on more modern systems is the first program that runs on the CPU when a computer is turned on. System firmware has a much more privileged software layer in computer systems and more recently has become the target in sophisticated computer attacks. Malware used by high-profile rootkits is almost completely invisible to standard forensic procedures and can be detected only with special mechanisms. Researchers have provided insights into well-known open-source memory forensic tools and have evaluated the approach within both physical and virtual environments. The firmware code initially runs on the ROM chip. Then it moves from ROM into RAM by manipulating special registers, called Programmable Attribute Maps (PAMs), to shadow the ROM-mapped regions with RAM and uncompressing its code image into memory. Firmware then starts initializing devices on the system bus and maps their registers and memory into the physical address space as required.

The pre-boot phase, which is the actual power-on self-test (POST), loads firmware settings, checks for a valid disk system, and if the system is good, moves to the next phase, verifying that the computer has a valid master boot record (MBR) before loading the Windows boot manager. As you can see, firmware all along this path is referenced and loaded multiple times, giving rise to the potential for an attacker’s interaction with this load-in process. A report from Wired magazine shows that many BIOSs share portions of the same code, which leads to incursion vulnerability levels in the 80 percent range. This code reuse is even present in big-brand PC makers. Attackers are looking to modify and sabotage firmware to target specific sections of the operating system, and infiltrate other running software. Securing firmware is largely in the hands of the product designers’ hands. Intel’s Hardware Shield, Microsoft’s updated OS protection, and Dell’s enhanced BIOS verifications are all evidence of vendors taking responsibility. You too can help by updating your firmware when an update is released. If a USB drive is required, make sure it is one you can trust. If you’re purchasing new hardware, ensure it has advanced firmware security.

Snapshot

A snapshot preserves the entire state and data of the virtual machine at the time it is taken. A snapshot includes the virtual machine settings and the state of the machine’s virtual drives. It can also be used as a restore point when testing software or configuration changes fail.

Because computers are vulnerable to cyber attacks, cybercriminals can counterfeit and fabricate evidence stored on the computer in question. Forensic analysts therefore must protect the evidence from loss. Before an investigator starts examining the computer for digital evidence, the entire drive of the computer should be imaged (copied) to preserve data and verify its integrity. In various circumstances, crime scene investigators create a bitstream copy of a storage device with the help of a forensic imaging tool. Then they store that original storage media on a forensically clean storage device and subsequently work on the image obtained.

Proper checks are necessary for successful image duplication, requiring the image makers to perform a hash calculation before and after the creation of a forensic image. A hash calculation verifies that the image wasn’t altered or damaged during an imaging process. If the duplication is successful, the hash of both the original copy and imaged copy should be the same.

Cache

The contents of the CPU cache and registers are extremely volatile because they are changing all the time. Nanoseconds literally make the difference here. An examiner needs to get to the cache and register immediately and extract that evidence before it is lost or corrupt. Users utilize web browsers for many functions, such as searching for information, accessing email, engaging in e-commerce or banking, instant messaging, visiting online blogs, and accessing social networks. Web browsers record many data-related user activities; information such as URLs visited by users, search terms, cookies, cache files and images, access time, and use time is held in memory on the system. Examining the evidence that is the subject of criminal records is an important step in the examination of your browser. Being able to uncover an offender’s profile and connections depends on the web registry. The evidence obtained from the use of the web browser is a key component of a forensic expert’s case. It is possible to analyze evidence such as history, cache, cookies, downloads, URL addresses, access times, and frequency of visits from a suspect’s computer. The analysis process should first identify the web browsers.

Network

Network traffic and logs can provide empirical evidence if forensic analysts properly collect and preserve them. Various network devices and tools, including switches, routers, VPN appliances such as SIEMs, proxies, and firewalls, can be configured to record logs of the activities and events that occur on them and can provide evidence for court proceedings. Network forensic analysis tools (NFATs) typically provide the same functionality as packet sniffers, protocol analyzers, and SIEM software, sometimes in a single product. Whereas SIEM software concentrates on correlating events among existing data sources that typically include multiple network traffic and related sources, NFAT focuses primarily on collecting, examining, and analyzing network traffic. NFAT software also offers additional features that further facilitate network forensics, such as reconstructing events by replaying network traffic within the tool, ranging from an individual session like instant messaging (IM) between two users to all sessions during a particular time period. The speed of the replay can typically be adjusted as needed.

Artifacts

Forensic artifacts are remnants of an intrusion that can be identified on a host or network. Artifacts can be extracted from hosts such as logs, the registry, browsers, the RDP cache, and Windows Error Reporting (WER). Artifacts that can be extracted from hosts within the network or the compromised network itself lead to the entry point of a threat actor and the exfiltration point(s).

Common examples of indicators of compromise (IOCs) include

  • IP addresses

  • IPv4 and IPv6

  • URL and FQDN+Path

  • MD5 hash and SHA-1

  • Hash

  • Filename and type

  • Windows registry key

  • Windows driver

On-premises vs. Cloud

Cloud computing has transformed the IT industry; services can now be deployed in a fraction of the time that they used to take. Scalable computing solutions have spawned large cloud computing companies such as Amazon Web Services (AWS), Google Cloud, and Microsoft Azure. With a click of a button, personnel can create or reset the entire infrastructure of a computing resource in three main types of cloud computer service models: software as a service (SaaS), platform as a service (PaaS), and infrastructure as a service (IaaS). These models present three unique challenges to conducting cloud forensic investigations.

Forensic issues that are unique to cloud computing are jurisdiction, multitenancy, and dependency on cloud service providers (CSPs). Cloud forensics is a subset of digital forensics based on the unique approach to investigating cloud environments. CSPs have servers around the world to host customer data. When a cyber incident happens, legal jurisdiction and the laws that govern the region present unique challenges. A court order issued in a jurisdiction where a data center resides is likely not applicable to the jurisdiction for a different host in another country. In modern CSP environments, the customer can choose the region in which the data will reside, and this decision should be made carefully.

A main concern for an investigator is to ensure that the digital evidence has not been tampered with by third parties so that it can be admissible in a court of law. In PaaS and SaaS service models, customers must depend on the cloud service providers for access to the logs because they do not have control over the hardware. In some cases, CSPs intentionally hide the details of logs from customers. In other cases, CSPs have policies that they do not offer services to collect logs.

Maintaining a chain of custody is very challenging in a cloud environment versus a traditional forensics environment. In traditional forensics environments, the internal security team has control over who conducts forensics operations on a machine, whereas in cloud forensics, the security team has no control over who the CSP chooses to gather evidence. If the investigator is not trained according to a forensic standard, the chain of custody may not hold up in a court of law.

TIP

Cloud-based and on-premises are simply terms that describe where systems store data. Many of the same vulnerabilities that affect on-premises systems also affect cloud-based systems.

Right-to-Audit Clauses

In cloud-based systems, the right-to-audit provision of hosting, services, and data storage contracts is important, and it should be clearly defined that the buyer has the right to examine the books and records of the seller to assure that the seller has complied with the contract and that no employee of the buyer received any funds either directly or indirectly. Vendor audit clauses can help control fraud and abuse by affording a discovery device in a fraud examination. When the right to audit is exercised, fraud examiners or auditors are generally looking for fraud by vendors and violations of company ethics policies. For example, they might be looking for faulty or inferior quality of goods, short shipments, or high prices when the goods can be bought directly and/or cheaper from the same or another vendor, goods not delivered, kickbacks, gifts and gratuities to company employees, or conflicts of interest.

Regulatory/Jurisdiction

When data is stored, managed, and accessed in the cloud and users are in any number of countries, regulatory and jurisdictional authority might not be clearly delineated. Forensic regulatory and jurisdictional intervention involves data being in private data centers, offsite data centers, and cloud virtual environments where users have on-demand access to processing resources, data, infrastructure, and applications through the private connection or via the Internet. The liability for compliance shifts from solely your responsibility to a joint responsibility between you and the provider. When it comes to multi-jurisdiction engagement, whether for investigation or e-discovery, there will always be issues with data protection laws in various locations. Different countries have different laws when it comes to data protection or privacy; for example, the EU is highly regulated by the General Data Protection Regulation (GDPR).

GDPR is obviously relevant to any organizations that have offices in Europe, and it goes well beyond that. For example, if your clients are storing any data for any EU residents, that organization would be subject to GDPR as well. When you’re engaging in any data collection, it is important to discern whether your organization needs to comply with GDPR or other country-specific laws and regulations, and if so, what steps need to be taken to ensure that the only data collected is relevant to the matter as opposed to everything being collected. Generally, forensic teams overcollect to some extent to ensure that important documents are not missed, but with privacy changes, it is important to understand the laws and collect only what is relevant. Certain country jurisdictions may not allow you to collect data at all based on where you are located. Therefore, you must check with all local laws, regulations, restrictions, and privacy rights before collecting data. Otherwise, fines could reach millions of dollars. In contrast, when data is stored locally or in your on-premises owned-and-managed data center, you are responsible for maintaining compliance. Keeping clear records of your team’s part in the process is essential in proving compliance during an audit.

Data Breach Notification Laws

On-premises breaches are handled differently from cloud-based breaches. In some cases, you might not be made aware of a cloud breach in the regulatory stated time for notification and thereby be responsible for fines. This is an important factor in compliance requirements when working with third-party cloud hosting. Another important factor is that you are responsible for initiating the forensic investigation for onsite breaches. Depending on the cloud vendor and the agreement, you might have to wait until they perform the entire investigation before you get to the actual issue. In some cases, you can request a joint cloud investigation where you can send a team to the providers’ data center to perform the investigation on behalf of both of you. Just about every U.S. state has some sort of data breach laws. California SB 1386, breach notification legislation, has radically changed the nature and tenor of responding to data breaches by introducing mandatory notification requirements and opening the door to significant regulatory fines and civil damages. These laws have affected the way evidence in data breaches must be collected and treated.

The gathering of evidence has always been a significant step in analyzing the cause and extent of data breaches. However, the use of forensic evidence and methodologies, such as preserving data so that findings can be verified and authenticated in litigation, has become more complicated and important with the advent of cloud hosting over the last few years. Not all network breaches lead to unauthorized access to personally identifiable information (PII) or protected health information (PHI), so forensic analysis can remove the need for expensive notification requirements that can be detrimental to the company.

Another growing trend is for regulators to question the procedures used by organizations in determining the scope of a breach and the numbers of persons to be notified. There has been a surge in civil claims following data breaches, with plaintiffs’ attorneys being ready to argue that inadvertent loss of data during the initial breach response spoiled relevant evidence, leading to negative presumptions against the breached organization.

Integrity

Forensic investigators play crucial roles within the legal system and are constantly under various pressures when performing analytical work, generating reports based on their analyses, or testifying to the content of these reports. Maintaining the scientific integrity of these actions is paramount to supporting a functional legal system and the practice of a good part of the science. There must be transparency and professionalism in the forensic field to ensure integrity and to reduce conflicts of interest. Besides the investigators’ and analysts’ integrity, the evidence must maintain its integrity by having a closely documented and controlled chain of custody. There must be a systematic, objective, scientific analysis of the incident, not only because this is the most reliable approach, but because a true scientific understanding of how an incident or failure happened is the first step to empowering others to prevent such incidents from happening in the future.

Hashing

A hash calculation is performed before and after the creation of a forensic image. It ensures that the image wasn’t altered during the duplication process. In addition, it is essential to verify periodically that the hash of the image copy being used for forensic examination has not changed or been altered. Therefore, a hash calculation confirms that the results drawn from the image copy would legally apply to the original source.

Hash functions have four defining properties that make them useful. Hash functions are

  • Collision resistant: A collision occurs where multiple inputs are found to produce a common output or common hash value. Because potential inputs are infinite and the output is a fixed length, collisions are bound to occur.

  • Deterministic: For any given input, a hash function must return the same value each and every time that input is processed.

  • Computationally efficient: You can expect that a hash function will be computationally efficient or, in other words, speedy.

  • Pre-image resistant: All hash functions must be “pre-image resistant.” The hash function should not provide any clue about the size or content of the input.

Checksums

To produce a checksum, you run a program that puts that file through an algorithm. Typical algorithms used for this process include MD5, SHA-1, SHA-256, and SHA-512. The algorithm uses a cryptographic hash function that takes an input and produces a string, a sequence of numbers and letters of a fixed length. The input file can be a small 100KB file or a massive 4GB file, but either way, you end up with a checksum of the same length. Checksums may also be called hashes. Small changes in the file produce very different looking checksums. You can use checksums to check files and other data for errors that occur during transmission or storage, as well as for evidence in a forensic investigation to ensure it hasn’t been tampered with.

Provenance

Data provenance refers to the establishment of a chain of custody for information that can describe its generation and all subsequent modifications that have led to its current state. An advanced data provenance practice or system gives forensic investigators a transparent idea about the data’s lineage and helps resolve disputes over controversial pieces of data by providing digital evidence. Likewise, provenance research is a forensic method employed to reconstruct legal chains of ownership that establish an artwork’s whereabouts from the moment of creation to its present circumstances, which is helpful in defending the chain of custody in cases where data is subject to handling scrutiny.

The first step to being able to make use of provenance for forensic purposes is to be able to ensure that it is collected in a secure and trustworthy fashion. However, the collection process alone raises several significant challenges with approaches to provenance collection from application to operating system level, and this process should rely on a provenance monitor to assure the complete collection of data. Such information can be invaluable for a forensics investigator.

Preservation

Handling of digital evidence is the most important aspect in digital forensics. This process is known as preservation of evidence—isolation and protection of digital evidence exactly as found without alteration so that it can later be analyzed. Sometimes this process involves a forensic copy of a hard disk from a system or logs being collected by another system. Collection is the gathering of devices and duplication of electronically stored information (ESI) for the purpose of preserving digital evidence, an exact copy of the original that remains untouched while digital forensics is performed. In certain cases you may have to turn off the device or isolate ESI in a way that will not alter evidence.

Dead box forensic collection—imaging a device after it is powered off in order to collect digital evidence—still remains an essential part of the digital forensic process. However, live box forensics is growing increasingly important with today’s technology, with exploit codes capable of remaining in memory. If a device is encrypted, without the passcode or encryption key, you may never have another chance to acquire valuable evidence if that device powers off or locks due to inactivity.

The other problem with live collection is that relevant data could be permanently lost due to continued use of the device. The data must be preserved for collection if it is to be considered for litigation. Time and date stamps will change, system log files will rotate, and valuable information can be lost if you attempt to log in and see what was done before forensic images are taken of a device. A copy of digital evidence must be properly preserved and collected in accordance with forensic best practices. Otherwise, the digital evidence may be inadmissible in court, or spoliation sanctions may be imposed.

E-discovery

E-discovery is the process of identifying, preserving, collecting, processing, reviewing, and analyzing electronically stored information in litigation. The digital forensics process involves identifying, preserving, collecting, analyzing, and reporting on digital information. In an e-discovery matter, the role of the expert is to provide the information to legal teams in a reviewable format for the analysis. When leveraging digital forensics, the expert performs the analysis of the information and reports the findings to the legal teams. The party performing the analysis of the electronic information is the primary differentiator between e-discovery and digital forensics. The terms e-discovery and digital forensics are often used interchangeably, but there are clear differences. The critical difference is the analysis of the information. In an e-discovery engagement, the legal teams review and analyze the information. In digital forensics, the expert reviews the digital information and provides the findings in an expert report.

TIP

Organizations should have a legal hold process in place to perform an e-discovery to preserve and gather information.

Data Recovery

Forensic data recovery is the extraction of data from damaged, deleted, or purposely destroyed evidence sources in a forensically sound manner. This method of recovering data means that any evidence resulting from it can later be relied on in a court of law. Deleting files from your computer doesn’t mean the files are necessarily gone, even after being deleted and emptied in the Recycle Bin. What really happens here is that you are deleting the location of those files, which merely hides them from the operating system. Emptying the Recycle Bin frees up disk space and removes the pointer details from the file directory, but the files remain. Unless that data has been physically written over or removed from the hard drive, those files can be recovered with forensic recovery, even if the entire drive has been formatted and seemingly wiped clean.

Nonrepudiation

Nonrepudiation is the assurance that someone cannot deny the validity of something, where a statement’s author cannot dispute its authorship. Nonrepudiation is a legal concept that is widely used in information security and refers to a service that provides proof of the origin and integrity of the data. Nonrepudiation makes it difficult to successfully deny who and where a message came from as well as the authenticity and integrity of that message. Digital signatures can offer nonrepudiation when it comes to online transactions, where it is crucial to ensure that the party of a contract or a communication can’t deny the authenticity of the signature on a document or sending the communication in the first place. In this context, nonrepudiation refers to the ability to ensure that a party to a contract or a communication must accept the authenticity of his or her signature on a document or the sending of a message. In forensics and digital security, nonrepudiation means

A service or system that provides proof of the integrity and origin of data.

An authentication that can be said to be genuine with high confidence.

Proof of data integrity is typically the easiest of these requirements to accomplish. A data hash such as SHA2 usually ensures that the data will not be changed undetectably. Even with this safeguard, it is possible to tamper with data in transit, either through an on-path (formerly known as man-in-the-middle) attack or phishing.

As a result, data integrity is best asserted when the recipient already possesses the necessary verification information, such as after being mutually authenticated. The most common method to provide nonrepudiation in the context of digital communications or storage is through digital signatures, a more powerful tool that provides nonrepudiation in a publicly verifiable manner.

Strategic Intelligence/Counterintelligence

Strategic intelligence is the strategy of narrowing down an investigation to relevant information gathering initially, which is the use of all resources to make determinations that limit the investigation’s scope to a manageable level. Counterintelligence is information gathered and activities conducted to protect against espionage, other intelligence activities, or sabotage conducted by or on behalf of other elements. The intelligence is designed to quickly direct resources to the most significant problems first and address them head on. The strategic intelligence plan should aid in planning and resource allocation, priorities for intelligence protection, and the process.

Chapter Review Activities

Use the features in this section to study and review the topics in this chapter.

Review Key Topics

Review the most important topics in the chapter, noted with the Key Topic icon in the outer margin of the page. Table 30-2 lists a reference of these key topics and the page number on which each is found.

Table 30-2 Key Topics for Chapter 30

Key Topic Element

Description

Page Number

Section

Legal Hold

842

Section

Video

842

Section

Admissibility

843

Section

Chain of Custody

844

Section

Timelines of Sequence of Events

844

Section

Tags

845

Section

Reports

846

Section

Event Logs

846

Section

Interviews

846

Section

Order of Volatility

848

Section

Disk

848

Section

Random-Access Memory

848

Section

Swap/Pagefile

849

Section

Operating System

850

Section

Device

850

Section

Firmware

851

Section

Snapshot

851

Section

Cache

852

Section

Network

852

Section

Artifacts

853

Section

Right-to-Audit Clauses

854

Section

Regulatory/Jurisdiction

855

Section

Data Breach Notification Laws

855

Section

Hashing

856

Section

Checksums

857

Section

Provenance

857

Section

Preservation

858

Section

E-discovery

858

Section

Data Recovery

859

Section

Nonrepudiation

859

Section

Strategic Intelligence/Counterintelligence

860

Define Key Terms

Define the following key terms from this chapter, and check your answers in the glossary:

legal hold

chain of custody

tag

event logs

acquisition

order of volatility

swap file

pagefile

artifacts

right-to-audit

provenance

preservation

E-discovery

data recovery

nonrepudiation

strategic intelligence

counterintelligence

Review Questions

Answer the following review questions. Check your answers with the answer key in Appendix A.

1. What are the three rules for evidence?

2. What are three standards for evidence?

3. When are checksums useful?

4. What is the role of a hash in computer forensics?

5. What does NFAT mean in the context of network forensics?

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.138.114.38