How it works...

This script, like the others, begins with import statements of the libraries required for its execution. The two new libraries here are olefile which, as we discussed, parses the Windows Sticky Note OLE streams and StringIO, a built-in library used to interpret a string of data as a file-like object. This library will be used to convert the pytsk file object into a stream the olefile library can interpret:

from __future__ import print_function
from argparse import ArgumentParser
import unicodecsv as csv
import os
import StringIO

from utility.pytskutil import TSKUtil
import olefile

We specify a global variable, REPORT_COLS, which represent the report columns. These static columns will be used across several functions.

REPORT_COLS = ['note_id', 'created', 'modified', 'note_text', 'note_file']

This recipe's command-line handler takes three positional arguments, EVIDENCE_FILE, IMAGE_TYPE, and REPORT_FOLDER, which represent the path to the evidence file, the type of evidence file, and the desired output directory path, respectively. This is similar to the previous recipe, with the exception of the REPORT_FOLDER, which is a directory that we will write the Sticky Note RTF files to:

if __name__ == '__main__':
parser = argparse.ArgumentParser(
description=__description__,
epilog="Developed by {} on {}".format(
", ".join(__authors__), __date__)
)
parser.add_argument('EVIDENCE_FILE', help="Path to evidence file")
parser.add_argument('IMAGE_TYPE', help="Evidence file format",
choices=('ewf', 'raw'))
parser.add_argument('REPORT_FOLDER', help="Path to report folder")
args = parser.parse_args()
main(args.EVIDENCE_FILE, args.IMAGE_TYPE, args.REPORT_FOLDER)

Our main function starts similarly to the last, by handling the evidence file and searching for the files we seek to parse. In this case, we are looking for the StickyNotes.snt file, which is found within each user's AppData directory. For this reason, we limit the search to the /Users folder and look for a file matching the exact name:

def main(evidence, image_type, report_folder):
tsk_util = TSKUtil(evidence, image_type)
note_files = tsk_util.recurse_files('StickyNotes.snt', '/Users',
'equals')

We then iterate through the resulting files, splitting out the user's home directory name and setting up the file-like object required for processing by the olefile library. Next, we call the parse_snt_file() function to process the file and return a list of results to iterate through. At this point, if the note_data is not None, we write the RTF file with the write_note_rtf() method. Additionally, we append the processed the processed data from the prep_note_report() to the report_details list. Once the for loop completes, we write the CSV report with the write_csv() method by providing the report name, report columns, and the list we have built of the sticky note information.

    report_details = []
for note_file in note_files:
user_dir = note_file[1].split("/")[1]
file_like_obj = create_file_like_obj(note_file[2])
note_data = parse_snt_file(file_like_obj)
if note_data is None:
continue
write_note_rtf(note_data, os.path.join(report_folder, user_dir))
report_details += prep_note_report(note_data, REPORT_COLS,
"/Users" + note_file[1])
write_csv(os.path.join(report_folder, 'sticky_notes.csv'), REPORT_COLS,
report_details)

The create_file_like_obj() function takes our pytsk file object and reads the size of the file. This size is used in the read_random() function to read the entire sticky note content into memory. We feed the file_content into the StringIO() class to convert it into a file-like object the olefile library can read before returning it to the parent function:

def create_file_like_obj(note_file):
file_size = note_file.info.meta.size
file_content = note_file.read_random(0, file_size)
return StringIO.StringIO(file_content)

The parse_snt_file() function accepts the file-like object as its input and is used to read and interpret the sticky note file. We begin by validating that the file-like object is an OLE file, returning None if it is not. If it is, we open the file-like object using the OleFileIO() method. This provides a list of streams, allowing us to iterate through each element of each sticky note. As we iterate over the list, we check if the stream contains three dashes, as this suggests that the stream contains a unique identifier for a sticky note. This file can contain one or more sticky notes, each identified by the unique IDs. The sticky note data is either read directly as RTF data or UTF-16 encoded data based on the value of the element in the first index of the stream.

We also read the created and modified information from the stream using the getctime() and getmtime() functions, respectively. Next, we extract the sticky note RTF or UTF-16 encoded data to the content variable. Note, we must decode the UTF-16 encoded data prior to storing it. If there is content to save, we add it to the note dictionary and continue processing all remaining streams. Once all streams are processed, the note dictionary is returned to the parent function:

def parse_snt_file(snt_file):
if not olefile.isOleFile(snt_file):
print("This is not an OLE file")
return None
ole = olefile.OleFileIO(snt_file)
note = {}
for stream in ole.listdir():
if stream[0].count("-") == 3:
if stream[0] not in note:
note[stream[0]] = {
# Read timestamps
"created": ole.getctime(stream[0]),
"modified": ole.getmtime(stream[0])
}

content = None
if stream[1] == '0':
# Parse RTF text
content = ole.openstream(stream).read()
elif stream[1] == '3':
# Parse UTF text
content = ole.openstream(stream).read().decode("utf-16")

if content:
note[stream[0]][stream[1]] = content

return note

To create the RTF files, we pass the dictionary of note data to the write_note_rtf() function. If the report folder does not exist, we use the os library to create it. At this point, we iterate through the note_data dictionary, splitting the note_id keys from stream_data values. The note_id is used to create the output RTF filename prior to opening it.

The data stored in stream zero is then written to the ouput RTF file before it is closed and the next sticky note is handled:

def write_note_rtf(note_data, report_folder):
if not os.path.exists(report_folder):
os.makedirs(report_folder)
for note_id, stream_data in note_data.items():
fname = os.path.join(report_folder, note_id + ".rtf")
with open(fname, 'w') as open_file:
open_file.write(stream_data['0'])

With the content of the sticky notes written, we now move onto the CSV report itself which is handled a little differently by the prep_note_report() function. This translates the nested dictionary into a flat list of dictionaries that are more conducive and appropriate for a CSV spreadsheet. We flatten it by including the note_id key and naming the fields using the keys specified in the global REPORT_COLS list.

def prep_note_report(note_data, report_cols, note_file):
report_details = []
for note_id, stream_data in note_data.items():
report_details.append({
"note_id": note_id,
"created": stream_data['created'],
"modified": stream_data['modified'],
"note_text": stream_data['3'].strip("x00"),
"note_file": note_file
})
return report_details

Lastly, in the write_csv() method, we create a csv.Dictwriter object to create an overview report of the sticky note data. This CSV writer also uses the unicodecsv library and writes the list of dictionaries to the file, using the REPORT_COLS list of columns as the fieldnames.

def write_csv(outfile, fieldnames, data):
with open(outfile, 'wb') as open_outfile:
csvfile = csv.DictWriter(open_outfile, fieldnames)
csvfile.writeheader()
csvfile.writerows(data)

We can then view the output as we have a new directory containing the exported sticky notes and report:

Opening our report, we can view the note metadata and gather some of the internal content, though most spreadsheet viewers have difficulty with non-ASCII character interpretations:

Lastly, we can open the output RTF files and view the raw content:

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.144.193.129