How it works...

For this script, we import the axiom library and the datetime library. Notice, we have removed the previous argparse and csv imports are they are unnecessary here.

from __future__ import print_function
from axiom import *
from datetime import datetime

Next, we must paste in the ProcessDailyOut class from the prior recipe, not including the write_csv or argument handling code, to use in this script. Since the current version of the API does not allow imports, we have to bundle all the code we need into a single script. To save pages and avoid redundancy, we will omit the code block in this section (though it exists as you'd expect in the code file bundled with this chapter).

The next class is the DailyOutArtifact, a subclass of the Artifact class provided by the Axiom API. We call the AddHunter() method, providing our (not yet shown) hHunter class, before defining the plugin's name within the GetName() method.

class DailyOutArtifact(Artifact):
def __init__(self):
self.AddHunter(DailyOutHunter())

def GetName(self):
return 'daily.out parser'

The last method of this class, CreateFragments(), specifies how to handle a single entry of the processed daily.out log results. A fragment, with respect to the Axiom API, is the term used to describe a single entry of an artifact. This code block allows us to add custom column names and assign the proper categories and data types for those columns. The categories include date, location, and other special values defined by the tool. The majority of columns for our artifact will be in the None category, as they don't display a specific kind of information.

One important categorical difference is DateTimeLocal versus DateTime: the DateTime will present the date as a UTC value to the user, so we need to be conscious about selecting the proper date category. Because we extracted the time zone from the daily.out log entries, we use the DateTimeLocal category in this recipe. The FragmentType property is a string for all of the values, as the class does not convert values from strings into another data type.

    def CreateFragments(self):
self.AddFragment('Snapshot Date - LocalTime (yyyy-mm-dd)',
Category.DateTimeLocal, FragmentType.DateTime)
self.AddFragment('Snapshot Timezone', Category.None,
FragmentType.String)
self.AddFragment('Volume Name',
Category.None, FragmentType.String)
self.AddFragment('Filesystem Mount',
Category.None, FragmentType.String)
self.AddFragment('Volume Size',
Category.None, FragmentType.String)
self.AddFragment('Volume Used',
Category.None, FragmentType.String)
self.AddFragment('Percentage Used',
Category.None, FragmentType.String)

The next class is our Hunter. This parent class is used to run the processing code and, as you will see, specifies the platform and content that will be provided to the plugin by the Axiom engine. In this instance, we only want to run this against the computer platform and a file that goes by a single name. The RegisterFileName() method is one of several options for specifying what files will be requested by the plugin. We can also use regular expressions or file extensions to select the files we would like to process.

class DailyOutHunter(Hunter):
def __init__(self):
self.Platform = Platform.Computer

def Register(self, registrar):
registrar.RegisterFileName('daily.out')

The Hunt() method is where the magic happens. To start, we get a temporary path where the file can be read within the sandbox and assign it to the temp_daily_out variable. With this open file, we hand the file object to the ProcessDailyOut class and use the run() method to parse the file, just like in the last recipe.

    def Hunt(self, context):
temp_daily_out = open(context.Searchable.FileCopy, 'r')

processor = ProcessDailyOut(temp_daily_out)
parsed_events = processor.run()

After gathering the parsed event information, we are ready to "publish" the data to the software and display it to the user. In the for loop, we first initiate a Hit() object to add data to a new fragment using the AddValue() method. Once we have assigned the event values to a hit, we publish the hit to the platform with the PublishHit() method and continue the loop until all parsed events have been published:

        for entry in parsed_events:
hit = Hit()
hit.AddValue(
"Snapshot Date - LocalTime (yyyy-mm-dd)",
entry['event_date'].strftime("%Y-%m-%d %H:%M:%S"))
hit.AddValue("Snapshot Timezone", entry['event_tz'])
hit.AddValue("Volume Name", entry['Mounted on'])
hit.AddValue("Filesystem Mount", entry["Filesystem"])
hit.AddValue("Volume Size", entry['Size'])
hit.AddValue("Volume Used", entry['Used'])
hit.AddValue("Percentage Used", entry['Capacity'])
self.PublishHit(hit)

The last bit of code checks to see if the file is not None and will close it if so. This is the end of the processing code, which may be called again if another daily.out file is discovered on the system!

        if temp_daily_out is not None:
temp_daily_out.close()

The last line registers our hard work with Axiom's engine to ensure it is included and called by the framework.

RegisterArtifact(DailyOutArtifact())

To use the newly developed artifact in Axiom, we need to take a few more steps to import and run the code against an image. First, we need to launch Axiom Process. This is where we will load, select, and run the artifact against the provided evidence. Under the Tools menu, we select the Manage custom artifacts option:

Within the Manage custom artifacts window, we will see any existing custom artifacts and can import new ones as seen here:

We will add our custom artifact and the updated Manage custom artifacts window should show the name of the artifact:

Now we can press OKAY and continue through Axiom, adding the evidence and configuring our processing options. When we reach the COMPUTER ARTIFACTS selection, we want to confirm that the custom artifact is selected to run. It probably goes without saying: we should only run this artifact if the machine is running macOS or has a macOS partition on it:

After completing the remaining configuration options, we can start processing the evidence. With processing complete, we run Axiom Examine to review the processed results. As seen in the following screenshot, we can navigate to the CUSTOM pane of the artifact review and see the parsed columns from the plugin! These columns can be sorted and exported using the standard options in Axiom, without any additional code on our part:

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.145.115.195