Audit log monitoring solutions

To internally develop a monitoring solution based on the audit log data, a PowerShell script which searches and exports the audit log data to a CSV file can be scheduled. This CSV file is then used as the source of an extract-transform-load (ETL) or extract-load-transform (ELT) process to persist the log data in a source system, such as a SQL Server database. Finally, Power BI Desktop can be used to implement remaining lightweight transformations, create DAX measures, and develop the monitoring reports. 

The following list of steps and supporting screenshots describe the monitoring workflow in detail:

  1. A PowerShell script (.ps1) is executed on a schedule and generates a CSV file of Power BI activities:

The following sample script searches the audit log for Power BI activities since yesterday and exports the data to a CSV file:

$UserCredential = Get-Credential
$CurrentDate = get-date
$Yesterday = $CurrentDate.AddDays(-1)
$csvFile = "C:UsersBrett PowellDesktopPowerBIAuditLogs.csv"
$Session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri https://outlook.office365.com/powershell-liveid/ -Credential $UserCredential -Authentication Basic -AllowRedirection

Import-PSSession $Session

$result = Search-UnifiedAuditLog -StartDate $Yesterday -EndDate $CurrentDate -RecordType PowerBI -ResultSize 5000 | Export-Csv $csvFile

The ResultSize parameter (count of rows) of the Search-UnifiedAuditLog cmdlet is limited to 5,000. If not specified, the default value is 100. Depending on the level of usage in the organization, the frequency of executing the script and overall process will need to be adjusted accordingly to capture all Power BI activities.

  1. An ETL (or ELT) process is executed to access the CSV file and load the new data to a data source:
    •  The results of each audit log search can contain duplicate rows. However, the Identity column included in the search results can be used to eliminate these duplicate rows.

In an on-premises MSBI environment, a combination of SQL Server Agent, SQL Server Integration Services (SSIS), and the SQL Server relational database engine could be used to implement the data retrieval process. For example, an Agent Services job could be scheduled to sequentially execute the PowerShell script, an SSIS package, and optionally a SQL Server stored procedure.

  1. A Power BI dataset (.PBIX) with a connection to the data source in step 2 is refreshed:
    • The M query used to load the data model can include transformations to parse the JSON column and expose all columns to the data model, as shown in the following screenshot.
    • As an import mode dataset, additional data sources, such as Active Directory, could be included in the refresh process.
    • Additionally, the refresh of this dataset could be triggered to execute immediately following the completion of step 2 via the Power BI Rest API, as described in the Power BI REST API section of Chapter 8, Managing Application Workspaces and Power BI Content.

In the following screenshot, audit log search result data has been connected to from Power BI Desktop:

Power BI Desktop: Parse JSON transformation

As shown in the preceding screenshot, a Parse JSON command is available on the Transform tab of the Power Query Editor. Selecting the AuditData column containing the JSON and then the Parse JSON transform converts each cell value into a record value. Once parsed, select the outward facing arrows next to the AuditData column header to convert these records into individual columns.

Technically, the two steps described here in the Power Query Editor are converted into M expressions, which utilize the Table.TransformColumns() and Table.ExpandRecordColumn() functions, respectively. As an essential M query to a monitoring solution, a review of the M syntax via the Advanced Editor and other enhancements, such as parameterizing the source file location, is recommended.

With the parsed JSON column expanded, 19 columns with an AuditData prefix will be available to load to the data model, as shown in the following screenshot:

Transformed audit log data

As shown in the preceding screenshot, the columns expanded will be of the Any data type in M (ABC123 icon). As Any type columns, these columns will be loaded to the data model as Text data type columns. Therefore, the CreationDate column, which is available outside the AuditData (JSON) column, should be used on the reporting layer as this column will be stored as a Date/Time type. 

With a sound data retrieval process in place, DAX measures could be authored, such as the count of active users, the average number of users per day and per month, and the count of created reports or dashboards. To support security and compliance, measures and visualizations could be created targeting high-risk or undesirable activities, such as exporting report visual data or publishing reports to the web. For example, a card visual representing the count of data export activities could be pinned to a Power BI dashboard and a data alert could be configured against this dashboard tile.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.147.60.63