Logging

PowerShell comes packed with many logging capabilities, which can be seen in the EventLog.

Logs for Windows PowerShell:

This log source contains basic information about Windows PowerShell. We have actually used this log source previously, when we searched for the engine version filtering Event ID 400.

Remoting Logs:

These logs are mainly used for troubleshooting purposes, to validate misbehavior on remoting. They can also be used for forensic approaches to validate the established connections from or to specific machines.

PowerShell Admin and Operational logs:

The last ones, Admin and Operational, can be found in the event logs in the following path: Applications and Service Logs | Microsoft  | Windows | PowerShell. In the Admin log file, all admin tasks are logged. It is important to validate this log file, as a re-enabled PowerShell version 2 would show up here. And the last ones are the operational logs. 

PowerShell code logging can generally be split into the following three log types:

  • Transcription logging
  • Module logging
  • Script Block logging

For each of these important logging mechanisms, group policies are available:

Module Logging:

The first one is module logging, which records pipeline execution details, including variable initialization and command invocations. It has been available since PowerShell 3 and its events are saved with the event ID 4103.

As no specific limitations are defined, it is recommended to enable module logging for all modules. The group policy (GPO) includes a specific list for the module names. To enable monitoring on all available and used modules, just enter an asterisk for the first value, as follows:

As an example, it can always be discovered which cmdlets have been executed, as in the following example Invoke-Expression. This cmdlet was actually executed as an obfuscated script and could be caught in the event logs:

function SuperDecrypt {
param($script)
$bytes = [Convert]::FromBase64String($script)
## XOR encryption
$xorKey = 0x42
for ($counter = 0; $counter -lt $bytes.Length; $counter++) {
$bytes[$counter] = $bytes[$counter] -bxor $xorKey
}
[System.Text.Encoding]::Unicode.GetString($bytes)
}

$
decrypted = SuperDecrypt FUIwQitCNkInQm9CCkItQjFCNkJiQmVCEkI1QixCJkJlQg==

Invoke-Expression
$decrypted

Invoke-Expression (SuperDecrypt FUIwQitCNkInQm9CCkItQjFCNkJiQmVCEkI1QixCJkJlQg==)

Transcription Logging:

Transcription logging is also very often explained as over-the-shoulder logging, as it will log every input and output from every session, exactly as it appears. This logging mechanism is especially useful for investigating attacks from the internal network. As explained in the introduction, these make up more than half of all attacks. 

In addition, transcription logging will also create a unique record for every PowerShell session and store it in a very storage-efficient way in files. These files can either be saved locally or in a network shared with dedicated ACLs. For the first step, we recommend enabling this log type to write to the local store with the included invocation headers, as the size of all PowerShell logs will increase drastically:

The locally configured folder path has to already exist.

Under the folders specified for logging, dedicated folders for each day are going to be created:

Underneath every PowerShell session, dedicated files are being created with a generic ID in the following pattern:

PowerShell_transcript.%COMPUTERNAME%.%GUID%.txt

It will look like the following screenshot:

The content of the file is separated into the invocation header, the command start time, the actual executed output and input from the session, and finally, the endtime:

**********************
Windows PowerShell transcript start
Start time: 20180519114630
Username: %domain%\%username%
RunAs User: %domain%\%userName%
Configuration Name:
Machine: %ComputerName% (Microsoft Windows NT 10.0.16257.0)
Host Application: C:WindowsSystem32WindowsPowerShellv1.0powershell_ise.exe C:4_Signing1 Signing with Cert from file.ps1
Process ID: 16244
PSVersion: 5.1.16257.1
PSEdition: Desktop
PSCompatibleVersions: 1.0, 2.0, 3.0, 4.0, 5.0, 5.1.16257.1
BuildVersion: 10.0.16257.1
CLRVersion: 4.0.30319.42000
WSManStackVersion: 3.0
PSRemotingProtocolVersion: 2.3
SerializationVersion: 1.1.0.1
**********************
**********************
Command start time: 20180519114630
**********************
PS>Get-AuthenticodeSignature -FilePath 'C:UsersdadasnevAppDataLocalTemp2c0b15ce-d7ad-4531-880d-ab591c9eccd7.ps1'

Directory: C:UsersdadasnevAppDataLocalTemp

SignerCertificate Status Path
----------------- ------ ----
78C61E31456784A5721187320D95E3BF481 UnknownError 2c0b15ce-d7ad-4...

**********************
Windows PowerShell transcript end
End time: 20180519114630
**********************

Script Block Logging:

Script block logging records blocks of code as they are executed by the PowerShell engine. It captures the full contents of the executed code, including scripts and commands. Its information is saved with the event ID 4104. You can also enable script block start and stop events, which would make sense for performance analysis on specific cmdlets, though in terms of security it creates too much unnecessary overhead, and should be disabled if not necessary.

The start and stop events would be stored with the event IDs 4105 and 4106:

In particular log entries coming from script block logging with the event ID 4104 and the type Warning should be validated. Keyword validation is integrated into PowerShell for detecting suspicious keywords such as SeEnableDebugPrivilege. It is recommended to have the warnings validated, as they will be generated even if no logging was enabled.

Logging Recommendation:

From a recommendation perspective, you should enable all three logging possibilities in your environment. For the first step, a good approach is to log everything on the machines and save it there. By enabling this, you will be able to do further forensics after a security incident. But you have to always think like an attacker, and they will surely try to manipulate and delete log files to not leave any fingerprints on any machines. The next steps would be to centralize the logs. For the transcription logs, this would result in a specific share with configured ACLs, and for the gathered EventLogs (Module Logging and Script Block Logging), this would result in Windows Event Forwarding (WEF) up to a dedicated Security Information and Event Management (SIEM) solution. But, only centralizing the logs will not bring you any advantage. The next step would be to create incidents out of this data by hunting. Either you write your own hunting queries, or use dedicated software for this task. But keep in mind that you need to have an incident response process up and running, and probably also a Security Operations Center (SOC) to handle all the incidents at different levels. The problem we frequently see is the lack of enough resources to actually handle the incidents and in the end, all of the effort that was being put into enabling, collecting, and validating all of the log materials is useless.

A nicely described approach using WEF can be found in the following blog article: https://blogs.technet.microsoft.com/jepayne/2017/12/08/weffles/.
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
52.14.168.56