In this chapter, we will explore the Azure Monitor Log Analytics platform, which is used to store all the log data that will be analyzed by Azure Sentinel. This is the first component that needs to be designed and configured when implementing Azure Sentinel, and will require some ongoing maintenance to configure the data storage options and control costs of the solution.
This chapter will also explain how to create a new workspace using the Azure portal, PowerShell, and the CLI. Once a workspace has been created, we will learn how to attach various resources to it so that information can be gathered, and we will explore the other navigation menu options.
By the end of this chapter you will know how to set up a new workspace, connect to resources to gather data, enable Azure Sentinel for data analysis, and configure some of the advanced features to ensure security and cost management.
We will cover the following topics in this chapter:
Before you start creating Log Analytics workspaces and using Azure Sentinel, you will need to set up an Azure tenant and subscription. It does not matter what type of Azure tenant you have, just as long as you have one that you can use. If you do not have access to an Azure tenant, you can set up a free trial by following the instructions at https://azure.microsoft.com/en-us/free/.
Once you have a tenant, you will need a subscription as well, if there is not one that you can use already. Depending on the type of Azure tenant you have access to, you may need to contact someone else to create the subscription. If you need help creating a new subscription, go to https://docs.microsoft.com/en-us/azure/active-directory/fundamentals/active-directory-how-subscriptions-associated-directory.
Azure Monitor is the name of a suite of solutions built within the Azure platform to collect logs and metrics, then use that information to create insights, visualizations, and automated responses. Log Analytics is one of the main services, created to analyze the logs gathered. The platform supports near real-time scenarios, is automatically scaled, and is available to multiple services across Azure (including Azure Sentinel). Using a version of the Kusto Query Language (KQL), the query language used to obtain information from logs, complex information can be queried quickly and the queries are saved for future use. In this book, we will refer to this service simply as Log Analytics.
In order to create a Log Analytics workspace, you must first have an Azure subscription. Each subscription is based on a specific geographic location that ties the data storage to that region. The region selection is decided based on where you want your data to be stored; consider the distance between the source data and the Azure data center (region), alongside the legal requirements for data sovereignty requirements for your organization. The selection of the region will also impact the costs associated with both Log Analytics and Azure Sentinel.
Each workspace has its own separate data repository, and each can be configured uniquely to meet the business and technical requirements for security, governance, and cost management. Azure Sentinel can only be enabled to use a single Log Analytics workspace; we therefore recommend that you centralize all your security logs to a dedicated central workspace. If your organization requires you to create more than one location for your data for the legal or technical requirements mentioned before, then you will need to run multiple instances of Azure Sentinel (one per Log Analytics workspace), and each instance would need to be monitored, in which case you should consider the deployment of Azure Lighthouse.
Azure Lighthouse
Microsoft has addressed the need to manage multiple Azure subscriptions and resources in a centralized console. This is usually deployed by managed service providers who have multiple customers, although it may also be used by organizations with complex requirements and who have deployed multiple Azure subscriptions. Azure Sentinel is now a supported resource for this portal, and more features are expected to be added in time to ensure strong parity for directly interacting with Azure Sentinel.
The following diagram shows how Log Analytics workspaces relate to the rest of Azure. Each workspace resides in a single resource group, although there can be multiple workspaces in a single resource group and most likely other Azure resources. Each resource group belongs to a single subscription, and each subscription belongs to a single Azure Tenant. There can be, and usually are, multiple resource groups in a subscription, and many companies will have multiple subscriptions in a tenant:
Once created, a workspace can be used to gather information from many different sources, including the following:
To protect the data collected, security and compliance standards are built into this solution; the Log Analytics service manages your data in a secure cloud data repository and ensures the data is secured with multiple layers of protection, including the following:
For more information on the way data is secured, we recommend reading the official Microsoft documentation: https://docs.microsoft.com/en-us/azure/azure-monitor/platform/data-security.
While it is easy to just go and create a workspace, it is better to plan out the workspace configuration beforehand to avoid having to perform rework later. Some of the aspects to take into account include the following:
If this raises concerns that the name will make it a bigger target for bad actors, use whatever naming convention makes sense and meets corporate standards. Just remember that it must be unique across all of Azure. That name must also be between 4 and 64 letters, digits, and - should not be the first or last character.
For more information on Log Analytics pricing, see https://azure.microsoft.com/en-us/pricing/details/monitor.
a) Free: There is no charge for the data being ingested, although there is a 500 MB daily limit and the data is retained for only 7 days. This can only be used for lab and research purposes.
b) Standalone (Per GB): This is the same as Per GB (2018).
c) Per Node (OMS): Use this one if OMS E1 Suite, OMS E2 Suite, or OMS Add-On for System Center has been purchased to use the authorization that came from those purchases.
Planning your workspace before you create it is very important. Making sure to select a unique and meaningful name, the proper location to avoid egress charges, the correct resource group, and other decisions before deploying will save you frustration or complete rework later.
This section will describe how to create the Log Analytics workspace using the Azure portal website. This is a graphical representation of the PowerShell and CLI commands discussed later, and as such may be the easiest way to start working with workspaces:
a) Enter the name for the workspace.
b) Select a Subscription where this will reside.
c) Choose the Resource group for this workspace.
d) Select the Location where this workspace will reside.
e) For the Pricing tier, Per GB (2018) will automatically be selected.
The blade will look something like this:
That is all there is to creating a workspace using the Azure portal. While this is very easy to do, there may be times when you will want to perform these same actions using command-line actions, and that will be described next.
There are times when you need to be able to consistently recreate an Azure Sentinel environment. Perhaps you are just testing all the various configuration options, creating environments for many different subscriptions for an international company, or creating instances for customers. No matter the reason, if you need to create many Azure Sentinel environments that are all the same, using PowerShell or the Command-Line Interface (CLI) is a better option than doing it in the Azure portal.
When creating a new Log Analytics workspace using PowerShell in this lab, you will use an Azure Resource Management (ARM) template to perform the actual configuration. While you can create the workspace directly using either technology, using an ARM template provides additional benefits, including being able to easily recreate the workspace, using the ARM template in a DevOps workflow, or using it in Azure Blueprints.
Note
A complete discussion of ARM templates is beyond the scope of this lab, but briefly, an ARM template is a JSON file that describes what needs to be created in Azure. It contains parameters, which are the values that a user will provide to determine items such as name, location, and pricing tier. It can also have variables, which are internal values that can be used to determine other values. It will also have a list of one or more resources, which are the Azure resources to create.
Go to https://docs.microsoft.com/en-us/azure/azure-monitor/platform/template-workspace-configuration and copy the JSON text. You will be pasting this into a file that you create later.
In this example, you will be prompted for the workspace name and the location, but the pricing tier will default to pergb2018 due to the presence of a defaultValue entry. If you do not wish to have those defaults, you can either change the values shown or remove the entire defaultValue line, including the comma at the end, in which case you will be prompted for the values when executing the command.
While JSON is just text, so you can use a program such as Notepad to view it, it is recommended that you use something like Visual Studio or Visual Studio Code, which provides options including color coding and showing available commands. We will be using a version of Visual Studio Code in the Azure portal for this lab.
In your Azure portal, click on the Cloud Shell icon in the top right-hand corner of the screen:
If prompted, select the subscription you are currently using for Azure Sentinel and then click Create Storage. This will only have to be done once, so if you have used the Cloud Shell before you will not be prompted for this.
At the bottom of the screen will be the Cloud Shell, which should look like the following screenshot. The text may not match exactly what is shown:
If in the top left-hand corner it says Bash rather than PowerShell, use the dropdown to change it to PowerShell, as that will have the command we need for this lab.
Once the Cloud Shell has finished loading, enter the following:
code deployworkspacetemplate.json
This will start a version of Visual Studio code that you can use in the Cloud Shell. Type the code from the preceding example. On the right side of the screen, click on the context menu and click Save. You can then either click the X to close the editor or click on the context menu and click on Close Editor:
Note
If you want to run this on your own computer rather than via the Azure portal, go to https://docs.microsoft.com/en-us/powershell/azure/install-az-ps?view=azps-3.2.0 to learn how to install the Azure PowerShell module, or https://docs.microsoft.com/en-us/cli/azure/install-azure-cli?view=azure-cli-latest to install the Azure CLI module.
That is how you can use the CLI to create a new Log Analytics ARM template. This is the external file that we will be using in the following sections to create the new workspace.
PowerShell is a scripting language that can be used across various machines, including Windows and Linux computers, that was built on top of .NET. Because of this, it can accept .NET objects, making it incredibly powerful. PowerShell has many different commands, including some specifically created for working with Azure, which will be used here.
Note
You may not have the Azure PowerShell module loaded on your local computer. To install it, follow the instructions located at https://docs.microsoft.com/en-us/powershell/azure/install-az-ps?view=azps-3.5.0
Follow these steps to create a new workspace using PowerShell:
New-AzResourceGroup -Name <resource-group-name> -Location <location>
Replace <resource-group-name> with the name of the new resource group and <location> with the location where the resource group will reside, such as EastUS or WestUS. If you do not know what to use for your location, run this command:
Get-AzLocation
Find the location you want and use the value listed under Location.
New-AzResourceGroupDeployment -Name <deployment-name> -ResourceGroupName <resource-group-name> -TemplateFile $HOME/deployworkspacetemplate.json
Replace <deployment-name> with the name of this deployment. You can use something like <labworkspace>. It is not important what you enter, as this is just a placeholder name so that if you look at the resource group you can see the various deployments. Replace <resource-group-name> with the name of the resource group where the workspace will reside.
If you get an error screen, read the message, as the messages are usually quite specific as to what caused the error.
That is how you can create a new Log Analytics workspace using an ARM template and PowerShell. This can be preferable to using the Azure portal as it is repeatable. Next, we will look at using the Azure CLI and see how to create a new workspace without using the ARM template.
The Azure CLI is also a cross-platform scripting tool developed by Microsoft. Initially, it was the only tool that was cross-platform, so if you were working on a computer that was not running Windows, it was your only option. PowerShell is now cross-platform as well, so the main difference between the two is that Azure CLI can create Azure resources directly without using an ARM template.
Note
The following steps describe how to run the CLI from the Azure portal. If you want to run this on your local computer, you will need to make sure you have the CLI installed. Go to https://docs.microsoft.com/en-us/cli/azure/install-azure-cli?view=azure-cli-latest for instructions on how to perform the installation:
az group create --name <resource-group-name> --location <location>
az account list-location
az group deployment create --resource-group <my-resource-group> --name <my-deployment-name> --template-file deploylaworkspacetemplate.json
You will be prompted for the Log Analytics workspace name as well. Enter a valid name and press Enter to continue.
Once the command has finished running, it will show either the JSON values for this workspace as shown in the following screenshot or an error message. Note that not all the JSON is shown for brevity:
Note that as stated earlier, you can use the Azure CLI to create the Log Analytics workspace directly using the az monitor log-analytics workspace create command. Go to https://docs.microsoft.com/en-us/cli/azure/monitor/log-analytics/workspace?view=azure-cli-latest#az-monitor-log-analytics-workspace-create for more information on this command.
No matter how you created your Log Analytics workspace, the rest of the work in this lab will be done using the Azure Portal:
Note that this is just a partial screen shown due to the amount of information on this page.
a) Resource group: The resource group where the workspace resides. Selecting [change] will allow you to move to another resource group.
b) Status: The status of the workspace should show Active.
c) Location: The Azure location where the workspace resides.
d) Subscription name: The subscription this resource is associated with. Selecting [change] will allow you to move to another subscription.
e) Subscription ID: The unique GUID for the preceding subscription, which is useful when calling Microsoft for technical support.
f) Workspace name: The name of the Log Analytics workspace.
g) Workspace ID: The GUID for the workspace, which is also useful when calling Microsoft for technical support.
h) Pricing tier: The pricing tier for the workspace.
i) Management services: View the activity log for the workspace.
j) Access control mode: How users are granted permission to access the information in this workspace. See the following section for more information.
The previous sections described the various ways that you can create a new Log Analytics workspace to use with Azure Sentinel. This can be done either through the Azure portal or programmatically using either PowerShell or CLI commands. Once the workspace has been created, we next need to ensure access is restricted to only those users that need to access it.
Before we connect and store data in the workspace and enable Azure Sentinel to carry out analytics on the data, let’s review the options to secure access to this new resource. Azure provides three main levels of access to resources:
These permissions can be granted at four different levels:
Table Level RBAC
While there is no user interface available to set permissions on individual tables within the log, you can create Azure custom roles to set these permissions. See https://docs.microsoft.com/en-us/azure/azure-monitor/platform/manage-access#table-level-rbac for more information on how to do this.
Permissions can be applied using built-in roles, or you can make a custom role for specific access if you need to be more granular. To make this simpler, there are several built-in user roles we recommend you use to manage access to Log Analytics for the purpose of using Azure Sentinel, and we recommend you apply these to the specific resource group used for Azure Sentinel:
a) Azure Sentinel Contributor: Provides the ability to create and edit dashboards, analytics rules, and other Azure Sentinel resources
b) Log Analytics Reader: Provides read-only visibility to all Azure resources and Azure Sentinel logs
a) Azure Sentinel Responder: Provides the ability to manage incidents, view data, workbooks, and other Azure Sentinel resources
b) Log Analytics Reader: Provides read-only visibility to all Azure resources and Azure Sentinel logs
If additional permissions are required, keep to the idea of providing the minimal permissions and applying only to the specific resources required. It may take some trial and error to get the right outcome, but it is a safer option than providing broad and excessive permissions. For further information, please take a look at the following article: https://docs.microsoft.com/en-us/azure/azure-monitor/platform/manage-access.
Once you have created a Log Analytics workspace that you want to use with Azure Sentinel, it is very easy to attach it to Azure Sentinel:
Congratulations! You now have your Azure Sentinel environment created and ready to go. While the News & guides page is where you will go automatically after attaching a Log Analytics workspace to Azure Sentinel, if you leave Azure Sentinel and go back to it, you will automatically go to the Azure Sentinel Overview page which is described next.
The Azure Sentinel Overview page is the page that you will automatically go to when entering Azure Sentinel after you have associated the Log Analytics workspace with it. This page provides a general overview of the information in your Azure Sentinel environment and will look like the following screenshot. The actual numbers and data being shown will vary depending on your environment, of course:
The page is broken up into various sections and each of these is described in the following sections.
The header bar allows you to refresh the screen to see any updates, as well as to select how far back in time to look for the data. You can select the icon that looks like a clock to change how far back you want to look.
The summary bar will show you how much data has been ingested in the selected time period as well as how many alerts were raised, and the number of incidents those alerts created. In addition, the incidents are broken down by their status.
This section will show the logs that have ingested the most data and the number of incidents created in the selected time frame. This is an interactive chart, so when you mouse over a specific time, the information will be filtered to show what happened at that time.
This section will show up to the last five created incidents as well as the number of alerts that have generated the incident. You can click on the incident name to get more information about the incident.
This section will show up to two different data sources that Azure Sentinel’s Machine Learning has determined contain anomalies. You can click on the log name to get more information about the anomaly.
This section, not shown, will show an interactive map where any potential malicious events will be highlighted. You can zoom into the map to get a very precise indication of where the event occurred.
This section, not shown, provides some general information on Azure Sentinel’s use of Machine Learning and provides a link where you can obtain more information.
That is the Azure Sentinel Overview page. It is a great place to go to get an overview of what is going on in your Azure Sentinel environment and is the landing page of Azure Sentinel. While Figure 2.14 shows lots of data, when you first create a Log Analytics workspace, it will be empty. The next section will explain how to start getting data into your workspace.
Before we dig into the details of the Azure Sentinel data connectors (see Chapter 3, Data Collection and Management), we will review how Log Analytics enables connectivity to a range of different sources in order to receive data to store and analyze. Some of the data source options include the following:
In this section, we will show you how you can enable log collection from Azure virtual machines.
To have the virtual machines (VMs) populate a Log Analytics workspace, they need to be connected to it. This is done from the Log Analytics workspace Overview page.
There are two different ways to get to this page. First, you can select Log Analytics in the Azure portal navigation menu and then select the appropriate workspace. The second, and perhaps easier, way is to select Settings from the Azure Sentinel navigation menu and then select Workspace settings from the menus at the top of the page, as shown in the following screenshot:
No matter which method you use to get to the page, it will look similar to the following screenshot:
Under Connect a data source in the Get started with Log Analytics section, select Azure virtual machines (VMs). This will take you to the Virtual machines page, which lists each VM and shows whether it is connected, as well as the OS, subscription GUID, the resource group, and the location it belongs to. The following screenshot is an example of what this page looks like:
You can see that the first three VMs are connected to this workspace, the fourth one, called LinuxWebServer, is connected to another workspace, and the final one, ThreatHuntDemo, is not connected to any workspace.
To change the connection status of any of the VMs, click on the row containing it. This will open a new blade, where you can either connect or disconnect the VM:
Select either the Disconnect or Connect link to perform the action you desire.
Connecting a VM to a Log Analytics workspace downloads and installs the Microsoft Monitoring Agent to the VM, so this step can be performed automatically when provisioning the VM using tools such as PowerShell Desired State Configuration. However, the actual steps to perform this task are beyond the scope of this book.
In a large-scale deployment, especially with VMs that are not hosted in Azure, you may not want each individual server directly sending their logs to the Log Analytics workspace. Instead, you may consider deploying the Syslog/CEF connector to centralize log collection and data ingestion. Each VM would then point towards the CEF collector server instead of Log Analytics.
The advanced settings for Log Analytics allow you to perform actions such as connecting on-premises and other non-Azure Windows and Linux servers, Azure Storage, and System Center Management groups. You can also set what information to import from Windows and Linux servers, import IIS Logs and Syslog events, and add custom logs and fields. Finally, you can create groups of computers, or use groups already created in Active Directory, Windows Server Update Service (WSUS), and SCCM, which can be used in your queries.
To get to the Advanced settings page, follow the instructions to get the Log Analytics Overview page in the previous section and instead of selecting Azure virtual machines (VMs), select Windows, Linux and other sources. This will open a new page as shown in the following screenshot:
As you can see, there are various menu options that will allow you to connect to various servers, determine what data to ingest, and help with querying the data that these servers provide. Each one will be discussed in the next section.
This area allows you to attach non-Azure Windows and Linux servers, Azure Storage, and System Center Operations Manager:
Note
While this can be used to connect Azure VMs, it is far easier to use the steps in the previous section to do so.
The Connected Sources area shows you how to connect to on-premises servers as well as Azure Storage and System Center Manager groups. Next, we will look at the Data menu, which will tell Azure Sentinel what information from those servers to ingest.
This area allows you to determine which data from connected servers will be imported. Selecting the Data option will show you the following page:
Let’s take a look at the different fields under the Data option:
As you can see, there are a lot of ways you can configure the data to import. This will always be a tradeoff between what data you need or want and the cost of ingesting and storing the data. In the next section, we will look at Computer Groups, which can help you with your queries.
This section will show all the custom computer groups that have been created and provide a way to create your own. These groups can then be used in queries. You can use these groups to create queries that reference a specific set of servers that can then be easily changed without having to change the query itself.
Selecting the Computer Groups option will show you the following screen:
Let’s discuss the different fields under Computer Groups:
There are various ways to create computer groups to help you with your queries. Each of these will be discussed in more detail in the following sections.
Adding a computer group using a query involves running a query that will return a list of computers and then save that information as a computer group:
Heartbeat
| where TimeGenerated > ago(30m)
| distinct Computer
Don’t worry about what the query means, it will be explained in Chapter 5, Using the Kusto Query Language (KQL). For now, this will return a listing of all the computers who have sent a heartbeat to the workspace in the last 30 minutes. Note that you will need to have a server connected to Azure (see the Obtaining information from Azure Virtual Machines and the Connected Sources sections) to get any information from this query.
This will bring up a new blade where you can enter the query information.
When you go back to the Saved Groups page, you will see your saved group, which will look similar to what is shown in the following screenshot:
To use a saved group, enter a query like this:
Perf
| where Computer in (BookQuery)
Substitute BookQuery for the name of the query you just created. Again, do not worry about what this query means, it is just an example of how to use a saved group. It will make more sense after reading Chapter 5, Using the Kusto Query Language (KQL).
In this chapter, we explored the Azure Monitor Log Analytics solution, including how to create a new workspace using the Azure portal, PowerShell, or CLI, and how to configure the security options to ensure each user has the appropriate level of access. We also looked at how to connect a data source and configure some of the advanced settings. This information is very useful when you need to first configure Azure Sentinel, and in future if you need to make any changes to the Log Analytics platform supporting your operations and business needs.
In the next chapter, we will look at how to select data that is most useful for security threat hunting, which connectors to use to gather the data from any system, and the options available to enable long-term data retention while keeping costs under control.
Answer these questions to test your knowledge of this chapter:
3.144.37.38