15. Building Scripts

_______________________________

In This Chapter

How To Begin

Documentation of SCC

PowerShell and Change

Script Building Summary

_______________________________

In the previous two chapters quite a few topics concerning PowerShell were covered, with some Exchange Server topics sprinkled into the mix. Now that some basic topics have been covered let’s see how these can be used in a PowerShell script and begin building your own scripts. This chapter will cover what to start with, how to add to the script, how to enhance the script, perform detailed testing and finally how to transition and use this in production with your SCC tenant.

How to Begin

When script building, having a clear goal of what is to be accomplished is advisable. In the past programmers used various methods to build scripts. The key to our method is that we need a beginning and we need an end. This method requires a seed or first cmdlet to start with and from that we need to aim for the end goal, what could be called the purpose of the script. Let’s start with a real life example.

To build a complete script, to keep the process ordered and to complete the task at hand there are a series of steps that can provide a useful guide to the process. Provided below is a series of suggested steps for creating a PowerShell script.

  • Seed to start the script - usually a core concept with a corresponding PowerShell cmdlet
  • Look for samples on the Internet to save time – code blocks, one-liners, routines and usage
  • Loops if needed to perform iterations (Foreach, Do...While, etc.) over objects in the SCC
  • Define arrays if needed for the loops or other parts of the scripts
  • Functions if a process is repeatable or needs to be called on from multiple parts of a script
  • Export the results
  • Build in some error checking or fail safes
  • Commenting - top of the script - detailed description
  • Commenting - document the script

    TIP

    On the first run of any new script, either the script needs to be run in a test tenant or all PowerShell cmdlets that make changes should be commented to prevent their execution. Alternatively, you can leverage the WhatIf switch to see what the cmdlet would do. However, trailing code can react as if the cmdlet failed, as the cmdlet did not actually run.

Documentation of SCC

This book is built on practical ways to use PowerShell and building scripts in a practical manner is considered part of that overall goal. Imagine for this scenario you are the administrator of a parent company that is working to document an Office 365 environment for a recently acquired company. You want a script that will produce a report of settings, compliance related items and more that will be used to assess next steps with this acquisition. Important items to consider are Labels in use, any custom Sensitive Information Types and more. Reports will be used by the IT department as well as company executives.

Some information is confidential or at least compliance-related, so remember to store exported data on a secure drive. Additionally, not every setting or report is exportable via PowerShell or solely through the SCC PowerShell module. Secure Score reports are either accessed at https://protection.office.com or via the Microsoft Graph PowerShell module - https://blogs.technet.microsoft.com/cloudlojik/2017/09/05/using-powershell-to-connect-to-microsoft-graph-api/.

Coding the Script

In the Security and Compliance Center, we have a lot of items that can be configured and set up for an Office 365 tenant. From Labels, to Policies, to Keyword Dictionaries and Assigned Role Groups we will have quite some territory to cover with a documentation script. In order to accomplish this, we will need a few things in order to be successful. First we will need to connect to the Security and Compliance Center with an account that has the appropriate access to various features. At the minimum we want a Global Admin for the tenant, but we may also want the eDiscovery Administrator for access to Compliance Cases.

Where to Start

First, we need a log file and a directory where to export files and more as part of this process. First, to help with automation, we’ll use the current path as the location for all files and store this in a $Path variable. For our general log file, a variable called $File will store the filename

$Path = (Get-Item -Path “.” -Verbose).FullName

$File = ‘SecurityComplianceCenter-Documentation.txt’

$Destination = $Path+’’+$File

In addition to general log files, we will create exports of configuration data as needed, so the $Path variable will be valuable later in the script for specifying a destination for files exported by the script. After we have the variable set, we will move on to documenting actual configuration data in the SCC. First, we can start with a security export that shows who has been assigned to what roles. How do we do this? If we remember from the Security chapter, we need to look at Role Groups. Each Role Group could have members assigned to it in the SCC. One exception is the eDiscovery Case Administrator as this role has it’s own dedicated PowerShell cmdlets. We will gather that in an additional code section.

$RoleGroups = (Get-RoleGroup).Name

Foreach ($RoleGroup in $RoleGroups){

$Members = Get-RoleGroupMember -Identity $RoleGroup

$Line = $RoleGroup | Out-File $Destination -Append

$Line = ‘-----------------------------’| Out-File $Destination -Append

$Members| Out-File $Destination -Append

$Line = ‘’ | Out-File $Destination -Append

}

Sample output from this section:

We need to add the eDiscovery Case Admin (these lines are added after the above loop):

$Line = ‘ ’ | Out-File $Destination -Append

$Line = ‘eDiscovery Case Admin’ | Out-File $Destination -Append

$Line = ‘---------------------------’ | Out-File $Destination -Append

Get-eDiscoveryCaseAdmin

$Line = ‘ ’ | Out-File $Destination -Append

Note that all lines of PowerShell that export to the $Destination file use ‘-Append’ at the end. We do this so we do not overwrite any existing content in that file. Also note that there are extra lines, literally, that are used for formatting as we want to make the destination log file one that someone can read.

Next we can tackle Labels, Label Policies and Sensitive Information Types. As we get into output that could be wider than a single screen, we should consider a way to widen the output placed in the $Destination file. We can do this by adjusting our buffers for the PowerShell window like so:

$Host.UI.RawUI.BufferSize = New-Object Management.Automation.Host.Size (500, 9999)

This now widens our buffer to 500 and the window size to 9999. Now we can go back to exporting Labels and Label Policies:

$Line = ‘Labels on the Security and Compliance Center’ | Out-File $Destination -Append

$Labels = Get-Label | Ft Name,Workload,Settings,LocalSettings,ToolTip,Comment -Auto | Out-File $Destination -Append

$Line = ‘ ’ | Out-File $Destination -Append

We will take a similar approach to Label Policies:

$Line = ‘Label Policies in the Security and Compliance Center’ | Out-File $Destination -Append
$LabelPolicies = Get-LabelPolicy | Ft Name,Type,Settings,Labels,WorkLoad,Comment-Auto | Out-File $Destination -Append

$Line = ‘ ’ | Out-File $Destination -Append

Next, for Sensitive Data Types, we have a few things that will need to be processed

Get-DlpSensitiveInformationType | Where {$_.Publisher -ne ‘Microsoft Corporation’} | Ft Name, Publisher, ID, Description, RecommendedConfidence, RulePackID | Out-File $Destination -Append

We can also export the Rules Packages that are associated with these Sensitive Information Types:

$DLPRulePackIDs = (Get-DlpSensitiveInformationType | where {$_.Publisher -ne ‘Microsoft Corporation’}).RulePackID

Foreach ($DLPRulePackID in $DLPRulePackIDs) {

# Define Output File:

$File = ‘DlpSensitiveInformationTypeRulePackage-’+$DLPRulePackID+’.xml’

$XMLDestination = $Path+’’+$File

# Pull current Rules Package:

$RulesPackage = Get-DlpSensitiveInformationTypeRulePackage $DLPRulePackID

# Export Rules Collections to XML Files:

Set-Content -path $XMLDestination -Encoding Byte -Value $RulesPackage.SerializedClassification-RuleCollection

}

Sample XML File Result:

In addition to the normal Sensitive Information Types, we also need to gather information on Exact Data Match (EDM) information, Keyword Dictionaries. In each section we export our findings to a destination file for examination or documentation:

$Line = ‘DLP EDM Schemas’ | Out-File $Destination -Append

$Line = ‘---------------’| Out-File $Destination -Append

$DLPEDMSchemas = Get-DlpEdmSchema

$DLPEDMSchemas | Select Name, DataStoreName, Description, GUID, IsValid | Out-File $Destination -Append

Foreach ($DLPEdmSchema in $DLPEdmSchemas) {

# Variables

$EdmSchemaXML = $DLPEdmSchema.EdmSchemaXML

$Name = $DLPEdmSchema.Name

# Define Output File:

$File = “DlpEDMSchema-$Name.xml”

$XMLDestination = $Path+’’+$File

# Output to XML File:

$EdmSchemaXML | Out-File $XMLDestination

}

Next, documenting Keyword Dictionaries:

$Line = ‘DLP Keyword Dictionaries’ | Out-File $Destination -Append

$Line = ‘------------------------’ | Out-File $Destination -Append

$Line = Get-DlpKeywordDictionary | Select Name, IsValid, Description, Identity, KeywordDictionary | Out-File $Destination -Append

Next we will cover Information barriers as they are only accessible in PowerShell. First we need to document the general status of the Information Barrier Application in the Security and Compliance Center:

$Line = ‘Information Barrier Policy Application Status’ | Out-File $Destination -Append

$Line = ‘-----------------------------------------------’ | Out-File $Destination -Append

$Line = Get-InformationBarrierPoliciesApplicationStatus | Out-File $Destination -Append

$Line = ‘Information Barrier Policies’ | Out-File $Destination -Append

$Line = ‘----------------------------’ | Out-File $Destination -Append

$Line = Get-InformationBarrierPolicy | ft Name,Type,AssignedSegment,SegmentsAllowed,SegmentsBlocked,SegmentsAllowedFilter,BlockVisibility,BlockCommunication,State,CreatedBy,CreationTimeUTC -AutoSize | Out-File $Destination -Append

$Line = ‘ ‘ | Out-File $Destination -Append

Lastly, we will also document the Organization Segments in use:

$Line = ‘Organization Segment(s)’ | Out-File $Destination -Append

$Line = ‘----------------------------’ | Out-File $Destination -Append

$Line = Get-OrganizationSegment | Ft Name, Type, UserGroupFilter, ObjectClass, CreatedBy | Out-File $Destination -Append

Next we can move on to Data Loss Prevention (DLP) items in the Security and Compliance Center:

$Line = ‘DLP Compliance Policies’ | Out-File $Destination -Append

$Line = ‘-----------------------’ | Out-File $Destination -Append

$DLPCompliancePolicies = Get-DlpCompliancePolicy | Select Name, Type, Mode, Enabled, Comment, Workload, ExchangeLocation, SharePointLocation, SharePointLocationException, OneDriveLocation, OneDriveLocationException, ExchangeOnPremisesLocation, SharePointOnPremisesLocation, SharePointOnPremisesLocationException, TeamsLocation, TeamsLocationException, ExchangeSender, ExchangeSenderMemberOf, ExchangeSenderException, ExchangeSenderMemberOfException

Foreach ($DLPCompliancePolicy in $DLPCompliancePolicies) {

$DLPCompliancePolicy | Ft Name,Type,Mode,Enabled,Comment,Workload | Out-File $Destination -Append

$DLPCompliancePolicy | Fl ExchangeLocation, SharePointLocation, SharePointLocationException, OneDriveLocation, OneDriveLocationException, ExchangeOnPremisesLocation, SharePointOnPremisesLocation, SharePointOnPremisesLocationException, TeamsLocation, TeamsLocationException, ExchangeSender, ExchangeSenderMemberOf, ExchangeSenderException, ExchangeSenderMemberOfException | Out-File $Destination -Append

}

For DLP Compliance Rules, we will create a summary table as well as a full details export to the same file:

$Line = ‘DLP Compliance Rules’ | Out-File $Destination -Append

$Line = ‘--------------------’ | Out-File $Destination -Append

$DLPComplianceRules = Get-DLPComplianceRule

Foreach ($DLPComplianceRule in $DLPComplianceRules) {

$DLPComplianceRule | Ft Name, Mode,Disabled,Workload,Policy,AccessScope -Auto |Out-File $Destination -Append

$DLPComplianceRule | Fl | Out-File $Destination -Append

}

For the last part of the script we will pull some more Compliance information like cases, holds and more.

Compliance Cases:

$Line = ‘Compliance Cases’ | Out-File $Destination -Append

$Line = ‘----------------’ | Out-File $Destination -Append

$ComplianceCases = Get-ComplianceCase

$ComplianceCases | Ft Name,Identity,CaseType,Status,Description -Auto | Out-File $Destination -Append

$Line = ‘’ | Out-File $Destination -Append

Foreach ($ComplianceCase in $ComplianceCases){

$Name = $ComplianceCase.Name

$Line = “Compliance Case Members [ $Name ]” | Out-File $Destination -Append

$Line = Get-ComplianceCaseMember -Case $Name | Out-File $Destination -Append

$Line = ‘’ | Out-File $Destination -Append

}

Compliance Searches:

$Line = ‘Compliance Searches’ | Out-File $Destination -Append

$Line = ‘-------------------’ | Out-File $Destination -Append

$ComplianceSearches = Get-ComplianceSearch

$ComplianceSearches | Ft Name, SearchType, Description, ContentMatchQuery -Auto | Out-File $Destination -Append

$Line = ‘’ | Out-File $Destination -Append

$Line = ‘Detailed Compliance Search Info’ | Out-File $Destination -Append

$ComplianceSearches | Fl | Out-File $Destination -Append

Compliance Search Actions:

$Line = ‘Compliance Search Action’ | Out-File $Destination -Append

$Line = ‘------------------------’ | Out-File $Destination -Append

$ComplianceSearchActions = Get-ComplianceSearchAction

$ComplianceSearchActions | Ft Name,Status,Action,SearchName,Results -Auto | Out-File $Destination -Append

$Line = ‘’ | Out-File $Destination -Append

$Line = ‘Detailed Compliance Search Action Info’ | Out-File $Destination -Append

$ComplianceSearchActions | Fl | Out-File $Destination -Append

Retention Compliance Policies:

$Line = ‘Retention Compliance Policies’ | Out-File $Destination -Append

$Line = ‘-----------------------------’ | Out-File $Destination -Append

$RetentionCompliancePolicies = Get-RetentionCompliancePolicy

$RetentionCompliancePolicies | Ft Name,Enabled,DistributionStatus,TeamsPolicy,Comment -Auto | Out-File $Destination -Append

$Line = ‘’ | Out-File $Destination -Append

$Line = ‘Detailed Retention Compliance Policies’ | Out-File $Destination -Append

$RetentionCompliancePolicies | Fl | Out-File $Destination -Append

Retention Compliance Rules:

$Line = ‘Retention Compliance Rules’ | Out-File $Destination -Append

$Line = ‘--------------------------’ | Out-File $Destination -Append

$RetentionComplianceRules = Get-RetentionComplianceRule

$RetentionComplianceRules |Ft Name, Mode, Disabled, ContentMatchQuery, RetentionComplianceAction, Workload -Auto | Out-File $Destination -Append

$Line = ‘’ | Out-File $Destination -Append

$Line = ‘Detailed Retention Compliance rules’ | Out-File $Destination -Append

$RetentionComplianceRules | Fl | Out-File $Destination -Append

Last:

$Line = ‘Detailed Teams Retention Compliance Policies’ | Out-File $Destination -Append

$Line = ‘--------------------------------------------’ | Out-File $Destination -Append

$TeamsRetentionCompliancePolicies = Get-TeamsRetentionCompliancePolicy | Fl | Out-File $Destination -Append

$Line = ‘Detailed Teams Retention Compliance Rules’ | Out-File $Destination -Append

$Line = ‘-----------------------------------------’ | Out-File $Destination -Append

$TeamsRetentionComplianceRules = Get-TeamsRetentionComplianceRule | Fl | Out-File $Destination -Append

PowerShell and Change

Before ending this chapter on the basics of building a script, a thought should be given to the longevity of your script…. PowerShell cmdlets change with features added, removed, deprecated and more….

Change is a constant at Microsoft. By the time you read this, Microsoft will probably have already added some new cmdlets to the Security and Compliance Center. Other workloads in Office 365 change every month, week and day. PowerShell change is less often, but the results are no different. New editions are made, old commands deprecated and eventually removed.

Microsoft also has enhanced new options like Cloud Shell to include ways to work with objects in Azure and Exchange Online. It is within the realm of possibility that Cloud Shell will be enhanced to work with all workloads in Office 365 at some time. This would provide an Administrator a single avenue with which to manage their Office 365 tenant. As scripts are built, effort may be required to make sure the cmdlets being used are not being deprecated. Cmdlets that are being deprecated can be found in a few ways. Simply run a cmdlet in PowerShell and if the cmdlet is being deprecated a message in yellow will reveal itself.

In any workload for Office 365 the change comes quickly. PowerShell cmdlets are added. PowerShell cmdlets are removed. Often there is no notification at all. This usually comes with features being added or removed. How do we know that the cmdlets changed? One way to keep track of the changes is to connect to the Security and Compliance Center, log the number of available cmdlets and export a list of the cmdlets to a text file. This script could be run each day or scheduled to run.

Example: Historical charts that map out the number of PowerShell cmdlets available for SCC and Skype Online. Notice the change for the SCC which increased from 125 cmdlets in 2017 to almost 250 in 2019:

Coding the Script

Just like the first script in this chapter we will need to initiate a connection to the Security and Compliance Center. Again, we create the credentials (stored in a secure password file) to be used in the connection:

$Username = “[email protected]

$Password = Cat C:SecureString-MyTenant.txt | ConvertTo-SecureString

$LiveCred = New-Object -TypeName System.Management.Automation.PSCredential -Argumentlist

$Username, $Password

Since we want to keep track of the cmdlets over time we need a place to drop the results file into. For the sake of this exercise we create a file structure like so (to keep the results organized):

C:CmdletCheck - Root directory to store lists of cmdlets

C:CmdletCheckHistorical - Store historical data

First step is to read in the existing CSV file that has results in it. If this is the first time a script executed, we can populate the data with a current date and a ‘0’ to simulate the fact that there are 0 known cmdlets at this time.

# Read in CSV file for comparison

$CSV = Import-CSV ‘C:CmdletCheckHistoricalCurrentChart.csv’

Next, we can store the number of cmdlets in a variable called $SCCNum and it will be populated with the number found in the column SecurityAndCompliance in the CSV file. The $CSV variable can be read and the current number of Exchange Online cmdlets will be stored in a variable called ‘$SCCNum’:

Foreach ($Line in $CSV) {

$SCCNum = $Line.SecurityAndCompliance

}

The CSV being utilized for the step above has the following format:

Now that we have that out of the way, it’s time to connect to Office 365 and the SCC specifically. To do so, we need to use the New-PSSession cmdlet. What can we do with this cmdlet? Let’s review the Get-Help:

Get-Help New-PsSession -Examples

Now the above is the closest of the examples included. In fact, we need a couple of other options:

Authentication - Specifies the mechanism that is used to authenticate the user’s credentials - Office 365 uses ‘Basic’ for its default authentication

AllowRedirection - Allows redirection of this connection to an alternate Uniform Resource Identifier (URI)

- This is a required option for connecting to Office 365 because it needs to be able to redirect the session to the appropriate resource in the cloud

We will store the PowerShell session connection in a variable called $Session. This way we can use the ‘Import-PSSession’ cmdlet to initiate the connection:

Putting this together we get these two cmdlets to initiate the session:

$Session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri https://ps.compliance.protection.outlook.com/powershell-liveid/ -Credential $LiveCred -Authentication Basic -AllowRedirection

Import-PSSession $Session

After the session is connected we need to query for cmdlets that are related to the Security and Compliance Center. How can we filter for these cmdlets? Well, let’s see what cmdlets are available once we make our connection:

Get-Command

When that is typed in a seemingly endless stream of cmdlets is displayed. At the time of this writing, the total was around 5,338 cmdlets. How do we get an accurate count:

(Get-Command).Count

The parentheses and the ‘.Count’ allows PowerShell to count the number of items that would be displayed if the entire list was shown. This same method could be used again, almost any variable or cmdlet that lists results. Now, looking at the list of cmdlets, we see there are a number of modules listed.

Get-Command | Select-Object ModuleName -Unique

Well, that’s great, but that provides 148 module names. None of them include Security Compliance Center in the names. We need to take another approach. We know that a cmdlet like ‘Get-DLP*’ is an SCC cmdlet. Maybe we can get the module name from that cmdlet:

Get-Command Get-DLPCompliancePolicy

Nice. As can be seen by the name of the module that the cmdlet is in, it appears to be a temporary module name (Source) assigned to the cmdlet. What this means is that every time we connect a PowerShell session to Exchange Online, the Module Name for Exchange Online PowerShell cmdlets will be different. What we then have to do is first query all Modules:

$ModuleName = (Get-Module).Name

Populate a couple of variables needed for the file name:

$Date = Get-Date -Format “MM.dd.yyyy-hh.mm-tt”

$Service = ‘SecurityCompliance’

$EmailDate = Get-Date -Format “MM.dd.yyyy”

$SMTPServer = <mail server ip address>

To pinpoint the module for the Security and Compliance Center Cmdlets we will check each Module until we find the Module beginning with ‘tmp’:

Foreach ($Name in $ModuleName) {

If ($Name -Like “tmp*”) {

If the Module matches, then we can get the number of cmdlets found in the Security and Compliance Center now - notice the ‘.Count’ at the end:

$NewSCCNum = (Get-Command | Where {$_.ModuleName -eq $Name}).Count

Lastly, we will create a txt file that will contain all of the cmdlets from this Module. The file will be named ‘SecurityCompliance-< date >.txt:

$Service = “SecurityAndCompliance”

Get-command | Where {$_.ModuleName -eq $Name} | Select-ObjectName > “C:

CmdletCheck$Service-$Date.txt”

Now that we’ve gathered up the cmdlet names and a count of the cmdlets, we should clean up our PowerShell session to the Security and Compliance Center. Why should we clear our sessions? First, it closes the remote connection and allows us to run cmdlets on our local server. Second, it closes a potential security hole because of the direct connection to Exchange Online using a connection with administrative permissions to the tenant.

** Note ** Maximum number of PowerShell connections per user is three.

How can we close the session? Remove-PSSession:

Using the example above we can add this code:

# Cleanup - Main Connection

Get-PsSession | Remove-PSSession

Note that the above will close ALL active sessions. For a scheduled script, this condition is acceptable as the sessions closed by this cmdlet are only the ones opened in the current script execution. It does not affect sessions outside of the PowerShell window used by the script. Next we need to begin constructing our CSV file for storing results. We start with the first row of the chart:

# ReWrite the CSV File

$HeaderRow = “SecurityAndCompliance

$HeaderRow > ‘C:CmdletCheckHistoricalCurrentChart.csv’

Add additional row to the CurrentChart.Csv spreadsheet:

# New numbers

$NewRow = “$NewSCCNum”

Add-Content ‘C:CmdletCheckHistoricalCurrentChart.csv’ $NewRow

Add additional row to the FullChart.Csv spreadsheet - Note we tag the row with the current date as well:

# Add row to historical data

$NewRow = “$Date,”+“$NewSCCNum”

Add-Content ‘C:CmdletCheckHistoricalFullChart.csv’ $NewRow

Once the CSV files are updated, we need to notify someone of the changes. First, we check to see if the new number of cmdlets ($NewSCCNum) has changed from the previous number of cmdlets ($SCCNum):

If ($NewSCCNum -ne $SCCNum) {

If the changes have occurred, then:

$Change = “Security and Compliance Center cmdlets changed from $SCCNum to $NewSCCNum.”

Add-Content ‘C:CmdletCheckHistoricalCurrentChanges.csv’ $Change

This next variable is set because we want to send an email out:

$MailRequired = 1

Now, if the cmdlet numbers have not changed, we use the ‘} Else {‘ code section to do so. In this section we will record no change to the new CSV file:

} Else {

$NoChange = “Security and Compliance Center Cmdlets did not change in number.”

Add-Content ‘C:CmdletCheckHistoricalCurrentChanges.csv’ $NoChange

}

Once those checks complete, we can check to see if the $MailRequired variable is set to ‘1’.

If ($MailRequired -eq 1) {

We then set the ‘$Body’ and ‘$Subject’ variable for the email:

$Subject = “Some Office 365 PowerShell Cmdlets Changed on $EmailDate”

$Body = (Get-Content ‘C:CmdletCheckHistoricalCurrentChanges.csv’) -join ‘<BR>’

Then we can send the email out with the changes that occurred.

Send-MailMessage -To $To -From $From -Subject $Subject -BodyAsHtml -Body $Body -SmtpServer $SMTPServer

If there were no changes, and the $MailRequired is not set to 1, then we can use an ‘} Else {‘ section of code:

} Else {

We then set the ‘$Body’ and ‘$Subject’ variable for the email, differently then the successful email:

$Subject = “No Office 365 PowerShell Cmdlets Changed on $EmailDate”

$Body = “No Office 365 PowerShell Cmdlets Changed on $EmailDate.”

Then we can send the email out with the fact no changes happened:

Send-MailMessage -To $To -From $From -Subject $Subject -BodyAsHtml -Body $Body -SmtpServer $SmtpServer

}

Once that’s complete, we can close the PS Session....

Get-PSSession | Remove-PSSession

... and remove the CSV file we no longer need:

Remove-Item “C:CmdletCheckHistoricalCurrentChanges.csv”

Script Summary

In this sample, we used Send-MailMessage, If..Else, PSSession cmdlets, and more to make this happen. The key thing to remember is to add comments later once the script works. This will enable sharing of the script as well as help in troubleshooting it if there are any issues.

Script Building Summary

Building a script in PowerShell can take some planning and certainly takes some experimentation. An idea method would be to have some sort of test environment in order to prove out the scripts and then put it into production once the script has been vetted. Ideally the script will start with some sort of seed like mailbox information, or groups information and then expanding out to culling data points and then performing some sort of action in response to the data found. The scripts might be scheduled or run manually depending on the end purpose.

When building your scripts, make sure to take advantage of all the tools that are out there:

Don’t ignore any help you can get from these sources. Experience will teach you that until you understand the underlying way that PowerShell operates, as well as how data could be stored in Exchange and Active Directory, it will take experimentation to get the most out of it.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.221.98.71