Category Archives: Automation

How to use PowerShell script for setting Azure AD Password Reset Writeback On-premises Permissions

When you configure the Azure AD Premium Self Service Password Reset solution on your Azure AD tenant and then the Azure AD Connect Password Writeback feature, you will need to add permissions in your local Active Directory that permits the Azure AD Connect account to actually change and reset passwords for your users  , as detailed here: https://docs.microsoft.com/en-us/azure/active-directory/active-directory-passwords-getting-started#step-4-set-up-the-appropriate-active-directory-permissions.

I wrote this PowerShell script that helps you configure this correctly in your domain/forest. Some notes:

  • You can use it in a single-domain, single-forest domain, or in a multi-domain forest, just remember to specify a Domain Controller for the wanted domain, and for the domain the Azure AD Connect account is in.
  • You have to find the Azure AD Connect Synchronization account, it would be MSOL_xxxx.. if you have used Express settings, or a dedicated account. Look at current configuration for details.
  • You can specify an OU for your users, and if inheritance is enabled all subordinate users and OUs will inherit the permissions. If not, please run the script once for each OU you want the permissions to be applied for.

Here is the script:

Hope the script will be helpful!

Schedule Tesla Heating with Microsoft Flow from Calendar @microsoftflow

My Skill colleague Pål-Andre Kjøniksen wrote a blog post on how he can control the heating in his Tesla Model S based on his Office 365 Calendar appointments and Microsoft Flow.

https://www.skill.no/blogg/samhandlingsbloggen/2017-01-25-microsoft-flow-styrer-varmen-i-teslaen/.

The blog post is in Norwegian, but it should be fairly simple to follow the steps, where the idea was that if he creates a Calendar appointment that starts with “Tesla:”, the heating will start up 20 minutes before that, making sure that the car is warm when he drives off.

Experts and Community unite at last ever #SCU_Europe 2016! #ExpertsLive next

This years SCU Europe 2016, for the first time outside Switzerland in the 4th year running, was held in Berlin at the BCC (Berlin Congress Center) close to the Alexander Platz in the eastern parts of “Berlin Mitte”.

 

 

The intro video introducing the Experts:

Let’s begin with the end: at the closing note SCUE general Marcel Zehner announced and with a little bit of emotion that this was the last ever SCU Europe to be held.. You and your organization should be proud of what you have achieved, Marcel, it is one of the best community conferences around, and I have been fortunate to be able to visit all 4 starting with Bern in 2013, Basel in 2014 and 2015, and now Berlin in 2016. It’s only cities with B’s is it? In fact, you never know what twists and turns your career takes, but looking back I’m not sure I would be where I am now in turn of being presenter, MVP and community influencer myself if I had not travelled alone to Bern 4 years ago, that’s where I really started working with and for the Community (with a capitol C)!

Luckily SCU Europe will continue as Experts Live Europe next year! Same place at BCC, same organization and format, and the same dates only next year it will be: 23rd – 25th of August 2017. A new web page was launched, www.expertslive.eu, and Twitter (@ExpertsLiveEU) and Facebook have been changed to reflect that. The hash tag #SCU_Europe will eventually be inactive and you should now use #ExpertsLive.

image

I think this is a very good decision, there has already been discussion on that the name “System Center Universe” is not really reflecting the content and focus of the conference, now embracing the Cloud, with content areas for Management, Productivity, Security, DevOps, Automation, Data Platform and more. ExpertsLive, originally a 1-day community conference in Netherland running each year back from 2009 and with up to 1200 participants, will now be a network of conferences, ranging from region based (ExpertsLive Europe, but also SCU APAC and SCU Australia will be ExpertsLive APAC and Australia next year), and local, country based ExpertsLive like the one in Netherlands, but more will come.

image

The closing note video announcing Experts Live Europe:

This year at SCU Europe I was one of the Experts and presented two sessions on “Premium Identity Management and Protection with Azure AD” and “Deep Dive: Publishing Applications with Azure AD”. I also took part in a “Ask-the-Experts” area together with Cameron Fuller and Kevin Greene where we took questions on the topic System Center 2016. I participated on a discussion panel on Friday morning with Markus Wilhelm from Microsoft Germany on the subject Defense Strategies and Security, and of course we had the Meet and greet with the Experts at the Networking party. It was a really great experience speaking at this conference, thanks for having me!

 

 

 

 

The content of the conference this year was great, and for the first time there was 5 tracks, with over 70 sessions presented! All presentations and session recordings will be at Channel 9 in a few weeks time, so make sure you look at anything you missed or want to see again if you where there, or if you weren’t at the conference this year you can look at your sessions of interest.

I was travelling with a group this year, both from my company and some of our customers, in total we were 7 in the group, and also had 3 cancellations the last week before the conference from some customers that could not make it after all. Moving the conference to Berlin is a big part of why it now was easier to attract more Nordic attendance I think. We stayed at the Park Inn by Radisson right by the Alexander Platz and BCC, so it was really central and nice.

 

 

 

 

In good tradition there are a lot of parties and social networking going on. On the first night there are the Sponsors and Speakers Party, which was held in Mio right by the TV Tower by Alexander Platz, on Thursday we had the attendee Networking Party at the conference center. Later that night our group and some more partners/customers of Squared Up went on to another party at Cosmic Kaspar. It was really hot, so basically the party was at the pavement! On the last day we had the Closing Drinks, sponsored by Cireson and itnetX at Club Carambar, also close to the Alexander Platz. In addition, there are a lot of unofficial gatherings going on, lots of laughs and new and old friends have a good time.

 

 

 

 

 

 

See you next year at Experts Live Europe in Berlin 23-25th August, 2017!

Trigger Azure AD Connect Sync Scheduler with Azure Automation

In this blog post I will show how you can trigger the Azure AD Connect Sync Scheduler with an Azure Automation Runbook PowerShell Script. Since the Azure AD Connect build 1.1.105.0 was released February 2016, a new scheduler is built-in that per default sync every 30 minutes (previously 3 hours). For more detail on Azure AD Connect Sync Scheduler, see https://azure.microsoft.com/en-us/documentation/articles/active-directory-aadconnectsync-feature-scheduler/.

Normally a sync schedule of 30 minutes is sufficient for most use, but sometimes you will need to do an immediate sync. So I thought it was a good idea to create a PowerShell script that creates a remote session to the Azure AD Connect Server, and then triggers a delta sync.

Now, this PowerShell script can of course be used with any of your favorite automation solutions, for example Orchestrator or SMA on-premises. But why not just use Azure Automation and a Hybrid Worker to run this script. This way you can trigger the script in a number of ways including in the Azure Portal, via Webhooks, remediating alerts in OMS and more.

Requirements

Lets first take a look at the requirements for this solution:

  • You will have to have an Azure Subscription, so that an Azure Automation Account can be created (or use your existing account), and that a runbook script with the related assets can be created.
  • You will need to have an OMS Workspace for the Azure Subscription, and have a Hybrid Worker set up that can communicate with the Azure AD Connect Server. The Hybrid Worker will use a credential asset and variable asset created in the first part.

In the following two parts I will look at these two requirements and how you can set it up to start triggering Azure AD Connect Scheduler with your Azure Automation Runbook.

Part 1 – Set up the Azure Automation Runbook and Assets

To set up the Azure Automation part of the solution, I have created a GitHub Repository  where you can deploy the solution directly from https://github.com/skillriver/Trigger-AzureADSyncScheduler. This repository contains the Azure Resource Manager deployment template and PowerShell script that you need to get started.

You can also click this deploy button directly:

 

Lets step through what you experience when you click to “Deploy to Azure”. Please make sure that you are logged in to your correct Azure Subscription first.

Deploying with the Template

I will not go through how I created the ARM based JSON templates, but I will quickly show the user experience when doing the deployment.

The custom deployment will ask you for some parameter values:

  • AUTOMATIONACCOUNTNAME. If you specify an existing Automation Account, this will be used, or else a new one will be created with the Free pricing tier.
  • AAHYBRIDWORKERCREDENTIALNAME. There is a default value there, this will be used as a Credential Asset in the PowerShell script. You can change the value, but then you must remember to change it in the script as well.
  • AAHYBRIDWORKERDOMAIN. The NETBIOS Domain Name for where the Azure AD Connect Server belongs to.
  • AAHYBRIDWORKERUSERNAME. This is this the user name for the service account or other user account that has permission to connect to the Azure AD Connect Server and trigger the sync schedule.
  • AAHYBRIDWORKERPASSWORD. The password for the user above.
  • AADSSERVERNAME. The server name of the Azure AD Connect Server.

You will have to select the correct subscription, and either create a new Resource Group, or an existing one (please note that Azure Automation is not available in every region).

image

After saving the parameters, and reviewing and accepting legal terms, you are ready to create the custom deployment.

If everything went OK, you should see a confirmation:

image

You will have an Automation Account:

image

You will have a PowerShell Script Runbook with the name Trigger-AzureADSync:

image

The script can be viewed as shown below. This script is short and simple, it will get the Asset Variable for Azure AD Connect Server name, and get the Credential Asset for the Hybrid Worker Account. It will the create a remote session, and run the delta sync cycle:

image

Lets take a look at the Assets created with the deployment as well, the Variable:

image

The Credential:

image

That’s the whole solution for this first part. If you for any reason could not or would not deploy the template directly, and would prefer to create this manually, you should be fine just following the images above. Just follow these steps:

  1. Create a Azure Automation Account (Free tier, and in your chosen supported Azure Region).
  2. Create a Variable Asset, with the name of the Azure AD Connect Server.
  3. Create a Credential Asset, with the DOMAIN\UserName of the account you will use to remote session to the Azure AD Connect Server.
  4. Create a new PowerShell Script Runbook, typing the CmdLets from above and using your variable assets.

By now you should be ready for the next step, because you cannot run this Automation Runbook just yet. You have to have in place OMS and a Hybrid Worker first, and that will be shown in the next part.

Part 2 –  Set up the Hybrid Worker and Remote session permission

To be able to run Azure Automation Runbooks in your own datacenter, you will need to have an OMS workspace and at least one Hybrid Worker configured that will be able to execute the Runbook locally and connect to the Azure AD Connect Server.

Hybrid Runbook Worker Components

I will not go through the details here on how to set up an OMS workspace and a Hybrid Worker if you don’t have this from before, you can just follow the documentation here https://azure.microsoft.com/en-us/documentation/articles/automation-hybrid-runbook-worker/.

After setting up and registering your Hybrid Worker, you will have a Hybrid Worker Group with at least one Hybrid Worker.

image

Now, running the Runbook with the right security is going to be essential here, after all the Runbook is going to connect to the Azure AD Connect Server and initiate the sync cycle. Lets first check the settings of the Hybrid Worker Group. We can either select a Default Run As account as I have here:

image

Or you can select a Custom Run As, specifying a credential Asset to use for all Runbooks running on this Hybrid Worker Group:

image

In my example here, I will use the Default Run As Account, because I specify my own credentials in the PowerShell Runbook, as shown earlier in Part 1 of this blog post:

image

Next, I will have to create a domain account in my local Active Directory. I have created a service account to be used for Azure Automation Hybrid Workers. This is the same account you specified when creating the credential asset in Part 1 in Azure Automation:

image

This account will need permission to remote PowerShell to the Azure AD Connect Server. In Computer Management and Local Users and Groups on the Azure AD Connect Server, add this service account to the Remote Management Users group:

image

And add the account to the ADSyncOperators group, so that the user has permission to Azure AD Connect operations:

image

That should be it, we are now ready to start the Runbook and verify that it works.

Starting the Runbook

From the Automation Account and the Trigger-AzureADSync Runbook, select Start and under Run Settings select Hybrid Worker and your Hybrid Worker Group:

image

You can verify that the job completed and with no errors:

image

Looking into the Synchronization Service on the Azure AD Connect Server, I can verify that the sync cycle has been running:

image

That concludes this blog article, hope it has been helpful!

Displaying Azure Automation Runbook Stats in OMS via Performance Collection and Operations Manager

Wouldn’t it be great to get some more information about your Azure Automation Runbooks in the Operations Management Suite Portal? That’s a rhetorical question, of course the answer will be yes!

While Azure Automation is a part of the suite of components in OMS, today you only get the following information from the Azure Automation blade:

The blade shows the number of runbooks and jobs from the one Automation Account you have configured. You can only configure one Automation Account at a time, and for getting more details you are directed to the Azure Portal.

I wanted to use my OMS-connected Operations Manager Management Group, and use a PowerShell script rule to get some more statistics for Azure Automation and display that in OMS Log Analytics as Performance Data. I will do this using the “Sample Management Pack – Wizard to Create PowerShell script Collection Rules” described in this blog article http://blogs.msdn.com/b/wei_out_there_with_system_center/archive/2015/09/29/oms-collecting-nrt-performance-data-from-an-opsmgr-powershell-script-collection-rule-created-from-a-wizard.aspx.

I will use the AzureRM PowerShell Module for the PowerShell commands that will connect to my Azure subscription and get the Azure Automation Runbooks data.

Getting Ready

Before I can create the PowerShell script rule for gettting the Azure Automation data, I have to do some preparations first. This includes:

  1. Importing the “Sample Management Pack – Wizard to Create PowerShell script Collection Rules” to my Operations Manager environment.
    1. This can be downloaded from Technet Gallery at https://gallery.technet.microsoft.com/Sample-Management-Pack-e48040f7.
  2. Install the AzureRM PowerShell Module (at the chosen Target server for the PowerShell Script Rule).
    1. I chose to install it from the PowerShell Gallery using the instructions here: https://azure.microsoft.com/en-us/documentation/articles/powershell-install-configure/
    2. If you are running Windows Server 2012 R2, which I am, follow the instructions here to support the PowerShellGet module, https://www.powershellgallery.com/GettingStarted?section=Get%20Started.
  3. Choose Target for where to run the AzureRM commands from
    1. Regarding the AzureRM and where to install, I decided to use the SCOM Root Management Server Emulator. This server will then run the AzureRM commands against my Azure Subscription.
  4. Choose account for Run As Profile
    1. I also needed to think about the run as account the AzureRM commands will run under. As we will see later the PowerShell Script Rules will be set up with a Default Run As Profile.
    2. The Default Run As Profile for the RMS Emulator will be the Management Server Action Account, if I had chosen another Rule Target the Default Run As Profile would be the Local System Account.
    3. Alternatively, I could have created a custom Run As Profile with a user account that have permissions to execute the AzureRM cmdlets and connect to and read the required data from the Azure subscription, and configure the PowerShell Script rules to use that.
    4. I decided to go with the Management Server Action Account, in my case SKILL\scom_msaa. This account will execute the AzureRM PowerShell cmdlets, so I need to make sure that I can login to my Azure subscription using that account.
  5. Next, I started PowerShell ISE with “Run as different user”, specifying my scom_msaa account. I run the commands below, as I wanted to save the password for the user I’m going to connect to the Azure subscription and get the Automation data. I also did a test import-module of the AzureRM modules I will need in the main script.

The commands above are here in full:


# Prepare to save encrypted password

# Verify that logged on as scom_msaa
whoami

# Get the password
$securepassword = Read-Host -AsSecureString -Prompt Enter Azure AD account password:

# Filepath for encrypted password file
$filepath = C:\users\scom_msaa\AppData\encryptedazureadpassword.txt

# Save password encrypted to file
ConvertFrom-SecureString -SecureString $securepassword | Out-File -FilePath $filepath

Import-Module C:\Program Files\WindowsPowerShell\Modules\AzureRM
Import-Module C:\Program Files\WindowsPowerShell\Modules\AzureRM.Profile
Import-Module C:\Program Files\WindowsPowerShell\Modules\AzureRM.Automation

At this point I’m ready for the next step, which is to create some PowerShell commands for the Script Rule in SCOM.

Creating the PowerShell Command Script for getting Azure Automation data

First I needed to think about what kind of Azure Automation and Runbook data I wanted to get from my Azure Subscription. I decided to get the following values:

  • Job Count Last Day
  • Job Count Last Month
  • Job Count This Month
  • Job Minutes This Month
  • Runbooks in New State
  • Runbooks in Published State
  • Runbooks in Edit State
  • PowerShell Workflow Runbooks
  • Graphical Runbooks
  • PowerShell Script Runbooks

I wanted to have the statistics for Runbooks Jobs to see the activity of the Runbooks. As I’m running the Free plan of Azure Automation, I’m restricted to 500 minutes a month, so it makes sense to count the accumulated job minutes for the month as well.

In addition to this I want some statistics for the number of Runbooks in the environment, separated on New, Published and Edit Runbooks, and the Runbook type separated on Workflow, Graphical and PowerShell Script.

The PowerShell Script Rule for getting these data will be using the AzureRM PowerShell Module, and specifically the cmdlets in AzureRM.Profile and AzureRM.Automation:

To log in and authenticate to Azure, I will use the encrypted password saved earlier, and create a Credential object for the login:

Initializing the script with date filters and setting default values for variables. I decided to create the script so that I can get data from all the Resource Groups I have Automation Accounts in. This way, If I have multiple Automation Accounts, I can get statistics combined for each of them:

Then, looping through each Resource Group, running the different commands to get the variable data. Since I potentially will loop through multiple Resource Groups and Automation Accounts, the variables will be using += to add to the previous loop value:

After setting each variable and exiting the loop, the $PropertyBag can be filled with the values for the different counters:

The complete script is shown below for how to get those Azure Automation data via SCOM and PowerShell Script Rule to to OMS:


# Debug file
$debuglog = $env:TEMP+\powershell_perf_collect_AA_stats_debug.log

Date | Out-File $debuglog

Who Am I: | Out-File $debuglog -Append
whoami |
Out-File $debuglog -Append

$ErrorActionPreference = Stop

Try {

If (!(Get-Module –Name AzureRM)) { Import-Module C:\Program Files\WindowsPowerShell\Modules\AzureRM }
If (!(Get-Module –Name AzureRM.Profile)) { Import-Module C:\Program Files\WindowsPowerShell\Modules\AzureRM.Profile }
If (!(Get-Module –Name AzureRM.Automation)) { Import-Module C:\Program Files\WindowsPowerShell\Modules\AzureRM.Automation }

# Get Cred for ARM
$filepath = C:\users\scom_msaa\AppData\encryptedazureadpassword.txt
$userName = myAzureADAdminAccount
$securePassword = ConvertTo-SecureString (Get-Content -Path $FilePath)
$cred = New-Object -TypeName System.Management.Automation.PSCredential ($username, $securePassword)

# Log in and sett active subscription
Login-AzureRmAccount -Credential $cred

$subscriptionid = mysubscriptionID

Set-AzureRmContext -SubscriptionId $subscriptionid

$API = new-object -comObject MOM.ScriptAPI

$aftertime = $(Get-Date).AddHours(1)
$afterdate_lastday = $(Get-Date).AddDays(1)
$afterdate_lastmonth = $(Get-Date).AddDays(30)
$afterdate_thismonth = $(Get-Date).AddDays(($(Get-Date).Day)+1)

$AutomationRGs = @(MyResourceGroupName1,MyResourceGroupName2)

$PropertyBags=@()

$newrunbooks = 0
$publishedrunbooks = 0
$editrunbooks = 0
$scriptrunbooks = 0
$graphrunbooks = 0
$powershellrunbooks = 0
$jobcountlastday = 0
$jobcountlastmonth = 0
$jobcountthismonth = 0
$jobminutesthismonth = 0

ForEach ($AutomationRG in $AutomationRGs) {

$rmautomationacct = Get-AzureRmAutomationAccount -ResourceGroupName $AutomationRG

$newrunbooks += (Get-AzureRmAutomationRunbook -AutomationAccountName $rmautomationacct.AutomationAccountName -ResourceGroupName $AutomationRG `
|
Where {$_.State -eq New}).Count

$publishedrunbooks += (Get-AzureRmAutomationRunbook -AutomationAccountName $rmautomationacct.AutomationAccountName -ResourceGroupName $AutomationRG `
|
Where {$_.State -eq Published}).Count

$editrunbooks += (Get-AzureRmAutomationRunbook -AutomationAccountName $rmautomationacct.AutomationAccountName -ResourceGroupName $AutomationRG `
|
Where {$_.State -eq Edit}).Count

$scriptrunbooks += (Get-AzureRmAutomationRunbook -AutomationAccountName $rmautomationacct.AutomationAccountName -ResourceGroupName $AutomationRG `
|
Where {$_.RunbookType -eq Script}).Count

$graphrunbooks += (Get-AzureRmAutomationRunbook -AutomationAccountName $rmautomationacct.AutomationAccountName -ResourceGroupName $AutomationRG `
|
Where {$_.RunbookType -eq Graph}).Count

$powershellrunbooks += (Get-AzureRmAutomationRunbook -AutomationAccountName $rmautomationacct.AutomationAccountName -ResourceGroupName $AutomationRG `
|
Where {$_.RunbookType -eq PowerShell}).Count

$jobcountlastday += (Get-AzureRmAutomationJob -AutomationAccountName $rmautomationacct.AutomationAccountName -ResourceGroupName $AutomationRG `
-StartTime
$afterdate_lastday).Count

$jobcountlastmonth += (Get-AzureRmAutomationJob -AutomationAccountName $rmautomationacct.AutomationAccountName -ResourceGroupName $AutomationRG `
-StartTime
$afterdate_lastmonth).Count

$jobcountthismonth += (Get-AzureRmAutomationJob -AutomationAccountName $rmautomationacct.AutomationAccountName -ResourceGroupName $AutomationRG `
-StartTime
$afterdate_thismonth.ToLongDateString()).Count

$jobsthismonth = Get-AzureRmAutomationJob -AutomationAccountName $rmautomationacct.AutomationAccountName -ResourceGroupName $AutomationRG `
-StartTime
$afterdate_thismonth.ToLongDateString() | Select-Object RunbookName, StartTime, EndTime, CreationTime, LastModifiedTime, @{Name=RunningTime;Expression={[TimeSpan]::Parse($_.EndTime $_.StartTime).TotalMinutes}}, @{Name=Month;Expression={($_.EndTime).Month}}

$jobminutesthismonth += [int][Math]::Ceiling(($jobsthismonth | Measure-Object -Property RunningTime -Sum).Sum)

}

$PropertyBag = $API.CreatePropertyBag()
$PropertyBag.AddValue(Instance, Job Count Last Day)
$PropertyBag.AddValue(Value, [UInt32]$jobcountlastday)
$PropertyBags += $PropertyBag

Job Count Last Day: | Out-File $debuglog -Append
$jobcountlastday | Out-File $debuglog -Append

$PropertyBag = $API.CreatePropertyBag()
$PropertyBag.AddValue(Instance, Job Count Last Month)
$PropertyBag.AddValue(Value, [UInt32]$jobcountlastmonth)
$PropertyBags += $PropertyBag

Job Count Last Month: | Out-File $debuglog -Append
$jobcountlastmonth | Out-File $debuglog -Append

$PropertyBag = $API.CreatePropertyBag()
$PropertyBag.AddValue(Instance, Job Count This Month)
$PropertyBag.AddValue(Value, [UInt32]$jobcountthismonth)
$PropertyBags += $PropertyBag

Job Count This Month: | Out-File $debuglog -Append
$jobcountthismonth | Out-File $debuglog -Append

$PropertyBag = $API.CreatePropertyBag()
$PropertyBag.AddValue(Instance, Job Minutes This Month)
$PropertyBag.AddValue(Value, [UInt32]$jobminutesthismonth)
$PropertyBags += $PropertyBag

Job Minutes This Month: | Out-File $debuglog -Append
$jobminutesthismonth | Out-File $debuglog -Append

$PropertyBag = $API.CreatePropertyBag()
$PropertyBag.AddValue(Instance, Runbooks in New State)
$PropertyBag.AddValue(Value, [UInt32]$newrunbooks)
$PropertyBags += $PropertyBag

Runbooks in New State: | Out-File $debuglog -Append
$newrunbooks | Out-File $debuglog -Append

$PropertyBag = $API.CreatePropertyBag()
$PropertyBag.AddValue(Instance, Runbooks in Published State)
$PropertyBag.AddValue(Value, [UInt32]$publishedrunbooks)
$PropertyBags += $PropertyBag

Runbooks in Published State: | Out-File $debuglog -Append
$publishedrunbooks | Out-File $debuglog -Append

$PropertyBag = $API.CreatePropertyBag()
$PropertyBag.AddValue(Instance, Runbooks in Edit State)
$PropertyBag.AddValue(Value, [UInt32]$editrunbooks)
$PropertyBags += $PropertyBag

Runbooks in Edit State: | Out-File $debuglog -Append
$editrunbooks | Out-File $debuglog -Append

$PropertyBag = $API.CreatePropertyBag()
$PropertyBag.AddValue(Instance, PowerShell Workflow Runbooks)
$PropertyBag.AddValue(Value, [UInt32]$scriptrunbooks)
$PropertyBags += $PropertyBag

PowerShell Workflow Runbooks: | Out-File $debuglog -Append
$scriptrunbooks | Out-File $debuglog -Append

$PropertyBag = $API.CreatePropertyBag()
$PropertyBag.AddValue(Instance, Graphical Runbooks)
$PropertyBag.AddValue(Value, [UInt32]$graphrunbooks)
$PropertyBags += $PropertyBag

Graphical Runbooks: | Out-File $debuglog -Append
$graphrunbooks | Out-File $debuglog -Append

$PropertyBag = $API.CreatePropertyBag()
$PropertyBag.AddValue(Instance, PowerShell Script Runbooks)
$PropertyBag.AddValue(Value, [UInt32]$powershellrunbooks)
$PropertyBags += $PropertyBag

PowerShell Script Runbooks: | Out-File $debuglog -Append
$powershellrunbooks | Out-File $debuglog -Append

$PropertyBags

} Catch {

Error Catched: | Out-File $debuglog -Append
$(
$_.Exception.GetType().FullName) | Out-File $debuglog -Append
$(
$_.Exception.Message) | Out-File $debuglog -Append

}

PS! I have included debugging and logging in the script, be aware though that doing $ErrorActionPreference=Stop will end the script if any errors, for example with logging, so it might be an idea to remove the debug logging when confirmed that everything works.

In the next part I’m ready to create the PowerShell Script Rule.

Creating the PowerShell Script Rule

In the Operations Console, under Authoring, create a new PowerShell Script Rule as shown below:

  1. Select the PowerShell Script (Performance – OMS Bound) Rule:I have created a custom destination management pack for this script.
  2. Specifying a Rule name and Rule Category: Performance Collection. As mentioned earlier in this article the Rule target will be the Root Management Server Emulator:
  3. Selecting to run the script every 30 minutes, and at which time the interval will start from:
  4. Selecting a name for the script file and timeout, and entering the complete script as shown earlier:
  5. For the Performance Mapping information, the Object name must be in the \\FQDN\YourObjectName format. For FQDN I used the Target variable for PrincipalName, and for the Object Name AzureAutomationRunbookStats, and adding the “\\” at the start and “\” between: \\$Target/Host/Property[Type=”MicrosoftWindowsLibrary7585010!Microsoft.Windows.Computer”]/PrincipalName$\AzureAutomationRunbookStatsI specified the Counter name as “Azure Automation Runbook Stats”, and the Instance and Value are specified as $Data/Property(@Name=’Instance’)$ and $Data/Property(@Name=Value)$. These reflect the PropertyBag instance and value created in the PowerShell script:
  6. After finishing the Create Rule Wizard, two new rules are created, which you can find by scoping to the Root Management Server Emulator I chose as target. Both Rules must be enabled, as they are not enabled by default:

At this point we are finished configuring the SCOM side, and can wait for some hours to see that data are actually coming into my OMS workspace.

Looking at Azure Automation Runbook Stats Performance Data in OMS

After a while I will start seeing Performance Data coming into OMS with the specified Object and Counter Name, and for the different instances and values.

In Log Search, I can specify Type=Perf ObjectName=AzureAutomationRunbookStats, and I will find the Results and Metrics for the specified time frame.

In the example above I’m highlighting the Job Minutes This Month counter, which will steadily increase for each month, and as we can see the highest value was 107 minutes, after when the month changed to March we were back at 0 minutes. After a while when the number of job minutes increases it will be interesting to follow whether this counter will go close to 500 minutes.

This way, I can now look at Azure Automation Runbook stats as performance data, showing different scenarios like how many jobs and runbook job minutes there are over a time period. I can also look at how what type of runbooks I have and what state they are in.

I can also create saved searches and alerts for my search criteria.

Creating OMS Alerts for Azure Automation Runbook Counters

There is one specific scenario for Alerts I’m interested in, and that is when I’m approaching my monthly limit on 500 job minutes.

“Job Minutes This Month” is a counter that will get the sum of all job minutes for all runbook jobs across all automation accounts. In the classic Azure portal, you will have a usage overview like this:

With OMS I would get this information over a time period like this:

The search query for Job Minutes This Month as I have defined it via the PowerShell Script Rule in OMS is:

Type=Perf ObjectName=AzureAutomationRunbookStats InstanceName=”Job Minutes This Month”

This would give me all results for the defined time period, but to generate an alert I would want to look at the most recent results, for example for the last hour. In addition, I want to filter the results for my alert when the number of job minutes are over the threshold of 450 which means I’m getting close to the limit of 500 free minutes per month. My query for this would be:

Type=Perf ObjectName=AzureAutomationRunbookStats InstanceName=”Job Minutes This Month” AND TimeGenerated>NOW-1HOUR AND CounterValue > 450

Now, in my test environment, this will give med 0 results, because I’m currently at 37 minutes:

Let’s say, for sake of testing an Alert, I add criteria to include 37 minutes as well, like this:

This time I have 2 results. Let’s create an alert for this, press the ALERT button:

For the alert I give it a name and base it on the search query. I want to check every 60 minutes, and generate alert when the number of results is greater than 1 so that I make sure the passing of the threshold is consistent and not just temporary.

For actions I want an email notification, so I type in a Subject and my recipients.

I Save the alert rule, and verify that it was successfully created.

Soon I get my first alert on email:

Now, that it works, I can remove the Alert and create a new one without the OR CounterValue=37, this I leave to you 😉

With that, this blog post is concluded. Thanks for reading, I hope this post on how to get more insights on your Azure Automation Runbook Stats in OMS and getting data via NRT Perfomance Collection has been useful 😉

Creating SCSM Incidents from OMS Alerts using Azure Automation – Part 2

This is the second part of a 2-part blog article that will show how you can create a new Service Manager Incident from an Azure Automation Runbook using a Hybrid Worker Group, and with OMS Alerts search for a condition and generate an alert which triggers this Azure Automation Runbook for creating an Incident in Service Manager via a Webhook and some contextual data for the Alert.

In Part 1 of the blog I prepared my Service Manager environment, and created Azure Automation Runbook and Assets to run via Hybrid Worker for generating incidents in Service Manager. In this second part of the blog I will configure my Operations Management Suite environment for OMS Alerting and Alert Remediation, and create an OMS Alert that will trigger this PowerShell Runbook.

Configuring OMS Alerting and Remediation

If you haven’t already for your OMS Workspace, you will need to enable the OMS Alerting and Alert Remediation under Settings and Preview Features. This is shown in the picture below:

Creating the OMS Alert

The next step is to create the OMS Alert. To do this I will need to do a Log Search with the criteria I want. For my example in this article, I will use an EventLog Search where I have previously added Azure AD Application Proxy Connector Event Log to OMS, and where I also have created a custom field for events where “The Connector was unable to connect to the service due to networking issues”.

The result of this Log Search is shown below, where I have 7 results in the last 7 days:

When I enabled OMS Alerting and Remediation under Settings, I can now see that I have a new Alert button at the bottom of the screen. I click on that to create my new OMS Alert.

I give the OMS Alert a descriptive name, using my current search query, and checking every 15 minutes for this alert. I can also specify a threshold over a specified time windows, in this case I want the Alert to trigger if there are more than 0 occurrences. If I want to I can also send an email notification to specified recipient(s).

Since I want to generate a SCSM Incident when this OMS Alert triggers, I select to Enable Remediation and select my Create-SCSMIncident Runbook.

After saving the OMS Alert I get a successful confirmation, and a link to where I can see my configured Alerts:

While in Preview I can only create up to 10 Alerts, and I can also remove them but not edit existing for now:

That is all I need to configure in Operations Management Suite to get the OMS Alert to trigger. Now I need to go back to the Azure Portal and configure some changes for my PowerShell Runbook!

Configuring Azure Automation PowerShell Runbook for Webhook and Hybrid Worker Group

In the Azure Portal and under my Automation Account and the PowerShell Runbook I created for Create-SCSMIncident (see Part 1), there will now automatically be created a Webhook for OMS Alert Remediation. This Webhook has a expiry date of one year ahead of creation.

I now need to specify the Parameters for the Webhook, so that it runs on my Hybrid Worker group:

After I have specified the Hybrid Worker group, any OMS Alerts will now trigger this Runbook and run on my local environment, and in this case create the SCSM incident as specified in the PowerShell Runbook. But, I also want to have some contextual data in the Incident, so I need to look at the Webhook data in the next step.

Configuring and using Webhook for contextual data in Runbook

Whenever the OMS Alert triggers the remediation Azure Automation Runbook via the Webhook, event information will be submitted from OMS to the Runbook via WebhookData input parameter.

An example of this is shown in the image below, where the WebhookData Input Parameter contains event information formatted as JSON (JavaScript Object Notation):

So, I need to configure my PowerShell Runbook to process this WebhookData, and to use that information when creating the Incident.

Let’s first take a look at the WebhookData. If I copy the input from above to for example Visual Studio Code, I can see clearer that the WebhookData consists of a WebhookName, RequestBody and RequestHeader. The values I’m looking for are in the RequestBody and SearchResults:

I update my PowerShell Runbook so that I can process the WebhookData, and get the WebhookName, WebhookHeaders and WebhookBody. When I have the WebhookBody, I can get the SearchResults and by using ConvertFrom-JSON loop trough the value array to get the fields I’m looking for like this:

In this case I want the Source, EventID and RenderedDescription, which also corresponds to the values from the Alert in OMS, as shown below. I then use these values for the Incident Title and Description in the PowerShell Runbook.

The complete Azure Automation PowerShell Runbook is shown below:

param (
[
object]$WebhookData
)

if ($WebhookData -ne $null) {

# Get Webhook Data
$WebhookName = $WebhookData.WebhookName
$WebhookHeaders = $WebhookData.RequestHeader
$WebhookBody = $WebhookData.RequestBody

# Writing Webhook Data to verbose output
Write-Verbose Webhook name: ‘$WebhookName’
Write-Verbose Webhook header:
Write-Verbose $WebhookHeaders
Write-Verbose Webhook body:
Write-Verbose $WebhookBody

# Searching Webhook Data for Value Results
$SearchResults = (ConvertFrom-JSON $WebhookBody).SearchResults
$SearchResultsValue = $SearchResults.value
Foreach ($item in $SearchResultsValue)
{
# Getting Alert Source, EventID and RenderedDescription
$AlertSource = $item.Source
Write-Verbose Alert Name: ‘$AlertSource’
$AlertEventId = $item.EventID
Write-Verbose Alert EventID: ‘$AlertEventId’
$AlertDescription = $item.RenderedDescription
Write-Verbose Alert Description: ‘$AlertDescription’
}

# Setting Incident Title and Description based on OMS Alert
$incident_title = OMS Alert: + $AlertSource
$incident_desc = $AlertDescription
}
else
{
# Setting Generic Incident Title and Description
$incident_title = Azure Automation Generated Alert
$incident_desc = This Incident is generated from an Azure Automation Runbook via Hybrid Worker
}

# Getting Assets for SCSM Management Server Name and Credentials
$scsm_mgmtserver = Get-AutomationVariable -Name SCSMServerName
$credential = Get-AutomationPSCredential -Name SCSMAASvcAccount

# Create Remote Session to SCSM Management Server
#
(Specified credential must be in Remote Management Users local group and SCSM operator)
$session = New-PSSession -ComputerName $scsm_mgmtserver -Credential $credential

# Import module for Service Manager PowerShell CmdLets
$SMDIR = Invoke-Command -ScriptBlock {(Get-ItemProperty hklm:/software/microsoft/System Center/2010/Service Manager/Setup).InstallDirectory} -Session $session
Invoke-Command -ScriptBlock { param($SMDIR) Set-Location -Path $SMDIR } -Args $SMDIR -Session $session
Import-Module .\Powershell\System.Center.Service.Manager.psd1 -PSSession $session

# Create Incident
Invoke-Command -ScriptBlock { param ($incident_title, $incident_desc)

# Get Incident Class
$IncidentClass = Get-SCSMClass -Name System.WorkItem.Incident

# Get Prefix for Incident IDs
$IncidentPrefix = (Get-SCSMClassInstance -Class (Get-SCSMClass -Name System.WorkItem.Incident.GeneralSetting)).PrefixForId

# Set Incident Properties
$Property = @{Id=$IncidentPrefix{0}
Title
= $incident_title
Description
= $incident_desc
Urgency
= System.WorkItem.TroubleTicket.UrgencyEnum.Medium
Source
= SkillSCSM.IncidentSourceEnum.OMS
Impact
= System.WorkItem.TroubleTicket.ImpactEnum.Medium
Status
= IncidentStatusEnum.Active
}

# Create the Incident
New-SCSMClassInstance -Class $IncidentClass -Property $Property -PassThru

} -Args $incident_title, $incident_desc -Session $session

Remove-PSSession $session

After publishing the updated Runbook I’m ready for the OMS Alert to trigger.

When the OMS Alert triggers

The next time this OMS Alert triggers, I can verify that the Runbook is started and an Incident is created. Since I also wanted an email notification, I also received that:

In Operations Management Suite, I search for any OMS Alerts generated by using the query “Type=Alert SourceSystem=OMS”:

In Azure Automation, I can see that the Runbook has launched a job:

And most importantly, I can see that the Incident is created in Service Manager with the info I specified:

That concludes this two-part blog article on how to create SCSM Incidents from OMS Alerts. OMS Automation rocks!

Creating SCSM Incidents from OMS Alerts using Azure Automation – Part 1

There has been some great announcements recently for OMS Alerts in Public Preview (http://blogs.technet.com/b/momteam/archive/2015/12/02/announcing-the-oms-alerting-public-preview.aspx) and Webhooks support for Hybrid Worker Runbooks (https://azure.microsoft.com/en-us/updates/hybrid-worker-runbooks-support-webhooks/). This opens up for some scenarios I have been thinking about.

This 2-part blog will show how you can create a new Service Manager Incident from an Azure Automation Runbook using a Hybrid Worker Group, and with OMS Alerts search for a condition and generate an alert which triggers this Azure Automation Runbook for creating an Incident in Service Manager via a Webhook and some contextual data for the Alert.

This is the first part of this blog post, so I will start by preparing the Service Manager environment, creating the Azure Automation Runbook, and testing the Incident creation via the Hybrid Worker.

Prepare the Service Manager Environment

First I want to prepare my Service Manager Environment for the Incident creation via Azure Automation PowerShell Runbooks. I decided to create a new Incident Source Enumeration for ‘Operations Management Suite’, and also to create a new account with permissions to create incidents in Service Manager to be used in the Runbooks.

To create the Source I edited the Library List for Incident Source like this:

To make it easier to refer to this Enumeration Value in PowerShell scripts, I define my own ID in the corresponding Management Pack XML:

And specifying the DisplayString for the ElementID for the Languages I want:

The next step is to prepare the account for the Runbook. As Azure Automation Runbooks on Hybrid Workers will run as Local System, I need to be able to run my commands as an account with permissions to Service Manager and to create Incidents.

I elected to create a new local Active Directory account, and give that account permission to my Service Manager Management Server.

With the new account created, I added it to the Remote Management Users local group on the Service Manager Management Server:

Next I added this account to the Advanced Operators Role Group in Service Manager:

Adding the account to the Advanced Operators group is more permission than I need for this scenario, but will make me able to use the same account for other work item scenarios in the future.

With the Service Manager Enviroment prepared, I can go to the next step which is the PowerShell Runbook in Azure Automation.

Create an Azure Automation Runbook for creating SCSM Incidents

I created a new PowerShell Script based Runbook in Azure Automation for Creating Incidents. This Runbook are using a Credential Asset to run Remote PowerShell session commands to my Service Manager Management Server. The Credential Asset is the local Active Directory Account I created in the previous step:

I also have created a variable for the SCSM Management Server Name to be used in the Runbook.

The PowerShell Runbook can then be created in Azure Automation, using my Automation Assets, and connecting to Service Manager for creating a new Incident as specified:

The complete PowerShell Runbook is show below:

# Setting Generic Incident Title and Description
$incident_title = Azure Automation Generated Alert
$incident_desc = This Incident is generated from an Azure Automation Runbook via Hybrid Worker

# Getting Assets for SCSM Management Server Name and Credentials
$scsm_mgmtserver = Get-AutomationVariable -Name SCSMServerName
$credential = Get-AutomationPSCredential -Name SCSMAASvcAccount

# Create Remote Session to SCSM Management Server
#
(Specified credential must be in Remote Management Users local group and SCSM operator)
$session = New-PSSession -ComputerName $scsm_mgmtserver -Credential $credential

# Import module for Service Manager PowerShell CmdLets
$SMDIR = Invoke-Command -ScriptBlock {(Get-ItemProperty hklm:/software/microsoft/System Center/2010/Service Manager/Setup).InstallDirectory} -Session $session
Invoke-Command -ScriptBlock { param($SMDIR) Set-Location -Path $SMDIR } -Args $SMDIR -Session $session
Import-Module .\Powershell\System.Center.Service.Manager.psd1 -PSSession $session

# Create Incident
Invoke-Command -ScriptBlock { param ($incident_title, $incident_desc)

# Get Incident Class
$IncidentClass = Get-SCSMClass -Name System.WorkItem.Incident

# Get Prefix for Incident IDs
$IncidentPrefix = (Get-SCSMClassInstance -Class (Get-SCSMClass -Name System.WorkItem.Incident.GeneralSetting)).PrefixForId

# Set Incident Properties
$Property = @{Id=$IncidentPrefix{0}
Title
= $incident_title
Description
= $incident_desc
Urgency
= System.WorkItem.TroubleTicket.UrgencyEnum.Medium
Source
= SkillSCSM.IncidentSourceEnum.OMS
Impact
= System.WorkItem.TroubleTicket.ImpactEnum.Medium
Status
= IncidentStatusEnum.Active
}

# Create the Incident
New-SCSMClassInstance -Class $IncidentClass -Property $Property -PassThru

} -Args $incident_title, $incident_desc -Session $session

Remove-PSSession $session
The script should be pretty straightforward to interpret. The most important part is that it would require to be run on a Hybrid Worker Group with Servers that can connect via PowerShell Remote to the specified Service Manager Management Server. The Incident that will be created are using a few variables for incident title and description (these will be updated for contextual data from OMS Alerts in part 2), and some fixed data for Urgency, Impact and Status, along with my custom Source for Operations Management Suite (ref. the Enumeration Value created in the first step).

After publishing this Runbook I’m ready to run it with a Hybrid Worker.

Testing the PowerShell Runbook with a Hybrid Worker

Now I can run my Azure Automation PowerShell Runbook. I select to run it on my previously defined Hybrid Worker Group.

The Runbook is successfully completed, and the output is showing the new incident details:

I can also see the Incident created in Service Manager:

That concludes this first part of this blog post. Stay tuned for how to create an OMS Alert and trigger this Runbook in part 2!