Category Archives: Automation

Connect to Microsoft Graph in Azure DevOps Pipelines using Workload Identity Federation

Microsoft recently announced that Workload Identity Federation for Azure Pipelines now is in Public Preview: https://devblogs.microsoft.com/devops/public-preview-of-workload-identity-federation-for-azure-pipelines/.

This opens for a lot of scenarios for Azure service connections, without the need to manage secrets for service principals and more security as there are no secrets that can be exposed or exfiltrated.

As I work a lot with Microsoft Graph and automation, I wanted to see if and how I could use Workload Identity Federation to connect to and send queries to Microsoft Graph using Azure Pipelines.

Create the Workload Identity Federation Service Connection

First of all, I need to create a service connection in my Azure DevOps project that will use the new Workload Identity Federation. To be able to do this, you need to have access to the preview functionality, see details here in this learn article: Create an Azure Resource Manager service connection using workload identity federation.

When you have access to the feature, you can create a new Workload Identity federation either by manual or automatic configuration:

I will now choose the Azure Subscription, and optionally a Resource Group. Choosing a resource group is a good idea, as the service connection will be given Contributor access only to that Resource Group, and not the whole subscription. But it also depends on what you want to use your Service Connection for, in my case it is a demo scenario for Microsoft Graph Access, so it makes sense to scope the permissions down:

After creating the Service Connection, I can find it my Entra ID tenant. Let’s look at the role assignments for the Resource Group first:

The service principal has been given the name of <DevOps Org>-<DevOps Project>-<guid>, and been assigned with Contributor access to that RG.

Next, let’s find the App Registration for the Service Connection. As you can see from below there has been no (0) credentials of secrets or certificates created, but there has been created a Federated credential:

If we look at the detail for the federated credential, we can se the issuer, subject and audience, and confirm that this service principal only can be access by the service connection in Azure DevOps:

Next, go to API permissions. Here I will add a Microsoft Graph permission, so that we can use that for queries in the pipeline later. In my case I add the Application permission User.Read.All, so I can look up user information:

We are now ready to set up an Azure Pipeline to use this service connection.

Create the Azure Pipeline to access Microsoft Graph API

In your DevOps project, if this is a new project, make sure that you initialize the Repository, and that you have at least a Basic or Visual Studio access level, then head to Pipelines and create a “New Pipeline”. For my environment I will just choose the following steps:

  1. Select Azure Repos Git (YAML)
  2. Select my repository
  3. Use a starter pipeline (or you can choose an existing if you have)

This is a sample YAML code that will use the service connection (se below picture) to get an access token for Microsoft Graph, and the use that access token to connect to Graph PowerShell SDK. In my example I’m just showing how to get some simple user information:

There are different ways you can go about this, in my case I was just using Azure CLI in one task to get the access token for the resource type that is Graph. (You can also use Az PowerShell task for this by the way). I also set and secure the variable for use in later steps in the pipeline job.

In the next task I use PowerShell Core to convert the token to a secure string, and then install the required Microsoft Graph PowerShell modules. I can then connect to Graph and get user information. Here is the complete YAML code:

# Pipeline for accessing Microsoft Graph using Federated Workload Identity Credential
# Created by Jan Vidar Elven, Evidi, 15.09.2023

trigger:
- none

pool:
  vmImage: windows-latest

steps:
- task: AzureCLI@2
  displayName: 'Get Graph Token for Workload Federated Credential'
  inputs:
    azureSubscription: 'wi-fed-sconn-ado-to-msgraph'
    scriptType: 'pscore'
    scriptLocation: 'inlineScript'
    inlineScript: |
      $token = az account get-access-token --resource-type ms-graph
      $accessToken = ($token | ConvertFrom-Json).accessToken
      Write-Host "##vso[task.setvariable variable=secretToken;issecret=true]$accessToken"
- task: PowerShell@2
  displayName: 'Connect to Graph PowerShell with Token'
  inputs:
    targetType: 'inline'
    script: |
      # Convert the secure variable to a secure string
      $secureToken = ConvertTo-SecureString -String $(secretToken) -AsPlainText

      # Install Microsoft Graph Modules required
      Install-Module Microsoft.Graph.Authentication -Force
      Install-Module Microsoft.Graph.Users -Force

      # Connect to MS Graph
      Connect-MgGraph -AccessToken $secureToken

      # Get User Info
      Get-MgUser -UserId "[email protected]"
      
    pwsh: true

I can now try to run the pipeline. At first run you will have to validate and permit access to the service connection from the pipeline:

And then I can verify that it indeed can connect to the Graph via PowerShell SDK and get my resources via the Workload Identity Federation service connection:

Summary and Usage Scenarios

Most will use the new Workload Identity Federation for Azure Pipelines that access Azure subscriptions and resources, but I have shown as as long as this is using the Entra ID authentication platform and OIDC, it is possible to get access tokens for other API’s as well, in this case Microsoft Graph API.

You can use Graph API to get information about your tenant, to enrich and complement your exisiting CI/CD pipelines, or in some case automate consistent deployments also for Graph resources, like for example important settings and policies.

Speaking at Nordic Virtual Summit 2021!

I’m excited to announce that I’m speaking at the inaugural Nordic Virtual Summit 2021, from 10th to 11th February. Nordic Virtual Summit is a Microsoft IT Pro Community Event, organized by the people behind #SGUCSE #SCUGDK #SCUGFI #MMUGNO and #MSEndpointMgr communities!

Sessions will be delivered in 3 tracks:

  • Endpoint Management
  • Security & Compliance
  • Azure & Automation

Each day will start with a pre-conference talk, and then the sessions kick off each hour mark, 3 sessions before lunch and 3 after. There will be 15 minutes break and Q/A between sessions, so you can catch your breath, fill up your coffee or just wait in excitement for the next session 🙂

My session will be about while Azure Serverless Automation solutions like Azure Functions & Logic Apps can be great for your automation scenarios, how can you secure access to sending requests and protect your serverless automation using Azure AD authentication and authorization.

Session details:

I’m hearing the number of registered attendees is now closing in on 2000, so make sure you register and secure your FREE ticket today:

Register – Nordic Virtual Summit

Hope to see you there!

Remote Authentication and Controlling Philips Hue API using Postman

A while back I bought my first starter kit of Philips Hue lights and bridge, and since then I’ve bought several more bulbs, lamps, lights strips and more. One of the things I quickly got interested in was how to use the very well developed Hue API (https://developers.meethue.com/develop/hue-api/). You need to register and login with a Hue Developer Account, which is free, to be able to explore the API documentation, and for following the steps I describe in this blog post.

If you want to have a very quick look at the Hue API, you can start with this link which does not require a Hue Developer Account: https://developers.meethue.com/develop/get-started-2/. If you are familiar with Postman for exploring API’s, which may have brought you to this blog post to begin with, you should read on as I will describe and share some of the work I have been doing.

When I started exploring the API, I did what most people do: I searched online for tips and how-to’s. I found the following useful Postman Collection on GitHub: https://github.com/moldedbits/philips-hue-postman. So I downloaded that collection, and after configuring my Postman environment with the IP address to the bridge, and pushed the linked button to create a username, all described above on the get started link, I was good to go!

However, this let’s me explore the API only locally, requiring me to run the commands from a local device. This is where I should let you know my real reasons for wanting to use the Hue API, as I work with Microsoft and Cloud solutions daily. I wanted to look into scenarios where I can manipulate my Hue Lights dependent on events from the Microsoft Platform. It could be updating the lights to reflect my presence in Teams, it could be blinking the lights if some alerts happened, and so on. I want to be able to use Azure Functions, Power Platform, PowerShell etc. for controlling my Hue API.

To be able to do that, I need to be able to run Hue API commands remotely, and luckily the Hue API support that as well: https://developers.meethue.com/develop/hue-api/remote-api-quick-start-guide/.

Before I go into the details of this blog post on how I set up remote authentication and control of my Hue API using Postman, I first wanted to share a proof and sneak peak of a coming blog post that I have succeeded in what I’ve wanted, I can now authenticate and control my lights using for example Power Platform:

It was really the foundation of what I learned using Postman to do Remote Authentication to Hue API that made me able to create the solution above, so lets get into the details.

Create a New Remote Hue API app

The first thing you need to do after you have create a Hue Developer account, is to create a new Remote Hue API app here: https://developers.meethue.com/my-apps/.

You need to specify the following required fields:

  • App name: A display name for your remote app
  • Callback URL: This is needed for the Oauth2 consent and returning an authorization code. Specify https://app.getpostman.com/oauth2/callback here.
  • Application description: A short description for your remote app.
  • Optionally you can specify contact details.

After submitting your new app is ready, and also an AppId, ClientId and ClientSecret will be provided, for example:

You are now ready to start setting up your Postman client. If you don’t have Postman installed you can get it at https://www.postman.com/downloads/.

Create Postman Environment and Collection

In the Postman client I have saved all my variables and requests against the Hue API in an environment and collection. At the end of this blog post, I will share with you the link to a GitHub repository where you download these samples for you to start with yourself.

As I will be using several variables with my Postman queries against the Hue API, I’ve created an environment like the following:

Most of these variables are empty to begin with, but you can start by filling out:

  • ClientId: From the app registration at your Hue Developer Account.
  • ClientSecret: From the app registration.
  • AppId: From the app registration.
  • DeviceId: This is the id of the client you will be running queries from. It could be anything like the name of the computer, or the environment you are running from. I just chose my surname and postman.
  • DeviceName: Anything Display Name for the device you run the queries from.

The rest of the variables will be empty for now, but they will be calculated and updated based on responses later.

Also, since collections and variables can be synced and shared with other users or devices, I keep the sensitive variables like client id and secret only in current value column.

After creating my environment, I’ve created a collection with folders for my requests, for example like the following:

We can now proceed to start with Remote Authentication to Hue API.

Remote Authentication to Hue API

Hue Remote API use Oauth2 authorization code flow to get an access token that has to be included in the HTTP Authorization Header for every request to the API. We will use the app registration we did earlier to authenticate with the developer account and grant application access, and with the resulting authorization code we can request the access token to be used in the requests.

The documentation for Remote API is here: https://developers.meethue.com/develop/hue-api/remote-authentication/, and I will walk through the following requests in my Postman Collection:

Get Authorization Code

To get an Authorization code you need to construct an Url like the following:

https://api.meethue.com/oauth2/auth?clientid=ClientId&response_type=code&state=anystring&appid=AppId&deviceid=DeviceId&devicename=DeviceName

You can use Postman to construct the Url like the following, but you cannot really run the query from here as the Postman browser does not support the response:

You can use the code view on the right to get a preview of the request, and the copy it from there and paste it to your browser:

When you execute the authorization code url in your browser, you will be asked to log in with your Hue Developer account, if you aren’t already, and then you need to grant permission to your app registration:

After successfully granting permission, you will be redirected to the Callback Url you specified earlier, and from there you can get the authorization code:

Copy that code and paste it in the environment variable for AuthorizationCode as shown below:

Now that we have the Authorization Code, we can proceed requesting the Access Token. There are two methods for getting an access token:

  • Basic Authentication, which sends your client id and secret as base64 encoded strings in the authorization header.
  • Digest Authentication, which use a more secure challenge-response handshake that handle the credentials more securely.

See documentation for more details on whether to use basic or digest. I will show both in the following.

Get Access Token with Basic Authentication

To get an access token with basic authentication, create a POST request to the token endpoint, specifying the following query parameters:

  • code=authorization code you received in the previous step, using the environment variable
  • grant_type=authorization_code

In addition to that you need to set the Authorization Header to type Basic Authm and specify the ClientId and ClientSecret as shown below:

Postman has this great feature called tests that you can use to run javascript commands before (pre-request) or after queries. You can use this to save response details and update your environment variables. In my example I’m saving the response which would include the access and refresh tokens, and update my environment variables with that:

If I now run this request (and you haven’t waited to long to the authorization code to expire), you should get a successful response:

These tokens expire after the specified amount in seconds. For the Access Token this is 168 hours = 7 days. The Refresh Token expire after 112 days.

This means that after 7 days, you will need to use the refresh token to get a new access token. And when you do that, you will get a new access token valid for 7 more days, and a new refresh token again valid for another 112 days. This means in theory that as long as you refresh the access token regularly, this can last “forever” for your client.

You can also verify that the environment variables for access token and refresh token has been updated based on my javascript from above:

Refresh Token with Basic Authentication

If your access token has expired, you will get a response like the following:

To refresh the access token, you need to use the oauth2/refresh endpoint and with the query parameter grant_type?refresh_token:

The same as getting access token with basic authentication, also when refreshing you need to specify the authorization header to use basic auth, and with client id and secret as username and password:

In Headers, specify that the Content-Type is application/x-www-form-urlencoded:

And in the Body of the request, specify the environment variable for refresh token as form parameter like this:

Under Tests I have the same javascript as shown earlier, which lets me save the refreshed access token and refresh token to my environment again. When I run the request to refresh the access token, I should get a successful response like the following:

Now that I have an access token and refresh token, I’m all set for doing the necessary remote config for getting a username to use in the requests, but first I will show how to do Digest Authentication. If you want you can skip right to the Remote Configure Link Button and Create User section below.

Get Access Token with Digest Authentication

When you want to get an access token using Digest Authentication, you will need to start over again with getting an authorization code as described earlier. After that I will just request an access token with a POST request to the oauth2/token endpoint with the query parameters of code=authorization code & grant_type=authorization_code. This is basically the same request as I used with basic authentication, but with one important difference: I will not specify the Authorization Header to be basic auth.

When I run this request without a authorization header, I will get a 401 unauthorized response like the following. In the response Header and the WWW-Authenticate key I will get the first part of the challenge-response needed for Digest Authentication, the value called Nonce:

As with previous requests, I run a javascript test that extracts the response, in this case it is a little more complicated as i need to examine the response header and split so that I can get the Nonce value. This is so saved to the Nonce environment variable like this:

Now that I have the Nonce I can proceed requesting an access token with Digest Authentication. This time I go to the request Headers and add an Authorization Header with a value as shown below:

This requires a little more explanation. The Authorization header consist of the following:

Digest username=”{{ClientId}}”, realm=”[email protected]”, nonce=”{{Nonce}}”, uri=”/oauth2/token”, response=”{{ResponseMD5Hash}}”

, where Nonce is the value I retrieved from the WWW-Authenticate response header in the first request, and the response is a calculated MD5 Hash as explained in the documentation linked earlier from Hue Remote API. But how did I get this value calculated? This time around I’m using a pre-request script to calculate that, based on the formula from the documentation (marked in green comment below):

So this little pre-request script calculates and sets the ResponseMD5Hash environment variable, so that I have that available when constructing the Authorization Header for Digest authentication.

Before I proceed with requesting an access token using Digest Authentication, I need to put together a Test script that do 2 tests:

  1. Check if there is a WWW-Authenticate response header, and if so save the Nonce to environment variable.
  2. Check if there is a successful json response and if so save the access and refresh tokens to environment variables.

My script now looks like this:

This means that I will need to run the request twice in succession, the first time I will get a 401 Unauthorized response, in which I will get the Nonce:

The second time I will get the 200 OK and a JSON response:

I now have an access token using Digest Authentication:

Refresh Token with Digest Authentication

When the Access Token expire, I will need to use the Refresh Token to get a new Access Token with Digest Authentication. Refreshing using Digest is based on the same challenge-response flow as getting the access token.

First I need to construct a POST request to the oauth2/refresh endpoint like below:

Then I have to add to the Header Content-Type: application/x-www-form-urlencoded, and the Authorization using Digest like shown below:

This Digest Authorization Header is the same as with requesting the access token, with one important difference, uri is “/oauth2/refresh“.

Next, in the Body I need to specify the refresh token:

In the Pre-request Script, make sure to use “/oauth2/refresh” when calculating the md5 hash:

And the Tests scripts are the same as with getting an access token:

Remember that refreshing the access token using Digest will follow the same pattern as getting the access token, you will need to run the refresh request twice in succession for getting a 401 Unauthorized and then 200 OK like shown above.

Get Access Token with Postman Oauth2 Authorization

With the collection of requests shown above, you should now be able to use the access token and refresh when needed for as long time as you need to. Before I proceed to configuration and accessing my lights, I will add one more method of getting access token with Postman: Using the built-in Oauth2 Authorization method.

Lets start by creating the following GET request, this is just a simple request to get configuration from my HUE bridge:

Next, go to Authorization, select Oauth 2.0 and add Authorization data to Request Headers as shown below. Then click on the Get New Access Token button:

In the following dialog, give a name for your Token, and make sure that grant type is Authorization Code. Select Authorize using browser, and note the Callback URL. You need to copy this URL. Construct an Auth URL using your environment variables like I have shown below, the Access Token URL, specify ClientId and ClientSecret variables, type any string for state, and make sure that Client Authentication is Basic Auth header:

Now, before you click on Request Token, you need to go back to your Hue Developer Account and change the Callback URL to the one above:

When you have done that, you can click on Request Token, which will launch you default browser:

You need to be logged in with your Hue Developer account and grant permission:

After that you will get the following message when redirected:

You might have some issues with pop up blocker or redirects:

If everything works as expected, your Postman client will receive the callback and complete then authentication:

And you should receive your Access Token:

Note the little warning about partially support:

Click the Use Token button, which should fill in the Access Token field. Now click Send on the request:

This should now response with some config details of your Hue Bridge, like the name and versions. If you go to Headers and show the hidden Headers you will see that Authorization Header has been set with Bearer and your Access Token.

So going forward, and for the rest of this blog post, you now have two options for using Access Token:

  1. Use the Access Token from environment variables that I have shown how you can retrieve with Basic or Digest Authentication, and Refresh when needed.
  2. Use the Oauth2 Authorization for every request (or set it on the parent level of your collection of requests). When the Access Token expire, it won’t refresh automatically, so you need to Request a new one.

Remote Configure Link Button and Create User

Now for the next part we will have to remote configure the link button, so that the bridge will accept creating a new user for us to be able to control the lights.

Enable Link Button

Users of Hue will remember that you from time to time that you need to push the link button on your bridge when adding new users. With Remote API we can do this without having to run around the house 😉

Create a PUT request like the following. On Headers specify Authorization and value to be Bearer {{AccessToken}}, which will get your environment variable from the previously retrieved access token. PS! For all the following requests: Always include this Authorization: Bearer {{AccessToken}} header!

Specify Content-Type: application/json:

Next, set the Body to {“linkbutton”: true}:

And then Send the Request, you should get a success response like below:

You now have 30 seconds to add your user..

Add Whitelist Identifier

Next, create a POST request to the Bridge endpoint that specify devicetype in the Body, like shown below, where devicetype is a named description of your client and app, for example:

Run the request, but if 30 seconds have passed, you will get this response:

In that case, just run the PUT request again to enable the link button, and try again. When successful you should get a response like the following:

This username have to be included in every Hue Remote API request later, so make sure to save it in your environment variables:

Get All Configuration

Your starting point now for all requests are https://api.meethue.com/bridge/{{username}}/ (and remember to include the Authorization: Bearer {{AccessToken}} header always).

If I want to list all configuration, I can run this GET request:

Which will provide me with all the configuration details of my Bridge:

We are now ready to start controlling the Lights and more!

Control Lights Remotely

If will now show you some sample requests for controlling lights via Hue Remote API:

List All Lights

If I want to list all my lights, I can just run a GET request: https://api.meethue.com/bridge/{{username}}/lights/:

The response will show all lights, numbered from 1 to count of lights connected to your bridge. I’ve collapsed some details in the response above, but you can see from my light number 12 that the state is “on”, you can see the brightness, color and more.

PS! Every light is a new typed object, and not an array, which I find a bit unfortunate, so you need to be aware of that if you want to loop through your lights for example.

If you want to get a specific light, just add the light number at the end of the GET request URI, like this: https://api.meethue.com/bridge/{{username}}/lights/12

Turn a Light On/Off

You can control the state of a specific light by doing a PUT request to your specified light number, for example light number 12: https://api.meethue.com/bridge/{{username}}/lights/12/state

You need a Request Body that tells what the light state should be. For example to turn the light off, just include a body with {“on”:false}, the response will confirm success (and you should be able to visually confirm as well ;):

Likewise, to turn on again:

Controlling Light State and Colors

Now you can really start playing with your lights with custom states, you can change brightness, go from cold white to warm white, or maybe set a color scene for the millions of colors available if your light supports color. More info can be found here:

https://developers.meethue.com/develop/get-started-2/core-concepts/#controlling-light

Light color are primarly controlled by the CIE color space, where you specify an array of two values between 0 and 1 expressed as x and y.

Source: https://developers.meethue.com/develop/get-started-2/core-concepts/#controlling-light

Here is an example where I both control brightness (1-254) and a custom color:

Now, you can just experiment and learn for yourself. A good tip is to use your Hue App on your mobile phone to set the color you like, and then use GET to get the light state color settings.

Summary and Github Repo

I hope you have learnt a lot about Hue Remote API when reading this, I certainly learnt a lot exploring and documenting this via Postman. This understanding has provided me with the knowledge and tools to look into automation and integration scenarios with the Microsoft solutions I use daily in my job. I’ve already been doing this with PowerShell, Azure Functions, Logic Apps and Power Automate to name a few. There is a blog post coming soon on Power Platform and how I did that to control my lights based on Teams Presence and more.

So whether you are a developer that loves to write code, or maybe you are more into the low-code, no-code solutions, I hope this has been helpful.

I promised I will share my Postman Collection and Requests for Hue Remote API, so here it is: https://github.com/JanVidarElven/HueRemoteAPI-Postman-Collections

Knock your self (or your lights) out, and please feel free to comment or contribute 🙂

Azure MFA Report Dashboard in Azure Portal–The Good, The Bad and The Ugly

If you are working with EMS and implementing Azure AD, Intune, MDM, MAM, Information Protection and more, you can build yourself some great dashboards in the Azure Portal using tiles and pin blades to your customized dashboard. This is an example from my own workplace:

image

Often when I work with projects implementing Identity & Access, Conditional Access and Azure MFA, I wish I could have a dashboard to report on MFA registration, and be able to pin that to my EMS dashboard as shown above.

It might be in the future that Azure MFA registrations and methods will be native in the portal, but for now this information have to be retreived in another way. In this blog post I will show you how you can set up a solution for showing this information. I will use the Markdown Tile from the gallery for displaying this information, and in the end it will look like this:

I referred in the title of this blog post to the good, the bad and the ugly, and by that I mean the process of setting this up, because it starts easy enough in the beginning but it will get more “ugly” in the end 😉

The Good – Setting up the Markdown Tile

I will use the Markdown Tile for the content in my customized dashboard in my Azure Portal. The first part is easy to set up, just click Edit and find the Markdown tile from the gallery, as shown below:

image

Drag the tile to a place you want it on your dashboard, and the Edit the title, subtitle and content as prompted:

image

There is a sample content provided, other than that you can write your own markdown. I will not get into details on markdown format here, there is a lot of good guides for learning the format, for example this: https://guides.github.com/features/mastering-markdown/. I will however provide you with a sample for reporting MFA registrations and default methods. This is how I set up my markdown tile:

image

And here is a link to my github repository where you can get the complete MFAReport.md file sample:

https://github.com/skillriver/AzureMFADashboard/blob/master/MFAReport.md

Now we need to fill that markdown tile with some real Azure AD MFA report data to report on.

The Bad – PowerShell Script for getting MFA registration and methods to Markdown

So the “bad” news is that we are reliant on running some Azure AD PowerShell commands for getting user details for MFA registration and methods. For now we are also reliant on the Azure AD v1 PowerShell (MSOnline) Module, as the new v2 AzureAD Module does not yet have any methods to get MFA authentication data. We cannot use the Microsoft Graph API either to get MFA user data, but I expect that to change in the future.

So lets look at the script I use, and after authenticating and connecting to Azure AD in my tenant with Connect-MSOLService, I will run the following commands to get details from each user where there has been configured one or more StrongAuthenticationMethods, and Group on those methods and save the results to a hash table. The results are stored in the $authMethodsRegistered object variable. Similarly I run the command once more, filtering on only showing the methods that are set to default for each user, and save to the $authMethodsDefault variable.

# Connect to MSOnline PowerShell (Azure AD v1)
Connect-MsolService

# Get MFA Methods Registered as Hash Table
$authMethodsRegistered = Get-MsolUser -All | Where-Object {$_.StrongAuthenticationMethods -ne $null} | Select-Object -Property UserPrincipalName -ExpandProperty StrongAuthenticationMethods `
| Group-Object MethodType -AsHashTable -AsString

# Get Default MFA Methods as Hash Table
$authMethodsDefault = Get-MsolUser -All | Where-Object {$_.StrongAuthenticationMethods -ne $null} | Select-Object -Property UserPrincipalName -ExpandProperty StrongAuthenticationMethods `
| Where-Object {$_.IsDefault -eq $true} | Group-Object MethodType -AsHashTable -AsString

# Create a Custom Object for MFA Data
$authMethodsData = New-Object PSObject
$authMethodsData | Add-Member -MemberType NoteProperty -Name AuthPhoneRegistered -Value $authMethodsRegistered.TwoWayVoiceMobile.Count
$authMethodsData | Add-Member -MemberType NoteProperty -Name AuthPhoneAppRegistered -Value $authMethodsRegistered.PhoneAppOTP.Count
$authMethodsData | Add-Member -MemberType NoteProperty -Name OfficePhoneRegistered -Value $authMethodsRegistered.TwoWayVoiceOffice.Count
$authMethodsData | Add-Member -MemberType NoteProperty -Name AlternatePhoneRegistered -Value $authMethodsRegistered.TwoWayVoiceAlternateMobile.Count
$authMethodsData | Add-Member -MemberType NoteProperty -Name OneWaySMSDefault –Value $authMethodsDefault.OneWaySMS.Count
$authMethodsData | Add-Member -MemberType NoteProperty -Name PhoneAppNotificationDefault –Value $authMethodsDefault.PhoneAppNotification.Count
$authMethodsData | Add-Member -MemberType NoteProperty -Name PhoneAppOTPDefault –Value $authMethodsDefault.PhoneAppOTP.Count
$authMethodsData | Add-Member -MemberType NoteProperty -Name TwoWayVoiceMobileDefault –Value $authMethodsDefault.TwoWayVoiceMobile.Count
$authMethodsData | Add-Member -MemberType NoteProperty -Name TwoWayVoiceOfficeDefault –Value $authMethodsDefault.TwoWayVoiceOffice.Count

# Write to Markdown file
"## MFA Authentication Methods`n" | Set-Content .\MFAReport.md -Force -Encoding UTF8
"### Registered`n" | Add-Content .\MFAReport.md -Encoding UTF8
"The following methods has been registered by users:`n" | Add-Content .\MFAReport.md -Encoding UTF8
"| Method | Count |" | Add-Content .\MFAReport.md -Encoding UTF8
"|:-----------|:-----------|" | Add-Content .\MFAReport.md -Encoding UTF8
"| Authentication Phone | " + [string]$authMethodsData.AuthPhoneRegistered + " |" | Add-Content .\MFAReport.md -Encoding UTF8
"| Phone App | " + [string]$authMethodsData.AuthPhoneAppRegistered + " |" | Add-Content .\MFAReport.md -Encoding UTF8
"| Alternate Phone | " + [string]$authMethodsData.AlternatePhoneRegistered + " |" | Add-Content .\MFAReport.md -Encoding UTF8
"| Office Phone | " + [string]$authMethodsData.OfficePhoneRegistered + " |" | Add-Content .\MFAReport.md -Encoding UTF8
"" | Add-Content .\MFAReport.md -Encoding UTF8
"### Default Method" | Add-Content .\MFAReport.md -Encoding UTF8
"The following methods has been configured as default by users:" | Add-Content .\MFAReport.md -Encoding UTF8
"" | Add-Content .\MFAReport.md -Encoding UTF8
"| Method | Count |" | Add-Content .\MFAReport.md -Encoding UTF8
"|:-----------|:-----------|" | Add-Content .\MFAReport.md -Encoding UTF8
"| OneWay SMS | " + [string]$authMethodsData.OneWaySMSDefault + " |" | Add-Content .\MFAReport.md -Encoding UTF8
"| Phone App Notification | " + [string]$authMethodsData.PhoneAppNotificationDefault + " |" | Add-Content .\MFAReport.md -Encoding UTF8
"| Phone App OTP | " + [string]$authMethodsData.PhoneAppOTPDefault + " |" | Add-Content .\MFAReport.md -Encoding UTF8
"| TwoWay Voice Mobile | " + [string]$authMethodsData.TwoWayVoiceMobileDefault + " |" | Add-Content .\MFAReport.md -Encoding UTF8
"| TwoWay Voice Office Phone | " + [string]$authMethodsData.TwoWayVoiceOfficeDefault + " |`n" | Add-Content .\MFAReport.md -Encoding UTF8
"Last reported " + [string](Get-Date) | Add-Content .\MFAReport.md -Encoding UTF8

"" | Add-Content .\MFAReport.md

The complete PowerShell script can be found at my GitHub repository here:

https://github.com/skillriver/AzureMFADashboard/blob/master/MFAStrongAuthenticationUserReport.ps1

So now we have a script where we can get MFA authentication details for each user and create a markdown file that we can use in the tile in the Azure Portal custom dashboard. But it is all a manual process now, and it works fine for an ad hoc update. If we want to automate however, we have to get into the “ugly” stuff 😉

The Ugly – Automating Markdown creation and update Dashboard

This part requires multiple steps. First we need to schedule and run the PowerShell commands from above. Then we need to find a way to update the customized dashboard tile with the updated markdown file. To summary, this is what we need now:

  • Schedule the PowerShell script to run automatically. We can use Azure Automation for that.
  • Programmatically change the markdown tile in the customized dashboard. We can use Azure Resource Manager Rest API for that.

Lets get into the Azure Automation solution first. To run a PowerShell script I will need to create a Runbook, and in that Runbook I need to authenticate to Azure AD. I can define a Credential Asset with a username and password for a global admin user, but I like to use the least privilege possible, and besides that all my global admins are either protected by Azure AD PIM and/or MFA, so that won’t work. I prefer to use a service principal whereever possible, but after testing extensively with Connect-MSOLService that is not supported either.

So I tested with a dedicated Azure AD credential account, first by only adding the user to the Directory Readers role. I was able to list all users with Get-MSOLUser, but not any StrongAuthentication info. Neither did it work with Security Readers. In the end I added the user account to User Administrator role in Azure AD, and I was successful getting StrongAuthentication methods.

So, in my automation accont I will add or reuse my credentials:

image

Next, I will create a new PowerShell script based Runbook, basically I will use the PowerShell script from earlier in the blog, but with a couple of added parameter and getting the credential using the Get-PSAutomationCredential method. This is how it looks, you will get link to the complete script later:

image

And after testing, I can see that I successfully will get the MFAReport.md content (added a Get-Content .\MFAReport.md at the end to display the output):

image

Now that we have a solution for running the PowerShell script and generating the markdown file, the next part is how to update that data in the custom dashboard. And for that we need to look into programatically changing Azure Portal dashboards. There is a good resource and starting point for that in this article: https://docs.microsoft.com/en-us/azure/azure-portal/azure-portal-dashboards-create-programmatically.

First you need to share the custom dashboard, remember to include the markdown tile we set up in the first part of this blog post. At the top in the portal dashboard, select the Share button:

image

By sharing the dashboard will published as an Azure resource. Specify a name, select the subscription and either use the default dashboard resource group or select an existing one:

image

Go to Resource Explorer in the Portal:

image

Navigate to Subscriptions, Resource Groups, and find the resource group and resource containing the custom dashboard. From there you will be able to see the JSON definition of the dashboard and specifically the markdown tile containing the content we want to update:

image

So for next process now we need to copy this complete JSON definition containing all your tiles including the markdown tile. Locally on your computer, create a .json file in your favorite JSON editor, I use Visual Studio Code for this, and paste in the content. I have named my file DeploymentTemplateMFAReport.json.

Now we need to change this template to be ready for deployment, and for that we need to add or change a couple of things. First, in the start of the JSON file, add schema and versioning, and parameters, variables and resources section like I have shown below in line 1-17:

image

I have chosen to use 3 parameters, the markdown content itself, and name of the dashboard and the title of the dashboard.

Next, find the tile for the markdown content, and change the content value to the value of the parameter, like I have done at line 113 here:

image

And last, at the end of the json template, add and change the following settings, I have used my parameters for the dashboard name and the dashboard title here in the lines 401-411:

image

My deployment template for the customized dashboard is now completely general and can be used in every environment. In fact you are welcome to borrow and use my template from above her, I have included it in my github repository:

https://github.com/skillriver/AzureMFADashboard/blob/master/DeploymentTemplateMFAReport.json

Working locally in my Visual Studio Code editor, I can now test the deployment using Azure PowerShell, as shown below and described with these simple steps:

  1. Connect to Azure Resource Manager and select the Subscription
  2. Specify a variable for the Resource Group you want to deploy to
  3. The MFAreport.md file (which we created earlier) need some converting to JSON format, I’m removing all espace characters and any uneeded special characters
  4. Specify variable names your environment for name and title for the dashboard
  5. Deploy the custom dashboard to the resource group

image

However, now that we can test the deployment, I want to schedule a deployment using Azure Automation, and I will continue on my previous runbook from before. But first we need to set up some connections for authenticating to Azure and some variables.

I Azure Automation we can create an Azure Run As Account, this will also create a service principal. If you navigate to your Automation Account in the Azure Portal, and go to the section called Run as accounts, you can create an Azure Run As Account automatically, as I have done here:

image

If I look more closely at this generated Run As Account, I can see details for Azure AD App Registration, Service Principal, Certificate and more. This account will also automatically be assigned Contributor role for my Azure Subscription. If you want more control over Azure Run As Accounts, you can create your own as described in the following article: https://docs.microsoft.com/en-us/azure/automation/automation-create-runas-account

image

I will use this Azure Run As account in my environment to deploy the dashboard resource, I’ll just need to make sure the account has contributor access to the resource group. Next I will set ut a few variables under the Variables section for my Automation Account, I will use these variables when I deploy the resource:

image

Now we are ready to finally put together the complete Runbook and test it. You will have the complete link later in the blog post, but I will share some screenshots first:

After I’ve connected with Connect-MSOLService I’m creating a variable for the markdown content, so I’ve changed from earlier when I saved a .md file temporarily, now I just adding lines using the newline special character (`n):

image

The next part is for logging in to Azure (using the Azure Run As Account mentioned above), and then getting my variables ready for deployment:

image

Then I convert the markdown content to Json format, and removing any escape characters that I don’t need:

image

And then deploy the dashboard resource with parameters for markdown content and dashboard name & title. Note that I’m using my deployment template as a source from my github repository via the TemplateUri property:

image

You can use any TemplateUri you want, for example from a more private source like a storage account blob etc.

Testing the Runbook gives the following output, which shows it was successful:

image

When I now go and refresh the dasboard in the portal, I can see that the markdown tile has been updated:

image

That leaves me with just publishing and scheduling the runbook:

image

When creating a new schedule specify a name and recurrence:

image

Link the schedule to the runbook and any needed parameters, I have to specify my credential that are allowed to Connect-MSOLService:

image

That concludes this lengthy blog post. The script will now run regularly and update my custom markdown tile with MFA report data.

Here is the link to the PowerShell script used in my Azure Automation runbook, enjoy your MFA Reporting!

https://github.com/skillriver/AzureMFADashboard/blob/master/MFAStrongAuthenticationUserReportAutomation.ps1

 

How to use PowerShell script for setting Azure AD Password Reset Writeback On-premises Permissions

When you configure the Azure AD Premium Self Service Password Reset solution on your Azure AD tenant and then the Azure AD Connect Password Writeback feature, you will need to add permissions in your local Active Directory that permits the Azure AD Connect account to actually change and reset passwords for your users  , as detailed here: https://docs.microsoft.com/en-us/azure/active-directory/active-directory-passwords-getting-started#step-4-set-up-the-appropriate-active-directory-permissions.

I wrote this PowerShell script that helps you configure this correctly in your domain/forest. Some notes:

  • You can use it in a single-domain, single-forest domain, or in a multi-domain forest, just remember to specify a Domain Controller for the wanted domain, and for the domain the Azure AD Connect account is in.
  • You have to find the Azure AD Connect Synchronization account, it would be MSOL_xxxx.. if you have used Express settings, or a dedicated account. Look at current configuration for details.
  • You can specify an OU for your users, and if inheritance is enabled all subordinate users and OUs will inherit the permissions. If not, please run the script once for each OU you want the permissions to be applied for.

Here is the script:


# Description: Sets Azure AD Connect Password Write Back AD Permissions
# Created by: Jan Vidar Elven, Enterprise Mobility MVP, Skill AS
# Last Modified: 01.06.2016
# Run this on-premises for your domain/forest
Import-Module ActiveDirectory
#region Initial Parameters/Variables
# Domain Controller in wanted domain, leave blank if using current domain
$dcserver = "mydc.domain.local"
# Azure AD Connect Synchronization Account
$aadcaccount = "MSOL_xxxxx"
# Azure AD Connect Account Domain Controller
$aadcserver = "mydc.domain.local"
# Organizational Unit Distinguished Name, starting point for delegation
$oudn = "OU=<UsersOU>,DC=domain,DC=local"
#endregion
# Check if default current domain or specified domain should be used
If ($dcserver -eq $null) {
# Get a reference to the RootDSE of the current domain
$rootdse = Get-ADRootDSE
# Get a reference to the current domain
$domain = Get-ADDomain
}
Else {
# Get a reference to the RootDSE of the domain for the specified server
$rootdse = Get-ADRootDSE -Server $dcserver
# Get a reference to the domain for the specified server
$domain = Get-ADDomain -Server $dcserver
}
# Refer to my Active Directory, either current or specified server from above
New-PSDrive -Name "myAD" -Root "" -PsProvider ActiveDirectory -Server $rootdse.dnsHostName
# Change to My Active Directory Command Prompt
cd myAD:
# Create a hashtable to store the GUID value of each schema class and attribute
$guidmap = @{}
Get-ADObject -SearchBase ($rootdse.SchemaNamingContext) -LDAPFilter `
"(schemaidguid=*)" -Properties lDAPDisplayName,schemaIDGUID |
% {$guidmap[$_.lDAPDisplayName]=[System.GUID]$_.schemaIDGUID}
# Create a hashtable to store the GUID value of each extended right in the forest
$extendedrightsmap = @{}
Get-ADObject -SearchBase ($rootdse.ConfigurationNamingContext) -LDAPFilter `
"(&(objectclass=controlAccessRight)(rightsguid=*))" -Properties displayName,rightsGuid |
% {$extendedrightsmap[$_.displayName]=[System.GUID]$_.rightsGuid}
# Get a reference to the OU we want to delegate
$ou = Get-ADOrganizationalUnit -Identity $oudn
# Get the SID value of the Azure AD Connect Sync Account we wish to delegate access to
$a = New-Object System.Security.Principal.SecurityIdentifier (Get-ADUser $aadcaccount -Server $aadcserver).SID
# Get a copy of the current DACL on the OU
$acl = Get-ACL -Path ($ou.DistinguishedName)
# Create an Access Control Entry for new permission we wish to add
# Allow the Azure AD Account to reset passwords on all descendent user objects
$acl.AddAccessRule((New-Object System.DirectoryServices.ActiveDirectoryAccessRule `
$a,"ExtendedRight","Allow",$extendedrightsmap["Reset Password"],"Descendents",$guidmap["user"]))
# Allow the Azure AD Account to change passwords on all descendent user objects
$acl.AddAccessRule((New-Object System.DirectoryServices.ActiveDirectoryAccessRule `
$a,"ExtendedRight","Allow",$extendedrightsmap["Change Password"],"Descendents",$guidmap["user"]))
# Allow the Azure AD Account to write lockoutTime extended property
$acl.AddAccessRule((New-Object System.DirectoryServices.ActiveDirectoryAccessRule `
$a,"WriteProperty","Allow",$guidmap["lockoutTime"],"Descendents",$guidmap["user"]))
# Allow the Azure AD Account to write pwdLastSet extended property
$acl.AddAccessRule((New-Object System.DirectoryServices.ActiveDirectoryAccessRule `
$a,"WriteProperty","Allow",$guidmap["pwdLastSet"],"Descendents",$guidmap["user"]))
# Re-apply the modified DACL to the OU
Set-ACL -ACLObject $acl -Path ("myAD:\"+($ou.DistinguishedName))

Hope the script will be helpful!

Schedule Tesla Heating with Microsoft Flow from Calendar @microsoftflow

My Skill colleague Pål-Andre Kjøniksen wrote a blog post on how he can control the heating in his Tesla Model S based on his Office 365 Calendar appointments and Microsoft Flow.

https://www.skill.no/blogg/samhandlingsbloggen/2017-01-25-microsoft-flow-styrer-varmen-i-teslaen/.

The blog post is in Norwegian, but it should be fairly simple to follow the steps, where the idea was that if he creates a Calendar appointment that starts with “Tesla:”, the heating will start up 20 minutes before that, making sure that the car is warm when he drives off.

Experts and Community unite at last ever #SCU_Europe 2016! #ExpertsLive next

This years SCU Europe 2016, for the first time outside Switzerland in the 4th year running, was held in Berlin at the BCC (Berlin Congress Center) close to the Alexander Platz in the eastern parts of “Berlin Mitte”.

 

 

The intro video introducing the Experts:

Let’s begin with the end: at the closing note SCUE general Marcel Zehner announced and with a little bit of emotion that this was the last ever SCU Europe to be held.. You and your organization should be proud of what you have achieved, Marcel, it is one of the best community conferences around, and I have been fortunate to be able to visit all 4 starting with Bern in 2013, Basel in 2014 and 2015, and now Berlin in 2016. It’s only cities with B’s is it? In fact, you never know what twists and turns your career takes, but looking back I’m not sure I would be where I am now in turn of being presenter, MVP and community influencer myself if I had not travelled alone to Bern 4 years ago, that’s where I really started working with and for the Community (with a capitol C)!

Luckily SCU Europe will continue as Experts Live Europe next year! Same place at BCC, same organization and format, and the same dates only next year it will be: 23rd – 25th of August 2017. A new web page was launched, www.expertslive.eu, and Twitter (@ExpertsLiveEU) and Facebook have been changed to reflect that. The hash tag #SCU_Europe will eventually be inactive and you should now use #ExpertsLive.

image

I think this is a very good decision, there has already been discussion on that the name “System Center Universe” is not really reflecting the content and focus of the conference, now embracing the Cloud, with content areas for Management, Productivity, Security, DevOps, Automation, Data Platform and more. ExpertsLive, originally a 1-day community conference in Netherland running each year back from 2009 and with up to 1200 participants, will now be a network of conferences, ranging from region based (ExpertsLive Europe, but also SCU APAC and SCU Australia will be ExpertsLive APAC and Australia next year), and local, country based ExpertsLive like the one in Netherlands, but more will come.

image

The closing note video announcing Experts Live Europe:

This year at SCU Europe I was one of the Experts and presented two sessions on “Premium Identity Management and Protection with Azure AD” and “Deep Dive: Publishing Applications with Azure AD”. I also took part in a “Ask-the-Experts” area together with Cameron Fuller and Kevin Greene where we took questions on the topic System Center 2016. I participated on a discussion panel on Friday morning with Markus Wilhelm from Microsoft Germany on the subject Defense Strategies and Security, and of course we had the Meet and greet with the Experts at the Networking party. It was a really great experience speaking at this conference, thanks for having me!

 

 

 

 

The content of the conference this year was great, and for the first time there was 5 tracks, with over 70 sessions presented! All presentations and session recordings will be at Channel 9 in a few weeks time, so make sure you look at anything you missed or want to see again if you where there, or if you weren’t at the conference this year you can look at your sessions of interest.

I was travelling with a group this year, both from my company and some of our customers, in total we were 7 in the group, and also had 3 cancellations the last week before the conference from some customers that could not make it after all. Moving the conference to Berlin is a big part of why it now was easier to attract more Nordic attendance I think. We stayed at the Park Inn by Radisson right by the Alexander Platz and BCC, so it was really central and nice.

 

 

 

 

In good tradition there are a lot of parties and social networking going on. On the first night there are the Sponsors and Speakers Party, which was held in Mio right by the TV Tower by Alexander Platz, on Thursday we had the attendee Networking Party at the conference center. Later that night our group and some more partners/customers of Squared Up went on to another party at Cosmic Kaspar. It was really hot, so basically the party was at the pavement! On the last day we had the Closing Drinks, sponsored by Cireson and itnetX at Club Carambar, also close to the Alexander Platz. In addition, there are a lot of unofficial gatherings going on, lots of laughs and new and old friends have a good time.

 

 

 

 

 

 

See you next year at Experts Live Europe in Berlin 23-25th August, 2017!

Trigger Azure AD Connect Sync Scheduler with Azure Automation

In this blog post I will show how you can trigger the Azure AD Connect Sync Scheduler with an Azure Automation Runbook PowerShell Script. Since the Azure AD Connect build 1.1.105.0 was released February 2016, a new scheduler is built-in that per default sync every 30 minutes (previously 3 hours). For more detail on Azure AD Connect Sync Scheduler, see https://azure.microsoft.com/en-us/documentation/articles/active-directory-aadconnectsync-feature-scheduler/.

Normally a sync schedule of 30 minutes is sufficient for most use, but sometimes you will need to do an immediate sync. So I thought it was a good idea to create a PowerShell script that creates a remote session to the Azure AD Connect Server, and then triggers a delta sync.

Now, this PowerShell script can of course be used with any of your favorite automation solutions, for example Orchestrator or SMA on-premises. But why not just use Azure Automation and a Hybrid Worker to run this script. This way you can trigger the script in a number of ways including in the Azure Portal, via Webhooks, remediating alerts in OMS and more.

Requirements

Lets first take a look at the requirements for this solution:

  • You will have to have an Azure Subscription, so that an Azure Automation Account can be created (or use your existing account), and that a runbook script with the related assets can be created.
  • You will need to have an OMS Workspace for the Azure Subscription, and have a Hybrid Worker set up that can communicate with the Azure AD Connect Server. The Hybrid Worker will use a credential asset and variable asset created in the first part.

In the following two parts I will look at these two requirements and how you can set it up to start triggering Azure AD Connect Scheduler with your Azure Automation Runbook.

Part 1 – Set up the Azure Automation Runbook and Assets

To set up the Azure Automation part of the solution, I have created a GitHub Repository  where you can deploy the solution directly from https://github.com/skillriver/Trigger-AzureADSyncScheduler. This repository contains the Azure Resource Manager deployment template and PowerShell script that you need to get started.

You can also click this deploy button directly:

 

Lets step through what you experience when you click to “Deploy to Azure”. Please make sure that you are logged in to your correct Azure Subscription first.

Deploying with the Template

I will not go through how I created the ARM based JSON templates, but I will quickly show the user experience when doing the deployment.

The custom deployment will ask you for some parameter values:

  • AUTOMATIONACCOUNTNAME. If you specify an existing Automation Account, this will be used, or else a new one will be created with the Free pricing tier.
  • AAHYBRIDWORKERCREDENTIALNAME. There is a default value there, this will be used as a Credential Asset in the PowerShell script. You can change the value, but then you must remember to change it in the script as well.
  • AAHYBRIDWORKERDOMAIN. The NETBIOS Domain Name for where the Azure AD Connect Server belongs to.
  • AAHYBRIDWORKERUSERNAME. This is this the user name for the service account or other user account that has permission to connect to the Azure AD Connect Server and trigger the sync schedule.
  • AAHYBRIDWORKERPASSWORD. The password for the user above.
  • AADSSERVERNAME. The server name of the Azure AD Connect Server.

You will have to select the correct subscription, and either create a new Resource Group, or an existing one (please note that Azure Automation is not available in every region).

image

After saving the parameters, and reviewing and accepting legal terms, you are ready to create the custom deployment.

If everything went OK, you should see a confirmation:

image

You will have an Automation Account:

image

You will have a PowerShell Script Runbook with the name Trigger-AzureADSync:

image

The script can be viewed as shown below. This script is short and simple, it will get the Asset Variable for Azure AD Connect Server name, and get the Credential Asset for the Hybrid Worker Account. It will the create a remote session, and run the delta sync cycle:

image

Lets take a look at the Assets created with the deployment as well, the Variable:

image

The Credential:

image

That’s the whole solution for this first part. If you for any reason could not or would not deploy the template directly, and would prefer to create this manually, you should be fine just following the images above. Just follow these steps:

  1. Create a Azure Automation Account (Free tier, and in your chosen supported Azure Region).
  2. Create a Variable Asset, with the name of the Azure AD Connect Server.
  3. Create a Credential Asset, with the DOMAIN\UserName of the account you will use to remote session to the Azure AD Connect Server.
  4. Create a new PowerShell Script Runbook, typing the CmdLets from above and using your variable assets.

By now you should be ready for the next step, because you cannot run this Automation Runbook just yet. You have to have in place OMS and a Hybrid Worker first, and that will be shown in the next part.

Part 2 –  Set up the Hybrid Worker and Remote session permission

To be able to run Azure Automation Runbooks in your own datacenter, you will need to have an OMS workspace and at least one Hybrid Worker configured that will be able to execute the Runbook locally and connect to the Azure AD Connect Server.

Hybrid Runbook Worker Components

I will not go through the details here on how to set up an OMS workspace and a Hybrid Worker if you don’t have this from before, you can just follow the documentation here https://azure.microsoft.com/en-us/documentation/articles/automation-hybrid-runbook-worker/.

After setting up and registering your Hybrid Worker, you will have a Hybrid Worker Group with at least one Hybrid Worker.

image

Now, running the Runbook with the right security is going to be essential here, after all the Runbook is going to connect to the Azure AD Connect Server and initiate the sync cycle. Lets first check the settings of the Hybrid Worker Group. We can either select a Default Run As account as I have here:

image

Or you can select a Custom Run As, specifying a credential Asset to use for all Runbooks running on this Hybrid Worker Group:

image

In my example here, I will use the Default Run As Account, because I specify my own credentials in the PowerShell Runbook, as shown earlier in Part 1 of this blog post:

image

Next, I will have to create a domain account in my local Active Directory. I have created a service account to be used for Azure Automation Hybrid Workers. This is the same account you specified when creating the credential asset in Part 1 in Azure Automation:

image

This account will need permission to remote PowerShell to the Azure AD Connect Server. In Computer Management and Local Users and Groups on the Azure AD Connect Server, add this service account to the Remote Management Users group:

image

And add the account to the ADSyncOperators group, so that the user has permission to Azure AD Connect operations:

image

That should be it, we are now ready to start the Runbook and verify that it works.

Starting the Runbook

From the Automation Account and the Trigger-AzureADSync Runbook, select Start and under Run Settings select Hybrid Worker and your Hybrid Worker Group:

image

You can verify that the job completed and with no errors:

image

Looking into the Synchronization Service on the Azure AD Connect Server, I can verify that the sync cycle has been running:

image

That concludes this blog article, hope it has been helpful!

Displaying Azure Automation Runbook Stats in OMS via Performance Collection and Operations Manager

Wouldn’t it be great to get some more information about your Azure Automation Runbooks in the Operations Management Suite Portal? That’s a rhetorical question, of course the answer will be yes!

While Azure Automation is a part of the suite of components in OMS, today you only get the following information from the Azure Automation blade:

The blade shows the number of runbooks and jobs from the one Automation Account you have configured. You can only configure one Automation Account at a time, and for getting more details you are directed to the Azure Portal.

I wanted to use my OMS-connected Operations Manager Management Group, and use a PowerShell script rule to get some more statistics for Azure Automation and display that in OMS Log Analytics as Performance Data. I will do this using the “Sample Management Pack – Wizard to Create PowerShell script Collection Rules” described in this blog article http://blogs.msdn.com/b/wei_out_there_with_system_center/archive/2015/09/29/oms-collecting-nrt-performance-data-from-an-opsmgr-powershell-script-collection-rule-created-from-a-wizard.aspx.

I will use the AzureRM PowerShell Module for the PowerShell commands that will connect to my Azure subscription and get the Azure Automation Runbooks data.

Getting Ready

Before I can create the PowerShell script rule for gettting the Azure Automation data, I have to do some preparations first. This includes:

  1. Importing the “Sample Management Pack – Wizard to Create PowerShell script Collection Rules” to my Operations Manager environment.
    1. This can be downloaded from Technet Gallery at https://gallery.technet.microsoft.com/Sample-Management-Pack-e48040f7.
  2. Install the AzureRM PowerShell Module (at the chosen Target server for the PowerShell Script Rule).
    1. I chose to install it from the PowerShell Gallery using the instructions here: https://azure.microsoft.com/en-us/documentation/articles/powershell-install-configure/
    2. If you are running Windows Server 2012 R2, which I am, follow the instructions here to support the PowerShellGet module, https://www.powershellgallery.com/GettingStarted?section=Get%20Started.
  3. Choose Target for where to run the AzureRM commands from
    1. Regarding the AzureRM and where to install, I decided to use the SCOM Root Management Server Emulator. This server will then run the AzureRM commands against my Azure Subscription.
  4. Choose account for Run As Profile
    1. I also needed to think about the run as account the AzureRM commands will run under. As we will see later the PowerShell Script Rules will be set up with a Default Run As Profile.
    2. The Default Run As Profile for the RMS Emulator will be the Management Server Action Account, if I had chosen another Rule Target the Default Run As Profile would be the Local System Account.
    3. Alternatively, I could have created a custom Run As Profile with a user account that have permissions to execute the AzureRM cmdlets and connect to and read the required data from the Azure subscription, and configure the PowerShell Script rules to use that.
    4. I decided to go with the Management Server Action Account, in my case SKILL\scom_msaa. This account will execute the AzureRM PowerShell cmdlets, so I need to make sure that I can login to my Azure subscription using that account.
  5. Next, I started PowerShell ISE with “Run as different user”, specifying my scom_msaa account. I run the commands below, as I wanted to save the password for the user I’m going to connect to the Azure subscription and get the Automation data. I also did a test import-module of the AzureRM modules I will need in the main script.

The commands above are here in full:


# Prepare to save encrypted password

# Verify that logged on as scom_msaa
whoami

# Get the password
$securepassword = Read-Host -AsSecureString -Prompt Enter Azure AD account password:

# Filepath for encrypted password file
$filepath = C:\users\scom_msaa\AppData\encryptedazureadpassword.txt

# Save password encrypted to file
ConvertFrom-SecureString -SecureString $securepassword | Out-File -FilePath $filepath

Import-Module C:\Program Files\WindowsPowerShell\Modules\AzureRM
Import-Module C:\Program Files\WindowsPowerShell\Modules\AzureRM.Profile
Import-Module C:\Program Files\WindowsPowerShell\Modules\AzureRM.Automation

At this point I’m ready for the next step, which is to create some PowerShell commands for the Script Rule in SCOM.

Creating the PowerShell Command Script for getting Azure Automation data

First I needed to think about what kind of Azure Automation and Runbook data I wanted to get from my Azure Subscription. I decided to get the following values:

  • Job Count Last Day
  • Job Count Last Month
  • Job Count This Month
  • Job Minutes This Month
  • Runbooks in New State
  • Runbooks in Published State
  • Runbooks in Edit State
  • PowerShell Workflow Runbooks
  • Graphical Runbooks
  • PowerShell Script Runbooks

I wanted to have the statistics for Runbooks Jobs to see the activity of the Runbooks. As I’m running the Free plan of Azure Automation, I’m restricted to 500 minutes a month, so it makes sense to count the accumulated job minutes for the month as well.

In addition to this I want some statistics for the number of Runbooks in the environment, separated on New, Published and Edit Runbooks, and the Runbook type separated on Workflow, Graphical and PowerShell Script.

The PowerShell Script Rule for getting these data will be using the AzureRM PowerShell Module, and specifically the cmdlets in AzureRM.Profile and AzureRM.Automation:

To log in and authenticate to Azure, I will use the encrypted password saved earlier, and create a Credential object for the login:

Initializing the script with date filters and setting default values for variables. I decided to create the script so that I can get data from all the Resource Groups I have Automation Accounts in. This way, If I have multiple Automation Accounts, I can get statistics combined for each of them:

Then, looping through each Resource Group, running the different commands to get the variable data. Since I potentially will loop through multiple Resource Groups and Automation Accounts, the variables will be using += to add to the previous loop value:

After setting each variable and exiting the loop, the $PropertyBag can be filled with the values for the different counters:

The complete script is shown below for how to get those Azure Automation data via SCOM and PowerShell Script Rule to to OMS:


# Debug file
$debuglog = $env:TEMP+\powershell_perf_collect_AA_stats_debug.log

Date | Out-File $debuglog

Who Am I: | Out-File $debuglog -Append
whoami |
Out-File $debuglog -Append

$ErrorActionPreference = Stop

Try {

If (!(Get-Module –Name AzureRM)) { Import-Module C:\Program Files\WindowsPowerShell\Modules\AzureRM }
If (!(Get-Module –Name AzureRM.Profile)) { Import-Module C:\Program Files\WindowsPowerShell\Modules\AzureRM.Profile }
If (!(Get-Module –Name AzureRM.Automation)) { Import-Module C:\Program Files\WindowsPowerShell\Modules\AzureRM.Automation }

# Get Cred for ARM
$filepath = C:\users\scom_msaa\AppData\encryptedazureadpassword.txt
$userName = myAzureADAdminAccount
$securePassword = ConvertTo-SecureString (Get-Content -Path $FilePath)
$cred = New-Object -TypeName System.Management.Automation.PSCredential ($username, $securePassword)

# Log in and sett active subscription
Login-AzureRmAccount -Credential $cred

$subscriptionid = mysubscriptionID

Set-AzureRmContext -SubscriptionId $subscriptionid

$API = new-object -comObject MOM.ScriptAPI

$aftertime = $(Get-Date).AddHours(1)
$afterdate_lastday = $(Get-Date).AddDays(1)
$afterdate_lastmonth = $(Get-Date).AddDays(30)
$afterdate_thismonth = $(Get-Date).AddDays(($(Get-Date).Day)+1)

$AutomationRGs = @(MyResourceGroupName1,MyResourceGroupName2)

$PropertyBags=@()

$newrunbooks = 0
$publishedrunbooks = 0
$editrunbooks = 0
$scriptrunbooks = 0
$graphrunbooks = 0
$powershellrunbooks = 0
$jobcountlastday = 0
$jobcountlastmonth = 0
$jobcountthismonth = 0
$jobminutesthismonth = 0

ForEach ($AutomationRG in $AutomationRGs) {

$rmautomationacct = Get-AzureRmAutomationAccount -ResourceGroupName $AutomationRG

$newrunbooks += (Get-AzureRmAutomationRunbook -AutomationAccountName $rmautomationacct.AutomationAccountName -ResourceGroupName $AutomationRG `
|
Where {$_.State -eq New}).Count

$publishedrunbooks += (Get-AzureRmAutomationRunbook -AutomationAccountName $rmautomationacct.AutomationAccountName -ResourceGroupName $AutomationRG `
|
Where {$_.State -eq Published}).Count

$editrunbooks += (Get-AzureRmAutomationRunbook -AutomationAccountName $rmautomationacct.AutomationAccountName -ResourceGroupName $AutomationRG `
|
Where {$_.State -eq Edit}).Count

$scriptrunbooks += (Get-AzureRmAutomationRunbook -AutomationAccountName $rmautomationacct.AutomationAccountName -ResourceGroupName $AutomationRG `
|
Where {$_.RunbookType -eq Script}).Count

$graphrunbooks += (Get-AzureRmAutomationRunbook -AutomationAccountName $rmautomationacct.AutomationAccountName -ResourceGroupName $AutomationRG `
|
Where {$_.RunbookType -eq Graph}).Count

$powershellrunbooks += (Get-AzureRmAutomationRunbook -AutomationAccountName $rmautomationacct.AutomationAccountName -ResourceGroupName $AutomationRG `
|
Where {$_.RunbookType -eq PowerShell}).Count

$jobcountlastday += (Get-AzureRmAutomationJob -AutomationAccountName $rmautomationacct.AutomationAccountName -ResourceGroupName $AutomationRG `
-StartTime
$afterdate_lastday).Count

$jobcountlastmonth += (Get-AzureRmAutomationJob -AutomationAccountName $rmautomationacct.AutomationAccountName -ResourceGroupName $AutomationRG `
-StartTime
$afterdate_lastmonth).Count

$jobcountthismonth += (Get-AzureRmAutomationJob -AutomationAccountName $rmautomationacct.AutomationAccountName -ResourceGroupName $AutomationRG `
-StartTime
$afterdate_thismonth.ToLongDateString()).Count

$jobsthismonth = Get-AzureRmAutomationJob -AutomationAccountName $rmautomationacct.AutomationAccountName -ResourceGroupName $AutomationRG `
-StartTime
$afterdate_thismonth.ToLongDateString() | Select-Object RunbookName, StartTime, EndTime, CreationTime, LastModifiedTime, @{Name=RunningTime;Expression={[TimeSpan]::Parse($_.EndTime $_.StartTime).TotalMinutes}}, @{Name=Month;Expression={($_.EndTime).Month}}

$jobminutesthismonth += [int][Math]::Ceiling(($jobsthismonth | Measure-Object -Property RunningTime -Sum).Sum)

}

$PropertyBag = $API.CreatePropertyBag()
$PropertyBag.AddValue(Instance, Job Count Last Day)
$PropertyBag.AddValue(Value, [UInt32]$jobcountlastday)
$PropertyBags += $PropertyBag

Job Count Last Day: | Out-File $debuglog -Append
$jobcountlastday | Out-File $debuglog -Append

$PropertyBag = $API.CreatePropertyBag()
$PropertyBag.AddValue(Instance, Job Count Last Month)
$PropertyBag.AddValue(Value, [UInt32]$jobcountlastmonth)
$PropertyBags += $PropertyBag

Job Count Last Month: | Out-File $debuglog -Append
$jobcountlastmonth | Out-File $debuglog -Append

$PropertyBag = $API.CreatePropertyBag()
$PropertyBag.AddValue(Instance, Job Count This Month)
$PropertyBag.AddValue(Value, [UInt32]$jobcountthismonth)
$PropertyBags += $PropertyBag

Job Count This Month: | Out-File $debuglog -Append
$jobcountthismonth | Out-File $debuglog -Append

$PropertyBag = $API.CreatePropertyBag()
$PropertyBag.AddValue(Instance, Job Minutes This Month)
$PropertyBag.AddValue(Value, [UInt32]$jobminutesthismonth)
$PropertyBags += $PropertyBag

Job Minutes This Month: | Out-File $debuglog -Append
$jobminutesthismonth | Out-File $debuglog -Append

$PropertyBag = $API.CreatePropertyBag()
$PropertyBag.AddValue(Instance, Runbooks in New State)
$PropertyBag.AddValue(Value, [UInt32]$newrunbooks)
$PropertyBags += $PropertyBag

Runbooks in New State: | Out-File $debuglog -Append
$newrunbooks | Out-File $debuglog -Append

$PropertyBag = $API.CreatePropertyBag()
$PropertyBag.AddValue(Instance, Runbooks in Published State)
$PropertyBag.AddValue(Value, [UInt32]$publishedrunbooks)
$PropertyBags += $PropertyBag

Runbooks in Published State: | Out-File $debuglog -Append
$publishedrunbooks | Out-File $debuglog -Append

$PropertyBag = $API.CreatePropertyBag()
$PropertyBag.AddValue(Instance, Runbooks in Edit State)
$PropertyBag.AddValue(Value, [UInt32]$editrunbooks)
$PropertyBags += $PropertyBag

Runbooks in Edit State: | Out-File $debuglog -Append
$editrunbooks | Out-File $debuglog -Append

$PropertyBag = $API.CreatePropertyBag()
$PropertyBag.AddValue(Instance, PowerShell Workflow Runbooks)
$PropertyBag.AddValue(Value, [UInt32]$scriptrunbooks)
$PropertyBags += $PropertyBag

PowerShell Workflow Runbooks: | Out-File $debuglog -Append
$scriptrunbooks | Out-File $debuglog -Append

$PropertyBag = $API.CreatePropertyBag()
$PropertyBag.AddValue(Instance, Graphical Runbooks)
$PropertyBag.AddValue(Value, [UInt32]$graphrunbooks)
$PropertyBags += $PropertyBag

Graphical Runbooks: | Out-File $debuglog -Append
$graphrunbooks | Out-File $debuglog -Append

$PropertyBag = $API.CreatePropertyBag()
$PropertyBag.AddValue(Instance, PowerShell Script Runbooks)
$PropertyBag.AddValue(Value, [UInt32]$powershellrunbooks)
$PropertyBags += $PropertyBag

PowerShell Script Runbooks: | Out-File $debuglog -Append
$powershellrunbooks | Out-File $debuglog -Append

$PropertyBags

} Catch {

Error Catched: | Out-File $debuglog -Append
$(
$_.Exception.GetType().FullName) | Out-File $debuglog -Append
$(
$_.Exception.Message) | Out-File $debuglog -Append

}

PS! I have included debugging and logging in the script, be aware though that doing $ErrorActionPreference=Stop will end the script if any errors, for example with logging, so it might be an idea to remove the debug logging when confirmed that everything works.

In the next part I’m ready to create the PowerShell Script Rule.

Creating the PowerShell Script Rule

In the Operations Console, under Authoring, create a new PowerShell Script Rule as shown below:

  1. Select the PowerShell Script (Performance – OMS Bound) Rule:I have created a custom destination management pack for this script.
  2. Specifying a Rule name and Rule Category: Performance Collection. As mentioned earlier in this article the Rule target will be the Root Management Server Emulator:
  3. Selecting to run the script every 30 minutes, and at which time the interval will start from:
  4. Selecting a name for the script file and timeout, and entering the complete script as shown earlier:
  5. For the Performance Mapping information, the Object name must be in the \\FQDN\YourObjectName format. For FQDN I used the Target variable for PrincipalName, and for the Object Name AzureAutomationRunbookStats, and adding the “\\” at the start and “\” between: \\$Target/Host/Property[Type=”MicrosoftWindowsLibrary7585010!Microsoft.Windows.Computer”]/PrincipalName$\AzureAutomationRunbookStatsI specified the Counter name as “Azure Automation Runbook Stats”, and the Instance and Value are specified as $Data/Property(@Name=’Instance’)$ and $Data/Property(@Name=Value)$. These reflect the PropertyBag instance and value created in the PowerShell script:
  6. After finishing the Create Rule Wizard, two new rules are created, which you can find by scoping to the Root Management Server Emulator I chose as target. Both Rules must be enabled, as they are not enabled by default:

At this point we are finished configuring the SCOM side, and can wait for some hours to see that data are actually coming into my OMS workspace.

Looking at Azure Automation Runbook Stats Performance Data in OMS

After a while I will start seeing Performance Data coming into OMS with the specified Object and Counter Name, and for the different instances and values.

In Log Search, I can specify Type=Perf ObjectName=AzureAutomationRunbookStats, and I will find the Results and Metrics for the specified time frame.

In the example above I’m highlighting the Job Minutes This Month counter, which will steadily increase for each month, and as we can see the highest value was 107 minutes, after when the month changed to March we were back at 0 minutes. After a while when the number of job minutes increases it will be interesting to follow whether this counter will go close to 500 minutes.

This way, I can now look at Azure Automation Runbook stats as performance data, showing different scenarios like how many jobs and runbook job minutes there are over a time period. I can also look at how what type of runbooks I have and what state they are in.

I can also create saved searches and alerts for my search criteria.

Creating OMS Alerts for Azure Automation Runbook Counters

There is one specific scenario for Alerts I’m interested in, and that is when I’m approaching my monthly limit on 500 job minutes.

“Job Minutes This Month” is a counter that will get the sum of all job minutes for all runbook jobs across all automation accounts. In the classic Azure portal, you will have a usage overview like this:

With OMS I would get this information over a time period like this:

The search query for Job Minutes This Month as I have defined it via the PowerShell Script Rule in OMS is:

Type=Perf ObjectName=AzureAutomationRunbookStats InstanceName=”Job Minutes This Month”

This would give me all results for the defined time period, but to generate an alert I would want to look at the most recent results, for example for the last hour. In addition, I want to filter the results for my alert when the number of job minutes are over the threshold of 450 which means I’m getting close to the limit of 500 free minutes per month. My query for this would be:

Type=Perf ObjectName=AzureAutomationRunbookStats InstanceName=”Job Minutes This Month” AND TimeGenerated>NOW-1HOUR AND CounterValue > 450

Now, in my test environment, this will give med 0 results, because I’m currently at 37 minutes:

Let’s say, for sake of testing an Alert, I add criteria to include 37 minutes as well, like this:

This time I have 2 results. Let’s create an alert for this, press the ALERT button:

For the alert I give it a name and base it on the search query. I want to check every 60 minutes, and generate alert when the number of results is greater than 1 so that I make sure the passing of the threshold is consistent and not just temporary.

For actions I want an email notification, so I type in a Subject and my recipients.

I Save the alert rule, and verify that it was successfully created.

Soon I get my first alert on email:

Now, that it works, I can remove the Alert and create a new one without the OR CounterValue=37, this I leave to you 😉

With that, this blog post is concluded. Thanks for reading, I hope this post on how to get more insights on your Azure Automation Runbook Stats in OMS and getting data via NRT Perfomance Collection has been useful 😉

Creating SCSM Incidents from OMS Alerts using Azure Automation – Part 2

This is the second part of a 2-part blog article that will show how you can create a new Service Manager Incident from an Azure Automation Runbook using a Hybrid Worker Group, and with OMS Alerts search for a condition and generate an alert which triggers this Azure Automation Runbook for creating an Incident in Service Manager via a Webhook and some contextual data for the Alert.

In Part 1 of the blog I prepared my Service Manager environment, and created Azure Automation Runbook and Assets to run via Hybrid Worker for generating incidents in Service Manager. In this second part of the blog I will configure my Operations Management Suite environment for OMS Alerting and Alert Remediation, and create an OMS Alert that will trigger this PowerShell Runbook.

Configuring OMS Alerting and Remediation

If you haven’t already for your OMS Workspace, you will need to enable the OMS Alerting and Alert Remediation under Settings and Preview Features. This is shown in the picture below:

Creating the OMS Alert

The next step is to create the OMS Alert. To do this I will need to do a Log Search with the criteria I want. For my example in this article, I will use an EventLog Search where I have previously added Azure AD Application Proxy Connector Event Log to OMS, and where I also have created a custom field for events where “The Connector was unable to connect to the service due to networking issues”.

The result of this Log Search is shown below, where I have 7 results in the last 7 days:

When I enabled OMS Alerting and Remediation under Settings, I can now see that I have a new Alert button at the bottom of the screen. I click on that to create my new OMS Alert.

I give the OMS Alert a descriptive name, using my current search query, and checking every 15 minutes for this alert. I can also specify a threshold over a specified time windows, in this case I want the Alert to trigger if there are more than 0 occurrences. If I want to I can also send an email notification to specified recipient(s).

Since I want to generate a SCSM Incident when this OMS Alert triggers, I select to Enable Remediation and select my Create-SCSMIncident Runbook.

After saving the OMS Alert I get a successful confirmation, and a link to where I can see my configured Alerts:

While in Preview I can only create up to 10 Alerts, and I can also remove them but not edit existing for now:

That is all I need to configure in Operations Management Suite to get the OMS Alert to trigger. Now I need to go back to the Azure Portal and configure some changes for my PowerShell Runbook!

Configuring Azure Automation PowerShell Runbook for Webhook and Hybrid Worker Group

In the Azure Portal and under my Automation Account and the PowerShell Runbook I created for Create-SCSMIncident (see Part 1), there will now automatically be created a Webhook for OMS Alert Remediation. This Webhook has a expiry date of one year ahead of creation.

I now need to specify the Parameters for the Webhook, so that it runs on my Hybrid Worker group:

After I have specified the Hybrid Worker group, any OMS Alerts will now trigger this Runbook and run on my local environment, and in this case create the SCSM incident as specified in the PowerShell Runbook. But, I also want to have some contextual data in the Incident, so I need to look at the Webhook data in the next step.

Configuring and using Webhook for contextual data in Runbook

Whenever the OMS Alert triggers the remediation Azure Automation Runbook via the Webhook, event information will be submitted from OMS to the Runbook via WebhookData input parameter.

An example of this is shown in the image below, where the WebhookData Input Parameter contains event information formatted as JSON (JavaScript Object Notation):

So, I need to configure my PowerShell Runbook to process this WebhookData, and to use that information when creating the Incident.

Let’s first take a look at the WebhookData. If I copy the input from above to for example Visual Studio Code, I can see clearer that the WebhookData consists of a WebhookName, RequestBody and RequestHeader. The values I’m looking for are in the RequestBody and SearchResults:

I update my PowerShell Runbook so that I can process the WebhookData, and get the WebhookName, WebhookHeaders and WebhookBody. When I have the WebhookBody, I can get the SearchResults and by using ConvertFrom-JSON loop trough the value array to get the fields I’m looking for like this:

In this case I want the Source, EventID and RenderedDescription, which also corresponds to the values from the Alert in OMS, as shown below. I then use these values for the Incident Title and Description in the PowerShell Runbook.

The complete Azure Automation PowerShell Runbook is shown below:

param (
[
object]$WebhookData
)

if ($WebhookData -ne $null) {

# Get Webhook Data
$WebhookName = $WebhookData.WebhookName
$WebhookHeaders = $WebhookData.RequestHeader
$WebhookBody = $WebhookData.RequestBody

# Writing Webhook Data to verbose output
Write-Verbose Webhook name: ‘$WebhookName’
Write-Verbose Webhook header:
Write-Verbose $WebhookHeaders
Write-Verbose Webhook body:
Write-Verbose $WebhookBody

# Searching Webhook Data for Value Results
$SearchResults = (ConvertFrom-JSON $WebhookBody).SearchResults
$SearchResultsValue = $SearchResults.value
Foreach ($item in $SearchResultsValue)
{
# Getting Alert Source, EventID and RenderedDescription
$AlertSource = $item.Source
Write-Verbose Alert Name: ‘$AlertSource’
$AlertEventId = $item.EventID
Write-Verbose Alert EventID: ‘$AlertEventId’
$AlertDescription = $item.RenderedDescription
Write-Verbose Alert Description: ‘$AlertDescription’
}

# Setting Incident Title and Description based on OMS Alert
$incident_title = OMS Alert: + $AlertSource
$incident_desc = $AlertDescription
}
else
{
# Setting Generic Incident Title and Description
$incident_title = Azure Automation Generated Alert
$incident_desc = This Incident is generated from an Azure Automation Runbook via Hybrid Worker
}

# Getting Assets for SCSM Management Server Name and Credentials
$scsm_mgmtserver = Get-AutomationVariable -Name SCSMServerName
$credential = Get-AutomationPSCredential -Name SCSMAASvcAccount

# Create Remote Session to SCSM Management Server
#
(Specified credential must be in Remote Management Users local group and SCSM operator)
$session = New-PSSession -ComputerName $scsm_mgmtserver -Credential $credential

# Import module for Service Manager PowerShell CmdLets
$SMDIR = Invoke-Command -ScriptBlock {(Get-ItemProperty hklm:/software/microsoft/System Center/2010/Service Manager/Setup).InstallDirectory} -Session $session
Invoke-Command -ScriptBlock { param($SMDIR) Set-Location -Path $SMDIR } -Args $SMDIR -Session $session
Import-Module .\Powershell\System.Center.Service.Manager.psd1 -PSSession $session

# Create Incident
Invoke-Command -ScriptBlock { param ($incident_title, $incident_desc)

# Get Incident Class
$IncidentClass = Get-SCSMClass -Name System.WorkItem.Incident

# Get Prefix for Incident IDs
$IncidentPrefix = (Get-SCSMClassInstance -Class (Get-SCSMClass -Name System.WorkItem.Incident.GeneralSetting)).PrefixForId

# Set Incident Properties
$Property = @{Id=$IncidentPrefix{0}
Title
= $incident_title
Description
= $incident_desc
Urgency
= System.WorkItem.TroubleTicket.UrgencyEnum.Medium
Source
= SkillSCSM.IncidentSourceEnum.OMS
Impact
= System.WorkItem.TroubleTicket.ImpactEnum.Medium
Status
= IncidentStatusEnum.Active
}

# Create the Incident
New-SCSMClassInstance -Class $IncidentClass -Property $Property -PassThru

} -Args $incident_title, $incident_desc -Session $session

Remove-PSSession $session

After publishing the updated Runbook I’m ready for the OMS Alert to trigger.

When the OMS Alert triggers

The next time this OMS Alert triggers, I can verify that the Runbook is started and an Incident is created. Since I also wanted an email notification, I also received that:

In Operations Management Suite, I search for any OMS Alerts generated by using the query “Type=Alert SourceSystem=OMS”:

In Azure Automation, I can see that the Runbook has launched a job:

And most importantly, I can see that the Incident is created in Service Manager with the info I specified:

That concludes this two-part blog article on how to create SCSM Incidents from OMS Alerts. OMS Automation rocks!