Author Archives: Jan Vidar Elven

About Jan Vidar Elven

Microsoft MVP Enterprise Mobility. P-TSP Cloud and Datacenter Management.

How to Send Requests to GitHub API from Power Platform using Custom Connector

Recently I came across a personal scenario where I use Hugo and GitHub Pages as a team site for a Soccer team I’m coaching and wanted to automate some updates to the web site. I’ve written a blog post previously on how I organized trainings at home using Power Platform: How I as a Soccer Coach…. | GoToGuy Blog, and I am now using Github Pages and Hugo for publishing some statistics and more for that scenario.

In this blog post I will show how I:

  1. Created an OAuth Application for Github API.
  2. Created a Custom Connector in Power Platform for connections to that OAuth Application.
  3. Created Operations for getting content, updating content and triggering workflows for Github Actions.
  4. Connected to Github API using my Azure AD account and user impersonation.
  5. Created a Power Automate Cloud Flow for using the Custom Connector and the defined operations.

Lets get started!

Create OAuth Application for Github API

Start by logging in to your GitHub account and go to Settings. Under Settings you will find Developer Settings where you can access OAuth Apps. You can also go directly to the following URL https://github.com/settings/developers.

Click to Register a new application, and fill in something like the following:

As the above image shows, give the application a descriptive name for your scenario, you can type any homepage URL, this is not important in this scenario. The authorization callback URL is important though, as this will the callback to the Custom Connector we will create later. We can verify the URL later, but use https://global.consent.azure-apim.net/redirect.

Register the application. Next you can change the settings for the registered app. You will have to copy the Client ID, we will need that later. You also need to create Client Secret, make sure to copy that as well, you will only be able to see this once. You can also change some settings like name, logo and branding if you like. This is how my Github App registration looks like now:

We can now proceed to Power Platform to create the Custom Connector.

Create Custom Connector to Github API in Power Platform

Log in to your Power Platform environment, and go to Custom Connectors under Data. Click to create a New custom connector. You can select to create from blank if you want to follow along the steps in my blog post here, or you can select to import an OpenAPI for URL, as I will provide the swagger file at the end of this blog post.

Give the connector a name of your choice and continue:

Next you need to specify “api.github.com” as host. You can also optionally upload a connector icon, as I have done here:

(You can grab the mark logo used above from here, GitHub Logos and Usage, note the usage requirements).

Next, go to Security. Select OAuth 2.0 as authentication type, and then selec GitHub as Identity Provider.

(PS! You can select Generic OAuth 2 also, but it will fall back to GitHub as Identity Provider eventually after all).

Add your Client ID and Secret from the Github OAuth application registration:

It is important to configure the correct scope (or scopes) as this will authorize the client for accessing the API. If you leave the scope blank, you will only get public read only access. You can read more on available scopes here: Scopes for OAuth Apps – GitHub Docs

In my case I want to have full read and write access to public repositories, as well as read write to user profile, so I set the scope to “public_repo user” (use space delimiter for multiple scopes):

I can now click “Create connector”. After creating the security details are now hidden/disabled, and I can verify the Redirect URL to be the same as the Callback URL from the GitHub OAuth app registration:

We can now start defining the operations for the actions I want to do against the GitHub API.

Create Operations for sending requests to GitHub API

When querying and sending request to the GitHub API you need to know the API details and required parameters for what you want. The following link is for the official GitHub Rest API reference: Reference – GitHub Docs.

In my example I want to define the following 3 operations in my Custom Connector:

Under 3. Definition, select to create a New action, and call it something like “Get Repository Content” with the Operation ID set to “GetRepositoryContent”:

Then, under Request, click Import from sample. Select the Verb GET, and under URL type https://api.github.com. The rest of the query we will get from the GitHub API docs. Copy the following fra the REST API reference docs:

So that your sample request now looks like this, remember to add the recommended Accept header:

Click Import. The request will now ask for owner, repo and path as parameters:

Next, click the default response. Here you can copy the sample response from the REST API docs, I’ve copied the sample response for getting file contents:

After that click “Update connector” and we have the first action operation defined.

Click New action again, this time for updating file contents:

For the sample request the Verb is PUT, the URL is the same as when getting file content, but now we need to specify a request body as well:

I’ve created the sample request body based on the docs reference, with just empty placeholder values for the parameters needed. Some of these can be omitted, but message, contents, sha and branch is required for updating an existing file:

{
 "message": "",
 "content": "",
 "sha": "",
 "branch": "",
 "committer": {
  "name": "",
  "email": "",
  "author": {
   "name": "",
   "email": ""
  }
 }
}

After importing the sample request, you can click into the body parameter and change to required for the body itself, as well as the payload parameters that you always want to include from below:

Add a sample default response as well, I’ve copied the example response for updating a file from the docs.

Click “Update connector” again and we are ready to add the third action:

This will be a POST request, with the following URL and request body:

Note from above that “ref” needs to be referencing a branch or tag name as is a required parameter. “Inputs” is an object, depending on your GitHub Actions workflow if incoming parameters is defned, so in many cases this can be empty.

You can leave the default response as it is, as API will return 204 No Content if request is successful.

Click on “Update connector” again, and you should now have 3 actions successfully configured.

We can now proceed to create a connection and authenticate to GitHub API using this custom connector.

Connect to Github API using my Azure AD account and user impersonation

Go to “4. Test”, and click to create a “New connection”. This will create a new authentication popup, and if you’re not already logged in to GitHub you must log in first. Note the correct reference and branding to the “Elven Power Platform OAuth App”:

After logging in I’m prompted to authorize the OAuth app to access data in my account. Note that the scopes “public_repo” and “user” is shown in the authorization request:

If you own other organizations you can grant access to that as well. Click Authorize “OwnerName”: as shown below:

After authorizing you will be redirected back to the Connections, and you should be able to successfully get a new connection object.

Let’s take a look at GitHub settings again, under https://github.com/settings/applications. You should see the OAuth App and the correct permissions configured if you click into details. You can also revoke the access if you need to remove it or reconfigure the scopes for example:

Let’s do a test from the Custom Connector and see what we get. Click on the GetRepositoryContent, and provide the paramaters for “owner” (your GitHub account name), “repo” (any repository, I’m using my GitHub Pages repo here), and a “path” to an existing file in that repo (I’m just testing against my README.md at root, but this can be any subfolder\file also). Click Test operation and see:

This should be successful, note that the response contains a couple of important values for later, the “sha” for the existing file, and the “content” which is a base64 representation of the current contents of the README.md file.

Click on the Request tab, and you will see a preview of how the request was constructed. You will also see the Authorization Header with the Bearer Token:

A couple of important things to note:

  • The request uses an API gateway in Azure APIM, not GitHub directly.
  • The Bearer Token in the Authorization Header is for the Azure API GW audience, so it cannot be used directly against GitHub API.

Copy the entire token value, from after “Bearer <token……>”, and paste it into a JWT debugger like jwt.io. From there we can look at the decoded payload:

From that payload it’s clear that the Token has been issued by my Azure AD tenant and for my logged on user in Power Platform. The scope is user_impersonation, so this will be used in a on-behalf-of flow scenario via the audience defined as apihub.azure.com, which in turn will request from GitHub API resources on my behalf via the APIM gateway used by Power Platform.

You can also lookup the appid from the Token in the Azure AD tenant, and you will find the following Enterprise Application, from where you can enable or disable it on an organization level, or you can examine the sign in logs:

We can test the other operations as well, but let’s create a Flow for that scenario.

Create a Power Automate Cloud Flow for using the Custom Connector to Get and Update File Content

Create a new Cloud Flow, using an instant trigger for manually triggering a flow. Add some inputs like shown below:

Next, add a new action and from under Custom find the GitHub Custom Connector:

Add the “Get Repository Content” action and then fill in the inputs like below:

Next, add a Compose action, with the following dynamic expression:

base64ToString(outputs('Get_Repository_Content')?['body/content'])

This is just for checking what the existing file contents is:

We can do a quick Save and then Test Flow so far, from the Run history I should get the correct inputs, and when finding the existing file the outputs will include the sha value of the existing file, as well as the base64 encoded value of the content:

And when looking at the decoding of the content I can see that the readme.md file content is shown correctly:


Go into Flow edit mode again, and add another Compose action, this time we need to base64 encode the new content I want to update the file with:

Note that the base64 function uses for parameter the input trigger of base64(triggerBody()?['text']), as this is the first text parameter of the trigger.

Add a new action, this time for the Custom Connector again, and the Update File Contents. Specify the owner, repo and path as previously input values, type a custom message for the message, and select the outputs from the “Base64 Updated Content” action, and use the sha value from the “Get Repository Content”. The rest of the values (committer, author objects) are optional:

Save and then do another test, for example like the following to update the README.md file:

And the test should be successful:

I can also verify this at my repository and check the file has been updated. Note also the commit message:

Triggering a GitHub Actions Workflow

The last thing I wanted to go through in this blog post is using the Power Platform Custom Connector to trigger a GitHub Actions workflow. My use case for this is to start a Hugo build when I have dynamically updated files for my static website, but for now I will keep it simple.

I have via a basic template created a simple workflow like this:

This workflow can also be triggered manually using workflow_dispatch, so let’s use that to verify that I can call it from Power Platform.

Add a new action at the end of the Flow, adding the Custom Connector action for Dispatching Workflow event:

Specify Owner and Repo from inputs, and for workflow id either specify ID or the name of the workflow file, in this case blank.yml. The ref parameter is either a branch or tag name, so in my case I use main branch. I leave the other parameters blank as I don’t have any inputs to supply, and use the default Accept header.

Save and Test the Flow again, supplying an updated file content, owner, repo and path similar to what we did previously. When the Flow runs it should complete successfully:

If I go to my GitHub repository, and under Actions, I can see that this workflow has been triggered:

Actually it has been triggered twice, as the first trigger is automatic for the push commit on the file update, and the other (named “CI” in results) is the actual workflow dispatch from the Flow.

Basically this means that I can select some different logic to when my workflows will trigger, either as a push or pull trigger, or as a trigger event based on my Flows. But of course I won’t normally run both triggers πŸ˜‰

I now have what I need for working further with my personal Hugo and GitHub Pages project, my plan is to update data and assets files from my Power Platform environment, and then trigger a Hugo build for my website. I might blog more on that process later.

Summary and some last thoughts

In this blog post I wanted to show how you can work with the GitHub REST API via a Power Platform Custom Connector. This way you can basically achieve anything that the GitHub API has available, provided the correct scope/scopes has been authorized.

I do want to mention however that there is a GitHub Connector you can use directly in Power Automate, Logic Apps, or Power Apps also: GitHub – Connectors | Microsoft Docs, where you can create a direct connection to your GitHub account. You should take a look at that if that can server your needs.

In my case I needed the API to get or update file contents directly, as well as when using impersonation people in my organization can use their own Azure AD accounts if I share the Custom Connector with them, they don’t need their own GitHub accounts as long as the OAuth App has been authorized on my behalf.

If you want a quickstart on creating the Custom Connector your self, below is the Swagger definition. Thanks for reading, hope it has been useful!

swagger: '2.0'
info: {title: JanVidarElven Github Connector, description: GitHub API Connector for
JanVidarElven, version: '1.0'}
host: api.github.com
basePath: /
schemes: [https]
consumes: []
produces: []
paths:
/repos/{owner}/{repo}/contents/{path}:
get:
responses:
default:
description: default
schema:
type: object
properties:
type: {type: string, description: type}
encoding: {type: string, description: encoding}
size: {type: integer, format: int32, description: size}
name: {type: string, description: name}
path: {type: string, description: path}
content: {type: string, description: content}
sha: {type: string, description: sha}
url: {type: string, description: url}
git_url: {type: string, description: git_url}
html_url: {type: string, description: html_url}
download_url: {type: string, description: download_url}
_links:
type: object
properties:
git: {type: string, description: git}
self: {type: string, description: self}
html: {type: string, description: html}
description: _links
summary: Get Repository Content
operationId: GetRepositoryContent
description: Get File or Folder Content from Repository
parameters:
{name: owner, in: path, required: true, type: string}
{name: repo, in: path, required: true, type: string}
{name: path, in: path, required: true, type: string}
{name: Accept, in: header, required: false, type: string}
put:
responses:
default:
description: default
schema:
type: object
properties:
content:
type: object
properties:
name: {type: string, description: name}
path: {type: string, description: path}
sha: {type: string, description: sha}
size: {type: integer, format: int32, description: size}
url: {type: string, description: url}
html_url: {type: string, description: html_url}
git_url: {type: string, description: git_url}
download_url: {type: string, description: download_url}
type: {type: string, description: type}
_links:
type: object
properties:
self: {type: string, description: self}
git: {type: string, description: git}
html: {type: string, description: html}
description: _links
description: content
commit:
type: object
properties:
sha: {type: string, description: sha}
node_id: {type: string, description: node_id}
url: {type: string, description: url}
html_url: {type: string, description: html_url}
author:
type: object
properties:
date: {type: string, description: date}
name: {type: string, description: name}
email: {type: string, description: email}
description: author
committer:
type: object
properties:
date: {type: string, description: date}
name: {type: string, description: name}
email: {type: string, description: email}
description: committer
message: {type: string, description: message}
tree:
type: object
properties:
url: {type: string, description: url}
sha: {type: string, description: sha}
description: tree
parents:
type: array
items:
type: object
properties:
url: {type: string, description: url}
html_url: {type: string, description: html_url}
sha: {type: string, description: sha}
description: parents
verification:
type: object
properties:
verified: {type: boolean, description: verified}
reason: {type: string, description: reason}
signature: {type: string, description: signature}
payload: {type: string, description: payload}
description: verification
description: commit
summary: Update File Contents
description: Update existing file in repository
operationId: UpdateFileContents
parameters:
{name: owner, in: path, required: true, type: string}
{name: repo, in: path, required: true, type: string}
{name: path, in: path, required: true, type: string}
{name: Accept, in: header, required: false, type: string}
name: body
in: body
required: true
schema:
type: object
properties:
message: {type: string, description: message, title: ''}
content: {type: string, description: content, title: ''}
sha: {type: string, description: sha, title: ''}
branch: {type: string, description: branch, title: ''}
committer:
type: object
properties:
name: {type: string, description: name}
email: {type: string, description: email}
author:
type: object
properties:
name: {type: string, description: name}
email: {type: string, description: email}
description: author
description: committer
required: [branch, content, message, sha]
/repos/{owner}/{repo}/actions/workflows/{workflow_id}/dispatches:
post:
responses:
default:
description: default
schema: {}
summary: Dispatch Workflow Event
operationId: DispatchWorkflowEvent
description: Trigger a GitHub Actions Workflow by ID
parameters:
{name: owner, in: path, required: true, type: string}
{name: repo, in: path, required: true, type: string}
{name: workflow_id, in: path, required: true, type: string}
{name: Accept, in: header, required: false, type: string}
name: body
in: body
required: true
schema:
type: object
properties:
ref: {type: string, description: ref, title: ''}
inputs:
type: object
properties: {}
description: inputs
required: [ref]
definitions: {}
parameters: {}
responses: {}
securityDefinitions:
oauth2_auth:
type: oauth2
flow: accessCode
authorizationUrl: https://github.com/login/oauth/authorize
tokenUrl: https://login.windows.net/common/oauth2/authorize
scopes: {public_repo user: public_repo user}
security:
oauth2_auth: [public_repo user]
tags: []

Protect Logic Apps with Azure AD OAuth – Part 2 Expose Logic App as API

This blog article will build on the previous blog post published, Protect Logic Apps with Azure AD OAuth – Part 1 Management Access | GoToGuy Blog, which provided some basic understanding around authorizing to Logic Apps request triggers using OAuth and Access Tokens.

In this blog I will build on that, creating a scenario where a Logic App will be exposed as an API to end users. In this API, I will call another popular API: Microsoft Graph.

My scenario will use a case where end users does not have access themselves to certain Microsoft Graph requests, but where the Logic App does. Exposing the Logic App as an API will let users be able to authenticate and authorize, requesting and consenting to the custom Logic App API permissions I choose. Some of these permissions can users consent to themselves, while other must be admin consented. This way I can use some authorizing inside the Logic App, and only let the end users be able to request what they are permitted to.

I will also look into assigning users and groups, and using scopes and roles for additional fine graining end user and principal access to the Logic App.

A lot of topics to cover, so let’s get started by first creating the scenario for the Logic App.

Logic App calling Microsoft Graph API

A Logic App can run requests against the Microsoft Graph API using the HTTP action and specifying the method (GET, POST, etc) and resource URI. For authentication against Graph from the Logic App you can use either:

  • Using Azure Active Directory OAuth and Client Credentials Flow with Client Id and Secret.
  • Using System or User Assigned Managed Identity.

Permissions for Microsoft Graph API are either using “delegated” (in context of logged in user) or “application” (in context of application/deamon service). These scenarios using Logic App will use application permissions for Microsoft Graph.

PS! Using Logic Apps Custom Connectors (Custom connectors overview | Microsoft Docs) you can also use delegated permissions by creating a connection with a logged in user, but this outside of the scope of this article.

Scenario for using Microsoft Graph in Logic App

There are a variety of usage scenarios for Microsoft Graph, so for the purpose of this Logic App I will focus on one of the most popular: Device Management (Intune API) resources. This is what I want the Logic App to do in this first phase:

  • Listing a particular user’s managed devices.
  • Listing all of the organization’s managed devices.
  • Filtering managed devices based on operating system and version.

In addition to the above I want to implement the custom API such that any assigned user can list their own devices through end-user consent, but to be able to list all devices or any other user than your self you will need an admin consented permission for the custom API.

Creating the Logic App

In your Azure subscription, add a new Logic App to your chosen resource group and name it according to your naming standard. After the Logic App is created, you will need add the trigger. As this will be a custom API, you will need it to use HTTP as trigger, and you will also need a response back to the caller, so the easiest way is to use the template for HTTP Request-Response as shown below:

Your Logic App will now look like this:

Save the Logic App before proceeding.

Create a Managed Identity for the Logic App

Exit the designer and go to the Identity section of the Logic App. We need a managed identity, either system assigned or user assigned, to let the Logic App authenticate against Microsoft Graph.

A system assigned managed identity will follow the lifecycle of this Logic App, while a user assigned managed identity will have it’s own lifecycle, and can be used by other resources also. I want that flexibility, so I will create a user assigned managed identity for this scenario. In the Azure Portal, select to create a new resource and find User Assigned Managed Identity:

Create a new User Assigned Managed Identity in your selected resource group and give it a name based on your naming convention:

After creating the managed identity, go back to your Logic App, and then under Identity section, add the newly created managed identity under User Assigned Managed Identity:

Before we proceed with the Logic App, we need to give the Managed Identity the appropriate Microsoft Graph permissions.

Adding Microsoft Graph Permissions to the Managed Identity

Now, if we wanted the Logic App to have permissions to the Azure Rest API, we could have easily added Azure role assignments to the managed identity directly:

But, as we need permissions to Microsoft Graph, there are no GUI to do this for now. The permissions needed for listing managed devices are documented here: List managedDevices – Microsoft Graph v1.0 | Microsoft Docs.

So we need a minimum of: DeviceManagementManagedDevices.Read.All.

To add these permissions we need to run some PowerShell commands using the AzureAD module. If you have that installed locally, you can connect and proceed with the following commands, for easy of access you can also use the Cloud Shell in the Azure Portal, just run Connect-AzureAD first:

PS! You need to be a Global Admin to add Graph Permissions.

You can run each of these lines separately, or run it as a script:

# Microsoft Graph App Well Known App Id
$msGraphAppId = "00000003-0000-0000-c000-000000000000"

# Display Name if Managed Identity
$msiDisplayName="msi-ops-manageddevices" 

# Microsoft Graph Permission required
$msGraphPermission = "DeviceManagementManagedDevices.Read.All" 

# Get Managed Identity Service Principal Name
$msiSpn = (Get-AzureADServicePrincipal -Filter "displayName eq '$msiDisplayName'")

# Get Microsoft Graph Service Principal
$msGrapSpn = Get-AzureADServicePrincipal -Filter "appId eq '$msGraphAppId'"

# Get the Application Role for the Graph Permission
$appRole = $msGrapSpn.AppRoles | Where-Object {$_.Value -eq $msGraphPermission -and $_.AllowedMemberTypes -contains "Application"}

# Assign the Application Role to the Managed Identity
New-AzureAdServiceAppRoleAssignment -ObjectId $msiSpn.ObjectId -PrincipalId $msiSpn.ObjectId -ResourceId $msGrapSpn.ObjectId -Id $appRole.Id

Verify that it runs as expected:

As mentioned earlier, adding these permissions has to be done using script commands, but there is a way to verify the permissions by doing the following:

  1. Find the Managed Identity, and copy the Client ID:
  1. Under Azure Active Directory and Enterprise Applications, make sure you are in the Legacy Search Experience and paste in the Client ID:
  1. Which you then can click into, and under permissions you will see the admin has consented to Graph permissions:

The Logic App can now get Intune Managed Devices from Microsoft Graph API using the Managed Identity.

Calling Microsoft Graph from the Logic App

Let’s start by adding some inputs to the Logic App. I’m planning to trigger the Logic App using an http request body like the following:

{
 "userUpn": "someuser@elven.no",
 "operatingSystem": "Windows",
 "osVersion": "10"
}

In the Logic App request trigger, paste as a sample JSON payload:

The request body schema will be updated accordingly, and the Logic App is prepared to receive inputs:

Next, add a Condition action, where we will check if we should get a users’ managed devices, or all. Use an expression with the empty function to check for userUpn, and another expression for the true value, like below:

We will add more logic and conditions later for the filtering of the operating system and version, but for now add an HTTP action under True like the following:

Note the use of the Managed Identity and Audience, which will have permission for querying for managed devices.

Under False, we will get the managed devices for a specific user. So add the following, using the userUpn input in the URI:

Both these actions should be able to run successfully now, but we will leave the testing for a bit later. First I want to return the managed devices found via the Response action.

Add an Initialize Variable action before the Condition action. Set the Name and Type to Array as shown below, but the value can be empty for now:

Next, under True and Get All Managed Devices, add a Parse JSON action, adding the output body from the http action and using either the sample response from the Microsoft Graph documentation, or your own to create the schema.

PS! Note that if you have over 1000 managed devices, Graph will page the output, so you should test for odata.nextLink to be present as well. You can use the following anonymized sample response for schema which should work in most cases:

{
     "@odata.context": "https://graph.microsoft.com/v1.0/$metadata#deviceManagement/managedDevices",
     "@odata.count": 1000,
     "@odata.nextLink": "https://graph.microsoft.com/v1.0/deviceManagement/managedDevices?$skiptoken=",
     "value": [
         {
             "id": "id Value",
             "userId": "User Id value",
             "deviceName": "Device Name value",
             "managedDeviceOwnerType": "company",
             "operatingSystem": "Operating System value",
             "complianceState": "compliant",
             "managementAgent": "mdm",
             "osVersion": "Os Version value",
             "azureADRegistered": true,
             "deviceEnrollmentType": "userEnrollment",
             "azureADDeviceId": "Azure ADDevice Id value",
             "deviceRegistrationState": "registered",
             "isEncrypted": true,
             "userPrincipalName": "User Principal Name Value ",
             "model": "Model Value",
             "manufacturer": "Manufacturer Value",
             "userDisplayName": "User Display Name Value",
             "managedDeviceName": "Managed Device Name Value"
         }
     ]
 }

PS! Remove any sample response output from schema if values will be null or missing from your output. For example I needed to remove the configurationManagerClientEnabledFeatures from my schema, as this is null in many cases.

Add another Parse JSON action under the get user managed devices action as well:

Now we will take that output and do a For Each loop for each value. On both sides of the conditon, add a For Each action, using the value from the previous HTTP action:

Inside that For Each loop, add an Append to Array variable action. In this action we will build a JSON object, returning our chosen attributes (you can change to whatever you want), and selecting the properties from the value that was parsed:

Do the exact same thing for the user devices:

Now, on each side of the condition, add a response action, that will return the ManagedDevices array variable, this will be returned as a JSON som set the Content-Type to application/json:

Finally, remove the default response action that is no longer needed:

The complete Logic App should look like the following now:

As I mentioned earlier, we’ll get to the filtering parts later, but now it’s time for some testing.

Testing the Logic App from Postman

In the first part of this blog post article series, Protect Logic Apps with Azure AD OAuth – Part 1 Management Access | GoToGuy Blog, I described how you could use Postman, PowerShell or Azure CLI to test against REST API’s.

Let’s test this Logic App now with Postman. Copy the HTTP POST URL:

And paste it to Postman, remember to change method to POST:

You can now click Send, and the Logic App will trigger, and should return all your managed devices.

If you want a specific users’ managed devices, then you need to go to the Body parameter, and add like the following with an existing user principal name in your organization:

You should then be able to get this users’ managed devices, for example for my test user this was just a virtual machine with Window 10:

And I can verify a successful run from the Logic App history:

Summary so far

We’ve built a Logic App that uses it’s own identity (User Assigned Managed Identity) to access the Microsoft Graph API using Application Permissions to get managed devices for all users or a selected user by UPN. Now it’s time to exposing this Logic App as an API son end users can call this securely using Azure AD OAuth.

Building the Logic App API

When exposing the Logic App as an API, this will be the resource that end users will access and call as a REST API. Consider the following diagram showing the flow for OpenID Connect and OAuth, where Azure AD will be the Authorization Server from where end users can request access tokens where the audience will be the Logic App resource:

Our next step will be to create Azure AD App Registrations, and we will start with the App Registration for the resource API.

Creating App Registration for Logic App API

In your Azure AD tenant, create a new App Registration, and call it something like (YourName) LogicApp API:

I will use single tenant for this scenario, leave the other settings as it is and create.

Next, go to Expose an API:

Click on Set right next to Application ID URI, and save the App ID URI to your choice. You can keep the GUID if you want, but you can also type any URI value you like here (using api:// or https://). I chose to set the api URI to this:

Next we need to add scopes that will be the permissions that delegated end users can consent to. This will be the basis of the authorization checks we can do in the Logic App later.

Add a scope with the details shown below. This will be a scope end users can consent to themselves, but it will only allow them to read their own managed devices:

Next, add another Scope, with the following details. This will be a scope that only Admins can consent to, and will be authorized to read all devices:

You should now have the following scopes defined:

Next, go to the Manifest and change the accessTokenAcceptedVersion from null to 2, this will configure so that Tokens will use the OAuth2 endpoints:

That should be sufficient for now. In the next section we will prepare for the OAuth client.

Create App Registration for the Logic App Client

I choose to create a separate App Registration in Azure AD for the Logic App Client. This will represent the OAuth client that end users will use for OAuth authentication flows and requesting permissions for the Logic App API. I could have configured this in the same App Registration as the API created in the previous section, but this will provide better flexibility and security if I want to share the API with other clients also later, or if I want to separate the permission grants between clients.

Go to App Registrations in Azure AD, and create a new registration calling it something like (yourname) LogicApp Client:

Choose single tenant and leave the other settings for now.

After registering, go to API permissions, and click on Add a permission. From there you can browse to “My APIs” and you should be able to locate the (yourname) Logic API. Select to add the delegated permissions as shown below:

These delegated permissions reflect the scopes we defined in the API earlier. Your App registration and API permission should now look like below. NB! Do NOT click to Grant admin consent for your Azure AD! This will grant consent on behalf of all your users, which will work against our intended scenario later.

Next, we need to provide a way for clients to authenticate using Oauth flows, so go to the Certificates & secrets section. Click to create a Client secret, I will name my secret after where I want to use it for testing later (Postman):

Make sure you copy the secret value for later:

(Don’t worry, I’ve already invalidated the secret above and created a new one).

Next, go to Authentication. We need to add a platform for authentication flows, so click Add a platform and choose Web. For using Postman later for testing, add the following as Redirect URI: https://oauth.pstmn.io/v1/callback

Next, we will also provide another test scenario using PowerShell or Azure CLI client, so click on Add a platform one more time, this time adding Mobile desktop and apps as platform and use the following redirect URI: urn:ietf:wg:oauth:2.0:oob

Your platform configuration should now look like this:

Finally, go to advanced and set yes to allow public client flows, as this will aid in testing from PowerShell or Azure CLI clients later:

Now that we have configured the necessary App registrations, we can set up the Azure AD OAuth Authorization Policy for the Logic App.

Configuring Azure AD OAuth Authorization Policy for Logic App

Back in the Logic App, create an Azure AD Authorization Policy with issuer and audience as shown below:

Note the Claims values:

We are using the v2.0 endpoint as we configured in the manifest of the App Registration that accessTokenAcceptedVersion should be 2. (as opposed to v1.0 issuer that would be in the format https://sts.windows.net/{tenantId}/). And the Audience claim would be our configured API App ID. (for v1.0 the audience would be the App ID URI, like api://elven-logicapp-api).

Save the Logic App, and we can now start to do some testing where we will use the client app registration to get an access token for the Logic App API resource.

Testing with Postman Client

The first test scenario we will explore is using Postman Client and the Authorization Code flow for getting the correct v2.0 Token.

A recommended practice when using Postman and reusing variable values is to create an Environment. I’ve created this Environment for storing my Tenant ID, Client ID (App ID for the Client App Registration) and Client Secret (the secret I created for using Postman):

Previously in this blog article, we tested the Logic App using Postman. On that request, select the Authorization tab, and set type to OAuth 2.0:

Next, under Token configuration add the values like the following. Give the Token a recognizable name, this is just for Postman internal refererence. Make sure that the Grant Type is Authorization Code. Note the Callback URL, this is the URL we configured for the App registration and Callback Url. In the Auth and Access Token URL, configure the use of the v2.0 endpoints, using TenantID from the environment variables. (Make sure to set the current environment top right). And for Client ID and Client Secret these will also refer to the environment variables:

One important step remains, and that is to correctly set the scope for the access token. Using something like user.read here will only produce an Access Token for Microsoft Graph as audience. So we need to change to the Logic App API, and the scope for ManagedDevices.Read in this case:

Let’s get the Access Token, click on the Get New Access Token button:

A browser window launches, and if you are not already logged in, you must log in first. Then you will be prompted to consent to the permission as shown below. The end user is prompted to consent for the LogicApp API, as well as basic OpenID Connect consents:

After accepting, a popup will try to redirect you to Postman, so make sure you don’t block that:

Back in Postman, you will see that we have got a new Access Token:

Copy that Access Token, and paste it into a JWT debugger like jwt.ms or jwt.io. You should see in the data payload that the claims for audience and issuer is the same values we configured in the Logic App Azure AD OAuth policy:

Note also the token version is 2.0.

Click to use the Token in the Postman request, it should populate this field:

Before testing the request, remember to remove the SAS query parameters from the request, so that sv, sp and sig are not used with the query for the Logic App:

Now, we can test. Click Send on the Request. It should complete successfully with at status of 200 OK, and return the managed device details:

Let’s add to the permission scopes, by adding the ManagedDevices.Read.All:

Remember just to have a blank space between the scopes, and then click Get New Access Token:

If I’m logged on with a normal end user, I will get the prompt above that I need admin privileges. If I log in with an admin account, this will be shown:

Note that I can now do one of two actions:

  1. I can consent only on behalf of myself (the logged in admin user), OR..
  2. I can consent on behalf of the organization, by selecting the check box. This way all users will get that permission as well.

Be very conscious when granting consents on behalf of your organization.

At this point the Logic App will authorize if the Token is from the correct issuer and for the correct audience, but the calling user can still request any managed device or all devices. Before we get to that, I will show another test scenario using a public client like PowerShell.

Testing with PowerShell and MSAL.PS

MSAL.PS is a perfect companion for using MSAL (Microsoft Authentication Library) to get Access Tokens in PowerShell. You can install MSAL.PS from PowerShellGallery using Install-Module MSAL.PS.

The following commands show how you can get an Access Token using MSAL.PS:

# Set Client and Tenant ID
$clientID = "cd5283d0-8613-446f-bfd7-8eb1c6c9ac19"
$tenantID = "104742fb-6225-439f-9540-60da5f0317dc"

# Get Access Token using Interactive Authentication for Specified Scope and Redirect URI (Windows PowerShell)
$tokenResponse = Get-MsalToken -ClientId $clientID -TenantId $tenantID -Interactive -Scope 'api://elven-logicapp-api/ManagedDevices.Read' -RedirectUri 'urn:ietf:wg:oauth:2.0:oob'

# Get Access Token using Interactive Authentication for Specified Scope and Redirect URI (PowerShell Core)
$tokenResponse = Get-MsalToken -ClientId $clientID -TenantId $tenantID -Interactive -Scope 'api://elven-logicapp-api/ManagedDevices.Read' -RedirectUri 'http://localhost'


MSAL.PS can be used both for Windows PowerShell, and for PowerShell Core, so in the above commands, I show both. Note that the redirect URI for MSAL.PS on PowerShell Core need to be http://localhost. You also need to add that redirect URI to the App Registration:

Running the above command will prompt an interactive logon, and should return a successful response saved in the $tokenResponse variable.

We can verify the response, for example checking scopes or copying the Access Token to the clipboard so that we can check the token in a JWT debugger:

# Check Token Scopes
$tokenResponse.Scopes

# Copy Access Token to Clipboard
$tokenResponse.AccessToken | Clip

In the first blog post of this article series I covered how you can use Windows PowerShell and Core to use Invoke-RestMethod for calling the Logic App, here is an example where I call my Logic App using the Access Token (in PowerShell Core):

# Set variable for Logic App URL
$logicAppUrl = "https://prod-05.westeurope.logic.azure.com:443/workflows/d429c07002b44d63a388a698c2cee4ec/triggers/request/paths/invoke?api-version=2016-10-01"

# Convert Access Token to a Secure String for Bearer Token
$bearerToken = ConvertTo-SecureString ($tokenResponse.AccessToken) -AsPlainText -Force

# Invoke Logic App using Bearer Token
Invoke-RestMethod -Method Post -Uri $logicAppUrl -Authentication OAuth -Token $bearerToken

And I can verify that it works:

Great. I now have a couple of alternatives for calling my Logic App securely using Azure AD OAuth. In the next section we will get into how we can do authorization checks inside the Logic App.

Authorization inside Logic App

While the Logic App can have an authorization policy that verifies any claims like issuer and audience, or other custom claims, we cannot use that if we want to authorize inside the logic app based on scopes, roles etc.

In this section we will look into how we can do that.

Include Authorization Header in Logic Apps

First we need to include the Authorization header from the OAuth access token in the Logic App. To do this, open the Logic App in code view, and add the operationOptions to IncludeAuthorizationHeadersInOutputs for the trigger like this:

        "triggers": {
            "manual": {
                "inputs": {
                    "schema": {}
                },
                "kind": "Http",
                "type": "Request",
                "operationOptions": "IncludeAuthorizationHeadersInOutputs"
            }
        }

This will make the Bearer Token accessible inside the Logic App, as explained in detail in my previous post: Protect Logic Apps with Azure AD OAuth – Part 1 Management Access | GoToGuy Blog. There I also showed how to decode the token to get the readable JSON payload, so I need to apply the same steps here:

After applying the above steps, I can test the Logic App again, and get the details of the decoded JWT token, for example of interest will be to check the scopes:

Implement Logic to check the Scopes

When I created the LogicApp API app registration, I added two scopes: ManagedDevices.Read and ManagedDevices.Read.All. The authorization logic I want to implement now is to only let users calling the Logic App and that has the scope ManagedDevices.Read.All to be able to get ALL managed devices, or to get managed devices other than their own devices.

The first step will be to check if the JWT payload for scope “scp” contains the ManagedDevices.Read.All. Add a Compose action with the following expression:

contains(outputs('Base64_to_String_Json').scp,'ManagedDevices.Read.All')

This expression will return either true or false depending on the scp value.

Next after this action, add a Condition action, where we will do some authorization checks. I have created two groups of checks, where one OR the other needs to be true.

Here are the details for these two groups:

  • Group 1 (checks if scp does not contain ManagedDevices.Read.All and calling user tries to get All managed devices):
    • Outputs('Check_Scopes') = false
    • empty(triggerBody()?['userUpn']) = true
  • Group 2 (checks if scp does not contain ManagedDevices.Read.All, and tries to get managed devices for another user than users’ own upn):
    • Outputs('Check_Scopes') = false
    • triggerBody()?['userUpn'] != Outputs('Base64_to_String_Json')['preferred_username']

If either of those two groups is True, then we know that the calling user tries to do something the user is not authorized to do. This is something we need to give a customized response for. So inside the True condition, add a new Response action with something like the following:

I’m using a status code of 403, meaning that the request was successfully authenticated but was missing the required authorization for the resource.

Next, add a Terminate action, so that the Logic App stops with a successful status. Note also that on the False side of the condition, I leave it blank because I want it to proceed with the next steps in the Logic Apps.

Test the Authorization Scope Logic

We can now test the authorization scopes logic implemented above. In Postman, either use an existing Access Token or get a new Token that only include the ManagedDevices.Read scope.

Then, send a request with an empty request body. You should get the following response:

Then, try another test, this time specifying another user principal name than your own, which also should fail:

And then test with your own user principal name, which will match the ‘preferred_username’ claim in the Access Token, this should be successful and return your devices:

Perfect! It works as intended, normal end users can now only request their own managed devices.

Let’s test with an admin account and the ManagedDevices.Read.All scope. In Postman, add that scope, and get a new Access Token:

When logging in with a user that has admin privileges you will now get a Token that has the scope for getting all devices, for which your testing should return 200 OK for all or any users devices:

Adding Custom Claims to Access Token

In addition to the default claims and scopes in the Access Token, you can customize a select set of additional claims to be included in the JWT data payload. Since the Access Token is for the resource, you will need to customize this on the App Registration for the LogicApp API.

In Azure AD, select the App Registration for the API, and go to API permissions first. You need to add the OpenID scopes first. Add the following OpenID permissions:

Your API App Registration should look like this:

Next, go to Token configuration. Click Add optional claim, and select Access Token. For example you can add the ipaddr and upn claims as I have done below:

Note the optional claims listed for the resource API registration:

Next time I get a new access token, I can see that the claims are there:

Summary of User Authorization so far

What we have accomplished now is that users can get an Access Token for the Logic App API resource. This is the first requirement for users to be able to call the Logic App, that they indeed have a Bearer Token in the Authorization Header that includes the configured issuer and audience.

In my demos I have shown how to get an access token using Postman (Authorization Code Flow) and a Public Client using MSAL.PS. But you can use any kind of Web application, browser/SPA or, Client App, using any programming libraries that either support MSAL or OpenID Connect and OAuth2. Your solution, your choice πŸ˜‰

After that I showed how you can use scopes for delegated permissions, and how you can do internal authorization logic in the Logic App depending on what scope the user has consented to/allowed to.

We will now build on this, by looking into controlling access and using application roles for principals.

Assigning Users and Restricting Access

One of the most powerful aspects of exposing your API using Microsoft Identity Platform and Azure AD is that you now can control who can access your solution, in this case call the Logic App.

Better yet, you can use Azure AD Conditional Access to apply policies for requiring MFA, devices to be compliant, require locations or that sign-ins are under a certain risk level, to name a few.

Let’s see a couple of examples of that.

Require User Assignment

The first thing we need to do is to change the settings for the Enterprise Application. We created an App registration for the LogicApp Client, for users to able to authenticate and access the API. From that LogicApp Client, you can get to the Enterprise Application by clicking on the link for Managed application:

In the Enterprise App, go to Properties, and select User assignment required:

We can now control which users, or groups, that can authenticate to and get access to the Logic App API via the Client:

If I try to log in with a user that is not listed under Users and groups, I will get an error message that the “signed in user is not assigned to a role for the application”:

PS! The above error will show itself a little different based on how you authenticate, the above image is using a public client, if you use Postman, the error will be in the postman console log, if you use a web application you will get the error in the browser etc.

Configuring Conditional Access for the Logic App

In addition to controlling which users and groups that can access the Logic App, I can configure a Conditional Access policy in Azure AD for more fine grained access and security controls.

In your Azure AD blade, go to Security and Conditional Access. If you already have a CA policy that affects all Applications and Users, for example requiring MFA, your LogicApp API would already be affected by that.

Note that as we are protecting the resource here, your Conditional Access policy must be targeted to the LogicApp API Enterprise App.

Click to create a new policy specific for the Logic App API, as shown below:

For example I can require that my Logic App API only can be called from a managed and compliant device, or a Hybrid Azure AD Joined device as shown below:

If I create that policy, and then tries to get an access token using a device that are not registered or compliant with my organization, I will get this error:

Summary of Restricting Access for Users and Groups

With the above steps we can see that by adding an Azure AD OAuth authorization policy to the Logic App, we can control which users and groups that can authenticate to and get an Access Token required for calling the Logic App, and we can use Conditional Access for applying additional fine grained access control and security policies.

So far we have tested with interactive users and delegated permission acccess scenarios, in the next section we will dive into using application access and roles for authorization scenarios.

Adding Application Access and Roles

Sometimes you will have scenarios that will let application run as itself, like a deamon or service, without requiring an interactive user present.

Comparing that to the OIDC and OAuth flow from earlier the Client will access the Resource directly, by using an Access Token aquired from Azure AD using the Client Credentials Flow:

Using the Client Credentials Flow from Postman

Back in the Postman client, under the Authorization tab, just change the Grant Type to Client Credentials like the following. NB! When using application access, there are no spesific delegated scopes, so you need to change the scope so that it refers to .default after the scope URI:

Click Get New Access Token, and after successfully authenticating click to Use Token. Copy the Token to the Clipboard, and paste to a JWT debugger. Let’s examine the JWT payload:

Note that the audience and issuer is the same as when we got an access token for an end user, but also that the JWT payload does not contain any scopes (scp) or any other user identifiable claims.

Using the Client Credentials Flow from MSAL.PS

To get an Access Token for an application client in MSAL.PS, run the following commands:

# Set Client and Tenant ID
$clientID = "cd5283d0-8613-446f-bfd7-8eb1c6c9ac19"
$tenantID = "104742fb-6225-439f-9540-60da5f0317dc"
# Set Client Secret as Secure String (keep private)
$clientSecret = ConvertTo-SecureString ("<your secret in plain text") -AsPlainText -Force

# Get Access Token using Client Credentials Flow and Default Scope
$tokenResponse = Get-MsalToken -ClientId $clientID -ClientSecret $clientSecret -TenantId $tenantID -Scopes 'api://elven-logicapp-api/.default'

You can then validate this Token and copy it to a JWT debugger:

# Copy Access Token to Clipboard
$tokenResponse.AccessToken | Clip

Calling the Logic App using Client Application

We can send requests to the Logic App using an Access Token in an application by including it as a Bearer Token in the Authorization Header exactly the same way we did previously, however it might fail internally if the Logic App processing of the access token fails because it now contains a different payload with claims:

Looking into the run history of the Logic App I can see that the reason it fails is that it is missing scp (scopes) in the token.

This is expected when authenticating as an application, so we will fix that a little later.

A few words on Scopes vs. Roles

In delegated users scenarios, permissions are defined as Scopes. When using application permissions, we will be using Roles. Role permissions will always be granted by an admin, and every role permission granted for the application will be included in the token, and they will be provided by the .default scope for the API.

Adding Application Roles for Applications

Now, let’s look into adding Roles to our LogicApp API. Locate the App registration for the API, and go to the App roles | Preview blade. (this new preview let us define roles in the GUI, where until recently you had to go to the manifest to edit).

Next, click on Create app role. Give the app role a display name and value. PS! The value must be unique, so if you already have that value as a scope name, then you need to distinguish it eg. by using Role in the value as I have here:

The allowed member types give you a choice over who/what can be assigned the role. You can select either application or user/groups, or both.

Add another App Role as shown below:

You should now have the following two roles:

Assigning Roles to Application

I recommend that you create a new App Registration for application access scenarios. This way you can avoid mixing delegated and application permissions in the same app registration, it will make it easier to differentiate user and admins consents, and secret credentials will be easier to separate, and you can use different settings for restricting access using Azure AD Users/Groups and Conditional Access.

So create a new App registration, call it something like (Yourname) LogicApp Application Client:

Choose single tenant and leave the other settings as default. Click Register and copy the Application (Client ID) and store it for later:

Next, go to Certificates & secrets, and create a new Client secret:

Copy the secret and store it for later.

Go to API permissions, click Add a permission, and from My APIs, find the LogicApp API. Add the Application permissions as shown below, these are the App Roles we added to the API earlier:

Under API permissions you can remove the Microsoft Graph user.read permission, it won’t be needed here, the two remaining permissions should be:

These you NEED to grant admin consent for, as no interactive user will be involved in consent prompt:

The admin consent are granted as shown below:

Now we can test getting access token via this new app registration, either use Postman or MSAL.PS , remember to use the new app (client) id and app (client) secret. I chose to add the two values to my Postman environment like this:

Next, change the token settings for Client Credentials flow so that the Client ID and Secret use the new variable names. Click to Get New Access Token:

After successfully getting the access token, click Use Token and copy it to clipboard so we can analyze it in the JWT debugger. From there we can indeed see that the roles claims has been added:

We will look for these roles claims in the Logic App later. But first we will take a look at how we can add these roles to users as well.

Assigning Roles to Users/Groups

Adding roles to users or groups can be used for authorizing access based on the roles claim. Go to the Enterprise App for the Logic App API registration, you can get to the Enterprise App by clicking on the Managed application link:

In the Enterprise App, under Users and Groups, you will already see the ServicePrincipal’s for the LogicApp Application Client with the Roles assigned. This is because these role permissions were granted by admin consent:

Click on Add user/group, add for a user in your organization the selected role:

You can add more users or groups to assigned roles:

Lets do a test for this user scenario. We need to do an interactive user login again, so change to using Authorization Code Flow in Postman, and change to the originial ClientID and ClientSecret:

Click to Get New Access Token, authenticate with your user in the browser (the user you assigned a role to), and then use the token and copy it to clipboard. If we now examine that token and look at the JWT data payload, we can see that the user has now a role claim, as well as the scope claim:

We can now proceed to adjust the authorization checks in the Logic App.

Customizing Logic App to handle Roles Claims

Previously in the Logic App we did checks against the scopes (scp claim). We need to do some adjustment to this steps, as it will fail if there are no scp claim in the Token:

Change to the following expression, with a if test that returns false if no scp claim, in addition to the original check for scope to be ManagedDevices.Read.All:

This is the expression used above:

if(empty(outputs('Base64_to_String_Json')?['scp']),false,contains(outputs('Base64_to_String_Json').scp,'ManagedDevices.Read.All'))

Similary, add a new Compose action, where we will check for any Roles claim.

This expression will also return false if either the roles claim is empty, or does not contain the ManagedDevices.Role.Read.All:

if(empty(outputs('Base64_to_String_Json')?['roles']),false,contains(outputs('Base64_to_String_Json').roles,'ManagedDevices.Role.Read.All'))

Next we need to add more checks to the authorization logic. Add a new line to the first group, where we also check the output of the Check Roles action to be false:

In the above image I’ve also updated the action name and comment to reflect new checks.

To the second group, add two more lines, where line number 3 is checking outputs from Check Roles to be false (same as above), and line 4 do a check if the roles claim contains the role ManagedDevices.Role.Read:

The complete authorization checks logic should now be:

And this is the summary of conditions:

  • Group 1 (checks if scp does not contain ManagedDevices.Read.All and roles does not contain ManagedDevices.Role.Read.All and calling user tries to get All managed devices):
    • Outputs('Check_Scopes') = false
    • empty(triggerBody()?['userUpn']) = true
    • Outputs('Check_Roles') = false
  • Group 2 (checks if scp does not contain ManagedDevices.Read.All and roles does not contain ManagedDevices.Role.Read.All, and tries to get managed devices for another user than users’ own upn, and roles does not contain ManagedDevices.Role.Read):
    • Outputs('Check_Scopes') = false
    • triggerBody()?['userUpn'] != Outputs('Base64_to_String_Json')['preferred_username']
    • Outputs('Check_Roles') = false
    • contains(outputs('Base64_to_String_Json')?['roles'],'ManagedDevices.Role.Read') = false

If any of the two groups of checks above returns true, then it means that the request was not authorized. To give a more customized response, change the response action like the following:

In the above action I have changed that response is returned as a JSON object, and then changed the body so that it returns JSON data. I have also listed the values from the token that the user/application use when calling the Logic App. The dynamic expression for getting roles claim (for which will be in an array if there are any roles claim) is:
if(empty(outputs('Base64_to_String_Json')?['roles']),'',join(outputs('Base64_to_String_Json')?['roles'],' '))
And for getting any scopes claim, which will be a text string or null:
outputs('Base64_to_String_Json')?['scp']

Test Scenario Summary

I’ll leave the testing over to you, but if you have followed along and customized the Logic App as I described above, you should now be able to verify the following test scenarios:

User/AppScopeRolesResult
UserManagedDevices.ReadCan get own managed devices.
Not authorized to get all devices or other users’ managed devices.
User (Admin)ManagedDevices.Read.AllCan get any or all devices.
UserManagedDevices,ReadManagedDevices.Role.ReadCan get own managed devices.
Can get other users’ managed devices by userUpn.
Not authorized to get all devices.
UserManagedDevices.ReadManagedDevices.Role.Read.AllCan get any or all devices.
ApplicationManagedDevices.Role.ReadCan any users’ managed devices by userUpn.
Not authorized to get all devices.
ApplicationManagedDevices.Role.Read.AllCan get any or all devices.

When testing the above scenarios, you need a new access token using either authorization code flow (user) or client credentials (application). For testing with roles and user scenarios, you can change the role assignments for the user at the Enterprise Application for the LogicAPI API. For testing with roles with application scenarios, make sure that you only grant admin consent for the applicable roles you want to test.

Final Steps and Summary

This has been quite the long read. The goal of this blog post was to show how your Logic App workflows can be exposed as an API, and how Azure AD OAuth Authorization Policies can control who can send requests to the Logic App as well as how you can use scopes and roles in the Access Token to make authorization decisions inside the Logic App. And even of more importance, integrating with Azure AD let’s you control user/group access, as well as adding additional security layer with Conditional Access policies!

My demo scenario was to let the Logic App call Microsoft Graph and return managed devices, which require privileged access to Graph API, and by exposing the Logic App as an API I can now let end users/principals call that Logic App as long as they are authorized to do so using my defined scopes and/or roles. I can easily see several other Microsoft Graph API (or Azure Management APIs, etc) scenarios using Logic App where I can control user access similarly.

Note also that any callers of the Logic App that now will try to call the Logic App using SAS access scheme will fail, as a Bearer Token is expected in the Authorization Header and the custom authorization actions that has been implemented. You might want to implement some better error handling if you like.

There’s an added bonus at the end of this article, where I add the filters for getting managed devices. But for now I want to thank you for reading and more article in this series will come later, including:

  • Calling Logic Apps protected by Azure AD from Power Platform
  • Protecting Logic App APIs using Azure API Management (APIM)

Bonus read

To complete the filtering of Managed Devices from Microsoft Graph, the Logic App prepared inputs of operatingSystem and osVersion in addition to userUpn. Let’s how we can implement that support as well.

After the initialize variable ManagedDevices action, add a Compose action. In this action, which I rename to operatingSystemFilter, I add a long dynamic expression:

This expression will check if the request trigger has an operatingSystem value, it not this value will be a empty string, but if not empty the I start building a text string using concat function where I build the filter string. There are some complexities here, amongs others using escaping of single apostroph, by adding another single apostroph etc. But this expression works:

if(empty(triggerBody()?['operatingSystem']),'',concat('/?$filter=operatingSystem eq ''',triggerBody()?['operatingSystem'],''''))

Next, add another Compose action and name it operatingSystemVersionFilter. This expression is even longer, checking the request trigger for osVersion, and if empty, it just returns the operatingSystemFilter from the previos action, but if present another string concat where I ‘and’ with the previous filter:

The expression from above image:

if(empty(triggerBody()?['osVersion']),outputs('operatingSystemFilter'),concat(outputs('operatingSystemFilter'),' and startswith(osVersion,''',triggerBody()?['osVersion'],''')'))

We can now add that output to the Graph queries, both when getting all or a specific user’s devices:

I can now add operatingSystem and osVersion to the request body when calling the Logic App:

And if I check the run history when testing the Logic App, I can see that the filter has been appended to the Graph query:

You can if you want also build more error handling logic for when if users specify the wrong user principalname, or any other filtering errors that may occur because of syntax etc.

That concludes the bonus tip, thanks again for reading πŸ™‚

Protect Logic Apps with Azure AD OAuth – Part 1 Management Access

Azure Logic Apps are great for creating workflows for your IT automation scenarios. Logic App workflows can be triggered using a variety of sources and events, including schedules, but a popular trigger is using a HTTP trigger for starting the Logic App workflow interactively or on-demand from outside the Logic App.

To trigger a Logic App using a HTTP trigger, you need to know the endpoint URL, for example:

This URL consist of the endpoint address of the Logic App and workflow trigger, and with the following query parameters:

  • api-version
  • sp (specifies permissions for permitted HTTP methods to use)
  • sv (SAS version to use)
  • sig (shared access signature)

Anyone with access to this URL and query parameters kan trigger the Logic App, so it’s very important to protect it from unauthorized access and use.

In this multi part blog post series we will look into how Logic Apps can be protected by Azure AD.

Scenarios this multi-part blog post articles will cover:

  • Provide Management Access Tokens and Restrict Issuer and Audience via OAuth
  • Restrict External Guest User Access
  • Expose Logic App as API
  • Restrict permitted Enterprise Application Users and Groups and Conditional Access policies.
  • Scopes and Roles Authorization in Logic Apps.
  • Logic Apps and APIM (Azure API Management).

Lets first look at the other methods for protecting Logic Apps you should be aware of.

Protect Logic Apps Keys and URLs

Before we move on to protecting Logic Apps with Azure AD Open Authentication (OAuth), lets take a quick summary of other protections you should be aware of:

  • Regenerate access keys. If you have reason to think SAS keys are shared outside your control, you can regenerate and thus making previous SAS keys invalid.
  • Create expiring Callback URLs. If you need to share URLs with people outside your organization or team, you can limit exposure by creating a Callback URL that expire on a certain date and time.
  • Create Callback URL with primary or secondary key. You can select to create the Callback URL with the specified primary or secondary SAS key.

The above methods should be part of any governance and security strategy for protecting Logic Apps that perform privilieged actions or might return sensitive data.

Protect Logic Apps via restricting inbound IP address

Another way to protect Logic Apps is to restrict from where the Logic App can be triggered via inbound IP address restrictions.

This opens up scenarios where you can specify your datacenter IP ranges, or only let other Logic Apps outbound IP addresses call nested Logic Apps, or only allow Azure API management to call Logic App.

Protect Logic Apps with Azure AD OAuth

By creating an Authtorization Policy for your Logic App you can use a Authorization header with a Bearer Token and require that the token contains the specified issuer, audience or other claims. Showing how that works in detail, and usable scenarios will be the main focus for this blog post.

Let’s start by building a basic Logic App we can use for demo purpose.

Creating a basic Logic App with HTTP Trigger and Response

In you Azure Subcription, create a new Logic App, specifying to use a HTTP trigger. In my example below I have named my Logic App “logicapp-test-auth”:

Next, add a Parse JSON action, where the Content is set to the trigger headers, as shown below. I’ve just specified a simple schema output, this can be customized later if needed:

After that activity, add a Response action to return data to the caller. In my example below I return a Status Code of 200 (OK), and set the Content-Type to application/json, and return a simple JSON body of UserAgent where the value is set to the parsed header output from the trigger, using dynamic expression:
body('Parse_JSON_Headers')?['User-Agent']

Testing Logic App with Postman

A great way to test and explore HTTP and REST API calls from your client is to use Postman (Download Postman | Try Postman for Free). When testing the above Logic App, paste in the HTTP POST URL for your trigger, and set the method to POST as shown below:

From the above image, you can see the URL, and also the query parameters listed (api-version, sp, sv, and sig, remember that these should be shared publicly).

When I send the request, it will trigger the Logic App, and should response back:

We can also verify the run history for the Logic App:

We have now successfully tested the Logic App using SAS authentication scheme, and can proceed to adding Azure AD OAuth. First we need to create an Authorization Policy.

Creating an Azure AD Authorization Policy

Under Settings and Authorization for your Logic App, add a new Authorization Policy with your name, and add the Issuer claim for your tenant. Issuer will be either https://sts.windows.net/{your-tenant-id}/ or https://login.microsoftonline.com/{your-tenant-id}/ depending on the version of the Access Token:

We will add more Claims later, but for now we will just test against the Issuer. Before we can test however, we need to get an Access Token. There are several ways to easily get an access token, basically we can consider one of the following two scenarios:

  1. Aquire an Access Token for well known Azure Management Resource Endpoints.
  2. Create an App Registration in Azure AD exposing an API.

We’ll cover App Registration and more advanced scenarios later, but for now we will get an Access Token using well known resource endpoints for Azure management.

PS! Just a quick note on Access Tokens aquired for Microsoft Graph resources: These cannot be used for Logic Apps Azure AD OAuth authorization policies, because Graph access tokens does not allow for signature validation.

Management Access Tokens

The following examples require that you either have installed Azure CLI (Install the Azure CLI | Microsoft Docs) or Az PowerShell (Install Azure PowerShell with PowerShellGet | Microsoft Docs).

Azure CLI

If you haven’t already, you will first need to login to Azure using az login. You can login interactively using default browser which supports modern authentication, including MFA, but if you are running multiple browsers and profiles it might be easier to use the device code flow:

az login --use-device-code

You will be prompted to open the microsoft.com/devicelogin page and enter the supplied device code, and after authentication with your Azure AD account, you will get a list of all subscriptions you have access to.

PS! If your account has access to subscriptions in multiple tenants, you can also specify which tenant to log into using:

az login --tenant elven.onmicrosoft.com --use-device-code

To get an Access Token you can just run az account get-access-token, like the following:

Let’s save that into a variable and get the token:

$accessToken = az account get-access-token | ConvertFrom-Json
$accessToken.accessToken | Clip

The above command copies the Access Token to the Clipboard. Let’s take a look at the token. Open the website jwt.ms or jwt.io, and paste in the token. From the debugger you can look at the decoded token payload. The most interesting part for now is the issuer (iss) and audience (aud), which tells us where the token has been issued from, and to which audience:

As we can see from above, the audience for the token is “management.core.windows.net”. You can also get an access token for a specific resource endpoint using:

$accessToken = az account get-access-token --resource-type arm | ConvertFrom-Json

To show all available resource endpoints use:

az cloud show --query endpoints

Now that we have the method to get the access token using Az Cli, lets take a look at Az PowerShell as well. For reference for az account command and parameters, see docs here: az account | Microsoft Docs

Azure PowerShell

First you need to login to your Azure Subscription by using:

Connect-AzAccount

If your account has access to multiple subscriptions in multiple tenant, you can use the following command to specify tenant:

Connect-AzAccount -Tenant elven.onmicrosoft.com

If there are multiple subscriptions, you might need to specify which subscription to access using Set-AzContext -Subscription <Subscription>. Tip, use Get-AzContext -ListAvailable for listing available subscriptions.

To get an access token using Az PowerShell, use the following command to save to variable and copy to clipboard:

$accessToken = Get-AzAccessToken
$accessToken.Token | Clip

We can once again look at the JWT debugger to verify the token:

As with Az CLI, you can also specify resource endpoint by using the following command in Az PowerShell specifying the resource Url:

$accessToken = Get-AzAccessToken -ResourceUrl 'https://management.core.windows.net'

Now that we have the Access Token for an Azure Management resource endpoint, let’s see how we can use that against the Logic App.

Use Bearer Token in Postman

From the previous test using Postman earlier in this article, go to the Authorization section, and specify Bearer Token, and then Paste the management access token you should still have in your clipboard like the following:

When clicking Send request, observe the following error:

We cannot combine both SAS (Shared Access Signature) and Bearer Token, so we need to adjust the POST URL to the Logic Apps. In Postman, this can be easily done in Postman under Params. Deselect the sp, sv and sig query parameters like the following, which will remove these from the POST URL:

When you now click Send request, you should get a successful response again, provided that tha access token is valid:

Perfect! We have now authorized triggering the Logic App using Azure AD OAuth, based on the Authorization policy:

And the Access Token that match that Issuer:

I will now add the audience to the Authorization Policy as well, so that only Access Tokens for the management endpoint resource can be used:

Any HTTP requests to the Logic App that has a Bearer Token that does not comply with the above Authorization Policy, will received this 403 – Forbidden error:

Test using Bearer Token in Azure CLI

You can trigger HTTP REST methods in Azure CLI using az rest --method .. --url ...

When using az rest an authorization header with bearer token will be automatically added, trying to use the url as resource endpoint (if url is one of the well known resource endpoints). As we will be triggering the Logic App endpoint as url, we need to specify the resource endpoint as well. In my example, I will run the following command for my Logic App:

az rest --method POST --resource 'https://management.core.windows.net/' --url 'https://prod-72.we
steurope.logic.azure.com:443/workflows/2fa8c6d0ed894b50b8aa5af7abc0f08b/triggers/manual/paths/invoke?api-
version=2016-10-01'

From above, I specify the resource endpoint of management.core.windows.net, from which the access token will be aquired for, and for url I specify my Logic App endpoint url, without the sp, sv and sig query parameters. This results in the following response:

So the Logic App triggered successfully, this time returning my console (Windows Terminal using Azure CLI).

Test using Bearer Token in Azure PowerShell

I will also show you how you can do a test using Az PowerShell. Make sure that you get an access token and saving the bearer token to a variable using this command first:

$accessToken = Get-AzAccessToken
$bearerToken = $accessToken.Token

I will also set the Logic App url to a variable for easier access:

$logicAppUrl = 'https://prod-72.westeurope.logic.azure.com:443/workflows/2fa8c6d0ed894b50b8aa5af7abc0f08b/triggers/manual/paths/invoke?api-version=2016-10-01'

There are 2 ways you can use Az PowerShell, either using Windows PowerShell or PowerShell Core.

For Windows PowerShell, use Invoke-RestMethod and add a Headers parameter specifying the Authorization header to use Bearer token:

Invoke-RestMethod -Method Post -Uri $logicAppUrl -Headers @{"Authorization"="Bearer $bearerToken"}

Running this should return this successfully, specifying my Windows PowerShell version (5.1) as User Agent:

For PowerShell Core, Invoke-RestMethod has now support for using OAuth as authentication, but I first need to convert the Bearer Token to a Secure String:

$accessToken = Get-AzAccessToken
$bearerToken = ConvertTo-SecureString ($accessToken.Token) -AsPlainText -Force

Then I can call the Logic App url using:

Invoke-RestMethod -Method Post -Uri $logicAppUrl -Authentication OAuth -Token $bearerToken

This should successfully return the following response, this time the User Agent is my PowerShell Core version (7.1):

Summary so far of Management Access Tokens and Logic Apps

At this point we can summarize the following:

  1. You can trigger your Logic App either by using SAS URL or using Bearer Token in Authorization Header, but not both at the same time.
  2. You can add an Azure AD Authorization Policy to your Logic App that specifies the Issuer and Audience, so that calling clients only can use Bearer Tokens from the specified issuer (tenant id), and audience (resource endpoint).
  3. While you cannot disable use of SAS signatures altogether, you can keep them secret, and periodically rollover, and only share the Logic App url endpoint and trigger path with clients.
  4. This is especially great for automation scenarios where users can use CLI or Azure PowerShell and call your Logic Apps securely using OAuth and Access Tokens.

In the next part we will look more into how we can customize the Logic App to get the details of the Access Token so we can use that in the actions.

Include Authorization Header in Logic Apps

You can include the Authorization header from the OAuth access token in the Logic App. To do this, open the Logic App in code view, and add the operationOptions to IncludeAuthorizatioNHeadersInOutputs for the trigger like this:

        "triggers": {
            "manual": {
                "inputs": {
                    "schema": {}
                },
                "kind": "Http",
                "type": "Request",
                "operationOptions": "IncludeAuthorizationHeadersInOutputs"
            }
        }

For subsequent runs of the Logic App, we can now see that the Authorization Header has been included:

And if we parse the headers output, we can access the Bearer Token:

If we want to decode that Bearer token to get a look into the payload claims, we can achieve that with some custom expression magic. Add a Compose action to the Logic App and use the following custom expression:

Let’s break it down:

  • The Replace function replaces the string ‘Bearer ‘ with blank (including the space after)
  • The Split function splits the token into the header, payload data, and signature parts of the JWT. We are interested in the payload, so refers to that with index [1]
split(replace(body('Parse_JSON_Headers')?['Authorization'], 'Bearer ',''), '.')[1]

Now it becomes a little tricky. We will have to use the base64ToString function to get the payload into readable string, but if the length of the payload isn’t dividable by 4 we will get an error. Therefore we need to see if we need to add padding (=), as explained here: (Base64 – Wikipedia).

First I get the length of the payload data, and then use modulo function to see if there are any remaining data after dividing by 4:

mod(length(outputs('Get_JWT_Payload')),4)

Then I can do a conditional logic, where I use concat to add padding (=) to make it dividable by 4:

if(equals(outputs('Length_and_Modulo'),1),concat(outputs('Get_JWT_Payload'),'==='),if(equals(outputs('Length_and_Modulo'),2),concat(outputs('Get_JWT_Payload'),'=='),if(equals(outputs('Length_and_Modulo'),3),concat(outputs('Get_JWT_Payload'),'='),outputs('Get_JWT_Payload'))))

After this I can use base64ToString to convert to a readable string object and format to JSON object:

json(base64ToString(outputs('Padded_JWT_Payload')))

Now that we have access to the claims, we can later be able to do some authorization in the Logic Apps, for example based on roles or scopes, but we can also get some information on which user that has called the Logic App.

In the Response action, add the following to return the Name claim:

And if I test the Logic App http request again, I can see at it indeed returns my name based on the claims from the access token:

We now have a way to identify users or principals calling the Logic App. The Authorization Policy for the Logic App now permits users and principals from my own organization (based on the issuer claim) and as long as the audience is for management.core.windows.net. But what if I want external access as well? Let’s add to the authorization policies next.

Add OAuth Authorization Policy for Guests

A logic app can have several Azure AD Authorization Policies, so if I want to let external guest users to be allowed to trigger the logic app, I will need to create another authorization policy that allows that issuer:

Lets also add the upn claim to the response, so it is easier to see which user from which tenant that triggered the Logic App:

When I now test with different users, internal to my tenant and external, I can see that it works and the response output is as expected:

Summary of Management Access Scenarios

The purpose of the above steps has been to provide a way for management scenarios, where users and principals can get an Access Token using one of the Azure Management well known endpoints, and provide that when calling the Logic App. The access token is validated using OAuth Authorization Policies, requiring specific issuer (tenant id) and audience (management endpoint). This way we can make sure that we don’t have to share the SAS details which lets users that have access to this URL call the Logic App without authentication.

In the next parts of this blog post articles, we will look into more advanced scenarios where we will expose the Logic App as an API and more.

Blog Series – Power’ing up your Home Office Lights: Part 10 – Subscribe to Graph and Teams Presence to automatically set Hue Lights based on my Teams Presence!

This blog post is part of the Blog Series: Power’ing up your Home Office Lights with Power Platform. See introduction post for links to the other articles in the series:
https://gotoguy.blog/2020/12/02/blog-series—powering-up-your-home-office-lights-using-power-platform—introduction/

In this last part of the blog series, we will build on the previous blog post where we could get the presence status from Teams to the Hue Power App. Now we will use Microsoft Graph and subscribe to change notifications for presence, so that both the Power App and my Hue Lights will automatically change status based on the presence.

In this blog post I will reuse the Custom Connector I created for managing Microsoft Graph Subscriptions in this previous blog post: Managing Microsoft Graph Change Notifications Subscriptions with Power Platform | GoToGuy Blog

To follow along, you will need at least to create the App Registration and the Custom Connector referred to in that blog post. You can also import the Custom Connector OpenAPI Swagger from this link: <MY GITHUB URL TO HERE>

Adding the Custom Connector to the PowerApp

In the Hue Power App, go to the View menu, and under Data click to “+ Add data source”. Add the MSGraph Subscription Connector as shown below. This way we can refer to this in the Hue Power App.

We are going to add the logic to creating the subscription based on this toggle button and the selected light source:

But before that we have somethings to prepare first. When creating a Graph Subscription, we will need to prepare a Webhook Url for where Graph will send its notifcations when the Teams presence changes. This will be handled in a Power Automate Flow, so lets create that first.

Creating the Flow for Graph Notifications and Hue Lights changes

Since I have built this all before, ref. https://gotoguy.blog/2020/10/24/managing-microsoft-graph-change-notifications-subscriptions-with-power-platform/, I will make a copy of the “Graph Notification – Presence Change” flow for this use case.

If you want to follow along, you will need to follow the instructions in that blog post first. After saving a copy, or if you created a new flow from blank using HTTP request webhook, you will need to copy the HTTP POST URL shown below, as that will be the “notification url” we will refer to later:

Next set some value that only you know for the secret client state:

The next part of the Flow are used to do a first-time validation of adding the change subscription, with Content-Type text/plain and request query containing a validation token. Microsoft Graph expects a response of status code 200 with the validation token back. If that is returned, Microsoft Graph will successfully create the subscription.

For subsequent requests, we must return a 202 Accepted response, and in the next step I parse the notification request body, so that we can look into what change we have been notified for:

Following the change notification, we can start looking into the change value. Firstly, I have added a verification of the secret client state I specified earlier, this is prevent misuse of the notification Url if that become known by others or used in the wrong context. After doing a simple test of the client state, where I do nothing if the client state don’t match, I can start building the logic behind the changes in presence status:

Inside the Switch Presence Status action, I will based on the availability status change, do a different case for each of the possible Teams Presence status values (see blog series post part 9 for explaining the different presence availability values):

Inside each of these cases I will define the light settings and colors, and after that call the Remote Hue API for setting the light. As you saw in part 8 of this blog series, in order to access the Hue API remotely I will need the following:

  • An Access Token to be included as a Bearer token in the Authorization Header
  • A username/whitelist identifier
  • The actual lightnumber to set the state for
  • A request body containing the light state, for example colors, brightness, on/off etc.

Remember, this Flow will be triggered from Microsoft Graph whenever there is a presence state change in Teams. So I need to be able to access/retreive the access token, username and for which lightnumber I created the subscription. This is how I will get it:

  • Access Token will be retreived via the Logic App I created in part 4 of the blog series.
  • Username/whitelist identifier will be retrieved via the SharePoint List i created in part 6, see below image.
  • However, I do need to store the lightnumber I will create the change notification for, and for this I will add a couple more columns in this list:

Customize the Configuration List for storing Subscription Details

I add the following two single-line-of-text columns to my List:

  • Subscribe Presence LightNumber. This will be the chosen Hue light I want to change when change notifications occur for Teams presence status.
  • Change Notification Subscription Id. This will be the Id I can refer back to when adding and removing the subscription when needed.

Customize the Flow for getting User Configuration and preparing Light States

Back to the Flow again, we need to add some actions. First, add a initialize variable action after the initialize SecretClientState action like here:

I set the type to Object and using the json function to create an empty json object. This variable will later be used for changing the light state.

Next, in the Flow after the “Parse Notification Body” action, add a “Get Items” action from SharePoint connector, and configure it to your site, list name and the following Filter Query:

My filter query:
first(body('Parse_Notification_Body')?['value'])?['subscriptionId']
will be used to find what username and light that have been set up for presence changes.

Next, lets set the variable for LigthState. Inside each of the Switch Cases, add the action Set Variable, and then set the variable to the chosen xy color code, as a json object like the following. This is for the color green:

Do the same for all of the other case, these are the values I have been using:

STATECOLORJSON VARIABLE
AwayYellowjson(‘{
“on”: true,
“bri”: 254,
“xy”: [ 0.517102, 0.474840 ],
“transitiontime”: 0
}’)
AvailableGreenjson(‘{
“on”: true,
“bri”: 254,
“xy”: [ 0.358189, 0.556853 ],
“transitiontime”: 0
}’)
AvailableIdleGreen (10% bright)json(‘{
“on”: true,
“bri”: 25,
“xy”: [ 0.358189, 0.556853 ],
“transitiontime”: 0
}’)
BusyRedjson(‘{
“on”: true,
“bri”: 254,
“xy”: [ 0.626564, 0.256591 ],
“transitiontime”: 0
}’)
BusyIdleRed (10% bright)json(‘{
“on”: true,
“bri”: 25,
“xy”: [ 0.626564, 0.256591 ],
“transitiontime”: 0
}’)
BeRightBackYellowjson(‘{
“on”: true,
“bri”: 254,
“xy”: [ 0.517102, 0.474840 ],
“transitiontime”: 0
}’)
DoNotDisturbRedjson(‘{
“on”: true,
“bri”: 254,
“xy”: [ 0.626564, 0.256591 ],
“transitiontime”: 0
}’)
OfflineGrey (10% bright)json(‘{
“on”: true,
“bri”: 25,
“xy”: [ 0.3146, 0.3303 ],
“transitiontime”: 0
}’)
PresenceUnknownWhitejson(‘{
“on”: true,
“bri”: 254,
“xy”: [ 0.3146, 0.3303 ],
“transitiontime”: 0
}’)

PS! The transitiontime is to get as close to realtime updates as possible. Add the colors similarly for rest of the states:

Get the Access Token and call Hue Remote API

Now we are ready to get the Access Token, and call the Hue Remote API with the selected light state.

First, add a HTTP action below the Switch Presence Status action, calling the Logic App previously created in part 4 of the blog series:

Next, add another HTTP request after, and this will call the Hue Remote API to set the lights:

When constructing Hue API URI above, I retreive the whitelist identifier for username with the following expression:

first(body('Get_My_Hue_User_Subscription')?['Value'])?['WhitelistIdentifier']

And then which light number with this expression:

first(body('Get_My_Hue_User_Subscription')?['Value'])?['SubscribePresenceLightNumber']

That should be the completion of this Flow. Remember to turn it on if needed. We are now ready for the last step.

Adding the Flow for Creating the Graph Subscription

In the beginning of this blog post I referred to the toggle “Sync Hue Lights with Teams” in the Power App. Now that we have prepared the Flow handling the notification Url, we need to add a new Flow that would handle the creating, or deletion, of Graph Subscriptions. As this is a toggle control, setting it to “On” should create a Graph subscription for the selected lights, and setting it to “Off” should remove that subscription again.

Create a new instant Flow with PowerApps as trigger:

Next, add an action for initialize variable for getting the UserDisplayName:

Use the following expression for getting the user display name:

triggerOutputs()['headers']['x-ms-user-name']

Next, add another initialize variable for getting the light number. After renaming the action, click on “Ask in PowerApps”:

Add another initialize variable, this time a boolean value, and for getting the value of the toggle sync control:

Next, we will retrieve our user config source from the SharePoint list again, this time filtered by the UserDisplayName variable:

We now need some logic to the next steps for the flow, lets start by the toogle button which will be either true or false:

If yes, that should mean that we either should create a new graph subscription, or update any exisiting ones.

Under Yes, add a new action. This action will call the Custom Connector I created in another blog post (https://gotoguy.blog/2020/10/24/managing-microsoft-graph-change-notifications-subscriptions-with-power-platform/). This Custom Connector should have the following actions:

Click on the Get Subscription action to add that. For the subscriptionId parameter I will refer to the first returned instance of any exisiting Graph Subscriptions I have in the SharePoint list returned earlier.

Here is the expression for your reference:

first(body('Get_My_Hue_User')?['Value'])?['ChangeNotificationSubscriptionId']

When running the Flow this action will either:

  • Return a 200 OK if the subscription is found, or if the SharePoint List has a blank value for ChangeNotificationSubscriptionId.
  • Return a 404 ResourceNotFound if the subscription is not found, this will happen if the subscription is expired. This is an error that will halt the Flow, so we need to handle it.

So lets start by getting the Status Code. I’ll do that by adding a compose action with the following expression:

outputs('Get_Graph_Subscription')['statusCode']

It’s also important to set that this action should run even if the Get Graph Subscription action fails:

Lets add another condition, where the outputs of the status code should be 200 (OK):

A status code of 200 will either indicate that either we found an existing Graph Subscription matching the configured subscription id from the SharePoint List, or that the list is blank and the Get Graph Subscription will return an empty array or any other Graph Subscriptions you might have. In the first case, we will just update the existing subscription, in the latter case we will create a new subscription. Lets start by adding another compose action:

Adding the Flow to the Hue Power App

Back in the Hue Power App, we can now link this Flow to the toggle control. With the sync toggle control selected, go to the Action menu, and then click on Power Automate. From there you should see the “Hue – Create Update Remove Graph Subscription” Flow, click to add it. On the OnChange event, add the Run parameters where the lightnumber value is read from the dropdown list and selected items lightnumber, and the toggle button value like this:

Managing Microsoft Graph Change Notifications Subscriptions with Power Platform

In a recent blog post, https://gotoguy.blog/2020/07/12/subscribing-to-teams-presence-with-graph-api-using-power-platform/, I described how you could create a subscription for change notifications for Teams Presence API. In that example, I created a subscription using Graph Explorer. As every subscription has an expiry lifetime, depending on the resource type, you need to have some management over renewing subscriptions before they expire, and re-creating if they have been expired. I also include deleting subscriptions that are no longer needed in that scenario.

In this blog post I’m going to use the Powers of Power Platform to set up this. I’m going to do this from an interactive users’ perspective, using delegated permissions. Many Microsoft Graph API resources supports application permissions, but not all especially those that are in the beta endpoint, so it makes sense for me now to concentrate on delegate permissions.

Lets get started, and as in many scenarios with Microsoft Graph, I will start by creating an App Registration in Azure AD.

Register an App for Managing Graph Subscriptions

Login to your Azure AD Portal (https://aad.portal.azure.com) and go to App Registrations to create a new.

Depending on the settings in your tenant, users can be allowed to create their own application registrations:

If your admin has set this to “No”, you will need to either be a Global Admin, or been added to one of these roles for creating app registrations:

Create the following App registration, just give a name and let the other settings be like default:

Add to the permissions so that you have the following Microsoft Graph delegated permissions:

  • Presence.Read
  • Presence.Read.All
  • User.ReadBasic.All

Note that none of these permissions require admin consent. If you later want to manage subsxcriptions to Graph resources that require admin consent, you will need assistance from your Global Admin to consent to any permissions that require this,

Next, on the Overview page, copy the Application (client) ID, you will need this for later. Next go to Certificates & Secrets, and create a new secret:

Copy this secret for later.

We will now use this app registration as our API to Microsoft Graph, using a Custom Connector in Power Platform.

Create a Custom Connector for Microsoft Graph

Initial creation and authentication

If you haven’t already, log in to flow.microsoft.com, and go to Data and Custom Connectors. The following procedure includes a lot of manual steps, if you want to import this connector and its actions from my swagger definition file, you will find a link at the end of this section.

Select to create a new Custom Connector, from blank. Give it a name, for example:

Provide an optional description, and specify graph.microsoft.com as Host:

Next, under Security, select OAuth 2.0 for Authentication and select Azure Active Directory for Identity Provider. Add the Application Id for the App Registration as Client id, and the Secret you created earlier for Client secret. Type https://graph.microsoft.com as resource, and add under Scope the permissions you added so that the user can consent to these permissions. Type each permission with a space between:

Now, click on Create Connector and verify that is has been successfully created, by now you will see that the Redirect URL field has been updated, copy this value:

Next, go back to your Azure AD App Registration, and under Authentication add a Web platform with the Redirect URI set to the copied value from the Custom Connector.

Lets verify the Custom Connector before we go to the operations. Go to 4. Test, and select to create a New connection:

Next, you will be asked to authenticate with your Azure AD user, note that you will have to consent to the permission scopes we defined for the connector:

And you should successfully have a connection to Microsoft Graph, via the Custom Connector and our App Registration:

Now, we need to add operations via 3. Definition, this where we will call the Microsoft Graph API for queries regarding creating subscriptions and more.

Define Operations for the Connector

Basically this Connector will be doing the same queries like I did by using Graph Explorer in the preceding blog post plus a few more. A pro tip is to use Graph Explorer for sample requests and response when defining operations. In the following I will set up operations for:

  • GET /users/{userUpn}/?$select=id: Get a specific user and id
  • GET /communications/presences/{userId}: Get Presence for a specific user by id
  • POST /subscriptions: Create a new subscription
  • GET /subscriptions: List all my active subscriptions
  • GET /subscriptions/{subscriptionId}: Get a specific subscription
  • PATCH /subscriptions/{subscriptionId}: Update an existing subscription to renew
  • DELETE /subscriptions/{subscriptionId}: Delete a subscription

Get a specific user and id

We will start with the first operation. Go to Definitions and add a new Action. Specify a summary and operation id:

Next under Request, click on Import from sample. Specify a GET query that has the format: https://graph.microsoft.com/v1.0/users/{userUpn}?$select=id, like this and click import:

This will interpret the request URL and recognize that userUpn is a Path parameter variable and $select is a query parameter:

Next, go over to Graph Explorer, and run the same query specifying your own user principal name, copy the response:

Back in the Custom Connector, under Response, click on the default response and click Import from sample. Paste the response you copied from Graph Explorer and click Import:

Click on Update Connector to save this action. You can now go to 4. Test. You can use the connection created earlier, and select which operation you want to test, in this case specify your Upn and to select id to be returned. Testing the operation should be successful:

With the first operation added, lets add the remaining. I will skip the screenshots in the following, just follow the same steps using the information below where I summarize the details.

Get Presence for a specific user by id

Create a new subscription

  • Action Summary: Create Subscription
  • Action Operation Id: CreateSubscription
  • Request Verb: POST
  • Request URL: https://graph.microsoft.com/beta/subscriptions
  • Request Body:
    {
    “changeType”: “updated”,
    “clientState”: “{MySecretClientState}”,
    “notificationUrl”: “{WebhookUrl}”,
    “resource”: “/communications/presences/{UserId}”,
    “expirationDateTime”: “{ExpiryDateTime}”
    }
  • Request Response: (copy example response from running this in Graph Explorer)

List all my active subscriptions

Get a specific subscription

  • Action Summary: Get Subscription
  • Action Operation Id: GetSubscription
  • Request Verb: GET
  • Request URL: https://graph.microsoft.com/beta/subscriptions/{subscriptionId}
  • Request Response: (copy example response from running this in Graph Explorer)

Update an existing subscription to renew

  • Action Summary: Update Subscription
  • Action Operation Id: UpdateSubscription
  • Request Verb: PATCH
  • Request URL: https://graph.microsoft.com/beta/subscriptions/{subscriptionId}
  • Request Body:
    { “expirationDateTime”:”2020-07-14T00:00Z” }
  • Request Response: (copy example response from running this in Graph Explorer)

Delete a subscription

Custom Connector Summary

By now you should have 7 actions defined, remember to test to verify that they work as expected:

NB! Currently I get an error when running GET /subscriptions/{id}, even though it is a correct subscription id and following offical docs: https://docs.microsoft.com/en-us/graph/api/subscription-get?view=graph-rest-beta&tabs=http. I’ll leave the action in the connector but testing it gives me an “internal storage error” both in Graph Explorer and testing with this Custom Connector.

As mentioned earlier, I’ll provide you a link to the complete swagger definition of this connector and its actions here on my GitHub repository:

github.com/JanVidarElven

The Custom Connector can now be used in Power Automate Flows and Power Apps, so in the next section I will start looking into some logic for managing subscriptions.

Choosing a Configuration Source

First we need to look into some kind of configuration source for your Subscriptions to Graph Change Notifications. Consider the following:

  • When you create a subscription, you need somewhere to store the subscription id if you want to be able to update/renew or delete an existing subscription.
  • If you want to continuously renew a subscription, you might want to set an end date for how long you want a subscription to exist.
  • You will want to store the webhook url for where to send the change notifications, and possibly being able to change that if needed.
  • You will want to store the resource/resources you want receive change notifications for and for what type of change.
  • It makes sense to store the clientstate secret so that can be re-used when creating/re-creating subscriptions.

The next question is, where should I store this configuration info? There are some options. If you have access to an Azure subscriptions, you could of course store this in an SQL Database or Cosmos DB to name a few common data stores, or maybe Azure Key Vault for some of the secret info. I want to focus on options typically available for end users though, so it makes sense to go more in the direction of Microsoft 365 services. You could use CDS, Common Data Service, part of the Power Platform and using solutions and custom entities, but that would require admin assistance in either configuring this for you, or giving your end-user permission roles to do this yourself. In addition, CDS will available for all users with access to the environment, so creating a dedicated environment should also be part of that.

In this blog post I will go for the simplest option, I will store it in a list, using SharePoint Online Lists. Another alternative could be to use Microsoft Lists. I will do this by creating a dedicated Teams, and go to the SharePoint Online teams site, and create the list from there.

So I have prepared this now, and my list now contains the following columns, subscriptionId, changeType, clientState, notificationUrl and resource all being field type single line of text, and an endDateTime wit field type of Date with Time:

This will be my configuration store for the following Flows and PowerApps that will mange my subscriptions using the Custom Connector created earlier.

Creating the PowerApp for Managing Subscriptions

My main user interface for managing Graph Subscriptions will be a PowerApp. From this PowerApp I will call actions from the Custom Connector directly, or call Power Automate Flows for where I will have more steps and logic.

Note that this PowerApp can be used to manage all sorts of Microsoft Graph Subscriptions you might have, not just for Team Presence API which is the main focus in this example but for other resources also.

Log in to make.powerapps.com to get started, and follow the general steps in the sections below.

Create a blank canvas app and main screen

Select to create a new Canvas app from blank, give it a name like below, and select format. I will use Table format because I will mainly use this from my PC:

Next in the PowerApps creator studio you can set a Theme for your app, or use your own custom colors and backgrounds. Then you can start adding controls to the canvas. In the below image I have added a custom background, my own logo, some labels for headings and some buttons I will add actions to later. I have also added an icon for later being able to manually refresh.

This main screen will list my active Graph subscriptions on the left part, this will be queried from GET https://graph.microsoft.com/beta/subscriptions. The right part will be from my configuration source, which is the SharePoint list.

It is also a good practice to name the controls using your own naming convention, this is mine for the screen above:

Add Data sources to the Main screen

Next we will start adding the data sources. Go to the View menu, select Data sources, and click “Add data”. Search for “SharePoint” and add the following data source:

Connect with your user, or create a new connection if needed:

Next paste in the SharePoint URL for the SharePoint site where you created the list earlier, and click Connect:

Choose the list:

We now have a data source we can use for this list in the PowerApp:

Next, add a new control to your app, click Insert and under Layout find the “Data table (preview)” control. This will then be placed on your canvas, and you are prompted to select a data source. This is where I select the SharePoint list:

The data table control should now be automatically configured with the fields you created in the list:

To make the data table smaller I will remove some of the fields to display, I’ll remove changeType, clientState and notificationUrl. This leaves me with the following data table which I place under the configured subscriptions part. The list is now empty, but that will be filled in later when I create subscriptions:

Now lets proceed to get the actual Graph subscriptions. Add a new data source, this time search for the name of the Custom Connector we added earlier:

Add that source and choose a connection, and the data source should be added alongside the SharePoint data source:

We will add another Data table control to the Canvas, select Insert and Data table, this time selecting the Custom Connector as data source.

The Items property for the Data table needs to be set as: (MSGraphSubscriptionConnector.GetSubscriptions()).value, which will return a data table for every registered Graph subscription. This should look like this:

After that you can configure the fields to display for the data table, I have added the id, resource and expirationDateTime fields, and placed this data table under the “My Active Graph Subscriptions” part:

I now have the data connections I need for the main screen. The next part is to add actions to the buttons, but before that I will add another screen for where to specify more details for creating a subscription.

Before you proceed, select the main screen from the Tree view and duplicate screen:

I will use this copied screen as starting point for adding the next set of controls.

Creating a Screen for adding new subscriptions

In this screen I have added controls on the left side where you can type in an user principal name, get the user id and get the presence for that user. In this way I can check that I have the correct permissions required for subscribing to change notifications for the specified user.

On the right part of the screen I have added controls for adding a new Graph subscription.

Most of the controls above is label and textbox controls. For User Id and Presence I have change the Display mode to View, and for Notification URL I have changed from single line to multi line.

The change type is a combo box, where I have added to the Items property: [“updated”,”created”,”deleted”]. The expire date time is a date picker control showing date and time in short format.

At the bottom left I have added an icon with a home symbol, and on the OnSelect event for that icon I have added a navigation back to the home screen:

Similarly, on the Home screen I have added a navigation on the “Create Subscription” button to the create subscription screen:

In the next section we will add some actions to the buttons.

Getting User Id and Presence

We will start on the left side of the screen, getting user id and presence for that user. When we created the Custom Connector earlier we created actions just for this scenario.

I will use the UpdateContext function in PowerApps to save the output of the called actions to a variable, and then use that variable for the text input property.

Select the “Get User Id” button, and type the following on the OnSelect event:
UpdateContext({UserId: MSGraphSubscriptionConnector.GetUserIdByUpn(txtUserUpn.Text)})

Like the following:

This will create/update my variable “UserId”, by calling the Custom Connector action GetUserIdByUpn using the input parameter that is the text value from the txtUserUpn text input box. When I click on this button, it will run a query against Microsoft Graph the will return a user record object where I can retreive the .id property.

Next, set the Default propery of the txtUserId text input to UserId.Id:

We can test this right away, click on Preview the App (F5), and type in an existing upn, and press the Get User Id button, it should return the id for your user. You can try several different user upn’s also:

We will do something similar on getting presence. On the OnSelect event for the “Get Presence” button:
UpdateContext({UserPresence: MSGraphSubscriptionConnector.GetPresenceForUser(txtUserId.Text)})

Then set the Default property for the Presence text input to:

When testing you should be able to get the specified users presence by user id.

I’ll add one more thing, on the right side of the screen, for the resource text input, add the following to the Default property:

This will pre-populate the resource based on the user I looked up on the left side, which will come in handy when I create the subscription later:

In the next part I will create the logic behind adding a new subscription, and this will be done using a Flow, as I will have several steps in this logic.

Creating Flow for adding new Graph Subscriptions

This Flow will be triggered from the PowerApp above, and use the Custom Connector action for adding a new Graph Subscription. In the same Flow I will also save the configuration of the subscription to my SharePoint list.

Start by going to flow.microsoft.com, select Create, Start from blank and select Instant flow. Tyoe a name for the Flow, for example “Create Graph Subscription”, and select PowerApps as trigger as shown below:

Next, add the action Initialize variable right after the PowerApps trigger. Set the name of the variable and rename the action to the same. Do this 5 times, as we need input of changeType, resource, clientState, expirationDateTime and notificationUrl. Leave the Default value blank at first.

Pro tip: Make sure that you rename the actions before you do the next step: For each variable, for value select Ask in PowerApps:

The input parameters will now be named nice and recognizable, which will make it easier when calling it from PowerApps:

Before we proceed, we need to convert the date time input from PowerApps to ISO8601 format and UTC timezone, which is expected by Graph API. Additionally subscription expiration can only be 4230 minutes in the future from creation. The thinking behind my PowerApp for adding new graph subscriptions is that the expire date time is sometime in the future including any renewals. So for now I need to create a new variable that can take todays date, add maximum of 4230 minutes, and convert it to ISO8601 UTC format.

Add a new initialize variable action. Give it the following name, set type to string and use the following expression for using now as time and adding 4230 minutes in UTC format: addMinutes(utcNow(),4230), like this:

For the next action, select the MSGraph Subscription Connector, and the Create Subscription action:

You can now add the variables as input parameters to the create subscription action, remember to use the calculated variable for graph expire date time as shown above:

The next step will be to save this configuration to the SharePoint list. Add an action called “Create item” from the SharePoint connector like below. Select the SharePoint site address and the list name for the configuration source created earlier:

Title is required for adding a list item, let’s call the Ask in PowerApps for this one also:

For subscriptionId, select the “id” output from the Create Subscription action:

For the rest of the list values, select the initial variables:

The Flow is now ready to be called. I could have added more error handling and custom response if input data is missing, but I will proceed for now with this.

Calling the Flow from PowerApps

Back in the PowerApp, select the “Add Subscription” button, and on the Action menu select “Power Automate”:

Select the “Create Graph Subscriptions” Flow, which then will be added to the PowerApp, and you are prompted to provide the parameters we defined as “Ask in PowerApps” in the Flow:

Fill in the parameters like below:

Now you can press F5 to preview the app. Type in a UPN, and click to get the User Id. On the right side, make sure that the change type is set to “updated” (if you want to subscribe to presence change) and that the resource is filled in with the correct user id. From the date picker, select any future date, we will later use this date for how long the subscription should be auto-renewed. Last, add the notification URL for the change notifications. Then click on “Add Subscription”, which now should trigger the Flow:

Visually, nothing changes in the PowerApp, so we need to look at the run history for the Flow to see if it ran successfully:

We can also look at the steps that indeed a Graph subscription was created and a list item was added to SharePoint:

While both the Graph subscription and the list has been updated, we need a way to refresh that into the PowerApp. We can use the Refresh(data source) method for the SharePoint list, but that won’t work for the Custom Connector. We will get to that later, let’s start with the SharePoint list. On the refresh icon I added for the main screen, on the OnSelect event, add the following: Refresh(‘Change Notification Subscriptions’);, like this:

If you now preview the app, and click on the refresh button, the right side of the data table should now show a SP list item that has been created (PS! if you have a dark background them, make sure that the data table control has a light background so you can see the text):

Next, for refreshing the Graph subscriptions via the connector, I will change to use a Collection and the ClearCollect method. On the refresh icon, add a new line like this: ClearCollect(GraphSubscriptions,(MSGraphSubscriptionConnector.GetSubscriptions()).value), as shown below:

Next, change the Items property for the left data table for Graph subscriptions to use this Collection:

This should now show the active Graph subscriptions like below:

Now that we have a way to manually refresh these two data tables, using the refresh icon, we can add the same logic to the “Add Subscriptions” button:

The above here in clear text, the set(wait, ..) is a nice way to add busy status to PowerApps when potential long running tasks:

Set(wait,true);
CreateGraphSubcription.Run(dropDownChangeType.SelectedText.Value,txtResource.Text,txtClientState.Text,dateExpireDateTime.SelectedDate,txtNotificationUrl.Text,(txtUserUpn.Text & " Presence"));
Refresh('Change Notification Subscriptions');
ClearCollect(GraphSubscriptions,(MSGraphSubscriptionConnector.GetSubscriptions()).value);
Set(wait,!true)

At this point we have the solution ready for creating Graph Subscriptions and refreshing that in the data tables. Next we need to create Flows for updating and deleting subscriptions.

Blog Series – Power’ing up your Home Office Lights: Part 9 – Using Microsoft Graph to get Teams Presence and show state in PowerApp.

This blog post is part of the Blog Series: Power’ing up your Home Office Lights with Power Platform. See introduction post for links to the other articles in the series:
https://gotoguy.blog/2020/12/02/blog-series—powering-up-your-home-office-lights-using-power-platform—introduction/

In this part 9 we will use Microsoft Graph to get the logged in user Teams Presence, and show that state in the PowerApp.

I have previously written another post on Teams Presence, Microsoft Graph and requirements here: Subscribing to Teams Presence with Graph API using Power Platform | GoToGuy Blog. If you want to dig deeper into that I would recommend that you read that post, but for now in this article I will show how you can get your Teams Presence into the Hue Power App.

Teams Presence is currently available in the beta endpoint of Microsoft Graph here: https://graph.microsoft.com/beta/me/presence

If you quickly want to check your own Teams Presence via the Microsoft Graph you can try the following. Just click this link that will launch in Graph Explorer: https://developer.microsoft.com/en-us/graph/graph-explorer?request=me%2Fpresence&method=GET&version=beta&GraphUrl=https://graph.microsoft.com

Just remember to consent to the Presence.Read permission as shown below:

As always when calling Microsoft Graph, we need to authenticate to Azure AD and authorize to Graph API to get an access token for quierying resources. And if we want to do that from Power Platform we need to create an app registration for that in Azure AD.

App Registration in Azure AD

This step might be dependent on if your tenant administrator has restricted the users’ right to create app registrations. If so, you will need to log into your tenant as a Global Administrator or Application Administrator, or get help from your IT admin to create the following App Registration in Azure AD.

If not, the following operations don’t require admin consent or permissions, so you can go ahead and create the App Registration. At the Azure AD Portal, go to https://aad.portal.azure.com, App Registrations and add a new like below:

Just leave the Redirect URI blank for now and click register.

Next, click on API Permissions, and click add a permission and select Microsoft Graph at the top, click on Delegated permissions, and add the Presence.Read permission as shown below:

You should now have the following permissions:

Next, go to Certificates & secrets, add a new client secret with a description, and select your chosen expiry:

Click Add and copy the secret value which will showed only this once. Save this secret for now, we will need it later. Also, go back to overview and copy the Application (Client) Id for later. We will need that as well.

There is just one thing left in this app registration, but for now we need to switch over to Power Platform for creating the Custom Connector.

Custom Connector in Power Platform for Microsoft Graph

We will now create a custom connector in Power Platform to reference this App Registration and get the Presence. Log either into make.powerapps.com, or flow.microsoft.com, for this next step.

Under the Data menu, select Custom Connectors. Select to add new connector from blank, and give it a name:

Select Continue, and on the General page, type graph.microsoft.com as host. You can also upload an icon and a description:

On the Security page, select OAuth 2.0 as type, and Azure Active Directory for Identity Provider. Client Id and Secret is the App Id and Secret from the App Registration earlier. Resource Url is https://graph.microsoft.com, and specify the scope to be Presence.Read:

After that, click on “Create Connector”, and the the “Redirect URL” will be populated:

Copy this URL and add it as a Web platform Redirect URI back in the Azure AD App Registration:

Back in the Custom Connector, go to Step 3. Definition, and click New Action. Type in a Summary “Get Presence” and Operation ID “GetPresence”, and under Request click Import from sample. Specify Get as verb, and URL to https://graph.microsoft.com/beta/me/presence, like below, and click Import:

Go to the Response section, and click on the Default response. Click on Import from sample and specify Content-Type application/json for Header response, and for Body, paste in the response you got when you tried the presence query in Graph Explorer in the beginning of this blog post:

The action should now look like this:

We can now proceed to Test. Click on Update Connector and under 4. Test click on “New connection”, and then Create:

Sign in and then accept the application to read your presence information and profile as shown below:

I can now test the GetPresence action with the signed in connection, and verify a successful response. In my case my availability just now is “Away”:

With the Custom Connector now ready, I can proceed to add this status to my PowerApp.

Customizing the Hue Power App to get Presence

Back in my Power App i created in earlier parts of this blog series, I want this icon to reflect my Teams Presence status. I will start simple by adding an OnSelect event to this icon, that will get my Presence status using the Custom Connector.

Under View menu, and Data, select to add the custom connector as a new connection to the PowerApp:

On the OnSelect event for the presence icon, I will use Set function and a variable called MyPresence, where I run the Custom connector and GetPresence operation like below:

Set(MyPresence,MSGraphPresenceConnector.GetPresence())

This is how it looks:

Holding down ALT button, I can now click on the Icon to run the OnSelect event, and after that I can go to the View menu again, then under variables I will find the MyPresence variable. When looking into that record, I can verify that I indeed have received my presence status:

The next part would be to update the color of the Icon to reflect the status. I also, for now at least want an extra label that specifies the status as a text value. Lets start by that. I add a label next to the Icon and then set the Text property to “MyPresence.availability”, as shown under:

You should now be able to change the Teams Presence and then click on the Icon in the Hue Power App to update presence status text:

From the Graph Documentation, presence resource type – Microsoft Graph beta | Microsoft Docs, the following values are possible for presence availability, and I have added the suggested colors for these statuses:

  • Away (Yellow)
  • Available (Green)
  • AvailableIdle (Green)
  • Busy (Red)
  • BusyIdle (Red)
  • BeRightBack (Yellow)
  • DoNotDisturb (Red)
  • Offline (Light Grey)
  • PresenceUnknown (White)

So what remaining is that I want to update the color of the Teams Presence Icon also to reflect the status. And for this I chose to use the Switch function, where I evaluate the MyPresence.availability variable, and have different results:

Switch( MyPresence.availability, "Away", "Result1", "Available", "Result2", "AvailableIdle", "Result3", "Busy", "Result4", "BusyIdle", "Result5", "BeRightBack", "Result6", "DoNotDisturb", "Result7", "Offline", "Result8", "PresenceUnknown", "Result9", "DefaultResult" )

I will use that Switch formula to set the Fill property of the Icon, which now is manually set to Red like this:

So after picking the colors, I end up with this formula:

Switch( MyPresence.availability, "Away", RGBA(253, 185, 19, 1), "Available", RGBA(146, 195, 83, 1), "AvailableIdle", RGBA(146, 195, 83, 1), "Busy", RGBA(196, 49, 75, 1), "BusyIdle", RGBA(196, 49, 75, 1), "BeRightBack", RGBA(253, 185, 19, 1), "DoNotDisturb", RGBA(196, 49, 75, 1), "Offline", RGBA(128, 130, 133, 1), "PresenceUnknown", RGBA(255, 255, 255, 1), RGBA(0, 0, 0, 0) )

Adding this to the Fill property of the Icon:

After this you should be able to change your Teams Presence status, and then click on the Icon to update the status in the PowerApp:

One last ting remains before I conclude this blog post, and that is that I want to update the presence status everytime I navigate to this screen in my PowerApp. I’ll just add the following line to the OnSelect event for the Control Lights button on the main screen:

Summary & Next Steps

In this blog post I have shown how you can get the Teams Presence status into the Hue Power App, and for now the status is manually updated either by clicking on the status Icon, or when navigating to the lights screen.

In the next, and last part, of this blog series, I will show how you can subscribe to Microsoft Graph changes, so that you can automatically get status updates.

Thanks for reading so far, see you in the last part 10 of this blog series!

Blog Series – Power’ing up your Home Office Lights: Part 8 – Using Power Automate Flows to Get and Set Lights State

This blog post is part of the Blog Series: Power’ing up your Home Office Lights with Power Platform. See introduction post for links to the other articles in the series:
https://gotoguy.blog/2020/12/02/blog-series—powering-up-your-home-office-lights-using-power-platform—introduction/

In Part 7 we built the main screen of the PowerApp, the topic for today is to build Flows and the PowerApp screen for controlling the Hue Lights:

If you want a quick summary of how this screen works, take a look at this video:

<YOUTUBE VIDEO PROCESSING, AVAILABLE SOON>

Building the Lights Control Screen

Start by adding another screen to the Hue PowerApp. If you have used a custom background color, logo and other graphical elements like I have you can do the same for this screen also. In addition to the label controls I’ve added for texts, I’ve added the following controls to my Hue PowerApp:

  • Small circle icons/shapes to reflect color states.
  • Toggle controls to set Light state On/Off and sync with Teams Presence On/Off.
  • Dropdown list for listing the Hue Lights.
  • Slider control for setting Brightness.
  • I’ve also added a Timer control and set it to not visible.

After adding and customizing the controls and named your controls after your chosen naming convention, your Hue PowerApp might look like the following:

Now we need to create a couple of Flows (as of today these are names Cloud Flows) for getting and setting Light State.

Creating Flow for Getting Lights and State

Create a new Instant Flow with PowerApps as Trigger. Name the Flow “Hue – Get Lights and State”. First add a Compose action, name the action “Access Token and User Name”, and select Ask in PowerApps under Dynamic Content:

Next, add a Parse JSON action below:

You can use the following schema:

{
    "type": "object",
    "properties": {
        "access_token": {
            "type": "string"
        },
        "username": {
            "type": "string"
        }
    }
}

We are now ready to query for the Lights for my Hue Remote API. But first it is helpful to understand a little about how the Hue Remote API returns lights. Earlier this year I published this blog post about exploring the Hue Remote API using Postman: Remote Authentication and Controlling Philips Hue API using Postman | GoToGuy Blog. For example when I query for all lights, https://api.meethue.com/bridge/{{username}}/lights/, I get a response similar to this:

The special thing to note here is that Hue returns every light as a named object identified by a light number. This is not an Array, so you cannot loop through that as you would expect. So I needed to think a little different in my solution.

I decided to create my own Array, and get the Lights one-by-one. For this I needed to start at light number “1”, and then do until some maximum value. I have currently 13 lights, so I created a variable for “13”. It makes it a little static, but at least it works with as little hassle as possible.

First add an Initialize variable action, of type Array and name arrayLights, and using the expression json('[]') as an empty json array as value:

Next, add two more Initialize variables actions, both of type Integer and named LightNumber with value 1, and NumberOfLights with value 13 (or whatever number of lights you have!).

Now, add a “Do until” action, setting LightNumber is greater than NumberOfLights as loop control:

Inside the Do until-loop, add a HTTP action, where we will run a GET query against the https://api.meethue.com/bridge/<whitelist identifer>/lights/<lightnumber>, using the access_token as a Bearer token in the Authorization Header:

This will return the first light state. Add a Append to array variable action, selecting the “arrayLights”, and adding the value like following:

This will add the Light number, the name of the Light source (body('Get_Light')?['name']) and if state on is true or false (body('Get_Light')?['state/on']).

Next action is to add an Increment variable action to increase the LightNumber by 1:

And last, outside the Do until, add a Response action so that we can return the data to the PowerApp. The important part here is to specify status code 200 and content-type application/json, and return the arrayLights variable as shown below:

Getting the Lights and State to the PowerApp

Now that we have to Flow for getting Lights and State, we can get that data into the PowerApp. Back in the PowerApp, select the Button control in the Main Screen with the name Control Lights. Click on the Action menu, and Power Automate to link the “Hue – Get Lights” and State Flow, and add the following lines to the OnSelect event:

Navigate(screenPresenceLights);
Set(wait,true);
ClearCollect(MyHueLights,'Hue-GetLightsandState'.Run(JSON(HueResponse)));
Set(wait,!true)

To explain, the Navigate(<screen>), is for changing to the other screen of course. I also use the Set(wait,true) and Set(wait,!true) on either side of the Flow run to make the PowerApp appear busy. And then, I save all the Lights and State back from the response from the Flow to a Collection, using ClearCollect and the Collection name “MyHueLights”. The Flow run expects that I supply the access_token and username, which I already has as a record variable in the shape of “HueResponse”. So, I’ll just add a JSON(..) function around that.

We can test. Hold down the “ALT” on your keyboard, and click on the “Control Lights” button. After this, go to the View menu and select Collections. You should see the “MyHueLights” collection, and a preview of the first 5 items:

Now we can get that data in to the PowerApp controls. Select the Drop Down list control, and set the Items property to “MyHueLights” and the Value to “Name”:

This should fill the Drop Down with Light names. Next, for the Drop Down list OnChange event, add the following:

Set(SelectedLight,(ddlMyLights.SelectedText));
Set(CheckStatus,false);
If(SelectedLight.State="True",Set(CheckStatus,true);Set(LightState,true),
Set(CheckStatus,true);Set(LightState,false)
)

So in the above expression for the OnChange event, I set a variable “SelectedLight” to the selected text from the Drop Down, and then I’m manipulating another variable with set “CheckStatus” and set “LightState”, depending on if the state on is true or false.

Proceeed to select the toggleLightState control, and set the Default property to the variable “LightState” and Reset property to “CheckStatus”:

We now have what we need for getting the Lights and State into the PowerApp. The next thing we need to build is to actually set Light states and colors back to the Hue Remote API.

Creating Flow for Setting Lights and State

Create a new Instant Flow with PowerApps as trigger, and name it “Hue – Set Light and State”. Start by adding the same two Compose actions as the “Hue – Get Light and State” Flow:

Next, add an Initialize variable action, with the name “Initialize LightNumber”, and select “Ask in PowerApps” under Dynamic content so that this input will be submitted from the PowerApp:

After that, add a Compose action. Name it “Body State”, and select “Ask in PowerApps” for input:

This input parameter is where we will supply the light state, colors etc.

Next add a Parse JSON action, using the outputs of the previous Body State input:

You can use the following schema:

{
    "type": "object",
    "properties": {
        "on": {
            "type": "boolean"
        },
        "xy": {
            "type": "array",
            "items": {
                "type": "number"
            }
        },
        "bri": {
            "type": "integer"
        }
    }
}

After this, add an HTTP action, using method PUT, and the address https://api.meethue.com/bridge/<whitelist identifier>/lights/<lightnumber>/state, and including the access_token as a Bearer token in the Authorization Header. For Body, construct the following JSON body:

And last, add a Response action to return status code and body to the PowerApp:

We now have a Flow in which we can call to set the light states in the PowerApp.

Control Light States from PowerApp

Lets start by turning selected Lights on and off. Select the Toggle control for Light State, and for the “OnCheck” event add the Power Automate Flow “Hue – Set Light and State” under the Action menu. For the OnCheck event add the following expression:

Set(MyLightState, "{'on':true }");
'Hue-SetLightandState'.Run(JSON(HueResponse), SelectedLight.LightNumber , MyLightState)

And for the UnCheck event:

Set(MyLightState, "{'on':false }");
'Hue-SetLightandState'.Run(JSON(HueResponse), SelectedLight.LightNumber , MyLightState)

So as you can see above, I’m using a variable named “MyLightState”, for dynamically storing the different light states I want to set and submit to the Flow. The ‘Hue-SetLightandState.Run’ takes three inputs in the form of access_token and username (via HueResponse variable), then selected LightNumber, and the MyLightState variable.

Next, lets go to the Slider control for setting Brightness. On the OnChange event, add the following expression:

Set(MyLightState, "{'bri': " & sliderBrightness.Value & " }");
'Hue-SetLightandState'.Run(JSON(HueResponse), SelectedLight.LightNumber , MyLightState)

Here I’m changing the state via the ‘bri’ value, and the sliderBrightness.Value. Btw, the Slider is set to minimum 2 and max 254, to support the values expected by the Hue API for ‘bri’.

And then finally we can set the color states for the three icons I have prepared. I have created pre-defined colors reflecting my presence status, green for available, red for busy and yellow for away.

For each of these, change the “OnSelect” event to the following:

Green (Available):

Set(MyLightState, "{'on':true, 'xy': [ 0.358189, 0.556853 ], 'bri':" & sliderBrightness.Value & " }");
 'Hue-SetLightandState'.Run(JSON(HueResponse), SelectedLight.LightNumber , MyLightState)

Red (Busy):

Set(MyLightState, "{'on':true, 'xy': [ 0.626564, 0.256591 ], 'bri':" & sliderBrightness.Value & " }");
 'Hue-SetLightandState'.Run(JSON(HueResponse), SelectedLight.LightNumber , MyLightState)

Yellow (Away):

Set(MyLightState, "{'on':true, 'xy': [ 0.517102, 0.474840 ], 'bri':" & sliderBrightness.Value & " }");
 'Hue-SetLightandState'.Run(JSON(HueResponse), SelectedLight.LightNumber , MyLightState)

A few words about the colors, this is something that could be a little difficult to get a grasp on. Hue has an explanation on the CIE color space and the “xy” resource here: Core Concepts – Philips Hue Developer Program (meethue.com).

You can also see some conversion functions here: Color Conversion Formulas RGB to XY and back – Philips Hue Developer Program (meethue.com)

Basically I’ve tested and learned. A good tip is to set the color you like using the official Hue Mobile App, and then read the state for the light.

Summary and Next Steps

The Hue PowerApp has now a working solution for getting Lights and State, as well as manually controlling colors, toggle on and off, and setting brightness.

In the next part of this blog post series, we will look into getting the presence status from Teams and show that in the Power App.

Thanks for reading!

Blog Series – Power’ing up your Home Office Lights: Part 7 – Building the PowerApp for Hue to Get Config and Link user

This blog post is part of the Blog Series: Power’ing up your Home Office Lights with Power Platform. See introduction post for links to the other articles in the series:
https://gotoguy.blog/2020/12/02/blog-series—powering-up-your-home-office-lights-using-power-platform—introduction/

With the Power Automate Flows we’ve built in the previous parts, we should now be able to get the Link and Whitelist the user and get the Hue Bridge Configuration details. It is time to build the main screen of the “Hue PowerApp”!

Here is a short video where I talk about the basics of the main screen of the PowerApp we are going to build:

Building the PowerApp and Main Screen

In my solution I wanted to build a canvas app with a phone layout, to be able to use it when on my mobile as well. Start by logging in to make.powerapps.com, and creating a new app from Blank, and either phone or tablet layout by your preference:

This next step is up to your preference and personal choice, but what I did was the following:

  • Added a custom background color from your palette (if you have a branding profile) or you could choose one of the built-in themes:
  • Add a Header logo
  • Add elements like frames and icons. I often use a Label control and set the border for it to create a frame like figure.
  • Add label controls for your text and placeholders for where you will update values later. Set font colors for labels and labels where you will have values.
  • Add some Images for where you want to add an action to the OnSelect event.
  • Add Button controls or Icons for navigating between screens.
  • Use a naming convenvtion for your controls.

In the end, adding and formatting all controls, and before I add any data to the PowerApp, my Hue PowerApp ends up like this:

I’ve uploaded Images for the Authorization and Linking, for your convenience I’ve attached those here:

After finishing the PowerApp main screen design, we can proceed to adding actions and getting data.

Connecting the PowerApp to Power Automate Flows

Start by selecting the Refresh Icon, on the Action menu, click the On Select button to change to the OnSelect event, and click the Power Automate button:

Under Data, select to associate the “Hue – Get Access Token and Config” Flow:

This will start populating the OnSelect event field, which you would edit so that you use the “Se”t function and save the response from calling the Flow in the variable HueResponse like this: Set(HueResponse,'Hue-GetAccessTokenandConfig'.Run())

Lets test this action. Before this I have removed my user from the Microsoft List “Elven Hue Users”, this list is empty now:

Hold down the “ALT” button on your keyboard, and click on the Refresh icon. The Flow will now run, you will see the small dots flying over the screen, but you won’t see any data yet. But you can check the contents of the “HueResponse” variable. Do this by going to the View menu, and click on the “Variables” button. From there you should see the HueResponse variable, it is of type “Record” and you can click on that Record icon:

You should now see something like the following values, if I hadn’t deleted the username from my List earlier I would see values for all these fields:

If I compare this with the response output from the Flow I triggered with the refresh icon above, I can see that the output really reflects the contents of the “HueResponse” variable:

Lets add these values to the labels I prepared in the PowerApp.

For the label containing the Hue name value, add the following to the Text property: If(HueResponse.access_token="","Hue not Connected!",If(HueResponse.username="","Connection to Hue OK, but User not linked!",HueResponse.name))

This should return something like this:

Proceed to add the following to the Text property for each of the remaining configuration value labels:

HueResponse.ipaddress
HueResponse.apiversion
HueResponse.internet
HueResponse.remoteaccess
HueResponse.devicetype

They won’t show any value in the PowerApp yet though. First we need to get the user registered at the Hue Remote API, which is the next step. Select the following image:

On the Action menu, for the OnSelect event, add the Power Automate Flow for Link and Whitelist User. Change the OnSelect event so that also this is using “Set” function and taking the response from the Flow to the same HueResponse variable, but you also need to supply an input to this flow. For this we will use the HueResponse.access_token, so your OnSelect event should look like this:

Set(HueResponse, 'Hue-LinkandWhitelistUser'.Run(HueResponse.access_token))

Lets test this button. Hold down “ALT” on your keyboard, and click on the image. The Flow should now run, register a user at Hue Remote API, create a new List item and return the configuration to the PowerApp:

Checking the HueResponse record variable now:

A couple of more things remain on the main screen. First, on the App’s OnStart event, add the same event as the refresh icon, this would get the config automatically at start:

Next, select this Image:

On the OnSelect event, add the following:

Launch("https://api.meethue.com/oauth2/auth?clientid=<your_client_id>&response_type=code&state=<youranystring>&appid=<your_app_id>&deviceid=<your_device_id>&devicename=<your device name>")

Replace the <your_…> values with the client id and app id from the Hue Remote API app registration, and your values for device id and name.

Clicking this image will now launch the Hue Developers portal, asking you to Grant permission to the App, and return to the Logic App that retrieves the Bearer Token and store that in the Key Vault as we have seen in previous parts of this blog series.

Summary and Next Steps

We’ve now built the foundation and first part of the PowerApp to retreive the configuration, create and link username, and if needed authorizing and getting a new Bearer Token via Hue Remote API if needed.

In the next part we will build the screen for getting lights and setting lights state and color.

Thanks for reading, see you in the next part!

Blog Series – Power’ing up your Home Office Lights: Part 6 – Using Power Automate Flow to Link Button and Whitelist user

This blog post is part of the Blog Series: Power’ing up your Home Office Lights with Power Platform. See introduction post for links to the other articles in the series:
https://gotoguy.blog/2020/12/02/blog-series—powering-up-your-home-office-lights-using-power-platform—introduction/

In the previous part 5 we created the first Power Automate Flow of the solution, for retreiving the Access Token and getting the configuration of the Hue Bridge via Remote API. To get all the configuration details of the Bridge, we were dependent on that the user had a Whitelist Identifier in the Microsoft List, and this is the Flow we will be working on in this blog post.

Lets do a quick video where I talk about this Flow and what it does:

Create the Flow for Linking and Whitelisting User

Create a new instant Flow, with PowerApps as Trigger. In my case I have named this Flow “Hue – Link and Whitelist User”.

As the first action in the Flow after the PowerApps trigger, add a Compose action:

Tips: Make sure that you set a name for the action, in my case I’ve named it “Access Token”, before you under Dynamic content selects “Ask in PowerApps”. This way the Input parameter will get a more descriptive name like “AccessToken_Inputs”, when we later call the Flow from the PowerApp.

Next, add two Initialize variable actions, called UserDisplayName and UserEmail and type String. For values use the following custom expressions (see comment for expression):

Next, add a Get Items action from SharePoint, and specify your Site and List. For Filter Query, add Title eq ‘<UserDisplayName variable>’:

Add a Condition action, where we will check if the Get Items returns an empty result to be false:

If false, meaning that the user already have a configuration in the List, under “If yes”, add a Get Item action. Specify the Site and List Name, and for Id add the following expression to return the first instance of results first(body('Check_if_User_Already_Linked')?['Value'])?['Id'] :

Next, add a HTTP action where we will query the Hue Remote API for the Bridge configuration details. Specify the URI to be https://api.meethue.com/bridge/<whitelist identifier>/config, and add an Authorization Header with Bearer <Token Outputs>:

This action should return all the details we want from the Hue Bridge, and we can add a Response action to return that back to the PowerApp:

In the other case, when a User Linked was not found in the SharePoint List, we need to add that user and get the Whitelist Identifier. Under “If no”, add a HTTP action. In this action we will “remotely” push the Hue Bridge button via a PUT method. This is basically the same procedure as when you add new lights or equipments, where you need to run and press the button down. But here we do it via the API like below:

PS! Note that above I’ve used “Raw” Authentication and for Value selected Bearer “AccessToken Outputs”. This is just another option to show, I could have have used an Authorization Header instead.

After the Link Button is enabled, we can add another HTTP action, this will register the username via a POST method and a request body containing the “devicetype” value. Device type is so that you can identify the registered usernames on your bridge:

After this action, add a Parse JSON action so that we can more easliy reuse the outputs from adding the username:

For Schema, select “Generate from sample”, and paste the sample output provided by the Hue API documentation here, https://developers.meethue.com/develop/hue-api/7-configuration-api/#create-user, under 7.1.5. Sample Response.

Next, add a “Create Item” action. Specify the Site and List Name, and add the following values for List columns:

Note that for Whitelist Identifier, use the expression body('Parse_JSON')?[0]?['success/username'], this is because the output from Hue API returns an array, so the [0] is to specify the first instance.

Now, using that newly created username, we can query for the Bridge config using the “Whitelist Identifier” from above:

And lastly, add a Response action that returns this back to the PowerApp:

Verify and Test the Flow

That should complete the Flow. We will link that into a PowerApp later, but if you want to you can test the Flow by performing the Trigger action yourself. Then you need to specify a valid Access Token, and the Flow should run successfully, creating a Linked User if you haven’t already:

If you check the List a new item should now represent your user:

Summary and Next Steps

We are now ready to start working on the PowerApp, linking the Flows we have created in this and the previous blog posts. That will come in the next part!

Thanks for reading so far πŸ™‚

Blog Series – Power’ing up your Home Office Lights: Part 5 – Using Power Automate Flow to Get Access Token and Config

This blog post is part of the Blog Series: Power’ing up your Home Office Lights with Power Platform. See introduction post for links to the other articles in the series:
https://gotoguy.blog/2020/12/02/blog-series—powering-up-your-home-office-lights-using-power-platform—introduction/

In previous parts we have built the logic for authorizing and getting Bearer Token for Hue Remote API, and storing that as a Secret in Azure Key Vault, now it’s time to move over to the user side of things. In this part we will build a Power Automate Flow that retreives the Access Token and checks if the user has been set up for configuration. Here is a short video where I walk through that Flow:

Setting up a User State & Config Source

As the PowerApp and Flows we will build are stateless, in the sense it will get data from configured variables and data connections, we need to store some user state and configuration somewhere. The Hue Remote API require that we need to register a so called “whitelist identifer”, a username to be used when sending request to the Hue Remote API, for example: https://api.meethue.com/bridge/<whitelist_identifier>/lights&nbsp;

The way I have built the solution is that the Authorization part, getting, retreiving and if needed refreshing the Bearer Token, are done in the Logic Apps layer, and common for every user that uses the Power Platform solution. On the user side of things, I want every user that share the solution to have their own whitelist identifier.

This means that first time a user use the solution, the user must register their device and retreive the username to be used as the whitelist identifier. Subsequently users will use their own identifer when calling the Hue Remote API.

So we need something set up to store this information about users’ states and configuration, and I have chosen to use a SharePoint List/Microsoft List to do this. This List has been created in a Team where the users of the solution are members.

These are the steps I have done to set it up:

  1. Created a new Team in Microsoft Teams. I’ve named my Team “Elven Hue Lights”
  2. Created a new SharePoint List/Microsoft List in that Team. You can either to this directly in the Team by adding Microsofts Lists to a Tab, and select to create a new List from Blank, or open Lists from the Office 365 launcher. I created a List like this:
  1. Then, in addition to the Title column, add the following columns, single line of text, for storing “Whitelist Identifier” and “Device Type”:

This list will be used by the following Power Automate Flow, so that is the next step to set up.

Create the Flow for Hue Access Token and Config

Create the following Power Automate Flow, of type Instant and using PowerApps as trigger, and using the name “Hue – Get Access Token and Config”:

After the Trigger action for PowerApps, add a HTTP action. This action will send a GET request to the Logic App we created in the previous part “logicapp-hue-get-accesstoken”. So paste the Request Url for that Logic App in the URI field below:

After getting the Access Token from the Logic App, add an Initialize variable to get the calling users Displayname via the triggerOutputs header and x-ms-user-name value:

Next, add a Get Item action from SharePoint, this will retreive any items matching the User Display Name from the List we created for Hue Users. Specify the correct Site Address and List Name, and add a Filter query where Title equals the variable of the User Display Name we got in the previous step:

Next, add a Condition action, where we will evaluate the returned statusCode from the Logic App. This will be either 204 (No Content) or 200 (OK), as we configured back in the Logic App:

If Yes, add a Response action, this will return a JSON response object back to the PowerApp, but in this case it will be empty:

A quick comment on the above Response body, which will be clearer later in the Flow. I’ve prepared a structured response that will possibly return not only the access_token, but also the name (Hue Bridge), and the Bridge IP address, API version etc. But for now, this is empty data.

Let’s move over to the No side of the Condition. Inside “If no”, add another Condition, and name it “If Username is Empty”. This condition should apply if the Get Item action from the List returns no matching user for the display name:

I’m using an expression empty(body('Get_My_Hue_User')?['value']) and if that is equal to the expression true.

Under “If yes”, meaning that the Get Item returns empty results, add another Response action like the following:

In the above case, as we don’t have a matching user configuration stored, we can return the JSON object with only the access_token.

Next, under “If no”, meaning that a matching user was found in the List, add a HTTP action. This action will call the https://api.meethue.com/bridge/<whitelistidentifier>/config, getting the configuration of the Hue Bridge, and using the access_token as Bearer Token in the Authorization Header:

Note! Even though the Get Item action that is filtered for the user display name, in reality will return only one item (or empty), it is still returned as an array of results. So I’m using the expression and function “first” to return only the first item as a single item. So to get the Whitelist Identifier the expression to be used is:

first(body('Get_My_Hue_User')?['Value'])?['WhitelistIdentifier']

After this action, add another Response action, this time returning values for all config values in my custom JSON object:

Ok, quite a few custom expressions used above, so for your convenience I’ll list them here:

body('Get_Hue_Access_Token')?['access_token']
body('Get_Config_Existing')?['name']
body('Get_Config_Existing')?['ipaddress']
body('Get_Config_Existing')?['apiversion']
body('Get_Config_Existing')?['internetservices/internet']
body('Get_Config_Existing')?['internetservices/remoteaccess']
first(body('Get_My_Hue_User')?['Value'])?['WhitelistIdentifier']
first(body('Get_My_Hue_User')?['Value'])?['DeviceType']

That should be this Flow complete.

Test and Validate Flow

We can now validate the Flow using a simple test run. Save the Flow and click on the Test button and “I’ll perform the trigger action”. This should now complete successfully:

As expected the username should be returned as empty as we haven’t yet configured the user for Hue Remote API. So the Flow will return access_token only:

Summary and Next Steps

This Flow will be central later and used on every PowerApp launch to retrieve the access_token and configuration from the Hue Bridge. But first we need to build the Flow for linking User configuration and Whitelist Identifier. That will be in the next part!

Thanks for reading this far, see you in the next part of this blog series.