Category Archives: Microsoft Graph

Build your own Security Copilot using Azure Open AI and your data!

In this contribution I will show you how you can build your own Security Copilot, by using Azure Open AI, AI Search Service and your own security data sources, in a creative way that let users ask about their own security status in a natural language!

This is part of my contribution to the Festive Tech Calendar 2023, and I’m proud to share my learnings again for this year, hope you will find it useful 🎅🏻🎄

Let’s start by looking into what capabilities that will be available in the Microsoft Security Copilot.

Microsoft Security Copilot

One of the main features of AI and Copilot solutions is to process natural language prompts, so I asked Bing Chat Enterprise to provide me a summary of what Microsoft Security Copilot is, use cases and data insights:

Microsoft Security Copilot is a generative AI-powered security solution designed to increase the efficiency and capabilities of defenders1. It combines a specialized language model with security-specific capabilities from Microsoft2. These capabilities incorporate a growing set of security-specific skills informed by Microsoft’s unique global threat intelligence and more than 65 trillion daily signals2.

Security Copilot can be used in various scenarios such as:

The data insights provided by Security Copilot include:

In summary, Microsoft Security Copilot is a powerful tool that helps security teams defend their organizations more effectively and efficiently2.

Bing Chat Enterprise
A security analyst dressed as a copilot in a christmas decorated office following up on security operations i Microsoft Security Copilot

Microsoft Security Copilot is not generally available yet, and require that your organization is part of an invitation-only Early Access Program for Security Copilot.

Security Copilot works with other Microsoft Security products—including but not limited to Microsoft Defender XDR, Microsoft Sentinel, Microsoft Intune, Microsoft Entra, Microsoft Purview, Microsoft Defender for Cloud, and Microsoft Defender for External Attack Surface Management.

Security Copilot uses the data and signals from these products to generate customized guidance based on user prompts processed by LLM (Large Language Model) and Azure OpenAI grounded on your organizations’ data via connected plugins.

Building Your Own Security Copilot

We can customize and build our own AI solution and Copilot, while waiting for access to the upcoming Microsoft Security Copilot, by following these high-level steps:

  • Create an Azure OpenAI instance in your Azure Subscription.
  • Bring your own data to OpenAI and AI Search Service.
  • Create a deployment and connect a web app, bot or any other client interface that can process prompts.

In this blog article I will show a couple of different options and guides for doing so yourself.

Prerequisites

To be able to build your own Copilot solution, you will need to have access to an Azure Subscription and create some Azure resources for Azure OpenAI and AI Search.

When this is set up, and a deployment have been created in the OpenAI Studio for for example gpt-35-turbo or the gpt-4 models, you are ready for adding your own data.

For reference documentation for the scenarios below, you can check this handy documentation: Using your data with Azure OpenAI Service – Azure OpenAI | Microsoft Learn

Scenario A: Create your own Microsoft Sentinel Cyber Security AI assistant

This solution is inspired by Jeroen Niesen’s Vlog about how to add alerts from Microsoft Sentinel to a Storage Account as markdown documents, and add that storage account to Azure OpenAI and Search Service to index the security alerts. From there you can ask questions like “are there any new incidents” and follow up with details. This short video show how to set it up:

I’ve used that demo recently in a couple of presentations, where I have added a Power Platform app with a custom connector that queries the OpenAI instance via REST API, basically building on the same scenario as Jeroen over here.

Scenario B: Add Security Information from Microsoft Graph to Azure OpenAI

In this scenario I will explore how you can add Security information from Microsoft Graph to Azure OpenAI. There are a lot of security info that you can retrieve using Microsoft Graph API, so I will scope this scenario to getting reports about the Users’ Authentication Methods. The scenario I would like to accomplish is that we can use our own Copilot to get insights by using prompts like:

  • “How many users are registered for MFA?”
  • “What is the most used authentication method used for MFA?”
  • “How many users have at least two methods registered?”
  • “How many users are capable of passwordless?”
  • .. and so on..

To be able to answer these questions by using Azure OpenAI, I will need to find a way to add the report data from Microsoft Graph as my own data, and in the prerequisites section above I linked to a Learn article that detailed how to use your own data. Note that there are a list of supported file types, which currently includes .txt, .md (markdown files), .html, as well as Word documents, PowerPoint presentations and PDF documents.

I will start by querying Microsoft Graph API and get reports of Authentication Methods registered for my users, and then export that data into markdown files that I will place on a Storage Container that will be indexed by Azure AI Search service.

Let’s get to work 💪🏻!

Create and Configure a Logic App for getting security information

I will use a Logic App for querying Microsoft Graph for the authentication methods reports, and place this info on a storage account blob container. Follow these steps:

  1. Create a Logic App in your Azure subscription with a http request trigger and a http response.
  2. Enable a System Assigned Managed Identity for the Logic App.
  3. If you haven’t already a suitable Storage Account for placing the reports, create a Storage Account and a blob container for the markdown reports to be placed in.
  4. You will now need to add role assignments so that the Logic App can access the storage account container:
    • Add “Reader and Data Access” role to the Logic App system assigned managed identity.
    • Add “Storage Blob Data Contributor” role to the Logic App system assigned managed identity.
  5. Add an action in the Logic App for Initialize Variable, of type string and for initializing a markdown file with some generic headings for now.
  6. Add another action for Create Blob (V2), where you use the managed identity to connect to the Storage Account and container, and place the markdown file initialized by the previous variable.
  7. Your Logic App can now look similar to this, make sure to test and verify a successful run:

Send Requests to Microsoft Graph from Logic App

Next, we will need to prepare the Logic App to send requests to Microsoft Graph to get Authentication Methods Reports. In my scenario I want start by querying these resources:

I can verify these queries in the Graph Explorer (https://aka.ms/ge), note that you need to consent to AuditLog.Read.All to be able to run this, you also need to be member of one of the following roles:

  • Reports Reader
  • Security Reader
  • Security Administrator
  • Global Reader
  • Global Administrator

Next, as I want to run these requests using my Logic App, I will need to add the application permission “AuditLog.Read.All” to the system-assigned managed identity for the Logic App. Use this guide https://gotoguy.blog/2022/03/15/add-graph-application-permissions-to-managed-identity-using-graph-explorer/ for adding the Graph Permission.

In the Logic App, add two HTTP actions after the trigger, like this:

Then configure the respective HTTP actions to run queries to Microsoft Graph and using managed identity like this:

Add also after each HTTP request a Parse JSON action, using the sample schema from the response you got when you tested the queries in Graph Explorer. This will make it easier to use the values in our report later. When testing now, you should get something like this before you proceed to the next section:

Start building the Markdown Report for Authentication Methods

We now have an output from Microsoft Graph, which we can use to populate the markdown report that will be placed in the Storage Account for later consumption by OpenAI.

There are several ways you can do this, I will fokus on keywords and values to be presented in the report. As you might have seen, the response from Microsoft Graph for the authentication methods report is a combination of a parent object and an array of either “userRegistrationFeatureCounts” or “userRegistrationMethodCounts”, so I’ll include several Filter Array actions to get the user counts I want. For example like the following:

I repeat that for every user count value I want to include in my report.

In the variable action I use for initalizing the markdown report, I can now refer to these usercount values. PS! As Filter Array action returns an array, even with a single instance, you need to use a function like first() or last() to get the value, for example: first(outputs('Filter_mfaCapable')['body'])?['userCount'])

So my report definition now looks like this, I have also added a time stamp and the total number of users from the feature report:

If I run the Logic App again, I can verify that it will get the report values and create the report as I wanted to, here is a section of the report:

This report, with values, have now been placed at the Storage Account Blob Container, and we can continue into Azure OpenAI to add this as our custom data!

Add your own data to Azure OpenAI

This section require that you have access to Azure OpenAI, and have deployed an instance. Then you can go to OpenAI Studio, https://oai.azure.com/, and follow the steps from there.

In my environment from below, I can go to the Chat playground, and under assistant setup go to the add your data tab:

From the different options of data sources, I select Blob storage, and navigate to my subscription, storage account resource and the storage container where I placed the security report. I also need to select (or create if you don’t yet have any) and Azure AI Search resource. This was previously known as Cognitive Search. Enter an index name of choice, and select a schedule:

I select keyword search in my scenario:

I confirm and complete the wizard, and we can now wait for the processing and indexing of my data:

Finally I will add a system message that will help the assistant in answering prompts from users:

Our assistant in Azure OpenAI is now ready to answer our questions. Let’s do some testing in the playground chat:

As we can see, the assistant is now capable of answering prompts about the report data. Note that I cannot ask about individual users’ methods, as naturally I haven’t included that in the report data. But I plan to add that in a follow up article to this blog post, so stay tuned.

Share your Security Copilot with users in the Organization

You can share this to your users directly from the OpenAI Studio, by either deploy to a Web App or as a Power Virtual Agent bot. Different requirements and prerequisites apply to each scenario:

For my demo I published to a web app as an Azure App Service, that will automatically configured with Entra ID Authentication for users in your organization. Here is a demo screenshoot of how the web app looks:

If I want to use my own application platform, for example a PowerApp like I showed earlier in this post, I can use the details from the code sample in the Chat Playground, and integrate as I like:

I’ll leave the rest of the exploring and playing to you!

Summary & Next Steps

Let it be no doubt, the upcoming Microsoft Security Copilot will be the most comprehensive AI based security assistant to date, but it might be that not everyone have access to or can afford pricing for the official Microsoft solution.

The reason for showing this blog post is that you can use basically the same data sources, utilize Azure OpenAI, and build your own custom scenarios. From there you can explore many different ways to bring copilot experience to your users.

Thanks for reading, happy AI’ing! 🤖

Connect to Microsoft Graph in Azure DevOps Pipelines using Workload Identity Federation

Microsoft recently announced that Workload Identity Federation for Azure Pipelines now is in Public Preview: https://devblogs.microsoft.com/devops/public-preview-of-workload-identity-federation-for-azure-pipelines/.

This opens for a lot of scenarios for Azure service connections, without the need to manage secrets for service principals and more security as there are no secrets that can be exposed or exfiltrated.

As I work a lot with Microsoft Graph and automation, I wanted to see if and how I could use Workload Identity Federation to connect to and send queries to Microsoft Graph using Azure Pipelines.

Create the Workload Identity Federation Service Connection

First of all, I need to create a service connection in my Azure DevOps project that will use the new Workload Identity Federation. To be able to do this, you need to have access to the preview functionality, see details here in this learn article: Create an Azure Resource Manager service connection using workload identity federation.

When you have access to the feature, you can create a new Workload Identity federation either by manual or automatic configuration:

I will now choose the Azure Subscription, and optionally a Resource Group. Choosing a resource group is a good idea, as the service connection will be given Contributor access only to that Resource Group, and not the whole subscription. But it also depends on what you want to use your Service Connection for, in my case it is a demo scenario for Microsoft Graph Access, so it makes sense to scope the permissions down:

After creating the Service Connection, I can find it my Entra ID tenant. Let’s look at the role assignments for the Resource Group first:

The service principal has been given the name of <DevOps Org>-<DevOps Project>-<guid>, and been assigned with Contributor access to that RG.

Next, let’s find the App Registration for the Service Connection. As you can see from below there has been no (0) credentials of secrets or certificates created, but there has been created a Federated credential:

If we look at the detail for the federated credential, we can se the issuer, subject and audience, and confirm that this service principal only can be access by the service connection in Azure DevOps:

Next, go to API permissions. Here I will add a Microsoft Graph permission, so that we can use that for queries in the pipeline later. In my case I add the Application permission User.Read.All, so I can look up user information:

We are now ready to set up an Azure Pipeline to use this service connection.

Create the Azure Pipeline to access Microsoft Graph API

In your DevOps project, if this is a new project, make sure that you initialize the Repository, and that you have at least a Basic or Visual Studio access level, then head to Pipelines and create a “New Pipeline”. For my environment I will just choose the following steps:

  1. Select Azure Repos Git (YAML)
  2. Select my repository
  3. Use a starter pipeline (or you can choose an existing if you have)

This is a sample YAML code that will use the service connection (se below picture) to get an access token for Microsoft Graph, and the use that access token to connect to Graph PowerShell SDK. In my example I’m just showing how to get some simple user information:

There are different ways you can go about this, in my case I was just using Azure CLI in one task to get the access token for the resource type that is Graph. (You can also use Az PowerShell task for this by the way). I also set and secure the variable for use in later steps in the pipeline job.

In the next task I use PowerShell Core to convert the token to a secure string, and then install the required Microsoft Graph PowerShell modules. I can then connect to Graph and get user information. Here is the complete YAML code:

# Pipeline for accessing Microsoft Graph using Federated Workload Identity Credential
# Created by Jan Vidar Elven, Evidi, 15.09.2023

trigger:
- none

pool:
  vmImage: windows-latest

steps:
- task: AzureCLI@2
  displayName: 'Get Graph Token for Workload Federated Credential'
  inputs:
    azureSubscription: 'wi-fed-sconn-ado-to-msgraph'
    scriptType: 'pscore'
    scriptLocation: 'inlineScript'
    inlineScript: |
      $token = az account get-access-token --resource-type ms-graph
      $accessToken = ($token | ConvertFrom-Json).accessToken
      Write-Host "##vso[task.setvariable variable=secretToken;issecret=true]$accessToken"
- task: PowerShell@2
  displayName: 'Connect to Graph PowerShell with Token'
  inputs:
    targetType: 'inline'
    script: |
      # Convert the secure variable to a secure string
      $secureToken = ConvertTo-SecureString -String $(secretToken) -AsPlainText

      # Install Microsoft Graph Modules required
      Install-Module Microsoft.Graph.Authentication -Force
      Install-Module Microsoft.Graph.Users -Force

      # Connect to MS Graph
      Connect-MgGraph -AccessToken $secureToken

      # Get User Info
      Get-MgUser -UserId "[email protected]"
      
    pwsh: true

I can now try to run the pipeline. At first run you will have to validate and permit access to the service connection from the pipeline:

And then I can verify that it indeed can connect to the Graph via PowerShell SDK and get my resources via the Workload Identity Federation service connection:

Summary and Usage Scenarios

Most will use the new Workload Identity Federation for Azure Pipelines that access Azure subscriptions and resources, but I have shown as as long as this is using the Entra ID authentication platform and OIDC, it is possible to get access tokens for other API’s as well, in this case Microsoft Graph API.

You can use Graph API to get information about your tenant, to enrich and complement your exisiting CI/CD pipelines, or in some case automate consistent deployments also for Graph resources, like for example important settings and policies.

Connect Power Platform to Azure AD Protected APIs using built-in HTTP connectors

There are several ways you can access the Azure AD Protected APIs in Power Platform Flows and Apps. Without creating Custom Connectors, which basically can connect to any REST based API that is available, it is useful to know what built-in HTTP connectors are available and can be used for delegated authentication to Azure AD Protected APIs like Microsoft Graph or other APIs.

This and more will be answered and demoed in this blog post.

What is an Azure AD Protected API?

First, lets do a quick explanation of what an Azure AD Protected API is.

The most known Microsoft API is the Microsoft Graph API. Another well known API is the Azure REST API. These APIs are protected by and accessed by identities and applications from Azure Active Directory organizations.

Other than that, any API has the possibility to be protected by Azure AD and by using industry standard authentication and authorization protocols like OpenID Connect (OIDC) and OAuth2.0. This includes APIs you might build yourself or APIs from third party services.

Delegated vs. Application Access

When you want to access Azure AD Protected APIs from Power Platform you also need to know the access scenarios you will need. There are 2 main access scenarios: Delegated access and Application-only access, and as shown in the picture below, there is a distinct difference in that the delegated access scenario always includes the user.

This means that if you are running a Power Automate flow, or a PowerApp, you will usually run that interactively as your user account, and any connections you use will be running via the app under your user context and permissions to access the data resources the app might be using. The data resource in this example can be represented by an Azure AD Protected API.

If you share the PowerApp or Power Automate flow with other users in your organization, then those users will use the connections to the API resource under their own user context.

If you, on the other hand, wants to run a Power Automate flow on a schedule or in any other way without user interaction or without a user context, then you typically will use the app-only access scenario.

In this blog post I will focus on delegated access scenarios, and where you can use built-in HTTP connectors that will run in connection based on the running user.

Built-in HTTP Connectors in Power Platform

Let’s start with an overview over what built-in HTTP connectors and actions for Azure AD Protected APIs we have available in Power Platform. These are HTTP connector actions you can use in Power Automate, PowerApps and in most cases also Logic Apps, and the following list is current as of December 2022:

  • “Send an HTTP request (preview)”, Office 365 Outlook connector.
  • “Send an HTTP request (preview)”, Office 365 Users connector.
  • “Send an HTTP request”, Office 365 Groups connector.
  • “Send an HTTP request V2 (preview)”, Office 365 Groups connector.
  • “Send an HTTP request (preview)”, Office 365 Groups Mail connector.
  • “Send an HTTP request to SharePoint”, SharePoint connector.
  • “Send an HTTP request (preview)”, LMS365 connector.

These are all Standard connectors that you can freely use if you are using the Power Apps/Automate plan for Microsoft 365.

There is also an interesting HTTP action with Azure AD that is Premium and has the following action:

  • “Invoke an HTTP request”, HTTP with Azure AD connector.

Premium means you need to acquire a standalone Power Apps and Power Automate licensing plan.

There is also a “Send an HTTP request to Azure DevOps” in the Azure DevOps connector that is also Premium.

While these connectors are for querying resources that are under the scope of the data that are accessible via the connector, the most important part is that these are all dependent on the user connection and is using the delegated access scenario.

This makes them especially useful for calling Microsoft Graph API in the context of your own user.

If you want to send HTTP requests in an app-only access scenario, then you can use the built in HTTP connector, which is Premium. But that is not the scope of this blog post, so let’s continue looking into some scenarios and examples for the above delegated access connectors.

Send HTTP Request via Office 365 connectors

Lets do a couple of scenarios of the Office 365 Outlook/Users/Groups connectors from above, and their HTTP request actions. I have created a instant cloud flow with a manual trigger for now.

First I will initialize a variable for the base Url, this will be the Microsoft Graph API:

Then I add each of the HTTP actions from the Office 365 connectors, using the baseUrl and the “me” resource to start with. In the below image I’ve renamed the HTTP actions so that you can see from which connector they are from, and I’m running a simple GET method:

I’ve also configured the run after setting for each of the actions, so that I can verify the results of the others if anyone fails:

When I save and test the flow I can verify that I’ve signed in to the connectors as my own user, and any permissions that these have:

When I try to run this Flow, it will fail on several of the actions, this is expected:

The reason it fails is because there are restrictions on the resources and objects the actions are allowed to query, for example for the first one we get the info on that this connector can use either “me” resource or “users” resource, but also only for the listed objects like messages, calendar etc:

One of the actions is successful however, and that is from the Office 365 Groups connector and the first version of the Send an HTTP request, which is allowed to get the /me resource:

Lets make some adjustments to the queries for the different actions:

In the above image I’ve added some supported objects for the different actions, and all these should return a valid Graph response:

So this means as long as you either:

  • use the /me/{object} or
  • /users/{userid-or-userprincipalname}/{object} or
  • /groups or /groups/{groupid}/{object}

From the supported list of objects (messages, events, calendar, etc) then you can run any Microsoft Graph API queries including GET, POST, PUT, PATCH and DELETE.

From earlier we saw that one of the connector actions had more broad support than the others, and that was the Office 365 Groups connector and the original version of the “Send an HTTP request”. There has since been released a V2 version of the same action, and that limits only queries against /groups resource.

Lets do a quick test on if the first action supports getting the user’s registered authentication methods. If we run that same query using Graph Explorer, we will get a list of your users methods for authentication, including Authenticator, Phone, Windows Hello etc. In the action I will type /me/authentication/methods, but when I run it will fail with request authorization failed:

This means that even though I have permission as my self to query my authentication methods (see Graph Explorer for example), the connector does not have the correct delegated permissions to act on my behalf. From Graph Explorer I can verify that I need to consent to permissions for authentication methods:

So to summarize the Office 365 connectors and HTTP actions, they can be valuable for many Graph API requests for your users, but only for permitted objects and permissions.

This is where the “Invoke an HTTP request”, HTTP with Azure AD connector, can be useful, and we will look into that next.

Send Request via HTTP with Azure AD Connector

First of all, this is a Premium connector, so you must make sure you are licensed with a separate Power Apps / Power Automate plan for this, but a trial should also be available to for exploring the connector.

According to the documentation, the HTTP with Azure AD connector can be used to fetch resources from various web services that are authenticated by Azure AD. It can also be used to query from an on-premise web service.

Note that there are known issues and limitations, such as if you get “Forbidden” or “Authorization Request Denied“, or “Insufficient privileges to complete the operation.” then it could be because this connector has a limited set of scopes.

The big advantage of this connector is that you can use it for several of Microsoft APIs, not just the Graph API.

Lets try it out, I will create another instant cloud flow with manual trigger, and then add the HTTP with Azure AD connector and Invoke an HTTP request action:

If this is the first time you have added that connector action and you don’t have any existing connections, then you need to configure that and sign in first. If we want to use the Microsoft Graph API, then you need to fill in this and then sign in:

Then, we can start with a simple Graph request for getting my profile info:

When we run that, it should successfully return the user profile:

Ok, let’s try another request again, and see if this connector lets us query for authentication methods:

So here we can see that the limitations in scope also applies to this connector, as we get an access denied and request authorization failed.

But there are more scenarios that this connector action support than the Office 365 connectors I described in the previous section. For example I can get my groups memberships, note the use of $count parameter and that it requires the consistencyLevel=eventual in the Request Header:

There aren’t really any good documentation on the HTTP for Azure AD connector to say which resources and objects you can send requests to, but you can assume that you can do a lot of the normal member of a directory can do, but not necessarily those that require Graph permission consent outside read user information.

Note that you can use the HTTP for Azure AD connector to other APIs than Microsoft Graph, and this is just a list of examples:

  • management.azure.com
  • vault.azure.net
  • <tenant>.sharepoint.com
  • api.loganalytics.io

But I thought I should wrap up the blog post by looking at how the HTTP for Azure AD connector can be used for invoking requests for APIs that you have built yourself, like for Azure Functions or Logic Apps.

Invoke Requests for your Serverless APIs with HTTP for Azure AD

Protected Logic Apps

I have previously written a blog post series on how you can protect Logic Apps with Azure AD: https://gotoguy.blog/2020/12/31/protect-logic-apps-with-azure-ad-oauth-part-1-management-access/

Lets use that knowledge to see if the HTTP for Azure AD connector can invoke requests to a Logic App that is protected with an Azure AD Authorization Policy.

I have created a new simple Logic App with a HTTP request trigger:

I’ve added a response action, and made sure that the response is returning JSON content like this:

Then, I have added an Azure AD Authorization Policy like the following, requiring that the issuer claim in the authorization header is from my tenant:

I can now go to my Power Automate flow, and add an HTTP for Azure AD connector action, but I need to set up a new connection like the following. Note that the base resource URL will be the Logic App instance, and the Azure AD Resouce URI will be one of the well known Azure APIs like below. (I haven’t been able to use graph.microsoft.com here, because I suspect the fact that Graph API tokens cannot be validated outside Microsoft Graph itself).

When I have signed in at set that connection active I can configure the Invoke an HTTP request action like the following, note that I have left out the parameters that belong to the SAS (shared access scheme) like sig, sv, sp, as you only can use on of SAS or OAuth2:

When I run this I get a successful response:

This means that the HTTP for Azure AD connector was now able to invoke a request for a protected API that I have built myself. Of course this was a simple request, but if you look at the aforementioned blog post series of protecting Logic Apps, you can see how you can include the Authorization Header in the Logic App trigger, and then get the claims and do authorization logic inside the Logic App.

In this example I didn’t include my own API definition, so lets now get even more advanced and look at an Azure Functions API example!

Protected Azure Functions with Custom API

Last year for Festive Tech Calendar in 2021, I created this solution for my contribution: https://gotoguy.blog/2021/12/22/creating-an-azure-ad-protected-api-in-azure-in-a-school-hour/

I will now use this as an example on how I can invoke requests for this API using Power Automate and the HTTP for Azure AD connector!

In another instant cloud flow, I’ve added the Invoke request with HTTP for Azure AD, and I need to create another connection, which will be configured as follow:

The base URL is the Function App URL, and for Resource URI I’ve used the Application ID URI I created when I defined the custom API (see the blog post for details).

However, when I sign in I will first get an error like the following:

Note the client app id I’ve marked with a yellow shape above, this is a well known global confidential application ID for Power Platform, so this will be the same ID for you as it is for me. This is also described in this article: https://learn.microsoft.com/en-us/power-query/connectorauthentication.

This means that I need to go to my custom API app registration, and add this ID as an authorized client application and which scopes it is authorized to use:

After adding that I can now sign in to create the connection, and next configure the invoke HTTP request as follows, using one of the API methods I’ve defined (again see my blog post from last year for details :):

When I run the Flow I see that the request is successful, and I indeed get a response from the Protected Functions API:

Summary

In this blog post I have shown how you can use and connect Power Platform to Azure AD Protected APIs using built-in HTTP connectors that use the delegated access scenario.

While the Office 365 connectors are great for limited scenarios around Microsoft Graph, we could verify that the HTTP for Azure AD connector (Premium) had more usage scenarios, including the ability to invoke requests from both Microsoft APIs, as well as your own APIs.

Hope this has been useful, thanks for reading!

Speaking at Oslo Power Platform & Beyond!

I’m excited and very much looking forward to speak at the upcoming Oslo Power Platform & Beyond Community Event, which will happen in-person at May 21st 2022 at Microsoft Norway offices i Oslo.

Oslo Power Platform & Beyond is a Community Event hosted by the Dynamics User Group Norway , and will on this upcoming Saturday feature 21 sessions delivered by 23 international speakers and rockstars, MVP’s and community leaders!

My session will be about how you can Connect Power Platform to any Azure AD protected API using OAuth2 and Custom Connectors. While there are hundreds of built-in connectors you can use in your Power Automate Flows or Power Apps, there are many scenarios where you would want to access API’s like Microsoft Graph, or any other API that is protected by Azure AD. In this session I will show how you can access this using Custom Connectors and OAuth2, and my demo will show a self-built API using Azure Serverless solutions like Azure Functions and Logic Apps!

Session details:

The event starts in a few days, but there is still time to register for FREE:

For a full list of session program and speakers, see here: https://oslo-power-platform-and-beyond.sessionize.com/

Hope to see you there!

Add Graph Application Permissions to Managed Identity using Graph Explorer

I use Managed Identites in Azure for a lot of different automation scenarios, for example if I want run a Logic App or an Azure Function that should securely call an API like Microsoft Graph..

In such a scenario, the Managed Identity, represented by its Service Principal, needs to be granted application permissions to the API. Let’s say for example that you want to list Intune Managed Devices in your Organization using the Microsoft Graph API, using a Function App or Logic App for example and connect to the Graph API using a Managed Identity.

Then you would need to give that Managed Identity an App Role assignment for the application permission in Graph that is called DeviceManagementManagedDevices.Read.All. If you use that Managed Identity for example in a Function App or Logic App, they could then call the Microsoft Graph API as illustrated below:

Currently there are no way to manage these application role assignments in the Azure Portal GUI. You can verify the permissions, but not add or remove them.

For reference, usually you would do this using cmdlets in Azure AD PowerShell:

See Gist from my GitHub on how to create App Role Assignment for Managed Identity using PowerShell

Today I would like to show how you can do this with a “GUI” after all, by using the Microsoft Graph Explorer!

Prerequisite – Sign in and Connect to Graph Explorer

Many of you might be familiar with Graph Explorer, but if you aren’t you can find it at the Microsoft Graph documentation site, or just use this short url: https://aka.ms/ge.

In Graph Explorer you need to sign in (and consent to permissions) so that you can access your organizations’ data via Graph API. Note that your organisation might have restrictions in place for users consenting to permissions for API’s, and in any case if you want to use my example here for adding Microsoft Graph API application permissions, you will need to be a Global Administrator anyway.

Part 1 – Find the Service Principal of the Managed Identity

The first thing we need to do is to find the Service Principal that represents your Managed Identity. I will assume that you already have a Managed Identity created and are familiar with the concept, if not you can read more about it and how to create on this link: https://docs.microsoft.com/en-us/azure/active-directory/managed-identities-azure-resources/overview.

I will now search for the Service Principal in Graph Explorer. You can do this by running something similar to this query:

GET https://graph.microsoft.com/v1.0/servicePrincipals?$search="displayName:msi"&$count=true

PS! You must add ConsistencyLevel = eventual i Request Header to be able to run the search and count parameter.

In my example I’m searching for a User Assigned Managed Identity that I know I have prefixed with the name “msi”, but you can also search for System Assigned Managed Identities, these will have the name of the resource you have assigned it to (name of the Function App, Logic App, etc).

When I run this example query currently in my tenant, I get a count of 5 service principals, one of which is the Managed Identity I’m looking for. The thing of interest for me here is the Id of the Service Principal, in the green box below. I have also a yellow box around the name of the Managed Identity, which you can see is a User Assigned Identity created in a Resource Group where I connect it to Logic Apps.

Take a note of that Id, you will need it later. You can now also get the Service Principal directly by Id:

GET https://graph.microsoft.com/v1.0/servicePrincipals/{your-managed-identity-service-principal-id}

Part 2 – Find the Service Principal of the Microsoft Graph (or other) API

Next, we also need to find the service principal in your organisation that represent your instance of the multi tenant App that is “Microsoft Graph”.

Microsoft Graph API always has the appId that is: 00000003-0000-0000-c000-000000000000

In every Azure AD organisation the Microsoft Graph API application is represented by a Service Principal, you can find this with the following query:

GET https://graph.microsoft.com/v1.0/servicePrincipals?$filter=appId eq '00000003-0000-0000-c000-000000000000'

You will now need to note the “id” of that Service Principal, that will be the “resourceId” to be used later, and this resource “id” is different for every Azure AD organization/tenant:

Part 3 – Find the Application Role that will be the Permission you want to assign

Now that we have the Service Principal Id of the Managed Identity, and the Service Principal Id of the Microsoft Graph Resource in your Azure AD Organization, we need to find the Id of the actual application role permission you want to assign.

You can now use the Service Principal Id you retreived in part 2, to list all available app roles by appending /appRoles to the query:

GET https://graph.microsoft.com/v1.0/servicePrincipals/{your-graph-serviceprincipal-id}/appRoles/

Now, there are a lot of application role permissions for Microsoft Graph, so we need to do a search. Unfortunately, not all Graph resources support all OData filter queries, and not everything is documented, but as far I can see I cannot use $search or $filter inside a specific service principal resource. For example I would want to do something similar to this query:

GET https://graph.microsoft.com/v1.0/servicePrincipals/{your-graph-serviceprincipal-id}/appRoles?$search="displayName:Intune"

But this will return an error like this:

This might be fixed at a later time, but for now we can just use the browser’s search function (CTRL+F), so when I query for all /appRoles, I will get them all listed, and then just search for the application permission I want:

Not so elegant, I know, but at least I will get that specific application role id I need for the next step.

(PS! Graph API will usually return max 100 values and in this case the Graph API has less than that. If there were to be more than 100 results, then the request will be paged and with a odata.nextlink for the next page of results).

Anyway, I now have the 3 parts I need to create the application role assigment for the Managed Identity.

Part 4 – Assign Application Role to Managed Identity

We can now assign the application role to the service principal, and as documented here, we will need the following 3 id’s:

  • principalId: The id of the user, group or client servicePrincipal to which you are assigning the app role. This will be the id of the Managed Identity service principal we found in part 1.
  • resourceId: The id of the resource servicePrincipal which has defined the app role. This will be the id of the Microsoft Graph service principal we found in part 2.
  • appRoleId: The id of the appRole (defined on the resource service principal) to assign to a user, group, or service principal. This the app role id we found by searching the appRoles for the resource id in part 3.

To create this assignement we need to do a POST query in Graph Explorer, with Content-Type application/json in the Header, and the following request body:

POST https://graph.microsoft.com/v1.0/servicePrincipals/{your-graph-serviceprincipal-id}/appRoleAssignedTo

Content-Type: application/json

{
  "principalId": "{your-managed-identity-service-principal-id}",
  "resourceId": "{your-graph-serviceprincipal-id}",
  "appRoleId": "{your-app-role-id}"
}

After you run this query, you should get a status of 201 Created and a response like the following:

You can now also verify this assignment in the Azure AD Portal. If you go to Enterprise Applications, and search for {your-managed-identity-service-principal-id}, you should find your Managed Identity. From there you can click on Permissions under Security, and you will see the application permissions that you have granted. PS! I had already added another for writes as well:

Part 5 – Managing Application Role Assignments

After adding application permissions for the Managed Identity, you can also use Graph Explorer for viewing current application role assignments, as well as remove existing role assignments.

To get App Role Assignments for the Service Principal that is your Managed Identity, use the following query:

GET https://graph.microsoft.com/v1.0/servicePrincipals/{your-managed-identity-service-principal-id}/appRoleAssignments

This will return all the application permissions assigned to this Managed Identity Service Principal:

And then if you want to delete an application role assignment, you need to run a DELETE query as following:

DELETE https://graph.microsoft.com/v1.0/servicePrincipals/{your-graph-serviceprincipal-id}/appRoleAssignedTo/{appRoleAssignment-id}

The {appRoleAssignment-id} would be the “id” from the GET /appRoleAssignments shown above. When run successfully you should you should receive as status of 204 – No Content:

Summary

In this blog post I have show how you can use Graph Explorer to add Graph API application role permissions to your Managed Identity. Similar steps can be used against any Azure AD protected API other than Graph you would want your Managed Identity to access.

Thanks for reading!

Speaking at Scottish Summit 2021!

I’m very much looking forward to speaking this Saturday 27th of February at the Virtual Scottish Summit 2021! This amazing Community conference will host 365 (!) virtual sessions from experts all over the world, ranging from first-time speakers to experienced community leaders.

Sessions will be delivered in a range of tracks covering Microsoft technologies and community topics, something to choose from for everyone.

My session will be about why it so important to have Zero Trust Admins with least privilege access, and why you start using Azure AD PIM (Privileged Identity Management) today!

I learned about the stereotype of cheap Scots from reading about Scrooge McDuck, I’ve no idea if this is true or not, probably not ;), but the thing that is true is that you should be really cheap when handing out admin privileges.

Today Microsoft 365 Global Administrators or Azure Subscription Owners are the new Domain/Enterprise Admins, in many organizations too many users have these roles. In the session I will show how by implementing just-in-time and just-enough-access (JIT/JEA) policies, we can reduce vulnerability and attack surface, and the right tool for the job is using Azure AD Privileged Identity Management (PIM).

I have been using AAD PIM for years, and in this session I will share my best practices and how to implement and use the right way.

Session details:

There is still time to register, but time and available tickets are running out fast. Go an register your free ticket for an awesome day of free community contents here:

Scottish Summit 2021 | Scottish Summit

Have a great conference and hope you visit my session if the topic is of your interest!

Protect Logic Apps with Azure AD OAuth – Part 3 Connect to API from Power Platform

In this article I’m going to build on my previous blog posts in this series where I have written about how to add Azure AD OAuth authentication and authorization to your Logic Apps and expose them as an API. For reference the links to these blog post articles are here:

If you want to connect to API’s using Power Platform (Power Automate Flows, PowerApps etc.), you can do this in two different ways:

  • Using HTTP action and send requests that use Azure AD OAuth authentication. This will use the “Client Credentials” OAuth flow, and is suitable for calling the API using application permissions and roles.
  • Setting up a Custom Connector for the API, and using the HTTP logic app trigger as operation. This will use the “Authorization Code” OAuth flow, and is suitable for using delegated permissions and scopes for the logged on user via connections.

So it depends on how you want your Power Platform users to be able to send requests your Logic App API. Should they do this as themselves with their logged on user, or should they use an application identity? There are use cases for both, so I will show both in this article.

Connect to Logic App API using Custom Connector

Using Custom Connectors is a great way to use your own identity for sending requests to an API. This way you can also securely share Custom Connectors, and Flows/PowerApps, using them in your organization, without needing to share sensitive credentials like client id and client secrets.

If you want to create a Custom Connector in Power Platform that triggers an HTTP request to a Logic App, you can currently do this in one of the following ways:

  1. Creating Custom Connector using Azure services and Logic App.
  2. Exporting the Logic App to a Power Platform environment.
  3. Creating the Custom Connector using an OpenAPI swagger definition file/url.
  4. Creating the Custom Connector from blank.

Lets take a quick look at each of these, but first we need to take care of some permissions in Azure for creating the Custom Connector automatically.

Azure Permissions for Logic Apps and listing swagger

There are some minimum permissions your user needs to be able to create a Custom Connector automatically by browsing the Azure Service.

A good starting point is using one of the built in Azure roles for Logic Apps:

But even these do not have the permissions necessary, if you try you will get an error similar to the following:

So we need to att this ../listSwagger/action for the scope, and you could give the user full Contributor access, but that seems rather excessive. Lets create a custom role instead. I will do this using Azure PowerShell, for reference please see the docs: Create or update Azure custom roles using Azure PowerShell – Azure RBAC | Microsoft Docs.

After connecting to Azure using an Azure account that can create custom roles for the scope (Owner or User Access Administrator), start by exporting an existing role as a starting point:

# 1. Export a JSON template as a reference based on an exisiting role
Get-AzRoleDefinition -Name "Logic App Operator" | ConvertTo-Json | Out-File .\LogicAppAPIOperator.json

Then edit this JSON file, by removing the Id parameter, defining a Name, setting the IsCustom to true, and Description to something descriptive like below. I have also set my Azure subscription id under assignable scopes, and added the required ../listSwagger/action:

{
  "Name": "Logic App API Operator",
  "IsCustom": true,
  "Description": "Lets you read, enable and disable logic app, and list swagger actions for API.",
  "Actions": [
    "Microsoft.Authorization/*/read",
    "Microsoft.Insights/alertRules/*/read",
    "Microsoft.Insights/metricAlerts/*/read",
    "Microsoft.Insights/diagnosticSettings/*/read",
    "Microsoft.Insights/metricDefinitions/*/read",
    "Microsoft.Logic/*/read",
    "Microsoft.Logic/workflows/disable/action",
    "Microsoft.Logic/workflows/enable/action",
    "Microsoft.Logic/workflows/validate/action",
    "Microsoft.Resources/deployments/operations/read",
    "Microsoft.Resources/subscriptions/operationresults/read",
    "Microsoft.Resources/subscriptions/resourceGroups/read",
    "Microsoft.Support/*",
    "Microsoft.Web/connectionGateways/*/read",
    "Microsoft.Web/connections/*/read",
    "Microsoft.Web/customApis/*/read",
    "Microsoft.Web/serverFarms/read",
    "Microsoft.Logic/workflows/listSwagger/action"
  ],
  "NotActions": [],
  "DataActions": [],
  "NotDataActions": [],
  "AssignableScopes": [
    "/subscriptions/<my azure sub id>"
  ]
}

After that you can create the custom role:

# 3. Create the new custom role:
New-AzRoleDefinition -InputFile .\LogicAppAPIOperator.json

This role can now be assigned to the Power Platform user(s) that need it, using the scope of your Logic Apps, for example the Resource Group. You can either add the role assignment to the user directly, or preferrably using Azure PIM:

With the correct permissions now in place, you can proceed to the next step.

Creating Custom Connector using Azure services and Logic App

Log in to Power Apps or Power Automate using your Power Platform user. Under Data and Custom Connectors, select to create a New custom connector. From there select “Create from Azure Service” (Preview as per now):

Next, type name for your Custom Connector, select which Azure subscription, and from which Azure service which in this case is Logic Apps. Note that you can create from Azure Functions and Azure API Management as well. Then select the Logic App name from the list:

If you don’t see or get any errors here, verify your permissions.

Click Continue, and you will see something like the following, where the Host and Base URL has automatically been set correctly for your Logic App HTTP trigger. If you want you can upload an icon, background color and desctription as well:

Click Next to go to Security. Here we need to change the Authentication to OAuth 2.0, as this is what we have implemented for authorizing requests to the Logic App. To authenticate and get the correct Access Token, we will reuse the LogicApp Client app registration that we created in the previous blog post. Copy the Application Client ID and Tenant ID:

And then create and copy a new secret for using in the Power Platform Custom Connector:

Fill in the rest of the OAuth 2.0 details for your environment like below:

Note from above that we need to specify the correct resource (the backend API) and scope. This is all very vell described in the previous blog post.

Click Create Connector to save the Connector details. Make sure that you copy the Redirect URL:

And add that to the App Registration redirect URLs:

Next, under the Custom Connector proceed to step 3. Definition. This is where the POST request trigger will be, and it should already be populated with an action:

We need to specify a value for Summary, in my case I will type “Get Managed Devices”:

Next, under Request Query, remove the “sp”, “sv” and “sig” parameters, as these will not be needed as long as we are using OAuth2 authorization scheme:

The Body parameter should be correctly specified, expecting an operatingSystem, osVersion and userUpn request body parameters:

Last, lets check the Responses from the Logic App. They have been successfully configured with status 200 (OK) and 403 (Not Authorized), as these two responses were defined in the Logic App.

If the Response body is empty like below, we would need to import from sample output from the Logic App (response body should have automatically been configured if the response action in Logic App had a response schema defined):

For response status 200, the sample import is:

[
  {
    "deviceName": "",
    "manufacturerer": "",
    "model": "",
    "operatingSystem": "",
    "osVersion": "",
    "userDisplayName": "",
    "userPrincipalName": ""
  }
]

Giving this response definition:

The sample response from the 403 not authorized should be:

{
  "Message": "",
  "Roles Required": "",
  "Roles in Token": "",
  "Scopes Required": "",
  "Scopes in Token": ""
}

Giving this response definition:

NB! It’s important to have correct responses defined like above, it will make it easier to consume those responses later in Power Automate Flows and Power Apps.

Click on Update Connector, and then go to next section 4. Test.

We can now test the Logic App trigger via the Custom Connector, first we need to create a new connection:

After logging in, and if needed consenting to the permissions scopes (se previous blog post for details), we should have a connection. We can now test the trigger by supplying the api-version (2016-10-01) and specify the operatingSystem, osVersion and userUpn parameters:

Click Test operation and verify a successful response like below:

Let’s try another test, this time leaving the userUpn blank (from the previous blog post this means that the Logic App tries to return all managed devices, if the user has the correct scope and/or roles). This time I get a 403 not authorized, which is expected as I don’t have the correct scope/role:

Checking the Logic App run history I can see that my Power Platform user triggered the Logic App and I can see the expected scp and roles claim:

Perfect so far! In the end of this blog post article I will show how we can get this response data to a Power App via a Flow and the Custom Connector, but first lets look into the other ways of creating a Custom Connector.

Exporting the Logic App to a Power Platform environment

In the previous example, I created the Custom Connector from my Power Platform environment, in this example I will do an Export from the Logic App. The user I will do this with needs to be a Power Platform licensed user and have access to the environments, or else I will get this:

To Export, click this button:

Then fill inn the name of the Custom Connector to create, I will call this ..v2, and select environment:

You might get another permission error:

If so, we need to update the Custom Role created earlier with this permission. Do the following:

# 3b. Update the custom role
$roleLogicAppAPIOperator = Get-AzRoleDefinition -Name "Logic App API Operator"
$roleLogicAppAPIOperator.Actions.Add("Microsoft.Logic/workflows/triggers/listCallbackUrl/action")
Set-AzRoleDefinition -Role $roleLogicAppAPIOperator

The role and the assignment should now be updated, so we can try again. You might need to refresh or log out and in again for the permission to be updated. After this the Export should be successful:

We can now find the Custom Connector right below the first we created:

We still need to edit the Custom Connector with the authentication details, adding the app/client id, secret etc. The export has also left out all the query parameters (sv, sp, sig) but also the required api-version. This must be fixed, the easiest way is to switch to Swagger Editor, and add the line 15 and 16 as shown below:

After this you should be able to Update the Connector, and the Test, Create a Connection and verify successful results.

Creating the Custom Connector using an OpenAPI swagger definition file/url

Both examples above, either importing a Custom Connector from Azure Service, or exporting the Logic App to a Custom Connector in a Power Platform environment, require that the user doing this both has:

  • Azure RBAC role assignment and permissions as detailed above.
  • Access to Power Platform environment and licensed for using Power Platform.

What if you as Azure administrator don’t want your Power Platform users to have access to Azure, but you still want to help them with creating Custom Connectors that send requests to selected Logic App workflow APIs?

Then you can provide them with an OpenAPI swagger definition file or url. You can get the swagger OpenAPI definition by running this Azure REST request: Workflows – List Swagger (Azure Logic Apps) | Microsoft Docs.

To get your swagger you can look at the first blog post in this series, where I showed how you could use Az PowerShell to get management access tokens using Get-AzAccessToken and running Invoke-RestMethod. In this example I’m just going to use the Try it button from the Docs link above, and then authenticate to my Azure subscription, and fill inn the required parameters:

Running this request should produce your requested swagger OpenAPI definition. You can now copy this to a file:

Before you share this OpenAPI file with your Power Platform developers, you should edit and remove the following request query parameters, as these are not needed when running the OAuth2 authorization scheme:

After this you can create a new Custom Connector, by specifying an OpenAPI file / url, depending on where you made the file available to your Power Platform Developers.

Browse to the filename and type a Connector name:

After this you will have the basis of the Connector defined, where you can customize general settings etc:

You will now need to add the authentication for Azure AD OAuth2 with client id, secret etc under Security, as well as creating the connector, and under test create a connection and test the operations. This is the same as I showed earlier, so I don’t need to show tthe details here.

Creating the Custom Connector from blank

You can of course create Power Platform Custom Connectors from blank as well, this should be easy enough based on the details I have provided above, but basically you will need to make sure to set the correct Host and Base URL path for your Logic App here:

After adding the authentication details for Azure AD OAuth2 (same as before), you will need to manually add actions, and providing a request from sample, as well as defining the default responses for 200 and 403 status, as shown in the earlier steps.

With the Custom Connector now in place for sending Requests to the Logic App using delegated authentication, we can now start using this Connector in our Flows/Power Apps.

Lets build a quick sample of that.

Creating a Power Automate Flow that will trigger Logic App API

I’ll just assume readers of this blog post knows a thing or two about Power Automate and Cloud Flows, so I’ll try to keep this high level.

I’ve created a new instant Cloud Flow, using PowerApps as a trigger. Then I add three initialize variables actions, giving the variables and actions name like below, before I set “Ask in PowerApps” for values:

Next, add a Custom Connector action, selecting the Custom Connector we created earlier (I have 3 versions here, as I shown above with the alternative ways to set up a Custom Connector from Logic Apps:

Next, select the action from the chosen Custom Connector, and fill in the parameters like below:

Next, we need to check the response status code we get back from the Custom Connector. Add a Switch control action, where we will check against the outputs from the Get Managed Devices and statusCode:

Make sure that the Switch action should run either if the Get Managed Devices is successful or failed:

For each case of statusCode we will check against, we need a Response action to return data back to the PowerApp. For status code 200 OK, I’ll return the Status Code as shown below, adding Headers to be Content-Type application/json, and using the Body output from the Get Managed Devices custom connector action. The Response Body JSON schema is based on the sample output from the Logic App API.

PS! To get the Body output, you can use the following custom expression: outputs('Get_Managed_Devices')?['body']:

For status code 403 I have added the following Case and Response action, using the Body output again, but this time the schema is based on the 403 response from the Logic App API:

Last, as every case should have Response, I’ll add the following Default Case:

The whole Flow visualized:

PS! Instead of using the Response action, I could also have used the “Respond to PowerApps” action. However this action only let me return text strings, numbers, boolean etc, and I wanted to return native JSON response.

Make sure that you test and verify the Flow before you proceed.

Creating the Power App to connect to the Flow

With the Flow ready, lets quickly build a PowerApp. My PowerApp is a Canvas App, and I have been using the Phone layout.

You can build this any way you want, but I used a dark theme, added an Icon at the top screen, and the 3 labels and text inputs for the parameters needed for the Flow. I then added a button for triggering the Flow, and under the button I have added a (hidden now) text label, for showing any error messages from the Flow. And I have added a Gallery control under there again for showing the resulting devices:

For the Button, select it and click on the Action menu and the Power Automate to connect your Flow. Next, change the “OnSelect” event for the Button to the following command:

Set(wait,true);
Clear(MyManagedDevices);Set(MyErrorMessage,Blank());
Set(MyDeviceResponse,GetManagedDevices.Run(textOperatingSystem.Text,textOSVersion.Text,textUserUpn.Text));
If(IsBlank(MyDeviceResponse),Set(MyErrorMessage,"Authorization Error: Check Flow for details on missing scope or roles claims for querying organization devices."),ClearCollect(MyManagedDevices,MyDeviceResponse));
Set(wait,!true)

A quick explanation of the commands above:

  • Set(wait,true) and Set(wait,!true) is to make the PowerApp “busy” when clicking.
  • I then Clear my Collection and Variable used.
  • I then use Set to get a “MyDeviceResponse”, this will return a collection of items (devices) returned via a JSON array from the Flow, or if I’m not authorized, it will return a failed response (based on the 403) and a blank MyDeviceResponse.
  • Next I do a If test, if the MyDeviceResponse i Blank, I’ll set the MyErrorMessage variable, if it’s not blank I will run a ClearCollect and fill the Collection with returned devices.

I fully appreciate that there might be other ways to do this fail checking and error handling, please let me know in the comments if you have other suggestions 🙂

For the Gallery I set the Data source to MyManagedDevices collection, and I have selected to use the layout of “Title, subtitle, and body”. You can change the device data that get filled in for these items in the Galleri, for example Manufacturer, Version, Name etc.

And last I set the Text property of my error message label to the MyErrorMessage variable:

Let’s Save, Publish and test this PowerApp!

First, I’ll try to add parameters for getting all Windows 10 devices, leaving user principal name blank. This will via the Custom Connector send a request to the Logic App API to return all devices, and in this case I’m not authorized to do so, so I’ll get an authorization error:

This is something I can verify in the Flow run history also:

Next, I’ll try to return my test users devices only, and this is successful and will fill the gallery:

We now have a working Flow and PowerApp connected to the Logic App API using the signed in users delegated permissions. If I want I can now share the PowerApp and Flow including the Custom Connector with other users in my organization, and they can use their own user identity for connections to the Logic App API.

In the next part of this blog post I will show how you can access the Logic App API using HTTP action and application permissions.

Connect to Logic App API using HTTP action

Sometimes you will have scenarios where you want to use an application identity to call an API like the Logic App I have used in this blog post article series. This is especially useful if you want to run a Power Automate Flow without a logged in user’s permissions.

I the previous blog post part 2 for exposing Logic App as an API, I created this App Registration to represent the Application Client scenarios and Application permissions:

In that App Registration, create a new Client Secret for using in Power Automate, and copy this to your clipboard:

Make sure to copy the Application (Client) ID and Tenant ID also:

Now let’s create a new Power Automate Flow to test this scenario. This type of Flow could use a range of different triggers based on your needs, but I’ll just use a Instant Cloud Flow as trigger where I have configured the following inputs:

Note that I have configured userUpn as an optional input.

Next add a “Compose” action for the Client Secret, give the action a name and paste in the Client Secret you created earlier. Note the Lock symbol:

Click on settings and select to Secure Inputs:

Next add a HTTP action, specifying the Method to POST and the URI to be the LogicApp API url, remember to not include the sv, sp and sig query parameters. Set Headers Content-Type to application/json, and under queries add the api-version. For body build the JSON request body using the inputs. We need to build a dynamic expression for userUpn, as this can be optional. I have used the following expression:

if(not(empty(triggerBody()?['text_2'])),triggerBody()?['text_2'],'')

Click to show advanced settings, and choose Authentication to use Azure AD OAuth. Add the authority, tenant id and set audience to the custom Logic App API URI. Then paste in the Application (Client) Id, and use the Outputs from the Compose Client Secret action:

This authentication above will use the Client Credentials Flow to get an access token that will be accepted by the Logic App API.

The remaining parts of this Flow can be exactly the same as the previous Flow we built, a Switch control that continiues on success/failure of the HTTP action:

And then returning Response objects from the HTTP action body for each case:

When testing the Flow now, I can see that the client secret is hidden from all relevant actions:

Summary and Next Steps

We are at the end of another extensive blog post. The focus for this article has been to show how you can use Power Automate to connect to your custom API, that we built in the previous blog post for exposing the Logic App as an API.

The community are increasingly creating Power Platform custom connectors and http actions that sends requests to API’s to Microsoft Graph directly, and that is great but it might result in too extensive permissions given to users and application clients. My focus has been to show how you can control authentication and authorization using on-behalf-of flows hidden behind a Logic App API where users and clients are allowed to send requests based on allowed permission scopes and/or roles, using the powers of Azure Active Directory and OAuth2.

There will be a later blog post in this series also, where I look into how Azure API Management can be used in these scenarios as well.

In the meantime, thanks for reading, hope it has been helpful!

Protect Logic Apps with Azure AD OAuth – Part 2 Expose Logic App as API

This blog article will build on the previous blog post published, Protect Logic Apps with Azure AD OAuth – Part 1 Management Access | GoToGuy Blog, which provided some basic understanding around authorizing to Logic Apps request triggers using OAuth and Access Tokens.

In this blog I will build on that, creating a scenario where a Logic App will be exposed as an API to end users. In this API, I will call another popular API: Microsoft Graph.

My scenario will use a case where end users does not have access themselves to certain Microsoft Graph requests, but where the Logic App does. Exposing the Logic App as an API will let users be able to authenticate and authorize, requesting and consenting to the custom Logic App API permissions I choose. Some of these permissions can users consent to themselves, while other must be admin consented. This way I can use some authorizing inside the Logic App, and only let the end users be able to request what they are permitted to.

I will also look into assigning users and groups, and using scopes and roles for additional fine graining end user and principal access to the Logic App.

A lot of topics to cover, so let’s get started by first creating the scenario for the Logic App.

Logic App calling Microsoft Graph API

A Logic App can run requests against the Microsoft Graph API using the HTTP action and specifying the method (GET, POST, etc) and resource URI. For authentication against Graph from the Logic App you can use either:

  • Using Azure Active Directory OAuth and Client Credentials Flow with Client Id and Secret.
  • Using System or User Assigned Managed Identity.

Permissions for Microsoft Graph API are either using “delegated” (in context of logged in user) or “application” (in context of application/deamon service). These scenarios using Logic App will use application permissions for Microsoft Graph.

PS! Using Logic Apps Custom Connectors (Custom connectors overview | Microsoft Docs) you can also use delegated permissions by creating a connection with a logged in user, but this outside of the scope of this article.

Scenario for using Microsoft Graph in Logic App

There are a variety of usage scenarios for Microsoft Graph, so for the purpose of this Logic App I will focus on one of the most popular: Device Management (Intune API) resources. This is what I want the Logic App to do in this first phase:

  • Listing a particular user’s managed devices.
  • Listing all of the organization’s managed devices.
  • Filtering managed devices based on operating system and version.

In addition to the above I want to implement the custom API such that any assigned user can list their own devices through end-user consent, but to be able to list all devices or any other user than your self you will need an admin consented permission for the custom API.

Creating the Logic App

In your Azure subscription, add a new Logic App to your chosen resource group and name it according to your naming standard. After the Logic App is created, you will need add the trigger. As this will be a custom API, you will need it to use HTTP as trigger, and you will also need a response back to the caller, so the easiest way is to use the template for HTTP Request-Response as shown below:

Your Logic App will now look like this:

Save the Logic App before proceeding.

Create a Managed Identity for the Logic App

Exit the designer and go to the Identity section of the Logic App. We need a managed identity, either system assigned or user assigned, to let the Logic App authenticate against Microsoft Graph.

A system assigned managed identity will follow the lifecycle of this Logic App, while a user assigned managed identity will have it’s own lifecycle, and can be used by other resources also. I want that flexibility, so I will create a user assigned managed identity for this scenario. In the Azure Portal, select to create a new resource and find User Assigned Managed Identity:

Create a new User Assigned Managed Identity in your selected resource group and give it a name based on your naming convention:

After creating the managed identity, go back to your Logic App, and then under Identity section, add the newly created managed identity under User Assigned Managed Identity:

Before we proceed with the Logic App, we need to give the Managed Identity the appropriate Microsoft Graph permissions.

Adding Microsoft Graph Permissions to the Managed Identity

Now, if we wanted the Logic App to have permissions to the Azure Rest API, we could have easily added Azure role assignments to the managed identity directly:

But, as we need permissions to Microsoft Graph, there are no GUI to do this for now. The permissions needed for listing managed devices are documented here: List managedDevices – Microsoft Graph v1.0 | Microsoft Docs.

So we need a minimum of: DeviceManagementManagedDevices.Read.All.

To add these permissions we need to run some PowerShell commands using the AzureAD module. If you have that installed locally, you can connect and proceed with the following commands, for easy of access you can also use the Cloud Shell in the Azure Portal, just run Connect-AzureAD first:

PS! You need to be a Global Admin to add Graph Permissions.

You can run each of these lines separately, or run it as a script:

# Microsoft Graph App Well Known App Id
$msGraphAppId = "00000003-0000-0000-c000-000000000000"

# Display Name if Managed Identity
$msiDisplayName="msi-ops-manageddevices" 

# Microsoft Graph Permission required
$msGraphPermission = "DeviceManagementManagedDevices.Read.All" 

# Get Managed Identity Service Principal Name
$msiSpn = (Get-AzureADServicePrincipal -Filter "displayName eq '$msiDisplayName'")

# Get Microsoft Graph Service Principal
$msGraphSpn = Get-AzureADServicePrincipal -Filter "appId eq '$msGraphAppId'"

# Get the Application Role for the Graph Permission
$appRole = $msGraphSpn.AppRoles | Where-Object {$_.Value -eq $msGraphPermission -and $_.AllowedMemberTypes -contains "Application"}

# Assign the Application Role to the Managed Identity
New-AzureAdServiceAppRoleAssignment -ObjectId $msiSpn.ObjectId -PrincipalId $msiSpn.ObjectId -ResourceId $msGraphSpn.ObjectId -Id $appRole.Id

Verify that it runs as expected:

As mentioned earlier, adding these permissions has to be done using script commands, but there is a way to verify the permissions by doing the following:

  1. Find the Managed Identity, and copy the Client ID:
  1. Under Azure Active Directory and Enterprise Applications, make sure you are in the Legacy Search Experience and paste in the Client ID:
  1. Which you then can click into, and under permissions you will see the admin has consented to Graph permissions:

The Logic App can now get Intune Managed Devices from Microsoft Graph API using the Managed Identity.

Calling Microsoft Graph from the Logic App

Let’s start by adding some inputs to the Logic App. I’m planning to trigger the Logic App using an http request body like the following:

{
 "userUpn": "[email protected]",
 "operatingSystem": "Windows",
 "osVersion": "10"
}

In the Logic App request trigger, paste as a sample JSON payload:

The request body schema will be updated accordingly, and the Logic App is prepared to receive inputs:

Next, add a Condition action, where we will check if we should get a users’ managed devices, or all. Use an expression with the empty function to check for userUpn, and another expression for the true value, like below:

We will add more logic and conditions later for the filtering of the operating system and version, but for now add an HTTP action under True like the following:

Note the use of the Managed Identity and Audience, which will have permission for querying for managed devices.

Under False, we will get the managed devices for a specific user. So add the following, using the userUpn input in the URI:

Both these actions should be able to run successfully now, but we will leave the testing for a bit later. First I want to return the managed devices found via the Response action.

Add an Initialize Variable action before the Condition action. Set the Name and Type to Array as shown below, but the value can be empty for now:

Next, under True and Get All Managed Devices, add a Parse JSON action, adding the output body from the http action and using either the sample response from the Microsoft Graph documentation, or your own to create the schema.

PS! Note that if you have over 1000 managed devices, Graph will page the output, so you should test for odata.nextLink to be present as well. You can use the following anonymized sample response for schema which should work in most cases:

{
     "@odata.context": "https://graph.microsoft.com/v1.0/$metadata#deviceManagement/managedDevices",
     "@odata.count": 1000,
     "@odata.nextLink": "https://graph.microsoft.com/v1.0/deviceManagement/managedDevices?$skiptoken=",
     "value": [
         {
             "id": "id Value",
             "userId": "User Id value",
             "deviceName": "Device Name value",
             "managedDeviceOwnerType": "company",
             "operatingSystem": "Operating System value",
             "complianceState": "compliant",
             "managementAgent": "mdm",
             "osVersion": "Os Version value",
             "azureADRegistered": true,
             "deviceEnrollmentType": "userEnrollment",
             "azureADDeviceId": "Azure ADDevice Id value",
             "deviceRegistrationState": "registered",
             "isEncrypted": true,
             "userPrincipalName": "User Principal Name Value ",
             "model": "Model Value",
             "manufacturer": "Manufacturer Value",
             "userDisplayName": "User Display Name Value",
             "managedDeviceName": "Managed Device Name Value"
         }
     ]
 }

PS! Remove any sample response output from schema if values will be null or missing from your output. For example I needed to remove the configurationManagerClientEnabledFeatures from my schema, as this is null in many cases.

Add another Parse JSON action under the get user managed devices action as well:

Now we will take that output and do a For Each loop for each value. On both sides of the conditon, add a For Each action, using the value from the previous HTTP action:

Inside that For Each loop, add an Append to Array variable action. In this action we will build a JSON object, returning our chosen attributes (you can change to whatever you want), and selecting the properties from the value that was parsed:

Do the exact same thing for the user devices:

Now, on each side of the condition, add a response action, that will return the ManagedDevices array variable, this will be returned as a JSON som set the Content-Type to application/json:

Finally, remove the default response action that is no longer needed:

The complete Logic App should look like the following now:

As I mentioned earlier, we’ll get to the filtering parts later, but now it’s time for some testing.

Testing the Logic App from Postman

In the first part of this blog post article series, Protect Logic Apps with Azure AD OAuth – Part 1 Management Access | GoToGuy Blog, I described how you could use Postman, PowerShell or Azure CLI to test against REST API’s.

Let’s test this Logic App now with Postman. Copy the HTTP POST URL:

And paste it to Postman, remember to change method to POST:

You can now click Send, and the Logic App will trigger, and should return all your managed devices.

If you want a specific users’ managed devices, then you need to go to the Body parameter, and add like the following with an existing user principal name in your organization:

You should then be able to get this users’ managed devices, for example for my test user this was just a virtual machine with Window 10:

And I can verify a successful run from the Logic App history:

Summary so far

We’ve built a Logic App that uses it’s own identity (User Assigned Managed Identity) to access the Microsoft Graph API using Application Permissions to get managed devices for all users or a selected user by UPN. Now it’s time to exposing this Logic App as an API son end users can call this securely using Azure AD OAuth.

Building the Logic App API

When exposing the Logic App as an API, this will be the resource that end users will access and call as a REST API. Consider the following diagram showing the flow for OpenID Connect and OAuth, where Azure AD will be the Authorization Server from where end users can request access tokens where the audience will be the Logic App resource:

Our next step will be to create Azure AD App Registrations, and we will start with the App Registration for the resource API.

Creating App Registration for Logic App API

In your Azure AD tenant, create a new App Registration, and call it something like (YourName) LogicApp API:

I will use single tenant for this scenario, leave the other settings as it is and create.

Next, go to Expose an API:

Click on Set right next to Application ID URI, and save the App ID URI to your choice. You can keep the GUID if you want, but you can also type any URI value you like here (using api:// or https://). I chose to set the api URI to this:

Next we need to add scopes that will be the permissions that delegated end users can consent to. This will be the basis of the authorization checks we can do in the Logic App later.

Add a scope with the details shown below. This will be a scope end users can consent to themselves, but it will only allow them to read their own managed devices:

Next, add another Scope, with the following details. This will be a scope that only Admins can consent to, and will be authorized to read all devices:

You should now have the following scopes defined:

Next, go to the Manifest and change the accessTokenAcceptedVersion from null to 2, this will configure so that Tokens will use the OAuth2 endpoints:

That should be sufficient for now. In the next section we will prepare for the OAuth client.

Create App Registration for the Logic App Client

I choose to create a separate App Registration in Azure AD for the Logic App Client. This will represent the OAuth client that end users will use for OAuth authentication flows and requesting permissions for the Logic App API. I could have configured this in the same App Registration as the API created in the previous section, but this will provide better flexibility and security if I want to share the API with other clients also later, or if I want to separate the permission grants between clients.

Go to App Registrations in Azure AD, and create a new registration calling it something like (yourname) LogicApp Client:

Choose single tenant and leave the other settings for now.

After registering, go to API permissions, and click on Add a permission. From there you can browse to “My APIs” and you should be able to locate the (yourname) Logic API. Select to add the delegated permissions as shown below:

These delegated permissions reflect the scopes we defined in the API earlier. Your App registration and API permission should now look like below. NB! Do NOT click to Grant admin consent for your Azure AD! This will grant consent on behalf of all your users, which will work against our intended scenario later.

Next, we need to provide a way for clients to authenticate using Oauth flows, so go to the Certificates & secrets section. Click to create a Client secret, I will name my secret after where I want to use it for testing later (Postman):

Make sure you copy the secret value for later:

(Don’t worry, I’ve already invalidated the secret above and created a new one).

Next, go to Authentication. We need to add a platform for authentication flows, so click Add a platform and choose Web. For using Postman later for testing, add the following as Redirect URI: https://oauth.pstmn.io/v1/callback

Next, we will also provide another test scenario using PowerShell or Azure CLI client, so click on Add a platform one more time, this time adding Mobile desktop and apps as platform and use the following redirect URI: urn:ietf:wg:oauth:2.0:oob

Your platform configuration should now look like this:

Finally, go to advanced and set yes to allow public client flows, as this will aid in testing from PowerShell or Azure CLI clients later:

Now that we have configured the necessary App registrations, we can set up the Azure AD OAuth Authorization Policy for the Logic App.

Configuring Azure AD OAuth Authorization Policy for Logic App

Back in the Logic App, create an Azure AD Authorization Policy with issuer and audience as shown below:

Note the Claims values:

We are using the v2.0 endpoint as we configured in the manifest of the App Registration that accessTokenAcceptedVersion should be 2. (as opposed to v1.0 issuer that would be in the format https://sts.windows.net/{tenantId}/). And the Audience claim would be our configured API App ID. (for v1.0 the audience would be the App ID URI, like api://elven-logicapp-api).

Save the Logic App, and we can now start to do some testing where we will use the client app registration to get an access token for the Logic App API resource.

Testing with Postman Client

The first test scenario we will explore is using Postman Client and the Authorization Code flow for getting the correct v2.0 Token.

A recommended practice when using Postman and reusing variable values is to create an Environment. I’ve created this Environment for storing my Tenant ID, Client ID (App ID for the Client App Registration) and Client Secret (the secret I created for using Postman):

Previously in this blog article, we tested the Logic App using Postman. On that request, select the Authorization tab, and set type to OAuth 2.0:

Next, under Token configuration add the values like the following. Give the Token a recognizable name, this is just for Postman internal refererence. Make sure that the Grant Type is Authorization Code. Note the Callback URL, this is the URL we configured for the App registration and Callback Url. In the Auth and Access Token URL, configure the use of the v2.0 endpoints, using TenantID from the environment variables. (Make sure to set the current environment top right). And for Client ID and Client Secret these will also refer to the environment variables:

One important step remains, and that is to correctly set the scope for the access token. Using something like user.read here will only produce an Access Token for Microsoft Graph as audience. So we need to change to the Logic App API, and the scope for ManagedDevices.Read in this case:

Let’s get the Access Token, click on the Get New Access Token button:

A browser window launches, and if you are not already logged in, you must log in first. Then you will be prompted to consent to the permission as shown below. The end user is prompted to consent for the LogicApp API, as well as basic OpenID Connect consents:

After accepting, a popup will try to redirect you to Postman, so make sure you don’t block that:

Back in Postman, you will see that we have got a new Access Token:

Copy that Access Token, and paste it into a JWT debugger like jwt.ms or jwt.io. You should see in the data payload that the claims for audience and issuer is the same values we configured in the Logic App Azure AD OAuth policy:

Note also the token version is 2.0.

Click to use the Token in the Postman request, it should populate this field:

Before testing the request, remember to remove the SAS query parameters from the request, so that sv, sp and sig are not used with the query for the Logic App:

Now, we can test. Click Send on the Request. It should complete successfully with at status of 200 OK, and return the managed device details:

Let’s add to the permission scopes, by adding the ManagedDevices.Read.All:

Remember just to have a blank space between the scopes, and then click Get New Access Token:

If I’m logged on with a normal end user, I will get the prompt above that I need admin privileges. If I log in with an admin account, this will be shown:

Note that I can now do one of two actions:

  1. I can consent only on behalf of myself (the logged in admin user), OR..
  2. I can consent on behalf of the organization, by selecting the check box. This way all users will get that permission as well.

Be very conscious when granting consents on behalf of your organization.

At this point the Logic App will authorize if the Token is from the correct issuer and for the correct audience, but the calling user can still request any managed device or all devices. Before we get to that, I will show another test scenario using a public client like PowerShell.

Testing with PowerShell and MSAL.PS

MSAL.PS is a perfect companion for using MSAL (Microsoft Authentication Library) to get Access Tokens in PowerShell. You can install MSAL.PS from PowerShellGallery using Install-Module MSAL.PS.

The following commands show how you can get an Access Token using MSAL.PS:

# Set Client and Tenant ID
$clientID = "cd5283d0-8613-446f-bfd7-8eb1c6c9ac19"
$tenantID = "104742fb-6225-439f-9540-60da5f0317dc"

# Get Access Token using Interactive Authentication for Specified Scope and Redirect URI (Windows PowerShell)
$tokenResponse = Get-MsalToken -ClientId $clientID -TenantId $tenantID -Interactive -Scope 'api://elven-logicapp-api/ManagedDevices.Read' -RedirectUri 'urn:ietf:wg:oauth:2.0:oob'

# Get Access Token using Interactive Authentication for Specified Scope and Redirect URI (PowerShell Core)
$tokenResponse = Get-MsalToken -ClientId $clientID -TenantId $tenantID -Interactive -Scope 'api://elven-logicapp-api/ManagedDevices.Read' -RedirectUri 'http://localhost'


MSAL.PS can be used both for Windows PowerShell, and for PowerShell Core, so in the above commands, I show both. Note that the redirect URI for MSAL.PS on PowerShell Core need to be http://localhost. You also need to add that redirect URI to the App Registration:

Running the above command will prompt an interactive logon, and should return a successful response saved in the $tokenResponse variable.

We can verify the response, for example checking scopes or copying the Access Token to the clipboard so that we can check the token in a JWT debugger:

# Check Token Scopes
$tokenResponse.Scopes

# Copy Access Token to Clipboard
$tokenResponse.AccessToken | Clip

In the first blog post of this article series I covered how you can use Windows PowerShell and Core to use Invoke-RestMethod for calling the Logic App, here is an example where I call my Logic App using the Access Token (in PowerShell Core):

# Set variable for Logic App URL
$logicAppUrl = "https://prod-05.westeurope.logic.azure.com:443/workflows/d429c07002b44d63a388a698c2cee4ec/triggers/request/paths/invoke?api-version=2016-10-01"

# Convert Access Token to a Secure String for Bearer Token
$bearerToken = ConvertTo-SecureString ($tokenResponse.AccessToken) -AsPlainText -Force

# Invoke Logic App using Bearer Token
Invoke-RestMethod -Method Post -Uri $logicAppUrl -Authentication OAuth -Token $bearerToken

And I can verify that it works:

Great. I now have a couple of alternatives for calling my Logic App securely using Azure AD OAuth. In the next section we will get into how we can do authorization checks inside the Logic App.

Authorization inside Logic App

While the Logic App can have an authorization policy that verifies any claims like issuer and audience, or other custom claims, we cannot use that if we want to authorize inside the logic app based on scopes, roles etc.

In this section we will look into how we can do that.

Include Authorization Header in Logic Apps

First we need to include the Authorization header from the OAuth access token in the Logic App. To do this, open the Logic App in code view, and add the operationOptions to IncludeAuthorizationHeadersInOutputs for the trigger like this:

        "triggers": {
            "manual": {
                "inputs": {
                    "schema": {}
                },
                "kind": "Http",
                "type": "Request",
                "operationOptions": "IncludeAuthorizationHeadersInOutputs"
            }
        }

This will make the Bearer Token accessible inside the Logic App, as explained in detail in my previous post: Protect Logic Apps with Azure AD OAuth – Part 1 Management Access | GoToGuy Blog. There I also showed how to decode the token to get the readable JSON payload, so I need to apply the same steps here:

After applying the above steps, I can test the Logic App again, and get the details of the decoded JWT token, for example of interest will be to check the scopes:

Implement Logic to check the Scopes

When I created the LogicApp API app registration, I added two scopes: ManagedDevices.Read and ManagedDevices.Read.All. The authorization logic I want to implement now is to only let users calling the Logic App and that has the scope ManagedDevices.Read.All to be able to get ALL managed devices, or to get managed devices other than their own devices.

The first step will be to check if the JWT payload for scope “scp” contains the ManagedDevices.Read.All. Add a Compose action with the following expression:

contains(outputs('Base64_to_String_Json').scp,'ManagedDevices.Read.All')

This expression will return either true or false depending on the scp value.

Next after this action, add a Condition action, where we will do some authorization checks. I have created two groups of checks, where one OR the other needs to be true.

Here are the details for these two groups:

  • Group 1 (checks if scp does not contain ManagedDevices.Read.All and calling user tries to get All managed devices):
    • Outputs('Check_Scopes') = false
    • empty(triggerBody()?['userUpn']) = true
  • Group 2 (checks if scp does not contain ManagedDevices.Read.All, and tries to get managed devices for another user than users’ own upn):
    • Outputs('Check_Scopes') = false
    • triggerBody()?['userUpn'] != Outputs('Base64_to_String_Json')['preferred_username']

If either of those two groups is True, then we know that the calling user tries to do something the user is not authorized to do. This is something we need to give a customized response for. So inside the True condition, add a new Response action with something like the following:

I’m using a status code of 403, meaning that the request was successfully authenticated but was missing the required authorization for the resource.

Next, add a Terminate action, so that the Logic App stops with a successful status. Note also that on the False side of the condition, I leave it blank because I want it to proceed with the next steps in the Logic Apps.

Test the Authorization Scope Logic

We can now test the authorization scopes logic implemented above. In Postman, either use an existing Access Token or get a new Token that only include the ManagedDevices.Read scope.

Then, send a request with an empty request body. You should get the following response:

Then, try another test, this time specifying another user principal name than your own, which also should fail:

And then test with your own user principal name, which will match the ‘preferred_username’ claim in the Access Token, this should be successful and return your devices:

Perfect! It works as intended, normal end users can now only request their own managed devices.

Let’s test with an admin account and the ManagedDevices.Read.All scope. In Postman, add that scope, and get a new Access Token:

When logging in with a user that has admin privileges you will now get a Token that has the scope for getting all devices, for which your testing should return 200 OK for all or any users devices:

Adding Custom Claims to Access Token

In addition to the default claims and scopes in the Access Token, you can customize a select set of additional claims to be included in the JWT data payload. Since the Access Token is for the resource, you will need to customize this on the App Registration for the LogicApp API.

In Azure AD, select the App Registration for the API, and go to API permissions first. You need to add the OpenID scopes first. Add the following OpenID permissions:

Your API App Registration should look like this:

Next, go to Token configuration. Click Add optional claim, and select Access Token. For example you can add the ipaddr and upn claims as I have done below:

Note the optional claims listed for the resource API registration:

Next time I get a new access token, I can see that the claims are there:

Summary of User Authorization so far

What we have accomplished now is that users can get an Access Token for the Logic App API resource. This is the first requirement for users to be able to call the Logic App, that they indeed have a Bearer Token in the Authorization Header that includes the configured issuer and audience.

In my demos I have shown how to get an access token using Postman (Authorization Code Flow) and a Public Client using MSAL.PS. But you can use any kind of Web application, browser/SPA or, Client App, using any programming libraries that either support MSAL or OpenID Connect and OAuth2. Your solution, your choice 😉

After that I showed how you can use scopes for delegated permissions, and how you can do internal authorization logic in the Logic App depending on what scope the user has consented to/allowed to.

We will now build on this, by looking into controlling access and using application roles for principals.

Assigning Users and Restricting Access

One of the most powerful aspects of exposing your API using Microsoft Identity Platform and Azure AD is that you now can control who can access your solution, in this case call the Logic App.

Better yet, you can use Azure AD Conditional Access to apply policies for requiring MFA, devices to be compliant, require locations or that sign-ins are under a certain risk level, to name a few.

Let’s see a couple of examples of that.

Require User Assignment

The first thing we need to do is to change the settings for the Enterprise Application. We created an App registration for the LogicApp Client, for users to able to authenticate and access the API. From that LogicApp Client, you can get to the Enterprise Application by clicking on the link for Managed application:

In the Enterprise App, go to Properties, and select User assignment required:

We can now control which users, or groups, that can authenticate to and get access to the Logic App API via the Client:

If I try to log in with a user that is not listed under Users and groups, I will get an error message that the “signed in user is not assigned to a role for the application”:

PS! The above error will show itself a little different based on how you authenticate, the above image is using a public client, if you use Postman, the error will be in the postman console log, if you use a web application you will get the error in the browser etc.

Configuring Conditional Access for the Logic App

In addition to controlling which users and groups that can access the Logic App, I can configure a Conditional Access policy in Azure AD for more fine grained access and security controls.

In your Azure AD blade, go to Security and Conditional Access. If you already have a CA policy that affects all Applications and Users, for example requiring MFA, your LogicApp API would already be affected by that.

Note that as we are protecting the resource here, your Conditional Access policy must be targeted to the LogicApp API Enterprise App.

Click to create a new policy specific for the Logic App API, as shown below:

For example I can require that my Logic App API only can be called from a managed and compliant device, or a Hybrid Azure AD Joined device as shown below:

If I create that policy, and then tries to get an access token using a device that are not registered or compliant with my organization, I will get this error:

Summary of Restricting Access for Users and Groups

With the above steps we can see that by adding an Azure AD OAuth authorization policy to the Logic App, we can control which users and groups that can authenticate to and get an Access Token required for calling the Logic App, and we can use Conditional Access for applying additional fine grained access control and security policies.

So far we have tested with interactive users and delegated permission acccess scenarios, in the next section we will dive into using application access and roles for authorization scenarios.

Adding Application Access and Roles

Sometimes you will have scenarios that will let application run as itself, like a deamon or service, without requiring an interactive user present.

Comparing that to the OIDC and OAuth flow from earlier the Client will access the Resource directly, by using an Access Token aquired from Azure AD using the Client Credentials Flow:

Using the Client Credentials Flow from Postman

Back in the Postman client, under the Authorization tab, just change the Grant Type to Client Credentials like the following. NB! When using application access, there are no spesific delegated scopes, so you need to change the scope so that it refers to .default after the scope URI:

Click Get New Access Token, and after successfully authenticating click to Use Token. Copy the Token to the Clipboard, and paste to a JWT debugger. Let’s examine the JWT payload:

Note that the audience and issuer is the same as when we got an access token for an end user, but also that the JWT payload does not contain any scopes (scp) or any other user identifiable claims.

Using the Client Credentials Flow from MSAL.PS

To get an Access Token for an application client in MSAL.PS, run the following commands:

# Set Client and Tenant ID
$clientID = "cd5283d0-8613-446f-bfd7-8eb1c6c9ac19"
$tenantID = "104742fb-6225-439f-9540-60da5f0317dc"
# Set Client Secret as Secure String (keep private)
$clientSecret = ConvertTo-SecureString ("<your secret in plain text") -AsPlainText -Force

# Get Access Token using Client Credentials Flow and Default Scope
$tokenResponse = Get-MsalToken -ClientId $clientID -ClientSecret $clientSecret -TenantId $tenantID -Scopes 'api://elven-logicapp-api/.default'

You can then validate this Token and copy it to a JWT debugger:

# Copy Access Token to Clipboard
$tokenResponse.AccessToken | Clip

Calling the Logic App using Client Application

We can send requests to the Logic App using an Access Token in an application by including it as a Bearer Token in the Authorization Header exactly the same way we did previously, however it might fail internally if the Logic App processing of the access token fails because it now contains a different payload with claims:

Looking into the run history of the Logic App I can see that the reason it fails is that it is missing scp (scopes) in the token.

This is expected when authenticating as an application, so we will fix that a little later.

A few words on Scopes vs. Roles

In delegated users scenarios, permissions are defined as Scopes. When using application permissions, we will be using Roles. Role permissions will always be granted by an admin, and every role permission granted for the application will be included in the token, and they will be provided by the .default scope for the API.

Adding Application Roles for Applications

Now, let’s look into adding Roles to our LogicApp API. Locate the App registration for the API, and go to the App roles | Preview blade. (this new preview let us define roles in the GUI, where until recently you had to go to the manifest to edit).

Next, click on Create app role. Give the app role a display name and value. PS! The value must be unique, so if you already have that value as a scope name, then you need to distinguish it eg. by using Role in the value as I have here:

The allowed member types give you a choice over who/what can be assigned the role. You can select either application or user/groups, or both.

Add another App Role as shown below:

You should now have the following two roles:

Assigning Roles to Application

I recommend that you create a new App Registration for application access scenarios. This way you can avoid mixing delegated and application permissions in the same app registration, it will make it easier to differentiate user and admins consents, and secret credentials will be easier to separate, and you can use different settings for restricting access using Azure AD Users/Groups and Conditional Access.

So create a new App registration, call it something like (Yourname) LogicApp Application Client:

Choose single tenant and leave the other settings as default. Click Register and copy the Application (Client ID) and store it for later:

Next, go to Certificates & secrets, and create a new Client secret:

Copy the secret and store it for later.

Go to API permissions, click Add a permission, and from My APIs, find the LogicApp API. Add the Application permissions as shown below, these are the App Roles we added to the API earlier:

Under API permissions you can remove the Microsoft Graph user.read permission, it won’t be needed here, the two remaining permissions should be:

These you NEED to grant admin consent for, as no interactive user will be involved in consent prompt:

The admin consent are granted as shown below:

Now we can test getting access token via this new app registration, either use Postman or MSAL.PS , remember to use the new app (client) id and app (client) secret. I chose to add the two values to my Postman environment like this:

Next, change the token settings for Client Credentials flow so that the Client ID and Secret use the new variable names. Click to Get New Access Token:

After successfully getting the access token, click Use Token and copy it to clipboard so we can analyze it in the JWT debugger. From there we can indeed see that the roles claims has been added:

We will look for these roles claims in the Logic App later. But first we will take a look at how we can add these roles to users as well.

Assigning Roles to Users/Groups

Adding roles to users or groups can be used for authorizing access based on the roles claim. Go to the Enterprise App for the Logic App API registration, you can get to the Enterprise App by clicking on the Managed application link:

In the Enterprise App, under Users and Groups, you will already see the ServicePrincipal’s for the LogicApp Application Client with the Roles assigned. This is because these role permissions were granted by admin consent:

Click on Add user/group, add for a user in your organization the selected role:

You can add more users or groups to assigned roles:

Lets do a test for this user scenario. We need to do an interactive user login again, so change to using Authorization Code Flow in Postman, and change to the originial ClientID and ClientSecret:

Click to Get New Access Token, authenticate with your user in the browser (the user you assigned a role to), and then use the token and copy it to clipboard. If we now examine that token and look at the JWT data payload, we can see that the user has now a role claim, as well as the scope claim:

We can now proceed to adjust the authorization checks in the Logic App.

Customizing Logic App to handle Roles Claims

Previously in the Logic App we did checks against the scopes (scp claim). We need to do some adjustment to this steps, as it will fail if there are no scp claim in the Token:

Change to the following expression, with a if test that returns false if no scp claim, in addition to the original check for scope to be ManagedDevices.Read.All:

This is the expression used above:

if(empty(outputs('Base64_to_String_Json')?['scp']),false,contains(outputs('Base64_to_String_Json').scp,'ManagedDevices.Read.All'))

Similary, add a new Compose action, where we will check for any Roles claim.

This expression will also return false if either the roles claim is empty, or does not contain the ManagedDevices.Role.Read.All:

if(empty(outputs('Base64_to_String_Json')?['roles']),false,contains(outputs('Base64_to_String_Json').roles,'ManagedDevices.Role.Read.All'))

Next we need to add more checks to the authorization logic. Add a new line to the first group, where we also check the output of the Check Roles action to be false:

In the above image I’ve also updated the action name and comment to reflect new checks.

To the second group, add two more lines, where line number 3 is checking outputs from Check Roles to be false (same as above), and line 4 do a check if the roles claim contains the role ManagedDevices.Role.Read:

The complete authorization checks logic should now be:

And this is the summary of conditions:

  • Group 1 (checks if scp does not contain ManagedDevices.Read.All and roles does not contain ManagedDevices.Role.Read.All and calling user tries to get All managed devices):
    • Outputs('Check_Scopes') = false
    • empty(triggerBody()?['userUpn']) = true
    • Outputs('Check_Roles') = false
  • Group 2 (checks if scp does not contain ManagedDevices.Read.All and roles does not contain ManagedDevices.Role.Read.All, and tries to get managed devices for another user than users’ own upn, and roles does not contain ManagedDevices.Role.Read):
    • Outputs('Check_Scopes') = false
    • triggerBody()?['userUpn'] != Outputs('Base64_to_String_Json')['preferred_username']
    • Outputs('Check_Roles') = false
    • contains(outputs('Base64_to_String_Json')?['roles'],'ManagedDevices.Role.Read') = false

If any of the two groups of checks above returns true, then it means that the request was not authorized. To give a more customized response, change the response action like the following:

In the above action I have changed that response is returned as a JSON object, and then changed the body so that it returns JSON data. I have also listed the values from the token that the user/application use when calling the Logic App. The dynamic expression for getting roles claim (for which will be in an array if there are any roles claim) is:
if(empty(outputs('Base64_to_String_Json')?['roles']),'',join(outputs('Base64_to_String_Json')?['roles'],' '))
And for getting any scopes claim, which will be a text string or null:
outputs('Base64_to_String_Json')?['scp']

Test Scenario Summary

I’ll leave the testing over to you, but if you have followed along and customized the Logic App as I described above, you should now be able to verify the following test scenarios:

User/AppScopeRolesResult
UserManagedDevices.ReadCan get own managed devices.
Not authorized to get all devices or other users’ managed devices.
User (Admin)ManagedDevices.Read.AllCan get any or all devices.
UserManagedDevices,ReadManagedDevices.Role.ReadCan get own managed devices.
Can get other users’ managed devices by userUpn.
Not authorized to get all devices.
UserManagedDevices.ReadManagedDevices.Role.Read.AllCan get any or all devices.
ApplicationManagedDevices.Role.ReadCan any users’ managed devices by userUpn.
Not authorized to get all devices.
ApplicationManagedDevices.Role.Read.AllCan get any or all devices.

When testing the above scenarios, you need a new access token using either authorization code flow (user) or client credentials (application). For testing with roles and user scenarios, you can change the role assignments for the user at the Enterprise Application for the LogicAPI API. For testing with roles with application scenarios, make sure that you only grant admin consent for the applicable roles you want to test.

Final Steps and Summary

This has been quite the long read. The goal of this blog post was to show how your Logic App workflows can be exposed as an API, and how Azure AD OAuth Authorization Policies can control who can send requests to the Logic App as well as how you can use scopes and roles in the Access Token to make authorization decisions inside the Logic App. And even of more importance, integrating with Azure AD let’s you control user/group access, as well as adding additional security layer with Conditional Access policies!

My demo scenario was to let the Logic App call Microsoft Graph and return managed devices, which require privileged access to Graph API, and by exposing the Logic App as an API I can now let end users/principals call that Logic App as long as they are authorized to do so using my defined scopes and/or roles. I can easily see several other Microsoft Graph API (or Azure Management APIs, etc) scenarios using Logic App where I can control user access similarly.

Note also that any callers of the Logic App that now will try to call the Logic App using SAS access scheme will fail, as a Bearer Token is expected in the Authorization Header and the custom authorization actions that has been implemented. You might want to implement some better error handling if you like.

There’s an added bonus at the end of this article, where I add the filters for getting managed devices. But for now I want to thank you for reading and more article in this series will come later, including:

  • Calling Logic Apps protected by Azure AD from Power Platform
  • Protecting Logic App APIs using Azure API Management (APIM)

Bonus read

To complete the filtering of Managed Devices from Microsoft Graph, the Logic App prepared inputs of operatingSystem and osVersion in addition to userUpn. Let’s how we can implement that support as well.

After the initialize variable ManagedDevices action, add a Compose action. In this action, which I rename to operatingSystemFilter, I add a long dynamic expression:

This expression will check if the request trigger has an operatingSystem value, it not this value will be a empty string, but if not empty the I start building a text string using concat function where I build the filter string. There are some complexities here, amongs others using escaping of single apostroph, by adding another single apostroph etc. But this expression works:

if(empty(triggerBody()?['operatingSystem']),'',concat('/?$filter=operatingSystem eq ''',triggerBody()?['operatingSystem'],''''))

Next, add another Compose action and name it operatingSystemVersionFilter. This expression is even longer, checking the request trigger for osVersion, and if empty, it just returns the operatingSystemFilter from the previos action, but if present another string concat where I ‘and’ with the previous filter:

The expression from above image:

if(empty(triggerBody()?['osVersion']),outputs('operatingSystemFilter'),concat(outputs('operatingSystemFilter'),' and startswith(osVersion,''',triggerBody()?['osVersion'],''')'))

We can now add that output to the Graph queries, both when getting all or a specific user’s devices:

I can now add operatingSystem and osVersion to the request body when calling the Logic App:

And if I check the run history when testing the Logic App, I can see that the filter has been appended to the Graph query:

You can if you want also build more error handling logic for when if users specify the wrong user principalname, or any other filtering errors that may occur because of syntax etc.

That concludes the bonus tip, thanks again for reading 🙂

Blog Series – Power’ing up your Home Office Lights: Part 10 – Subscribe to Graph and Teams Presence to automatically set Hue Lights based on my Teams Presence!

This blog post is part of the Blog Series: Power’ing up your Home Office Lights with Power Platform. See introduction post for links to the other articles in the series:
https://gotoguy.blog/2020/12/02/blog-series—powering-up-your-home-office-lights-using-power-platform—introduction/

In this last part of the blog series, we will build on the previous blog post where we could get the presence status from Teams to the Hue Power App. Now we will use Microsoft Graph and subscribe to change notifications for presence, so that both the Power App and my Hue Lights will automatically change status based on the presence.

In this blog post I will reuse the Custom Connector I created for managing Microsoft Graph Subscriptions in this previous blog post: Managing Microsoft Graph Change Notifications Subscriptions with Power Platform | GoToGuy Blog

To follow along, you will need at least to create the App Registration and the Custom Connector referred to in that blog post. You can also import the Custom Connector OpenAPI Swagger from this link: <MY GITHUB URL TO HERE>

Adding the Custom Connector to the PowerApp

In the Hue Power App, go to the View menu, and under Data click to “+ Add data source”. Add the MSGraph Subscription Connector as shown below. This way we can refer to this in the Hue Power App.

We are going to add the logic to creating the subscription based on this toggle button and the selected light source:

But before that we have somethings to prepare first. When creating a Graph Subscription, we will need to prepare a Webhook Url for where Graph will send its notifcations when the Teams presence changes. This will be handled in a Power Automate Flow, so lets create that first.

Creating the Flow for Graph Notifications and Hue Lights changes

Since I have built this all before, ref. https://gotoguy.blog/2020/10/24/managing-microsoft-graph-change-notifications-subscriptions-with-power-platform/, I will make a copy of the “Graph Notification – Presence Change” flow for this use case.

If you want to follow along, you will need to follow the instructions in that blog post first. After saving a copy, or if you created a new flow from blank using HTTP request webhook, you will need to copy the HTTP POST URL shown below, as that will be the “notification url” we will refer to later:

Next set some value that only you know for the secret client state:

The next part of the Flow are used to do a first-time validation of adding the change subscription, with Content-Type text/plain and request query containing a validation token. Microsoft Graph expects a response of status code 200 with the validation token back. If that is returned, Microsoft Graph will successfully create the subscription.

For subsequent requests, we must return a 202 Accepted response, and in the next step I parse the notification request body, so that we can look into what change we have been notified for:

Following the change notification, we can start looking into the change value. Firstly, I have added a verification of the secret client state I specified earlier, this is prevent misuse of the notification Url if that become known by others or used in the wrong context. After doing a simple test of the client state, where I do nothing if the client state don’t match, I can start building the logic behind the changes in presence status:

Inside the Switch Presence Status action, I will based on the availability status change, do a different case for each of the possible Teams Presence status values (see blog series post part 9 for explaining the different presence availability values):

Inside each of these cases I will define the light settings and colors, and after that call the Remote Hue API for setting the light. As you saw in part 8 of this blog series, in order to access the Hue API remotely I will need the following:

  • An Access Token to be included as a Bearer token in the Authorization Header
  • A username/whitelist identifier
  • The actual lightnumber to set the state for
  • A request body containing the light state, for example colors, brightness, on/off etc.

Remember, this Flow will be triggered from Microsoft Graph whenever there is a presence state change in Teams. So I need to be able to access/retreive the access token, username and for which lightnumber I created the subscription. This is how I will get it:

  • Access Token will be retreived via the Logic App I created in part 4 of the blog series.
  • Username/whitelist identifier will be retrieved via the SharePoint List i created in part 6, see below image.
  • However, I do need to store the lightnumber I will create the change notification for, and for this I will add a couple more columns in this list:

Customize the Configuration List for storing Subscription Details

I add the following two single-line-of-text columns to my List:

  • Subscribe Presence LightNumber. This will be the chosen Hue light I want to change when change notifications occur for Teams presence status.
  • Change Notification Subscription Id. This will be the Id I can refer back to when adding and removing the subscription when needed.

Customize the Flow for getting User Configuration and preparing Light States

Back to the Flow again, we need to add some actions. First, add a initialize variable action after the initialize SecretClientState action like here:

I set the type to Object and using the json function to create an empty json object. This variable will later be used for changing the light state.

Next, in the Flow after the “Parse Notification Body” action, add a “Get Items” action from SharePoint connector, and configure it to your site, list name and the following Filter Query:

My filter query:
first(body('Parse_Notification_Body')?['value'])?['subscriptionId']
will be used to find what username and light that have been set up for presence changes.

Next, lets set the variable for LigthState. Inside each of the Switch Cases, add the action Set Variable, and then set the variable to the chosen xy color code, as a json object like the following. This is for the color green:

Do the same for all of the other case, these are the values I have been using:

STATECOLORJSON VARIABLE
AwayYellowjson(‘{
“on”: true,
“bri”: 254,
“xy”: [ 0.517102, 0.474840 ],
“transitiontime”: 0
}’)
AvailableGreenjson(‘{
“on”: true,
“bri”: 254,
“xy”: [ 0.358189, 0.556853 ],
“transitiontime”: 0
}’)
AvailableIdleGreen (10% bright)json(‘{
“on”: true,
“bri”: 25,
“xy”: [ 0.358189, 0.556853 ],
“transitiontime”: 0
}’)
BusyRedjson(‘{
“on”: true,
“bri”: 254,
“xy”: [ 0.626564, 0.256591 ],
“transitiontime”: 0
}’)
BusyIdleRed (10% bright)json(‘{
“on”: true,
“bri”: 25,
“xy”: [ 0.626564, 0.256591 ],
“transitiontime”: 0
}’)
BeRightBackYellowjson(‘{
“on”: true,
“bri”: 254,
“xy”: [ 0.517102, 0.474840 ],
“transitiontime”: 0
}’)
DoNotDisturbRedjson(‘{
“on”: true,
“bri”: 254,
“xy”: [ 0.626564, 0.256591 ],
“transitiontime”: 0
}’)
OfflineGrey (10% bright)json(‘{
“on”: true,
“bri”: 25,
“xy”: [ 0.3146, 0.3303 ],
“transitiontime”: 0
}’)
PresenceUnknownWhitejson(‘{
“on”: true,
“bri”: 254,
“xy”: [ 0.3146, 0.3303 ],
“transitiontime”: 0
}’)

PS! The transitiontime is to get as close to realtime updates as possible. Add the colors similarly for rest of the states:

Get the Access Token and call Hue Remote API

Now we are ready to get the Access Token, and call the Hue Remote API with the selected light state.

First, add a HTTP action below the Switch Presence Status action, calling the Logic App previously created in part 4 of the blog series:

Next, add another HTTP request after, and this will call the Hue Remote API to set the lights:

When constructing Hue API URI above, I retreive the whitelist identifier for username with the following expression:

first(body('Get_My_Hue_User_Subscription')?['Value'])?['WhitelistIdentifier']

And then which light number with this expression:

first(body('Get_My_Hue_User_Subscription')?['Value'])?['SubscribePresenceLightNumber']

That should be the completion of this Flow. Remember to turn it on if needed. We are now ready for the last step.

Adding the Flow for Creating the Graph Subscription

In the beginning of this blog post I referred to the toggle “Sync Hue Lights with Teams” in the Power App. Now that we have prepared the Flow handling the notification Url, we need to add a new Flow that would handle the creating, or deletion, of Graph Subscriptions. As this is a toggle control, setting it to “On” should create a Graph subscription for the selected lights, and setting it to “Off” should remove that subscription again.

Create a new instant Flow with PowerApps as trigger:

Next, add an action for initialize variable for getting the UserDisplayName:

Use the following expression for getting the user display name:

triggerOutputs()['headers']['x-ms-user-name']

Next, add another initialize variable for getting the light number. After renaming the action, click on “Ask in PowerApps”:

Add another initialize variable, this time a boolean value, and for getting the value of the toggle sync control:

Next, we will retrieve our user config source from the SharePoint list again, this time filtered by the UserDisplayName variable:

We now need some logic to the next steps for the flow, lets start by the toogle button which will be either true or false:

If yes, that should mean that we either should create a new graph subscription, or update any exisiting ones.

Under Yes, add a new action. This action will call the Custom Connector I created in another blog post (https://gotoguy.blog/2020/10/24/managing-microsoft-graph-change-notifications-subscriptions-with-power-platform/). This Custom Connector should have the following actions:

Click on the Get Subscription action to add that. For the subscriptionId parameter I will refer to the first returned instance of any exisiting Graph Subscriptions I have in the SharePoint list returned earlier.

Here is the expression for your reference:

first(body('Get_My_Hue_User')?['Value'])?['ChangeNotificationSubscriptionId']

When running the Flow this action will either:

  • Return a 200 OK if the subscription is found, or if the SharePoint List has a blank value for ChangeNotificationSubscriptionId.
  • Return a 404 ResourceNotFound if the subscription is not found, this will happen if the subscription is expired. This is an error that will halt the Flow, so we need to handle it.

So lets start by getting the Status Code. I’ll do that by adding a compose action with the following expression:

outputs('Get_Graph_Subscription')['statusCode']

It’s also important to set that this action should run even if the Get Graph Subscription action fails:

Lets add another condition, where the outputs of the status code should be 200 (OK):

A status code of 200 will either indicate that either we found an existing Graph Subscription matching the configured subscription id from the SharePoint List, or that the list is blank and the Get Graph Subscription will return an empty array or any other Graph Subscriptions you might have. In the first case, we will just update the existing subscription, in the latter case we will create a new subscription. Lets start by adding another compose action:

Adding the Flow to the Hue Power App

Back in the Hue Power App, we can now link this Flow to the toggle control. With the sync toggle control selected, go to the Action menu, and then click on Power Automate. From there you should see the “Hue – Create Update Remove Graph Subscription” Flow, click to add it. On the OnChange event, add the Run parameters where the lightnumber value is read from the dropdown list and selected items lightnumber, and the toggle button value like this:

Managing Microsoft Graph Change Notifications Subscriptions with Power Platform

In a recent blog post, https://gotoguy.blog/2020/07/12/subscribing-to-teams-presence-with-graph-api-using-power-platform/, I described how you could create a subscription for change notifications for Teams Presence API. In that example, I created a subscription using Graph Explorer. As every subscription has an expiry lifetime, depending on the resource type, you need to have some management over renewing subscriptions before they expire, and re-creating if they have been expired. I also include deleting subscriptions that are no longer needed in that scenario.

In this blog post I’m going to use the Powers of Power Platform to set up this. I’m going to do this from an interactive users’ perspective, using delegated permissions. Many Microsoft Graph API resources supports application permissions, but not all especially those that are in the beta endpoint, so it makes sense for me now to concentrate on delegate permissions.

Lets get started, and as in many scenarios with Microsoft Graph, I will start by creating an App Registration in Azure AD.

Register an App for Managing Graph Subscriptions

Login to your Azure AD Portal (https://aad.portal.azure.com) and go to App Registrations to create a new.

Depending on the settings in your tenant, users can be allowed to create their own application registrations:

If your admin has set this to “No”, you will need to either be a Global Admin, or been added to one of these roles for creating app registrations:

Create the following App registration, just give a name and let the other settings be like default:

Add to the permissions so that you have the following Microsoft Graph delegated permissions:

  • Presence.Read
  • Presence.Read.All
  • User.ReadBasic.All

Note that none of these permissions require admin consent. If you later want to manage subsxcriptions to Graph resources that require admin consent, you will need assistance from your Global Admin to consent to any permissions that require this,

Next, on the Overview page, copy the Application (client) ID, you will need this for later. Next go to Certificates & Secrets, and create a new secret:

Copy this secret for later.

We will now use this app registration as our API to Microsoft Graph, using a Custom Connector in Power Platform.

Create a Custom Connector for Microsoft Graph

Initial creation and authentication

If you haven’t already, log in to flow.microsoft.com, and go to Data and Custom Connectors. The following procedure includes a lot of manual steps, if you want to import this connector and its actions from my swagger definition file, you will find a link at the end of this section.

Select to create a new Custom Connector, from blank. Give it a name, for example:

Provide an optional description, and specify graph.microsoft.com as Host:

Next, under Security, select OAuth 2.0 for Authentication and select Azure Active Directory for Identity Provider. Add the Application Id for the App Registration as Client id, and the Secret you created earlier for Client secret. Type https://graph.microsoft.com as resource, and add under Scope the permissions you added so that the user can consent to these permissions. Type each permission with a space between:

Now, click on Create Connector and verify that is has been successfully created, by now you will see that the Redirect URL field has been updated, copy this value:

Next, go back to your Azure AD App Registration, and under Authentication add a Web platform with the Redirect URI set to the copied value from the Custom Connector.

Lets verify the Custom Connector before we go to the operations. Go to 4. Test, and select to create a New connection:

Next, you will be asked to authenticate with your Azure AD user, note that you will have to consent to the permission scopes we defined for the connector:

And you should successfully have a connection to Microsoft Graph, via the Custom Connector and our App Registration:

Now, we need to add operations via 3. Definition, this where we will call the Microsoft Graph API for queries regarding creating subscriptions and more.

Define Operations for the Connector

Basically this Connector will be doing the same queries like I did by using Graph Explorer in the preceding blog post plus a few more. A pro tip is to use Graph Explorer for sample requests and response when defining operations. In the following I will set up operations for:

  • GET /users/{userUpn}/?$select=id: Get a specific user and id
  • GET /communications/presences/{userId}: Get Presence for a specific user by id
  • POST /subscriptions: Create a new subscription
  • GET /subscriptions: List all my active subscriptions
  • GET /subscriptions/{subscriptionId}: Get a specific subscription
  • PATCH /subscriptions/{subscriptionId}: Update an existing subscription to renew
  • DELETE /subscriptions/{subscriptionId}: Delete a subscription

Get a specific user and id

We will start with the first operation. Go to Definitions and add a new Action. Specify a summary and operation id:

Next under Request, click on Import from sample. Specify a GET query that has the format: https://graph.microsoft.com/v1.0/users/{userUpn}?$select=id, like this and click import:

This will interpret the request URL and recognize that userUpn is a Path parameter variable and $select is a query parameter:

Next, go over to Graph Explorer, and run the same query specifying your own user principal name, copy the response:

Back in the Custom Connector, under Response, click on the default response and click Import from sample. Paste the response you copied from Graph Explorer and click Import:

Click on Update Connector to save this action. You can now go to 4. Test. You can use the connection created earlier, and select which operation you want to test, in this case specify your Upn and to select id to be returned. Testing the operation should be successful:

With the first operation added, lets add the remaining. I will skip the screenshots in the following, just follow the same steps using the information below where I summarize the details.

Get Presence for a specific user by id

Create a new subscription

  • Action Summary: Create Subscription
  • Action Operation Id: CreateSubscription
  • Request Verb: POST
  • Request URL: https://graph.microsoft.com/beta/subscriptions
  • Request Body:
    {
    “changeType”: “updated”,
    “clientState”: “{MySecretClientState}”,
    “notificationUrl”: “{WebhookUrl}”,
    “resource”: “/communications/presences/{UserId}”,
    “expirationDateTime”: “{ExpiryDateTime}”
    }
  • Request Response: (copy example response from running this in Graph Explorer)

List all my active subscriptions

Get a specific subscription

  • Action Summary: Get Subscription
  • Action Operation Id: GetSubscription
  • Request Verb: GET
  • Request URL: https://graph.microsoft.com/beta/subscriptions/{subscriptionId}
  • Request Response: (copy example response from running this in Graph Explorer)

Update an existing subscription to renew

  • Action Summary: Update Subscription
  • Action Operation Id: UpdateSubscription
  • Request Verb: PATCH
  • Request URL: https://graph.microsoft.com/beta/subscriptions/{subscriptionId}
  • Request Body:
    { “expirationDateTime”:”2020-07-14T00:00Z” }
  • Request Response: (copy example response from running this in Graph Explorer)

Delete a subscription

Custom Connector Summary

By now you should have 7 actions defined, remember to test to verify that they work as expected:

NB! Currently I get an error when running GET /subscriptions/{id}, even though it is a correct subscription id and following offical docs: https://docs.microsoft.com/en-us/graph/api/subscription-get?view=graph-rest-beta&tabs=http. I’ll leave the action in the connector but testing it gives me an “internal storage error” both in Graph Explorer and testing with this Custom Connector.

As mentioned earlier, I’ll provide you a link to the complete swagger definition of this connector and its actions here on my GitHub repository:

github.com/JanVidarElven

The Custom Connector can now be used in Power Automate Flows and Power Apps, so in the next section I will start looking into some logic for managing subscriptions.

Choosing a Configuration Source

First we need to look into some kind of configuration source for your Subscriptions to Graph Change Notifications. Consider the following:

  • When you create a subscription, you need somewhere to store the subscription id if you want to be able to update/renew or delete an existing subscription.
  • If you want to continuously renew a subscription, you might want to set an end date for how long you want a subscription to exist.
  • You will want to store the webhook url for where to send the change notifications, and possibly being able to change that if needed.
  • You will want to store the resource/resources you want receive change notifications for and for what type of change.
  • It makes sense to store the clientstate secret so that can be re-used when creating/re-creating subscriptions.

The next question is, where should I store this configuration info? There are some options. If you have access to an Azure subscriptions, you could of course store this in an SQL Database or Cosmos DB to name a few common data stores, or maybe Azure Key Vault for some of the secret info. I want to focus on options typically available for end users though, so it makes sense to go more in the direction of Microsoft 365 services. You could use CDS, Common Data Service, part of the Power Platform and using solutions and custom entities, but that would require admin assistance in either configuring this for you, or giving your end-user permission roles to do this yourself. In addition, CDS will available for all users with access to the environment, so creating a dedicated environment should also be part of that.

In this blog post I will go for the simplest option, I will store it in a list, using SharePoint Online Lists. Another alternative could be to use Microsoft Lists. I will do this by creating a dedicated Teams, and go to the SharePoint Online teams site, and create the list from there.

So I have prepared this now, and my list now contains the following columns, subscriptionId, changeType, clientState, notificationUrl and resource all being field type single line of text, and an endDateTime wit field type of Date with Time:

This will be my configuration store for the following Flows and PowerApps that will mange my subscriptions using the Custom Connector created earlier.

Creating the PowerApp for Managing Subscriptions

My main user interface for managing Graph Subscriptions will be a PowerApp. From this PowerApp I will call actions from the Custom Connector directly, or call Power Automate Flows for where I will have more steps and logic.

Note that this PowerApp can be used to manage all sorts of Microsoft Graph Subscriptions you might have, not just for Team Presence API which is the main focus in this example but for other resources also.

Log in to make.powerapps.com to get started, and follow the general steps in the sections below.

Create a blank canvas app and main screen

Select to create a new Canvas app from blank, give it a name like below, and select format. I will use Table format because I will mainly use this from my PC:

Next in the PowerApps creator studio you can set a Theme for your app, or use your own custom colors and backgrounds. Then you can start adding controls to the canvas. In the below image I have added a custom background, my own logo, some labels for headings and some buttons I will add actions to later. I have also added an icon for later being able to manually refresh.

This main screen will list my active Graph subscriptions on the left part, this will be queried from GET https://graph.microsoft.com/beta/subscriptions. The right part will be from my configuration source, which is the SharePoint list.

It is also a good practice to name the controls using your own naming convention, this is mine for the screen above:

Add Data sources to the Main screen

Next we will start adding the data sources. Go to the View menu, select Data sources, and click “Add data”. Search for “SharePoint” and add the following data source:

Connect with your user, or create a new connection if needed:

Next paste in the SharePoint URL for the SharePoint site where you created the list earlier, and click Connect:

Choose the list:

We now have a data source we can use for this list in the PowerApp:

Next, add a new control to your app, click Insert and under Layout find the “Data table (preview)” control. This will then be placed on your canvas, and you are prompted to select a data source. This is where I select the SharePoint list:

The data table control should now be automatically configured with the fields you created in the list:

To make the data table smaller I will remove some of the fields to display, I’ll remove changeType, clientState and notificationUrl. This leaves me with the following data table which I place under the configured subscriptions part. The list is now empty, but that will be filled in later when I create subscriptions:

Now lets proceed to get the actual Graph subscriptions. Add a new data source, this time search for the name of the Custom Connector we added earlier:

Add that source and choose a connection, and the data source should be added alongside the SharePoint data source:

We will add another Data table control to the Canvas, select Insert and Data table, this time selecting the Custom Connector as data source.

The Items property for the Data table needs to be set as: (MSGraphSubscriptionConnector.GetSubscriptions()).value, which will return a data table for every registered Graph subscription. This should look like this:

After that you can configure the fields to display for the data table, I have added the id, resource and expirationDateTime fields, and placed this data table under the “My Active Graph Subscriptions” part:

I now have the data connections I need for the main screen. The next part is to add actions to the buttons, but before that I will add another screen for where to specify more details for creating a subscription.

Before you proceed, select the main screen from the Tree view and duplicate screen:

I will use this copied screen as starting point for adding the next set of controls.

Creating a Screen for adding new subscriptions

In this screen I have added controls on the left side where you can type in an user principal name, get the user id and get the presence for that user. In this way I can check that I have the correct permissions required for subscribing to change notifications for the specified user.

On the right part of the screen I have added controls for adding a new Graph subscription.

Most of the controls above is label and textbox controls. For User Id and Presence I have change the Display mode to View, and for Notification URL I have changed from single line to multi line.

The change type is a combo box, where I have added to the Items property: [“updated”,”created”,”deleted”]. The expire date time is a date picker control showing date and time in short format.

At the bottom left I have added an icon with a home symbol, and on the OnSelect event for that icon I have added a navigation back to the home screen:

Similarly, on the Home screen I have added a navigation on the “Create Subscription” button to the create subscription screen:

In the next section we will add some actions to the buttons.

Getting User Id and Presence

We will start on the left side of the screen, getting user id and presence for that user. When we created the Custom Connector earlier we created actions just for this scenario.

I will use the UpdateContext function in PowerApps to save the output of the called actions to a variable, and then use that variable for the text input property.

Select the “Get User Id” button, and type the following on the OnSelect event:
UpdateContext({UserId: MSGraphSubscriptionConnector.GetUserIdByUpn(txtUserUpn.Text)})

Like the following:

This will create/update my variable “UserId”, by calling the Custom Connector action GetUserIdByUpn using the input parameter that is the text value from the txtUserUpn text input box. When I click on this button, it will run a query against Microsoft Graph the will return a user record object where I can retreive the .id property.

Next, set the Default propery of the txtUserId text input to UserId.Id:

We can test this right away, click on Preview the App (F5), and type in an existing upn, and press the Get User Id button, it should return the id for your user. You can try several different user upn’s also:

We will do something similar on getting presence. On the OnSelect event for the “Get Presence” button:
UpdateContext({UserPresence: MSGraphSubscriptionConnector.GetPresenceForUser(txtUserId.Text)})

Then set the Default property for the Presence text input to:

When testing you should be able to get the specified users presence by user id.

I’ll add one more thing, on the right side of the screen, for the resource text input, add the following to the Default property:

This will pre-populate the resource based on the user I looked up on the left side, which will come in handy when I create the subscription later:

In the next part I will create the logic behind adding a new subscription, and this will be done using a Flow, as I will have several steps in this logic.

Creating Flow for adding new Graph Subscriptions

This Flow will be triggered from the PowerApp above, and use the Custom Connector action for adding a new Graph Subscription. In the same Flow I will also save the configuration of the subscription to my SharePoint list.

Start by going to flow.microsoft.com, select Create, Start from blank and select Instant flow. Tyoe a name for the Flow, for example “Create Graph Subscription”, and select PowerApps as trigger as shown below:

Next, add the action Initialize variable right after the PowerApps trigger. Set the name of the variable and rename the action to the same. Do this 5 times, as we need input of changeType, resource, clientState, expirationDateTime and notificationUrl. Leave the Default value blank at first.

Pro tip: Make sure that you rename the actions before you do the next step: For each variable, for value select Ask in PowerApps:

The input parameters will now be named nice and recognizable, which will make it easier when calling it from PowerApps:

Before we proceed, we need to convert the date time input from PowerApps to ISO8601 format and UTC timezone, which is expected by Graph API. Additionally subscription expiration can only be 4230 minutes in the future from creation. The thinking behind my PowerApp for adding new graph subscriptions is that the expire date time is sometime in the future including any renewals. So for now I need to create a new variable that can take todays date, add maximum of 4230 minutes, and convert it to ISO8601 UTC format.

Add a new initialize variable action. Give it the following name, set type to string and use the following expression for using now as time and adding 4230 minutes in UTC format: addMinutes(utcNow(),4230), like this:

For the next action, select the MSGraph Subscription Connector, and the Create Subscription action:

You can now add the variables as input parameters to the create subscription action, remember to use the calculated variable for graph expire date time as shown above:

The next step will be to save this configuration to the SharePoint list. Add an action called “Create item” from the SharePoint connector like below. Select the SharePoint site address and the list name for the configuration source created earlier:

Title is required for adding a list item, let’s call the Ask in PowerApps for this one also:

For subscriptionId, select the “id” output from the Create Subscription action:

For the rest of the list values, select the initial variables:

The Flow is now ready to be called. I could have added more error handling and custom response if input data is missing, but I will proceed for now with this.

Calling the Flow from PowerApps

Back in the PowerApp, select the “Add Subscription” button, and on the Action menu select “Power Automate”:

Select the “Create Graph Subscriptions” Flow, which then will be added to the PowerApp, and you are prompted to provide the parameters we defined as “Ask in PowerApps” in the Flow:

Fill in the parameters like below:

Now you can press F5 to preview the app. Type in a UPN, and click to get the User Id. On the right side, make sure that the change type is set to “updated” (if you want to subscribe to presence change) and that the resource is filled in with the correct user id. From the date picker, select any future date, we will later use this date for how long the subscription should be auto-renewed. Last, add the notification URL for the change notifications. Then click on “Add Subscription”, which now should trigger the Flow:

Visually, nothing changes in the PowerApp, so we need to look at the run history for the Flow to see if it ran successfully:

We can also look at the steps that indeed a Graph subscription was created and a list item was added to SharePoint:

While both the Graph subscription and the list has been updated, we need a way to refresh that into the PowerApp. We can use the Refresh(data source) method for the SharePoint list, but that won’t work for the Custom Connector. We will get to that later, let’s start with the SharePoint list. On the refresh icon I added for the main screen, on the OnSelect event, add the following: Refresh(‘Change Notification Subscriptions’);, like this:

If you now preview the app, and click on the refresh button, the right side of the data table should now show a SP list item that has been created (PS! if you have a dark background them, make sure that the data table control has a light background so you can see the text):

Next, for refreshing the Graph subscriptions via the connector, I will change to use a Collection and the ClearCollect method. On the refresh icon, add a new line like this: ClearCollect(GraphSubscriptions,(MSGraphSubscriptionConnector.GetSubscriptions()).value), as shown below:

Next, change the Items property for the left data table for Graph subscriptions to use this Collection:

This should now show the active Graph subscriptions like below:

Now that we have a way to manually refresh these two data tables, using the refresh icon, we can add the same logic to the “Add Subscriptions” button:

The above here in clear text, the set(wait, ..) is a nice way to add busy status to PowerApps when potential long running tasks:

Set(wait,true);
CreateGraphSubcription.Run(dropDownChangeType.SelectedText.Value,txtResource.Text,txtClientState.Text,dateExpireDateTime.SelectedDate,txtNotificationUrl.Text,(txtUserUpn.Text & " Presence"));
Refresh('Change Notification Subscriptions');
ClearCollect(GraphSubscriptions,(MSGraphSubscriptionConnector.GetSubscriptions()).value);
Set(wait,!true)

At this point we have the solution ready for creating Graph Subscriptions and refreshing that in the data tables. Next we need to create Flows for updating and deleting subscriptions.