Category Archives: DevOps

Automatic Provisioning and Deprovisioning of Copilot for Security Capacity Unit

At time of writing this blog post, it’s only the first week after Microsoft Copilot for Security went GA April 1st, and already the #Security #Community are creating scripts, automations and tools for provisioning or deprovisioning Secure Capacity Units (SCU) that are required for running Copilot for Security in your environment.

A potential cost of 4$ per hour per SCU can amount to $2920 per month if you count 730 hours standard per month, so the motivation is clear, to save cost and use the service only when you need to.

I took inspiration from https://thoor.tech/Copilot-for-Security-deploy-and-destroy/, and decided to create my own solution based on Bicep, Deployment Stacks, and Azure DevOps Pipelines to automate creating a SCU on weekday mornings, and destroy again on the afternoon.

Disclaimer: This is a concept for saving cost in my own sponsored development and demo subscription, and the effects of removing and recreating Secure Compute Units in a Production environment must be carefully evaluated.

Bicep and Deployment Stacks

I wanted to use Bicep and often when I deploy Bicep I use a main.bicep file that deploys at subscription level, that also creates resource group(s) as well as contained resources which I place in modules.

Deployment Stacks, which is currently in Public Preview, let you deploy a set of resources grouped together, and I highly recommend reading this blog post by Freek Berson to understand more of the value: https://fberson.medium.com/deployment-stacks-for-bicep-are-awesome-a-full-walkthrough-sneak-peek-and-of-whats-coming-soon-4461078ec56a

I decided to use Deployment Stacks because I wanted to do a declarative approach, not only to do the actual deploy of the Secure Compute Unit, but also remove the resource from the resource group when the resource became unmanaged.

I think the code referenced at the end of this blog post is mostly self-explaining, but I will highlight a couple of elements. The first is using conditional deployment of the module in Bicep like this:

From the image above I can control whether the module will be managed in the deployment or not when I deploy the Bicep.

When running Bicep and using Deployment Stacks, you can control what happens to resources that are ‘unmanaged’. This is how I run the deployment if I want to NOT deploy the Secure Compute Unit for Copilot for Security Capacity, basically controlling CREATE operation with setting the parameter to true, and DELETE operation with setting the parameter to false:

az stack sub create --location WestEurope --name "stack-scu-elven-we" --template-file .\main.bicep --parameters deploySecureCapacityUnit=false --deny-settings-mode none --action-on-unmanage deleteResources

The magic parameter here is the –action-on-unmanage, which I set to deleteResources, which will delete all resources that falls out of bicep template (either by condition or removing resource from file). You can also do a detachResources, which will keep the resources, and you can specify deleteAll which will also remove the Resource Group (which I do not want in my scenario).

PS! Note that I run an az stack sub create, which lets me run a bicep deployment that creates resource groups as needed (yes, I could be a bit colored by my much longer Terraform experience where I also create resource groups).

In the Azure Portal, when I run a deploy that creates a SCU, it will look like this in the Deployment Stack:

Note the action on unmanage:

And when I set the parameter to false, to delete the SCU, the result is this:

Azure DevOps Pipelines

With the Bicep deployment using Az CLI verified succesfully, it should be straight forward to create Pipelines for deploying and destroying. The full YAML files are reference below, and should be easy to understand. The main components and requirements are:

  • Using a CRON expression and not a trigger for just scheduling when to run the pipeline. Note the use of always: true for the schedule, because I want the schedule to run in any case even if the files in the repo have not changed.
  • Remember you need to set up a Service Connection (Workload Identity Federation highly recommended), with approriate permissions to create resources in your reference Azure subscription.

If you look closer, you can see that the 2 pipelines, one for deploy and one for destroy, are almost exactly the same, with the exception of default value of either false or true for deploySecureCapacityUnit, and of course the morning and afternoon schedules are different.

FEATURE WHISH: I would have thought that I could use multiple schedules in the same YAML pipeline, and specifying different default value for the deploySecureCapacityUnit parameter based on schedule. Then I would be able to have just the one pipeline instead of two..

I’ll leave you to the rest, see reference to files I have published as Gists on my GitHub account https://github.com/janvidarelven.

Gist Reference

Complete Gist is linked below and is public, and feel free to Clone, Fork, Pull if you want to contribute, etc.

Gist consists of 4 files:

  • main.bicep – Parameters, variables, resource group and a module for Secure Compute Unit
  • secure-compute-unit.bicep – Module definition for deploying a Secure Compute Unit for Copilot for Security
  • deploy-security-copilot.yml – Pipeline with schedule for deploying Bicep from above files using Deployment Stack, creating a Secure Compute Unit.
  • destroy-security-copilot.yml – Pipeline with schedule for deploying Bicep from above files using Deployment Stack, deleting the Secure Compute Unit.
name: CD-$(rev:r)-Deploy-Security-Copilot-$(Date:dd.MM.yyyy) # build numbering format
trigger: none
schedules:
cron: "0 7 * * 1-5"
displayName: Morning weekdays
branches:
include:
main
always: true
parameters:
name: azureServiceConnection
default: serviceconn-<your-wif-connection>
name: azureSubscriptionTarget
default: '<your-sub-name-or-id'
name: deploySecureCapacityUnit
type: boolean
default: true
pool:
vmImage: windows-latest
variables:
name: deploymentDefaultLocation
value: westeurope
name: deploymentBicepTemplate
value: .\SecurityCopilot-Bicep\main.bicep
jobs:
job:
steps:
task: AzureCLI@2
displayName: 'Deploy Security Copilot Compute Unit'
inputs:
azureSubscription: '${{ parameters.azureServiceConnection }}'
scriptType: pscore
scriptLocation: inlineScript
inlineScript: |
az –version
az account set –subscription '${{ parameters.azureSubscriptionTarget }}'
az stack sub create `
–location $(deploymentDefaultLocation) `
–name "stack-scu-yourorg-we" `
–template-file $(deploymentBicepTemplate) `
–parameters deploySecureCapacityUnit=${{ parameters.deploySecureCapacityUnit }} `
–deny-settings-mode none `
–action-on-unmanage deleteResources
name: CD-$(rev:r)-Destroy-Security-Copilot-$(Date:dd.MM.yyyy) # build numbering format
trigger: none
schedules:
– cron: "0 14 * * 1-5"
displayName: Afternoon weekdays
branches:
include:
– main
always: true
parameters:
– name: azureServiceConnection
default: serviceconn-<your-wif-connection>
– name: azureSubscriptionTarget
default: '<your-sub-name-or-id'
– name: deploySecureCapacityUnit
type: boolean
default: false
pool:
vmImage: windows-latest
variables:
– name: deploymentDefaultLocation
value: westeurope
– name: deploymentBicepTemplate
value: .\SecurityCopilot-Bicep\main.bicep
jobs:
– job:
steps:
– task: AzureCLI@2
displayName: 'Deploy Security Copilot Compute Unit'
inputs:
azureSubscription: '${{ parameters.azureServiceConnection }}'
scriptType: pscore
scriptLocation: inlineScript
inlineScript: |
az –version
az account set –subscription '${{ parameters.azureSubscriptionTarget }}'
az stack sub create `
–location $(deploymentDefaultLocation) `
–name "stack-scu-elven-we" `
–template-file $(deploymentBicepTemplate) `
–parameters deploySecureCapacityUnit=${{ parameters.deploySecureCapacityUnit }} `
–deny-settings-mode none `
–action-on-unmanage deleteResources
targetScope = 'subscription'
// If an environment is set up (dev, test, prod…), it is used in the application name
param environment string = 'dev'
param applicationName string = 'security-copilot'
param location string = 'westeurope'
param resourceGroupName string = 'rg-sec-copilot-scu-we'
param capacityName string = 'scu-<yourorg>-we'
param capacityGeo string = 'EU'
// Some params for provisioning the secure capacity unit, and if it should be deployed or not
param defaultNumberOfUnits int = 1
param deploySecureCapacityUnit bool = true
var defaultTags = {
Environment: environment
Application: '${applicationName}-${environment}'
Dataclassification: 'Confidential'
Costcenter: 'AI'
Criticality: 'Normal'
Service: 'Security Copilot'
Deploymenttype: 'Bicep'
Owner: 'Jan Vidar Elven'
Business: 'Elven'
}
resource rg 'Microsoft.Resources/resourceGroups@2021-04-01' = {
name: resourceGroupName
location: location
tags: defaultTags
}
// Deploy the secure capacity unit module dependent on the deploySecureCapacityUnit parameter
module scu 'secure-capacity/secure-compute-unit.bicep' = if (deploySecureCapacityUnit) {
name: capacityName
scope: resourceGroup(rg.name)
params: {
capacityName: capacityName
geo: capacityGeo
crossGeoCompute: 'NotAllowed'
numberOfUnits: defaultNumberOfUnits
resourceTags: defaultTags
}
}
view raw main.bicep hosted with ❤ by GitHub
// Secure Compute Unit – Bicep module
// Created by – Jan Vidar Elven
@description('The name of the Security Copilot Capacity. It has to be unique.')
param capacityName string
@description('A list of tags to apply to the resources')
param resourceTags object
@description('Number of Secure Compute Units.')
@allowed([
1
2
3
])
param numberOfUnits int
@description('If Prompts are are allowed to cross default region for performance reasons.')
@allowed([
'NotAllowed'
'Allowed'
])
param crossGeoCompute string
@description('Prompt evaluation region. Allowed values are EU, ANZ, US, UK.')
@allowed([
'EU'
'ANZ'
'US'
'UK'
])
param geo string
var locationMap = {
EU: 'westeurope'
ANZ: 'australiaeast'
US: 'eastus'
UK: 'uksouth'
}
var location = contains(locationMap, geo) ? locationMap[geo] : 'defaultlocation'
resource Copilot 'Microsoft.SecurityCopilot/capacities@2023-12-01-preview' = {
name: capacityName
location: location
properties: {
numberOfUnits: numberOfUnits
crossGeoCompute: crossGeoCompute
geo: geo
}
tags: resourceTags
}

Connect to Microsoft Graph in Azure DevOps Pipelines using Workload Identity Federation

Microsoft recently announced that Workload Identity Federation for Azure Pipelines now is in Public Preview: https://devblogs.microsoft.com/devops/public-preview-of-workload-identity-federation-for-azure-pipelines/.

This opens for a lot of scenarios for Azure service connections, without the need to manage secrets for service principals and more security as there are no secrets that can be exposed or exfiltrated.

As I work a lot with Microsoft Graph and automation, I wanted to see if and how I could use Workload Identity Federation to connect to and send queries to Microsoft Graph using Azure Pipelines.

Create the Workload Identity Federation Service Connection

First of all, I need to create a service connection in my Azure DevOps project that will use the new Workload Identity Federation. To be able to do this, you need to have access to the preview functionality, see details here in this learn article: Create an Azure Resource Manager service connection using workload identity federation.

When you have access to the feature, you can create a new Workload Identity federation either by manual or automatic configuration:

I will now choose the Azure Subscription, and optionally a Resource Group. Choosing a resource group is a good idea, as the service connection will be given Contributor access only to that Resource Group, and not the whole subscription. But it also depends on what you want to use your Service Connection for, in my case it is a demo scenario for Microsoft Graph Access, so it makes sense to scope the permissions down:

After creating the Service Connection, I can find it my Entra ID tenant. Let’s look at the role assignments for the Resource Group first:

The service principal has been given the name of <DevOps Org>-<DevOps Project>-<guid>, and been assigned with Contributor access to that RG.

Next, let’s find the App Registration for the Service Connection. As you can see from below there has been no (0) credentials of secrets or certificates created, but there has been created a Federated credential:

If we look at the detail for the federated credential, we can se the issuer, subject and audience, and confirm that this service principal only can be access by the service connection in Azure DevOps:

Next, go to API permissions. Here I will add a Microsoft Graph permission, so that we can use that for queries in the pipeline later. In my case I add the Application permission User.Read.All, so I can look up user information:

We are now ready to set up an Azure Pipeline to use this service connection.

Create the Azure Pipeline to access Microsoft Graph API

In your DevOps project, if this is a new project, make sure that you initialize the Repository, and that you have at least a Basic or Visual Studio access level, then head to Pipelines and create a “New Pipeline”. For my environment I will just choose the following steps:

  1. Select Azure Repos Git (YAML)
  2. Select my repository
  3. Use a starter pipeline (or you can choose an existing if you have)

This is a sample YAML code that will use the service connection (se below picture) to get an access token for Microsoft Graph, and the use that access token to connect to Graph PowerShell SDK. In my example I’m just showing how to get some simple user information:

There are different ways you can go about this, in my case I was just using Azure CLI in one task to get the access token for the resource type that is Graph. (You can also use Az PowerShell task for this by the way). I also set and secure the variable for use in later steps in the pipeline job.

In the next task I use PowerShell Core to convert the token to a secure string, and then install the required Microsoft Graph PowerShell modules. I can then connect to Graph and get user information. Here is the complete YAML code:

# Pipeline for accessing Microsoft Graph using Federated Workload Identity Credential
# Created by Jan Vidar Elven, Evidi, 15.09.2023

trigger:
- none

pool:
  vmImage: windows-latest

steps:
- task: AzureCLI@2
  displayName: 'Get Graph Token for Workload Federated Credential'
  inputs:
    azureSubscription: 'wi-fed-sconn-ado-to-msgraph'
    scriptType: 'pscore'
    scriptLocation: 'inlineScript'
    inlineScript: |
      $token = az account get-access-token --resource-type ms-graph
      $accessToken = ($token | ConvertFrom-Json).accessToken
      Write-Host "##vso[task.setvariable variable=secretToken;issecret=true]$accessToken"
- task: PowerShell@2
  displayName: 'Connect to Graph PowerShell with Token'
  inputs:
    targetType: 'inline'
    script: |
      # Convert the secure variable to a secure string
      $secureToken = ConvertTo-SecureString -String $(secretToken) -AsPlainText

      # Install Microsoft Graph Modules required
      Install-Module Microsoft.Graph.Authentication -Force
      Install-Module Microsoft.Graph.Users -Force

      # Connect to MS Graph
      Connect-MgGraph -AccessToken $secureToken

      # Get User Info
      Get-MgUser -UserId "[email protected]"
      
    pwsh: true

I can now try to run the pipeline. At first run you will have to validate and permit access to the service connection from the pipeline:

And then I can verify that it indeed can connect to the Graph via PowerShell SDK and get my resources via the Workload Identity Federation service connection:

Summary and Usage Scenarios

Most will use the new Workload Identity Federation for Azure Pipelines that access Azure subscriptions and resources, but I have shown as as long as this is using the Entra ID authentication platform and OIDC, it is possible to get access tokens for other API’s as well, in this case Microsoft Graph API.

You can use Graph API to get information about your tenant, to enrich and complement your exisiting CI/CD pipelines, or in some case automate consistent deployments also for Graph resources, like for example important settings and policies.

How to Send Requests to GitHub API from Power Platform using Custom Connector

Recently I came across a personal scenario where I use Hugo and GitHub Pages as a team site for a Soccer team I’m coaching and wanted to automate some updates to the web site. I’ve written a blog post previously on how I organized trainings at home using Power Platform: How I as a Soccer Coach…. | GoToGuy Blog, and I am now using Github Pages and Hugo for publishing some statistics and more for that scenario.

In this blog post I will show how I:

  1. Created an OAuth Application for Github API.
  2. Created a Custom Connector in Power Platform for connections to that OAuth Application.
  3. Created Operations for getting content, updating content and triggering workflows for Github Actions.
  4. Connected to Github API using my Azure AD account and user impersonation.
  5. Created a Power Automate Cloud Flow for using the Custom Connector and the defined operations.

Lets get started!

Create OAuth Application for Github API

Start by logging in to your GitHub account and go to Settings. Under Settings you will find Developer Settings where you can access OAuth Apps. You can also go directly to the following URL https://github.com/settings/developers.

Click to Register a new application, and fill in something like the following:

As the above image shows, give the application a descriptive name for your scenario, you can type any homepage URL, this is not important in this scenario. The authorization callback URL is important though, as this will the callback to the Custom Connector we will create later. We can verify the URL later, but use https://global.consent.azure-apim.net/redirect.

Register the application. Next you can change the settings for the registered app. You will have to copy the Client ID, we will need that later. You also need to create Client Secret, make sure to copy that as well, you will only be able to see this once. You can also change some settings like name, logo and branding if you like. This is how my Github App registration looks like now:

We can now proceed to Power Platform to create the Custom Connector.

Create Custom Connector to Github API in Power Platform

Log in to your Power Platform environment, and go to Custom Connectors under Data. Click to create a New custom connector. You can select to create from blank if you want to follow along the steps in my blog post here, or you can select to import an OpenAPI for URL, as I will provide the swagger file at the end of this blog post.

Give the connector a name of your choice and continue:

Next you need to specify “api.github.com” as host. You can also optionally upload a connector icon, as I have done here:

(You can grab the mark logo used above from here, GitHub Logos and Usage, note the usage requirements).

Next, go to Security. Select OAuth 2.0 as authentication type, and then selec GitHub as Identity Provider.

(PS! You can select Generic OAuth 2 also, but it will fall back to GitHub as Identity Provider eventually after all).

Add your Client ID and Secret from the Github OAuth application registration:

It is important to configure the correct scope (or scopes) as this will authorize the client for accessing the API. If you leave the scope blank, you will only get public read only access. You can read more on available scopes here: Scopes for OAuth Apps – GitHub Docs

In my case I want to have full read and write access to public repositories, as well as read write to user profile, so I set the scope to “public_repo user” (use space delimiter for multiple scopes):

I can now click “Create connector”. After creating the security details are now hidden/disabled, and I can verify the Redirect URL to be the same as the Callback URL from the GitHub OAuth app registration:

We can now start defining the operations for the actions I want to do against the GitHub API.

Create Operations for sending requests to GitHub API

When querying and sending request to the GitHub API you need to know the API details and required parameters for what you want. The following link is for the official GitHub Rest API reference: Reference – GitHub Docs.

In my example I want to define the following 3 operations in my Custom Connector:

Under 3. Definition, select to create a New action, and call it something like “Get Repository Content” with the Operation ID set to “GetRepositoryContent”:

Then, under Request, click Import from sample. Select the Verb GET, and under URL type https://api.github.com. The rest of the query we will get from the GitHub API docs. Copy the following fra the REST API reference docs:

So that your sample request now looks like this, remember to add the recommended Accept header:

Click Import. The request will now ask for owner, repo and path as parameters:

Next, click the default response. Here you can copy the sample response from the REST API docs, I’ve copied the sample response for getting file contents:

After that click “Update connector” and we have the first action operation defined.

Click New action again, this time for updating file contents:

For the sample request the Verb is PUT, the URL is the same as when getting file content, but now we need to specify a request body as well:

I’ve created the sample request body based on the docs reference, with just empty placeholder values for the parameters needed. Some of these can be omitted, but message, contents, sha and branch is required for updating an existing file:

{
 "message": "",
 "content": "",
 "sha": "",
 "branch": "",
 "committer": {
  "name": "",
  "email": "",
  "author": {
   "name": "",
   "email": ""
  }
 }
}

After importing the sample request, you can click into the body parameter and change to required for the body itself, as well as the payload parameters that you always want to include from below:

Add a sample default response as well, I’ve copied the example response for updating a file from the docs.

Click “Update connector” again and we are ready to add the third action:

This will be a POST request, with the following URL and request body:

Note from above that “ref” needs to be referencing a branch or tag name as is a required parameter. “Inputs” is an object, depending on your GitHub Actions workflow if incoming parameters is defned, so in many cases this can be empty.

You can leave the default response as it is, as API will return 204 No Content if request is successful.

Click on “Update connector” again, and you should now have 3 actions successfully configured.

We can now proceed to create a connection and authenticate to GitHub API using this custom connector.

Connect to Github API using my Azure AD account and user impersonation

Go to “4. Test”, and click to create a “New connection”. This will create a new authentication popup, and if you’re not already logged in to GitHub you must log in first. Note the correct reference and branding to the “Elven Power Platform OAuth App”:

After logging in I’m prompted to authorize the OAuth app to access data in my account. Note that the scopes “public_repo” and “user” is shown in the authorization request:

If you own other organizations you can grant access to that as well. Click Authorize “OwnerName”: as shown below:

After authorizing you will be redirected back to the Connections, and you should be able to successfully get a new connection object.

Let’s take a look at GitHub settings again, under https://github.com/settings/applications. You should see the OAuth App and the correct permissions configured if you click into details. You can also revoke the access if you need to remove it or reconfigure the scopes for example:

Let’s do a test from the Custom Connector and see what we get. Click on the GetRepositoryContent, and provide the paramaters for “owner” (your GitHub account name), “repo” (any repository, I’m using my GitHub Pages repo here), and a “path” to an existing file in that repo (I’m just testing against my README.md at root, but this can be any subfolder\file also). Click Test operation and see:

This should be successful, note that the response contains a couple of important values for later, the “sha” for the existing file, and the “content” which is a base64 representation of the current contents of the README.md file.

Click on the Request tab, and you will see a preview of how the request was constructed. You will also see the Authorization Header with the Bearer Token:

A couple of important things to note:

  • The request uses an API gateway in Azure APIM, not GitHub directly.
  • The Bearer Token in the Authorization Header is for the Azure API GW audience, so it cannot be used directly against GitHub API.

Copy the entire token value, from after “Bearer <token……>”, and paste it into a JWT debugger like jwt.io. From there we can look at the decoded payload:

From that payload it’s clear that the Token has been issued by my Azure AD tenant and for my logged on user in Power Platform. The scope is user_impersonation, so this will be used in a on-behalf-of flow scenario via the audience defined as apihub.azure.com, which in turn will request from GitHub API resources on my behalf via the APIM gateway used by Power Platform.

You can also lookup the appid from the Token in the Azure AD tenant, and you will find the following Enterprise Application, from where you can enable or disable it on an organization level, or you can examine the sign in logs:

We can test the other operations as well, but let’s create a Flow for that scenario.

Create a Power Automate Cloud Flow for using the Custom Connector to Get and Update File Content

Create a new Cloud Flow, using an instant trigger for manually triggering a flow. Add some inputs like shown below:

Next, add a new action and from under Custom find the GitHub Custom Connector:

Add the “Get Repository Content” action and then fill in the inputs like below:

Next, add a Compose action, with the following dynamic expression:

base64ToString(outputs('Get_Repository_Content')?['body/content'])

This is just for checking what the existing file contents is:

We can do a quick Save and then Test Flow so far, from the Run history I should get the correct inputs, and when finding the existing file the outputs will include the sha value of the existing file, as well as the base64 encoded value of the content:

And when looking at the decoding of the content I can see that the readme.md file content is shown correctly:


Go into Flow edit mode again, and add another Compose action, this time we need to base64 encode the new content I want to update the file with:

Note that the base64 function uses for parameter the input trigger of base64(triggerBody()?['text']), as this is the first text parameter of the trigger.

Add a new action, this time for the Custom Connector again, and the Update File Contents. Specify the owner, repo and path as previously input values, type a custom message for the message, and select the outputs from the “Base64 Updated Content” action, and use the sha value from the “Get Repository Content”. The rest of the values (committer, author objects) are optional:

Save and then do another test, for example like the following to update the README.md file:

And the test should be successful:

I can also verify this at my repository and check the file has been updated. Note also the commit message:

Triggering a GitHub Actions Workflow

The last thing I wanted to go through in this blog post is using the Power Platform Custom Connector to trigger a GitHub Actions workflow. My use case for this is to start a Hugo build when I have dynamically updated files for my static website, but for now I will keep it simple.

I have via a basic template created a simple workflow like this:

This workflow can also be triggered manually using workflow_dispatch, so let’s use that to verify that I can call it from Power Platform.

Add a new action at the end of the Flow, adding the Custom Connector action for Dispatching Workflow event:

Specify Owner and Repo from inputs, and for workflow id either specify ID or the name of the workflow file, in this case blank.yml. The ref parameter is either a branch or tag name, so in my case I use main branch. I leave the other parameters blank as I don’t have any inputs to supply, and use the default Accept header.

Save and Test the Flow again, supplying an updated file content, owner, repo and path similar to what we did previously. When the Flow runs it should complete successfully:

If I go to my GitHub repository, and under Actions, I can see that this workflow has been triggered:

Actually it has been triggered twice, as the first trigger is automatic for the push commit on the file update, and the other (named “CI” in results) is the actual workflow dispatch from the Flow.

Basically this means that I can select some different logic to when my workflows will trigger, either as a push or pull trigger, or as a trigger event based on my Flows. But of course I won’t normally run both triggers 😉

I now have what I need for working further with my personal Hugo and GitHub Pages project, my plan is to update data and assets files from my Power Platform environment, and then trigger a Hugo build for my website. I might blog more on that process later.

Summary and some last thoughts

In this blog post I wanted to show how you can work with the GitHub REST API via a Power Platform Custom Connector. This way you can basically achieve anything that the GitHub API has available, provided the correct scope/scopes has been authorized.

I do want to mention however that there is a GitHub Connector you can use directly in Power Automate, Logic Apps, or Power Apps also: GitHub – Connectors | Microsoft Docs, where you can create a direct connection to your GitHub account. You should take a look at that if that can server your needs.

In my case I needed the API to get or update file contents directly, as well as when using impersonation people in my organization can use their own Azure AD accounts if I share the Custom Connector with them, they don’t need their own GitHub accounts as long as the OAuth App has been authorized on my behalf.

If you want a quickstart on creating the Custom Connector your self, below is the Swagger definition. Thanks for reading, hope it has been useful!

swagger: '2.0'
info: {title: JanVidarElven Github Connector, description: GitHub API Connector for
JanVidarElven, version: '1.0'}
host: api.github.com
basePath: /
schemes: [https]
consumes: []
produces: []
paths:
/repos/{owner}/{repo}/contents/{path}:
get:
responses:
default:
description: default
schema:
type: object
properties:
type: {type: string, description: type}
encoding: {type: string, description: encoding}
size: {type: integer, format: int32, description: size}
name: {type: string, description: name}
path: {type: string, description: path}
content: {type: string, description: content}
sha: {type: string, description: sha}
url: {type: string, description: url}
git_url: {type: string, description: git_url}
html_url: {type: string, description: html_url}
download_url: {type: string, description: download_url}
_links:
type: object
properties:
git: {type: string, description: git}
self: {type: string, description: self}
html: {type: string, description: html}
description: _links
summary: Get Repository Content
operationId: GetRepositoryContent
description: Get File or Folder Content from Repository
parameters:
– {name: owner, in: path, required: true, type: string}
– {name: repo, in: path, required: true, type: string}
– {name: path, in: path, required: true, type: string}
– {name: Accept, in: header, required: false, type: string}
put:
responses:
default:
description: default
schema:
type: object
properties:
content:
type: object
properties:
name: {type: string, description: name}
path: {type: string, description: path}
sha: {type: string, description: sha}
size: {type: integer, format: int32, description: size}
url: {type: string, description: url}
html_url: {type: string, description: html_url}
git_url: {type: string, description: git_url}
download_url: {type: string, description: download_url}
type: {type: string, description: type}
_links:
type: object
properties:
self: {type: string, description: self}
git: {type: string, description: git}
html: {type: string, description: html}
description: _links
description: content
commit:
type: object
properties:
sha: {type: string, description: sha}
node_id: {type: string, description: node_id}
url: {type: string, description: url}
html_url: {type: string, description: html_url}
author:
type: object
properties:
date: {type: string, description: date}
name: {type: string, description: name}
email: {type: string, description: email}
description: author
committer:
type: object
properties:
date: {type: string, description: date}
name: {type: string, description: name}
email: {type: string, description: email}
description: committer
message: {type: string, description: message}
tree:
type: object
properties:
url: {type: string, description: url}
sha: {type: string, description: sha}
description: tree
parents:
type: array
items:
type: object
properties:
url: {type: string, description: url}
html_url: {type: string, description: html_url}
sha: {type: string, description: sha}
description: parents
verification:
type: object
properties:
verified: {type: boolean, description: verified}
reason: {type: string, description: reason}
signature: {type: string, description: signature}
payload: {type: string, description: payload}
description: verification
description: commit
summary: Update File Contents
description: Update existing file in repository
operationId: UpdateFileContents
parameters:
– {name: owner, in: path, required: true, type: string}
– {name: repo, in: path, required: true, type: string}
– {name: path, in: path, required: true, type: string}
– {name: Accept, in: header, required: false, type: string}
– name: body
in: body
required: true
schema:
type: object
properties:
message: {type: string, description: message, title: ''}
content: {type: string, description: content, title: ''}
sha: {type: string, description: sha, title: ''}
branch: {type: string, description: branch, title: ''}
committer:
type: object
properties:
name: {type: string, description: name}
email: {type: string, description: email}
author:
type: object
properties:
name: {type: string, description: name}
email: {type: string, description: email}
description: author
description: committer
required: [branch, content, message, sha]
/repos/{owner}/{repo}/actions/workflows/{workflow_id}/dispatches:
post:
responses:
default:
description: default
schema: {}
summary: Dispatch Workflow Event
operationId: DispatchWorkflowEvent
description: Trigger a GitHub Actions Workflow by ID
parameters:
– {name: owner, in: path, required: true, type: string}
– {name: repo, in: path, required: true, type: string}
– {name: workflow_id, in: path, required: true, type: string}
– {name: Accept, in: header, required: false, type: string}
– name: body
in: body
required: true
schema:
type: object
properties:
ref: {type: string, description: ref, title: ''}
inputs:
type: object
properties: {}
description: inputs
required: [ref]
definitions: {}
parameters: {}
responses: {}
securityDefinitions:
oauth2_auth:
type: oauth2
flow: accessCode
authorizationUrl: https://github.com/login/oauth/authorize
tokenUrl: https://login.windows.net/common/oauth2/authorize
scopes: {public_repo user: public_repo user}
security:
– oauth2_auth: [public_repo user]
tags: []