Featured

Azure Blueprints – Create custom connector for Power Automate

Within Azure, we have a free service called Azure Blueprints. Azure Blueprints helps us in simplify largescale Azure deployments by packaging key environment artifacts, such as Azure Resource Manager templates, role-based access controls, and policies, in a single blueprint definition. Easily apply the blueprint to new subscriptions and environments, and fine-tune control and management through versioning. The Blueprints REST API’s are available to manage the different API groups.

In the last months, I have worked on a proof of concept, using “Low Coding to assign Blueprints definitions in Azure. This includes a PowerApps App and an Automation flow that’s calling the Azure Blueprint REST API. Within Power Automate, calling REST API’s is mostly done by using the HTTP connector.

API Call to assign Blueprint definition

Using the HTTP Connector, give us some overhead because each time you need the steps to get the authorisation token.

Get authorisation token

Creating a custom connector in Power Automate will make life much easier. Let us have a look at the REST API to create or update Assignments.

PUT https://management.azure.com/{scope}/providers/Microsoft.Blueprint/blueprintAssignments/{assignmentName}?api-version=2018-11-01-preview

More information about Azure Blueprints REST, can be found here.

Let’s have a look at how I did it and which issues, I had during the creation of the custom connector.

Open Power Automate and select “Data” and click on “Custom connectors”.

Click “New custom connector” and choose in drop down “Create from blank”.

Fill in the connector name and click “Continue”.

The first step of the setup is adding the “General” settings, like uploading icon, set background colour. This colour setting will influence also the top bar of the connector, see examples below.

Detail of the first step (General)

  1. Image has been uploaded, use PNG or JPG, less then 1MB.
  2. Used background color: #007EE5
  3. Adding a description for the custom connector
  4. Scheme: HTTPS
  5. Host: This information is available in the REST API documentation. https://management.azure.com/{scope}/providers/Microsoft.Blueprint/blueprintAssignments/{assignmentName}?api-version=2018-11-01-preview
    Value: “management.azure.com”
  6. Base URL: Because of variables are not allowed the base URL value is “/”

Step 2: Security

This exist out of two part: Azure and the configuration of the custom connector. In Azure, we will create a new app registration and make note of the application id and secret. Adding the Azure Service Management as API permission and push “Grant Admin consent for <Name>” to activate the permissions

Let go back to the first screenshot:

  1. OAuth2.0 is used to authenticate to the Azure service.
  2. Identity provider: Azure Active Directory
  3. Client id: <APP registration application id >
  4. Client Secret: <APP registration Secret >
  5. Login URL: https://login.windows.net
  6. Tenant ID: Common
  7. Resource URL: https://management.azure.com/ (See that the final slash has been included)
  8. Scope: <n/a>
  9. Redirect URL: Will be create when saving the custom connector for the first time. Copy this URL and go back to the

For the APP registration, that we have created in previous step, we will add this copied URI as the “Redirect URIs”.

Step 3: Definition

In the last step, we will create actions based in the REST API list. In this example we will add the get assignments in Azure Blueprints.

Adding Actions, click “New action”

General

  1. Summary, give a name for the action, is displayed as action name in the custom connector.
  2. Description, is shown in the custom connector when selecting the information icon.
  3. Operation ID: This is an unique string used to identify the operation and is internally used into the connector.
Information view of the action in the custom connector.

Adding request for the action and responds of the result.

Click “import from template” and complete the fields at your rights side: GET https://management.azure.com/{scope}/providers/Microsoft.Blueprint/blueprintAssignments/{assignmentName}?api-version=2018-11-01-preview

  1. Verb: GET
  2. URL: https://management.azure.com/{scope}/providers/Microsoft.Blueprint/blueprintAssignments/{assignmentName}
  3. Headers: n/a
  4. Body: Add body example, see REST API information. Copy the result and paste it in the body.
{
  "identity": {
    "type": "SystemAssigned",
    "tenantId": "00000000-0000-0000-0000-000000000000",
    "principalId": "00000000-0000-0000-0000-000000000000"
  },
  "location": "eastus",
  "properties": {
    "description": "enforce pre-defined simpleBlueprint to this XXXXXXXX subscription.",
    "provisioningState": "succeed",
    "blueprintId": "/providers/Microsoft.Management/managementGroups/ContosoOnlineGroup/providers/Microsoft.Blueprint/blueprints/simpleBlueprint",
    "parameters": {
      "storageAccountType": {
        "value": "Standard_LRS"
      },
      "costCenter": {
        "value": "Contoso/Online/Shopping/Production"
      },
      "owners": {
        "value": [
          "johnDoe@contoso.com",
          "johnsteam@contoso.com"
        ]
      }
    },
    "resourceGroups": {
      "storageRG": {
        "name": "defaultRG",
        "location": "eastus"
      }
    }
  },
  "id": "/subscriptions/00000000-0000-0000-0000-000000000000/providers/Microsoft.Blueprint/blueprintAssignments/assignSimpleBlueprint",
  "type": "Microsoft.Blueprint/Assignment",
  "name": "assignSimpleBlueprint"
}

Click “Import” to save your input. The request will be filled in based on the input. Some small changes need to be done before, we can say that this has been completed.

Click “scope“, adapt the following settings:

Is required: Will add a red star in the beginning of name value.
Visibility: important, this is visible in the connector, when selected.

Assignment Name:

API Version:

Select is required to “Yes” and visibility to internal. This selection will not been shown in the connector action overview, this setting is directly used internally into the custom connector as a default value for this parameter.

Fill in the “Default value” and select “Is required?” to “Yes” and “Visibility” to “Internal”.

Continue to the add responds, see workflow copy an example from the REST API documentation and add in the import from sample window.

Change the default value to 200.

This is the first action defined into the custom connector:

We can test the connector by clicking on test or by adding the connector into a Flow. Create first a new connection to get the token:

Create new connection to get token
Enter values and click “Test operation”

The result of the API request is displayed in the response tab.

Creating a Flow to Get the Assignments of a Azure Blueprint.

Result after running the flow:

Thanks for reading and please like and share!

Featured

Enable formula bar result view

I want to talk about this experimental feature, that enables you to understand the result of your formula or/and have the possibility to debug a formula.

Sharing my experience with this feature and let you see where it help me to define my formulas and debugging.

Where to enable the feature?

Open a power app project and click “File” >> “App Settings

Go to power app project, by clicking “Edit
Go to App settings
Select en click “Advanced setting

Go to the experimental features and set the switch to ON.

What is the difference?
FEATURE OFF
Code with no extra variable information

In the code above by selecting “varGalAppsVisible“, nothing will change or gives you information about this variable value.

FEATURE ON

Feature has been turned on. By selecting “varGalAppsVisible“, you will see the current value of the variable and the result of the IF – statement.

Conclusion?
Enabling this feature in power apps helped my in …
  • building complex code/queries
  • checking code/query result direct in the power app designer
  • gives me faster development of code/queries
  • easy and quick troubleshooting
Featured

Get your custom logs from a Power App into Azure Log Analytics

Azure Log Analytics is a service that can collect logs from any resource, within Azure. Having the possibility to send custom logs, leads us seamlessly to the next point.

Sometimes there’s that need to check what’s happening during a process, logging to a text file or keeping historical logs in a SQL database is mostly used. But what if alerts need to be triggered or a report in Power BI needs to be created?

Azure Log Analytics

Log Analytics is part of Azure Monitor an is used to collect logs from Azure resources. For more information about Azure Log Analytics click this LINK

Microsoft Flow provides us following connectors, these can be used to transfer logs towards and get logs out of Log Analytics:

  • Azure Log Analytics: Get output via Kusto query
    More information can be found here
  • Azure Log Analytics Data Collector: Send custom events to the log analytics workspace.
    More information can be found here

How-To send custom log from Power App to Log Analytics?

First of all you need to have a Azure Log Analytics workspace. This will end up by having this:

Creating a file that can be used to create the custom log.
This example, gives us the possibility to create a custom log

Time,EventCode,EventType,EventComment
2019-08-27 01:34:36,100,SUCCESS,Collection has been populated
2019-08-27 01:33:33,101,ERROR,Collection has not been populated
2019-08-27 01:33:33,102,INFO,Collection has been cleared

Creating custom log can be done as followed. Open the Azure portal and select your Azure Log Analytics Workspace. Select Advanced settings, data, custom logs and click “Add +”

Choose the file that includes the sample log data.

Click “Next >”

Click “Next >”

Define a log collection paths and press the [+] – button and click “Next >”

Add name and description of the power app custom log and click “Done”

The data is visible in the Azure Log Analytics workspace, after data has been inserted. Now the custom logging can be created in the power app. In this example there is button that will retrieve user information via graph API. When the collection is empty, means that the user information has not been found.

Depending on the result a Microsoft Flow will be triggered that will send the custom log to Log Analytics. Let’s have a look at the flow.

The flow is triggered by an RUN in the power app and the following steps the variable will be populated by values from the power app. Variables will be varEventCode, varEventType and varEventDetails. Getting the current time will lead us to the final step and that’s sending the data to the custom logs in the log analytics workspace.

The connection to the workspace is build by providing the workspace id and password, that can be found at the following location:

Advanced setting >> Connected Sources >> Windows Servers >> here workspace id and primary key can be copied. Save the flow and get the user information in the power app by clicking the button.

Data has been found, means that an event OK has been created in the Azure Log Analytics. Checking this in log analytics, will gives us back this event.

This procedure gives us the opportunities to collect process logs into Log Analytics.

EXTRA: Idea has been posted at the PowerApps community: VOTE now!!

Featured

Azure Automation connector – Content conversion to array

Study Question?

In which format the output of a PowerShell runbook, in Azure Automation, is returned to a Logic App/Flow? 

Let us describe quickly, the Azure Automation PowerShell script and the connection towards Logic App/Flow. The PowerShell script will transfer the results via an array – variable, this by throwing this to the output of the runbook, using the “return”- command.

#example of part of the script
$Sessions = Get-RDSessionHost -CollectionName $CollectionName -ConnectionBroker $ConnectionBroker
[array]$SessionsArray = @()
foreach($Session in $Sessions){
    $SessionsArray += $Session.SessionHost
}
return $SessionsArray #array will be returned to the output of the runbook.

Looking at Logic App/Flow, calling the Azure Automation runbook, by using the action “Create job” (Create Job to run on the hybrid worker)

Calling runbook and second step getting output of the runbook.

In the second step, the output is returned by the PowerShell script via an array. In the screenshot, the output shows us, three items. It looks easy now, adding these lines into a Logic App/Flow variable or data operation – action. But seems that the format type isn’t an array. Adding the content into a compose – action, the output of the compose looks like this.

"SRV.XXXXXXXXX.LOCAL\r\n\r\nSRV.XXXXXXXXX.LOCAL\r\n\r\nSRV.XXXXXXXXX.LOCAL\r\n\r\n"

Carriage Return and Line Feed are visible in the outcome text.


SRV.XXXXXXXXX.LOCAL

SRV.XXXXXXXXX.LOCAL

SRV.XXXXXXXXX.LOCAL


Converting this to a value that can be used to create an array, is resolved by using the “HTML to Text” – action. This will get rid of the carriage return and line feed.

HTML to text – action
Outputs result is a text string

What if the outcome text is larger then the display window. Again a line feed is added to the text string. Extra action is needed to resolve this.

/n is added for each new line in the display window.

What if the outcome text is larger then the display window. Again a line feed is added to the text string. Extra action is needed to resolve this.

Removing line feed.
Outputs string without /n

By splitting the string, we can create an array of all the items into the text string. This is how we can do this:

split(outputs(‘Remove-Return’),’ ‘)

The split is done on the space between two values in the text string.

Final result: Array with all text items

The outcome is an array with all the discovered objects from the runbook. Furthermore, the array can be used in a “For Each” or control action.

Featured

Share your canvas apps with guest users.

Microsoft put the sharing possibility of Power Apps with Guest users in preview. Okay, time to share created canvas apps with guest users. This give us the possibility to create PowerApps that are build with the intention to share data with external persons.

Let’s start by adding a guest user into the Azure Active Directory.

Don’t forget this …

Change the “USAGE LOCATION” of the guest user.

Default the usage location of a guest user is empty. When not filled in and the guest user has been added to the Power App. The guest user cannot open the Power App via the receive link in the e-mail. So don’t forget this to change to the correct usage location!

As administrator go to the canvas app that you would like to share with your guest user. Click on Share.

Search for the guest user and click share. The guest user will be notified by e-mail that access is given to a specific canvas app.

E-mail that will be sent to the guest user.

E-mail from Microsoft.

Guest user can click “Open the app >” and use the canvas app.

Which License you use for a guest user?

During the test the guest user had a Office 365 E3 license, including the Office 365 PowerApps/Flow apps. But I did also the test with a Microsoft enabled account without Office 365 license and had no license assigned in Azure Active Directory. Even this guest user cloud access the canvas app, after he was added to the shared user list.

License

Guest user does not need to have a license assignED, THIS UNTIL OCTOBER 2019. Guest user will need a license as from OCTOBER, that means that the guest user has already one in his own tenant or you need to deliver one.

Conclusion:

It’s a great opportunity to have the possibility to create canvas apps in your organisation that can be used by people outside your organisation. I have already my first canvas app, shared with our first customer. How cool is that!

Update 01/10/2019:

  • Share apps with users outside your tenant using ADD B2B or B2C
  • Share an app with guest as easily as you do with co-worker
  • Guest can run cnavas apps that are embedded in SharePoint lists
  • Guest can access apps in both browser and mobile clients

License – update 01/10/2019:

PowerApps and Flow service can only be accessed if you have a PowerApps or Flow license. Similarly, PowerApps and Flow licenses are required for users that input data into, query, or view data from the PowerApps and Flow apps through a pooling device. Pooled connections use a non-interactive user account in PowerApps and Flow that can access the system but only via the web service layer. Internal users accessing PowerApps and Flow service indirectly through a portal or via an API to a separate service such Microsoft Outlook must also be properly licensed, regardless of if they are set up as a PowerApps and Flow user in the app, for example:

  • Internal users that access the PowerApps and Flow service must still be properly licensed for PowerApps and Flow.
  • Any user that accesses PowerApps and Flow service that is made available through an automated process requires a PowerApps and Flow license.
  • The number of tiers of hardware or software between the PowerApps and Flow apps and the user or devices that ultimately use PowerApps and Flow service indirectly does not affect the number of SLs required.


Featured

Trigger a MS Flow from PowerShell script

Lately I was wondering how I can connect from a PowerShell script to a MS flow? Is this possible and how I can transfer variable(s) information from the script towards the Flow and use these values. In the simple blog example I will transfer variable values towards a MS Flow.

What is needed?

TypeNameInfoLicensing
PowerShell Invoke-WebRequest Link n/a
MS FlowTrigger: Request – When a HTTP request is received (PREMIUM)
Used for incoming API calls that could use actions in a Logic App or other API to trigger this flow.
Link Plan 1 (5€/month)

PowerShell script.

<#
    PowerShell code above, or below can be anything. 
    In this example, the values are manual set and variables 
    have been random chosen.
#>
# Variables
$RandomID = "1285"
$Hostname = "SRV-DEMO-01"
$HostStatus = "Running"

<#
    Define Job Parameters.
    ----------------------
#>
$JobUriParameters = @(
    @{ Name = 'ItemID'; Value = $RandomID},
    @{ Name = 'HostName'; Value = $Hostname},
    @{ Name = 'HostStatus'; Value = $HostStatus}
)
#Convert to JSON parameters
$MSFlowParam = ConvertTo-Json -InputObject $JobUriParameters
PowerShell code, containing the variable values and first step to create the JSON output file that will be sent via the API call to the MS Flow, via a POST operation.

Building the Flow?

We will start the MS Flow with the following trigger ” When a HTTP request is received “. This is a PREMIUM connector and therefore we need a minimum PLAN 1. We will receive the HTTP POST URL, when the Flow will be saved for the first time, currently link is not available.

Adding a JSON schema based on the information that will be sent from PowerShell to MS Flow.

PS> 
<#
    PowerShell code above, or below can be anything.
    In this example, the values are manual set and variables
    have been random chosen.
#>
# Variables
$RandomID = "1285"
$Hostname = "SRV-DEMO-01"
$HostStatus = "Running"

<#
    Define Job Parameters.
    ----------------------
#>
$JobUriParameters = @(
    @{ Name = 'ItemID'; Value = $RandomID},
    @{ Name = 'HostName'; Value = $Hostname},
    @{ Name = 'HostStatus'; Value = $HostStatus}
)
#Convert to JSON parameters
$MSFlowParam = ConvertTo-Json -InputObject $JobUriParameters
$MSFlowParam
#Result output
[
    {
        "Value":  "1285",
        "Name":  "ItemID"
    },
    {
        "Value":  "SRV-DEMO-01",
        "Name":  "HostName"
    },
    {
        "Value":  "Running",
        "Name":  "HostStatus"
    }
]

Copy the result output and paste in as a sample JSON payload. This will generated the required JSON schema.

Result JSON schema after adding the sample payload
JSON Schema generated based in the sample payload
Set the method to POST.

As second step we gone filter the array to get the values of each variable that has been propagate by the PowerShell Script.

Completing the PowerShell Script by adding the Invoke-WebRequest, will make an API call to MS Flow and sends the information via a JSON – file.

<#
    PowerShell code above, or below can be anything. 
    In this example, the values are manual set and variables 
    have been random chosen.
#>
# Variables
$RandomID = "1285"
$Hostname = "SRV-DEMO-01"
$HostStatus = "Running"

<#
    Define Job Parameters.
    ----------------------
#>
$JobUriParameters = @(
    @{ Name = 'ItemID'; Value = $RandomID},
    @{ Name = 'HostName'; Value = $Hostname},
    @{ Name = 'HostStatus'; Value = $HostStatus}
)
#Convert to JSON parameters
$MSFlowParam = ConvertTo-Json -InputObject $JobUriParameters

<#
Gets content from a web page on the Internet via 
Default,Delete,Get,Head,Merge,Options,Patch,Post,Put,Trace
#>
Invoke-WebRequest `
    -Uri 'https://xxxx-xx.westeurope.logic.azure.com:443/workflows/axxxx34xxxxe45xxxxx258cxxxxc487e/triggers/manual/paths/invoke?api-version=2016-06-01&sp=%2Ftriggers%2Fmanual%2Frun&sv=1.0&sig=xxxxxxxiVmqoHHuON0xxxxhX89ei3ybFSlsxxxxsyAY' `
    -ContentType "application/json" `
    -Method POST `
    -Body $MSFlowParam

Result of the data operations when filtering the array on the name – value. This gives us the possibility to split the array in smaller array pieces.

So the answer is YES. We can transfer data from a PowerShell script towards Microsoft Flow. How cool is that!

Featured

Azure Key Vault life cycle management – Part 1

The What?

Azure Key Vault is a resource for storing and accessing secrets, key and certificates. But if a company need to have a rotation for these identifications? Azure key Vault has the possibility to enable key rotation and auditing, but this needs to be configured and is not a default feature. For those identifications, some specific value items can be used to build a life cycle process. 

In this first part, a concept solution will be provided to detect the expiration date of a secret or key and to inform the IT department or owner of this key.

Azure Key Vault (Preview)

Service Class Regions
Logic Apps Standard All Logic Apps regions except the following:
     –   Azure Government regions
     –   Azure China regions
Flow Premium All Flow regions except the following:
     –   US Government (GCC)
PowerApps Premium All PowerApps regions except the following:
     –   US Government (GCC)

Throttling Limits

Name Calls Renewal Period
API Calls per connections 100060 seconds

The How-to:

The Flow will connect to the Azure Key Vault via the connector and collect the necessary information to calculated the expiration date that has been set on the secret.

Trigger

The trigger for this flow is a schedule that will run every day at midnight. Let’s start building the flow:

Select ‘Schedule’ as a trigger and filling in the following fields:

1-2. Interval – Frequency: based on the selected frequency type, the interval can be set. In this example, a daily schedule is created by selecting the day type and with an interval of 1.
3. Timezone – in this example the timezone UTC +01:00 is used for Belgium.
4. At these hours – the flow will be triggered at midnight, which is the 0 for this field.  

Actions

Following actions will be used in the flow:

  •  Connection to Azure Key Vault to get the information about the secrets in the Key Vault.
  •  Actions to calculate the days left before the expiration date.
  •  Send a notification on the number of days left.

Action – 0.2.Get Secrets

Before continuing the flow an app registration needs to be completed in the Azure portal. Go to ‘Azure Active Directory’, ‘App Registrations’, ‘New registration’

Click add new registration
API Permissions

Register the application and create a secret, go to ‘Certificates & Secret’. Create a client secret. Storing the client secret in a safe place, building the flow can be continued. Searching for the ‘Azure Key Vault’ and selecting the ‘List Secret’ – action.

Select ‘Connect with service principal’

1. Enter a connection name for this connector
2. Enter the name of the Key Vault in Azure. In this example, ‘Cloud02KeyVault’ has been used.
3 – 4 – 5. The Azure ID can be found in the App registration overview for this connection:

When the connection has been established with the Key Vault in Azure, the connector will be shown as follow in the flow: 

Action – 0.3.Check Days

In this apply to each – action, the days left before the expiration date will be calculated for every secret that has been found in the key vault. The value is the result of the step ‘0.2.Get Secrets’, that will contain all the information about the secrets.

Action – 0.3.1.EndTime

Compose action that will collect the Secret end time. (in this example, we assume that there’s is always an expiration time defined for each secret).

Action – 0.3.2.Today

Getting the current time and date, by using the Date Time – action. 

Action – 0.3.3.TicksToday

In the next two steps, a conversion is needed to define the difference between the current time and expiration time. This can only be accomplished by converting the time to the number of ticks. So that we can subtract both values.

Expression: ticks(body(‘0.3.2.Today’))

Action – 0.3.3.TicksToday 

Expression: ticks(outputs(‘0.3.1.Endtime’))

Action – 0.3.5.DivDays

In the compose – action, a calculation will be done to get the days between the current and expiration date.
Expression: div(sub(outputs(‘0.3.4.TicksEndTime’),outputs(‘0.3.3.TicksToday’)),864000000000)

This result will show the number of days left between the current day and expiration time. 

Action – 0.3.6.Check WARNING Lvl

In this example, a WARNING message will be sent via email when the day difference is between 16 and 30 days. Is it lower then 16 days a CRITICAL message will be sent via email. 

Result of this concept is that there is a kind of monitoring for a secret in the Azure Key Vault. Letting you build a Life Cycle Management for your secrets.

Upcoming parts:

  •  Adding an expiration date (Azure Automation), when there’s no defined
  •  Approval process to check if a secret is still in use.

Did you like this post share it on twitter, give some thumbs up and all feedback is welcome.

Featured

Azure Log Analytics to an alert dashboard in Power Apps

After reviewing the Azure Log Analytics connector and working a lot with Azure Log Analytics, I have chosen to create a concept to use Kusto queries and displaying the results on a dashboard in a power app. This concept has not been implemented in production and is merely an example of how to combine Azure connectors with Flow and PowerApps.

When looking at the design, three big components are used:

When looking at the design, three big components are used:

>> PowerApps – dashboard – Trigger for the flows

>> Microsoft Flow – Connection between Azure Log Analytics workspace and the Power App.

>> The “Log Analytics” workspace that contains logs of Azure resources.

In this example, a virtual machine that points to a Log Analytics workspace and collects all of the performances and security parameters of the virtual machine.

Before building the power app and flows, homework needs to be done in Azure. In this case, all the connectors in Flow will use a service principal to connect to the Azure Tenant and use the lowest privilege access rules.

In the Azure Portal, go to Azure AD and select “App registrations” in this blade click “New registration“. Creating an app registration for the Log Analytics access in  Azure.

App registrations

Give a name for the app application service principal name.

When the app registration has been performed a secret need to be created. Select and click “Certificates & secrets” and click “New client secret“. Make note of the password that has been created.

The Client ID, Client Secret, and Tenant ID will be used to authenticate the Azure Log Analytics connector in Flow to the Azure tenant. When this has been completed the development will proceed towards the power app. 
For the API permissions for Log Analytics and the tenant, permissions need to set.

For Log Analytics API, admin consent is required and need to be enabled. Log Analytics workspace and Azure VM’s that have diagnostics settings enabled. 

The power app contains a simple gallery that displays the result of each Kusto query. Using a control timer, that function as a trigger for the Flow to get the results from the Log Analytics workspace. The flow will be triggered as the timer starts, and the timer is starting automatically and restarts every time the refresh time runs out.

Two screens have been created to display alerts for high CPU levels and Windows Updates for the virtual machine. This is how the screens are looking in the power app editor:

Alerts dashboard
Windows updates dashboard

Within the timer property value “OnTimerStart” following code has been added: 

  • ClearCollect(Alerts,LogAnalyticsCPU.Run()) >> Gallery will be connected to the collection “Alerts
  • ClearCollect(WindowsUpdates,’LogAnalyticsWU’.Run()) >> Gallery will be connected to the collection “WindowsUpdates

One of the Flows that will be triggered from out of the power app, is “Log Analytics CPU” The Flow is triggered by the power app, the action “Run query and list results” from the Azure Log Analytics connector will run the Kusto query. 

Authentication of the Azure Log Analytics connector will be done by an app application service principal that has been created in one of the previous steps: 

Entering the correct client ID, tenant ID and client secret and clicking “Create” will connect the action to the given tenant and subscription, resource group and log analytics workspace can be selected.

In the next two steps a filter will select only the information that we need to send back to the power app. 
Because of the array, “Response HTTP” is used to send the information back towards the power app.

This is the JSON schema used to send the information:

{
    "type": "array",
    "items": {
        "type": "object",
        "properties": {
            "HostName": {
                "type": "string"
            },
            "AvgCPU": {
                "type": "number"
            }
        },
        "required": [
            "HostName",
            "AvgCPU"
        ]
    }
} 

I did not manage to send the information back with the action “Respond to PowerApps”. But using the HTTP response did the trick. 

The same flow has been used for Windows Updates, but with a different Kusto query. See flow below:

Screenshot of the Windows update result for the virtual machine:

This will be the same for alerts when the CPU is higher than 75%.
Hope you like (Like me/Share) this and when questions put them here below.

Featured

Timesheet check for Project Service Automation (PSA – Microsoft Dynamics 365)

Losing a lot of time by checking everyone’s timesheet if the work week has been completely registered in the PSA timesheet portal. In Belgium, most of the employees have 40 hours working week. In this MS Flow, a check will be performed every Friday evening around 8 PM. 

Set time schedule

Defining some variables that later will be used during the calculation of the timesheet total hours and to send a notification to the employees.

Variables

In the example, the work week is starting on Monday at 8.30 AM and end on Friday at 5 PM. To define the date on Friday, “Date Time” – actions will be used to get the current date and time. This will reflect the date on this Friday that the Flow is running.

Date Time – Action (Current Time)
Getting the date time that the Flow has being started.

Getting the date and Time before Monday morning, so that when filtering the time registration of the complete week will be selected.

Set begin date and time

To get all the employees that need to be checked, the “comma Data Service”- connector/action will be used.
The action “List records” will be used the get all the record of each employee.

PSA Users – getting employee records

After collecting the first information, check of the timesheet for an employee can be started. Process 05 – Check timesheet employee” will handle the work. Six steps in the “apply to each” – loop will calculate the total hours in registered for an employee and sent a notification when needed. Let’s look deeper into the process.

Process calculate registered work hours and notify as needed.

The first step of the process is getting the Employee Name (Because in PSA the employee/user is a unique number). The name of the employee can be found in the Entity “PSA Users”, under the field “psa_name”. This value will be set into the variable “EmployeeName”. 

Set variable EmployeeName

The second and third step will get the employee e-mail address and this can be found in the entity “Users”. This entity contains detailed employee/user information.

The output value will contain the Primary E-mail address of the employee/user (field “internalemailaddress” in the entity Users), this value will be set to the variable “EmployeeEmail” The fourth step in the process will be filtering the weekly registered hours out the entity “Hours”. 

Filtering registered hours

The fifth step of the process will be calculating the total amount of registered hours in the week.

Calculate Total Hours

Getting the hours will result in output as a string (example 1.5). This need to be converted to a floated integer value, before adding to the number variable that has been defined upfront, see screenshots below. 

The calculation in detail:

Process calculation details

In the last two steps, the total amount of hours will be check if there is a minimum of 40 hours registered, when not email notification will be sent to the employee/user to review his weekly time registration. The total hours variable will be reset to zero for the next check for another employee/user.

Total overview of the Flow, see below.

When questions or remarks please feel free to contact me and like my blog if you think it’s great.

Featured

Don’t miss any #callforspreakers on Twitter

Get notified about #callforspreakers Twitter messages and don’t miss any request for speakers. Get notifications about Twitter messages including some hashtags, then searching on Twitter for them.

The Twitter connector in Microsoft Flow can help us with this. In this example, the following process will be automated.

Searching in new tweets that including the following hashtags:
* #CallForSpeakers, #MicrosoftFlow, #PowerApps
This trigger the Flow that will send a notification towards the mobile Microsoft Flow App and a second action that will create a to-do task in Microsoft To-Do with reminder and due date.

Below an overview of a view limits and restrictions, when using the twitter connector:


– Maximum number of connections per user: 2 – Frequency of trigger polls: 60 seconds – Maximum size of image upload: 5 MB – Maximum size of video upload: 15 MB
– Maximum number of search results: 100

The trigger will check for new tweets with hashtags and words in the tweet text message. This will trigger the flow.

A variable is created to define when the tweet has been detected, this variable will be used to define the reminder and due date in the To – do a task in Microsoft To Do.

Set time to Middle European Time (this is my time zone, will be different for you)

Notification is sent to the Microsoft Flow app on the mobile phone.

The second action will create a to-do task about the Twitter search result including a reminder and due date.

The expression addDays() is used to add days to the variable “FormatTime”. In Microsoft To – Do a list has been created and assigned to the 02.2. Schedule to-do action.

This easy and straightforward flow will save a lot of search time on Twitter. Like this blog post and comments or suggestions or always welcome.

Total view of the Flow
Featured

Environments and Data Protection – How-to!

Intro

Power Platform of Microsoft (Power Apps, Microsoft Flow and Power BI) are tools that will help you, with your digital transformation 4.0.

Microsoft Power Platform

In the coming years, we will look at digital transformation 4.0 to optimize are business processes, production environments, healthcare, …

Currently I hear a lot of customers that are really have the need to start working with these tools and to eliminate daily processes in their company that are time consuming. But they have question about how-to control and protect the company data, when empowering your employees with these powerful tools. Laying this trust in the hands of your employees, is protecting you for persons that have not the intention to be decent for your company. Having the ability to send data to third party application, is one of the biggest concurs that some companies have.

Not allowing the end users to use these tools, can hold or even bring the digital transformation to a stop in your company. In the coming years, we will see a gap arising, between businesses that are empowering these tools and those that don’t embraces them. Do you not want to be a winner?!

Protecting your data and control the environments of your Flows?

This process will be the hardest part of your digital transformation and this is making some decisions about governance and Data loss prevention. There are several steps, like:

  • Team: Defining key users within your company (that can become those Citizen Developers)
  • Control:  Creating a Microsoft Flow environments for Flows personal usage, test and production environment. But also control created environments by none key users. (Deletion of environments that are not supported by the organization)

Below an answer on the question: How can we protect are data?
Using Data Loss Prevention (DLP) policies, to protect your data when using PowerApps and Microsoft Flow. It’s possible.

More detailed information about environments: 
https://docs.microsoft.com/en-us/flow/environments-overview-admin

Short note about Environments

Environments can be used because of:
Geographic location: Environment can be assigned to a geographic location (region). This means when users are in Belgium with the region defined as Europe. Result in a better performance outcome.
Data Loss Prevention: DLP Policies can be assigned to an environment. In this policy you define which application can used to handle business data.
Isolation boundary: Any resource of a flow in one environment, doesn’t exist in another environment
Common Data Services

How-to create a new environment.

These can be created as such:

Go to the Admin Center

Click «New environment»

New environment

Give a proper name to the environment and select region and type (type is depending on your Microsoft Flow licenses, more about licenses)

In this example we skipped the creation of the database. Next will define a Data Loss Prevention policy into the environment to protect business data.
Go to Data policies.

Create new policy by selecting the New Policy item.

Choose an environment, where a data loss policy need to be activated on.

Define which data applications can be used to handle business data only.
Business data only: This section will contain the apps that are allowed to use with business data. Default this is the default data group. Microsoft recommend this to leave this group as the default group.
No Business data allowed: This section will contain the apps that are not allowed to use with business data. Users that will try to use these apps will get a notification that it’s not allowed to use the selected app in the flow because of a data loss prevention policy.

Click “Save Policy”, to save and activated your DLP – policy.

Creating a Flow example with a connection to Google Drive will have the following result. The user will have a notification that saving of the created flow is not possible because of a Data Loss Prevention Policy. At this moment we protect the business data to be moved outside of the company.

But as example we add the Google Drive connection to the data loss policy. and add the connector to the default group. What will happen?

When the connector has been added to the business data only – default group. The users will have the possibility to save flows with a Google Drive connector at from the moment the policy has been saved.

Removing selected connectors that are allowed to use business data, is very straight forward, by adding the connector back to the “No business data allowed”- group. Flow with the connector that has been moved to this group, will be disabled in the list of flows.

Following notification will be displayed in the users notifications box.

More about environments and Data Loss Policy within Microsoft Flow can be found here:

https://docs.microsoft.com/en-us/flow/environments-overview-maker
https://docs.microsoft.com/en-us/flow/environments-overview-admin
https://docs.microsoft.com/en-us/flow/prevent-data-loss

Conclusion.

We don’t need to be afraid of the tools that can help you with optimizing business processes. Controlling the usage with data loss policies for each environment will add an extra layer of control and protects your data being exposed to the outside.

Tip: Create app in TEAMS

Building a canvas app in teams (earlier code name project Oakdale) gives you the possibility to create a canvas app in Teams for Teams. More information here

A lot of citizen developers, taken their first steps in developing apps in Teams and they are using also know tools, like SharePoint. But indeed, there is a but. Last week, I had a citizen developer, that was trying to add the SharePoint connector to the canvas app in the editor in Teams and he came into a loop, giving his credentials. But could not select a SharePoint site.

Let us go throw the steps, where the citizen developer gets stuck for adding the SharePoint connector:

Click “Add Data”

Search for “SharePoint”, select and click. This opens a web browser page towards the connectors list:

They clicked the plus sign at the “SharePoint”- connector.

Clicking “Create”

Login

The connection has been added but isn’t visible in the canvas app and currently the only way to do this is this:

Open make.powerapps.com by pressing the link icon.

Go to the Power Apps admin portal, https://admin.powerplatform.microsoft.com/, Environments and search for an environment of the type Microsoft Teams.

Select it and open by clicking the app link in Microsoft Teams association.

This will open the Teams websites and click “use the web app instead”. Do the same as in the Teams desktop client by opening Power Apps and selecting the tab “Build” and open the app.

Search for the SharePoint connector, now it will show the already made connection and the possibility to select a site link.

Select SharePoint Site
Select SharePoint list
Select list

In the browser the connection will be visible under data. Saving the app and opening it again in Microsoft Teams desktop version.

SharePoint Connection in Data
Open app in MS Teams Desktop pop-up and click “Allow”
Connection also visible in MS Teams Desktop

The permission pop-up will ask to allow the connector and he will be available in the data view. The data in the SharePoint list will be available for the Canvas App from the Microsoft Teams Desktop embedded editor.

Hope this the can help you in your citizen developer journey!

Deliver an Enhanced User Experience by Combining Azure, Teams and the Microsoft Power Platform – Global Power Platform Bootcamp – slide deck

Event: Global Power Platform Bootcamp 2020
Date: February 15 2020

SESSION OVERVIEW

Does your company have several teams demanding test environments in Azure? Well, then I will tell you how you can combine Azure Blueprints (Custom Connector), PowerApps, Power Automate and Microsoft Teams to get in total control of the environment.
The app will be used to send requests, gather feedback and manage the environment. Teams will be used to send approvals to the Team Manager and inform the requester once the environment has been provisioned. And of course, all of this will be automated with Low Coding.

GITHUB Custom Connector and Adaptive Cards https://github.com/frederikbisback

Deliver an Enhanced User Experience by Combining Azure, Teams and the Microsoft Power Platform – Super Power Saturday London 2020 – slide deck

Event: Super Power Saturday 2020
Date: February 8 2020

SESSION OVERVIEW

Does your company have several teams demanding test environments in Azure? Well, then I will tell you how you can combine Azure Blueprints (Custom Connector), PowerApps, Power Automate and Microsoft Teams to get in total control of the environment.
The app will be used to send requests, gather feedback and manage the environment. Teams will be used to send approvals to the Team Manager and inform the requester once the environment has been provisioned. And of course, all of this will be automated with Low Coding.

Events 2019

07.05.19 – AZURE CONNECT BELGIUM – (Event CANCELLED)

undefined


Meetup (Free Event): http://bit.ly/2PYoglU
Session: The Power is coming …
Representing the Belgium PowerApps and Flow User Group #BEPAFUG
New date will be announced soon …

08.08.19 – Belgium PowerApps and Flow User Group

This image has an empty alt attribute; its file name is logomeetup.png

Meetup (Free Event): https://www.meetup.com/Official-Belgium-PowerApps-Flow-User-Group/events/262750368/
Session: The Power is coming …

14.09.19 – Power User Days Belgium

This image has an empty alt attribute; its file name is powerusersdaycamp.png

Meetup (Free Event): More information about this event soon.
Session: From Azure to Microsoft Flow and PowerApps and back …

16.11.19 – SharePoint Saturday – Leicester

This image has an empty alt attribute; its file name is logo_196x60.png


Session:
The power is coming … make a connection between Azure, PowerApps, Microsoft Flow and more…
Microsoft Flow vs Logic Apps. But do we need to use always logic apps to create low coding automation between Azure and other application? In this session I would to guide you threw the differences of Microsoft Flow and Logic Apps (like when to decide to switch from Flow to Logic Apps, pricing and functional differences). Ending with the Power of Azure Integration services …

19.11.2019 – Milan Power Platform World Tour (Italy)

This image has an empty alt attribute; its file name is worldtour-powerplatform-logo_full-color.png


Session:
The Power is coming … make a connection between Azure, PowerApps, Microsoft Flow and more…
Microsoft Flow vs Logic Apps. But do we need to use always logic apps to create low coding automation between Azure and other application? In this session I would to guide you threw the differences of Microsoft Flow and Logic Apps (like when to decide to switch from Flow to Logic Apps, pricing and functional differences). Ending with the Power of Azure Integration services …

PPWT 2019 Milan – The power is coming

04.12.2019 – Brussels Power Platform World Tour (Belgium)

This image has an empty alt attribute; its file name is worldtour-powerplatform-logo_full-color.png


Session:
Deliver an enhanced user experience by combining azure, teams and the microsoft power platform. (Technical session together with Clifton Lenne)