Skip to content

Month: June 2020

Connect to Application Insights and Log Analytics with Direct Query in Power BI

Application Insights (AI) and Log Analytics (LA) from Microsoft Azure provide easy and inexpensive ways to instrument applications. Using just an instrumentation key, any application can send operational data to AI which can then provide a rich array of tools to monitor the operation of the application. In fact, the blog that you are reading uses an Application Insights plugin for WordPress that registers each view of a page into an instance of AI in my Azure tenant.

Application Insights data can be queried directly in the Azure portal to provide rich insights. In addition, the data can be exported to Excel for further analysis, or, it can be queried using Power Query in either Excel or Power BI. The procedure for using Power Query can be found in this article. The approach for doing so, uses the Web connector in Power Query, which can be automatically refreshed on a regular basis. The Web connector does not however support Direct Query, so the latency of the data in this scenario will be limited by the refresh schedule configured in Power BI. Any features that depend on Direct Query (Aggregations, Automatic Page Refresh) will also not work.

If you’ve worked with AI or LA, and dropped down to the Query editor, you’ve been exposed to KQL – The Kusto Query Language. This is the language that is used by Azure Data Explorer (ADX), or as its code name, “Kusto”. This is of course not a coincidence, as the Kusto engine powers both AI and LA.

Power BI contains a native connector for ADX, and if you can configure an ADX cluster for yourself, populate it, and work with it in Power BI for both imported and Direct Query datasets. Given that ADX is what powers AI and LA, it should be possible to use this connector to query the data for AI and LA. It turns out that the introduction of a new feature known as the ADX proxy will allow us to do just that.

The ADX proxy is designed to allow the ADX user interface to connect to instances of AI and LA and run queries from the same screens as native ADX clusters. The entire process is described in the document Query data in Azure Monitor using Azure Data Explorer. The document explains the process, but what we are particularly interested in is the syntax used to express an AI or LA instance as an ADX cluster. Multiple variations are described in the document, but the ones that we are most interested in are here:

For LA: https://ade.loganalytics.io/subscriptions/<subscription-id>/resourcegroups/<resource-group-name>/providers/microsoft.operationalinsights/workspaces/<workspace-name>

For AI: https://ade.applicationinsights.io/subscriptions/<subscription-id>/resourcegroups/<resource-group-name>/providers/microsoft.insights/components/<ai-app-name>

By substituting in your subscription ID, resource group name, and resource name, you can treat these resources as if they were ADX clusters, and query them in Power BI using Direct Query. As an example, a simple query on this blog can be formed using the ADX connector:

And the result will appear as:

The precise query is provided in the query section of the connector above.

Once the report is built, it can be deployed to the PowerBI service, and refreshed using AAD credentials.

It is important to note that this method does NOT require you to configure an ADX cluster of your own. We are simply utilizing the cluster provided to all instances of AI and LA. We therefore do not have any control over performance levels, as we would have in a full ADX cluster. However, if the performance is adequate,(and the queries designed appropriately), this can be a good approach to work with AI and LA data that has low latency (near real time) requirements.

31 Comments

Creating Data Driven Subscriptions for Power BI Reports

One of the features that has never made the leap from SQL Server Reporting Services (SSRS) on-premises to the cloud is data-driven subscriptions. Users can subscribe to reports, but a data-driven subscription allows individual subscriptions to be stored in a central location and parameterized, while delivering the reports to multiple locations. This article will describe a pattern for accomplishing this using SharePoint lists as the subscription store, and Power Automate as the automation tool, for a no-code solution to this requirement.

**Updated – Sept 24 2020** The new Power Automate “Export to File” power BI actions completely remove the need to create custom connectors (outlined below). I am leaving the steps in this post, because the approach can be used for other things, but these new actions make this whole process significantly easier, and cheaper. The Export to file actions are NOT premium actions in Power Automate.

Requirements

In order to implement this pattern it is necessary to have access to Power Automate and to SharePoint, both of which are available in Office 365. The custom connector described below uses the Power BI Rest API and the ExportTo function, which require a dedicated capacity (Premium) in Power BI to work. This pattern works with both interactive (pbix) and paginated reports. Paginated reports also require the use of a dedicated capacity. Data-driven subscriptions in SSRS were always an Enterprise feature on premises, so this requirement should come as no surprise.

Custom Connector

Currently, there are a number of actions available for Power BI within Power Automate. Unfortunately, none of these actions have the ability of rendering and saving a report, but that is something that the Power BI REST API can do. It is possible however to call this API using a custom connector in Power Automate.

Chris Webb recently put together a series of articles on using the Export function in the Power BI REST API with Power Automate. The first article outlines the process of creating the connector, as well as a downloadable Swagger (Open API) definition file that this pattern is based on. The second describes using it within Power Automate.

I won’t re-invent the wheel on the custom connector creation instructions here, just point you to the blogs above to create a connector. Once the custom connection is created, it will be possible to implement data-driven subscriptions.

Subscriptions

Subscriptions can be stored just about anywhere, but for the purposes of this example, we’re going to use a SharePoint list. What we want is the ability to specify the title of a report, what format we want it rendered in, and the destination. The Custom connector will require the workspace ID and the Report ID of the report in Power BI, in addition to the output format. In addition, we want to be able to take advantage of parameters in paginated reports, so our subscription definition needs to contain a parameter value pair as well.

The following SharePoint Columns will be used in a custom list:

Column NameColumn Type
TitleSingle line of text
Workspace GUIDSingle line of text
Report GUIDSingle line of text
File FormatChoice
Destination TypeChoice
DestinationSingle line of text
ParameterNameSingle line of text
ParameterValueSingle line of text

The choices for file format are the different output formats supported by the Export API. They are CSV, DOCX, IMAGE, MHTML, PDF, PNG, PPTX, XLSX, and XML. In my case I set the default to PDF as that is the most common format, but that choice is optional.

PowerAutomate supports a wide variety of file storage mechanisms, so the choices for destination type really depend on what destinations you want to support. In my case, I chose OneDrive for Business, SharePoint libraries, and email recipients. Therefore, one subscription could save to SharePoint while another delivers a file to an email user. These destinations will be reflected in the PowerAutomate flow created below.

Once the list is created, it can be populated with a few entries. In my example below, I am rendering reports from tyGraph for Twitter. Three are paginated reports going to each of the above destinations, and the last is an interactive (pbix) report being delivered to a SharePoint library.

The first three in the list are passing in a different parameter value to each report. Report parameters are not available to interactive reports, so these values are left empty for the interactive report.

The workspace GUID and the report GUID can be obtained by opening the report in a browser, and then inspecting the URL. This is true for both paginated and interactive reports.

Power Automate

Chris Webb’s post referenced above describes a pattern for rendering an export file from a Power Automate flow. We will use this within the pattern here.

The flow will iterate through the subscription list, and for each item found will render the report and save it to the desired output location. It can be created with any trigger, and for our purposes we are using the Recurrence trigger.

The first action in the flow is the SharePoint Get items action. Configure it to get all of the items from the subscription list created above.

We will need a name for the output file in multiple saving steps. It’s a good idea to create a variable for the output file name for ease of maintainability. We therefore initialize “Output File Name” as a String variable next.

We then create an “Apply to Each” Action from the control group and apply it to the “value” output from the “Get items” step above. This will iterate through each of our subscriptions.

Within the loop, we next apply the “Export to File” action from the custom connector created above. Instead of hardcoding the values however, we supply the values saved in the subscription. In addition, we pass in the parameter values taken from the subscription.

The same action can be used for both interactive and paginated reports. Interactive reports will simply ignore paginated specific options. Many options are available here, we are just utilizing a few of them. It should also be noted that this pattern only supports a single parameter/value pair. This is for simplicity’s sake, as the action will support multiple pairs.

It is also important to note that the settings of each of these custom actions must be changed to turn off the “Asynchronous Pattern” for the action. Without doing this, the action will fail at run time, even though it may test successfully when creating the custom connector.

In the next step, we set the value of the output file name variable that we set above. This will be called when we send the file to the destination.

In this case, we use the title, the current time, and the file format extension to create the file name. The exact formula is completely optional, but it’s a good idea to make the names unique to avoid overwriting past reports.

In the next step, we wait. Rendering takes some time, and one of the outputs above gives us an indication of how long we need to wait. In order to do so, we use the built in “Delay” action in Power Automate.

For the value of “Count” we select the “retry-after” output from the Export to file action above. It returns the number of seconds that the service estimates for the rendering of the report. This is just an estimate, and no guarantee, so it is possible that when we check on the status of the report, it will not be complete. Therefore, we need to repeat checks until it is. For that, we use a “Do Until” Action, available from the “Controls” section of the flow.

We check for the status of the report using the “Export Status” action of our custom connector. Therefore, we add this action into our loop and configure it appropriately, and tuning off the “Asynchronous Pattern” option as above. The “Export Status” action takes 3 arguments, the Workspace and Report GUIDs (that we get from the SharePoint list item) and exportId – which can be retrieved from the output of our “Export to File” action above as the “id” field.

The status reported as an output of this action will have 1 of 4 possible values: Succeeded, Failed, Running, or NotStarted . We want to continue checking as long as the status is neither “Successful” nor “Failed”. This is an advanced condition for the loop, so the Advanced option for it must be selected and the following code added:

@or(equals(body('Export_Status')?['status'], 'Succeeded'),equals(body('Export_Status')?['status'], 'Failed'))

Where Export_Status is the name of the action. Keep in mind that the language here is case sensitive.

The next action added is a condition where we inspect the value of the “status” output from the “Export Status” action. THe two conditions that we look for are Running or NotStarted. If either of these conditions are true, we need to wait for another estimated time interval. The entire loop will appear as below when configured.

Once the loop completes, we need to inspect the status field to see if it was successful, or if it failed. If it failed, we do nothing, but if it succeeds, we need to retrieve the report for storage in our destination. For this, we will add another condition AFTER the “Do Until” loop to inspect the status output.

Along the no branch, we add nothing, but if the output was successful, we retrieve the contents of the report with the “Get Export File” action of our custom connector. The “Get Export File” accepts the same arguments as the “Export Status” action, and has a single output – Body, which will contain the body of the report.

Once the body of the report has been retrieved, we need to send it to the destination. The destination will be determined from the “Destination Type” and “Destination” values from our subscription. For this, we use the “Switch” action from the Control section. In our case we have case branches for OneDrive for Business, SharePoint, and eMail. Fully configured, these branches appear as below.

Of course, your branches will reflect your possible destinations. The number of possible destinations is large and constantly evolving. In this way, this approach is much less constrained than the classic “data driven subscription” feature in SSRS which supported a fixed number of outputs.

Final Thoughts

While the classic Data Driven Subscriptions feature from SSRS Enterprise will likely not be returning, it is possible to recreate the capability with this approach. Its decoupled nature means that it is more flexible , allowing designer to add their own logic and destinations into the process.

Leave a Comment