Skip to content

Tag: Azure

Connect Power BI to Azure Monitor data using Direct Query with Azure Data Explorer

Man – that’s a boring title. But it’s accurate.

A few months ago, I posted an article outlining how to connect Power BI to Azure Application Insights and Azure Log Analytics (jointly referred to as Azure Monitor) with Direct Query. This article describes an approach that allows you to use a native Kusto connector to connect to the Azure Monitor instance as if it were an ADX cluster. This allows for Direct Query to be used, among other things. The option connecting Power BI available through the Azure Monitor UI uses an html connector to query the respective APIs, and that connector doesn’t support Direct Query.

The problem with using this connector is that it’s a bit of a hack. At the time it was written, you needed to use the old Power BI driver for Kusto to make it work, and that approach isn’t simple. Over time, it stopped working altogether for Application Insights. The ADX connector has since been updated to support connection to Azure Log Analytics (but not Application Insights) and is therefore still valid.

There is however another way to achieve this by using your own ADX cluster. ADX clusters allow for “cross-cluster queries” that permit tables in a database in one cluster to be joined or unioned with tables in a completely different cluster. The same proxy addresses mentioned above can be used in one of these cross-cluster queries, and in this way, be just use the ADX cluster as an intermediary.

Everything that you need to know about this approach can be found in the support article “Query data in Azure Monitor using Azure Data Explorer”

To create a Power BI report that queries Azure Monitor data using Direct Query, first create a new report, and connect to data using the “Azure Data Explorer (Kusto) connector”. Enter the address of the cluster, and the name of a database within that cluster. The database itself doesn’t matter; it simply provides a scope for the query. Finally, you need to specify the query, and this is where the cross-cluster query comes into the picture. The query takes the following form:

cluster(‘ProxyURL‘).database(‘ResourceName‘).TableName

The Proxy URLs differ between Log Analytics and Application Insights. The two take the following forms:

Log Analytics:

https://ade.loganalytics.io/subscriptions/<subscription-id>/resourcegroups/<resource-group-name>/providers/microsoft.operationalinsights/workspaces/<workspace-name>

Application Insights:

https://ade.applicationinsights.io/subscriptions/<subscription-id>/resourcegroups/<resource-group-name>/providers/microsoft.insights/components/<ai-app-name>

The cross-cluster query for the table named “pageViews” in an Application Insights instance named “WhitePagesLogs” in a Resource group named “MyResourceGroup” in the subscription “71a90792-474e-5e49-ab4e-da54baa26d5d” is therefore”

cluster('https://ade.applicationinsights.io/subscriptions/71a90792-474e-5e49-ab4e-da54baa26d5d/resourcegroups/MyResourceGroup/providers/microsoft.insights/components/WhitePagesLogs').database('WhitePagesLogs').pageViews

It is worth explicitly noting that the resource name appears twice in the query – once in the cluster address, and as the database name.

When ready, the Get data dialog box should appear as follows:

If you want to use Direct Query, don’t forget to open the Advanced Options section, and select it here.

At this point, the report can be built, and it will behave as if it was a normal ADX cluster. You can of course build more complex queries, etc, but you cannot build things like functions, or materialized vies, since you do not have administrative access to the engine behind Azure Monitor.

Compared to using the Power BI ADX connector directly, this approach has the advantage of being explicitly supported, and it also works with bot Application Insights, and Log Analytics. On the downside, there is a cost to running your own ADX cluster, although it is minimal. This cluster is simply acting as a gateway in this case, and therefore, a bare minimum of resources will suffice.

2 Comments

Exceed the 500,000 row limit in Application Insights and Log Analytics with Power BI

The combination of Power BI and Application Insights (AI)/Log Analytics (LA) is a powerful one. These tools provide a quick, convenient, and relatively cheap way to collect and analyze telemetry on a wide variety of applications. One drawback of AI/LA is that any data query will return a maximum of 500,000 rows, which can be quite constraining in some cases. This article describes a way to work around this limit.

In this example, we’ll be working with an Application Insights instance that is being populated by the WordPress Application Insights plugin – in fact, it’s the one used on this very blog. There are a couple of ways to connect Power BI Desktop to AI data. The Power Query code is downloadable directly from Application Insights, and you can also use the Azure Data Explorer proxy address as outline in my post on the topic here. This approach will work for both methods, and for our purposes, we’ll be using the generated Power Query code approach.

To begin, access your Application Insights instance, and open the Logs window. If necessary, dismiss the “Queries” window that pops up. Next, form your query using Kusto Query Language (KQL). In our case, we want a simple dump of all rows in the “pageViews” table, so the query is simple – just pageViews.

Once we have the query the way that we want it, we select the Export button, and choose “Export to Power BI (M query). M is the name of the language that Power Query uses. Once chosen, a text file will be downloaded that contains the Power Query that we will need in Power BI Desktop.

At this point, we launch Power BI Desktop, and choose “Get Data”. Since we already have the query that we need, we will choose “Blank Query”.

Next, we name our query “Page Views”, and select the Advanced Editor. This is where we can paste in the query generated by Application Insights above. At this point, we open the file that was downloaded above, copy the contents, and paste them in this window (the top comments can be excluded).

Of note here is the value that will be automatically set for timespan. By default, this will be set to P1D, which means data will be retrieved only for the previous day. In our example above, we have changed it to show data for the past 365 days.

Selecting “Done” will load a preview of our data into Power Query. However, if we want to then load it into the data model, it will do so in a single pull, and we will be subject to the 500,000 row limit. What we need to do is break up our query into multiple queries, and Power Query lets us do this through the use of functions.

The first thing that we’ll need to do is to decide on how to segment the AI data. In our case, it is unlikely that we will have more than 500,000 page views per month, so if we performed one query per month, we should be able to retrieve all of our data. In order to do this, we need to go back to Application Insights, and form up a query that will return a list of year and month for our data. In our case, this query is:

pageViews
| where timestamp > now(-365d)
| summarize by 
    Year = datetime_part('Year',timestamp), 
    Month = datetime_part('Month',timestamp)

Note that the number of days in the where clause above should match the number of days in the larger query above. Next, export this query to Power BI, and create another query in Power Query. Leave the name as default for now. Selecting Done should return a list of years and months for your data. These values are all numbers, and Power Query recognizes them as such. However, we need to work with them as text later on , so we change their types to text.

Now we will return to our original query, and modify it so that it only returns data for a single month. Reopen the advanced editor and replace the query “pageViews” with:

pageViews | where datetime_part('Month',timestamp) == 10 and datetime_part('Year',timestamp) == 2020

The values chosen don’t matter, but they should return data, In the end, the edited code should look as follows:

Selecting done, we verify that we have data restricted to the specified month. This is where the fun begins. We are now going to turn this query into a function. To do so, we right click on our pageViews Query, and select “Create Function”

We are then presented with a dialog box that asks if we want to create the function without parameters. We can go ahead and select “Create”. We are then prompted to name the function, and we’ll call it “GetViewsByMonthAndYear”. We now need to edit the function. To do so, with the function selected in the query pane, we select the Advanced Editor once again. We then dismiss the following warning, and then we edit the function in two places. First, we need to define two variables to be passed to the function Month and Year , and then we add them to our query.

In the function declaration we add “Month as text” and “Year as text”. We then replace the explicit month and year that we originally queried for with these new variables, Month and Year. Our function code now appears as below:

Now we are ready to use our function. We select our query that contains the list of years and months, select the “Add Column” tab from the ribbon, and choose “Invoke Custom Function”. We give the new column a name “Views”, select our function from the dropdown, and then we select our column containing years and the column containing months to be passed to the function.

At this point, selecting “OK” will cause the function to be executed for each of the listed months. These are individual queries to AI, not one large one. Each query is still subject to the 500,000 row limit, but provided that no specific month exceeds that limit, all of the data will be returned.

Initially, the data is returned as a single table per day, but selecting the expand icon at the right of the column header allows us to retrieve the row values. It’s also a good idea to turn off the “Use original column name” option.

Selecting OK at this point displays all of the appropriate column values. We can then remove the “Year” and “Month” columns, as well as the original Page Views table that we used to create the function. We also need to set the data types for all of our columns because Power Query is unable to detect them using this approach.

Renaming our combined Query to Views, gives us the following result:

We still have a single table, but there is no longer a 500,00 row limit. At this point, we can load the data into the model and build our report.

3 Comments

A low cost approach for paginated report subscriptions in Power BI

Paginated reports in Power BI offer a rich set of capabilities for printing, and the generation of report documents. Paginated reports can be exported to multiple document formats, and those exports can be scheduled and delivered via email. Unfortunately, for the moment at least, paginated reports require the use of a dedicated capacity, which can be cost prohibitive for some. This article describes a pattern that will help minimize the cost of using paginated reports for subscriptions.

Dedicated capacities

The term “Premium” is often used to describe the resource (as opposed to user) based licensing option in Power BI. More accurately, this option is called “dedicated capacity”, and there are a number of different ways to use dedicated capacity. Premium is certainly one of those ways, but the Azure (A) SKU and the EMbedding (EM) SKU are two others.

The different SKUs have different features and capabilities, but for the purposes of this article, they are functionally equivalent. For a detailed discussion of the different dedicated capacity SKUs, see my older post Understanding the Power BI capacity based SKUs.

The “A” SKU is of particular interest, because unlike the other two SKU types, it is billed in hourly increments as opposed to monthly. If the capacity can be started, perform a task, and then stopped, then you will only be billed for the time used to perform the task in question. Matt Allington has a post that outlines this concept in great detail: Affordable Power BI Premium for Small Business .

The process

In our scenario, we have a paginated report that uses a published Power BI dataset as a data source. as of this writing, there is no API call available to render a paginated report on demand, so we will rely on the scheduled subscription capability. in order to minimize the cost of the solution, we want the dedicated capacity to run as little as possible.

The solution will consist of an Azure logic app, and Power BI paginated report scheduling. An Azure logic app uses the same set of actions that a Flow in Power automate does, but is a little more flexible in its permissions model.

The overall process flow is as follows:

  1. The Logic App runs on a schedule, and starts the dedicated capacity by calling the Power BI API from an HTTP action
  2. The Logic app uses the “Refresh a dataset” Power BI action to initiate a dataset refresh. Note that this is only necessary if the report’s data souce is a Power BI dataset.
  3. On a schedule that allows for the dataset above to be refreshed, the report schedule runs and delivers the report to the destination addresses.
  4. Another Logic App runs on a schedule that allows for both the refresh of the data in step 2, and the report to be rendered in step 3, that pauses the capacity. Ideally this should be less than a hour after the capacity was started in step 1, given the hourly billing increments.

Optionally, further automation can run on the destination inbox to deliver the report to alternate locations, such as a SharePoint library, etc.

Create the capacity

Prior to doing any of this, you will need to create a Power BI dedicated capacity in Azure. In Azure, the service is called “Power BI Embedded” and detailed instructions for creating it can be found in this document.

Although there are 6 possible sizes to choose from, paginated reports in the Power BI service require an A4 capacity at a minimum. It is therefore important to select the “Change size” link and choose a size that is A4 or greater.

Once created, any relevant workspaces can be assigned to this capacity. It should be noted that the capacity must be created in the same tenant as the Power BI service itself.

Starting and Refreshing

Creating an Azure Logic app is relatively straightforward. The steps involved in doing so can be found in this introductory document. Once created, we will add a schedule trigger, an HTTP action, and a “Refresh a dataset” action. The complete Logic app is below.

Logic app to start the capacity and refresh a dataset

The recurrence step will kick off the run on a scheduled basis. You can set the recurrence to a number of time periods, and the run time will be based on that start date/time. In the example above, the process runs every day at 2:50 AM

The HTTP step calls into Azure to start the capacity. The request is a POST request, and all that it requires is a single URL.You must provide 3 configuration values for it to work. The URL itself is:

https://management.azure.com/subscriptions/{SubscriptionGUID}/resourceGroups/{ResourceGroup}/providers/Microsoft.PowerBIDedicated/capacities/{CapacityName}/resume?api-version=2017-10-01

Where:

  • {SubscriptionGUID} is the GUID of the Azure subscription for the capacity
  • {ResourceGroup} is the name of the Resource Group for the capacity
  • {CapacityName} is the name of the capacity

The other important item here is the Authentication setting. There are a number of options here, and your choice will match your requirements. For this example, we are using the relatively new “Managed Identity” in Azure. Managed identity allows one service (the Power BI capacity) to grant direct access to another (our Logic app). In this way, the Logic app is granted permission to start and stop the capacity.

More information on managed identities can be found in this Microsoft document.

The final action in this app is the “Refresh a dataset” action. Once added, you provide your credentials, select a workspace, and the the dataset to be refreshed. The logic app waits for the HTTP action to finish before it tries to refresh the dataset, so you do not need to worry about adding pauses.

Subscription

The subscription for the paginated report needs to be set to run some time after the dataset above has refreshed if using a dataset as a data source, or after the capacity has started if not. There is nothing unique about subsctriptions for this process, but you need to be aware of the timing. The subscription must run and complete in the time between the starting and refresh of any datasets, and when the companion logic apps fires to pause the capacity.

As of this writing, subscriptions can only be delivered to mailboxes within the same tenant as the Power BI service itself.

To create a subscription, select the “Subscribe” button in the upper right of the report toolbar.

Next, complete the subscription options. Note that in this example, the subscription is set to run at 3:15 AM, which is 25 minutes after the start capacity Logic app fires, allowing for plenty of time to start the capacity, and to refresh the dataset that this report connects to.

Pausing the capacity

A second Logic app is required to pause the capacity. It needs to be scheduled to allow for the subscription to be sent, ideally within an hour of the capacity starting so that only one hour of usage is billed.

The Logic app to pause the capacity is very similar to the one to start it, with a different URL being called.

In our example, the Recurrence trigger is set to run daily at 3:45 AM, which is 55 minutes after the start app.

The HTTP action settings are identical to those in the startup logic app, with the exception that the suspend method is called instead of the resume method. The full URL is below.

https://management.azure.com/subscriptions/{SubscriptionGUID}/resourceGroups/{ResourceGroup}/providers/Microsoft.PowerBIDedicated/capacities/{CapacityName}/resume?api-version=2017-10-01

Where:

  • {SubscriptionGUID} is the GUID of the Azure subscription for the capacity
  • {ResourceGroup} is the name of the Resource Group for the capacity
  • {CapacityName} is the name of the capacity

Summary

While paginated reports are currently a “Premium” offering, it is possible to use automation techniques, along with the Power BI Embedded Azure service to only run the service for an hour a day, turning a cost of hundreds per day into several dollars per day.

By taking advantage of the report rendering capabilities that paginated reports affords, and by building the reports on top of pre-existing Power BI dataset, paginated reports can become a print engine for the analytical reports.

3 Comments

How to Migrate a WordPress site to Azure Using In-App MySQL

Did this site load a little faster than it normally does? You may not have a basis of comparison, but I have noticed that pages load between 2x and 5x faster since I moved the site to Azure hosted WordPress using an In-App MySQL database. Previously I was running it on Azure as well, but it was using the 3rd party ClearDB database server. The performance increase is therefore entirely due to the difference in the database engines.

I have been running this blog as a web app on Azure for the last couple of years, ever since it became available. In fact, I wrote about how to enable hosting for multiple users on the same database when I first set it up. At the time, setting up a WordPress web app involved also provisioning a MySQL database through a third-party hosting provider, Clear DB. The initial offering was free, but as I quickly found out, the initial offering also doesn’t provide much, and I needed to upgrade it through the third party. This arrangement was fraught with difficulty. Aside from the unwelcome additional costs, managing the billing cycle was difficult. In addition, all my WordPress sites were a little to a lot sluggish, and increasing Azure resources didn’t seem to help much.

Over time, I learned that I wasn’t the only one, and the performance problems seemed to be with latency between Azure and the third-party provider. However, I didn’t want to start messing around with standing up my own, and it was usable if a tad expensive. However, a month or so ago I was listening to my friends Andrew Connell and Chris Johnson on the Microsoft Cloud Show, and they mentioned that Azure had put out a preview of a native MySQL implementation. This was of course music to my ears, so I tried it out, and it appears to work quite well.

I have since moved all our WordPress properties over to this new architecture, and documented the procedure. The approach that I tool should work for any WordPress site, whether it is hosted on Azure or not, but the examples I use will of course be going from Azure to Azure. I essentially create a new WordPress site, migrate the site assets to it, configure the new site to match the old one, then point the address to the new web app. There are quite likely third party add-ons that facilitate this process, but this process is manual. I am in no way saying that this approach is a best practice, only that it worked for me. Finally, as noted above, the In-App MySQL is in Preview, not production, so if your WordPress site is critical, it would likely be a good idea to hang on for a bit. I however like to live dangerously, and if my blog goes offline for a few hours, it’s not the end of the world.

Here are the steps required to accomplish this.

1. Upgrade the existing site

The new site that will be created will use the latest version of WordPress, and any plugins that get installed will also be the most recent. To avoid any version mismatches, it’s a good idea to make sure that your WordPress version, and all your plugins are up to date.

2. Retrieve the WordPress Assets from the existing Site

You can use the built-in export feature in WordPress to retrieve all the database assets. Open the tools section, select “Export”, and choose “All Content”.

The types of content will vary depending on your WordPress installation, plugins, etc., but make sure that you select all of them. When ready, select “Download Export File”. You’ll get prompted to download an XML file – put it somewhere safe – you’ll need it later.

Next up, you’ll want to retrieve your file system based assets. These will be all your uploaded files, unless you currently use and externally hosted provider, your WordPress themes, and your plugins. Strictly speaking, this step isn’t necessary. You should be able to re-download your themes and plugins, but I have found that they aren’t always available, and that this process is faster. However, if you don’t have access to the file system of the existing site, you may not be able to do this. The upload files can be gathered through the import process later as well, but this approach provides an added level of safety.

For Azure, you’ll use FTP to connect to the file system and copy the files locally. For Azure hosted sites, you can set the FTP credentials by logging into the “new” Azure portal, selecting the web app for your site, then navigating to “Deployment Credentials”. You then enter a user name and a password, and save them.

Next, scroll down to “Properties” for the web app, and take note of the “FTP Host Name” and the “FTP/Deployment user”. You will use these values to connect to the file system.

Now open Explorer on a Windows PC, right click in the “This PC” node, and select “Add a network location”.

Follow the prompts entering the FTP host and the user name when prompted. Do not attempt to log in anonymously. Also, take note – the user name has the form web app name\username. When the node opens, enter the password, and you should see 4 folders. Open “site”, then “wwwroot”, and finally “wp-content”. The folders that you need are here.

Specifically, you are looking for the plugins, themes and uploads folders. Copy these folders locally and keep them handy.

3. Create the new WordPress Site

From the Azure admin portal, select “Create New”, and search for “WordPress”. There are several WordPress options to choose from, but the one we’re pursuing is published by WordPress.

Once selected, you will be prompted to fill out the details. Give the new app a name, select the Resource Group, and most importantly, the Database Provider. ClearDB is the one that we are moving away from, so “MySQL In App” is the one that we want to select.

Once you click OK, the App will be created, and WordPress will be deployed to it. The App creation happens almost immediately, but it takes a few minutes for WordPress to be fully deployed. Don’t be alarmed if there’s nothing there for a little while.

Once deployment is complete, you can simply click on the URL of the app in the “Overview” section. The URL will take the form of http://webappname.azurewebsites.net.

A browser will open and you will be prompted to complete the initial WordPress installation. One that complete, you will be able to login to the WordPress administration portal.

4. Upload the Older Assets to the new WordPress Site

The next thing that we want to do is to upload the assets that we downloaded in step #2 to the new site. To do this, simply connect to the new file system via FTP by following the same steps that were used to connect to the old site in step #2. Once connected, upload the 3 folders to the wp-content folder of the new site. If there are folders that already exist, or that you don’t want to use in the new site, simply omit them from the upload. Once uploaded, we can activate the features.

5. Activate Assets in the New Site

It is important to activate and configure the plugins before the content from the existing site is imported. This is because some plugins extend the schema of the WordPress database, and any content that depends on those schema extensions will fail to import if they are not present.

Login to the administrative portal in the new site, and activate all the required plugins. If you don’t know which plugins should be activate, simply login to the administrative portal of the old site for reference. It’s a good idea to have these portals open side by side as you complete the next few actions. Once the plugins are active, go to the appearance section, and select the same theme as the original. Once the theme is selected, it needs to be configured. Walk through all the configuration options for your theme matching with the original site. Any options that depends on content will need to be set after the content is imported. Once the theme is configured, the plugins should all be configured. This is a very manual process of going through all the configuration screens and comparing the settings to those of the existing site.

Finally, recreate all necessary users from the old system. You will need to match blog posts with authors during the import step. The import step will offer another opportunity to add new users, but it’s a good idea to do this prior so that complete information is added for each user.

6. Imports Content from the Existing System

From the administration portal of the new WordPress site, navigate to the Tools section, and select import. A number of options will be presented, the option that you’re interested in is “WordPress”. If you don’t already have the WordPress Import Plugin, you’ll need to select “Install Now” and allow the plugin to install and activate. Once activated, select “Run Importer”, and the Import dialog will appear. Select the export file that was downloaded in Step #2 above, and then click the “Upload file and import” button to see the Import WordPress dialog.

WordPress Import is author aware, and will automatically assign posts to users that exist in the new environment based on who they were in the old, you simply need to map them at this point. If you forgot to add a user in Step #5, you can do so here as well. Once authorship is assigned, the only other decision is whether to select the “Download and import file attachments”. Strictly speaking, if all assets were brought across in step #2, this shouldn’t be necessary. What this option does is to download all referenced assets from the existing blog during the import process. This doesn’t always work, particularly on larger blogs, which is why step #2 is so important.

In addition, if the content of the site results in a particularly large export file (as was the case with this one), you’ll need to increase the upload limit for your WordPress site. This can be done by creating a “.user.ini” file in the root of your WordPress installation as described here. Additionally, you may also need to increase some of the application timeout values.

7. Test

Test the new site to ensure that it works. This cannot be stressed enough. Open all the sections to ensure that everything looks right. Ideally, use browser windows open side by side with the new and the existing sites

8. Make URL Changes to the Existing WordPress Site

It is important to follow these steps to avoid being locked out of the existing site. There are ways to correct it if it happens, but the situation is beast avoided.

Open the administration portal of the existing site, and navigate to “Settings”, General. If the WordPress Address (URL) and the “Site Address (URL)” values do not match the default URL for the Azure Web App, they will need to be changed to that value here.

The address will take the form http://azurewebappname.azurewebsites.net. It’s also a good idea to navigate to that URL to ensure that it works before saving.

8. Make URL Changes to Azure

If your existing site isn’t running on the default Azure address, you’ll need to repoint it to the new site. This will cause your site to be unavailable for a few moments. To begin, you need to remove your custom domain from the existing (now “old”) site. Navigate to the Web App for the old site in the Azure portal, and select “Custom domains”. Your custom domain should appear there along with the default address (that was used in step 8).

Click on the ellipsis beside the domain, and select “Unassign”. This will remove the custom address from the old site, freeing it up to be used by the new site.

At this point, you will need to make changes to your domain with your domain registrar. You will need to change any references (A records and/or CNAME records) that you currently have for your custom address to point to the new Azure Web App. Details for those settings can be found under “Custom domains” for your new Azure Web App. Once complete, navigate to “Custom domains” in the new Web App and click on the “+” button beside “Add hostname”. Enter your custom address and the click the “Validate” button. The custom address will be tested, and if there are any issues with it, remediation steps will be provided. The Azure portal is quite good at helping to manage this step.

Once the new URL has been registered, it should be tested to ensure that it is accessible from the outside environment. Prior to testing, the old site should be stopped (but not deleted!) to ensure that it is not responding to any requests.

If SSL was used on the old site, at this point they should be brought in to the new Web App and bound to the site.

9. Make URL Changes to the New WordPress Site

If the custom domain is working, follow the steps in step 7, but on the new WordPress site, and use the custom address for the URL values. Save, and login again.

10. Final Testing

At this point the site is live, but it is worthwhile to do another round of testing with the old Web App in a stopped state. This will identify any URLs hardcoded with the old Web App default URL, and missing assets, etc.

At this point, the new WordPress site is set up and working with the In-App MySQL database. I would recommend waiting a week or so before going back and deleting the old site and its assets, just in case.

8 Comments

Ignite 2015 Impressions

I don’t normally do conference summaries, but Ignite was just so big, and there was so much information that I felt the need to record my thoughts around it, and decided to share. Ignite was very much cross product, which is in line with where Microsoft seems to be headed – a focus on the function, not the tooling. With around 24,000 people in attendance, the conference, and the logistical issues that it imposed was too big for my taste, but the amount of information was excellent, and I imagine that I’ll be digesting it for some time to come. For now, here’s how I interpreted it all.

Azure and Office 365

Cloud services are killing it.

Between Azure’s Platform as a Service (PaaS) and Infrastructiure as a Service (IaaS) and Office 365’s Software as a Service (SaaS), Azure Active Directory is already sporting over 450 million active users. Azure Active Directory is what is used by Office 365, and the accounts within are otherwise known as Organizational Accounts. It’s an important metric because I believe that the Microsoft strategy is to own identity online. It makes sense when you look at what they seem to be doing.

For years, they absolutely dominated operating systems. Nothing to this day has ever really touched them on the desktop, but Apple changed the base with mobile, and developers flocked there. Google tried to do the same thing to Apple, and has been quite successful, but not fully so. While Android is in the majority in the mobile space, iOS is still quite strong, and shows no signs of diminishing. Windows isn’t really a factor in mobile, but still dominates the desktop which remains significant (about 300 million units/year), and is a factor on tablets. Microsoft got flanked by Apple and Android, and is holding the fort, but not conquering any new territory.

Microsoft now seems to be focusing on cloud services, and they don’t care what platform is being used to consume them. I think that at the core of this strategy is cloud identity – whether it is consumer (Microsoft Account) or enterprise (Azure Active Directory). With this identity strategy, Microsoft is attempting to again change the base – to outflank both Apple and Google and make the operating system almost irrelevant. Every app they’re putting out now is usually for iOS first, then Android, then Windows Phone. The new Universal app platform likely means that they will come out for Windows (desktop, phone, whatever) at initial launch with iOS, but the bottom line is that an awful lot of effort is going into supporting all platforms all the time. If the apps work well across platforms, then the choice of operating system simply becomes one of personal preference, not of features. It gets marginalized, and Microsoft owns the back end service. That’s why I think that so much effort has gone into this strategy.

Another thing that I sensed at the show was that in the past, all of the talk around identity and federation (ADFS) was about bringing your on-premises identities into the cloud to support a few new services. Now, there seems to have been a real shift, and the reason for adopting ADFS is to bring the Azure Active Directory identities back down on-premises to where legacy applications can use them. It’s a subtle shift, but discernable.

One of the more interesting product introduced into Azure recently is Logic Apps. As far as I can tell, Logic Apps are the cloud manifestation of BizTalk, which is an excellent product with a steep learning curve. Logic apps remove the learning curve and allow you to quickly connect and flow data through multiple systems. The session on logic apps can be seen here:

SharePoint 2016

In the past, SharePoint announcements would warrant their own post, but now SharePoint is probably best seen as part of a greater whole. Details on SharePoint 2016 details were first announced at Ignite, and I feel that the most informative session was Bill Baer’s on Wednesday morning where he outlined the major architectural changes:

Not surprisingly, this release will be very much about hybrid Sharepoint/Office 365 scenarios. Some of the notable items from the talk are:

  • SharePoint Server 2016 Will require 64 bit Windows Server 2012 or Widows Server 10, and SQL Server 2014 SP1 as a minimum
  • Standalone installations are no longer supported. It will be possible to install SharePoint and SQL Server on the same machine, but full SQL Server will be required, and SQL Express will no longer be supported. This obviously raises questions about whether or not there will be a free SharePoint Foundation SKU with the next release.
  • PerformancePoint will in fact be included with SharePoint 2016. I doubt very much that there will be any investments in it at all, but it will at least be there. I’d view this as legacy support.
  • SharePoint 2016 will support SAML claims as a first class citizen. That means that it will be possible to login with Azure Active Directory credentials, and is an example of bringing cloud identities on prem. However, don’t trash that domain controller just yet, I’m sure that service accounts will still need to be NTLM – SQL Server needs it.
  • There will be a new Roles Based installation. It will be much simpler to install and maintain servers with specific roles such as web front end, search, etc. BI will be one of the roles.
  • There will be new boundaries. Content databases up to Terabyte sizes, 10 GB file size limit, list thresholds of much greater than 5000 items (although how much greater was not specified)
  • No more FIM. The user profile engine that we’ve all grown to….. deal with from SharePoint 2010 and 2013 is no longer embedded. The full Forefront Information Manager can be used, but the default profile import mechanism will be the good ol’ User import from SharePoint 2007.
  • Durable resource based links. Every object in SharePoint will receive its own resource based URL. That means that it can be moved around in the farm, and reference URLs will still work. This is like permalinks in WordPress.
  • While not final, a preview was shown of some operational reporting. This is primarily “speeds and feeds” type information that would interest a farm administrator, although simple usage reporting could be seen.
  • Integration with the Office Graph – see below section on Delve.

SQL Server 2016

The next release of SQL Server was announces at Ignite. Its chock full of new things, focused primarily at hybrid operation and analytics. One of the more interesting concepts in this version is the ability to “stretch” a database into the cloud. With this, you can take an on-premises database, and extend it into Azure SQL, specifying rules to determine which data goes where. Given that online storage is significantly cheaper than on-premises, this makes total sense, and they’ve figured out a way to make it work reliably. The overall SQL Server keynote can be found here:

I’m very interested in the analytics capabilities, and the session outlining the improvements to SQL Server BI is found here:

I found the following items particularly notable:

  • A comment was made during the BI session that Microsoft is “Super Committed” to SQL Server Reporting Services (SSRS). Hopefully this helps quell the naysayers. SSRS is receiving a major facelift in this version, bringing a modern design experience. In addition, the parameters pane has received a great deal of attention, adding, among other things, support for cascaded dropdowns.
  • Datazen is a visualization company based in Toronto that was recently acquired by Microsoft. There is a good demo of Datazen in the session, and I highly recommend watching it. It will be included with SQL Server 2016.
  • Datazen has KPIs. It also has “sub-KPIs”. I’m not sure about you, but that sounds a lot like a scorecard to me. This may sound the eventual (see the SharePoint section) death knell for PerformancePoint, given that that’s about all that it uniquely provides to the BI stack.
  • Tabular models in SSAS (and presumably PowerPivot) will support many-many relationships and a host of other new features.
  • Tabular models in SSAS and PowerPivot will have time intelligence built in. No longer will separate time intelligence tables be required. It’s an open question however as to how extensible they will be and when.
  • SharePoint will allow browser editing on PowerPivot embedded workbooks. Currently, you need to launch Excel to edit a PowerPivot embedded workbook.

Office 365 Groups

I attended the roadmap on Office 365 Groups:

(video unavailable as of posting – should be shortly)

During this session, the light really went on for me. Groups was (were? Not sure about the grammar on this…it’s a name) introduced last year and appeared to be a glorified distribution list with Sharepoint artifacts. However, its about to become the center of the Office 365 collaborative experience. It ties together Azure Active Directory objects, a SharePoint site collection, One Note, Skype, and OneDrive into a single cohesive, non-customizable experience. It currently uses Exchange exclusively for social conversations, but full Yammer integration is promised. No date was given for the integration, but my guess is that the target is early 2016.

The current User interface is limited – too limited for my own use at the moment, but during the demonstration, a rather useful interface was shown that is coming soon. You can access groups presently through the Outlook web client in Office 365. I’m running Office 2016 preview on my laptop, and there is a very nice interface contained there. There was chatter, particularly in the Yammer community about confusion as to what tool should be used when, but I think that the coming deep integration of Yammer into Groups will render this point moot’

The next UI, demonstrated in the above session looks really good, and offers a lot of benefits. There is also a mobile app coming very shortly for, you guessed it, iOS and Windows Universal, then Android.

One unanswered question from the show is whether Groups would be available on-premises.

Power BI

Power BI content was sort of sprinkled throughout the conference, without specific focus. There was a session on the new DAX features available in Power BI Designer that is worth a watch from a modeling perspective:

One talk that really impressed me was by Lukasz Pawlowski and Josh Caplan entitled Power BI for Developers:

They cover content packs are mentioned, real time analytics, and an in depth analysis of the “how old” app that went viral during Build.

It was also announced in the SQL BI session that SSRS will in fact be included in Power BI shortly, although little detail was provided. Finally, for development, the best place to get started is http://dev.powerbi.com.

It should also be noted that Power BI was at the center of almost any analytics discussion during the conference. This is by no means a little side project.

Delve/Office Graph

Delve is a newer product in Office 365 that provides insights around what content is relevant in an organization, and how people interact with it. It’s available directly from the app launcher in Office 365, and recently, user profiles have moved to the Delve application. It’s powered by the Office Graph, which in essence an advanced index that contains content from Exchange and SharePoint, and will very shortly, be extensible for multiple content types. The roadmap session for Delve/Graph is available here:

During the session, it was stated that “Delve is the evolution of Enterprise Search”. Given that all of the work on Delve and the Graph is coming from Oslo and the former team from FAST search, this just makes sense. One of the major announcements around SharePoint 2016 was that SharePoint 2016 content can be crawled by the Office Graph to provide both search results and Delve results in Office Graph. The reverse will also be true in that the on-premises crawler will be able to index Office 365 content for search results, but Delve and the Graph will remain in Office 365. The surprise here was that later this year, it will be possible to do the same thing with Sharepoint 2013 through a coming enhancement.

Much of this Graph goodness can also now be accessed through the new Office 365 Universal API:

tyGraph

tyGraph is our product that provides advanced analytics for Yammer. It had something of a coming out party at Ignite, and while we didn’t have a booth or any launch sessions, we were fortunate enough to have several folks, customers and thought leaders present talks that at least in part featured tyGraph. If you’re interested in analytics for your Yammer network, I recommend that you watch some or all of these sessions:

Enterprise Social, from “Ooh, Shiny” to Business Success – Melanie Hohertz, Cargill

The Microsoft Enterprise Social Journey: How We Did It – Chris Slemp, Microsoft

Gain Organizational Insights with Yammer Data Mining and Analytics – Steve Nguyen, Microsoft and Tammy Young Heck, EY

Yammer Mining: Dig in and “Listen” to What Your Big *Social* Data Is Saying – Richard diZerega, Microsoft

4 Comments