Skip to content

Tag: Power BI

Something Went Wrong When Connecting Power BI to SSAS

One of the more powerful features in the new Power BI preview is the ability to connect your Power BI dashboard to an on-premises instance of SQL Server Analysis Services. This is done by installing a connector on-premises that connects to bot the SSAS server(s) and to the Power BI service. Installation is straightforward, but quite often the first attempt to connect to the data results in the “oh-so-helpful” error message, “something went wrong”.

image

The message isn’t very helpful, unless you’re speaking with a support representative.

I have found that the problem in most cases is that the user connecting to the data does not have sufficient rights on the SSAS server, or the server does not understand who the user is. This may very well be the same user that was used to connect from the SSAS Connector back into the service, but that doesn’t matter. The problem is that SSAS does not know about that identity.

To explain, first, we need to consider how the connector is registered.

image

When the connector is registered, two sets of credentials are provided. One credential is used to connect the connector to the SSAS server. This is an Windows credential (typically in the form of DOMAIN\username, and  it must be an Administrator on the SSAS server. The reason for this requirement is that it will be used to funnel all Connector queries to the SSAS server, and it uses the EFFECTIVEUSERNAME feature in SSAS. EFFECTIVEUSERNAME requires admin level access.

The second credential is used to connect the connector to the Power BI service. This one is used to register the connector with the service so that it can be used by dashboard authors, and isn’t extensively used afterward. This credential will be an Organizational Account (i.e. an Office 365 identity/Azure Active Directory) and needs to have enough rights to register a data source with Power BI.

Once registered, it works as follows.

image

When a dashboard user interacts with the dashboard, or accesses the data source, a request is sent to the connector with the credentials of the user making the request. The connector then establishes a connection with the SSAS server, using the admin credentials registered with the connector, and issues the query using the EFFECTIVEUSERNAME parameter, which basically means “run this query using the provided user’s credentials”. The user provided is the one making the request. This allows for per-user level security for Power BI, but unfortunately, it is what typically causes the error above.

The issue is that SSAS only understands Windows (NTLM and Kerberos) credentials. Without doing anything else, it has no idea what an Organizational ID is. So how can it work at all? There are two ways.

The first, is that your domain can be federated with Azure Active directory, specifically with the Azure Active directory that your Power BI (Office 365) tenant is using. Once federation is complete, your AD domain (domainxxx.com) will be registered and trusted with your internal NTLM/Kerberos domain and your users will be mapped to their Azure ID identities. SSAS will then understand who they are, and if granted permission, they will be able to access SSAS data via the Power BI dashboard. This is the only supported method, and is what should be used in a production environment. There is however another way.

If AD federation is not an option in the short term, or you simply need to get a development or demonstration environment spun up, it is possible to “hack” your active directory to allow SSAS to understand the organizational IDs. First, the AD domain is registered directly with Active directory, then the Active Directory users can be set to use that directory. The key part is that the user name in the internal AD (ie DOMAIN\xxx) must match the user portion of the Azure AD account (ie xxx@domainxxx.com). The entire procedure is outline very well by Greg Galloway in this article, and I won’t repeat it here.

Going back to the original error, it would be nice if it could be a little more descriptive. I’d be happy with “Access Denied”. These are early days, and the product is still in preview – I expect this will change. Security also may not be the only cause of this error, but it’s the only one that I’ve seen thus far. 

2 Comments

The New Power BI – Now With Enterprise!

Yesterday Microsoft announced the next step in the evolution of Power BI. It’s getting quite a bit of attention, and rightly so for its aim of bringing Business Intelligence closer to users. Democratizing BI has always proved a challenge – it’s the realm of the gurus in the white coats that hold the keys to the data. Microsoft is aiming to accomplish this democratization through a combination of user focus, and as of yesterday, a drastic change in its pricing model. Power BI just went from about $40 per user per month, to free, or $9.99/user/month for advanced capabilities. That’s quite a drop, and arguably the biggest announcement from yesterday – it will have a massive impact. The detailed price breakdown can be found here.

However, all of the focus around personal BI is, in my opinion, missing a key component. Power BI and its components have always focused squarely on both personal and team BI solutions. That is to say the ability for a power user to model data, visualize it quickly and easily and to share it out with fellow team members. While that capability is certainly retained in the new Power BI, this new version contains the first appearance of enterprise grade BI in the cloud for Microsoft.

To fully understand this, it’s necessary to touch on the Microsoft BI stack as it stands today.

Microsoft BI On Premises

The On-Premises BI story from Microsoft may be confusing, and occasionally difficult to understand, but it is very powerful, and relatively complete. In a nutshell, the story is good from a personal, team and enterprise perspective.

On the enterprise side, there are products from both the SQL Server team, and the Office team. Data warehousing is served by SQL Server and ETL duties fall to SQL Server Integration Services (SSIS). Multidimensional analysis storage is served by SQL Server Analysis Services in both OLAP and Tabular modes, and Reporting is performed by SQL Server Reporting Services (SSRS). The SQL product line doesn’t have much on the client side for analysis apart from SSRS, but this slack is taken up by the analysis tools available in Excel, and through Performance Point services in SharePoint.

Indeed, SharePoint also provides a platform for SSRS via SSRS SharePoint mode, and for Excel based analytical workbooks connected to SQL Server and to SSAS through Excel Services.

On the personal BI side, that role has traditionally fallen to Excel. The pitfalls of importing data into Excel workbooks for analysis are well documented and don’t need to be discussed here, but the bulk of those issues were addressed with the introduction of PowerPivot several years ago. PowerPivot allows for massive amounts of data to be cached within the Excel file for analysis without any data integrity concerns. The addition in recent years of  analytic visuals (Power View, Power Map) and ETL capabilities (Power Query) have further rounded out the offering.

Taking that Excel workbook and sharing it brings us into the realm of Team BI. This is to say that the analyses are relatively modest in size, and of interest to a targeted group. These models may not require the rigour or reliability associated with enterprise BI models. Once again, the technology involved here is SharePoint. A user can take a workbook with an embedded PowerPivot model, share it through a SharePoint library, and other users can interact with that embedded model using only a browser. This capability requires PowerPivot for SharePoint, which is really a specialized version of SSAS, along with a SharePoint service application.

One thing to note about these seemingly disparate approaches is that a power user can build a Power Pivot data model with Excel, share it to a team via SharePoint, and when it requires sufficient rigour or management, it can be “upgraded” into SSAS in tabular mode. This common model approach is powerful, and is key to understanding Microsoft’s entire BI strategy. You can also see here that SharePoint straddles the two worlds of team and enterprise BI.

Moving to the cloud

The BI workload is one of the last Microsoft workloads to move to the cloud, and with good reason. Massive amounts of data present problems of scale, and security or data sovereignty concerns tend to keep data on premises. However, there is a very real need to provide BI to users outside of the firewall.

SharePoint is the hub of BI on prem, so it’s logical to assume that with SharePoint Online, it could continue to perform that function in the cloud. The big catch here is that on-prem, SharePoint is simply the display platform. In the enterprise scenario, users connect through SharePoint to the back end servers. This isn’t an option in the cloud, so enterprise BI was left off the table.

With the personal and team BI scenarios, data is cached in a Power Pivot data model, which could be supported in the cloud. When Office 365 moved to the SharePoint 2013 code base for SharePoint online, rudimentary support for embedded Power Pivot models was indeed added. Essentially PowerPivot for SharePoint “light” was added. I call it light for two major reasons. Firstly, data models could be no larger than 10 MB. Secondly, there was no way to update the data contained within the Power Pivot cache, outside of re-uploading the Excel workbook. This is still true without a Power BI license. The inability to refresh the data renders team BI almost useless, except in static data scenarios.

The first generation of Power BI changed all of that. With a Power BI license, it was possible to install a Data Management Gateway on premises that would connect to team BI workbooks in Office 365 and update them on a scheduled basis. Yes, the gateway had many limitations (many of which have been removed over time), but finally, the on-prem refresh story was solved. In addition, the model size limit was increased to 250 MB. However, we were still left with a number of problems or limitations.

  1. Daily data refresh schedule. Automatic data refreshes could be daily at their most frequent. Manual refreshes could be done anytime
  2. Capacity. The maximum size of a data model was increased to 250 MB, which is relatively small for enterprise scenarios. In addition, refreshes aren’t differential, which means that the entire model is re-uploaded on every refresh
  3. Data sensitivity/sovereignty.  The refresh problem was solved, but because the data is still cached in the workbooks, there can be reluctance to sending it outside of the corporate firewall
  4. Per User Security – Power Pivot data models have no concept of user security in a workbook (tabular models in SSAS do). Security is at the workbook level
  5. Cost. This initial cost of Power BI was $40 per user per month. A power BI license was required to interact with any workbook that had a data model larger than 10 MB. Considering that a full Office 365 E3 license was around $25 per user per month, this price tended to limit the audience for sharing.

All of this is to say that Power BI in its first (and as yet current) incarnation is suitable for personal and team BI only. There has been no enterprise cloud BI story.

Power BI V2

The announcements yesterday outlined the next generation of Power BI. Going forward, Power BI will be available as a standalone offering, at the price points offered above. Office 365 users will continue to be able to use it from Office 365, but Office 365 will no longer be required to use it. In it’s early days, Power BI was a SharePoint app, but a careful examination of URLs in the current offering quickly reveals that it’s actually two apps currently, both running on Azure (not in SharePoint).

If you’ve signed up for the new Power BI preview, you may notice that the URL is http://app.powerbi.com/…… so this move isn’t a big surprise.

With the new model, Excel is no longer the central container. Users connect to data and publish it directly to Power BI. Behind the scenes, the service is doing a very similar thing as what it does with Power Pivot models – it’s storing them in SSAS. In fact, the same limits still apply – 250 MB per model (at least for now) Excel can still be used, but now it is as a data source.

Visualizations are performed through Power Views, and data is acquired through Power Query. These are no longer add-ons, but available on their own through Power BI Designer. This decoupling is good for those that have not made an investment in SharePoint Online, or Excel.

These changes to the architecture and the cost are great news for adoption, but don’t address the needs of the enterprise. Except for one thing – The SSAS Connector.

image

One of the data sources available to the new Power BI is the SSAS data connector. This connector is a piece of code that runs on premises (it actually includes the Data Management Gateway). It acts as a bridge between the Power BI service, and an on prem SSAS server.

The biggest distinction worth noting is that with the gateway, data is NOT being uploaded to the service, it remains on prem. The way that it works is that when a user interacts with a visualization from the cloud, a query is sent to the SSAS server through the gateway. That query is run, and its results sent back to the user’s visualization, and the data is not persisted.

In addition, when the query is sent back to the SSAS it is run with the permission of the user making the request. This is accomplished through the EFFECTIVEUSERNAME feature in SSAS. This provides for full user level security, and since tabular models in SSAS can utilize per user security, we no longer need to rely on proxy accounts/document level security.

Finally, because the data is being stored in an on prem SSAS server, it can be refreshed automatically as often as desired. For the same reason, we have no capacity limits – you can grow your own SSAS servers as large as you like.

The SSAS connector removes most of the limitations that prevent cloud based enterprise Business Intelligence, and the new pricing model removes the rest. Certainly there are going to be feature limits in the near term, but it appears to me at least that the back of this thorny problem has finally been cracked.

6 Comments

Power BI Data Management Gateway 1.4 – Where is it heading?

I received a notice from my main Power BI tenant last night that a new version of the Data Management Gateway was available. The previous (1.2) version contained some very significant changes so I was understandably eager to have a look. I installed it, observed a relatively attractive setup interface, then opened the release notes to find out what else was new. Only four items were listed (from the release notes).

  • Unified binary that supports both Microsoft Azure Data Factory and Office 365 Power BI services
  • Refine the Configuration UI and registration process
  • Azure Data Factory – Azure Ingress and Egress support for SQL Server data source
  • Office 365 Power BI – Bug fixes

I had already observed number two, the new setup experience. Bug fixes, while absolutely necessary, aren’t necessarily something to write about, but I think that the other two items are. While they may not have immediate impact, my bet is that they will in very short order.

The key point here is that the gateway now supports the Azure Data Factory. There are many, many things that the data factory enables (Hadoop anyone), but the one that I feel is most relevant to Power BI today is the ability to connect directly to on premises data sources. That’s not quite how it’s been done until now.

Power BI for Office 365

In the context of Power BI as we’ve come to know it today, on-prem data refreshes are handled by the Data Management Gateway. On a periodic basis (daily at most) the service contacts the gateway, which in turn reruns all relevant queries. The resultant data is then uploaded to the service.

The service in turn packages the data and updates the host Excel workbook, and the model is transferred into a back end analysis server. Every transaction goes through the host Excel workbook.

Power BI Dashboards

If you’ve had a chance to see the preview of Power BI Dashboards, you may have noticed that it is not dependent on Office 365 or Excel at all. When you add a data source, you take the date and add it to a cloud based data model directly (presumably backed by SQL Server Analysis Services). All visualization work against these models, with one very important exception. If you connect to a SQL Server Analysis Services Data source you are actually connecting directly to a model hosted on an on-prem SSAS server in real time.

SNAGHTML5339485

How is this done? The connection is made through the “Analysis Services Connector”, which is a separate bit of software installed on prem to facilitate connection between the Power BI Dashboards service and the On-Prem SSAS server. It’s available directly from the dashboards portal.

image

After installing it, a process that establishes for dashboard and SSAS credentials, it can be reconfigured by running the “Power BI Analysis Services Connector” tool.

image

However, installation also adds another piece of software to the host machine. The Microsoft Data Management Gateway. This version of the DMG establishes the connection between the SSAS server and the Power BI service in real time. Up until now, the DMG didn’t work this way, so which version is it?

image

Until now, the most recent version of the DMG was 1.2, so this Dashboards preview contained a glimpse into the next generation Data Management Gateway that provided some intriguing new capabilities.

Coming Together

Checking into the latest version of the Data Management Gateway from Office 365, we see:

image

This version is newer that that included in the Dashboards Preview, and presumably includes everything from it. The key phrase in the release notes to me is therefore “Unified Binary”. One gateway to rule them all, if you will. Does this mean that we’ll be able to connect to on-prem data in real time from Office 365 as well as from the Power BI preview? I don’t know how, but I bet that the building blocks are now there.

The latest version may not include support for any new data sources, or any new bells and whistles, but it’s likely worth setting up for new capabilities that will hopefully show up sooner rather than later.

1 Comment

Limitations for Power Query OData Feeds in Power BI

One of the features available in Power BI is the ability to take any defined data source and expose it as an OData feed. This is a very simple and quick way to get your existing data available through OData, as it involves a simple check box selection. Complete instructions on setting this up can be found here. There are however a few limitations to what you are able to do that you should be aware of before you head down this path.

Intranet Only

The OData feed feature works through the Data Management Gateway, which is normally used to keep data models stored in the cloud updated regularly with new on-premises data. When a data source is registered, an “enable OData feed” option is made available which when checked, creates an OData feed URL.

When this feed is used, a connection is made directly from the OData client to the Power BI service, which then redirects communication to the Data Management Gateway. The reason that this is important is that because the actual data connection does not go through the Power BI service, the client machine needs to be able to communicate directly with the machine hosting the Data Management Gateway. This means that the OData feed only works on the intranet – it can’t be shared publicly. For now at least.

Data Types

The Data Management Gateway, and therefore the Power BI service don’t support all of the data types supported by SQL Server, or Oracle. If your table or view uses any field that is an unsupported data source, the entire table will be unavailable to use in an OData feed. The table will appear as greyed out when the list of tables to use for OData is being configured.

image

In the above case, the DistrictMaps table contained a geography field, which is unsupported. A complete list of supported data types can be found here. If you are using unsupported data types, you may want to consider creating views that do not contain these fields, and exposing those.

Data Sources

Up until recently (version 1.2), the Data Management Gateway only supported performing data refreshes from two on-premises data sources – SQL Server and Oracle, which constrained its value somewhat. Version 1.2 brought support for a wide variety of Power Query data sources, which really changed the game. Now, since OData feeds utilize the Data Management Gateway, we should be able to expose all sorts of data sources as OData feeds, right?

Wrong. Well, not quite at least. I received a question from Hrvoje Kusulja,  who was trying to expose DB2 data as an OData feed through Power Query, but the OData feed option was disabled. After some testing, and communication with Microsoft, I was able to determine conclusively that while Power Query queries are supported for OData feeds, the underlying Power Query queries MUST come from either SQL Server or Oracle. This is identical to the Power Query refresh support in version 1.1 of the Data Management Gateway. Unfortunately we couldn’t find documentation on this anywhere.

One potential workaround if you need OData support and your data source isn’t supported would be to use an ETL system (Integration Services) to pump data into SQL or Oracle, and create the query from there.

OData feeds is a great little feature, and a nice side benefit from using Power BI and the Data Management Gateway. As with any new product, it has limits that will undoubtedly be reduced in the future, but it’s important to know where they are.

5 Comments

Power BI as a Product Today

Recently, I have come across several situations where people are confused about where Power BI fits in a solution scenario. There is a fair bit of confusion as to precisely what the product is and what it does. The problem is that Power BI isn’t really a product at all, but instead a collection of different products and services. Adding to the confusion is the fact that some of these products require a Power BI license, while others do not. In fact some of these products are actually embedded in other products.

Power BI is Microsoft’s cloud based Business Intelligence solution billed as “Self service analytics for all of your data”. In reality, it’s a little more than self service, it also is a great solution for team BI as it’s based on Office 365. That’s all well and good, but what is it really? What does it consist of, and how does it work? If you look at the main product site for Power BI, it’s not immediately obvious at what you get when you purchase it, or what you need to run it. This post is an attempt to demystify the product.

To start, let’s break it down by its constituent components. Today Power BI consists of the following parts.

image

Unfortunately, this can be rather confusing from a product perspective. Looking first at the on-premises components, Power Query, Power View, and Power Map are all Excel plug ins. Excel is therefore a prerequisite for Power BI. All of these add ins also require (or in the case of Power Query, support) the embedded xVelocity data model, and therefore Power Pivot is a prerequisite. Power Pivot is included in Excel 2013 (Professional Plus), but it can also be downloaded for free for Excel 2010.

Also included in Excel 2013 is Power View, and, with Office 2013 SP1, Power Map. Power Query is downloaded separately, but is free. This is where much of the confusion arises. Due to the fact that these three add ins are included in the product definition of Power BI, it is often assumed that a Power BI license is required to use them. It is not. These products have a life of their own, and can be fully (or almost fully) used within Excel without any association with a Power BI license.

Power Query contains a few features that will only work with a Power BI tenant, mostly involved around the creation and maintenance of shared queries. Since this is part of the cloud service, this makes complete sense, but none of the other features of the product are in any way reduced in the absence of a license. Power View is enhanced through a Power BI license, but only because this makes Power View reports available within the mobile client(s). Indeed, Power Map has no use whatsoever of a Power BI license. Power Maps cannot be viewed at all within a browser – they are a client side feature only. In my opinion, they shouldn’t even be included under the Power BI umbrella, but that’s just my opinion.

Thus far, I have been talking about the modelling and visualization creation aspects of the tools, but what about pure consumption clients? The whole idea of power BI is that designers can create these models and users can interact with them. The workbooks containing these models are stored within Office 365, so do casual users need a license?

The answer is of course maybe. If these users are going to take advantage of any of the services specifically offered by Power BI, then the answer is yes. For example, any user can open a workbook in a browser in Office 365. However, if they want to interact with that model, by using a slicer, pivot table, etc, and that model is larger than 10 MB, then the answer is yes. Obviously, if the user wants to use the Power Q&A features, then the answer is also yes.

For the record, I don’t like this answer. To my mind, designers and content creators should require a license, but consumers should not. This would greatly encourage adoption of the product, so I do hope for some changes in this area.

So, precisely what do you get when you purchase a Power BI license? These are the things that you will absolutely need a Power BI license for.

  • Opening workbooks in a browser with models larger than 30 MB on Office 365
  • Interacting with (slicers, pivot tables, etc) workbooks in a browser with models larger that 10 MB on Office 365
  • Automatic refresh of on premises data
  • Sharing of Power Query queries
  • Refresh of Power Query queries
  • Power Q&A – Natural language queries
  • Power BI mobile application

and that’s it.

In fact, if you check out my earlier article “Whither Power Pivot for SharePoint”, you’ll see that many of the features of Power BI are already available in Power Pivot for SharePoint.

To my mind, the product “Power BI” should not include the Excel add ins, but only list them as a requirement, much like Excel itself is a requirement. This would help to reduce confusion. The next version of Power will support their inclusion. If you’re interested in this new version, you can sign up for the preview when it’s ready here. I’ll be writing more about that shortly.

3 Comments