Skip to content

Month: January 2015

The New Power BI – Now With Enterprise!

Yesterday Microsoft announced the next step in the evolution of Power BI. It’s getting quite a bit of attention, and rightly so for its aim of bringing Business Intelligence closer to users. Democratizing BI has always proved a challenge – it’s the realm of the gurus in the white coats that hold the keys to the data. Microsoft is aiming to accomplish this democratization through a combination of user focus, and as of yesterday, a drastic change in its pricing model. Power BI just went from about $40 per user per month, to free, or $9.99/user/month for advanced capabilities. That’s quite a drop, and arguably the biggest announcement from yesterday – it will have a massive impact. The detailed price breakdown can be found here.

However, all of the focus around personal BI is, in my opinion, missing a key component. Power BI and its components have always focused squarely on both personal and team BI solutions. That is to say the ability for a power user to model data, visualize it quickly and easily and to share it out with fellow team members. While that capability is certainly retained in the new Power BI, this new version contains the first appearance of enterprise grade BI in the cloud for Microsoft.

To fully understand this, it’s necessary to touch on the Microsoft BI stack as it stands today.

Microsoft BI On Premises

The On-Premises BI story from Microsoft may be confusing, and occasionally difficult to understand, but it is very powerful, and relatively complete. In a nutshell, the story is good from a personal, team and enterprise perspective.

On the enterprise side, there are products from both the SQL Server team, and the Office team. Data warehousing is served by SQL Server and ETL duties fall to SQL Server Integration Services (SSIS). Multidimensional analysis storage is served by SQL Server Analysis Services in both OLAP and Tabular modes, and Reporting is performed by SQL Server Reporting Services (SSRS). The SQL product line doesn’t have much on the client side for analysis apart from SSRS, but this slack is taken up by the analysis tools available in Excel, and through Performance Point services in SharePoint.

Indeed, SharePoint also provides a platform for SSRS via SSRS SharePoint mode, and for Excel based analytical workbooks connected to SQL Server and to SSAS through Excel Services.

On the personal BI side, that role has traditionally fallen to Excel. The pitfalls of importing data into Excel workbooks for analysis are well documented and don’t need to be discussed here, but the bulk of those issues were addressed with the introduction of PowerPivot several years ago. PowerPivot allows for massive amounts of data to be cached within the Excel file for analysis without any data integrity concerns. The addition in recent years of  analytic visuals (Power View, Power Map) and ETL capabilities (Power Query) have further rounded out the offering.

Taking that Excel workbook and sharing it brings us into the realm of Team BI. This is to say that the analyses are relatively modest in size, and of interest to a targeted group. These models may not require the rigour or reliability associated with enterprise BI models. Once again, the technology involved here is SharePoint. A user can take a workbook with an embedded PowerPivot model, share it through a SharePoint library, and other users can interact with that embedded model using only a browser. This capability requires PowerPivot for SharePoint, which is really a specialized version of SSAS, along with a SharePoint service application.

One thing to note about these seemingly disparate approaches is that a power user can build a Power Pivot data model with Excel, share it to a team via SharePoint, and when it requires sufficient rigour or management, it can be “upgraded” into SSAS in tabular mode. This common model approach is powerful, and is key to understanding Microsoft’s entire BI strategy. You can also see here that SharePoint straddles the two worlds of team and enterprise BI.

Moving to the cloud

The BI workload is one of the last Microsoft workloads to move to the cloud, and with good reason. Massive amounts of data present problems of scale, and security or data sovereignty concerns tend to keep data on premises. However, there is a very real need to provide BI to users outside of the firewall.

SharePoint is the hub of BI on prem, so it’s logical to assume that with SharePoint Online, it could continue to perform that function in the cloud. The big catch here is that on-prem, SharePoint is simply the display platform. In the enterprise scenario, users connect through SharePoint to the back end servers. This isn’t an option in the cloud, so enterprise BI was left off the table.

With the personal and team BI scenarios, data is cached in a Power Pivot data model, which could be supported in the cloud. When Office 365 moved to the SharePoint 2013 code base for SharePoint online, rudimentary support for embedded Power Pivot models was indeed added. Essentially PowerPivot for SharePoint “light” was added. I call it light for two major reasons. Firstly, data models could be no larger than 10 MB. Secondly, there was no way to update the data contained within the Power Pivot cache, outside of re-uploading the Excel workbook. This is still true without a Power BI license. The inability to refresh the data renders team BI almost useless, except in static data scenarios.

The first generation of Power BI changed all of that. With a Power BI license, it was possible to install a Data Management Gateway on premises that would connect to team BI workbooks in Office 365 and update them on a scheduled basis. Yes, the gateway had many limitations (many of which have been removed over time), but finally, the on-prem refresh story was solved. In addition, the model size limit was increased to 250 MB. However, we were still left with a number of problems or limitations.

  1. Daily data refresh schedule. Automatic data refreshes could be daily at their most frequent. Manual refreshes could be done anytime
  2. Capacity. The maximum size of a data model was increased to 250 MB, which is relatively small for enterprise scenarios. In addition, refreshes aren’t differential, which means that the entire model is re-uploaded on every refresh
  3. Data sensitivity/sovereignty.  The refresh problem was solved, but because the data is still cached in the workbooks, there can be reluctance to sending it outside of the corporate firewall
  4. Per User Security – Power Pivot data models have no concept of user security in a workbook (tabular models in SSAS do). Security is at the workbook level
  5. Cost. This initial cost of Power BI was $40 per user per month. A power BI license was required to interact with any workbook that had a data model larger than 10 MB. Considering that a full Office 365 E3 license was around $25 per user per month, this price tended to limit the audience for sharing.

All of this is to say that Power BI in its first (and as yet current) incarnation is suitable for personal and team BI only. There has been no enterprise cloud BI story.

Power BI V2

The announcements yesterday outlined the next generation of Power BI. Going forward, Power BI will be available as a standalone offering, at the price points offered above. Office 365 users will continue to be able to use it from Office 365, but Office 365 will no longer be required to use it. In it’s early days, Power BI was a SharePoint app, but a careful examination of URLs in the current offering quickly reveals that it’s actually two apps currently, both running on Azure (not in SharePoint).

If you’ve signed up for the new Power BI preview, you may notice that the URL is http://app.powerbi.com/…… so this move isn’t a big surprise.

With the new model, Excel is no longer the central container. Users connect to data and publish it directly to Power BI. Behind the scenes, the service is doing a very similar thing as what it does with Power Pivot models – it’s storing them in SSAS. In fact, the same limits still apply – 250 MB per model (at least for now) Excel can still be used, but now it is as a data source.

Visualizations are performed through Power Views, and data is acquired through Power Query. These are no longer add-ons, but available on their own through Power BI Designer. This decoupling is good for those that have not made an investment in SharePoint Online, or Excel.

These changes to the architecture and the cost are great news for adoption, but don’t address the needs of the enterprise. Except for one thing – The SSAS Connector.

image

One of the data sources available to the new Power BI is the SSAS data connector. This connector is a piece of code that runs on premises (it actually includes the Data Management Gateway). It acts as a bridge between the Power BI service, and an on prem SSAS server.

The biggest distinction worth noting is that with the gateway, data is NOT being uploaded to the service, it remains on prem. The way that it works is that when a user interacts with a visualization from the cloud, a query is sent to the SSAS server through the gateway. That query is run, and its results sent back to the user’s visualization, and the data is not persisted.

In addition, when the query is sent back to the SSAS it is run with the permission of the user making the request. This is accomplished through the EFFECTIVEUSERNAME feature in SSAS. This provides for full user level security, and since tabular models in SSAS can utilize per user security, we no longer need to rely on proxy accounts/document level security.

Finally, because the data is being stored in an on prem SSAS server, it can be refreshed automatically as often as desired. For the same reason, we have no capacity limits – you can grow your own SSAS servers as large as you like.

The SSAS connector removes most of the limitations that prevent cloud based enterprise Business Intelligence, and the new pricing model removes the rest. Certainly there are going to be feature limits in the near term, but it appears to me at least that the back of this thorny problem has finally been cracked.

6 Comments

Power BI Data Management Gateway 1.4 – Where is it heading?

I received a notice from my main Power BI tenant last night that a new version of the Data Management Gateway was available. The previous (1.2) version contained some very significant changes so I was understandably eager to have a look. I installed it, observed a relatively attractive setup interface, then opened the release notes to find out what else was new. Only four items were listed (from the release notes).

  • Unified binary that supports both Microsoft Azure Data Factory and Office 365 Power BI services
  • Refine the Configuration UI and registration process
  • Azure Data Factory – Azure Ingress and Egress support for SQL Server data source
  • Office 365 Power BI – Bug fixes

I had already observed number two, the new setup experience. Bug fixes, while absolutely necessary, aren’t necessarily something to write about, but I think that the other two items are. While they may not have immediate impact, my bet is that they will in very short order.

The key point here is that the gateway now supports the Azure Data Factory. There are many, many things that the data factory enables (Hadoop anyone), but the one that I feel is most relevant to Power BI today is the ability to connect directly to on premises data sources. That’s not quite how it’s been done until now.

Power BI for Office 365

In the context of Power BI as we’ve come to know it today, on-prem data refreshes are handled by the Data Management Gateway. On a periodic basis (daily at most) the service contacts the gateway, which in turn reruns all relevant queries. The resultant data is then uploaded to the service.

The service in turn packages the data and updates the host Excel workbook, and the model is transferred into a back end analysis server. Every transaction goes through the host Excel workbook.

Power BI Dashboards

If you’ve had a chance to see the preview of Power BI Dashboards, you may have noticed that it is not dependent on Office 365 or Excel at all. When you add a data source, you take the date and add it to a cloud based data model directly (presumably backed by SQL Server Analysis Services). All visualization work against these models, with one very important exception. If you connect to a SQL Server Analysis Services Data source you are actually connecting directly to a model hosted on an on-prem SSAS server in real time.

SNAGHTML5339485

How is this done? The connection is made through the “Analysis Services Connector”, which is a separate bit of software installed on prem to facilitate connection between the Power BI Dashboards service and the On-Prem SSAS server. It’s available directly from the dashboards portal.

image

After installing it, a process that establishes for dashboard and SSAS credentials, it can be reconfigured by running the “Power BI Analysis Services Connector” tool.

image

However, installation also adds another piece of software to the host machine. The Microsoft Data Management Gateway. This version of the DMG establishes the connection between the SSAS server and the Power BI service in real time. Up until now, the DMG didn’t work this way, so which version is it?

image

Until now, the most recent version of the DMG was 1.2, so this Dashboards preview contained a glimpse into the next generation Data Management Gateway that provided some intriguing new capabilities.

Coming Together

Checking into the latest version of the Data Management Gateway from Office 365, we see:

image

This version is newer that that included in the Dashboards Preview, and presumably includes everything from it. The key phrase in the release notes to me is therefore “Unified Binary”. One gateway to rule them all, if you will. Does this mean that we’ll be able to connect to on-prem data in real time from Office 365 as well as from the Power BI preview? I don’t know how, but I bet that the building blocks are now there.

The latest version may not include support for any new data sources, or any new bells and whistles, but it’s likely worth setting up for new capabilities that will hopefully show up sooner rather than later.

1 Comment

Calculate a Due Date Based on Business Hours With Nintex Workflow

More than a few times, I have come across the requirement to calculate a due date based on business hours. Recently I needed such a capability for a help desk application that had been built in Nintex Workflow. It’s not quite as simple as it sounds – it’s the business hours part of it that makes it relatively tricky. After a cursory search, while there were a few that would calculate based on days, I couldn’t find any solutions out there to suit this requirement, so I decided to build my own.

The help desk workflow looks up the type of issue from a list, retrieves the expected close time from a Service Level Agreement (SLA) field, and that is used to calculate the due date (along with alert times). SLAs are in business hours,  and the business is measure on overdue times, so it is important that this is accurate.

The workflow itself had been built originally with Nintex Workflow 2007, and a web service had been utilized that served this purpose, but that option was no longer available in the customer’s new SharePoint 2013 environment due to authentication issues. The good news is, Nintex Workflow 2013 contains a feature called User Defined Actions (UDAs) that would allow me to get the job done.

In a nutshell, a UDA allows a workflow developer to encapsulate a workflow, have it accept input parameters have it output results as parameters. Then, that UDA can be used in other workflows as required. It is essentially a workflow function. I was able to successfully build a UDA to do this, and I am providing it here:

Nintex UDA to calculate a due date based on business hours

The basic logic of the UDA goes something like this. Starting with a date, a number of days, hours and minutes, it first calculates the number of minutes that the issue is to be open. The number of days is multiplied by the number of business hours in a day (retrieved from workflow constants), the hours by 60, and these are added to the minutes parameter.

Next, we test to see if it is a business day (determined from a list of holidays and whether the day is a Saturday or Sunday). Next, we see if the date is today, and if we are within business hours. If so, we subtract the remaining minutes in the business day from the minutes outstanding. We then increment the day, and repeat the loop until there are more minutes in the current date than are outstanding. Once that happens, we add the outstanding minutes to the current date, and output is as the due date/time.

The package above contains two files – the UDA itself, and a list template. In order to get the UDA working, you first need to create the Holidays list, and two Nintex Workflow constants. The package contains a list template, Holidays.stp. I’m going to assume that adding a list template to the list template gallery is a well known procedure, and not step through it here, but it needs to be done. Once added, create a new list using the Holidays template, and pops appropriate. It comes with 3 annual holidays already defined, Christmas, Boxing Day, and New Years Day. If the holiday falls on the same date every year, it only needs to be added once, and the “Annual” option checked (the year can be ignored). Otherwise, each holiday must be explicitly added.

Next, two workflow constants need to be created – WorkdayEnd and WorkdayStart. They will use the Number type and represent the number of minutes from midnight that the work day starts and ends. 8:00 AM is 8×60 minutes from midnight and therefore the value is 480.

image

image

We then need to import our UDA. To do so, go to Settings – Nintex Workflow – Manage User Actions.

image

Next import the UDA from the file extracted from the package above (Calculate_Due_Date.uda). Once imported, it will likely be necessary to edit a couple of the UDA actions to connect them to the list and constants. These actions are easy to identify as they will display a yellow alert flag.

image

You will not be able to publish the UDA until all of these connections are made.

Once published, you can use the UDA in your workflows. To do so, open the User Defined Actions section in your Nintex Workflow Designer, and drag the “Calculate Due Date” action onto the workflow design surface. Next, configure the action.

image

Supply a starting date/time, the number of days, hours, and minutes to leave it open, and then assign the Due Date parameter to your workflow variable. That’s all there is to it.

There is nothing compiled in the UDA, and you can modify it to meet your needs. Hopefully this helps out a few people looking for this capability.

16 Comments