Skip to content

Category: Cloud Computing

OneDrive, TwoDrive, ThreeDrive

I’m calling it ThreeDrive now.

The much ballyhooed “Next Generation Sync Client for OneDrive for Business rolled out with the Windows 10 November update. You’d be excused for not noticing, because it looks pretty much the same as the old OneDrive consumer client. In fact, it IS the new OneDrive consumer client as well but it supports OneDrive for Business too. It’s not obvious that it supports OneDrive for Business because as of this writing, it requires a registry key entry. If you sign up for the OneDrive for Business preview, you’ll get the new sync client and the instructions, but for convenience, the key is:

[HKEY_CURRENT_USER\SOFTWARE\Microsoft\OneDrive] – “EnableAddAccounts”=dword:00000001

Presumably, at some point that registry key will be switched on for everyone by an update. Once it has been added, when you open the OneDrive settings, you will now see an option to add a business account.

Once added, your Office 365 OneDrive will be synchronized using the same (more robust) engine as the consumer client, you’ll be able to do selective sync of folders, etc. Once you add your business account, you’ll see two sync “clouds” in your system tray, one blue, and one white. White is your consumer OneDrive, and blue your OneDrive for Business.

I’m pretty sure that this brings us to TwoDrive. You’ll also get (at least) two entries in your Windows File explorer, one for Personal, and one for your corporate OneDrive. If you connect multiple Office 365 accounts, you’ll get multiple blue cloud icons, and multiple explorer entries.

I don’t know why the consumer client icon is blue, but it is what it is. To me, white would help with understanding. We do however have a single sync client! Well, not so fast. If I need to sync SharePoint libraries from either on-prem or Office 365, I will still need the older OneDrive for Business sync client, with all of the same limitations. This is also true for the OneDrives contained in an Office 365 group.

Once that’s installed, by syncing a library, you’ll get another blue cloud icon that is indistinguishable from the icons created by the new sync client, and you’ll get an entry in File Explorer for SharePoint.

Everything should be working at this point. However, although I have gotten my head around this, I find it pretty confusing, and I work in this environment for a living. I know that I’m not alone, I recently spent about an hour with my friend and fellow MVP Marc Anderson helping him get his head around it, so it’s certainly not simple.

I am quite happy to see the new sync client, and the harmonization that it brings. I also know that the need for the old OneDrive for Business sync client will go away as the new client gains the ability to sync with SharePoint libraries and Group based OneDrives. I personally use cloud based storage solutions from a number of vendors, and they all have strengths and weaknesses. OneDrive is still the best deal out there, and it’s also the best solution for corporate sharing. I am however concerned about the complexity. I can imagine the following future conversation with a customer.

Me: You should really look at OneDrive for Business for offline access to your content

Customer: Oh, I have OneDrive already. Didn’t Microsoft just limit its storage capacity?

M: No – they had only turned on unlimited storage for a small test group. They just decided not to move forward with it as earlier announced. Besides, that’s only for the consumer OneDrive, not OneDrive for Business.

C: So they’re not the same thing?

M: No – OneDrive is a consumer product, and you log into it with a Microsoft account. OneDrive for Business is a business product, and you get it with an Office 365 business account. You need a corporate account to use it. It gives each person that uses it 1 TB of storage.

C: So if it’s they’re different things, why are they both called OneDrive?

M: I know. Never mind.

C: OK, so how do I get them both working?

M: Well, you have the November update for Windows 10, right? All that you need to do is to go into your OneDrive settings, and add your business account.

C: Oh, so they use the same sync client?

M: Yes. That’s fairly new. There used to be an exclusive OneDrive for Business client, but you don’t need it now.

C: OK. (adds the business account) So how do I work with it?

M: You see those two cloud icons in your tray? The white one is your consumer account, and the blue one is your business account.

C: I thought that it was one sync client. Why are there two icons?

M: That’s so you can see the two different repositories. They are both driven by the same sync engine. It makes sense.

C: OK, cool. And how do I access my files?

M: Just open up File Explorer. The one that says OneDrive – Personal is your consumer account. The one that says OneDrive – your company name is your business OneDrive.

C: So the business one is my personal OneDrive in Office 365?

M: Yes. I know… the term “Personal” is a bit confusing, but it is what it is.

C: That’s fine. Why is personal cloud icon blue in File Explorer but not in the tray?

M: I have no idea

C: OK – so how do I sync my Office 365 Group OneDrives?

M: Oh. Remember when I mentioned the old sync client? You’ll need that to sync those. The new client doesn’t support them yet, but it will.

C: I thought the old one doesn’t work very well?

M: It’s not as good, but it’ll do for this purpose.

C: OK, how do I install that?

M: You have Office installed, so you already have it. Just open up the OneDrive in your browser, and click on the sync icon.

C: OK (does it). So how do I know it’s working?

M: Open up your tray. See that you have another blue cloud icon? That’s the older sync client.

C: How do I tell the difference between this and the other one?

M: Just hover over the icon. The one that’s just called OneDrive for Business is the older engine. The one that contains your company name is the new one.

C: Am I going to get another icon for every OneDrive that I sync?

M: No – in this case, they all use the same icon. If you hover over and select open the folder, you’ll see what’s syncing.

C: OK. So where do I find my files?

M: Open up File Explorer. Under your two OneDrives, you’ll see a new entry for SharePoint. Click on that, and you’ll see your content.

C: What’s SharePoint?

M: *sigh*

I really quite like OneDrive, and what it can do. It’s even more valuable to me when the people that I interact with use it too, and after walking through this explanation over the past few days, I can see a few barriers to entry. I’d love to see this whole thing simplified.

12 Comments

Diving with Sway (or What I Did on my Autumn Vacation)

Although I threaten to do so in the description of this blog, I rarely talk about diving. I think that the last time that I did so was about 3 years ago. However, I recently encountered an intersection of my vocation and my avocation that I thought was worth sharing.

In September/October this year, my wife and I went on a liveaboard dive trip down the entire Sea of Cortez (or the Gulf of California to you folks in the USA). A liveaboard is a cruise of sorts where you live on the boat and go diving multiple times per day. You get to see some amazing things that would otherwise be unavailable.

I like to document my dives using mostly photos and simetimes video.  The GoPro given to the Office MVPs last year (thanks, Microsoft!!!) at the MVP Summit has increased my use of video. Ultimately I return from these dive trips with a large collection of pictures and video, which I share out via Flickr, Facebook, OneDrive and YouTube. The problem is, these collections are disconnected, and while i try to tag and title the items, none of these collections really tell the story.

Thats where Sway comes in.

I’ve had access to Sway for a while, and while I did dabbble in it a bit originally, I didn’t really “get it”. I had a lot of video this time, and the trip warranted more “narration” than a simple collection of pictures could provide. I decided to give Sway another shot, with (I think) good results. You can see for yourself below. Click on the “Made with Sway” icon at the bottom left to open in full screen.

I found the Sway was exactly the right tool for this task. I was able to bring together diverse media elements from the trip, and organize them in ways that suited the narrative (not always chronologically). Adding the narrative itself completes the picture, and the presentation can stand on its own. I think, to sum it up, that Sway adds context to content.

Sway is an online tool. There are native clients available for Android, iOS ans Windows 10, but you are always working with online content – there are no files stored locally. Creation is simple. There are a number of preconfigured layouts which can be tweaked to some degree, and content addition is a simple matter of dragging and dropping from a set of cloud based repositories. You can share (and collaborate) at any time, and when ready, you can publish it on docs.com, where it will be available in galleries, search engines, etc.

In addition to travelogs, I can see this having great usefulness in the education space. I don’t think it will be replacing PowerPoint anytime soon – they serve different purposes. I would say though that in the absence of a presenter, I would likely rather have access to a sway than a PowerPoint deck – the Sway can do so much more on its own for explanations.

Finally, Sway is free. You’ll need a Microsoft account (either consumer or organizational) to use it, but you can simply point your browser at sway.com and get going on it. Give it a shot! I think you’ll like it. I do, and I think I’m going to dip into my back catalog to create more. WHen I do, I’ll be posting them in my diving collection on docs.com.

Leave a Comment

Using Excel With External Data – What’s the Right Tool?

Excel has been used with external data for… well, as long as I’ve been using Excel. So why would anyone bother to write a blog post about this given that the capability is so mature? In recent years, Excel has adopted a number of new, and frankly better mechanisms for working with external data, while retaining the old. Given that there are now multiple tools in Excel for working with external data, it’s not always clear as to which one is the best, and unfortunately there is no single tool that wins over all, although I believe that that will be the case soon.

The answer, as always is, “it depends”. When it depends, the important thing is to understand the strengths and weaknesses of each approach. With that said, let’s have a look at all of the options.

ODC Connections

ODC (Office Data Connections) are the traditional method of accessing data in Excel. You can create or reuse an ODC connection from the Data tab in the Excel ribbon.

When using an ODC connection, you establish a connection with a data source, form some sort of query and import the resultant data directly into the Excel workbook. From there, the data can be manipulated and shaped in order to support whatever the end user is trying to do. The one exception to this behaviour is the connection to SQL Server Analysis Services (SSAS). When a connection is made to SSAS, only the connection is created. No data is returned until an analysis is performed (through a pivot table, chart etc), and then only the query results are retrieved.

When the workbook using an ODC connection is saved, the data is saved within it. In the case of an SSAS connected workbook, the results of the last analysis are saved along with it. For small amounts of data, this is just fine, but any large analysis is bound to quickly run into the data limits in Excel which is 1,048,576 rows by 16,384 columns in Excel 2013. In addition such a file is very large and extremely cumbersome to work with, but even as such, Excel has been the primary tool of choice for business analysts for years.

Data loaded into the workbook can be refreshed on demand, but it can also be altered, shaped, mashed up, and as is too often the case, grow stale. Workbooks such as these have become known as “spreadmarts” and are the scourge of IT and business alike. With these spreadmarts, we have multiple versions of the same data being proliferated, and it becomes harder to discern which data is most accurate/current, not to mention the governance implications.

SharePoint has provided a way to mitigate some of the concerns with these connections. SharePoint itself supports ODC connections, and therefore users can access these workbooks stored within SharePoint and it also allows them to refresh data from the source either on demand or on open. A single point of storage along with a measure of oversight and browser access helps to restore a modicum of sanity to an out of control spreadmart environment, but the core issues remain.

In order to help with the core issues, Microsoft introduced PowerPivot in 2009.

PowerPivot Connections

Created in PowerPivot

PowerPivot was originally (and still is) an add-in to Excel 2010, and is a built in add-in to Excel 2013. PowerPivot allows for the analysis of massive amounts of data within Excel, limited only by the memory available to the user’s machine (assuming a 64 bit version). It does this by highly compressing data in memory using columnar compression. The end result is that literally hundreds of millions of rows of data can be analyzed efficiently from within Excel.

You can see that compression at work by comparing the same data imported into an Excel workbook directly, and into a PowerPivot model with a workbook. The following two files contain election data, and represent the maximum number of rows that Excel can handle directly (1,048,576) and 25 columns.

Getting data into the model was originally (and still can be) a completely separate process from bringing it into Excel. PowerPivot has its own data import mechanism, accessed from the Power Pivot window itself. First, click on the PowerPivot tab in Excel and then click manage. If you don’t have a PowerPivot tab, you will need to enable the add-in. If you don’t have the add-in, you have an earlier version of Excel – you’ll need to download it.

Once the PowerPivot window opens, the “Get External Data” option is on the ribbon.

Once the appropriate data source is selected and configured, data will be loaded directly into the data model – there is no option to import that data into a worksheet. Once the data is in, pivot tables and pivot charts can be added to the workbook that connect to the data model much like when creating an ODC connection to Analysis Services. In fact, it’s pretty much exactly like connecting to Analysis services, except that the AS process is running on the workstation.

Created in Excel

PowerPivot, and more importantly the tabular data model was included in Excel 2013. With that addition, Microsoft added a few features to make the process of getting data into the data model a little easier for users that were a little less tech savvy, and may be uncomfortable working with a separate PowerPivot window. That’s actually part of the thinking in leaving the PowerPivot add-on turned off by default.

When a user creates an ODC connection as outlined above, there are a couple of new options in Excel 2013. First, the “Select Table” dialog has a new checkbox – “Enable selection of multiple tables”.

When this option is selected, more than one table from the data source can be selected simultaneously, but more importantly, the data will automatically be sent to the data model in addition to any other import destinations.

Even if the multiple selection option wasn’t chosen, the next dialog in the import process, “Import Data” also has a new check box – “Add this data to the Data Model”.

Its purpose is pretty self-explanatory. It should be noted that if you choose this option, and also choose “Only Create Connection”, the data will ONLY be added to the model, nowhere else in the workbook. This is functionally equivalent to doing the import from the PowerPivot window, without enabling the add-in.

Power Query Connections

When Power BI was originally announced, Power Query was also announced and included as a component. This was very much a marketing distinction, as Power Query exists in its own right, and does not require a Power BI license to use. It is available as an add-on to both Excel 2010 and 2013, and will be included with Excel 2016.

Power Query brings some Extract, Transform and Load (ETL) muscle to the Excel data acquisition story. Data can be not only imported and filtered, but also transformed with Power Query and its powerful M language. Power Query brings many features to the table, but this article is focused on its use as a data acquisition tool.

To use Power Query, it must first be downloaded and installed. Once installed, it is available from the Power Query tab (Excel 2010 and 2013).

Or from the data tab, New Query (Excel 2016)

Once the desired data source is selected, the query can be edited, or loaded into either the workbook, the data model, or both simultaneously. To load without editing the query, the load option at the bottom of the import dialog is selected.

Selecting “Load To” will allow you to select the destination for the data – the workbook, the model or both. Selecting Load will import the data to the default destination, which is by default the workbook. Given the fact that the workbook is an inefficient destination for data, I always recommend that you change their default settings for Power Query.

To do so, select Options from the Power Query tab (2010 and 2013) or the New Query button (2016), click the Data Load section, and then specify your default settings.

Data Refresh Options

In almost every case when external data is analyzed, it will need to be refreshed on a periodic basis. Within the Excel Client, this is simple enough – click on the data tab, and then the Refresh All button, or refresh a specific connection. This works no matter what method was used to import the data in the first place. Excel data connections can also be configured to refresh automatically every time the workbook is opened, or on a periodic basis in the background.

However, workbooks can also be used in a browser through Office Web Apps and Excel Services (SharePoint and Office 365) or as a data source for Power BI dashboards. In these cases the workbooks need to be refreshed automatically in order that the consuming users will see the most up to data when the workbooks are opened. The tricky part is that not all of the connection types listed above are supported by all of the servers or services. Let’s dive in to what works with what.

SharePoint with Excel Services

Excel Services first shipped with SharePoint 2007, is a part of 2007, 2010, and will be included with 2016. From the beginning, Excel Services allowed browser users to view and interact with Excel workbooks, including workbooks that were connected to back end data. The connection type supported by Excel Services is ODC, and ODC only.

Excel Services has no mechanism for maintaining data refresh. However, the data connection refresh options are supported which means that the workbook can be automatically refreshed when opened, or on a scheduled basis (every xxx minutes in the background). Unfortunately, this can come with a significant performance penalty, and once refreshed it is only in memory. The workbook in the library is not updated. The data in the workbook can only be changed by editing the workbook in the client, refreshing it, and re-saving it

Workbooks with embedded data models (PowerPivot) can be opened in the browser, but any attempt to interact with the model (selecting a filter, slicer, etc) will result in an error unless PowerPivot for SharePoint has been configured.

SharePoint with Excel Services and PowerPivot for SharePoint

PowerPivot for SharePoint is a combination of a SharePoint Service application and Analysis Services SharePoint mode. When installed, it allows workbooks that have embedded PowerPivot data models to be interacted with through a browser. The way that it works is that when such a workbook is initially interacted, the embedded model is automatically “promoted” to the Analysis Services instance, and a connection is made with it, thus allowing the consuming user to work with it in the same manner as with a SSAS connected workbook,

The PowerPivot for SharePoint service application runs on a SharePoint server and allows for individual workbooks to be automatically refreshed on a scheduled basis. The schedule can be no more granular than once per day, but the actual data within the model on disk is updated, along with any Excel visualizations connected to it.

When the refresh process runs, it is the functional equivalent of editing the file in the client, selecting refresh all, and saving it back to the library. However, there is one significant difference. The Excel client will refresh all connection types, but the PowerPivot for SharePoint process does not understand Power Query connections. It can only handle those created through the Excel or PowerPivot interfaces.

Power Pivot for SharePoint ships on SQL Server media, and this limitation is still true as of SQL Server 2014. At the Ignite 2015 conference in Chicago, one of the promised enhancements was Power Query support in the SharePoint 2016 timeframe.

Office 365

Office 365, or more precisely, SharePoint Online supports Excel workbooks with ODC connections and PowerPivot embedded models in a browser. These workbooks can even be refreshed if the data source is online (SQL Azure), but they cannot be refreshed automatically. In addition, only ODC and PowerPivot connections are supported for manual refresh. Power Query connections require Power BI for Office 365. In addition, Office 365 imposes a 30 MB model size limit – beyond that, the Excel client must be used. In short, the Office 365 data refresh options are very limited.

Power BI for Office 365

Power BI for Office brings the ability to automatically refresh workbooks with embedded data models. Data sources can be on premises or in the cloud. On premises refresh is achieved through the use of the Data Management Gateway. It also raises Office 365’s model size limit from 30 MB to 250 MB. With Power BI for Office 365 both manual and automatic refreshes can be performed for both PowerPivot and Power Query connections, however Power Pivot connections are currently restricted to SQL Server and Oracle only.

The automatic refresh of ODC connections is not supported. A workbook must contain a data model in order to be enabled for Power BI.

Power BI Dashboards

Power BI Dashboards is a new service, allowing users to design dashboards without necessarily having Office 365 or even Excel. It is currently in preview form, so anything said here is subject to change. It is fundamentally based on the data model and it works with Excel files as a data source currently, and it is promised to use Excel as a report source as well. The service has the ability to automatically refresh the underlying Excel files on a periodic basis more frequent than daily.

In order for a workbook to be refreshed by Power BI, it must (at present) be stored in a OneDrive or OneDrive for Business container. It also must utilize either a PowerPivot, or a Power Query connection. At present, the data source must also be cloud based (ie SQL Azure) but on premises connectivity has been promised.

SQL Server Analysis Services

Another consideration, while not a platform for workbooks is SQL Server Analysis Services (SSAS). Excel can be used to design and build a data model, and that data model can at any time be imported into SSAS. As of version 2014, SSAS fully supports all connection types for import – ODC, PowerPivot and Power Query. Once a data model has been imported into SSAS, it can be refreshed on a schedule as often as desired, and you can connect to it with Excel, and share it in SharePoint. You can also connect to it in Power BI Dashboards through the SSAS connector. From both a flexibility and power standpoint, this is the best option, but it does require additional resources and complexity.

Refresh Compatibility Summary

For convenience, the table below summarizes the refresh options for the different connection types.

 

ODC

PowerPivot

Power Query

Excel Client

M

M

M

SharePoint/Excel Services

M

SharePoint/Excel Services/PP4SP

M

A

SQL Server Analysis Services Import

A

A

A

Office 365

M

M

Office 365 with Power BI

A*

A

Power BI Dashboards

A

A

M – Manual refresh

A – Both Manual and Automatic Refresh

* only limited data sources

 

The Right Tool

I started out above by saying that the selection of import tool would depend on circumstances, and that is certainly true. However, based on the capabilities and the restrictions of each, I believe that a few rules of thumb can be derived. As always, these will change over time as technology evolves.

  1. Always use the internal Data Model (PowerPivot) when importing data for analysis.

     

  2. Power Query is the future – use it wherever possible

    All of Microsoft’s energies around ETL and data import are going into Power Query. Power Query is core to Power BI, and announcements at the Ignite Conference indicate that Power Query is being added to both SQL Server Integration Services and to SQL Server Reporting Services. Keep in mind that we have been discussing only the data retrieval side of Power Query – it has a full set of ETL capabilities as well, which should also be considered.

  3. PowerPivot or ODC Connections must be used on premises

    PowerPivot for SharePoint does not support Power Query for refresh. This means that you MUST use PowerPivot connections for workbooks with embedded models. If you are already using SSAS, use an ODC connection within Excel.

  4. Power Query or PowerPivot must be used for cloud BI.

    PowerPivot connections will work for a few limited cases, but more Power Query support is being added constantly. Where possible, invest in Power Query

  5. If on-premises, consider importing your models into SSAS

    SSAS already supports Power Query. If, instead of using PowerPivot for SharePoint, Analysts build their models using Excel and Power Query, they can be “promoted” into SSAS. All that is then required is to connect a new workbook to the SSAS server with an ODC connection for end users. The Power Query workbooks can be used in the cloud, and the SSAS connector in Power BI Dashboard can directly use the SSAS models created.

  6. Choose wisely. Changing the connection type often requires rebuilding the data model, which in many cases is no small feat.

In summary, when importing data into Excel, the preferred destination is the tabular model, and to import data into that model, Power Query is the preferred choice. The only exception to this is on premises deployments. In these environments, consideration should be given to connecting to a SSAS server, and failing that, PowerPivot imports are the best option.

21 Comments

The New Power BI – Now With Enterprise!

Yesterday Microsoft announced the next step in the evolution of Power BI. It’s getting quite a bit of attention, and rightly so for its aim of bringing Business Intelligence closer to users. Democratizing BI has always proved a challenge – it’s the realm of the gurus in the white coats that hold the keys to the data. Microsoft is aiming to accomplish this democratization through a combination of user focus, and as of yesterday, a drastic change in its pricing model. Power BI just went from about $40 per user per month, to free, or $9.99/user/month for advanced capabilities. That’s quite a drop, and arguably the biggest announcement from yesterday – it will have a massive impact. The detailed price breakdown can be found here.

However, all of the focus around personal BI is, in my opinion, missing a key component. Power BI and its components have always focused squarely on both personal and team BI solutions. That is to say the ability for a power user to model data, visualize it quickly and easily and to share it out with fellow team members. While that capability is certainly retained in the new Power BI, this new version contains the first appearance of enterprise grade BI in the cloud for Microsoft.

To fully understand this, it’s necessary to touch on the Microsoft BI stack as it stands today.

Microsoft BI On Premises

The On-Premises BI story from Microsoft may be confusing, and occasionally difficult to understand, but it is very powerful, and relatively complete. In a nutshell, the story is good from a personal, team and enterprise perspective.

On the enterprise side, there are products from both the SQL Server team, and the Office team. Data warehousing is served by SQL Server and ETL duties fall to SQL Server Integration Services (SSIS). Multidimensional analysis storage is served by SQL Server Analysis Services in both OLAP and Tabular modes, and Reporting is performed by SQL Server Reporting Services (SSRS). The SQL product line doesn’t have much on the client side for analysis apart from SSRS, but this slack is taken up by the analysis tools available in Excel, and through Performance Point services in SharePoint.

Indeed, SharePoint also provides a platform for SSRS via SSRS SharePoint mode, and for Excel based analytical workbooks connected to SQL Server and to SSAS through Excel Services.

On the personal BI side, that role has traditionally fallen to Excel. The pitfalls of importing data into Excel workbooks for analysis are well documented and don’t need to be discussed here, but the bulk of those issues were addressed with the introduction of PowerPivot several years ago. PowerPivot allows for massive amounts of data to be cached within the Excel file for analysis without any data integrity concerns. The addition in recent years of  analytic visuals (Power View, Power Map) and ETL capabilities (Power Query) have further rounded out the offering.

Taking that Excel workbook and sharing it brings us into the realm of Team BI. This is to say that the analyses are relatively modest in size, and of interest to a targeted group. These models may not require the rigour or reliability associated with enterprise BI models. Once again, the technology involved here is SharePoint. A user can take a workbook with an embedded PowerPivot model, share it through a SharePoint library, and other users can interact with that embedded model using only a browser. This capability requires PowerPivot for SharePoint, which is really a specialized version of SSAS, along with a SharePoint service application.

One thing to note about these seemingly disparate approaches is that a power user can build a Power Pivot data model with Excel, share it to a team via SharePoint, and when it requires sufficient rigour or management, it can be “upgraded” into SSAS in tabular mode. This common model approach is powerful, and is key to understanding Microsoft’s entire BI strategy. You can also see here that SharePoint straddles the two worlds of team and enterprise BI.

Moving to the cloud

The BI workload is one of the last Microsoft workloads to move to the cloud, and with good reason. Massive amounts of data present problems of scale, and security or data sovereignty concerns tend to keep data on premises. However, there is a very real need to provide BI to users outside of the firewall.

SharePoint is the hub of BI on prem, so it’s logical to assume that with SharePoint Online, it could continue to perform that function in the cloud. The big catch here is that on-prem, SharePoint is simply the display platform. In the enterprise scenario, users connect through SharePoint to the back end servers. This isn’t an option in the cloud, so enterprise BI was left off the table.

With the personal and team BI scenarios, data is cached in a Power Pivot data model, which could be supported in the cloud. When Office 365 moved to the SharePoint 2013 code base for SharePoint online, rudimentary support for embedded Power Pivot models was indeed added. Essentially PowerPivot for SharePoint “light” was added. I call it light for two major reasons. Firstly, data models could be no larger than 10 MB. Secondly, there was no way to update the data contained within the Power Pivot cache, outside of re-uploading the Excel workbook. This is still true without a Power BI license. The inability to refresh the data renders team BI almost useless, except in static data scenarios.

The first generation of Power BI changed all of that. With a Power BI license, it was possible to install a Data Management Gateway on premises that would connect to team BI workbooks in Office 365 and update them on a scheduled basis. Yes, the gateway had many limitations (many of which have been removed over time), but finally, the on-prem refresh story was solved. In addition, the model size limit was increased to 250 MB. However, we were still left with a number of problems or limitations.

  1. Daily data refresh schedule. Automatic data refreshes could be daily at their most frequent. Manual refreshes could be done anytime
  2. Capacity. The maximum size of a data model was increased to 250 MB, which is relatively small for enterprise scenarios. In addition, refreshes aren’t differential, which means that the entire model is re-uploaded on every refresh
  3. Data sensitivity/sovereignty.  The refresh problem was solved, but because the data is still cached in the workbooks, there can be reluctance to sending it outside of the corporate firewall
  4. Per User Security – Power Pivot data models have no concept of user security in a workbook (tabular models in SSAS do). Security is at the workbook level
  5. Cost. This initial cost of Power BI was $40 per user per month. A power BI license was required to interact with any workbook that had a data model larger than 10 MB. Considering that a full Office 365 E3 license was around $25 per user per month, this price tended to limit the audience for sharing.

All of this is to say that Power BI in its first (and as yet current) incarnation is suitable for personal and team BI only. There has been no enterprise cloud BI story.

Power BI V2

The announcements yesterday outlined the next generation of Power BI. Going forward, Power BI will be available as a standalone offering, at the price points offered above. Office 365 users will continue to be able to use it from Office 365, but Office 365 will no longer be required to use it. In it’s early days, Power BI was a SharePoint app, but a careful examination of URLs in the current offering quickly reveals that it’s actually two apps currently, both running on Azure (not in SharePoint).

If you’ve signed up for the new Power BI preview, you may notice that the URL is http://app.powerbi.com/…… so this move isn’t a big surprise.

With the new model, Excel is no longer the central container. Users connect to data and publish it directly to Power BI. Behind the scenes, the service is doing a very similar thing as what it does with Power Pivot models – it’s storing them in SSAS. In fact, the same limits still apply – 250 MB per model (at least for now) Excel can still be used, but now it is as a data source.

Visualizations are performed through Power Views, and data is acquired through Power Query. These are no longer add-ons, but available on their own through Power BI Designer. This decoupling is good for those that have not made an investment in SharePoint Online, or Excel.

These changes to the architecture and the cost are great news for adoption, but don’t address the needs of the enterprise. Except for one thing – The SSAS Connector.

image

One of the data sources available to the new Power BI is the SSAS data connector. This connector is a piece of code that runs on premises (it actually includes the Data Management Gateway). It acts as a bridge between the Power BI service, and an on prem SSAS server.

The biggest distinction worth noting is that with the gateway, data is NOT being uploaded to the service, it remains on prem. The way that it works is that when a user interacts with a visualization from the cloud, a query is sent to the SSAS server through the gateway. That query is run, and its results sent back to the user’s visualization, and the data is not persisted.

In addition, when the query is sent back to the SSAS it is run with the permission of the user making the request. This is accomplished through the EFFECTIVEUSERNAME feature in SSAS. This provides for full user level security, and since tabular models in SSAS can utilize per user security, we no longer need to rely on proxy accounts/document level security.

Finally, because the data is being stored in an on prem SSAS server, it can be refreshed automatically as often as desired. For the same reason, we have no capacity limits – you can grow your own SSAS servers as large as you like.

The SSAS connector removes most of the limitations that prevent cloud based enterprise Business Intelligence, and the new pricing model removes the rest. Certainly there are going to be feature limits in the near term, but it appears to me at least that the back of this thorny problem has finally been cracked.

6 Comments

Power BI Data Management Gateway 1.4 – Where is it heading?

I received a notice from my main Power BI tenant last night that a new version of the Data Management Gateway was available. The previous (1.2) version contained some very significant changes so I was understandably eager to have a look. I installed it, observed a relatively attractive setup interface, then opened the release notes to find out what else was new. Only four items were listed (from the release notes).

  • Unified binary that supports both Microsoft Azure Data Factory and Office 365 Power BI services
  • Refine the Configuration UI and registration process
  • Azure Data Factory – Azure Ingress and Egress support for SQL Server data source
  • Office 365 Power BI – Bug fixes

I had already observed number two, the new setup experience. Bug fixes, while absolutely necessary, aren’t necessarily something to write about, but I think that the other two items are. While they may not have immediate impact, my bet is that they will in very short order.

The key point here is that the gateway now supports the Azure Data Factory. There are many, many things that the data factory enables (Hadoop anyone), but the one that I feel is most relevant to Power BI today is the ability to connect directly to on premises data sources. That’s not quite how it’s been done until now.

Power BI for Office 365

In the context of Power BI as we’ve come to know it today, on-prem data refreshes are handled by the Data Management Gateway. On a periodic basis (daily at most) the service contacts the gateway, which in turn reruns all relevant queries. The resultant data is then uploaded to the service.

The service in turn packages the data and updates the host Excel workbook, and the model is transferred into a back end analysis server. Every transaction goes through the host Excel workbook.

Power BI Dashboards

If you’ve had a chance to see the preview of Power BI Dashboards, you may have noticed that it is not dependent on Office 365 or Excel at all. When you add a data source, you take the date and add it to a cloud based data model directly (presumably backed by SQL Server Analysis Services). All visualization work against these models, with one very important exception. If you connect to a SQL Server Analysis Services Data source you are actually connecting directly to a model hosted on an on-prem SSAS server in real time.

SNAGHTML5339485

How is this done? The connection is made through the “Analysis Services Connector”, which is a separate bit of software installed on prem to facilitate connection between the Power BI Dashboards service and the On-Prem SSAS server. It’s available directly from the dashboards portal.

image

After installing it, a process that establishes for dashboard and SSAS credentials, it can be reconfigured by running the “Power BI Analysis Services Connector” tool.

image

However, installation also adds another piece of software to the host machine. The Microsoft Data Management Gateway. This version of the DMG establishes the connection between the SSAS server and the Power BI service in real time. Up until now, the DMG didn’t work this way, so which version is it?

image

Until now, the most recent version of the DMG was 1.2, so this Dashboards preview contained a glimpse into the next generation Data Management Gateway that provided some intriguing new capabilities.

Coming Together

Checking into the latest version of the Data Management Gateway from Office 365, we see:

image

This version is newer that that included in the Dashboards Preview, and presumably includes everything from it. The key phrase in the release notes to me is therefore “Unified Binary”. One gateway to rule them all, if you will. Does this mean that we’ll be able to connect to on-prem data in real time from Office 365 as well as from the Power BI preview? I don’t know how, but I bet that the building blocks are now there.

The latest version may not include support for any new data sources, or any new bells and whistles, but it’s likely worth setting up for new capabilities that will hopefully show up sooner rather than later.

1 Comment