Skip to content

129 search results for "excel"

Using Excel Files with Power BI Desktop and SharePoint

As discussed in a previous post, Working with Excel Files in the Power BI Service, Excel and Power BI have a rich, complex relationship. Power BI Desktop is the primary design tool for Power BI, and it has many feature overlaps with Excel as an analytic tool. Excel can be used both as an analytic tool and a data source, and the structure of the Excel file will dictate the way that Power BI Desktop can be used with it. If Excel is being used as an analytic tool (i.e. connected to data), the appropriate items in the file can be imported into Power BI Desktop. If it is being used as a data source (data in worksheets), Power BI Desktop will connect to it, and use its data to build a model. This post attempts to articulate the nuances of both scenarios.

Importing from Excel

Power BI Desktop has an unfortunate name in my opinion. It is a design tool and is not meant to replicate the capabilities of the Power BI service on the desktop, as the name might suggest. A better name for the product would I believe be Power BI Designer. Its purpose is to connect to and transform data (Power Query), build data models for Analysis (Power Pivot) and build reports (report designer).

Used as an analysis tool, Excel has all these capabilities as well. In fact, the first two (Power Query and Power Pivot) are identical to what is already in Power BI Desktop. Excel also has Power View for analytic reporting. Power View is very similar to the type of reporting in Power BI Desktop, but uses a different technology and has been deprecated for some time. As a result, Excel charts and pivot tables are the primary means of visualizing data in Excel.

So why would you need to use Power BI Desktop if you are using Excel? As explained in my previous post, the Power BI service can fully interact with Excel as an analysis tool, and allows you to interact with Excel right from the Power BI Service. If Excel is meeting all your analytics needs, then there may be no need to introduce Power BI Desktop at all. However, if you wish to take advantage of Power BI’s analytic reporting capabilities, and you have existing Excel assets, you may wish to convert them to the native Power BI format.  Whatever the reason, moving from Excel to Power BI is relatively straightforward with Power BI Desktop.

From the File menu in Power BI Desktop, select Import, and then Excel Workbook Contents.

Importing Excel Files contents

You are then prompted to select an Excel file. Once selected, you are then presented with a warning dialog.

Excel files contents warning dialog

The dialog does a very good job of explaining what will happen, specifically the fact that data from workbooks will not be brought into this new file. Any Power Pivot data models or Power Queries will be brought in. If the workbook contains legacy Power View sheets, they will be converted to native Power BI visuals. In addition, any legacy (non-Power Query) data connections used by the source file’s Power Pivot data model will be converted to Power Query and imported.

Imported Power View sheet

Legacy Power View Sheet converted to Power BI visuals

A complete list of workbook content and what is/isn’t converted is below:

Excel Content Import to PBI Desktop Support
Data in sheets Not imported
Data model (Power Pivot) Imported
Data connections Converted to Power Query and imported
Power Queries Imported
Power View Sheets Converted to PBI visuals and imported
Pivot charts/tables Not imported
Excel charts Not imported
Macros Not imported

Once imported, the new Power BI file (PBIX) lives on its own and contains no connection or any other type of relationship to the original source Excel file. If the source Excel file is changed, there is no way to update the PBIX file. Any imported data connections are between the PBIX file and the original data source. The new PBIX file can be published to the Power BI service like any other.

Connecting to Excel

Connecting to Excel as a data source is a very different thing than importing from it. In this scenario, the data in the worksheets and only the data in the worksheets is brought into the data model. The is very different behaviour than that of connecting to Excel files to the Power BI Service, where both the model and the worksheet data is brought in.

Using the Excel Connector

The easiest, and most obvious way to connect to Excel worksheet data is by using the Excel connector. From the ribbon in Power BI Desktop, select Get Data. The Excel connector is right at the top of the list.

Connecting to Excel files on the file system

Selecting it allows you select your source file, and then the workbooks within it, and then build out the data model.

This approach works well but carries with it an important limitation. The new queries are  connected to the file using a local file system. This means that to be refreshed, an on-premises data gateway is required. In order to eliminate the gateway requirement, you can connect to the file in SharePoint using the SharePoint folder connector.

Using the SharePoint Folder Connector

The SharePoint Folder connects to all the files stored in libraries of a SharePoint site. It allows you to report on file metadata, but it also allows you to drill into file contents.

From Power BI Desktop select Get Data but instead of selecting Excel, Search for SharePoint and select SharePoint folder.

Using the SharePoint folder connector

Once selected, enter the URL of the SharePoint site (NOT the URL of a library or folder) in the dialog box.

Next, you will be presented with a preview of all the files in your site. Unless you are only interested in file metadata, click on the Edit button to bring up the Power Query editor.

The initial view will contain all the files in the site, but we are interested in the content of just one of those files. Every file in this view will contain the hyperlinked value “Binary” in the Content column. Clicking that link for the file that you want to connect to will drill down into the contents.

Site contents using the SharePoint folder connector

From this level, you can build your Power Query, data model, and report as needed just as if you had used the Excel connector. The difference is that now when you publish your report to Power BI, it will know the file is stored in SharePoint and will connect directly to it. It will not require a gateway for refresh purposes. Once credentials are registered, the report will refresh itself directly from the workbook stored in SharePoint.

XLS vs XLSX

A note of caution. The above SharePoint folder approach only works for XLSX files. The Power BI Desktop and the Power BI service both support both Excel file formats (XLS and XLSX). However, refresh does not. If the source file format is XLS, and a refresh is attempted, you will receive the classic “microsoft.ace.oledb.12.0 provider” error in the Power BI service.

Excel files refresh error with XLS file types

The older Excel file format (XLS) requires an Access driver to refresh, which is not a part of the Power BI service. The newer XLSX file does not require this driver. As a result, if the source file is XLS, refreshing it requires going through an On-Premises Data gateway, and that gateway machine must also have the ACE components installed.

To recap, you can bring Excel assets into Power BI Desktop by using the import function, and you can load data from Excel files through Power Query. The two operations have very different results, and the can be combined if a source workbook contains both analyses and data.

5 Comments

Working with Excel files in the Power BI Service

Power BI has been able to work with Excel files since it was first introduced. Indeed, it was born from the analytic capabilities in Excel. Users can connect directly to Excel files by using the Power BI service and nothing but a browser. However, depending on the content of the Excel file, and the method of connecting, the resulting products can be very different. In this post I will attempt to clarify this behavior. A subsequent post will detail the options available when working with Excel files in Power BI Desktop.

File Structure

Excel is a multi-purpose tool. It contains all the building blocks of Power BI, and as such, it is an excellent Business Intelligence client. Excel files are also often used (much to my chagrin) as a data storage container, or as a data transport medium. Understanding how the file is structured, and what you want to do with it is key to making the right choice when combining it with Power BI.

Originally Excel files (workbooks) were collections of worksheets. Analysts could import data into those worksheets and then analyze them with the tools that Excel provided. Although Excel was never intended to be a database, it’s ease of use and familiarity led many people to begin using it that was, and “spreadmarts” (spreadsheet data marts) quickly became a problem. The problems arose because the instant data was extracted from a source it became stale, and the fact that it was being stored in worksheets meant that it could be edited (changing history) and became subject to the data size limitations of a worksheet.

To take advantage of Excel’s analytic capabilities without being subject to the issues involved in worksheet data storage, the data model was introduced, initially through PowerPivot. The data model is a “miniaturized” version of the SQL Server Analysis Services tabular engine that runs in Excel. This data model is read only, refreshable, and highly compressed which importantly means that its only data limitation is the amount of available memory available on the machine running it. Importantly, this engine is the same engine that is used by Power BI – the advantages of which we’ll explore shortly.

Excel of course still needs to be able to use worksheets and be Excel, so we can’t just remove the worksheet capability (which incidentally is effectively what Power BI Desktop is – Excel without worksheets). Therefore, today from a data perspective, Excel files can have data in the data model, worksheets or both. From the Power BI service perspective, the important thing is whether the file contains a data model, as it treats the two cases differently.

Getting Excel Data

From the Power BI service, you click the Get Data button, and then the Get button in the Files tile. You are then presented with one of two dialogs depending on whether you are using a personal workspace, or an app workspace.

Personal workspace

Importing files into a Power BI Personal Workspace

Connecting file-based data to a personal workspace

When importing into a personal workspace, there are 4 possible data sources.

A local file is one that is stored on a file system local to the machine being used. Selecting this option will allow you to work with the Excel file stored in that location, but if the file is being used as a data source (data is in the worksheets), then a Data Gateway will be required for any data refreshes. Power BI will also connect to a file stored in OneDrive, either Personal or Business (through office 365). Finally, the service can work with files stored in any accessible SharePoint site (not simply Team sites as the name would indicate).

App workspace

Importing files into a Power BI App Workspace

Connecting file-based data to an App workspace

When importing into an App workspace, there are 3 possible data sources. The Local File and SharePoint – Team Sites options are precisely the same as when importing into a personal workspace. The difference is the OneDrive – Workspace name option replaces the two other OneDrive options. Choosing this option allows you to work with files stored in the “Group OneDrive”. Since every App workspace is backed by an Office 365 or “Modern” group, it also has access to the SharePoint site for that group. The “Group OneDrive” is the Documents library within that SharePoint site. Therefore, choosing SharePoint – TeamSites and navigating to the Documents library will render the same results in a few more mouse clicks, but also give access to all other document libraries within that site.

Connect vs Import

Once you navigate to the Excel file that you want to work with, you select it, and click connect. You will then be presented with two options for the file, Import or Connect.

This choice dictates how the file is brought into the Power BI service. The structure of the file determines exactly what is brought in to the service in both cases.

Connect

Clicking the Connect button allows Power BI to connect to and work with the Excel file in place. The workbook is displayed as an Excel workbook in full fidelity in the Power BI interface using Excel Online. The file itself is shown in the Workbooks section in the Power BI interface, and it stands alone from other Power BI elements (except that regions of it can be pinned to a dashboard). Connecting to an Excel report will not create a Power BI Dataset, Report, or Dashboard. All operations, including refresh (see below) are controlled through the workbook.

At no point is the file moved, or “brought in” to the Power BI service. If the file is being stored in SharePoint, or OneDrive, anything done to the file in the Power BI service will be visible to anyone with access to the file itself, whether they are a Power BI user or not. This includes refresh, which will be discussed further below, but the important part to remember here is that if the data in the connected file is refreshed through the Power BI service, and it is being stored in SharePoint (or OneDrive), all users will be able to see updated data the next time that they open the file.

Connecting to an Excel file behaves the same way whether the file contains a data model or not, but the file must contain a data model in order to be refreshed by the Power BI service.

Connected Excel file within Power BI

Connected Excel file within Power BI

Import

Importing an Excel file behaves totally differently from connecting to it. When an Excel file is imported, it is treated as a data source to Power BI, and the assets within that file are brought into the Power BI service. Subsequent changes to the source file are not immediately reflected within the Power BI service, but are retrieved through the refresh process.

The way that the assets are brought into the service depends very much on the structure of the file, specifically whether it contains a data model or not. If the file does not contain a data model, then Power BI will use the data contained in the Excel worksheets to construct a new one. This is similar to what happens when a CSV file is imported into the service. If the file does contain a data model, then the worksheet data is imported, and that data model is brought into the service as-is. One important exception to this is if worksheet data uses the same query as an existing model, the worksheet data is ignored, and the data model is brought in as-is. This is important because Excel’s Power Pivot editor can be used to edit the model, creating calculated columns, calculated measures and relationships prior to import. The model that is automatically created when the file does not contain a model has no editing capabilities.

When an Excel file with a data model is imported, the data model (imported or created) is added to datasets, and a link to the dataset is added to the default dashboard for the workspace. If no default dashboard exists, one will be created. A report can then be authored in the service. If the workbook contains any PowerView reports, these will be converted to native Power BI reports and added to the service as well. Any embedded 3D maps are not brought in.

Imported Excel File showing calculated measures

Imported Excel File showing calculated measures

Refresh

Data refresh options, and behavior depend on both the Get Data choice (connect or import) and the structure of the Excel file.

Connected Workbooks

If the workbook is connected to the service, and it does not contain a data model, it cannot be refreshed. This is true even if the worksheets in the workbook contain data from Power Query queries. This is the only scenario that does not support refresh in any way.

If the workbook contains a data model refresh is supported. The interesting part is that refresh will be triggered not only for the data model itself, but for any worksheets that have Power Queries as a data source. Therefore, a workaround to the lack of refresh support for a worksheet with no data model is to add a blank data model.

For refresh to work, the data source must be available to the Power BI service. This means that the source must be available in the cloud or registered on an available On-Premises Data Gateway.

The important thing to note about connected workbooks is that the refresh options that are performed on them are permanent – refreshed data is stored with the workbook. This means that if the connected workbook is stored in SharePoint, or shared through OneDrive, updated data is available to all users with access regardless of whether they are Power BI users.

Imported Workbooks

Refresh options for imported workbooks are slightly more complicated. As mentioned above data is either imported from the worksheets, a data model imported into the service or both.

If data was imported from worksheets, then the Excel file is the data source from the standpoint of Power BI. If the file is stored in SharePoint or OneDrive, it will automatically be refreshed every hour by default. This means that changes to the underlying Excel file will be reflected back in the Power BI service within an hour. This feature can be disabled, but it is not possible to change to hourly schedule, nor precisely when it will occur.

Refresh options for workbooks in OneDrive/SharePoint

Refresh options for workbooks in OneDrive/SharePoint

If the file is stored on a file system, it can be scheduled more granularly, but you will need to connect to it through an On-Premises Data Gateway.

If the file contained a data model that was imported into the service, then the original source of data for that data model (the query) is what the Power BI service will refresh from (NOT the Excel file itself). In this case, the refresh options are the same as with most other Power BI data sources – Excel is taken out of the picture completely, and any changes to the source Excel file will not be reflected into the service. The exception to this is if the file had both a data model, and worksheet data that was imported.

In the case of an Excel with both a data model and worksheet data, both cases above will apply. The workbook is used as a data source for the table that was created by Power BI on import, and the original data model’s source is updated independently. This means that changes to the worksheet data are reflected in the Power BI service when refreshed, but any model changes to the original Excel file are not. Both the OneDrive and regular refresh schedules are used for imported files of this type.

Refresh options for a combined data source

Refresh options for a combined data source

The following table summarizes the refresh options available for file structure and connection type.

File Structure

Get Data option

Connect

Import

Worksheet data None Refresh from worksheet
Data model only Refresh from model source Refresh from model source
Data model plus worksheet data Refresh from model source and worksheet source Refresh from model source and worksheet

Summary

Both Excel and Power BI are powerful tools in their own rights, and the decision to use one does not preclude using the other and in fact there are many good reasons for doing so. Bringing refreshability to Excel files stored in SharePoint is just one of them. It is however important to understand how it all works in order to get the maximum impact.

5 Comments

Power BI Analyze in Excel – The beginning of a beautiful thing?

One of the announcements made at the Microsoft Data Summit in this past March 2016 was the availability of Analyze in Excel. This feature allows an Excel workbook to connect to a data model that is stored in the Power BI service, and to use it to analyze the data contained within. With this approach Excel is not importing data, or at least it is not importing any more data than the query results. It is exactly like connecting data to SQL Server Analysis Services data sources, something that Excel users have been doing for years. Well, to be completely accurate, it’s not LIKE connecting to SSAS, it IS connecting to SSAS. The only difference is that in this case SSAS is in the cloud. This feature significantly enhances the utility of the Power BI Service, and is important for several reasons that may not be all that obvious. I’d like to walk through a few of them, but let’s start with the obvious.

Excel is a very powerful analytical tool

As nice, and as attractive as Power BI visuals are, Excel still rules the roost when it comes to doing advanced analytics. Excel has been doing this for years and is very mature. It supports features such as pivot tables, pivot charts, and drill through to data, where Power BI reports still do not. The lack of these features can be a blocker for Power BI on its own, but if the data models in Power BI can be analyzed with Excel, suddenly a move to Power BI is not an either/or decision – you can have it both ways. You can deploy models and reports into Power BI and take advantage of all the goodness there, but you can also connect with Excel when the deep analysis is needed. With Analyze in Excel, you can have it both ways.

A wider audience for your data models

Very often, the person that builds the data model is the same person that does the analysis. This is the nature of self-service analytics. In the past when the only multidimensional analysis tools were OLAP cubes and connected Excel workbooks, cube design was a specialized skill. The cubes were published and users would use them as is. The advent of the data model (Power Pivot) and self service analytics lowered the skills bar so that analysts could acquire data, model it and analyze it, making the entire process much quicker and responsive. However, this still puts the model out of reach for those with no data modeling skills or interest.

Analyze in Excel provides the best of both worlds. Analysts can build models and reports in Power BI, and users that need more in depth analysis can connect to them with Excel without having to reinvent the wheel. This in effect provides the same capability that PowerPivot for SharePoint does on premises. One model can now reach a much wider audience of users. This has many of the benefits of an on-premises SSAS deployments without all of the organizational overhead of getting them up and running.

Uses the Analysis Services OLEDB Driver

The connection that is made from Excel to the Power BI services uses the latest version of the classic Analysis Services OLEDB driver. This is the driver that has always been used to allow Excel (and other tools) to communicate with Analysis Services, and this new version has been updated in order to work with the cloud based SSAS service. In fact, in order to use the feature, you must first download and install the updated driver. Therefore, in theory, any tool that uses this driver should be able to communicate with Power BI models as if Power BI was one great big SQL Server Analysis Services server (because it is).

It really is Analysis Services in the cloud

The Power BI service itself is backed by tabular mode SSAS. Until now, it was necessary to go through the service to access it. Analyze in Excel is the first instance that I know of that a client application communicating directly with that SSAS instance. While this connection is really using the Power BI API, it does beg the question – can a fully Platform as a Service version of Analysis Services be very far away?

Claims based authentication and Power BI API

None of the products in the SQL Server suite currently supports claims authentication. This is true even for the yet unreleased SQL Server 2016. Even SQL Azure, a cloud based version of SQL Server, requires SQL authentication only (although Azure Active Directory authenticated databases are currently in preview). However, looking at the connection string contained in the ODC file used by the Analyze in Excel feature reveals some interesting things. Here’s one connection string:

<odc:ConnectionString>Provider=MSOLAP.7;Integrated Security=ClaimsToken;Identity Provider=AAD;Data Source=https://analysis.windows.net/powerbi/api;;Initial Catalog=xxxxxxx; ……..

The value MSOLAP.7 for the provider indicates that this is the next version of the SSAS OLEDB Driver. No surprises there, but this does hint at future compatibility (see SharePoint below). The value for Integrated Security, and Identity Provider (ClaimsToken and AAD) indicate that it is leveraging Azure Active Directory Claims authentication. We therefore have a version of SSAS that can use Claims based authentication. This isn’t available to on-premises installations, but given that the capability has been built, I imagine that it is not all that far away.

Finally, the Data source indicates that the Power BI API is being used to marshal all communication with the back end API service. I think that it is reasonably to conclude that any API for a PaaS based version of SSAS would be based on, or strongly resemble the Power BI API. They may even be one and the same.

Excel Online in SharePoint

As anyone that has set up PowerPivot for SharePoint can tell you, SharePoint supports the configuration of new OLEDB drivers. This support carries forward into the Office Online Server in the world of SharePoint 2016. Given that both SharePoint and OOS utilize claims based authentication, it should theoretically possible to create a workbook that uses the Analyze in Excel feature, store it in SharePoint, and have it work for multiple users from within a browser. I imagine that more plumbing is needed at this point, but it would be an interesting way of integrating Power BI in the cloud with SharePoint both on premises and Online.

Reusing the Excel Files, and Limitations

In the same vein as discussed with SharePoint, Power BI itself allows Excel files to be interacted with in the service in exactly the same manner that Excel Online does. Theoretically, one should be able to use Analyze in Excel to build a workbook, then connect it to Power BI and have it work for interaction. While it is possible to connect it, all interactions fail at the moment. It appears that the Power BI service (or the backing Office Online service) does not yet support the new OLEDB driver.

Another current limitation of this feature is that data sources using Direct Query (this includes SSAS sources) or sources created by the Power BI API cannot be used with Analyze in Excel. At least not yet.

Analyze in Excel is another useful tool in the Power BI arsenal, but as outlined above, I think that it’s a harbinger of even greater things to come.

7 Comments

Analyze in Excel – Setup and initial issues

Analyze in Excel is an extremely important new Power BI feature for reasons that I’ll outline in more depth in an upcoming post. If you want to try this feature for yourself, there’s a good possibility that you’ll hit one of the errors that I ran into. This post will hopefully help other intrepid pioneers past the hurdles and get working with this fantastic new feature.

Initialization of the Data Source failed

In order to enable the Analyze in Excel feature, Excel must use the Microsoft AS OLE DB Provider from SQL Server 2016. This driver supports claims based authentication for SSAS, which is what Power BI uses. Observant readers will notice that I just used “claims based authentication” and “SSAS” in the same sentence, but I digress. What’s that you say? SQL Server 2016 has not yet reached RTM? That’s OK because you can use the driver from the Release Candidates, which can be obtained from the SQL Server 2016 Feature Pack directly. More commonly though, you’ll get it from Power BI user interface. There it can be acquired in one of two ways.

You can proactively download the updates from the Power BI “Download Center”, which is the little down arrow in the upper right of the toolbar:

Clicking the “Analyze in Excel updates” button will download the installer for the driver, which you can run right away.

Running the file takes you through the installation of the driver, which is simple and wizard driven.

Once installed, you may then navigate to a data source and launch “Analyze in Excel”.

At this point you will be prompted for two things. Firstly, you will be prompted to download and launch an ODC. This is the connection file that will be used to connect your Excel client to the SSAS service that your data source is housed in. If you’ve already installed the new driver, you can just go ahead and run it, and Excel will launch. If you haven’t you can take advantage of the second (simultaneous) prompt, which is to download the driver. This prompt will appear whether or not you have already installed it.

This approach is a bit different. This dialog has a big yellow box that just begs to be clicked, and it will install the 32-bit version of the driver. You can also take the high road and install the 64-bit version which of course you’ll need if your Excel is 64 bit. After all, anybody serious about doing data in Excel is using 64 bit Excel, right? In any event that’s the difference between the second approach and the first. The first approach does not give the option, it just goes ahead and uses the 32-bit version.

I of course originally opted for the first option, and whenever the ODC launched, and Excel opened, I received the error “Initialization of the Data Source failed”, which is hardly intuitive.

The issue of course is a mismatch between the bit level of the driver, and the bit level of Excel. The solution to this problem is quite simple. The 32-bit version needs to be removed, and the 64-bit version installed. You’ll find the offending package in listed in Programs and Features as “Microsoft AS OLE DB Provider for SQL Server 2016”, but it makes no mention of the bit level.

If you happen to have both installed (you can) the only way to tell the difference between the two is that the 64-bit version is that the 64-bit version is about twice as large. This distinction also holds for the installer files – they are named the same but the 32-bit version is approximately 29 MB, and the 64-bit version weighs in around 62 MB.

Excel cannot find OLAP cube Model

About a week ago I completed a small IoT project that takes data from several weather stations and pumps it into a number of sources, one of them being Power BI. Writing data directly to a Power BI data model through the API (which is what Azure Streaming Analytics does) introduces a number of idiosyncrasies into the mix, one of them being that the data can only be updated from the API. Apparently another is that you can’t use the Analyze in Excel feature with it. The error that you get when you try to do so is:

You also cannot use Analyze in Excel with any Direct Query data sources or on-premises SSAS through a gateway. The solution to this is apparently patience – it’s currently not supported. I can only presume that it is coming soon, but for now, you can stop beating your head against the wall, it won’t work.

The HTTP server returned the following error: Forbidden

This one sounds pretty ominous, doesn’t it? It also doesn’t make sense. You needed to be logged in in order to find the “Analyze in Excel” button in the first place, but this error is indicating that you don’t have access. This problem occurs when you have multiple AAD (Organizational) or Microsoft accounts, and you are maintaining a connection to one that does not have access to the data source. It’s particularly galling, and difficult to remedy if you have an AAD account and a Microsoft account that use exactly the same email address. I know, because I do. In order to fix this, you need to force the connection to logout, but the only way to do that currently is to modify the ODC file.

An ODC file is simply an XML file that can be edited with any text editor. Once open, you search for the connection string (<odc:ConnectionString> and add the desired user ID to it by adding “User ID=account” as in the example below:

Launching it after the edit will force the logout of the previous connection and you should be presented with a login screen for the correct one. Unfortunately, in the case like mine where the two accounts are named identically, this does not work. What you must do is to use a different account first. This will force the logout. Once that is done, you can add the correct account, or remove the User Id section altogether. Subsequent launched will force the login, where you can choose the correct login type, and the feature will work.

These few tips should help you get up and running with Analyze in Excel, and shortly I’ll be discussing the reasons that I think this feature is such a big deal.

5 Comments

Connecting Excel to Data with SharePoint 2016 and Office Online Server 2016

If you’re in the SharePoint community, and you’ve been living under a rock since August 2015, you may not have heard that Excel Services is no longer present in SharePoint 2016. This is not a as big a deal as it initially sounds, because all (OK.. most) of the features available to Excel Services have been moved over to Office Online server, streamlining the architecture. It does however lead to many changes, and this is certainly true for external data access and authentication. This article hopes to answer the question of how these changes impact data access and the Excel Services Authentication Settings.

My previous article “Adding Excel Services Capabilities to a SharePoint 2016 Farm” dealt with getting the farm enabled with Office Online Server and Excel Services capabilities. This article drills down into using Excel workbooks connected to back end data sources. It does not cover PowerPivot enabled workbooks, or workbooks with an embedded data model – that is the topic of the next article in this series.

The Excel Services authentication settings are use when an Excel workbook is opened in a browser, and the workbook connects directly to data on another server. Note that it does NOT pertain to PowerPivot enabled workbooks (at least that portion of those workbooks that uses the data model) as they utilize a different mechanism entirely.

The settings are accessed from the data tab in Excel. First, click on “Connections” to view the available connections in the workbook. Next select the connection and click “Properties”. Select the “Definition” tab, and at the bottom of the dialog box, click on “Authentication Settings”. It’s a well hidden option.

image_thumb3

Before continuing, it should be mentioned that these settings affect Windows credentials only. If the data source uses SQL Authentication, or something proprietary, the credentials will be stored in the Secure Store Service, or embedded in the connection string, and they will be used as proxy accounts – fine grained permissions on the destination server will not be possible.

The issue with Windows credentials is the standard “double hop” problem encountered in Windows authentication. This problem existed in prior versions of Excel Services, with SharePoint being the middle tier server. Office Online Server (OOS) must now reside on a separate server, and this add an extra “hop”, exacerbating the problem. In a nutshell, the problem is how to allow the server (in this case, OOS) to provide an identity to the source server. These settings allow for a number of solutions to this problem.

The two major data sources that support Windows authentication are SQL Server, and SQL Server Analysis Services (SSAS), and they are unique enough to be considered separately. Any other data source that supports Windows authentication can be treated in the same manner as SQL Server.

1. Use the authenticated user’s account

This is the option that you’ll choose if you want the identity of the person opening the workbook to be passed back to the source server when the data is refreshed, or in the case off SSAS, interacted with. It’s also the default option. If you never knew about the Excel Services Option Authentication Options button, this would be the one that you’d use.

Before you go rushing to declare that this is the best option, keep in mind that it means that the user permissions will need to be maintained on the data source for all users accessing it. Sometimes proxy credentials are a better choice. In order to get either option working however, additional configuration will be required.

SQL Server Analysis Services Data Sources

If your data source is SSAS you have two options. You can set up Kerberos Constrained Delegation (KCD) between the OOS Server and the SSAS Server, or you can use EffectiveUserName. Given that Kerberos will be discussed in the SQL Server section, and that my recommendation is to use EffectiveUserName whenever possible, we will cover EffectiveUserName here.

To begin with, EffectiveUserName is turned off by default in OOS. In Excel Services this could be turned on through the user interface in the Excel Services Service Application, but with OOS, this is done through PowerShell. As with the other OOS parameters, it’s a single line on the OOS Server:

Set-OfficeWebAppsFarm -ExcelUseEffectiveUserName:$true

With EffectiveUserName,  a connection is made between the OOS Server and the SSAS server using the machine account of the OOS Server itself. The OOS server has the identity, but not the password or token of the requesting user, and that is provided to the SSAS Server to use when executing queries. This impersonation process requires a high level of trust, and the machine account of the OOS server must be an administrator in the destination SSAS instance.

Adding this account requires the use of SQL Server Management Studio 2016 (SSMS), in prior versions it was not possible to add machine accounts to SSAS. To add it, connect SSMS to the destination SSAS instance, then right click on the server node and select “Properties”. Once the dialog options, select the “Security” node, and click the “Add” button.

image_thumb18

Next (and order is important here), select the “Locations” button, and choose “Entire Directory” (or a container where your server accounts live). Click “Ok” and and then click the “Object Types” button. Computer accounts are excluded by default, and this is where you select them.

image_thumb16

Once selected, you can enter the machine account in the form of DOMAIN\MachineName. After clicking OK, it should appear in the administrators list with a “$” appended to it. In our case, the machine name is NAUTILUS2016OOS and the domain is NAUTILUS. THis needs to be done for each OOS server in your farm.

image_thumb17

At this point, any user interacting with the workbook in a browser will interact with the SSAS server using their own identity, which of course means that the SSAS server needs to know who they are. The end users will need to be granted access to the data sources in SSAS, in the same way they would if they were connecting to it in an Excel client.

SQL Server Data Sources

SQL Server has an equivalent feature to EffectiveUserName, the SetUser command. This feature has been supported for years in SQL Server Reporting Services, but unfortunately was never supported by Excel Services. Unfortunately, it’s still not supported by OOS. SharePoint and OOS both support claims based identities, but SQL server does not and this leaves us with only Kerberos Constrained Delegation to provide the user’s identity to the source system.

In order for this to work, Kerberos must be configured for the OOS Server(s) to trust the account running SQL Server on the destination server to delegate credentials. However, KCD delegates Windows credentials, and as mentioned above, both SharePoint 2016 and OOS use claims based authentication. It is necessary to first retrieve the user’s Windows token in order to delegate it, and this needs to happen on the OOS server, as it is now the final “hop” before connecting to the source system.

In the past, SharePoint provided this capability through the Claims to Windows Token Service (C2WTS). The was perfect for Excel Services, as it ran on top of SharePoint and could leverage this service. With OOS that is no longer an option because it runs on a separate machine, but C2WTS can be made available here as a part of the Windows Identity Foundation (WIF).

WIF is not a prerequisite for OOS, so it must be added first. In order to do so,  launch the Add Roles and Features wizard on each OOS server, proceed to the features section, and select “Windows Identity Foundation”. Allow the wizard to complete.

image_thumb20

(Note – if you will be supporting ADFS, you will also need to install the Microsoft Identity Extensions.)

Next, the C2WTS needs to be configured to allow access for the Network Service account. To do so, edit the file “C:\Program Files\Windows Identity Foundation\v3.5\c2wtshost.exe.config” and remove the comments in the <allowedCallers> section for Network Service. When complete, it should appear as below:

<allowedCallers>
<clear/>
<add value=”NT AUTHORITY\Network Service” />
<!– <add value=”NT AUTHORITY\Local Service” /> –>
<!– <add value=”NT AUTHORITY\System” /> –>
<!– <add value=”NT AUTHORITY\Authenticated Users” /> –>
</allowedCallers>

Once edited, run the following Powershell to start the service and set it to automatic start:

Set-Service -Name C2WTS -startuptype “automatic”
Start-Service -Name C2WTS

At this point, we are ready to start delegating credentials.

The following Kerberos settings need to be performed on a directory server, or a server that can run the SetSPN command and the Active Directory Users And Computers tool.

The first step is to add the SQL Server service and its associated account as a Service Principal Name (SPN). We need to do this for both the machine name of the server and its Fully Qualified Domain Name(s). This can be done from either PowerShell or a command prompt.

SetSPN –S MSSQLSvc/NAUTILUS2016SQL.nautilus.local :1433 NAUTILUS\SQLService
SetSPN –S MSSQLSvc/NAUTILUS2016SQL :1433 NAUTILUS\SQLService

In this example, NAUTILUS2016SQL is the name of the SQL server, nautilus.local is its domain, and NAUTILUS\SQLService is the account that SQL service is running as.

Once the SPNs have been added,  open the Active Directory Users and Computers tool. Navigate to the Computers node, select the OOS Server(s), right click and select properties. Click the delegation tab, Select “Trust this computer for delegation to specified servers only” (this is the constrained part…), and “Use any authentication protocol”. Next, click the “Add” button, and then “Users and computers”. Find the SQL service account (in our case, NAUTILUS\SQLService), and save it. You should see its corresponding service appear in the dialog box.

image_thumb1

Click OK, save everything and that’s it. KCD should be set up to allow delegation from OOS to SQL server, and any direct connected Excel workbooks should work.

This process will need to be repeated for any data source that supports Windows Authentication.

2. Use a Stored Account

A stored account is a proxy account, which is to say that all users using the workbook in a browser will connect to the data using the same credential – the one that is stored. This option uses the Secure Store Service in SharePoint, which is therefore a requirement. Usage is relatively straightforward. To set up a new Stored Account, navigate to the Manage Service Applications section in Central Admin on the SharePoint farm, and select the Secure Store Service.

Once in the service, Select the “New” button in the ribbon to create a new “Target Application” (Stored Account). Enter an ID, Display Name, and contact email (it’s required for some reason). Then, ensure that the Target Application Type is “Group”.

image_thumb[1]

Once complete, click the “Next” button.

The next screen prompts for the type of credentials, and the default is for Windows credentials, which is what we’re using. Click the “Next” button.

The administrators are those users that can modify the target application (stored account) and set the credentials for it. Add administrators as appropriate. The members are those users that can use the stored account, so this should be set to anyone that can access the workbook.

Now, in addition to any regular users, there is one or more accountsthat MUST be in the members list –  the computer accounts for all of the OOS Servers. This makes sense as the Office Online service itself runs as the Network Account, and it needs to consume the credentials from the SSS – it needs permissions to do so. The account should be entered in the form DOMAIN\MachineName$ – so in our example it’s NAUTILUS\Nautilus2016OOS$.

image

Finally, click OK to save it. The last step is to set the credentials. Select the target application, and click on the “Set Credentials” button in the ribbon. Enter the account that will be used as the proxy here. This will be the stored account that gets used to access the source data, so it obviously should have access to that data.

image_thumb[5]

If your network is http based, there is one further configuration task to perform on the OOS server. By default, OOS requires https to access the secure store service. The option for this is in a PowerShell parameter, and you can change it with the following PowerShell script:

Set-OfficeWebAppsFarm -AllowHttpSecureStoreConnections:$true

Now that this is complete, go to the Excel Services Authentication dialog in an Excel workbook, select “Use a Stored Account” and enter the name of the Secure Store Service Target application that was created (in the example above, this is SSSExcelTest). Save the workbook to SharePoint, open it in a browser and refresh data. All should work.

This procedure is the same for both Analysis Services and for SQL Server. This approach does not require Kerberos, EffectiveUserName, the C2WTS, or S2S trust for it to work. It is likely the easiest option to get up and running quickly.

As with any proxy account, these accounts should only have enough access to  the source data to serve the worksheet(s) they are used for. These should always be least privilege accounts.

3. None

This option was always poorly named. None does not mean no credentials, or anonymous, it means default, or the Unattended Access Account. Both Excel Services and PerformancePoint had the concept of an Unattended Access Account. This account was really just a default entry in the Secure Store Service. It’s useful when the entire organization has access to a certain body of data. Giving the Unattended Access Account access to that data means that nobody needs to configure anything – it should “just work”.

One of the casualties of the move from Excel Services to OOS was the Unattended Access Account. There is no configuration area or PowerShell where it can be set. It would seem that the “None” option cannot be used at all. However, it can, in a very limited way.

Selecting this option will cause the OOS machine account(s) to be used when attempting to access the source data. Therefore, if we grant access in the source data to the machine account, we are treating it like an Unattended Access Account. If this is acceptable from a security standpoint, then it can be used in this manner.

We have already seen how to do this for SSAS, but there is no option in SSMS to add a machine account as a security user in SQL Server, even with SSMS 2016. It is however possible. The way to do this is to use SSMS to open a new query, and run the following T-SQL command:

create login [NAUTILUS\NAUTILUS2016OOS$] from windows

where NAUTILUS\NAUTILUS2016OOS$ is the machine account. The square brackets are required. This will add the machine account as a user in the security section in SQL Server Management Studio.  Once that is done, the account can be added to individual databases in the same manner as any SQL account by navigating to the database Security\Users folder and adding a new user.  From here, select “SQL User with login” as the User type, click the ellipsis beside Login name, enter the machine account and select “Check Names”. Once added click “OK”, and “OK”, and then that account can be used just like any other account.

SSAS cannot use the “None” parameter in this way due to the way that the EffectiveUserName option works. If EffectiveUserName is enabled in OOS, it will simply use the credentials of the logged in user and be indistinguishable from the “Use the authenticated user’s account” option. If it is not enabled, it will try to connect to SSAS as anonymous and fail. The alternative to failing would be to open up access to all data on the server – failure is the best option here.  The bottom line is that the “None” should never be used with SSAS.

Summary

Below is a Summary of the various authentication options that can be used with Excel and OOS, and the requirements to make them work.

Use authenticated user’s account with EffectiveUserName (SSAS Only)

  • ExcelUseEffectiveUserName = true on OOS Server
  • OOS Computer account added to SSAS Server Administrator list

Use authenticated user’s account with Kerberos (SSAS or SQL Server)

  • CTWTS running on OOS Server
  • C2WTS configured to allow Network Service on OOS
  • KCD delegation from OOS Server to Source Server/account

Use a stored account (SSAS or SQL Server)

  • SSS Set up on SharePoint Farm
  • SSS Target account(s) created
  • OOS Computer account added to Members for Target account(s)
  • Target account configured for Least Privilege access to data

None

  • OOS Server account configured with Least privilege access to data (SQL Server Only)
  • ExcelUseEffectiveUserName = true on OOS Server (SSAS)

Acknowledgements

I wanted to acknowledge the help and guidance I received while putting this together and testing. Both Kay Unkroth and Jason Haak at Microsoft were quick to answer some of the thornier configuration questions, and fellow MVP Trevor Seward was instrumental in getting me over the Kerberos hump. Thanks guys.

13 Comments