Skip to content

Category: SharePoint

Using Power BI with Microsoft Lists

Microsoft Lists debuted in 2020 and they are a (yet another) great way to organize list-based data. Now, when someone says data, I think Power BI. Obviously, we’ll want to report on this data, but how do we do that? There is no Power BI connector for Microsoft Lists. The answer is quite simple, if not completely obvious. You need to use the SharePoint Online List connector in Power BI.

Microsoft Lists are the same thing as SharePoint lists. In fact, they ARE SharePoint lists. The Microsoft Lists service is just a new user interface for interacting with them. You can use Lists all you want and never see SharePoint at all, unless you need to change list settings, but I digress. Given this fact, as far as Power BI is concerned, everything that applies to SharePoint lists applies to Microsoft Lists lists (the grammar here gets awfully awkward).

For reference, I wrote a series of articles some time ago about the idiosyncrasies of working with SharePoint data in Power BI, and these articles are still valid today (2021). The most recent of these articles can be found here and includes links to the others.

There is one thing that is worth mentioning about Microsoft Lists. When a new list is created using the Lists interface, the user can save it to any of their SharePoint sites, but another option is to same it to “My lists”.

When you use the SharePoint Online list connector in SharePoint, it prompts you to enter the URL for the SharePoint site that contains the list that you want to report on. That is straightforward when your list is stored in a SharePoint site, but what if your list is stored in “My lists”? Where are “My lists” stored?

They are stored in a “personal” SharePoint site. We SharePoint old timers would know it as the MySite, and while usage of MySite has been de-emphasized in Microsoft 365, it is very much still there. Each user has one. In fact, this is where the “personal” OneDrive for Business content is stored – in the Documents library of the very same MySite. By storing personal lists in the MySite, Microsoft Lists is just following the same pattern used by OneDrive for Business, which makes perfect sense.

Given this, what URL should you use in Power BI to connect to your lists stored in “My Lists”? You’ll find it in the Microsoft Lists web interface in the URL bar. It’s that portion of the URL up to “/Lists/.

In most cases it will take the following form:

https://TenantName-my.sharepoint.com/personal/LoginID/Lists/ListName

Where:

  • TenantName = the name of your Microsoft 365 tenant, i.e. Contoso
  • LoginID = the email address used to login to Microsoft 365 with the “@” and the “.” replaced with underscores i.e. jpw_contoso_com
  • ListName – the name of you list

Once you enter in this URL, you’ll have access to any of the lists stored in “My lists”. At this point, your personal lists will behave like any list in any other SharePoint site.

Leave a Comment

Keep your Power BI SharePoint reports current with Microsoft Flow

Power BI is without question the best way to report on data in SharePoint lists. The query tools available in Power Query make working with SharePoint data relatively painless, an the cached dataset means that reports are run against an optimized copy of the list data, not the data itself.

This latter distinction, while removing the performance issues of systems that query lists directly, also introduces problems with data latency. The report will never be fully “up to date”, as it needs to be refreshed on a periodic basis.

Consider the following scenario. A Power BI report has been built that uses data from a SharePoint list. That report has been embedded on a SharePoint page in the same site. A user adds an item to the list, and then navigates to the page to see the updated report. Unfortunately, that report won’t get updated until the next scheduled refresh.

This has been a significant problem, until the recent release of the new “Refresh a dataset” action in Microsoft Flow.

It is a relatively simple procedure to add a simple 1 step flow to any SharePoint list that is triggered when an item is created, updated, or deleted. This flow simply needs to add the “Refresh a dataset” action, that is configured for the relevant dataset, and these embedded reports will be updated very shortly after the data is modified.

Alternatively, the flow can be triggered by a timer, allowing you to create your own schedule (every 5 minutes, etc) that is not hardwired to run at the top or bottom of any given hour.

A few caveats should be kept in mind when using this action however.

While this action gives us much finer grained control over when refreshes happen, all of the current license restrictions remain in place. For datasets located in the shared capacity, only 8 refreshes per day are allowed.

For datasets in dedicated capacities (Premium), there are no limits to the number of refreshes. The limit of 48 per day is a UI restriction, not a licensing restriction. However, refresh can utilize significant resources, particularly memory, so you’ll want to ensure that you have significant resources to support the update frequency.

Finally, the load on the source data system should be considered. Refresh will pull a significant amount of data every time it is run.

Caveats aside, this new flow action is a welcome relief to those that need greater control of how their reports are updated.

3 Comments

Using Power BI to Report on Location Columns in SharePoint

At the end of 2018, SharePoint received something that we haven’t seen for a long time – a new column type, Location. Location columns will look up an address and geocode it as it is being entered in a form. It will also separate all the constituent parts of the address as well as the latitude and longitude into separate display only columns. These columns are used primarily in views but can also be used in reports. Given that I put together a series of posts recently on using Power BI to work with complex SharePoint report types, I was interested on how to report on this new column type. As it turns out, it is relatively straightforward.

This post will delve into the nuances involved with reporting on this new SharePoint Location column in Power BI..

The Location Column

To begin with, the Location column is a “modern” SharePoint column. This means that it can be added to a list via the Add column button in the list view, but NOT through the list settings page as other column types are.

List view creation

List settings creation

If the Add column does not appear for you, you may be using a “classic” SharePoint site, or you may be using one or more column types that are not supported in “modern” which causes a classic view to be used. Removing these columns from the view is often enough to light up the add button.

Once created, you will have the option to add any or all of the address components to the view. These are display elements only and will be available to reports (or other views) whether or not they are added to the view at creation time.

Once created, entering data is as simple as typing in an address, or the name of a location into the column. The typeahead feature will attempt to find the location and fill in the details.

Once selected, the full address will be filled in, and all the constituent address properties will be populated. If they are on the view, the list can be sorted, filtered, etc. by these elements.

Reporting on the Location Column

Internally, the location is saved as a BLOB of JSON content within a column. When the column itself is used in the view, its friendly display format is displayed. When constituent items are displayed (City for example) their values are extracted from the column and displayed as discrete elements. For other SharePoint column types, this can provide complications, but the developers of the location feature seem to have had reporting in mind when it was built. Consider the following list that contains a Location column named Location:

Loading the Data

We first launch Power BI Desktop, select “Get Data” and then choose SharePoint Online list. We are then prompted for the URL of the SharePoint Site. The dialog is titled SharePoint lists, but the value is the URL of the site, NOT the list itself. Once this is entered, we are prompted for credentials if we haven’t connected to this site before. After entering credentials, we can select the list that we want to report on. In our case, it’s named Properties. We select it, and then click on the Edit button.


Once the data loads in, one of the first things that you’ll notice is that there are a lot of columns to choose from, and it’s a good idea to remove the columns that you don’t need. We can do this by right clicking on the desired column titles and selecting Remove.

Using FieldValuesAsText

With all other complex SharePoint column types, the FieldValuesAsText column will retrieve the textual representation of required column values. This is the way that the column value appears in a view. However, it appears that the Location column type is an exception to this rule. When the Location column is used, the JSON value itself is returned, which renders FieldValuesAsText relatively useless. THis value is also available using the Location column value itself. The steps for extracting FiedValuesAsText are covered in previous posts in this series. Given that ultimately this will not be a good approach for the Location column, we won’t go into it further here.

Field value and value extracted from FieldValuesAsText

Using DispName

The text value of the location column is instead available through the derived DispName column.

With Power BI, it is possible to transform the JSON data contained in the original column, or the extracted FieldValuesAsText column. All of the extracted properties are available through more efficient means. The FieldValuesAsText column can therefore be ignored for the purposes of reporting on Location columns. In addition, in most cases, the original column (Locations in this case) can be removed, and the DispName column should be renamed in its place.

This behaviour is inconsistent with the behaviour of other complex SharePoint fields. It does not affect capability, but in the interests of consistency, my strong suggestion would be for the SharePoint team to eliminate the DispName field, and leverage FieldValuesAsText for the text conversion in the data feed.

Using Location Components

All the text components of the location column are separated out automatically as columns in Power Query. They can be used as any other column, and no additional action is necessary.

Automatically extracted location components

Using GeoLoc

Power BI will automatically geocode data at the time the report is rendered. The text components can therefore be used by the reporting engine to place data on a map. However, geocoding is a relatively computationally expensive operation, especially if there is a lot of data, or poor internet connection. In addition, some visuals may require the use of specific latitude and longitude co-ordinates. These co-ordinates are available through the GeoLoc column if they are needed, but they do need to be extracted.

Within Power Query, locate the GeoLoc column, and click on the Expand icon in the right of the column header.

Select both the Latitude and Longitude columns and deselect Use original column name as prefix. In my testing, both Altitude and Measure do not return any meaningful data, so they can be safely ignored, however this could change in the future.

At this point, we are almost ready to do some reporting. Once all the required columns have been shaped, and their data types set, select the Close and Apply button from the ribbon.

Reporting

Before using the location data on a map, it is important to categorize each of the components so that Power BI knows how to use it on a map. To categorize a data field, select it from the fields list. Then select the Modeling tab from the ribbon click the Data Category dropdown.

The category for most of the fields is obvious, but below is a table of recommended choices. In addition, both the longitude and latitude fields need to be set to the Decimal Number type.

Field Category
City City
CountryOrRegion Country/Region
Latitude Latitude
Location Place
Longitude Longitude
PostalCode Postal Code
State State or Province
Street Address

Once categorized, the data can be placed on a map according to any desired parameters. In this can, the below shows a map of listings colour coded by the asking price range.

The resulting report can then be published to the Power BI service, and then embedded into a SharePoint page through either the Power BI web part, or secure embedding if so desired.

Happy reporting!

1 Comment

Use Power BI dataflows to warehouse SharePoint list data

Image result for data warehouse

Reporting on SharePoint data has been a requirement for a long time, and there have been many approaches to fulfill this need. Custom web parts, Data View web parts and SSRS direct connected reports have historically been some of the solutions, but they all suffer from the same problem. If you have any serious amount of SharePoint data, you’ll quickly begin to bump into capacity limits and performance limitations, and in some cases, you can impact the performance of the overall system. In order to avoid this problem, it is necessary to warehouse SharePoint data first, as I argued in this post from 2012.

Once your list-based data is in a relational database, the performance issue is taken care of. However, the means of getting it moved there have traditionally been problematic. For a long time, there was a CodePlex project called the SharePoint List Source and Destination. This solution provided read and write access to SharePoint lists from SQL Server Integration Services (SSIS). Unfortunately, it was last updated in 2012, it was unsupported by Microsoft, and it did not support authentication for Office 365. This of course rendered it useless for use with SharePoint Online. In 2015, SQL Server Integration Services got an OData source, and given that SharePoint lists have OData endpoints, this became a viable option, particularly given that it did support Office 365 authentication. The OData connection from SharePoint did however have some limitations as well.

For cloud scenarios, Power BI has emerged as a very competent way of reporting against SharePoint data. It has native connectors for SharePoint list data, both on premises and in the cloud and Power BI reports can be hosted in the cloud through the SharePoint Power BI web part. On premises, the same can be done with Power BI Report Server. The structure of Power BI reports mean that the data is cached in a data model, so reports are not run directly against the list data source. This avoids the performance issues listed earlier.

Earlier this year I published a series of articles detailing how to do exactly this. The only issue with this approach is that the data shaping and preparation is always specific to a single report. If I have 5 different reports that use one list, I must query and shape that data 5 different times – one for each report. This is where Power BI dataflows come in.

In this context, dataflows are essentially a data warehousing layer with transformation capability. Instead of each report connecting back to a source list, the dataflow connects to the list, shapes the data with Power Query online and stores it in a data lake. The Power BI reports then connect to the dataflow as their data source. Transformation and storage only need to happen once.

As of this writing, dataflows are in public preview, so be warned – some things could change.

Creating a dataflow

Creating a dataflow from a SharePoint list is relatively straightforward. In our examples below, we will work with the same sample list from the series of articles on SharePoint data earlier this year. To begin open Power BI and navigate to a workspace (your personal workspace will not have dataflows). Click on the workspace name in the navigation pane and the dataflows tab should be available.

To create a new dataflow, Select the Create button, and click dataflow.

Select the Add new entities button and the data source selection will appear. SharePoint list and SharePoint online list are both options. SharePoint list is for on premises list data which will work with the On-Premises Data Gateway. In our case we are working with SharePoint Online, so we select the SharePoint Online source.

At this point, you enter the URL for the site that you want to connect to (NOT the URL for the list) and select the Next button. Power BI Will connect to the site and you can then select which list you want to work with. In our case, we need our Listings data, so we select that list and click Next.

Finally, we’re in the Power Query editing screen. This should be quite familiar to those used to working with Power Query in either Power BI Desktop or in Excel. From here you can select the columns that you want to include in the dataflow.

Although this experience is similar that building queries in the Power BI Desktop, there are a few noticeable differences. Queries in a PBIX file are referred to as queries, but within a dataflow they are referred to as entities. These entities can be custom, or they can be mapped to Common Data Model object types. The Power Query web editor also does not include the full featured editing ribbon found in Power BI Desktop, but instead has a button bar. Many of the editing options available in Power BI Desktop are not available in the Power Query web experience.

If you have read through some of my earlier articles on working with SharePoint data in Power BI, you will notice that there are fewer columns available than we see in the Desktop Power Query editor. Most notably for us working with SharePoint data is the FieldValuesAsText column which is the convenient way of retrieving the text representation of complex SharePoint list column types. At first glance, this would appear to be quite limiting.

However, by right-clicking on the entity name, we can access the Advanced Editor.

This Advanced editor allows you to write queries by hand using the M language. The side benefit of the Advanced editor is that it makes queries portable between platforms -Desktop, Excel, and now dataflows. You can therefore build your queries in Power BI Desktop using its fully functional editor and then copy and paste it into a new blank query in the dataflow editor. Using this approach allows you take advantage of the SharePoint helpers built into Power BI Desktop as the FieldValuesAsText column, and other columns are available. Using this technique, the Listings example can be transformed into several normalized tables in the dataflow.

Click on Done to save your entities, and then the Save button to save your dataflow. You will be prompted to Refresh Now which is a good idea because by default, the dataflow has no data contained within it. To keep the data up to date, you need to set a refresh schedule by clicking the schedule refresh icon under actions for the dataflow in question. From here, you schedule data refresh in the same manner as you would with ta Power BI Report.

Using the dataflow

Once data is loaded into the dataflow it becomes a source for a Power BI report. You must use Power BI Desktop to create this report, there is no way to connect a report to a dataflow in the pure web interface. Start Power BI Desktop and select “Get Data”. Choose the Power BI blade and then Power BI dataflows.

After clicking Connect, you will be presented with a set of Power BI workspaces that contain dataflows. Opening the workspace will allow you to open the dataflow and select the desired entities.

Once loaded, the report can be built just like any other. When it is refreshed, it will be refreshed from the data stored in the dataflow, NOT directly from the SharePoint list. It is therefore important to keep the dataflow itself up to date.

Any number of reports can be created from the dataflow. Instead of having all the transformation logic tied up within a single report, dataflows allow them to be centralized and consistent. With a little work, these transformations allow you work with your SharePoint data just as though it were relational. Power BI dataflows really are the best way to perform data warehousing with your SharePoint data, whether you SharePoint is on line or on-premises.

3 Comments

Business Intelligence in SharePoint 2019

The recent availability of the SharePoint 2019 public preview, and the supporting information that accompanies it has clarified the status of Business Intelligence features in SharePoint 2019. This release, with one exception, is the culmination of the process of decoupling BI from SharePoint which began in SharePoint 2016 through the removal of Excel Services. This decoupling strategy was initially articulated in the fall of 2015 with the document Microsoft Business Intelligence – our reporting roadmap which stated that SQL Server Reporting Services was to be the cornerstone of their on-premises BI investment (and not SharePoint).

The embedded BI features now run with SharePoint as opposed to on SharePoint. These changes do however require some planning and some effort on behalf of those that have already invested in the current platform and wish to move forward on-premises. With this in mind, and the fact that concise information around these changes is a bit difficult to find, I wanted to put this reference together. This post does not get into migration strategies, only the changes themselves.

The source for much of the below comes from discussions with the relevant product teams, and official information is found (today) in two primary places. The document What’s deprecated or removed from SharePoint Server 2019 Public Preview
which was published concurrently with the SharePoint 2019 public preview, and Christopher Finlan‘s presentation at the Microsoft Business Application Summit 2019 entitled Self-service BI and enterprise reporting on-premises with Power BI Report Server.

A summary of the changes to BI features, and a brief discussion of each is below.

Feature Status
SQL Server Reporting Services Integrated Mode Removed
Power View Removed
BISM file connections Removed
PerformancePoint Deprecated
PerformancePoint – Decomposition Trees Removed
Power Pivot for SharePoint Removed
Scheduled workbook data refresh Removed
Workbook as a data source Removed
PowerPivot management dashboard Removed
PowerPivot Gallery Removed

SQL Server Reporting Services Integrated Mode

SSRS Integrated mode was deprecated in November 2016, as was not a part of SQL Server 2017. However, organizations could continue to use SSRS versions from 2016 and prior in SharePoint 2016. This is not supported in SharePoint 2019, which means that integrated mode isn’t an option at all with SharePoint 2019. The good news is that the recent Report Viewer web part fully replicates the capabilities of the SSRS Integrated mode web part.

Power View

Power View was a feature of SSRS Integrated mode and is available in Excel. When Excel Services was removed in 2016, Power View in Excel required SSRS Integrated mode to work. Both supporting platforms are now gone, and thus Power View is not supported in SharePoint 2019.

BISM file connections

The BISM file connection type was used by Excel and SSRS to connect Power View reports to SQL Server Analysis Services data sources. This connection type has been removed along with Power View.

PerformancePoint Services

PerformancePoint is a combination of capabilities that includes dashboarding, scorecards, and analytic reports. Very few new features have been added to PerformancePoint in the last few versions, and this one even loses a few. Many of of these features are also available in Power BI and Power BI report server, and Microsoft has taken the decision to deprecate this product. This gives customers with a PerformancePoint investment time to migrate their assets but is a clear indication that it will also be removed in a subsequent release.

PerformancePoint – Decomposition Trees

The Decomposition Tree feature in PerformancePoint came originally from ProClarity – one of the three products that made up the original PerformancePoint product. These visuals are based on Silverlight, and have been removed from the product accordingly.

PowerPivot for SharePoint

PowerPivot for SharePoint is not supported in SharePoint 2019. PP4SP was originally a combination of two technologies – a specialized version of SQL Server Analysis Services, and a SharePoint service application. In the 2016 version, these two parts were split into two – the SSAS component became a part of the SQL Server installation media as SSQL – PowerPivot mode, and the service application, which continued the name PowerPivot for SharePoint. To be clear, it is the second of the two that has been removed. SSAS PowerPivot mode continues to be an important component and is used by Office Online Server for working with Excel files that have embedded models.

Scheduled workbook data refresh

This feature allowed for the automatic refresh of the data stored within Excel workbooks in SharePoint. It requires a PowerPivot data model to work, but the refresh operation would refresh all connected data in the workbook on a scheduled basis. This was a component of PowerPivot for SharePoint. It has recently been announced that this capability will soon be available in Power BI Report Server.

Workbook as a data source

With PowerPivot for SharePoint deployed, it is possible to use the data model in a published Excel workbook as the data source for another workbook. This feature will no longer be available, and there are no plans at present to reintroduce it.

PowerPivot Management Dashboard

Originally a part of SharePoint Central Administration, the management dashboard provided status updates on all PowerPivot for SharePoint operations. Being a part of PowerPivot for SharePoint, this has been removed accordingly.

PowerPivot Gallery

The PowerPivot Gallery is a modified SharePoint Document library form that displays worksheet thumbnails contained in published Excel workbooks. This component is Silverlight based, and part of PowerPivot for SharePoint. It has been removed accordingly.

Power View, Decomposition trees, and the PowerPivot gallery were the last remaining features that carried a Silverlight dependency. SharePoint 2019 no longer has any Silverlight dependencies.

These changes are significant for anyone with an existing Business Intelligence investment that plans to move to SharePoint 2019. I intent to write more about migration strategies and will be addressing these topics at various conferences in the future.

5 Comments