Skip to content

Category: Office 365

Using Power BI with Microsoft Lists

Microsoft Lists debuted in 2020 and they are a (yet another) great way to organize list-based data. Now, when someone says data, I think Power BI. Obviously, we’ll want to report on this data, but how do we do that? There is no Power BI connector for Microsoft Lists. The answer is quite simple, if not completely obvious. You need to use the SharePoint Online List connector in Power BI.

Microsoft Lists are the same thing as SharePoint lists. In fact, they ARE SharePoint lists. The Microsoft Lists service is just a new user interface for interacting with them. You can use Lists all you want and never see SharePoint at all, unless you need to change list settings, but I digress. Given this fact, as far as Power BI is concerned, everything that applies to SharePoint lists applies to Microsoft Lists lists (the grammar here gets awfully awkward).

For reference, I wrote a series of articles some time ago about the idiosyncrasies of working with SharePoint data in Power BI, and these articles are still valid today (2021). The most recent of these articles can be found here and includes links to the others.

There is one thing that is worth mentioning about Microsoft Lists. When a new list is created using the Lists interface, the user can save it to any of their SharePoint sites, but another option is to same it to “My lists”.

When you use the SharePoint Online list connector in SharePoint, it prompts you to enter the URL for the SharePoint site that contains the list that you want to report on. That is straightforward when your list is stored in a SharePoint site, but what if your list is stored in “My lists”? Where are “My lists” stored?

They are stored in a “personal” SharePoint site. We SharePoint old timers would know it as the MySite, and while usage of MySite has been de-emphasized in Microsoft 365, it is very much still there. Each user has one. In fact, this is where the “personal” OneDrive for Business content is stored – in the Documents library of the very same MySite. By storing personal lists in the MySite, Microsoft Lists is just following the same pattern used by OneDrive for Business, which makes perfect sense.

Given this, what URL should you use in Power BI to connect to your lists stored in “My Lists”? You’ll find it in the Microsoft Lists web interface in the URL bar. It’s that portion of the URL up to “/Lists/.

In most cases it will take the following form:

https://TenantName-my.sharepoint.com/personal/LoginID/Lists/ListName

Where:

  • TenantName = the name of your Microsoft 365 tenant, i.e. Contoso
  • LoginID = the email address used to login to Microsoft 365 with the “@” and the “.” replaced with underscores i.e. jpw_contoso_com
  • ListName – the name of you list

Once you enter in this URL, you’ll have access to any of the lists stored in “My lists”. At this point, your personal lists will behave like any list in any other SharePoint site.

Leave a Comment

Increase your OneDrive Storage Limit beyond 5 TB

Note – 2021-01-26 – This article is still totally valid, but check the comment section below for an alternate method for getting your tenant enabled for > 5 TB storage.

OneDrive for Business offers “unlimited” storage. You would be excused if you were sure that the limit was 1 TB because that is what it is set to by default, Microsoft would prefer it if you didn’t exercise this particular option. Individual users can’t change their limits, and administrators can only up that limit to 5 TB. Increasing it beyond that limit requires extra steps. I have just gone through those steps for my own OneDrive for business, and thought that I would share the experience.

The stratification of the different OneDrive for Business storage options has been outlined effectively by Joel Oleson in his article Three Tiers to Increase to Unlimited Storage in OneDrive for Business.

The bottom line is that you must increase the quota limit for a OneDrive in stages. These stages are:

  1. Increase quota from 1 TB to 5TB
  2. Reach 90% of the OneDrive storage limit (4.5 TB)
  3. Open a support ticket to turn on “Boost Quota” for the tenant
  4. Increase quota from 5 TB to 25TB
  5. Reach 90% of the OneDrive storage limit (22.5 TB)
  6. Open a support ticket to turn create a new site collection with 25 TB quota
  7. Repeat steps 5 & 6 as necessary

I have just completed step 4, so that’s as far as this post will go. It likely goes without saying, but all of the operations below require tenant admin permissions.

Increase from the default limit to 5 TB

To increase a user’s OneDrive limit from the default of 1 TB to 5TB, it is a simple matter of running a PowerShell command, as documented in Change a specific user’s OneDrive storage space.

Set-SPOSite -Identity <user's OneDrive URL> -StorageQuota 5242880‬

The number 5,242,880‬ (5 TB) used for the storage quota must be precise.

The default value for the tenant itself can be changed as well, so that this first step isn’t necessary for new users. See Set the default storage space for OneDrive users for details. It should be noted that the maximum value that you can set as a default is 5120 (5TB). If it’s set any higher, it won’t be saved.

Increasing storage limit to 25 TB

Before you can set the limit any higher, you must first fill the OneDrive to 90% of its capacity, or 4.5 TB. This happened to me a few weeks ago and I started getting weekly “approaching your capacity” messages. At this point I opened a support ticket, and this is where the fun started.

I’ll spare you the back and forth email exchange, but a little snippet of the conversation went something like the following. Each line is an action or an email:

Me:          fills out form explicitly stating that I have hit 90% and need quota increased from 5 TB to 25 TB
Support: Please reply with a screenshot of the problem, and any troubleshooting steps
Me: I need my OneDrive quota increased from 5 to 25 TB
Support: You need OneDrive Plan 2 for that, and your OneDrive must be at 98% (both incorrect)
Me: I have E5, which includes unlimited storage (I ignored the 98% comment)
Support: What capacity is it at now?
Me: 91%
Support: Send the OneDrive URL
Me: sends OneDrive URL
Support: Your tenant may not have the Boost storage option enabled. Let me ask my supervisor to get that enabled.
Support (one day later): Would you like instructions?
Me: Yes please
Support: sends publicly available url listed above that increases quota with Powershell
Support: First change it to 10 TB, then change it to 25 TB (no idea where that came from)
Me: tries it, doesn't work for 10 or 25
Me: It didn't work
Support: Did you connect to the SharePoint Online module first? (seriously)
Me: Yes. This isn't my first rodeo. (I'm paraphrasing)
Support: Can you send a screenshot of your error message?
Me: There is no error message, the value simply does not save.
Me: This approach works up to 5 TB but not beyond
Support: Let me look further into this and get back to you.
Support: Your request had now been passed to our escalations team
Support (5 days later, different rep): We enabled boost storage. Can you try and let us know?
Me: Successful. Thank you.

I include the above partly because you might want the chuckle, but mostly to let you know not to give up in this. It’s advertised, and you paid for it.

The command that I used to enable this, once support had turned on the boost storage feature was:

Set-SPOSite -Identity <user's OneDrive URL> -StorageQuota 26214400

The number 26,214,400 (25 TB) must be used precisely.

In any event, after running the above command, my storage limit is now at 25 TB.

The particularly interesting thing to note here is that because the “Boost storage” feature is set at the tenant level, any other OneDrives in the tenant can have their limits increased without contacting support. All that is necessary is the PowerShell script above. However, the drive must still reach 90% capacity before it can be increased.

Should I hit the next limit, I’ll report back here.

8 Comments

Keep your Power BI SharePoint reports current with Microsoft Flow

Power BI is without question the best way to report on data in SharePoint lists. The query tools available in Power Query make working with SharePoint data relatively painless, an the cached dataset means that reports are run against an optimized copy of the list data, not the data itself.

This latter distinction, while removing the performance issues of systems that query lists directly, also introduces problems with data latency. The report will never be fully “up to date”, as it needs to be refreshed on a periodic basis.

Consider the following scenario. A Power BI report has been built that uses data from a SharePoint list. That report has been embedded on a SharePoint page in the same site. A user adds an item to the list, and then navigates to the page to see the updated report. Unfortunately, that report won’t get updated until the next scheduled refresh.

This has been a significant problem, until the recent release of the new “Refresh a dataset” action in Microsoft Flow.

It is a relatively simple procedure to add a simple 1 step flow to any SharePoint list that is triggered when an item is created, updated, or deleted. This flow simply needs to add the “Refresh a dataset” action, that is configured for the relevant dataset, and these embedded reports will be updated very shortly after the data is modified.

Alternatively, the flow can be triggered by a timer, allowing you to create your own schedule (every 5 minutes, etc) that is not hardwired to run at the top or bottom of any given hour.

A few caveats should be kept in mind when using this action however.

While this action gives us much finer grained control over when refreshes happen, all of the current license restrictions remain in place. For datasets located in the shared capacity, only 8 refreshes per day are allowed.

For datasets in dedicated capacities (Premium), there are no limits to the number of refreshes. The limit of 48 per day is a UI restriction, not a licensing restriction. However, refresh can utilize significant resources, particularly memory, so you’ll want to ensure that you have significant resources to support the update frequency.

Finally, the load on the source data system should be considered. Refresh will pull a significant amount of data every time it is run.

Caveats aside, this new flow action is a welcome relief to those that need greater control of how their reports are updated.

3 Comments

Using Power BI to Report on Location Columns in SharePoint

At the end of 2018, SharePoint received something that we haven’t seen for a long time – a new column type, Location. Location columns will look up an address and geocode it as it is being entered in a form. It will also separate all the constituent parts of the address as well as the latitude and longitude into separate display only columns. These columns are used primarily in views but can also be used in reports. Given that I put together a series of posts recently on using Power BI to work with complex SharePoint report types, I was interested on how to report on this new column type. As it turns out, it is relatively straightforward.

This post will delve into the nuances involved with reporting on this new SharePoint Location column in Power BI..

The Location Column

To begin with, the Location column is a “modern” SharePoint column. This means that it can be added to a list via the Add column button in the list view, but NOT through the list settings page as other column types are.

List view creation

List settings creation

If the Add column does not appear for you, you may be using a “classic” SharePoint site, or you may be using one or more column types that are not supported in “modern” which causes a classic view to be used. Removing these columns from the view is often enough to light up the add button.

Once created, you will have the option to add any or all of the address components to the view. These are display elements only and will be available to reports (or other views) whether or not they are added to the view at creation time.

Once created, entering data is as simple as typing in an address, or the name of a location into the column. The typeahead feature will attempt to find the location and fill in the details.

Once selected, the full address will be filled in, and all the constituent address properties will be populated. If they are on the view, the list can be sorted, filtered, etc. by these elements.

Reporting on the Location Column

Internally, the location is saved as a BLOB of JSON content within a column. When the column itself is used in the view, its friendly display format is displayed. When constituent items are displayed (City for example) their values are extracted from the column and displayed as discrete elements. For other SharePoint column types, this can provide complications, but the developers of the location feature seem to have had reporting in mind when it was built. Consider the following list that contains a Location column named Location:

Loading the Data

We first launch Power BI Desktop, select “Get Data” and then choose SharePoint Online list. We are then prompted for the URL of the SharePoint Site. The dialog is titled SharePoint lists, but the value is the URL of the site, NOT the list itself. Once this is entered, we are prompted for credentials if we haven’t connected to this site before. After entering credentials, we can select the list that we want to report on. In our case, it’s named Properties. We select it, and then click on the Edit button.


Once the data loads in, one of the first things that you’ll notice is that there are a lot of columns to choose from, and it’s a good idea to remove the columns that you don’t need. We can do this by right clicking on the desired column titles and selecting Remove.

Using FieldValuesAsText

With all other complex SharePoint column types, the FieldValuesAsText column will retrieve the textual representation of required column values. This is the way that the column value appears in a view. However, it appears that the Location column type is an exception to this rule. When the Location column is used, the JSON value itself is returned, which renders FieldValuesAsText relatively useless. THis value is also available using the Location column value itself. The steps for extracting FiedValuesAsText are covered in previous posts in this series. Given that ultimately this will not be a good approach for the Location column, we won’t go into it further here.

Field value and value extracted from FieldValuesAsText

Using DispName

The text value of the location column is instead available through the derived DispName column.

With Power BI, it is possible to transform the JSON data contained in the original column, or the extracted FieldValuesAsText column. All of the extracted properties are available through more efficient means. The FieldValuesAsText column can therefore be ignored for the purposes of reporting on Location columns. In addition, in most cases, the original column (Locations in this case) can be removed, and the DispName column should be renamed in its place.

This behaviour is inconsistent with the behaviour of other complex SharePoint fields. It does not affect capability, but in the interests of consistency, my strong suggestion would be for the SharePoint team to eliminate the DispName field, and leverage FieldValuesAsText for the text conversion in the data feed.

Using Location Components

All the text components of the location column are separated out automatically as columns in Power Query. They can be used as any other column, and no additional action is necessary.

Automatically extracted location components

Using GeoLoc

Power BI will automatically geocode data at the time the report is rendered. The text components can therefore be used by the reporting engine to place data on a map. However, geocoding is a relatively computationally expensive operation, especially if there is a lot of data, or poor internet connection. In addition, some visuals may require the use of specific latitude and longitude co-ordinates. These co-ordinates are available through the GeoLoc column if they are needed, but they do need to be extracted.

Within Power Query, locate the GeoLoc column, and click on the Expand icon in the right of the column header.

Select both the Latitude and Longitude columns and deselect Use original column name as prefix. In my testing, both Altitude and Measure do not return any meaningful data, so they can be safely ignored, however this could change in the future.

At this point, we are almost ready to do some reporting. Once all the required columns have been shaped, and their data types set, select the Close and Apply button from the ribbon.

Reporting

Before using the location data on a map, it is important to categorize each of the components so that Power BI knows how to use it on a map. To categorize a data field, select it from the fields list. Then select the Modeling tab from the ribbon click the Data Category dropdown.

The category for most of the fields is obvious, but below is a table of recommended choices. In addition, both the longitude and latitude fields need to be set to the Decimal Number type.

Field Category
City City
CountryOrRegion Country/Region
Latitude Latitude
Location Place
Longitude Longitude
PostalCode Postal Code
State State or Province
Street Address

Once categorized, the data can be placed on a map according to any desired parameters. In this can, the below shows a map of listings colour coded by the asking price range.

The resulting report can then be published to the Power BI service, and then embedded into a SharePoint page through either the Power BI web part, or secure embedding if so desired.

Happy reporting!

1 Comment

Use Power BI dataflows to warehouse SharePoint list data

Image result for data warehouse

Reporting on SharePoint data has been a requirement for a long time, and there have been many approaches to fulfill this need. Custom web parts, Data View web parts and SSRS direct connected reports have historically been some of the solutions, but they all suffer from the same problem. If you have any serious amount of SharePoint data, you’ll quickly begin to bump into capacity limits and performance limitations, and in some cases, you can impact the performance of the overall system. In order to avoid this problem, it is necessary to warehouse SharePoint data first, as I argued in this post from 2012.

Once your list-based data is in a relational database, the performance issue is taken care of. However, the means of getting it moved there have traditionally been problematic. For a long time, there was a CodePlex project called the SharePoint List Source and Destination. This solution provided read and write access to SharePoint lists from SQL Server Integration Services (SSIS). Unfortunately, it was last updated in 2012, it was unsupported by Microsoft, and it did not support authentication for Office 365. This of course rendered it useless for use with SharePoint Online. In 2015, SQL Server Integration Services got an OData source, and given that SharePoint lists have OData endpoints, this became a viable option, particularly given that it did support Office 365 authentication. The OData connection from SharePoint did however have some limitations as well.

For cloud scenarios, Power BI has emerged as a very competent way of reporting against SharePoint data. It has native connectors for SharePoint list data, both on premises and in the cloud and Power BI reports can be hosted in the cloud through the SharePoint Power BI web part. On premises, the same can be done with Power BI Report Server. The structure of Power BI reports mean that the data is cached in a data model, so reports are not run directly against the list data source. This avoids the performance issues listed earlier.

Earlier this year I published a series of articles detailing how to do exactly this. The only issue with this approach is that the data shaping and preparation is always specific to a single report. If I have 5 different reports that use one list, I must query and shape that data 5 different times – one for each report. This is where Power BI dataflows come in.

In this context, dataflows are essentially a data warehousing layer with transformation capability. Instead of each report connecting back to a source list, the dataflow connects to the list, shapes the data with Power Query online and stores it in a data lake. The Power BI reports then connect to the dataflow as their data source. Transformation and storage only need to happen once.

As of this writing, dataflows are in public preview, so be warned – some things could change.

Creating a dataflow

Creating a dataflow from a SharePoint list is relatively straightforward. In our examples below, we will work with the same sample list from the series of articles on SharePoint data earlier this year. To begin open Power BI and navigate to a workspace (your personal workspace will not have dataflows). Click on the workspace name in the navigation pane and the dataflows tab should be available.

To create a new dataflow, Select the Create button, and click dataflow.

Select the Add new entities button and the data source selection will appear. SharePoint list and SharePoint online list are both options. SharePoint list is for on premises list data which will work with the On-Premises Data Gateway. In our case we are working with SharePoint Online, so we select the SharePoint Online source.

At this point, you enter the URL for the site that you want to connect to (NOT the URL for the list) and select the Next button. Power BI Will connect to the site and you can then select which list you want to work with. In our case, we need our Listings data, so we select that list and click Next.

Finally, we’re in the Power Query editing screen. This should be quite familiar to those used to working with Power Query in either Power BI Desktop or in Excel. From here you can select the columns that you want to include in the dataflow.

Although this experience is similar that building queries in the Power BI Desktop, there are a few noticeable differences. Queries in a PBIX file are referred to as queries, but within a dataflow they are referred to as entities. These entities can be custom, or they can be mapped to Common Data Model object types. The Power Query web editor also does not include the full featured editing ribbon found in Power BI Desktop, but instead has a button bar. Many of the editing options available in Power BI Desktop are not available in the Power Query web experience.

If you have read through some of my earlier articles on working with SharePoint data in Power BI, you will notice that there are fewer columns available than we see in the Desktop Power Query editor. Most notably for us working with SharePoint data is the FieldValuesAsText column which is the convenient way of retrieving the text representation of complex SharePoint list column types. At first glance, this would appear to be quite limiting.

However, by right-clicking on the entity name, we can access the Advanced Editor.

This Advanced editor allows you to write queries by hand using the M language. The side benefit of the Advanced editor is that it makes queries portable between platforms -Desktop, Excel, and now dataflows. You can therefore build your queries in Power BI Desktop using its fully functional editor and then copy and paste it into a new blank query in the dataflow editor. Using this approach allows you take advantage of the SharePoint helpers built into Power BI Desktop as the FieldValuesAsText column, and other columns are available. Using this technique, the Listings example can be transformed into several normalized tables in the dataflow.

Click on Done to save your entities, and then the Save button to save your dataflow. You will be prompted to Refresh Now which is a good idea because by default, the dataflow has no data contained within it. To keep the data up to date, you need to set a refresh schedule by clicking the schedule refresh icon under actions for the dataflow in question. From here, you schedule data refresh in the same manner as you would with ta Power BI Report.

Using the dataflow

Once data is loaded into the dataflow it becomes a source for a Power BI report. You must use Power BI Desktop to create this report, there is no way to connect a report to a dataflow in the pure web interface. Start Power BI Desktop and select “Get Data”. Choose the Power BI blade and then Power BI dataflows.

After clicking Connect, you will be presented with a set of Power BI workspaces that contain dataflows. Opening the workspace will allow you to open the dataflow and select the desired entities.

Once loaded, the report can be built just like any other. When it is refreshed, it will be refreshed from the data stored in the dataflow, NOT directly from the SharePoint list. It is therefore important to keep the dataflow itself up to date.

Any number of reports can be created from the dataflow. Instead of having all the transformation logic tied up within a single report, dataflows allow them to be centralized and consistent. With a little work, these transformations allow you work with your SharePoint data just as though it were relational. Power BI dataflows really are the best way to perform data warehousing with your SharePoint data, whether you SharePoint is on line or on-premises.

3 Comments