Skip to content

Tag: Power Query

Power BI – Working With the Data Management Gateway

 

Update August 2014 – Version 1.2 of the DMG offers significant changes, which I’ve written about here.

While it isn’t the flashiest, the Data Management Gateway is arguably the most important part of Power BI. Most of the other features of Power BI have either been available already, or could be achieved through alternate means. What was next to impossible was to have a workbook stored in Office 365 automatically update its data from an on premises data source in a manner similar to Power Pivot for SharePoint. The Data Management Gateway does this, and a fair bit more. This article will attempt to explore some of its intricacies, tricks, and things to look out for. This article is being written while using the invitation only preview of Power BI, and some of the specifics could will change by public preview, and certainly by General Availability.

The Data Management Gateway is a Windows service, and can run on any Windows PC. Well, not just any PC, it needs Windows 7 or above, but it’s a relatively light service. In my case for the preview, I’m running it on my main workstation. It’s also relatively small, and you can run multiple gateways within your organization. If you want to have a look at it, you can download it here whether or not you have a Power BI enabled Office 365 tenant, but if you don’t, you won’t be able to do much with it.

Installing the gateway is pretty straightforward, and is well documented here, so I won’t go into details on that. Once it is installed, it establishes a communication channel with your Power BI service in the cloud (no firewall holes required), and essentially waits for requests. Once it gets one, it acts as a broker between your cloud service and your on premises data.

Most configuration is done in the cloud, in the Power BI Admin Center. If you’re interested in monitoring its activity on the gateway machine, You can do so using Resource Monitor or Task Manager. The process that you’re looking for is diawp.exe or diahost.exe. I have no idea there those names come from, but I’m going to guess Data Integration Agent Worker Process and host.

image

Once installed, the gateway performs two major semi related functions. The first is the aforementioned on premises data refresh capability. In addition to this, the gateway also provides the ability to publish your on-premises data sources as an OData feed that is will be accessible in the cloud. In its current version, the gateway only supports a limited set of data sources – all of them SQL Server. The official list can be found at the bottom of this document. Although its not listed specifically, SQL Azure databases are also supported.

I’m going to drill down a little on these two main functions, and share some of my experiences. We’ll start with the OData feed.

Publishing an OData Feed

An OData feed is published by creating a data source. This is done from within the Admin Center by navigating to the data sources tab, and selecting “new data source”.

image

This starts a data source “wizard”. The first question to be answered is what will this data source be used for.

image

It isn’t necessary to create an OData feed in order to refresh on-premises data, but it is necessary to to create a data connection (more on this later). The “enable cloud access” option essentially tells the gateway that it’s OK to allow on demand and automatic data refreshes from the cloud. The “Enable OData Feed” option is pretty self explanatory – if you don’t enable it, the only thing that the connection can be used for is data refresh. These two can be selected independently of each other. After selecting next, you are presented with the connection info screen.

image

To start with, you’ll begin by giving the connection a name. Since the connection will typically be to a specific database, the name should reflect that. You may also wish to add something about the database’s location if you deal with similar data sets. The next selection is that of the gateway. You MUST have a gateway in order to create a data connection. You can have multiple gateways registered in your tenant, and the selection of the gateway will dictate the connection provider choices. It’s an easy thing to forget to choose your gateway, and then think that the page is broken because there are no Provider choices.

You can choose one of two methods to create your connection – connection string, or (if you’re lazy like me) you can choose Connection Properties, and let the tool build your string for you. When it comes to troubleshooting data refresh, you may find that the connection string method is more helpful, but either way should be equivalent. Once you have completed the bulk of this form, you need to enter credentials. First, select the method of authentication – your choices are Windows Authentication or SQL Authentication. Then select the credentials button which launches the Data Source Manager.

image

The Data Source Manager is a one click application that communicates directly with the gateway and is used to register the connection’s credentials directly with it – they are not stored in the Office 365 tenant or the BI Sites app. Because of this direct connection, you need to be on a network that is local to the gateway. This will not work from a remote location. If, like me, you are not joined to a domain, you will also need to be on the gateway machine itself.

On launch, it will go through a series of checks (this can take a while), verifying the gateway and the tenant. When it’s ready, it will show “Data Source Manager initialized successfully”, and you can go ahead and enter the credentials. Once you do, be sure to use the “test connection” button to verify that everything is working. When ready, click OK to register the credentials, then click save to save the data source. You will then be taken to the data settings page.

image

From here you can choose to not only index the metadata in the data catalogue, but also the data itself, and you can choose how frequently it is updated. The indexing option is currently disabled in the preview, so I have little to say about it at this point. Its purpose is to improve discoverability of the data via Power Query. The second section is the selection of tables to expose to the OData feed. You can choose from any tables or views, but as you can see from the example above, if your table/view contains any unsupported types (in this case geography fields), the entire object will be unavailable for publishing.

Clicking OK brings you to the users and groups page. From here, you can select those that will be able to use these data sources in Power Query, or manually refresh workbooks in the browser. As with all things SharePoint, it’s a good idea to use a group here.

Once done, your data connection is ready, and your OData feed is available. To use it, you’ll need to discover its address. You can do this by clicking on the ellipsis beside its name in the list.

image

image

As you can observe from the URL, this is a cloud service. You should be able to connect to the service from anywhere, and it will connect through the gateway to serve up the data. While this is great from a mobility standpoint, if you happen to be on premises, this would be quite inefficient, as the data would first need to be transferred to the endpoint in the cloud, and then back to the source network.

The good news is that the gateway is able to detect when you are accessing the feed locally, and it will redirect you to the source without sending the data up to the cloud and back. The bad news for us preview users is that this is the only thing working at the moment. Therefore, for the preview period at least, in order to access the OData feed, you must be on a local network. Specifically, you must be able to resolve the server name defied in the connection string.

If you meet these conditions, you can test the feed using Power Query. In Excel, go to the Power Query tab, and select “From OData Feed” in the “From Other Data Sources” dropdown.

image

You will then be prompted to log in. You need to use an Organizational Account (Office 365) to do this. This is an account associated with the Power BI license. However on this screen it’s referred to as a “Microsoft Online Services ID”.

image

I personally feel this could easily be confused with a Microsoft Account (Live ID). What’s worse is that Microsoft Accounts can be used with Office 365, so it’s certainly less than clear as to which credentials should be used here.

Clicking on the Sign In button takes you to a standard authentication dialog. In the preview, there is a small bug that requires you to enter your account completely in lower case. Failing to do so will cache the wrong credentials, and you’ll be denied access moving forward. If you encounter this problem, the solution is to close Excel, clear the browser cache and to restart.

Once you have authenticated successfully, you can save the connection (Power Query will save it in its cache), and work with it as with any other data source.

Thus far, I’ve only been able to use Power Query to connect to the OData feed. I have tried using Power Pivot directly, but although it is supposed to support OAuth, I can’t seem to save my Office 365 credentials. I’ve only tried via these two mechanisms – if anyone has tried others, I would love to know about it.

Refreshing Data in a Power BI Enabled Workbook

To start with, at the time of this writing (Power BI preview) quite a number of features that will be available in GA have not been enabled and/or are not supported. Data refresh scheduling is as yet unavailable, and the data sources that can be refreshed are restricted to direct SQL connections. Models created with Power Query cannot yet be refreshed. As these features become available, I will update this article with any relevant findings.

Given the fact that the Gateway is required to support both an OData feed and for data refresh, you might think that you must use the OData feed in your data models in order for them to be refreshable. This is not the case. When a refresh is requested, the model is interrogated for its data connections. The data catalogue is then interrogated for a data source with a matching connection string, and if found, is used. The Gateway is then called to retrieve the data if it is on premises. If the data source is SQL Azure, the Gateway is still used, but the data is loaded directly from SQL Azure – it does not need to be sent to the Gateway first.

As mentioned above, Power Query queries cannot yet be refreshed by the gateway. The only type of connection that I’ve been able to successfully refresh thus far is one created directly in Power Pivot. When creating a connection in Power Pivot, pay close attention to the connection string. You may need it later if you have refresh issues.

In addition to the data sources supported by the gateway, two other data sources can be refreshed. Project Online has a number of OData feeds that can be refreshed directly, and public OData feeds (not requiring authentication) can also be refreshed. I don’t currently have an instance of Project Online, so I don’t have much to add apart from the fact that it is supposed to work. I have tested refresh with public feeds and they do in fact work well. The interesting thing I noticed was that while it worked in my Power BI tenant, it also worked in my regular Office 365 tenant. Apparently this feature has been there for some time.

As I mentioned, scheduled refresh is not yet available. When it is, this will be done from the BI Sites app, the same way that you “enable” a workbook. For now, it must be done manually. This is done the same way that it is with an on premises workbook. First open the workbook in the browser, and then, from the data tab, select “Refresh All Connections”.

image

The workbook will go dim, and you’ll see “Working on it…” or “Still working on it…” for a bit – it does take some time to refresh, depending on the data set. Using the methods that I mentioned at the beginning of this article, you can monitor the progress of the refresh, and the impact on the gateway machine. Also, for the moment at least, if your data source is SQL Azure, prepare for a long wait – refresh time takes an exceptionally long time (in my case, about 10 min for a 1 million row x 20 column set of simple data types). The Azure refresh time should be addressed by GA.

Monitoring The Gateway

There are already a number of tools in the preview to help with troubleshooting Power BI, chiefly aimed at the Data Management Gateway. They can be found on the “system health” tab in the Admin Center.

image

The default screen shows the CPU utilization of your gateway machines (in this case, I have all of 1…) at the top, and the availability of your gateways at the bottom. At first glance, you would think that while my gateway is good from an availability standpoint, it’s taking a lot of my CPU. The reality is that the top chart shows total CPU utilization on the machine. If you want to see the utilization of the Gateway itself, you need to choose the specific machine from the Display dropdown.

image

Here you can see that while the machine utilization hovers around 40% (blue), the gateway utilization is barely noticeable (red).

Finally, what is likely the most useful part of monitoring, the logs, is well hidden. The logs are useful for troubleshooting data refresh issues, and can be accessed by clicking on “logs” which is at the top of this screen beside “dashboards”.

image

In the browser, you can see basic event information and basic error information if you have errors. However, if you download the logs as a CSV, you will see much more detailed information. If you are having problems with data refresh, particularly in the preview, I strongly recommend downloading it. One of the important pieces of information that it contains is the connection string that is being used:

image

You can compare that to the connection string that is being used by Power Pivot in the workbook. You find the string in Power Pivot by first clicking on “Manage” in the Power Pivot tab, and then in the Power Pivot window, choose “Existing Connections” in the ribbon. You should see your connection under “PowerPivot Data Connections”. Select it, and click the Edit button.

image

In the next screen, click the “Advanced” button. You should then be presented with the data connection property dialog.

image

Here, at the bottom, you will find the connection string being used by Power Pivot, and this is what the Gateway uses to look for one of its registered data connections. If you find any discrepancies, the chances are that they are at the source of your refresh problems, and that they should be addressed.

This is an early, first glance walkthrough of some experiences using the Data Management Gateway. Hopefully it can be of some help for the early adopters. I will try to keep this updated as Power BI moves from preview to General Availability.

16 Comments

How to Hide Objects From the Power BI Mobile Apps

If you’ve worked at all with the Power BI mobile app, you’ll find that it’s quite straightforward to use. What it does in essence is to render out objects from Excel workbooks in a manner that is easy to consume from a mobile device. It uses Excel Services to render the content, but it doesn’t do so in a manner that may be familiar to Excel Services users, which is to essentially replicate the spreadsheet editing environment in a browser. Instead, each named object is presented as a distinct object, and in favourites, the workbook is represented as a section.

image

Clicking on any of the objects opens it in full screen mode. Swiping down from the top, or right clicking on the report opens a bar at the top that allows you to navigate to any of the other objects.

By default, any named object will appear, as well as Power View reports. (It should be noted that as of right now at least, Power Map objects are not rendered by the Mobile app). However, what happens if we don’t want an object to be rendered in the application? If you use Power Query, and you follow my recommendations of loading the data directly into the data model, you’ll likely bump into the need to do this fairly quickly. Queries are named objects in the workbook, and are rendered by the mobile app.

image

In the screen above, there are actually 7 objects in the workbook, but the favourites view can only display 6. Three query stubs are blocking a Power View report. Not an ideal situation. What we want to do is to hide these queries, but how to do so is not immediately obvious. The BI app uses Excel Services to render the objects, and since the early days of Excel Services, Excel has had a mechanism to control what gets rendered by Excel Services.

If we open the source workbook in Excel we will see that all of the named objects are contained in worksheets named “Pivots” and “Pie” another worksheet, “Power View 1” contains the Power View report. The queries are all stored in different worksheets. In order to control what gets rendered, we first click on the “File” tab in the ribbon, and click the “Browser view options” button.

image

You are then presented with a dialog that allows you to determine what is rendered by Excel Services. The default is “Entire Workbook”, but you can select specific “Sheets”, or for very fine grained control, “Items in the workbook”.

image

In our case, we just want to turn off items in the unnamed sheets, so we deselect them. Once this is done, all that is necessary is to save the workbook back into its library, and the results become immediately apparent on a refresh of the app.

image

This is a bit of a brute force approach. It affects not only the mobile app, but also anyone using the workbook from a browser – those sheets will be invisible to any users using the Excel Web App. I can foresee a need to hide things from the mobile app but not from the browser. I can also see a need to control the layout of the objects, but these are early days. For now, I’ll take this approach.

1 Comment

How To Load Data Directly Into the Excel (Power Pivot) Data Model

In a recent post, I discussed that while Power BI sites allow for data models up to 250 MB, the size of the worksheets portion of any workbook cannot exceed 10 MB. This is because of the fact that while the data model is passed to Analysis Services for processing, the worksheet itself is still subject to the hard 10 MB limit imposed by Excel Services. It is therefore vital to keep a minimum amount of data in the workbook.

The best way to minimize the workbook size is by storing the data exclusively in the data model. There are a number of data import methods in Excel/Power Pivot, including the new Power Query. Doing this is more obvious in some than others.

Power Pivot Import

Prior to Excel 2013 and/or Power Query, importing into the model exclusively was almost default behaviour (yes it was possible to create models from Excel data, but that wasn’t the most common use case). You would invoke the Power Pivot add-in by clicking on the tab, click the manage button, and you would be taken to the Power Pivot management window where you could import data from a wide variety of sources

image

This approach still works just fine. Data imported using this method will be brought directly into the model, which is what we want to achieve. However, Excel 2013 now supports the data model by default, and allows data to be imported to it through several approaches. The Power Pivot editor is included, but is not enabled by default, so it is not immediately obvious that it is an option.

Excel Import

With Excel 2013, you can now use native Excel functionality to get data into the model. The most familiar way to do this is through the regular Excel data import interface. You click on the Data tab in Excel, choose “From Other Sources”, and select your data source.

image

Once done, and the connection is selected, you are presented with the Data Connection Wizard. There is a subtle change to the wizard in Excel 2013 when compared to previous versions, the addition of “Enable selection of multiple tables”.

image

Selecting this option will of course allow you to import multiple tables simultaneously, but what it will also do is to automatically add those table to the data model for you. However if you don’t select this option, there is still an opportunity to add to the data model. After clicking next and creating a connection, you are presented with the Import Data dialog.

image

At this point, if you only selected one table, you can select the option to add it to the model here. If you had chosen the multi table option earlier, this option would be greyed out and selected – you have no choice. However, the most important part of this dialog is the “Select how you want to view this data” section. By default, Table is selected. If it is left selected, data will be imported directly into the model, but it will also be imported as a table into the worksheet. This is a BAD THING. Yes, it’s what users are accustomed to, but instead of just storing the data in a nice tiny compressed data model, you’re also storing it in a big fat worksheet, and it won’t take much data to exceed your 10 MB Office 365 cap. In addition to that issue, by doing this, you also limit yourself to about a million rows of data (Excel limit) instead of the hundreds of millions of rows that the model can handle.

Any of the other options will allow you to import directly into the data model, bypassing the worksheet altogether. These options are your friends. Use them.

Power Query Import

The latest, and most impressive method of importing data into Excel is Power Query. A description of Power Query is beyond the scope of this article, but it allows for relational drill down, complex transformations, nested queries and a host of other options in addition to a wide variety of new data sources (Facebook, Hadoop, etc). It’s also the ONLY way to consume data served up by the on-premises  Data Management Gateway component of Power BI. It provides a very wide array of data import options to the end user.

However, one of the problems with the tool is that it tends to promote the importing of data into worksheets, which is a practice that in my opinion should be strongly discouraged. It is certainly easily possible to avoid this behaviour, as I’m going to demonstrate shortly, but it does require that the user be aware of the importance of this. Importing to the worksheets should be an option, not a default as it currently is. My concern is that far too many people will build a large fancy report that winds up being over the 10 MB workbook limit and publishing it to Office 365, only to have Power BI fail because it’s too large, and then give up in frustration.

To import data directly into the model with Power Query, first click on the Power Query tab in Excel, and then select your data source. After entering the server/database specifics and your credentials if necessary, you are presented with the Power Query Editor dialog.

image

 

After performing any necessary transformations, you are ready to import your data. To do so simply click the Done button. You will then be returned to Excel, which will immediately import your data into a worksheet, and open up the Query Setting window. So – what happens when the data source exceeds 1,048,567 rows of data? Pretty much what you would expect – you receive an error. A pretty explicit one at that.

 

image

In this case, the data is not brought into the worksheet, but the query is still defined, so the model can still be successfully populated by clicking the “Load to data model” link. However, if the source data does fall within Excel’s parameters, it will be brought in to the worksheet.

image

In order to load it into only the data model, we must first deselect the Load to worksheet slider, and then click the Load to data model link. (Note: I have no idea why these are two different control styles). The first option can be selected as the data is loading, so you don’t have to wait for the load to complete. If you select the first and not the second, the data is not loaded anywhere. This is a perfectly valid situation. With Power Query, you can base queries on other queries, or append/merge queries. By doing this, you can load only the end result, and not the intermediate queries.

Once done, the worksheet will display the “Load to worksheet disabled” message in place of the result set.

image

However, opening the Power Pivot management window will display the imported data, and you can work with the model.

In summary, Power Query brings many new capabilities to the data loading and transformation process. However, with great power comes great responsibility. Unless Microsoft makes a change to the default behaviour of the Power Query import process, I’ll be telling anyone that listens to make sure that they always turn off the “Load to Worksheet” option if they’ll be publishing to Office 365.

5 Comments

Power Query Preview vs Power Query Release – How Can I Share a Query?

You learn something new every day. Hopefully this helps a few confused souls, it would have helped me.

Microsoft first announced the release version of Power Query in July 2013, and last week (August 2013) it put out a stability update for it. Not wanting to be left behind, I gleefully installed the update, and when I got my invitation to try the Power BI preview, I went ahead and started working with it. One of the first things that I wanted to try was to work with shared queries. I couldn’t figure out how to create one. After a little research, i knew that I was missing something. There is a section (The Organization Section) in the Power Query ribbon where you can sign in and access shared queries.

image

The problem was, I didn’t have it. Initially, I thought that it was left out of the preview, but then quickly realized that others had it. Then perhaps that it was a license issue, or an installation issue. However after installing on three different systems with 2 different bit levels, I gave up and posted my dilemma in the community forums.  It didn’t take long for someone from Microsoft to chime in with the answer.

There are two different versions of Power BI. The released version (the one from July and the stability update) is meant for standalone use, and does not have the ability to connect to a Power BI site. The preview version is for corporate use, and it does have that capability. The release version is 1.5.3296.2082, and the preview version is 2.6.3387.121.

Power Query (for use with Power BI sites) version 2.6.3387.121
http://www.microsoft.com/en-us/download/details.aspx?id=39933

Power Query (standalone) version 1.5.3296.2082
http://www.microsoft.com/en-us/download/details.aspx?id=39379

Once I installed 2.x, everything was good. Luckily, the main Power BI download site now points to the version 2.x, so not too many people should be caught by this. If you have been, hopefully this helps.

3 Comments

Working With Power BI, Office 365 and File Size Limits

Note – this article was written during the preview cycle of Power BI, and certain behaviours and screens may change by final release.

Quick, what’s the biggest file that you can upload into SharePoint? As with anything SharePoint, of course it depends. This article explains some of the file size boundaries in SharePoint and Office 365, how they impact Power BI, how to take advantage of the storage changes that Power BI provides, and some of the intricacies of using the xVelocity model with Power BI.

The maximum file upload size is 250 Mb by default with SharePoint 2013, and 50 with prior versions. This setting can be changed on an application by application basis within Central Administration, and once set, is a hard limit. If your file exceeds it, you won’t be able to upload it into SharePoint.

If you use Excel Services, you may also note that the default maximum size for an Excel workbook is 10 Mb. This too can be changed in Central Administration (it’s a property of trusted file locations). If you upload a workbook that exceeds this limit, Excel Services won’t be able to render it, and you’ll need to use the Excel client. Depending on performance limitations, I often like to adjust this setting to match the maximum file upload size to avoid end user confusion.

SharePoint Online in Office 365 works a little bit differently. Until recently, the default upload limit was unchangeable. Recently however, SharePoint Online removed the upload limit altogether, so now the maximum file upload size is 2 Gb, which represents the maximum file size that SharePoint can store. For Excel Services Online there are settings that can be adjusted, but the maximum workbook size isn’t one of them. With Excel Services in Office 365, the maximum workbook size is 10 MB. Period.

If you use Power Pivot to do analysis, data is brought into the data model that is embedded within the workbook. That workbook is then stored in SharePoint, and on first use of the model (clicking a slicer, expanding a dimension, or anything requiring the retrieval of data from the model), an instance of the model is created in a backing Analysis Services engine by Excel Services.

Given that all of this is wrapped up into a single workbook file (xlsx), 10 MB would seem to be pretty constraining for data storage, and it is. The data model is very efficient, and data is highly compressed, but 10MB is simply too small for most involved analyses. This isn’t a problem on premises, where additional resources can be allocated and the limit increased, but in Office 365, you’re simply stuck with the 10 MB limit.

Enter BI Sites, the recently announced Office 365 based Business Intelligence app that is part of the Power BI application suite. BI Sites is a SharePoint app that provides additional capability to workbooks with embedded models stored in SharePoint Online libraries. BI sites allows for data models as large as 250MB. The BI Sites app doesn’t actually store content, it just renders it, and provides additional capability such as automatic data refresh. BI Sites also relies on Excel Services to render the content, so does a Power BI subscription increase that workbook limit to 250 Mb? Nope – that limit is still firmly in place. So how does it get around this limit?

As mentioned above, when a model is accessed by a user by forcing a data retrieval from the model, Excel Services creates an instance of the model in the backing Analysis Services instance, if it hasn’t already been created. Subsequent calls just utilize the backing version. What Power BI does is it preloads this model, and then drops the model from the workbook (it is reconstituted on download). Given that the model is the large part of the workbook this drops the file size considerably, allowing it to work within the limits imposed by Excel Services.

Notice that the limit of 250 Mb above is specified for the model, NOT for the workbook. The workbook limit is still 10 Mb, and this is quite visible if you do things in the wrong order, at least in the Power BI preview. To demonstrate we will work with a model that is similar to the one that I created in this article, which is a rudimentary budget analysis with Great Plains data. There are three versions of the analysis file for demonstration purposes.

image 

In the first version, the data was first imported into Excel worksheets using Power Query, and then loaded into the model before the model was edited. It is obviously the largest of the 3, as it contains both the original data, and the more optimized model. In the second file, the data was loaded directly into the model with Power Query. After model edits, the analysis was created using a simple pivot table and pivot chart. It is the smallest of the three, as the data is contained exclusively within the optimized model. In the last version of the file, the data was imported into Excel worksheets using Power Query. Take note of the fact that the file is 12 MB, 50% larger that the model only version, and all of which is counted when considering the Excel Services limit.

After uploading these three files to an Office 365 site, Only the EnbGPDataModelOnly file can be accessed via Excel Services directly. This makes sense because the other two are larger than the 10 MB limit, and Excel Services can’t do anything with them at all, resulting in the error below:

Sorry, we can't open your workbook in Excel Web App because it exceeds the file size limit. You'll need to open this in Excel.

If we now navigate into the Power BI application, We will see a warning symbol on the tiles for our workbooks. This is because they have not yet been optimized for viewing on the web. What does that mean? It means that the model has not yet been extracted from the workbook, and attached to the Analysis Services engine.

image

To do this, click the ellipsis on the tile, and then select Enable. After confirming, the model will be extracted, and you receive a confirmation. In our case, the only one that will be successfully enabled is our EnbGPDataModelOnly file, but the reason is different for the other two files. In the case of EnbGPExcelOnly, the data model was never populated, and results in “This Workbook could not be enabled for web viewing because it does not contain a data model”.

image

This makes perfect sense, but it does mean that all of the features available through the BI Sites feature are unavailable to workbooks that don’t use data models. There is one exception to this however. The Power BI app for Windows 8 and iOS can render workbooks without models, provided that they are within the 10MB limit.

If we try to enable the EnbGPDataModelandExcel file, which does contain a data model, we get the error “This workbook could not be enabled in Power BI for Office 365 because the worksheet contents (the contents of the workbook that are not contained in the data model) exceed the allowed size.

image

If we look at the file size with only the model, it’s about 8.7 MB. The file without the model is 12 Mb, so even with the model extracted, the limit is exceeded, and the enablement process detects this.

On a side note, I think that these error messages have some interesting language. They make reference to “Power BI for Office 365”. This implies, to my mind at least, that there may be a version coming that isn’t for Office 365. No on premises version of Power BI has ever been mentioned by Microsoft, but this may hint at it.

When complete, the failed and successful enablements are easy to spot.

image

Clicking on the middle tile will successfully render the workbook.

Next, let’s work with a file that can benefit from these new features. I created a model that’s identical to the “model only” version with the only difference that I import all of the data from the 3 tables, not just the selected columns. The resultant file (named EnbGPBigModel.xlsx) is 54 MB on disk – well above the 10 MB Excel Services limit, but below the 250 MB Power BI limit. However, Once uploaded, clicking on it directly in the library gives the “file size exceeded” error message. What’s up with that?

The reason is that Excel Services just sees that it is too big, and gives up. In order to extract the model, we must first enable it in Power BI before we can work with it in Excel Services. To do that, we simply repeat the above process by navigating to the Power BI app, and enabling it.

image

Once this has been done, the workbook can be accessed by clicking on it within the Power BI app here, in the Power BI mobile app, or by navigating back to the source library and using it directly with Excel Services.

image

image

Therefore, when building models, it is vitally important to distinguish between the size of the model, and the size of the rest of the workbook. While the model can grow to 250 MB, the sum of the rest of the content cannot exceed 10 MB. Note to Microsoft – we could really use a good term here, as opposed to “rest of the workbook”. Let’s call it the spreadsheets for our purposes right now.

So, how do we know the size of the model vs the size of the spreadsheets? Well, an Excel file (xlsx) is really just a zip file in disguise. In fact, if we rename the file to end in a .zip extension, we can crack it open with a ZIP viewer (or as a folder in Windows). If we do so with our file that contains both the spreadsheets and the model, open it up, and then drill down to the xlmodel folder we’ll see the file item.data.

image

This file is your data model. in this case, it is 7.8 MB in size. Subtract that value from the size of the overall xlsx file, and you have the size of your spreadsheets, which is the part that must fit within the 10 MB limit. When done, just rename the extension xlsx.

If we continue to have a problem, or as a matter of good practice, an excellent tool to use is the Workbook Size Optimizer tool from Microsoft – available here. This is (yet another) Excel add-in that will help to further optimize your model and to help reduce the file size. Just open your workbook, run the add in, and follow the prompts.

We can see that although the 250 MB model size in Power BI helps to make Office 365 a viable BI platform, it does still require a certain awareness on the part of users. The most important lesson to learn from this is to try to avoid importing data directly into Excel. Whenever possible, bring the data directly into the model, bypassing any Excel storage. This is a good idea in any event, but Power BI further underscores the need for it. When using Power Pivot, it’s fairly straightforward, but the data acquisition interface available in Power Query tends to prefer Data import. When using Power Query, care must be taken to avoid Excel import, and I’ll be posting another article on how to do this shortly. 

8 Comments