Skip to content

Tag: Power Query

Using Excel With External Data – What’s the Right Tool?

Excel has been used with external data for… well, as long as I’ve been using Excel. So why would anyone bother to write a blog post about this given that the capability is so mature? In recent years, Excel has adopted a number of new, and frankly better mechanisms for working with external data, while retaining the old. Given that there are now multiple tools in Excel for working with external data, it’s not always clear as to which one is the best, and unfortunately there is no single tool that wins over all, although I believe that that will be the case soon.

The answer, as always is, “it depends”. When it depends, the important thing is to understand the strengths and weaknesses of each approach. With that said, let’s have a look at all of the options.

ODC Connections

ODC (Office Data Connections) are the traditional method of accessing data in Excel. You can create or reuse an ODC connection from the Data tab in the Excel ribbon.

When using an ODC connection, you establish a connection with a data source, form some sort of query and import the resultant data directly into the Excel workbook. From there, the data can be manipulated and shaped in order to support whatever the end user is trying to do. The one exception to this behaviour is the connection to SQL Server Analysis Services (SSAS). When a connection is made to SSAS, only the connection is created. No data is returned until an analysis is performed (through a pivot table, chart etc), and then only the query results are retrieved.

When the workbook using an ODC connection is saved, the data is saved within it. In the case of an SSAS connected workbook, the results of the last analysis are saved along with it. For small amounts of data, this is just fine, but any large analysis is bound to quickly run into the data limits in Excel which is 1,048,576 rows by 16,384 columns in Excel 2013. In addition such a file is very large and extremely cumbersome to work with, but even as such, Excel has been the primary tool of choice for business analysts for years.

Data loaded into the workbook can be refreshed on demand, but it can also be altered, shaped, mashed up, and as is too often the case, grow stale. Workbooks such as these have become known as “spreadmarts” and are the scourge of IT and business alike. With these spreadmarts, we have multiple versions of the same data being proliferated, and it becomes harder to discern which data is most accurate/current, not to mention the governance implications.

SharePoint has provided a way to mitigate some of the concerns with these connections. SharePoint itself supports ODC connections, and therefore users can access these workbooks stored within SharePoint and it also allows them to refresh data from the source either on demand or on open. A single point of storage along with a measure of oversight and browser access helps to restore a modicum of sanity to an out of control spreadmart environment, but the core issues remain.

In order to help with the core issues, Microsoft introduced PowerPivot in 2009.

PowerPivot Connections

Created in PowerPivot

PowerPivot was originally (and still is) an add-in to Excel 2010, and is a built in add-in to Excel 2013. PowerPivot allows for the analysis of massive amounts of data within Excel, limited only by the memory available to the user’s machine (assuming a 64 bit version). It does this by highly compressing data in memory using columnar compression. The end result is that literally hundreds of millions of rows of data can be analyzed efficiently from within Excel.

You can see that compression at work by comparing the same data imported into an Excel workbook directly, and into a PowerPivot model with a workbook. The following two files contain election data, and represent the maximum number of rows that Excel can handle directly (1,048,576) and 25 columns.

Getting data into the model was originally (and still can be) a completely separate process from bringing it into Excel. PowerPivot has its own data import mechanism, accessed from the Power Pivot window itself. First, click on the PowerPivot tab in Excel and then click manage. If you don’t have a PowerPivot tab, you will need to enable the add-in. If you don’t have the add-in, you have an earlier version of Excel – you’ll need to download it.

Once the PowerPivot window opens, the “Get External Data” option is on the ribbon.

Once the appropriate data source is selected and configured, data will be loaded directly into the data model – there is no option to import that data into a worksheet. Once the data is in, pivot tables and pivot charts can be added to the workbook that connect to the data model much like when creating an ODC connection to Analysis Services. In fact, it’s pretty much exactly like connecting to Analysis services, except that the AS process is running on the workstation.

Created in Excel

PowerPivot, and more importantly the tabular data model was included in Excel 2013. With that addition, Microsoft added a few features to make the process of getting data into the data model a little easier for users that were a little less tech savvy, and may be uncomfortable working with a separate PowerPivot window. That’s actually part of the thinking in leaving the PowerPivot add-on turned off by default.

When a user creates an ODC connection as outlined above, there are a couple of new options in Excel 2013. First, the “Select Table” dialog has a new checkbox – “Enable selection of multiple tables”.

When this option is selected, more than one table from the data source can be selected simultaneously, but more importantly, the data will automatically be sent to the data model in addition to any other import destinations.

Even if the multiple selection option wasn’t chosen, the next dialog in the import process, “Import Data” also has a new check box – “Add this data to the Data Model”.

Its purpose is pretty self-explanatory. It should be noted that if you choose this option, and also choose “Only Create Connection”, the data will ONLY be added to the model, nowhere else in the workbook. This is functionally equivalent to doing the import from the PowerPivot window, without enabling the add-in.

Power Query Connections

When Power BI was originally announced, Power Query was also announced and included as a component. This was very much a marketing distinction, as Power Query exists in its own right, and does not require a Power BI license to use. It is available as an add-on to both Excel 2010 and 2013, and will be included with Excel 2016.

Power Query brings some Extract, Transform and Load (ETL) muscle to the Excel data acquisition story. Data can be not only imported and filtered, but also transformed with Power Query and its powerful M language. Power Query brings many features to the table, but this article is focused on its use as a data acquisition tool.

To use Power Query, it must first be downloaded and installed. Once installed, it is available from the Power Query tab (Excel 2010 and 2013).

Or from the data tab, New Query (Excel 2016)

Once the desired data source is selected, the query can be edited, or loaded into either the workbook, the data model, or both simultaneously. To load without editing the query, the load option at the bottom of the import dialog is selected.

Selecting “Load To” will allow you to select the destination for the data – the workbook, the model or both. Selecting Load will import the data to the default destination, which is by default the workbook. Given the fact that the workbook is an inefficient destination for data, I always recommend that you change their default settings for Power Query.

To do so, select Options from the Power Query tab (2010 and 2013) or the New Query button (2016), click the Data Load section, and then specify your default settings.

Data Refresh Options

In almost every case when external data is analyzed, it will need to be refreshed on a periodic basis. Within the Excel Client, this is simple enough – click on the data tab, and then the Refresh All button, or refresh a specific connection. This works no matter what method was used to import the data in the first place. Excel data connections can also be configured to refresh automatically every time the workbook is opened, or on a periodic basis in the background.

However, workbooks can also be used in a browser through Office Web Apps and Excel Services (SharePoint and Office 365) or as a data source for Power BI dashboards. In these cases the workbooks need to be refreshed automatically in order that the consuming users will see the most up to data when the workbooks are opened. The tricky part is that not all of the connection types listed above are supported by all of the servers or services. Let’s dive in to what works with what.

SharePoint with Excel Services

Excel Services first shipped with SharePoint 2007, is a part of 2007, 2010, and will be included with 2016. From the beginning, Excel Services allowed browser users to view and interact with Excel workbooks, including workbooks that were connected to back end data. The connection type supported by Excel Services is ODC, and ODC only.

Excel Services has no mechanism for maintaining data refresh. However, the data connection refresh options are supported which means that the workbook can be automatically refreshed when opened, or on a scheduled basis (every xxx minutes in the background). Unfortunately, this can come with a significant performance penalty, and once refreshed it is only in memory. The workbook in the library is not updated. The data in the workbook can only be changed by editing the workbook in the client, refreshing it, and re-saving it

Workbooks with embedded data models (PowerPivot) can be opened in the browser, but any attempt to interact with the model (selecting a filter, slicer, etc) will result in an error unless PowerPivot for SharePoint has been configured.

SharePoint with Excel Services and PowerPivot for SharePoint

PowerPivot for SharePoint is a combination of a SharePoint Service application and Analysis Services SharePoint mode. When installed, it allows workbooks that have embedded PowerPivot data models to be interacted with through a browser. The way that it works is that when such a workbook is initially interacted, the embedded model is automatically “promoted” to the Analysis Services instance, and a connection is made with it, thus allowing the consuming user to work with it in the same manner as with a SSAS connected workbook,

The PowerPivot for SharePoint service application runs on a SharePoint server and allows for individual workbooks to be automatically refreshed on a scheduled basis. The schedule can be no more granular than once per day, but the actual data within the model on disk is updated, along with any Excel visualizations connected to it.

When the refresh process runs, it is the functional equivalent of editing the file in the client, selecting refresh all, and saving it back to the library. However, there is one significant difference. The Excel client will refresh all connection types, but the PowerPivot for SharePoint process does not understand Power Query connections. It can only handle those created through the Excel or PowerPivot interfaces.

Power Pivot for SharePoint ships on SQL Server media, and this limitation is still true as of SQL Server 2014. At the Ignite 2015 conference in Chicago, one of the promised enhancements was Power Query support in the SharePoint 2016 timeframe.

Office 365

Office 365, or more precisely, SharePoint Online supports Excel workbooks with ODC connections and PowerPivot embedded models in a browser. These workbooks can even be refreshed if the data source is online (SQL Azure), but they cannot be refreshed automatically. In addition, only ODC and PowerPivot connections are supported for manual refresh. Power Query connections require Power BI for Office 365. In addition, Office 365 imposes a 30 MB model size limit – beyond that, the Excel client must be used. In short, the Office 365 data refresh options are very limited.

Power BI for Office 365

Power BI for Office brings the ability to automatically refresh workbooks with embedded data models. Data sources can be on premises or in the cloud. On premises refresh is achieved through the use of the Data Management Gateway. It also raises Office 365’s model size limit from 30 MB to 250 MB. With Power BI for Office 365 both manual and automatic refreshes can be performed for both PowerPivot and Power Query connections, however Power Pivot connections are currently restricted to SQL Server and Oracle only.

The automatic refresh of ODC connections is not supported. A workbook must contain a data model in order to be enabled for Power BI.

Power BI Dashboards

Power BI Dashboards is a new service, allowing users to design dashboards without necessarily having Office 365 or even Excel. It is currently in preview form, so anything said here is subject to change. It is fundamentally based on the data model and it works with Excel files as a data source currently, and it is promised to use Excel as a report source as well. The service has the ability to automatically refresh the underlying Excel files on a periodic basis more frequent than daily.

In order for a workbook to be refreshed by Power BI, it must (at present) be stored in a OneDrive or OneDrive for Business container. It also must utilize either a PowerPivot, or a Power Query connection. At present, the data source must also be cloud based (ie SQL Azure) but on premises connectivity has been promised.

SQL Server Analysis Services

Another consideration, while not a platform for workbooks is SQL Server Analysis Services (SSAS). Excel can be used to design and build a data model, and that data model can at any time be imported into SSAS. As of version 2014, SSAS fully supports all connection types for import – ODC, PowerPivot and Power Query. Once a data model has been imported into SSAS, it can be refreshed on a schedule as often as desired, and you can connect to it with Excel, and share it in SharePoint. You can also connect to it in Power BI Dashboards through the SSAS connector. From both a flexibility and power standpoint, this is the best option, but it does require additional resources and complexity.

Refresh Compatibility Summary

For convenience, the table below summarizes the refresh options for the different connection types.

 

ODC

PowerPivot

Power Query

Excel Client

M

M

M

SharePoint/Excel Services

M

SharePoint/Excel Services/PP4SP

M

A

SQL Server Analysis Services Import

A

A

A

Office 365

M

M

Office 365 with Power BI

A*

A

Power BI Dashboards

A

A

M – Manual refresh

A – Both Manual and Automatic Refresh

* only limited data sources

 

The Right Tool

I started out above by saying that the selection of import tool would depend on circumstances, and that is certainly true. However, based on the capabilities and the restrictions of each, I believe that a few rules of thumb can be derived. As always, these will change over time as technology evolves.

  1. Always use the internal Data Model (PowerPivot) when importing data for analysis.

     

  2. Power Query is the future – use it wherever possible

    All of Microsoft’s energies around ETL and data import are going into Power Query. Power Query is core to Power BI, and announcements at the Ignite Conference indicate that Power Query is being added to both SQL Server Integration Services and to SQL Server Reporting Services. Keep in mind that we have been discussing only the data retrieval side of Power Query – it has a full set of ETL capabilities as well, which should also be considered.

  3. PowerPivot or ODC Connections must be used on premises

    PowerPivot for SharePoint does not support Power Query for refresh. This means that you MUST use PowerPivot connections for workbooks with embedded models. If you are already using SSAS, use an ODC connection within Excel.

  4. Power Query or PowerPivot must be used for cloud BI.

    PowerPivot connections will work for a few limited cases, but more Power Query support is being added constantly. Where possible, invest in Power Query

  5. If on-premises, consider importing your models into SSAS

    SSAS already supports Power Query. If, instead of using PowerPivot for SharePoint, Analysts build their models using Excel and Power Query, they can be “promoted” into SSAS. All that is then required is to connect a new workbook to the SSAS server with an ODC connection for end users. The Power Query workbooks can be used in the cloud, and the SSAS connector in Power BI Dashboard can directly use the SSAS models created.

  6. Choose wisely. Changing the connection type often requires rebuilding the data model, which in many cases is no small feat.

In summary, when importing data into Excel, the preferred destination is the tabular model, and to import data into that model, Power Query is the preferred choice. The only exception to this is on premises deployments. In these environments, consideration should be given to connecting to a SSAS server, and failing that, PowerPivot imports are the best option.

21 Comments

The New Power BI – Now With Enterprise!

Yesterday Microsoft announced the next step in the evolution of Power BI. It’s getting quite a bit of attention, and rightly so for its aim of bringing Business Intelligence closer to users. Democratizing BI has always proved a challenge – it’s the realm of the gurus in the white coats that hold the keys to the data. Microsoft is aiming to accomplish this democratization through a combination of user focus, and as of yesterday, a drastic change in its pricing model. Power BI just went from about $40 per user per month, to free, or $9.99/user/month for advanced capabilities. That’s quite a drop, and arguably the biggest announcement from yesterday – it will have a massive impact. The detailed price breakdown can be found here.

However, all of the focus around personal BI is, in my opinion, missing a key component. Power BI and its components have always focused squarely on both personal and team BI solutions. That is to say the ability for a power user to model data, visualize it quickly and easily and to share it out with fellow team members. While that capability is certainly retained in the new Power BI, this new version contains the first appearance of enterprise grade BI in the cloud for Microsoft.

To fully understand this, it’s necessary to touch on the Microsoft BI stack as it stands today.

Microsoft BI On Premises

The On-Premises BI story from Microsoft may be confusing, and occasionally difficult to understand, but it is very powerful, and relatively complete. In a nutshell, the story is good from a personal, team and enterprise perspective.

On the enterprise side, there are products from both the SQL Server team, and the Office team. Data warehousing is served by SQL Server and ETL duties fall to SQL Server Integration Services (SSIS). Multidimensional analysis storage is served by SQL Server Analysis Services in both OLAP and Tabular modes, and Reporting is performed by SQL Server Reporting Services (SSRS). The SQL product line doesn’t have much on the client side for analysis apart from SSRS, but this slack is taken up by the analysis tools available in Excel, and through Performance Point services in SharePoint.

Indeed, SharePoint also provides a platform for SSRS via SSRS SharePoint mode, and for Excel based analytical workbooks connected to SQL Server and to SSAS through Excel Services.

On the personal BI side, that role has traditionally fallen to Excel. The pitfalls of importing data into Excel workbooks for analysis are well documented and don’t need to be discussed here, but the bulk of those issues were addressed with the introduction of PowerPivot several years ago. PowerPivot allows for massive amounts of data to be cached within the Excel file for analysis without any data integrity concerns. The addition in recent years of  analytic visuals (Power View, Power Map) and ETL capabilities (Power Query) have further rounded out the offering.

Taking that Excel workbook and sharing it brings us into the realm of Team BI. This is to say that the analyses are relatively modest in size, and of interest to a targeted group. These models may not require the rigour or reliability associated with enterprise BI models. Once again, the technology involved here is SharePoint. A user can take a workbook with an embedded PowerPivot model, share it through a SharePoint library, and other users can interact with that embedded model using only a browser. This capability requires PowerPivot for SharePoint, which is really a specialized version of SSAS, along with a SharePoint service application.

One thing to note about these seemingly disparate approaches is that a power user can build a Power Pivot data model with Excel, share it to a team via SharePoint, and when it requires sufficient rigour or management, it can be “upgraded” into SSAS in tabular mode. This common model approach is powerful, and is key to understanding Microsoft’s entire BI strategy. You can also see here that SharePoint straddles the two worlds of team and enterprise BI.

Moving to the cloud

The BI workload is one of the last Microsoft workloads to move to the cloud, and with good reason. Massive amounts of data present problems of scale, and security or data sovereignty concerns tend to keep data on premises. However, there is a very real need to provide BI to users outside of the firewall.

SharePoint is the hub of BI on prem, so it’s logical to assume that with SharePoint Online, it could continue to perform that function in the cloud. The big catch here is that on-prem, SharePoint is simply the display platform. In the enterprise scenario, users connect through SharePoint to the back end servers. This isn’t an option in the cloud, so enterprise BI was left off the table.

With the personal and team BI scenarios, data is cached in a Power Pivot data model, which could be supported in the cloud. When Office 365 moved to the SharePoint 2013 code base for SharePoint online, rudimentary support for embedded Power Pivot models was indeed added. Essentially PowerPivot for SharePoint “light” was added. I call it light for two major reasons. Firstly, data models could be no larger than 10 MB. Secondly, there was no way to update the data contained within the Power Pivot cache, outside of re-uploading the Excel workbook. This is still true without a Power BI license. The inability to refresh the data renders team BI almost useless, except in static data scenarios.

The first generation of Power BI changed all of that. With a Power BI license, it was possible to install a Data Management Gateway on premises that would connect to team BI workbooks in Office 365 and update them on a scheduled basis. Yes, the gateway had many limitations (many of which have been removed over time), but finally, the on-prem refresh story was solved. In addition, the model size limit was increased to 250 MB. However, we were still left with a number of problems or limitations.

  1. Daily data refresh schedule. Automatic data refreshes could be daily at their most frequent. Manual refreshes could be done anytime
  2. Capacity. The maximum size of a data model was increased to 250 MB, which is relatively small for enterprise scenarios. In addition, refreshes aren’t differential, which means that the entire model is re-uploaded on every refresh
  3. Data sensitivity/sovereignty.  The refresh problem was solved, but because the data is still cached in the workbooks, there can be reluctance to sending it outside of the corporate firewall
  4. Per User Security – Power Pivot data models have no concept of user security in a workbook (tabular models in SSAS do). Security is at the workbook level
  5. Cost. This initial cost of Power BI was $40 per user per month. A power BI license was required to interact with any workbook that had a data model larger than 10 MB. Considering that a full Office 365 E3 license was around $25 per user per month, this price tended to limit the audience for sharing.

All of this is to say that Power BI in its first (and as yet current) incarnation is suitable for personal and team BI only. There has been no enterprise cloud BI story.

Power BI V2

The announcements yesterday outlined the next generation of Power BI. Going forward, Power BI will be available as a standalone offering, at the price points offered above. Office 365 users will continue to be able to use it from Office 365, but Office 365 will no longer be required to use it. In it’s early days, Power BI was a SharePoint app, but a careful examination of URLs in the current offering quickly reveals that it’s actually two apps currently, both running on Azure (not in SharePoint).

If you’ve signed up for the new Power BI preview, you may notice that the URL is http://app.powerbi.com/…… so this move isn’t a big surprise.

With the new model, Excel is no longer the central container. Users connect to data and publish it directly to Power BI. Behind the scenes, the service is doing a very similar thing as what it does with Power Pivot models – it’s storing them in SSAS. In fact, the same limits still apply – 250 MB per model (at least for now) Excel can still be used, but now it is as a data source.

Visualizations are performed through Power Views, and data is acquired through Power Query. These are no longer add-ons, but available on their own through Power BI Designer. This decoupling is good for those that have not made an investment in SharePoint Online, or Excel.

These changes to the architecture and the cost are great news for adoption, but don’t address the needs of the enterprise. Except for one thing – The SSAS Connector.

image

One of the data sources available to the new Power BI is the SSAS data connector. This connector is a piece of code that runs on premises (it actually includes the Data Management Gateway). It acts as a bridge between the Power BI service, and an on prem SSAS server.

The biggest distinction worth noting is that with the gateway, data is NOT being uploaded to the service, it remains on prem. The way that it works is that when a user interacts with a visualization from the cloud, a query is sent to the SSAS server through the gateway. That query is run, and its results sent back to the user’s visualization, and the data is not persisted.

In addition, when the query is sent back to the SSAS it is run with the permission of the user making the request. This is accomplished through the EFFECTIVEUSERNAME feature in SSAS. This provides for full user level security, and since tabular models in SSAS can utilize per user security, we no longer need to rely on proxy accounts/document level security.

Finally, because the data is being stored in an on prem SSAS server, it can be refreshed automatically as often as desired. For the same reason, we have no capacity limits – you can grow your own SSAS servers as large as you like.

The SSAS connector removes most of the limitations that prevent cloud based enterprise Business Intelligence, and the new pricing model removes the rest. Certainly there are going to be feature limits in the near term, but it appears to me at least that the back of this thorny problem has finally been cracked.

6 Comments

Check-In Tracking with Power Query and Power View

For me, 2014 was a particularly busy year. I was fortunate enough to have been invited to speak at a number of international conferences, and took a few wonderful personal trips as well. When travelling, I have gotten into the habit of doing Facebook check ins as I go. It occurred to me after describing my year to a friend, that since Power Query supports Facebook, I should be able to visualize my travel year easily enough.It wasn’t only easy, it was also very fast.

Acquiring the data

Getting data from your Facebook feed is really quite easy. Once you have Power Query installed, open a new Excel workbook, select the Power Query tab. Then from the “Other Data Sources” button, select Facebook.

image

Unless they’ve already been saved, you’ll be prompted to enter Facebook credentials. Once you’ve done so, you’ll be prompted for a Facebook object, and a Facebook connection (or feed). image

There is a wealth of data that can be retrieved, but in our case, we’re interested in check-ins, which are expressed in Facebook as posts. Therefore we select “me” (the currently logged in user) and “Posts” and click OK. The Power Query editor loads with all of our Facebook posts. It should be noted that Power Query is using Facebook Graph to perform the query, as can be seen in the function bar:

image

That import function can be edited, so any valid Facebook graph expression can be used to query data, not just those automatically generated by the Power Query UI.

Unfortunately, “Posts” contains a lot more data than just check ins, so we need to filter it down. We do that be applying a few transformations. All check ins are status posts, so we start by filtering the type column, selecting “status” as the filter value.

image

Not all statuses are check ins. However, all statuses with a value for location are. Where is location? It’s in a related record in the “place” column. The place column values are related records, so first, we need to expand the place column. To do so, click the small “expand” icon next to the column title.

image

Next, select the columns that you want to expand. In this case, all we need is location, so that’s all we need to expand. When OK is selected, the location data is expanded, and the original place column is replaced with a new place.location column. However, the values for location are themselves records, so they also need to be expanded. We then follow the same procedure that we did for “place” to expand them.

image

Once the columns are expanded, it is fairly obvious that only some entries have values for longitude and/or latitude. These are our check ins. We need to filter out all of the other records, and we can do this by filtering latitude or longitude with using the same approach that we used on the “type field earlier. This time, we deselect the null value.

image

We are left with nothing but check ins as records.

Now it’s time to clean up the data – we don’t want to load any more data than we need to perform the analysis, so we delete any unnecessary columns. Also, the expanded column names are cumbersome, so we rename those as well. Finally, the two date fields don’t get auto-detected as dates, so we need to change their data types to Date/Time. IN our case, the completed query appears as below.

image

Visualizing the data

We are now ready to visualize the data, so we select close and load. In our case, Power Query has been set to load only into the model, but your settings may vary. Since we’ll be using Power View, we need it in the data model. After the data has loaded, we select the Insert tab, and select Power View from the ribbon.

image

The first thing that we want to see is a map of our logins, so we select Country from our Power View fields which loads a list of the countries. Then we select Map from the ribbon, which loads the map. Next, we resize the map to fit our requirements. Finally, we drag the id field to the size field (visually showing number of check ins per location) and then we drag the city field under country in the locations field. At this point we can double click on a data point to drill down to a city.

image

We want to restrict the data to 2014, so we drag the created_time field into the filters column, select the advanced filters option, and select greater than Jan 1 2014 and less than Jan 1 2015.

image

Finally, we want to show a list of all of the locations, so we click just below the map visualization and select created_time, Country, state, and city from the fields list.

Normally, I’d finish the post up with a screenshot of the finished post, but this post on the Power BI blog yesterday inspired me to try embedding the actual product. Play around and have a look!

 

1 Comment

Limitations for Power Query OData Feeds in Power BI

One of the features available in Power BI is the ability to take any defined data source and expose it as an OData feed. This is a very simple and quick way to get your existing data available through OData, as it involves a simple check box selection. Complete instructions on setting this up can be found here. There are however a few limitations to what you are able to do that you should be aware of before you head down this path.

Intranet Only

The OData feed feature works through the Data Management Gateway, which is normally used to keep data models stored in the cloud updated regularly with new on-premises data. When a data source is registered, an “enable OData feed” option is made available which when checked, creates an OData feed URL.

When this feed is used, a connection is made directly from the OData client to the Power BI service, which then redirects communication to the Data Management Gateway. The reason that this is important is that because the actual data connection does not go through the Power BI service, the client machine needs to be able to communicate directly with the machine hosting the Data Management Gateway. This means that the OData feed only works on the intranet – it can’t be shared publicly. For now at least.

Data Types

The Data Management Gateway, and therefore the Power BI service don’t support all of the data types supported by SQL Server, or Oracle. If your table or view uses any field that is an unsupported data source, the entire table will be unavailable to use in an OData feed. The table will appear as greyed out when the list of tables to use for OData is being configured.

image

In the above case, the DistrictMaps table contained a geography field, which is unsupported. A complete list of supported data types can be found here. If you are using unsupported data types, you may want to consider creating views that do not contain these fields, and exposing those.

Data Sources

Up until recently (version 1.2), the Data Management Gateway only supported performing data refreshes from two on-premises data sources – SQL Server and Oracle, which constrained its value somewhat. Version 1.2 brought support for a wide variety of Power Query data sources, which really changed the game. Now, since OData feeds utilize the Data Management Gateway, we should be able to expose all sorts of data sources as OData feeds, right?

Wrong. Well, not quite at least. I received a question from Hrvoje Kusulja,  who was trying to expose DB2 data as an OData feed through Power Query, but the OData feed option was disabled. After some testing, and communication with Microsoft, I was able to determine conclusively that while Power Query queries are supported for OData feeds, the underlying Power Query queries MUST come from either SQL Server or Oracle. This is identical to the Power Query refresh support in version 1.1 of the Data Management Gateway. Unfortunately we couldn’t find documentation on this anywhere.

One potential workaround if you need OData support and your data source isn’t supported would be to use an ETL system (Integration Services) to pump data into SQL or Oracle, and create the query from there.

OData feeds is a great little feature, and a nice side benefit from using Power BI and the Data Management Gateway. As with any new product, it has limits that will undoubtedly be reduced in the future, but it’s important to know where they are.

5 Comments

Power BI as a Product Today

Recently, I have come across several situations where people are confused about where Power BI fits in a solution scenario. There is a fair bit of confusion as to precisely what the product is and what it does. The problem is that Power BI isn’t really a product at all, but instead a collection of different products and services. Adding to the confusion is the fact that some of these products require a Power BI license, while others do not. In fact some of these products are actually embedded in other products.

Power BI is Microsoft’s cloud based Business Intelligence solution billed as “Self service analytics for all of your data”. In reality, it’s a little more than self service, it also is a great solution for team BI as it’s based on Office 365. That’s all well and good, but what is it really? What does it consist of, and how does it work? If you look at the main product site for Power BI, it’s not immediately obvious at what you get when you purchase it, or what you need to run it. This post is an attempt to demystify the product.

To start, let’s break it down by its constituent components. Today Power BI consists of the following parts.

image

Unfortunately, this can be rather confusing from a product perspective. Looking first at the on-premises components, Power Query, Power View, and Power Map are all Excel plug ins. Excel is therefore a prerequisite for Power BI. All of these add ins also require (or in the case of Power Query, support) the embedded xVelocity data model, and therefore Power Pivot is a prerequisite. Power Pivot is included in Excel 2013 (Professional Plus), but it can also be downloaded for free for Excel 2010.

Also included in Excel 2013 is Power View, and, with Office 2013 SP1, Power Map. Power Query is downloaded separately, but is free. This is where much of the confusion arises. Due to the fact that these three add ins are included in the product definition of Power BI, it is often assumed that a Power BI license is required to use them. It is not. These products have a life of their own, and can be fully (or almost fully) used within Excel without any association with a Power BI license.

Power Query contains a few features that will only work with a Power BI tenant, mostly involved around the creation and maintenance of shared queries. Since this is part of the cloud service, this makes complete sense, but none of the other features of the product are in any way reduced in the absence of a license. Power View is enhanced through a Power BI license, but only because this makes Power View reports available within the mobile client(s). Indeed, Power Map has no use whatsoever of a Power BI license. Power Maps cannot be viewed at all within a browser – they are a client side feature only. In my opinion, they shouldn’t even be included under the Power BI umbrella, but that’s just my opinion.

Thus far, I have been talking about the modelling and visualization creation aspects of the tools, but what about pure consumption clients? The whole idea of power BI is that designers can create these models and users can interact with them. The workbooks containing these models are stored within Office 365, so do casual users need a license?

The answer is of course maybe. If these users are going to take advantage of any of the services specifically offered by Power BI, then the answer is yes. For example, any user can open a workbook in a browser in Office 365. However, if they want to interact with that model, by using a slicer, pivot table, etc, and that model is larger than 10 MB, then the answer is yes. Obviously, if the user wants to use the Power Q&A features, then the answer is also yes.

For the record, I don’t like this answer. To my mind, designers and content creators should require a license, but consumers should not. This would greatly encourage adoption of the product, so I do hope for some changes in this area.

So, precisely what do you get when you purchase a Power BI license? These are the things that you will absolutely need a Power BI license for.

  • Opening workbooks in a browser with models larger than 30 MB on Office 365
  • Interacting with (slicers, pivot tables, etc) workbooks in a browser with models larger that 10 MB on Office 365
  • Automatic refresh of on premises data
  • Sharing of Power Query queries
  • Refresh of Power Query queries
  • Power Q&A – Natural language queries
  • Power BI mobile application

and that’s it.

In fact, if you check out my earlier article “Whither Power Pivot for SharePoint”, you’ll see that many of the features of Power BI are already available in Power Pivot for SharePoint.

To my mind, the product “Power BI” should not include the Excel add ins, but only list them as a requirement, much like Excel itself is a requirement. This would help to reduce confusion. The next version of Power will support their inclusion. If you’re interested in this new version, you can sign up for the preview when it’s ready here. I’ll be writing more about that shortly.

3 Comments