Tag Archives: Office 365

The New Power BI – Now With Enterprise!

Yesterday Microsoft announced the next step in the evolution of Power BI. It’s getting quite a bit of attention, and rightly so for its aim of bringing Business Intelligence closer to users. Democratizing BI has always proved a challenge – it’s the realm of the gurus in the white coats that hold the keys to the data. Microsoft is aiming to accomplish this democratization through a combination of user focus, and as of yesterday, a drastic change in its pricing model. Power BI just went from about $40 per user per month, to free, or $9.99/user/month for advanced capabilities. That’s quite a drop, and arguably the biggest announcement from yesterday – it will have a massive impact. The detailed price breakdown can be found here.

However, all of the focus around personal BI is, in my opinion, missing a key component. Power BI and its components have always focused squarely on both personal and team BI solutions. That is to say the ability for a power user to model data, visualize it quickly and easily and to share it out with fellow team members. While that capability is certainly retained in the new Power BI, this new version contains the first appearance of enterprise grade BI in the cloud for Microsoft.

To fully understand this, it’s necessary to touch on the Microsoft BI stack as it stands today.

Microsoft BI On Premises

The On-Premises BI story from Microsoft may be confusing, and occasionally difficult to understand, but it is very powerful, and relatively complete. In a nutshell, the story is good from a personal, team and enterprise perspective.

On the enterprise side, there are products from both the SQL Server team, and the Office team. Data warehousing is served by SQL Server and ETL duties fall to SQL Server Integration Services (SSIS). Multidimensional analysis storage is served by SQL Server Analysis Services in both OLAP and Tabular modes, and Reporting is performed by SQL Server Reporting Services (SSRS). The SQL product line doesn’t have much on the client side for analysis apart from SSRS, but this slack is taken up by the analysis tools available in Excel, and through Performance Point services in SharePoint.

Indeed, SharePoint also provides a platform for SSRS via SSRS SharePoint mode, and for Excel based analytical workbooks connected to SQL Server and to SSAS through Excel Services.

On the personal BI side, that role has traditionally fallen to Excel. The pitfalls of importing data into Excel workbooks for analysis are well documented and don’t need to be discussed here, but the bulk of those issues were addressed with the introduction of PowerPivot several years ago. PowerPivot allows for massive amounts of data to be cached within the Excel file for analysis without any data integrity concerns. The addition in recent years of  analytic visuals (Power View, Power Map) and ETL capabilities (Power Query) have further rounded out the offering.

Taking that Excel workbook and sharing it brings us into the realm of Team BI. This is to say that the analyses are relatively modest in size, and of interest to a targeted group. These models may not require the rigour or reliability associated with enterprise BI models. Once again, the technology involved here is SharePoint. A user can take a workbook with an embedded PowerPivot model, share it through a SharePoint library, and other users can interact with that embedded model using only a browser. This capability requires PowerPivot for SharePoint, which is really a specialized version of SSAS, along with a SharePoint service application.

One thing to note about these seemingly disparate approaches is that a power user can build a Power Pivot data model with Excel, share it to a team via SharePoint, and when it requires sufficient rigour or management, it can be “upgraded” into SSAS in tabular mode. This common model approach is powerful, and is key to understanding Microsoft’s entire BI strategy. You can also see here that SharePoint straddles the two worlds of team and enterprise BI.

Moving to the cloud

The BI workload is one of the last Microsoft workloads to move to the cloud, and with good reason. Massive amounts of data present problems of scale, and security or data sovereignty concerns tend to keep data on premises. However, there is a very real need to provide BI to users outside of the firewall.

SharePoint is the hub of BI on prem, so it’s logical to assume that with SharePoint Online, it could continue to perform that function in the cloud. The big catch here is that on-prem, SharePoint is simply the display platform. In the enterprise scenario, users connect through SharePoint to the back end servers. This isn’t an option in the cloud, so enterprise BI was left off the table.

With the personal and team BI scenarios, data is cached in a Power Pivot data model, which could be supported in the cloud. When Office 365 moved to the SharePoint 2013 code base for SharePoint online, rudimentary support for embedded Power Pivot models was indeed added. Essentially PowerPivot for SharePoint “light” was added. I call it light for two major reasons. Firstly, data models could be no larger than 10 MB. Secondly, there was no way to update the data contained within the Power Pivot cache, outside of re-uploading the Excel workbook. This is still true without a Power BI license. The inability to refresh the data renders team BI almost useless, except in static data scenarios.

The first generation of Power BI changed all of that. With a Power BI license, it was possible to install a Data Management Gateway on premises that would connect to team BI workbooks in Office 365 and update them on a scheduled basis. Yes, the gateway had many limitations (many of which have been removed over time), but finally, the on-prem refresh story was solved. In addition, the model size limit was increased to 250 MB. However, we were still left with a number of problems or limitations.

  1. Daily data refresh schedule. Automatic data refreshes could be daily at their most frequent. Manual refreshes could be done anytime
  2. Capacity. The maximum size of a data model was increased to 250 MB, which is relatively small for enterprise scenarios. In addition, refreshes aren’t differential, which means that the entire model is re-uploaded on every refresh
  3. Data sensitivity/sovereignty.  The refresh problem was solved, but because the data is still cached in the workbooks, there can be reluctance to sending it outside of the corporate firewall
  4. Per User Security – Power Pivot data models have no concept of user security in a workbook (tabular models in SSAS do). Security is at the workbook level
  5. Cost. This initial cost of Power BI was $40 per user per month. A power BI license was required to interact with any workbook that had a data model larger than 10 MB. Considering that a full Office 365 E3 license was around $25 per user per month, this price tended to limit the audience for sharing.

All of this is to say that Power BI in its first (and as yet current) incarnation is suitable for personal and team BI only. There has been no enterprise cloud BI story.

Power BI V2

The announcements yesterday outlined the next generation of Power BI. Going forward, Power BI will be available as a standalone offering, at the price points offered above. Office 365 users will continue to be able to use it from Office 365, but Office 365 will no longer be required to use it. In it’s early days, Power BI was a SharePoint app, but a careful examination of URLs in the current offering quickly reveals that it’s actually two apps currently, both running on Azure (not in SharePoint).

If you’ve signed up for the new Power BI preview, you may notice that the URL is http://app.powerbi.com/…… so this move isn’t a big surprise.

With the new model, Excel is no longer the central container. Users connect to data and publish it directly to Power BI. Behind the scenes, the service is doing a very similar thing as what it does with Power Pivot models – it’s storing them in SSAS. In fact, the same limits still apply – 250 MB per model (at least for now) Excel can still be used, but now it is as a data source.

Visualizations are performed through Power Views, and data is acquired through Power Query. These are no longer add-ons, but available on their own through Power BI Designer. This decoupling is good for those that have not made an investment in SharePoint Online, or Excel.

These changes to the architecture and the cost are great news for adoption, but don’t address the needs of the enterprise. Except for one thing – The SSAS Connector.

image

One of the data sources available to the new Power BI is the SSAS data connector. This connector is a piece of code that runs on premises (it actually includes the Data Management Gateway). It acts as a bridge between the Power BI service, and an on prem SSAS server.

The biggest distinction worth noting is that with the gateway, data is NOT being uploaded to the service, it remains on prem. The way that it works is that when a user interacts with a visualization from the cloud, a query is sent to the SSAS server through the gateway. That query is run, and its results sent back to the user’s visualization, and the data is not persisted.

In addition, when the query is sent back to the SSAS it is run with the permission of the user making the request. This is accomplished through the EFFECTIVEUSERNAME feature in SSAS. This provides for full user level security, and since tabular models in SSAS can utilize per user security, we no longer need to rely on proxy accounts/document level security.

Finally, because the data is being stored in an on prem SSAS server, it can be refreshed automatically as often as desired. For the same reason, we have no capacity limits – you can grow your own SSAS servers as large as you like.

The SSAS connector removes most of the limitations that prevent cloud based enterprise Business Intelligence, and the new pricing model removes the rest. Certainly there are going to be feature limits in the near term, but it appears to me at least that the back of this thorny problem has finally been cracked.

Using Power Query with SharePoint Lists and Lookup Fields

As I’ve explained many times before, querying SharePoint data directly is a bad idea. The SharePoint data storage mechanisms simply aren’t designed for querying of any scale, hence the lookup limitations that have been imposed upon it. The best approach to querying SharePoint list data is to first load it into a data warehouse or data mart of some sort. However, both Reporting Services (SSRS) and Power Query support direct access to SharePoint lists. While I try to strongly dissuade people from doing this with Reporting Services, properly used, Power Query is a totally viable means of querying SharePoint list data.

Why is this? With SSRS, every query goes back to the data source for retrieval.  Power Query is different – it’s analogous to SQL Server Integration Services, which is an ETL management product. It loads source data into a repository, in this case, an embedded xVelocity, or Power Pivot model which can be considered a “personal data warehouse”. Queries against this mini data warehouse are fast, and don’t rely on SharePoint  retrieval mechanisms, and can be used quite effectively in reports.

There are a couple of subtleties to querying SharePoint list items with Power Query, and I will briefly walk through the process below.

With Excel open, click the Power Query tab, select “From Other Sources” and the select “From SharePoint List”.

image

Next, enter the URL for the SharePoint site (or subsite) that contains the list you wish to query.

image

If it is the first time accessing this site, you will be prompted for credentials. If your site is Office365, be sure to enter organizational credentials. If it is on premise, use Windows credentials.

Once entered, you will be presented with a list of SharePoint lists in the Power Query Navigator window. Select the list that you wish to query, in our case, Announcements. When selected, click the edit button to edit the query.

image

The data, or a subset will load into the query editor window. You will see all of the list item fields expressed as columns, and for the most part, using the correct data type. At this point you can remove any columns that are unnecessary, or filter any undesired rows. There are a couple of SharePoint field types that bear special mention.

Lookup fields are a lookup into another SharePoint list. Internally, the SharePoint item stores this as an ID and display value, but Power Query gives you access to all of the properties of the related item as a one-to-one relationship. Essentially, what you can do is to flatten that relationship by incorporating the related item’s attributes.

If you scroll to a column of this particular type, you will see the value expressed as a hyperlink with the value “Record”. Clicking on it will drill down to one related record, but that’s not what we want to do. We want to expand the properties for all items in the list. The way that you do this is  to click on the expand icon in the column header. In our case, we want to expand the “CreatedBy” field. CreatedBy is a standard list field, of the Person type. Person fields are actually a special case of a lookup field, so it exhibits this behaviour.

image

Here, we are interested in retrieving the user’s name and mobile phone, so we deselect all of the other fields. A new column will be created for every expanded field in the format sourcefieldname.attributename .

image

Attachments are another special case. There can be multiple attachments for a single list item, a one to many relationship. The hyperlink is therefore “Table”. Clicking on the column header expand for this column looks similar, but with an important difference. Options are available to either expand or aggregate the related items.

image

Selecting expand will create a new source record for each related item, and the only columns that will differ will be the items selected from the related table (Name in our case). Aggregate will not create any new records, but will summarize the related fields. For numeric fields, they can be totalled or averaged, and for text fields they can be counted.

Once ready, click “Close and Load” from the Query Editor ribbon, and the list data will load to either your model, or your workbook, depending on what your preferences are. Of course, I always recommend that you load to the model only.

Once loaded, any visualizations and queries will work against the model. The data can be refreshed at any point either manually, or automatically if using the Data Management Gateway. Keep in mind however that refreshes will operate against the source list.

Using the SSIS OData Source Connector With SharePoint Online Authentication

Last week, Microsoft released the OData Source for Microsoft SQL Server 2012 . What is it? It allows SQL Server Integration Services (SSIS) to use an OData feed as a first class citizen data source in the same manner as SQL Server, Oracle, etc. Until now it was necessary to code OData connections using the script object.

This matters to those of us in the SharePoint world because any SharePoint list data can be expressed as OData.

I’ve written before about how SharePoint data can be extracted into a data warehouse using SSIS and the SharePoint List Source and Destination Adapters, available from CodePlex. These adapters plug in to SSIS and wrapper the SharePoint SOAP web services, and therefore do not need to be installed on a SharePoint server. We have used them for years, and they work very well, however, they are a CodePlex project, and therefore not fully supported.

These CodePlex adapters have more recently been bumping into another limitation. While SOAP web services are supported in Office 365, the adapters don’t support the Office 365 authentication mechanism, which effectively renders them useless. The OData services require the same authentication, but the the new OData Source supports it. It’s also an official Microsoft product, and is fully supported.

There is, however a trick to getting it working. Once you install the OData Source, you open up SQL Server Data Tools, open an SSIS project, and add or edit a data flow task. In the SSIS Toolbox,  You should see the OData Source.

image

Drag the tool on to the design surface, and double click to configure it. You’ll first need to configure an OData Connection Manager, and you’ll do that by clicking the New button.

image

Give the connection a name. The connection will be common to all lists and libraries within a site, so something based on the name of the site is likely appropriate. The Service document location is the OData endpoint. It takes the form of the URL of the site, along with the suffix /_vti_bin/listdata.svc. If the connection is on premises, you can use Windows Authentication, but if it is Office 365, you must use a stored name and password.

image

If you are using Office 365, and you click Test Connection at this point, you’ll receive an error “Test connection failed –> The remote server returned an error: (400) Bad Request.”

image

This is due to Office 365’s “unique” authentication mechanism. In order to authenticate to Office 365, you must first select the “All” button in the toolbar, and set the value of “Microsoft Online Services Authentication” to true.

image

This option may not be available to you. If the Online Services Authentication option is disabled, or greyed out, as it was for me when I first tried to use it, it’s because a prerequisite is missing. In order to authenticate to Office 365, the machine must have the SharePoint Server 2013 Client Components SDK installed on it.

Once the client components are in place, and the option is selected, the data source should be able to connect to the source, and the connection manager can be closed. Lists are exposed as Collections, so if you want to work with list data, you can then select the list from the list of Collections.

image

At this point, the data source will act like any other SSIS data source, you can select and transform columns at will. More importantly, this will help you get SharePoint data both on-prem and in the cloud into a central data warehouse.

How to Refresh Data Models in Office 365 from On Premises Using Nintex Workflow – A Hybrid Approach

One of the viable use cases for Power BI is to provide reporting to a mobile workforce, even when the organization has invested in BI on premises. In this scenario, there may be a SharePoint farm using PowerPivot for SharePoint with multiple workbooks connected to a myriad of data sources, including SQL Server Analysis Services cubes and models.

The Problem

In this scenario, the workbooks required by the mobile users can be copied up to the Office 365 tenant, and consumed from there. However, how is the data kept current? One way to do so is to set up the Power BI Data Management Gateway (DMG) to do this. The DMG is a great solution, but is relatively new, and has a few inherent limitations at the moment. Chief among them is that data can only be refreshed from SQL Server or Oracle data sources. For the moment at least, Analysis Services is left out in the cold. However, PowerPivot for SharePoint doesn’t share these limitations, and has no problem refreshing data from a wide variety of sources. Unfortunately, if you’re using Power BI, and need to refresh from an unsupported data source, you’re out of luck for the moment, so this is a problem.

Keeping a cloud copy of a local workbook carries the inherent problems of managing two different copies of the same workbooks. Every modification must be performed twice and there is always the chance that something will get missed. Ideally, for this scenario, we should have a publishing mechanism. We can use Nintex Workflow (on premises) to provide this publishing mechanism for us. Whenever the model is refreshed, the workflow will fire, and copy the file up to Office 365 to update the model there.

In order to implement this solution, you’ll need a copy of Nintex Workflow. The low end (Team) edition is fine, but what we’re after is the “Copy document to Office 365” action, and that is available through the Nintex Live action set.

The Solution

To start, we’ll work with an Excel workbook that is using an embedded model created from the standard Contoso Data Warehouse. In this example, we’ve built a simple pivot table to interact with the model. The model has been published to a PowerPivot gallery, and the refresh schedule has been set. Once we’ve tested the refresh to ensure that it’s working, we can create the workflow. To do so, we go to the workflow options section in the library ribbon and create a new Nintex workflow.

image

We then select the Blank template. Once presented with the design canvas, we select Workflow Settings, give it a good name, and set the workflow to run on create and on edit.

image

Once the settings have been saved, we can start to build the actual workflow. The first action will be a “Pause for” action. Find it in the Workflow Actions toolbar, and drag it onto the design surface. Select configure from its drop down menu, and set it to pause for a period of time, in this case, 5 minutes.

image

The amount of time that we need to pause for depends on how long it takes our model to refresh. When the refresh operation begins, it “edits” the document, which kicks off the workflow. We don’t want to upload the workbook until the refresh operation is complete, so we need to pause it. In this case, we are working with a data model that is about 100 MB and comprised of about 3 million rows of data. In this environment, it requires about 2.5 minutes to refresh, so a 5 minute pause is sufficient.

Next, we need to use the “Office 365 upload file” action. This action is one of the Nintex Live actions that can be added at no extra charge to the Nintex environment. If it hasn’t already been added to the toolbar, we need to do so. Assuming that Nintex Live has been enabled for the farm, we browse the available actions through the Catalog which is available through the designer ribbon.

image

Once in the catalog, we can browse the available actions, and click the Add button to add them to the toolbox. The action that we need to add is named “Office 365 upload file”. Once at least one action is added, an new section will appear in the Workflow Actions toolbar named “Nintex Live”. We then open that section and drag the Office 365 file upload action onto the design surface after the pause action, then configure it.

image

Walking through the options:

  • File to upload: We want the current file to be uploaded to Office 365, so we select current item
  • Destination site URL: This is in fact the URL of the site itself. Everything up to, but not including the document library
  • Folder path: This is the name of the destination document library, and if appropriate, any subfolders
  • File name: We can create or derive a new name for the file, but in this case, we will be using the same name as the source file
  • Overwrite existing file: We select this option, as we will be updating existing content.
  • Fields: We can add additional metadata values if we wish, but in this case, we leave it empty
  • SharePoint Online URL: this is the URL to the root of the Office 365 tenant for the destination
  • Username and password are for a proxy account that will be used to upload the file. Secure workflow constants can be used.

Once we configure the action, we save it, our workflow should appear as follows:

image

and we are ready to publish the workflow. To do so, select the Publish button, then close the workflow designer.

We are now ready to test our workflow. We can force the workflow to run on demand for our workbook, but a better test is to let the refresh trigger it. The most frequently that PowerPivot for SharePoint refreshes can be scheduled to run is once per day. but there is a way to force the refresh to happen on demand.

To force a refresh, we need to edit the refresh schedule for the workbook. Open up the refresh history screen by selecting Manage Data Refresh for the workbook. 

image

From the Refresh history screen, select the Configure Schedule link to edit the schedule. This will allow us to edit the existing schedule, but more importantly, it will also let us force a refresh by selecting the “Also refresh as soon as possible” option.

image

If we are using a specific account as a credential, we will also need to re-enter that. Once complete, selecting OK will queue the refresh job for execution. The refresh job will happen the next time that the “PowerPivot Data Refresh Timer Job” executes, and it runs every 5 minutes.

If you’re really impatient, and you have access to central administration, you can edit this timer job and force it to run immediately on a case by case basis.

Moving back to the refresh history screen we will notice that the refresh is currently running.image

And in the workflow history screen for the workbook, we can see that the workflow is also running.

image

Once both operations have completed, we can navigate to our Office 365 site, and we will find the file in the destination folder. If our model is greater than 10 MB in size, we will require Power BI in order to interact with it in the browser, and we will need to enable it for use with Power BI through the Power BI application. If it has already been enabled, then no additional actions are required. If the workbook is less than 10MB, it will work directly in any Office 365 library that is enabled for Excel Services.

More Possibilities

If you’ve worked with the Power BI Data Management Gateway at all, you’ll know that data refreshes can be scheduled at most daily. This limitation is also shared by PowerPivot for SharePoint, but it is possible to change its behaviour. Using the techniques outlined in this article by Ian Smith, and this article by Tim Laqua, refreshes can be made as granular as 5 minutes. These articles are written for SharePoint 2010, but the principles work with 2013 (hint, the stored procedure that forces a refresh is [DataRefresh].[AddRefreshSchedule]). Be warned, this technique is a hack, and not supported by Microsoft. However, it can be a way to up the frequency of refreshes on premises, and with the above technique, in Office 365 too.

Another way to refresh the data more often is by using SQL Server Integration Services (SSIS). SSIS can update an embedded data model in an Excel workbook through a script action (thanks to Gobán Saor).  SSIS packages can be scheduled with SQL agent jobs, which means that they can be run as frequently as every minute. Another advantage of this approach is that with Excel 2013, this method should be fully supported.

Finally…

By relying on our workflow to publish the refreshed model, we work around the current limitations of the Data Management Gateway. We do incur the cost of a Nintex license, but a few days of development would easily exceed that cost. In fact, if our data model doesn’t exceed 10 MB, we don’t even need Power BI for this solution to work.

Business Intelligence in SharePoint and Office 365

At the recent SharePoint Conference 2014, I had the opportunity to be interviewed by Karuana Gatimu of Microsoft IT and Channel 9. The original video can be found on Channel 9 here, and I include it below as well. In it, we discuss the Microsoft Business Intelligence stack, the impact of the new Power BI products, and how to get started.