License Requirements for SQL Server Reporting Services in SharePoint Integrated Mode

As I am keen to point out every opportunity I get, when installing SQL Server Reporting Services in SharePoint Integrated mode, it is important to ensure that it is installed on a SharePoint server, and not on a SQL Server. It’s a bit counter-intuitive because the installation files are on the SQL Server media, and not the Sharepoint media. This causes confusion in a number of areas, but one question that I hear a lot is in the realm of licensing.

Technically, using any SQL Server component on any server requires a full SQL Server license for that component for each server. Running SQL Server Analysis Services and SQL Server Database on two different machines requires two different licenses. Seen that way, putting the SSRS Service on a SharePoint server would require a second license of SQL Server, which can be an expensive proposition. It also doesn’t make much sense, because it promotes bad design. Luckily, the SSRS service application is specifically exempted from additional licensing requirements. The following is taken directly from the Microsoft SQL Server 2012 SP1 license document. Section 2.5 states:

2.5    Running Instances of the Additional Software.

You may run or otherwise use any number of instances of additional software listed below in physical or virtual operating system environments on any number of devices. You may use additional software only with the server software directly, or indirectly through other additional software.

  • Business Intelligence Development Studio
  • Client Tools Backward Compatibility
  • Client Tools Connectivity
  • Client Tools SDK
  • Data Quality Client
  • Data Quality Services
  • Distributed Replay Client
  • Distributed Replay Controller
  • Management Tools – Basic
  • Management Tools – Complete
  • Reporting Services – SharePoint
  • Reporting Services Add-in for SharePoint Products
  • Master Data Services
  • Sync Framework
  • SQL Client Connectivity SDK
  • SQL Server 2012 Books Online

 

Notice that little bullet point “Reporting Services – SharePoint”? That’s the service application. Put simply, this means that in order to use SSRS in SharePoint Integrated mode, you’ll need to have at least one licensed SQL server, but you don’t need to run it on the SQL machine. In fact, according to the line “any number of instances” you can run it on as many SharePoint servers as you wish to take advantage of load balancing without incurring any additional licensing cost for SQL Server.

Schedule Data Refresh for SSAS Connected Excel Workbooks with PowerPivot for SharePoint

Using Excel Services, SharePoint users have been able to share workbooks that are connected to back end data since SharePoint 2007. Typically, the connection is made to SQL Server, or to Analysis services although a wide variety of sources are available. It’s also possible to publish individual components from these workbooks anywhere within the site collection through the Excel Web Access web part. Users can navigate to a dashboard page that contains all sorts of elements including an Excel chart that is connected to back end data. Well, to be precise, it was connected to back end data, the last time the workbook was saved. The workbook itself can be refreshed, but only manually.

When you open an Excel workbook in a browser through Excel services, by default, you’ll see the visualizations and any stored data in precisely the way that the workbook was when it was last saved. If you need to see more up to date data, you can select “Refresh Connections”. If (and sometimes that’s a big if) the server and connections are set up properly, the server will fetch updated data and update the workbook.

 This works well enough, but the problem is that when you, or anyone else opens the workbook again, they’ll still see the old version of the workbook, and will need to manually refresh the date again. In addition, any visualizations published elsewhere on a dashboard will also continue to show old data unless manually refreshed. If the amount of data is significant, this poses a serious performance issue to the server(s). There’s also a significant usability impact in that it’s a pretty big ask of an end user to have them constantly hitting a refresh button.

To get around this issue, one option is to set the refresh options in the data connections of the workbook. Excel Services respects these options. There are two settings that we need to be aware of, periodic refresh, and refresh on open. Connection properties can be accessed within the Excel client by selecting the Data tab, choosing Connections, then highlighting the connection in question and selecting Properties.

Periodic refresh will allow the workbook to be automatically refreshed in the background while it is opened in the browser. This can be useful when the source data is changing frequently. Refresh on opening will have the greatest impact in our scenario, as it will automatically refresh the data in the workbook whenever the file is opened. This will also work with published objects (Excel Web Access web parts) – every time that the web part is opened, the data will be automatically refreshed. This solves the usability problem above because the user no longer needs to manually update the data. However, it does not affect the server load problem.

Due to the fact that the data and visualizations retain the state that they had when the workbook was last saved, it also affects search. When the search indexer runs, it will only index the data that is saved in the workbook. It has no means of refreshing the data. Finally, in addition to the load imposed on the servers by constant refreshes, if the quantity of data being refreshed is large, users can experience significant lags when loading the file. This obviously introduces another usability option. While the refresh options in Excel are helpful, they don’t fully solve the problem. What is needed is a way to automatically open the file for editing, refresh the data, and resave it to SharePoint.

If you have ever used Power Pivot for SharePoint, you know that it can do exactly that. Power Pivot for SharePoint contains two primary elements – a specialized instance of SQL Server Analysis Services that allows users to interact with workbooks that contain embedded PowerPivot models, and a SharePoint service application that among other things, keeps those embedded models refreshed. Using the PowerPivot Gallery (enabled when PowerPivot for SharePoint is installed), you can configure a workbook’s refresh options by clicking on the icon in the Gallery view, or by selecting “Manage PowerPivot Data Refresh” in the simple All Documents view.

 Data Refresh options in PowerPivot Gallery View

 Data Refresh options in All Documents View

Once configured, the PowerPivot for SharePoint Service will refresh the data model in the workbook on a periodic basis (no more than once per day). The service essentially opens the workbook in edit mode, refreshes all of the data connections, and saves the workbook back to the library. If versioning is enabled, it will be saved as a new version. Unfortunately, if you’re not using a PowerPivot data model, the options are unavailable. In Gallery view, the icons are simply unavailable, and while the option is available in the All Documents view, selecting it results in an error.

On the surface, it would seem that using workbooks with PowerPivot is the only option for keeping large volumes of back-end data up to date in Excel visualizations. However, there is a small loophole that you can take advantage of.

The refresh function in PowerPivot for SharePoint refreshes all of the connections in a workbook. While this option is unavailable if the workbook has no embedded PowerPivot model, when it does, it refreshes ALL of the data connections in the workbook, whether they connect to a model, a back end SSAS server, SQL server or whatever. So therefore, if you want to keep your connected data refreshed, the solution is to add a dummy PowerPivot model to your workbook.

Simply open up the PowerPivot window, import some small amount of data from an external source, and save it. Once saved, the PowerPivot refresh options will appear, and you’ll be able to schedule data refresh for your workbook. You can even deselect the refresh of the source data for your dummy model, and the other connections will work just fine.

Once your workbooks are being updated automatically, your users will be presented with up-to date data on load with no delays, all dashboard visualizations will be up to date and quick to render, and the visible data will be picked up by your search crawler. All will be well with the world.

Something Went Wrong When Connecting Power BI to SSAS

One of the more powerful features in the new Power BI preview is the ability to connect your Power BI dashboard to an on-premises instance of SQL Server Analysis Services. This is done by installing a connector on-premises that connects to bot the SSAS server(s) and to the Power BI service. Installation is straightforward, but quite often the first attempt to connect to the data results in the “oh-so-helpful” error message, “something went wrong”.

image

The message isn’t very helpful, unless you’re speaking with a support representative.

I have found that the problem in most cases is that the user connecting to the data does not have sufficient rights on the SSAS server, or the server does not understand who the user is. This may very well be the same user that was used to connect from the SSAS Connector back into the service, but that doesn’t matter. The problem is that SSAS does not know about that identity.

To explain, first, we need to consider how the connector is registered.

image

When the connector is registered, two sets of credentials are provided. One credential is used to connect the connector to the SSAS server. This is an Windows credential (typically in the form of DOMAIN\username, and  it must be an Administrator on the SSAS server. The reason for this requirement is that it will be used to funnel all Connector queries to the SSAS server, and it uses the EFFECTIVEUSERNAME feature in SSAS. EFFECTIVEUSERNAME requires admin level access.

The second credential is used to connect the connector to the Power BI service. This one is used to register the connector with the service so that it can be used by dashboard authors, and isn’t extensively used afterward. This credential will be an Organizational Account (i.e. an Office 365 identity/Azure Active Directory) and needs to have enough rights to register a data source with Power BI.

Once registered, it works as follows.

image

When a dashboard user interacts with the dashboard, or accesses the data source, a request is sent to the connector with the credentials of the user making the request. The connector then establishes a connection with the SSAS server, using the admin credentials registered with the connector, and issues the query using the EFFECTIVEUSERNAME parameter, which basically means “run this query using the provided user’s credentials”. The user provided is the one making the request. This allows for per-user level security for Power BI, but unfortunately, it is what typically causes the error above.

The issue is that SSAS only understands Windows (NTLM and Kerberos) credentials. Without doing anything else, it has no idea what an Organizational ID is. So how can it work at all? There are two ways.

The first, is that your domain can be federated with Azure Active directory, specifically with the Azure Active directory that your Power BI (Office 365) tenant is using. Once federation is complete, your AD domain (domainxxx.com) will be registered and trusted with your internal NTLM/Kerberos domain and your users will be mapped to their Azure ID identities. SSAS will then understand who they are, and if granted permission, they will be able to access SSAS data via the Power BI dashboard. This is the only supported method, and is what should be used in a production environment. There is however another way.

If AD federation is not an option in the short term, or you simply need to get a development or demonstration environment spun up, it is possible to “hack” your active directory to allow SSAS to understand the organizational IDs. First, the AD domain is registered directly with Active directory, then the Active Directory users can be set to use that directory. The key part is that the user name in the internal AD (ie DOMAIN\xxx) must match the user portion of the Azure AD account (ie xxx@domainxxx.com). The entire procedure is outline very well by Greg Galloway in this article, and I won’t repeat it here.

Going back to the original error, it would be nice if it could be a little more descriptive. I’d be happy with “Access Denied”. These are early days, and the product is still in preview – I expect this will change. Security also may not be the only cause of this error, but it’s the only one that I’ve seen thus far. 

The New Power BI – Now With Enterprise!

Yesterday Microsoft announced the next step in the evolution of Power BI. It’s getting quite a bit of attention, and rightly so for its aim of bringing Business Intelligence closer to users. Democratizing BI has always proved a challenge – it’s the realm of the gurus in the white coats that hold the keys to the data. Microsoft is aiming to accomplish this democratization through a combination of user focus, and as of yesterday, a drastic change in its pricing model. Power BI just went from about $40 per user per month, to free, or $9.99/user/month for advanced capabilities. That’s quite a drop, and arguably the biggest announcement from yesterday – it will have a massive impact. The detailed price breakdown can be found here.

However, all of the focus around personal BI is, in my opinion, missing a key component. Power BI and its components have always focused squarely on both personal and team BI solutions. That is to say the ability for a power user to model data, visualize it quickly and easily and to share it out with fellow team members. While that capability is certainly retained in the new Power BI, this new version contains the first appearance of enterprise grade BI in the cloud for Microsoft.

To fully understand this, it’s necessary to touch on the Microsoft BI stack as it stands today.

Microsoft BI On Premises

The On-Premises BI story from Microsoft may be confusing, and occasionally difficult to understand, but it is very powerful, and relatively complete. In a nutshell, the story is good from a personal, team and enterprise perspective.

On the enterprise side, there are products from both the SQL Server team, and the Office team. Data warehousing is served by SQL Server and ETL duties fall to SQL Server Integration Services (SSIS). Multidimensional analysis storage is served by SQL Server Analysis Services in both OLAP and Tabular modes, and Reporting is performed by SQL Server Reporting Services (SSRS). The SQL product line doesn’t have much on the client side for analysis apart from SSRS, but this slack is taken up by the analysis tools available in Excel, and through Performance Point services in SharePoint.

Indeed, SharePoint also provides a platform for SSRS via SSRS SharePoint mode, and for Excel based analytical workbooks connected to SQL Server and to SSAS through Excel Services.

On the personal BI side, that role has traditionally fallen to Excel. The pitfalls of importing data into Excel workbooks for analysis are well documented and don’t need to be discussed here, but the bulk of those issues were addressed with the introduction of PowerPivot several years ago. PowerPivot allows for massive amounts of data to be cached within the Excel file for analysis without any data integrity concerns. The addition in recent years of  analytic visuals (Power View, Power Map) and ETL capabilities (Power Query) have further rounded out the offering.

Taking that Excel workbook and sharing it brings us into the realm of Team BI. This is to say that the analyses are relatively modest in size, and of interest to a targeted group. These models may not require the rigour or reliability associated with enterprise BI models. Once again, the technology involved here is SharePoint. A user can take a workbook with an embedded PowerPivot model, share it through a SharePoint library, and other users can interact with that embedded model using only a browser. This capability requires PowerPivot for SharePoint, which is really a specialized version of SSAS, along with a SharePoint service application.

One thing to note about these seemingly disparate approaches is that a power user can build a Power Pivot data model with Excel, share it to a team via SharePoint, and when it requires sufficient rigour or management, it can be “upgraded” into SSAS in tabular mode. This common model approach is powerful, and is key to understanding Microsoft’s entire BI strategy. You can also see here that SharePoint straddles the two worlds of team and enterprise BI.

Moving to the cloud

The BI workload is one of the last Microsoft workloads to move to the cloud, and with good reason. Massive amounts of data present problems of scale, and security or data sovereignty concerns tend to keep data on premises. However, there is a very real need to provide BI to users outside of the firewall.

SharePoint is the hub of BI on prem, so it’s logical to assume that with SharePoint Online, it could continue to perform that function in the cloud. The big catch here is that on-prem, SharePoint is simply the display platform. In the enterprise scenario, users connect through SharePoint to the back end servers. This isn’t an option in the cloud, so enterprise BI was left off the table.

With the personal and team BI scenarios, data is cached in a Power Pivot data model, which could be supported in the cloud. When Office 365 moved to the SharePoint 2013 code base for SharePoint online, rudimentary support for embedded Power Pivot models was indeed added. Essentially PowerPivot for SharePoint “light” was added. I call it light for two major reasons. Firstly, data models could be no larger than 10 MB. Secondly, there was no way to update the data contained within the Power Pivot cache, outside of re-uploading the Excel workbook. This is still true without a Power BI license. The inability to refresh the data renders team BI almost useless, except in static data scenarios.

The first generation of Power BI changed all of that. With a Power BI license, it was possible to install a Data Management Gateway on premises that would connect to team BI workbooks in Office 365 and update them on a scheduled basis. Yes, the gateway had many limitations (many of which have been removed over time), but finally, the on-prem refresh story was solved. In addition, the model size limit was increased to 250 MB. However, we were still left with a number of problems or limitations.

  1. Daily data refresh schedule. Automatic data refreshes could be daily at their most frequent. Manual refreshes could be done anytime
  2. Capacity. The maximum size of a data model was increased to 250 MB, which is relatively small for enterprise scenarios. In addition, refreshes aren’t differential, which means that the entire model is re-uploaded on every refresh
  3. Data sensitivity/sovereignty.  The refresh problem was solved, but because the data is still cached in the workbooks, there can be reluctance to sending it outside of the corporate firewall
  4. Per User Security – Power Pivot data models have no concept of user security in a workbook (tabular models in SSAS do). Security is at the workbook level
  5. Cost. This initial cost of Power BI was $40 per user per month. A power BI license was required to interact with any workbook that had a data model larger than 10 MB. Considering that a full Office 365 E3 license was around $25 per user per month, this price tended to limit the audience for sharing.

All of this is to say that Power BI in its first (and as yet current) incarnation is suitable for personal and team BI only. There has been no enterprise cloud BI story.

Power BI V2

The announcements yesterday outlined the next generation of Power BI. Going forward, Power BI will be available as a standalone offering, at the price points offered above. Office 365 users will continue to be able to use it from Office 365, but Office 365 will no longer be required to use it. In it’s early days, Power BI was a SharePoint app, but a careful examination of URLs in the current offering quickly reveals that it’s actually two apps currently, both running on Azure (not in SharePoint).

If you’ve signed up for the new Power BI preview, you may notice that the URL is http://app.powerbi.com/…… so this move isn’t a big surprise.

With the new model, Excel is no longer the central container. Users connect to data and publish it directly to Power BI. Behind the scenes, the service is doing a very similar thing as what it does with Power Pivot models – it’s storing them in SSAS. In fact, the same limits still apply – 250 MB per model (at least for now) Excel can still be used, but now it is as a data source.

Visualizations are performed through Power Views, and data is acquired through Power Query. These are no longer add-ons, but available on their own through Power BI Designer. This decoupling is good for those that have not made an investment in SharePoint Online, or Excel.

These changes to the architecture and the cost are great news for adoption, but don’t address the needs of the enterprise. Except for one thing – The SSAS Connector.

image

One of the data sources available to the new Power BI is the SSAS data connector. This connector is a piece of code that runs on premises (it actually includes the Data Management Gateway). It acts as a bridge between the Power BI service, and an on prem SSAS server.

The biggest distinction worth noting is that with the gateway, data is NOT being uploaded to the service, it remains on prem. The way that it works is that when a user interacts with a visualization from the cloud, a query is sent to the SSAS server through the gateway. That query is run, and its results sent back to the user’s visualization, and the data is not persisted.

In addition, when the query is sent back to the SSAS it is run with the permission of the user making the request. This is accomplished through the EFFECTIVEUSERNAME feature in SSAS. This provides for full user level security, and since tabular models in SSAS can utilize per user security, we no longer need to rely on proxy accounts/document level security.

Finally, because the data is being stored in an on prem SSAS server, it can be refreshed automatically as often as desired. For the same reason, we have no capacity limits – you can grow your own SSAS servers as large as you like.

The SSAS connector removes most of the limitations that prevent cloud based enterprise Business Intelligence, and the new pricing model removes the rest. Certainly there are going to be feature limits in the near term, but it appears to me at least that the back of this thorny problem has finally been cracked.

Power BI Data Management Gateway 1.4 – Where is it heading?

I received a notice from my main Power BI tenant last night that a new version of the Data Management Gateway was available. The previous (1.2) version contained some very significant changes so I was understandably eager to have a look. I installed it, observed a relatively attractive setup interface, then opened the release notes to find out what else was new. Only four items were listed (from the release notes).

  • Unified binary that supports both Microsoft Azure Data Factory and Office 365 Power BI services
  • Refine the Configuration UI and registration process
  • Azure Data Factory – Azure Ingress and Egress support for SQL Server data source
  • Office 365 Power BI – Bug fixes

I had already observed number two, the new setup experience. Bug fixes, while absolutely necessary, aren’t necessarily something to write about, but I think that the other two items are. While they may not have immediate impact, my bet is that they will in very short order.

The key point here is that the gateway now supports the Azure Data Factory. There are many, many things that the data factory enables (Hadoop anyone), but the one that I feel is most relevant to Power BI today is the ability to connect directly to on premises data sources. That’s not quite how it’s been done until now.

Power BI for Office 365

In the context of Power BI as we’ve come to know it today, on-prem data refreshes are handled by the Data Management Gateway. On a periodic basis (daily at most) the service contacts the gateway, which in turn reruns all relevant queries. The resultant data is then uploaded to the service.

The service in turn packages the data and updates the host Excel workbook, and the model is transferred into a back end analysis server. Every transaction goes through the host Excel workbook.

Power BI Dashboards

If you’ve had a chance to see the preview of Power BI Dashboards, you may have noticed that it is not dependent on Office 365 or Excel at all. When you add a data source, you take the date and add it to a cloud based data model directly (presumably backed by SQL Server Analysis Services). All visualization work against these models, with one very important exception. If you connect to a SQL Server Analysis Services Data source you are actually connecting directly to a model hosted on an on-prem SSAS server in real time.

SNAGHTML5339485

How is this done? The connection is made through the “Analysis Services Connector”, which is a separate bit of software installed on prem to facilitate connection between the Power BI Dashboards service and the On-Prem SSAS server. It’s available directly from the dashboards portal.

image

After installing it, a process that establishes for dashboard and SSAS credentials, it can be reconfigured by running the “Power BI Analysis Services Connector” tool.

image

However, installation also adds another piece of software to the host machine. The Microsoft Data Management Gateway. This version of the DMG establishes the connection between the SSAS server and the Power BI service in real time. Up until now, the DMG didn’t work this way, so which version is it?

image

Until now, the most recent version of the DMG was 1.2, so this Dashboards preview contained a glimpse into the next generation Data Management Gateway that provided some intriguing new capabilities.

Coming Together

Checking into the latest version of the Data Management Gateway from Office 365, we see:

image

This version is newer that that included in the Dashboards Preview, and presumably includes everything from it. The key phrase in the release notes to me is therefore “Unified Binary”. One gateway to rule them all, if you will. Does this mean that we’ll be able to connect to on-prem data in real time from Office 365 as well as from the Power BI preview? I don’t know how, but I bet that the building blocks are now there.

The latest version may not include support for any new data sources, or any new bells and whistles, but it’s likely worth setting up for new capabilities that will hopefully show up sooner rather than later.