Skip to content

Category: Business Intelligence

Working With Power BI, Office 365 and File Size Limits

Note – this article was written during the preview cycle of Power BI, and certain behaviours and screens may change by final release.

Quick, what’s the biggest file that you can upload into SharePoint? As with anything SharePoint, of course it depends. This article explains some of the file size boundaries in SharePoint and Office 365, how they impact Power BI, how to take advantage of the storage changes that Power BI provides, and some of the intricacies of using the xVelocity model with Power BI.

The maximum file upload size is 250 Mb by default with SharePoint 2013, and 50 with prior versions. This setting can be changed on an application by application basis within Central Administration, and once set, is a hard limit. If your file exceeds it, you won’t be able to upload it into SharePoint.

If you use Excel Services, you may also note that the default maximum size for an Excel workbook is 10 Mb. This too can be changed in Central Administration (it’s a property of trusted file locations). If you upload a workbook that exceeds this limit, Excel Services won’t be able to render it, and you’ll need to use the Excel client. Depending on performance limitations, I often like to adjust this setting to match the maximum file upload size to avoid end user confusion.

SharePoint Online in Office 365 works a little bit differently. Until recently, the default upload limit was unchangeable. Recently however, SharePoint Online removed the upload limit altogether, so now the maximum file upload size is 2 Gb, which represents the maximum file size that SharePoint can store. For Excel Services Online there are settings that can be adjusted, but the maximum workbook size isn’t one of them. With Excel Services in Office 365, the maximum workbook size is 10 MB. Period.

If you use Power Pivot to do analysis, data is brought into the data model that is embedded within the workbook. That workbook is then stored in SharePoint, and on first use of the model (clicking a slicer, expanding a dimension, or anything requiring the retrieval of data from the model), an instance of the model is created in a backing Analysis Services engine by Excel Services.

Given that all of this is wrapped up into a single workbook file (xlsx), 10 MB would seem to be pretty constraining for data storage, and it is. The data model is very efficient, and data is highly compressed, but 10MB is simply too small for most involved analyses. This isn’t a problem on premises, where additional resources can be allocated and the limit increased, but in Office 365, you’re simply stuck with the 10 MB limit.

Enter BI Sites, the recently announced Office 365 based Business Intelligence app that is part of the Power BI application suite. BI Sites is a SharePoint app that provides additional capability to workbooks with embedded models stored in SharePoint Online libraries. BI sites allows for data models as large as 250MB. The BI Sites app doesn’t actually store content, it just renders it, and provides additional capability such as automatic data refresh. BI Sites also relies on Excel Services to render the content, so does a Power BI subscription increase that workbook limit to 250 Mb? Nope – that limit is still firmly in place. So how does it get around this limit?

As mentioned above, when a model is accessed by a user by forcing a data retrieval from the model, Excel Services creates an instance of the model in the backing Analysis Services instance, if it hasn’t already been created. Subsequent calls just utilize the backing version. What Power BI does is it preloads this model, and then drops the model from the workbook (it is reconstituted on download). Given that the model is the large part of the workbook this drops the file size considerably, allowing it to work within the limits imposed by Excel Services.

Notice that the limit of 250 Mb above is specified for the model, NOT for the workbook. The workbook limit is still 10 Mb, and this is quite visible if you do things in the wrong order, at least in the Power BI preview. To demonstrate we will work with a model that is similar to the one that I created in this article, which is a rudimentary budget analysis with Great Plains data. There are three versions of the analysis file for demonstration purposes.

image 

In the first version, the data was first imported into Excel worksheets using Power Query, and then loaded into the model before the model was edited. It is obviously the largest of the 3, as it contains both the original data, and the more optimized model. In the second file, the data was loaded directly into the model with Power Query. After model edits, the analysis was created using a simple pivot table and pivot chart. It is the smallest of the three, as the data is contained exclusively within the optimized model. In the last version of the file, the data was imported into Excel worksheets using Power Query. Take note of the fact that the file is 12 MB, 50% larger that the model only version, and all of which is counted when considering the Excel Services limit.

After uploading these three files to an Office 365 site, Only the EnbGPDataModelOnly file can be accessed via Excel Services directly. This makes sense because the other two are larger than the 10 MB limit, and Excel Services can’t do anything with them at all, resulting in the error below:

Sorry, we can't open your workbook in Excel Web App because it exceeds the file size limit. You'll need to open this in Excel.

If we now navigate into the Power BI application, We will see a warning symbol on the tiles for our workbooks. This is because they have not yet been optimized for viewing on the web. What does that mean? It means that the model has not yet been extracted from the workbook, and attached to the Analysis Services engine.

image

To do this, click the ellipsis on the tile, and then select Enable. After confirming, the model will be extracted, and you receive a confirmation. In our case, the only one that will be successfully enabled is our EnbGPDataModelOnly file, but the reason is different for the other two files. In the case of EnbGPExcelOnly, the data model was never populated, and results in “This Workbook could not be enabled for web viewing because it does not contain a data model”.

image

This makes perfect sense, but it does mean that all of the features available through the BI Sites feature are unavailable to workbooks that don’t use data models. There is one exception to this however. The Power BI app for Windows 8 and iOS can render workbooks without models, provided that they are within the 10MB limit.

If we try to enable the EnbGPDataModelandExcel file, which does contain a data model, we get the error “This workbook could not be enabled in Power BI for Office 365 because the worksheet contents (the contents of the workbook that are not contained in the data model) exceed the allowed size.

image

If we look at the file size with only the model, it’s about 8.7 MB. The file without the model is 12 Mb, so even with the model extracted, the limit is exceeded, and the enablement process detects this.

On a side note, I think that these error messages have some interesting language. They make reference to “Power BI for Office 365”. This implies, to my mind at least, that there may be a version coming that isn’t for Office 365. No on premises version of Power BI has ever been mentioned by Microsoft, but this may hint at it.

When complete, the failed and successful enablements are easy to spot.

image

Clicking on the middle tile will successfully render the workbook.

Next, let’s work with a file that can benefit from these new features. I created a model that’s identical to the “model only” version with the only difference that I import all of the data from the 3 tables, not just the selected columns. The resultant file (named EnbGPBigModel.xlsx) is 54 MB on disk – well above the 10 MB Excel Services limit, but below the 250 MB Power BI limit. However, Once uploaded, clicking on it directly in the library gives the “file size exceeded” error message. What’s up with that?

The reason is that Excel Services just sees that it is too big, and gives up. In order to extract the model, we must first enable it in Power BI before we can work with it in Excel Services. To do that, we simply repeat the above process by navigating to the Power BI app, and enabling it.

image

Once this has been done, the workbook can be accessed by clicking on it within the Power BI app here, in the Power BI mobile app, or by navigating back to the source library and using it directly with Excel Services.

image

image

Therefore, when building models, it is vitally important to distinguish between the size of the model, and the size of the rest of the workbook. While the model can grow to 250 MB, the sum of the rest of the content cannot exceed 10 MB. Note to Microsoft – we could really use a good term here, as opposed to “rest of the workbook”. Let’s call it the spreadsheets for our purposes right now.

So, how do we know the size of the model vs the size of the spreadsheets? Well, an Excel file (xlsx) is really just a zip file in disguise. In fact, if we rename the file to end in a .zip extension, we can crack it open with a ZIP viewer (or as a folder in Windows). If we do so with our file that contains both the spreadsheets and the model, open it up, and then drill down to the xlmodel folder we’ll see the file item.data.

image

This file is your data model. in this case, it is 7.8 MB in size. Subtract that value from the size of the overall xlsx file, and you have the size of your spreadsheets, which is the part that must fit within the 10 MB limit. When done, just rename the extension xlsx.

If we continue to have a problem, or as a matter of good practice, an excellent tool to use is the Workbook Size Optimizer tool from Microsoft – available here. This is (yet another) Excel add-in that will help to further optimize your model and to help reduce the file size. Just open your workbook, run the add in, and follow the prompts.

We can see that although the 250 MB model size in Power BI helps to make Office 365 a viable BI platform, it does still require a certain awareness on the part of users. The most important lesson to learn from this is to try to avoid importing data directly into Excel. Whenever possible, bring the data directly into the model, bypassing any Excel storage. This is a good idea in any event, but Power BI further underscores the need for it. When using Power Pivot, it’s fairly straightforward, but the data acquisition interface available in Power Query tends to prefer Data import. When using Power Query, care must be taken to avoid Excel import, and I’ll be posting another article on how to do this shortly. 

8 Comments

Power Query Navigator Can Only View 2,000 Items, and Other Limits

I was preparing a demonstration using Great Plains data when I came across a curious limitation. Great Plains isn’t exactly efficient with its use of tables, and this database had well over 1,000 tables in it. Unfortunately, Power Query wasn’t showing the tables that I was interested in. What I did notice was the number 1000 beside my table name.

image

That seemed like a pretty suspiciously round number, so I decided to dig a little and found that with the current version of Power Query, the navigation pane does indeed have a limit of 1000 items per data source. Indeed, there are a number of limitations to be aware of, and Microsoft has published a complete list of Power Query limitations that can be found here.

UPDATE – Nov 2013 – In the Nov 2013 update of Power Query, this update limit has been increased to 2,000. I have updated the title of this post accordingly, but all else here remains valid.

There is a workaround for this limit however. Instead of selecting the table that you are after, select any of the ones that are displayed. The preview data will fill the preview pane. In this case, I need data from the RM00101 table, but I first select the DS10100 table. After the preview is selected, click on the Query Editor button.

image

Once the button is pressed, you will be presented with an editor screen. Simply replace the name of the table that you don’t want, with the table that you do.

image

Once complete, clicking on the Done button should fill the preview screen with the desired data, and the navigation pane will show the correct table name.

image

You can now continue on to select additional tables, or continue your analysis. Feedback from customer support indicated that this limitation will be addressed in an upcoming update.

2 Comments

Power BI – What Is It, And Why Does It Matter?

Power BI for Office365 was first shown to the public last week at the Microsoft Worldwide Partner Conference. Power BI is in my opinion, the most significance advance that Microsoft has made in the area of Business Intelligence in the cloud, and it significantly advances their offering in personal Business Intelligence. I’ve been evangelizing Microsoft’s recent personal and team BI features for some time now, and this announcement builds on that momentum. The demonstration raised a lot of eyebrows, and showed off some of the great strides that Microsoft has made in this space. One little gem in there that was easy to miss was the fact that we got our first glimpse at the HTML 5 version of PowerView – at least the renderer. It also raised a number of questions as to how it works and what will be necessary to get it working optimally. I had the opportunity to speak with a project manager and managed to answer a few of these questions. This post is attempt to explain what this product is all about.

Overview

Firstly, if you haven’t already seen it, you need to see the demonstration that was done by Amir Netz during the Day 1 keynote. It’s really one of the most compelling demos that I’ve ever seen. I include it below.

Amir Netz Announces Power BI at WPC 2013

 

Microsoft has provided a web page to describe the product and let you sign up for a preview. It also contains links to download some of the constituent parts, but it doesn’t necessarily do a great job of explaining exactly what you’re signing up for, or what these downloads are for.

To start with, Power BI requires Excel 2013. This is because all of the constituent components rely on the xVelocity (PowerPivot) engine that is included with Excel 2013. Make no mistake, as far as Microsoft is concerned, the future of multi-dimensional analysis is with the xVelocity in-memory engine. This engine isn’t new – it’s what powers Power Pivot (note the space between the words Power and Pivot… that’s also new).

The “Power” branding I think is encouraging as well. The business intelligence offerings from marketing have been confusing to say the least. Check out this handy “quick reference guide” if you don’t quite understand what I mean. There’s a very large assortment of tools that come from two different product groups (even after the reorg) and it can be difficult and confusing to navigate. Marketing hasn’t made this any easier in the past, but the consistent “Power” prefix I think goes some way to remedying this.

Power BI itself is the culmination of several different projects – Data Explorer, GeoFlow, BI Sites, and work in the mobility space. The first two of these have been in the public for a little while now, and were given their release names this week – Power Query, and Power Maps respectively. In addition, Power Query reached its General Availability milestone, making it acceptable for production use. The BI Sites project has been a very well-kept secret, and last week was the first time that it has been seen in public. Finally apps were shown for both Windows 8 and iOS that should go a long way to address the deficiencies in the mobile space. Unfortunately, nothing was said about Android or Windows Phone, but I have to think that they’re not far off.

Given that there are several components to Power BI, I think that it’s worth outlining each one and how it will be used. To start with, Power Query can be used to discover, acquire and transform source data and load it into an xVelocity based data model in Excel. It can also register the data connections through a “data steward” service that will reside in Office 365 – this will allow for tenant wide discoverability of the data. Power Pivot will then be used as necessary to modify the model, create calculated measures, format columns, etc. Data analysis can then be performed in Excel, using pivot tables, charts and/or Power View. In addition Power Map can be used to visualize geospatial data on a rotatable, zoomable 3D map. Once ready, the workbook will be published to an Office 365 document library, where a BI site will find it. BI sites can then be used to perform natural (English) language queries on the data, add to the mobile applications, and to schedule data refreshes whether the data is public or private. Finally, the mobile apps can be used to consume the reports from the BI Site.

Power Query

Power Query went into general release last week along with the Power BI announcement, and you can download it here. It has been available in preview form as project “Data Explorer”. It’s an add in to Excel 2010 or 2013 that gives it Extract, Transform and Load (ETL) capabilities. I like to call it the personal version of SQL Server Integration Services, in the same way that Power Pivot is the personal version of Analysis Services, and Power View (to a lesser extent) is the personal version of Reporting Services.

As you might expect, it can pull data from a wide variety of sources. Web pages can be parsed, and tables scraped to provide tabular data. The usual suspects are all supported (SQL Server, Oracle DB2, etc.), as well as any OData feed, Azure Data Market, SharePoint lists, Hadoop and HDInsight, Active Directory, and Facebook. One of the more interesting data sources is a folder (not a file). If you point Power Query at a folder, it will merge the data from all of the files in that folder into a single result. Refreshes will pick up any new files as new data. I can see some real uses here for log files. Yes, the files do all need to be the same schema.

The killer feature for discovery here is the online search. In preview, online search gave you access to all of the Wikipedia data that has been expressed in tables. Now however, you not only have access to more public data sources, but you can publish your own data sources (on premises or otherwise) to a catalogue that resides in Office 365. Below is a screen shot from Amir’s demo.

image

Clicking on Online Search in the ribbon brings up the window on the right, where you can enter a search term, in this case “Microsoft Azure Datacenters”. You are then presented with a list of candidates, both public and private. Hovering over one of them will give you a preview of the data to be returned, and when satisfied, you can select Add to Worksheet, or if you would like to transform the data click Filter and shape, where you can modify data types, flatten relational data, merge or split fields, etc. Despite the “Add to  Worksheet” button name, the data can be loaded into the worksheet itself, the back end data model, or both. This distinction is  very important when working with very large data sets ( i.e. > 1million rows).

You can also see from the result set that the data is coming not only from Wikipedia sources, but from the Azure Data Market, and from On premises data sources. What the demo didn’t make clear was how those data sources get into the catalogue. Power Query is the answer here as well. When BI Sites are implemented in Office 365, Power Query will expose some additional UI elements. that will let you publish the query and transform that you have built into the enterprise catalogue. In fact, when you look closely at Amir’s demo screen, you will see these elements.

image

These don’t show up on a standard Power Query install

image

The transformation capabilities of Power Query are really quite impressive, but a detailed description is beyond the scope of this post. They can be seen in the version available for download. As an aside, models built with Power Pivot can be moved directly into Analysis Services. Given the relationship that I mentioned above between Power Query and Integration Services, I wonder if it’s on the roadmap to allow Power Queries to be brought in to SSIS. On the surface, it would seem to make sense, at least from where I’m standing.

Power Map

Power Map is the other component that’s available today, although as of this writing, it’s still in preview form. You can download it from here. Unlike Power Query, Power Map requires Excel 2013 – it won’t run on 2010 at all. Power Map has been available since last fall in the form of a preview as code name “Geo Flow”. It didn’t make it to the main stage in the announcement, but it’s a pretty impressive piece of technology.

What Power Map does is allow you to take a data model built with Excel and/or Power Pivot, and plot the data on the surface of a 3D map. The map can then be rotated and zoomed, and animations created around different views or time dimensions of the data. It wasn’t shown at the announcement last week, but below is a Power Map visualization of Hurricane Sandy data.

SandyImage

This was put together simply by plotting the location of the measurements, the pressure on one layer as a heat map, and the wind speed on another layer as columns. Storm category was used as the category to color the columns appropriately.

There are a number of limitations to Power Map currently, the big one is that it doesn’t yet work at all in SharePoint – it’s Excel only (and not Excel Web App). Given that fact, it’s really just a sort of one off visualization/presentation tool, no matter how cool it is. However, we did just see our first glimpse of an HTML 5 Power View viewer… could an HTML 5 Power Map viewer be far behind?

BI Sites

To my mind, BI Sites is the most exciting part of Power BI. Here is where we find the most significant advances yet in both Microsoft’s mobile and cloud BI strategy. Until now, the cloud strategy was hampered completely by the inability to keep data refreshed automatically, and mobile devices could settle only for mobile web approaches. With BI sites, we see not only a great first step into these areas, but a solid infrastructure to expand upon. Unfortunately, at the point of this writing, it’s not available. You can however sign up for the preview program here.

BI Sites is a SharePoint App, pure and simple. This fact was not even mentioned during the announcement, but I feel that it’s vitally important to understanding how it works, and what its limitations are. Once created, the app reads Excel content from your Office 365 tenant, and provides a great number of features. Let’s quickly run through a few of these features.

Data Refresh

The Achilles heel of using Office 365, at least for me, has been the inability to schedule data refreshes. The latest 2013 wave brought support for interacting with PowerPivot enabled workbooks, which was great, but data refreshes had to be done manually, and the results re-uploaded. On premises, PowerPivot for SharePoint can be used to perform this function, but PowerPivot for SharePoint is not available in Office 365. Since the data can exist both on premises, or in the public, how does a cloud based service refresh data from inside your firewall?

The answer is that you will install a service, a sort of “data gateway” somewhere in your environment. This gateway will manage the refresh of all the on premises data sources and push updates out to Office 365, NOT the other way around. This will not require any firewall configuration, and will not depend on App Fabric, or any other relatively complex infrastructure.

Once configured, the automatic refresh each individual workbook can be configured separately, as can be seen below.

image

 

Report Gallery

One of the things that you notice when you enable PowerPivot for SharePoint is a new library template, the “PowerPivot Gallery”. When you use this template, you will notice that you get a completely different UI for the library that is Silverlight based, and gives you previews, and access to data refresh capabilities. This is again absent in Office 365 due to the lack of PowerPivot for SharePoint. However BI Sites, gives you the same functionality, in a (presumably) HTML 5 based experience.

image

HTML 5

Power View is a self-service analytical reporting tool that was first introduced in SQL Server 2012 Reporting Services, and subsequently added to Excel 2013. It allows users to quickly analyze data from an embedded model, and is supported within SharePoint. It’s a great tool to get answers very quickly, but one of the concerns with it is the fact that is built with Silverlight. This limits is use, particularly in the mobile space, as Silverlight is not supported on iOS, Anroid, or (surprisingly) Windows Phone.

BI Sites show us the first version of a PowerView renderer based on HTML 5, which should work on all devices. When the workbooks are opened through the report gallery, you will see the Power View reports, but they are rendered in HTML 5. I suspect that we’ll continue to see Silverlight being used in the designer in Excel, but the rendering through BI Sites is done with HTML5, which opens up the world of mobile.

Data Catalogue

I hesitate to include this in this section, as it could just as easily be included in the Power Query section, but since it required the BI Sites capability in Office 365, I include it here. The Data Catalogue is that service that allows you to register data sources from Power Query. The catalogue lives in Office 365, and is managed from there.  Data sources in the catalogue can be discovered by Power Query users through the “Online Search” button.

The catalogue also supports features that help support the natural language features of BI Sites. These features are unclear to me at this point, but I will be sure to expand upon them once I get my hands on it.

Finally, to be sure, “Data Catalogue” is what I’m calling it in the absence of any official nomenclature from Microsoft.

Natural (English) Language Query

Natural language query was a very well kept secret. I had no idea that it was being developed until I saw it on stage. Natural language query allows a user to interrogate a data source almost as if they were performing a full text search. Based on their query, Power BI selects an appropriate visualization to display the results.

image

As you can see above, 1 shows that we’re interrogating a data model contained in the “Sales Reporting and …” workbook. 2 shows the query entered by the user in relatively plain English (no Select statements!), and 3 shows how Power BI has interpreted the natural language query. Finally, it determined that a tabular display was the best visualization to use for this result. However, many are available, and it uses the visualizations available to HTML 5 PowerView to display them. Number 4 shows other visualizations that may be relevant to the result set, and all you need to do is to select them to see them.

A few more results can be seen below.

image

image

image

New Visualizations

Any user of Power View will be familiar with the visualizations shown above. However, the one that brought the biggest applause at the announcement was the “king of the hill” visualization. This shows you a “winner” and “contenders” over a period of time, based upon a defined criteria. When applied to pop music over time, it looks something like this.

image

To get the full effect of this, I highly recommend watching the video starting at about 9 minutes in. Of course, this is a highly specific case, and I have to think that more interesting visualizations are on the way.

Limitations

As with most products, Microsoft giveth and Microsoft taketh away. Without having my hands on the full product, I have no hope of coming up with anything like a comprehensive list of limitations, but there are two that jump to mind immediately.

Power BI is limited to Office 365 only. That’s right, there will be no on premises version of this, at least not at launch. From a technical standpoint, I can’t see any reason why this must be, apart from either resourcing limitations within Microsoft, or strategic direction (or a combination of both). It remains to be seen whether Power BI is a “cloud first” product, or a “cloud only” product. Time will tell. In the meantime, those organizations that are still leery of cloud computing may miss out, but there is still a lot of goodness in the on premises offerings.

BI Sites is a SharePoint App, not a container. The workbooks themselves are still stored within document libraries in SharePoint. the default file size limit for Excel Services is 10 MB, and for SharePoint itself is 50 MB. It’s unclear to me if BI Sites relies on Excel Services (I assume that it does), and if so, your data models have a hard constraint of 10 MB. Even if Excel Services is not the bottleneck, they won’t be any larger than 50 MB. That is simply too small for some models, and will leave them out in the cold. I was told at WPC by a Product Manager that this is changing, but specifics are currently unavailable.

Mobile Apps

The final component (that is still unavailable) of Power BI is a collection of Mobile BI apps. Reports are selected in various BI sites and flagged for mobile use. Once flagged, they appear in the mobile apps and can be interacted with directly.

8358.Mobile BI.png-550x0

As an aside, I borrowed the above image from the Office 365 technology blog. Anyone that has attended my BI presentations in the past year will be familiar with the first data set in the app, a topic near and dear to my heart.

Apps were initially announced for Windows 8 and for iOS. No mention was made of Android or Windows Phone (?!?!), but I have to think that that isn’t far behind. Blackberry? I have no idea, but I wouldn’t hold my breath.

To finalize, I’m very excited about several of the new capabilities that Power BI brings, and the doors that it now opens. It’s another big step in moving Business Intelligence from the realm of the data gurus, and into the hands of those that can actually benefit from it. Once it is available to play with it, I intend to focus on it more thoroughly.

8 Comments

Troubleshooting and Removing a Failed Installation or Upgrade of PowerPivot for SharePoint

I recently had the opportunity to perform an upgrade of PowerPivot from SQL Server 2008 R2 to SQL Server 2012 for a customer with a relatively large SharePoint farm. Their farm consisted of 4 SharePoint front end servers, 2 SharePoint application servers, and a SQL Server 2008 R2 cluster. The PowerPivot for SharePoint 2008 R2 service was installed on the 2 application servers. The upgrade didn’t go quite as smoothly as planned and in the process it was necessary to manually remove all traces of the PowerPivot 2008 R2 service. This was a little tricky (to say the least), so I thought that I’d share the experience here in hopes that it helps someone.

The original plan was to remove the PowerPivot service from the second server, then perform a SQL server upgrade on the remaining server, upgrading the PowerPivot for SharePoint instance. The PowerPivot configuration tool would then be run on the single server, upgrading the SharePoint elements. Finally, PowerPivot for SharePoint 2012 would be reinstalled on the second application server.

This is an approach that has worked quite well in the past, but in this particular environment, the configuration tool could not run the upgrade process. Due to the fact that there had been no work done on scheduling data refreshes thus far, it was decided that a complete removal, and install from scratch was in order. This is where things got particularly tricky. The configuration tool was unable to run its uninstall process, so I elected to remove everything manually, which consisted of deactivating the PowerPivot features in the destination site collections and from Central Admin, then retracting the PowerPivot solutions from Farm Solutions. Finally, SQL Server installation was run, and the PowerPivot instance removed from the server completely.

Removing PowerPivot Breaks Central Administration

At this point, Central Administration became unavailable. A quick search through the ULS logs for the correlation ID reported that there were errors loading the Microsoft.AnalysisServices.SharePoint.Integration assembly. As a next step, I ran through the SharePoint Configuration Wizard (psconfigui) to make sure that everything was OK. As it turns out, it wasn’t OK, and the wizard repeatedly failed.

image

After some searching, I located a TechNet article that pointed out the problem. Apparently with 2008 R2, PowerPivot has a little trouble uninstalling itself, and leaves a few artefacts. Two of the artefacts are the registry keys listed in the article, and when the configuration wizard sees them, it tries to load the assembly, which of course no longer exists, and then it fails out. Removing the two keys fixes this particular problem, and I’m reproducing the steps for doing so below for convenience.

Open the registry editor. To do this, type regedit in the Run command.
Expand HKEY_LOCAL_MACHINE
Expand SOFTWARE
Expand Microsoft
Expand Shared Tools
Expand Web Server Extensions
Expand 14.0
Expand WSS
Expand ServiceProxies
Right-click Microsoft.AnalysisServices.Sharepoint.Integration.MidTierServiceProxy and select Delete.
Go back a step in the hierarchy. Under WSS, expand Services
Right-click Microsoft.AnalysisServices.Sharepoint.Integration.MidTierServicea and select Delete.

After getting rid of the registry entries, the configuration wizard completed successfully, and more importantly, Central Administration loaded up properly. In addition, to this, I also removed the same registry keys from the second application server in anticipation of a reinstall.

It was therefore time to reinstall PowerPivot. Installation from the SQL server media went fine (don’t forget to add all necessary accounts as administrators, particularly the account that is running the Analysis Services Windows service).

Parent Service Does Not Exist

However, re-running the PowerPivot configuration Tool resulted in the error “Cannot create the service instance because the parent Service does not exist”.  

image

This turned out to be a particularly vexing problem. The only guidance that I could find in the forums was incorrect (things like, “the Database engine must be installed with the PowerPivot instance”, which is patently false). I went to the source PowerShell scripts that register the service, and was able to run them, and register the service. The PowerPivot service then appeared under Services on Server, and I was even able to create a new PowerPivot service application thorough the Central Administration UI. However, attempting to access it resulted in an error (which I no longer have access to). Examining the ULS logs showed “access denied” attempting to connect to the Analysis Services instance.

In addition, subsequent attempts to run the PowerPivot configuration tool resulted in a perplexing error that stated that all servers in the farm must be running the same service account, and that they should all be changed to run as LOCAL SERVICE. This is particularly odd given that PowerPivot MUST be installed with a domain account, so this isn’t even possible.  As it turned out, this was a red herring, and occurred because I had incorrectly specified the credentials of the service account in my PowerShell scripts from above, and it had provisioned using the LOCAL SERVICE account. I was able to change the identity of the service through Central Administration in the Security section, by configuring the service accounts, and then the PowerPivot configuration tool was able to run to completion. However, the service application, and its proxy appeared as stopped in the Service Applications list.

Manual Uninstallation

Something was still amiss, and for the life of me I couldn’t figure out what it was. At this point, I engaged the PowerPivot support team. After eliminating a few other potential issues, we decided to manually uninstall PowerPivot and use a few clean-up tools afterward. I again deleted service applications, deactivated features, retracted and removed solutions, and uninstalled PowerPivot for SharePoint with the SQL Setup application. Once everything had been theoretically removed, I ran this PowerShell script (repeated below) provided by the Support team to force the removal of all remaining PowerPivot items from the farm.

Add-PSSnapin Microsoft.SharePoint.PowerShell -ErrorAction:SilentlyContinue

Function DeleteServiceAndInstances
{
    param($service)
    if($service)
    {
        foreach($instance in $service.Instances)
        {
            $instance.Delete()
        }
 
        $service.Delete()
    }
}

Function DeleteServiceApplications
{
    param($service)
    if($service)
    {
        foreach($instance in $service.Applications)
        {
            $instance.Delete()
        }
    }
}

Function DeleteServiceApplicationProxies
{
    param($proxy)
    if($proxy)
    {
        foreach($instance in $proxy.ApplicationProxies)
        {
            $instance.Delete()
        }
    }
}

Function DeletePowerPivotEngineServiceAndInstances
{
    $farm = Get-SPFarm
    $service = $farm.Services | where {$_.GetType().GUID -eq "324A283C-6525-43c8-806C-31D8C92D15B7"}
    DeleteServiceAndInstances $service
}

Function DeletePowerPivotMidTierServiceAndInstances
{
    $farm = Get-SPFarm
    $service = $farm.Services | where {$_.GetType().GUID -eq "35F084BE-5ED5-4735-ADAA-6DB08C03EF26"}
    foreach($job in $service.JobDefinitions)
    {
        $job.Delete()
    }

    DeleteServiceApplications $service    
    DeleteServiceAndInstances $service
}

Function DeletePowerPivotEngineServiceProxy
{
    $farm = Get-SPFarm
    $proxy = $farm.ServiceProxies | where {$_.TypeName -eq "Microsoft.AnalysisServices.SharePoint.Integration.EngineServiceProxy"}
    if ($proxy)
    {
        $proxy.Delete()
    }
}

Function DeletePowerPivotMidTierServiceProxy
{
    $farm = Get-SPFarm
    $proxy = $farm.ServiceProxies | where {$_.TypeName -eq "Microsoft.AnalysisServices.SharePoint.Integration.MidTierServiceProxy"}
    if ($proxy)
    {
        DeleteServiceApplicationProxies $proxy
        $proxy.Delete()
    }
}


DeletePowerPivotEngineServiceProxy
DeletePowerPivotMidTierServiceProxy
DeletePowerPivotEngineServiceAndInstances
DeletePowerPivotMidTierServiceAndInstances

This script ran successfully, and in theory should have removed all of the PowerPivot elements from the SharePoint farm. However, just to be sure, we ran the following SQL script (also provided by MS support) to look for any orphaned PowerPivot elements in the farm configuration database:

   SELECT Id, classid, parentid, name, status, version, properties
   FROM objects
   WHERE name like '%PowerPivot%'
   ORDER BY name

Lo and behold, there was still a record in the configuration database pointing to an instance that obviously no longer existed. Now, direct editing of the configuration databases is not something that anyone should do lightly, and it’s not generally supported. However, in a few cases, it’s the only option, and according to support, this was one of those cases. I therefore took a backup (ALWAYS take a backup first!) of the config database, and then ran the following SQL query to determine if the instance had any configuration dependencies.

SELECT * FROM Objects (NOLOCK) WHERE Properties LIKE ‘%GUID-found-in-earlier-query%'

The GUID is the value of the ID field found in the previous query for the orphaned item. In my case, there were no items returned, but if there were, the dependencies would need to be removed. If you find dependencies, you will need to contact support yourself, as there are additional complicating factors involved that will need to be evaluated by the product group.

Since I had no dependencies, I was safely able to delete the offending record.

   DELETE FROM objects WHERE name like '%PowerPivot%'

My Configuration Database was now clean.

The Return of the Parent Service Error

At this point, I reinstalled PowerPivot for SharePoint, and once again, ran the PowerPivot configuration tool. Unfortunately, I ran right into the “Parent Service Does Not Exist” error discussed above. This was frustrating, to say the least. After several choice words, and some sleep, I decided to do yet another uninstall/reinstall. This difference this time is that instead of retracting solutions manually, I would allow the configuration tool to do the uninstall for me. When I did, it generated an error when it attempted to uninstall a feature:

The feature with ID ‘1a33a234-b4a4-4fc6-96c2-8bdb56388bd5’ is still activated within this farm or stand alone installation. Deactivate this feature in the various locations where it is activated or use -force to force uninstallation of this feature.

This GUID is the PowerPivotSite feature, which was still activated at one of the site collections. I then used stsadm to force the uninstallation of the feature (I used stsadm simply because I’m a dinosaur, and know the switches off the top of my head. Obviously PowerShell would work just a well, if not better).

stsadm -o uninstallfeature -id ‘1a33a234-b4a4-4fc6-96c2-8bdb56388bd5’ -force

After forcibly removing the feature, I used the PowerPivot configuration tool to remove PowerPivot. Once removed, I ran it again to configure the PowerPivot instance, and this time, it completed successfully.

I did note one thing that I thought was odd, and haven’t been able to explain. On this last run of the configuration tool, the step that provisioned the PowerPivot service application took an inordinately long amount of time, about an hour. Not content to leave well enough alone, I deleted the service application, and re-ran the tool. This time it completed in under a minute. I only mention this in case someone tries this and gives up because they think the system has hung up.

Once I got the first server up and running, it was relatively simple to install PowerPivot for SharePoint 2012 on the second application server, and run the configuration tool, which did everything necessary to add the second application server back into the PowerPivot rotation.

After cleaning up the remaining configuration items common to PowerPivot for SharePoint, and cleaning up the Health Analyzer errors, PowerPivot now appears to be running smoothly on this farm.

17 Comments

Installing and/or Upgrading a Multi-Server SQL Server Reporting Services 2012 SharePoint Mode Farm

A few months ago I posted an article discussing how to upgrade integrated mode Reporting Services from 2008R2 to 2012. That article pretty well assumed a small SharePoint farm, with a single SharePoint server, a single SQL server, and with Reporting Services running on the SharePoint server. In this article, I’d like to address upgrading and/or installing on a medium or large farm, but to do so, I need to first discuss the nuances involved in scaling out the Reporting Services service application of a SharePoint 2010 farm.

As I discussed previously, and as the Service Application architecture of Reporting Services 2012 makes plain, Reporting Services bits from the SQL server installation media must be installed on a server that has the SharePoint bits installed, and is joined to the farm. In the simple farm scenario mentioned above, that’s fairly straightforward – it’s the SharePoint server (NOT the SQL server!). However, if your SharePoint farm consists of multiple servers, you need to decide where you want your Report processing to be done. With 2012, Reporting Services is a full fledged SharePoint application, which means that it is relatively straightforward to load balance this processing.

A recent project that I worked on had an architecture very similar to the diagram found on the MSDN article explaining how to scale out Reporting Services:

image

The only difference in my case was that there were 4 servers in the front end role (1). Of course the servers in the application role served up more than just Reporting Services, but the diagram is essentially accurate. In our case, the SQL Server cluster was running SQL Server 2008 R2, but that was inconsequential because only the Reporting Services instances on the two Application Servers were being upgraded.

If you are are upgrading, there are a number of other steps in addition required first. No matter what, the encryption key should be backed up first. It will be needed after the upgrade unless you want to recreate all of your data connections. Once that is backed up, you should of course back up your two Reporting Services databases. Next, if you have a load balanced (Reporting Services) environment, I recommend removing from rotation all of the load balanced RS servers, leaving only the server that will be used for the primary upgrade. To make things very simple, I also recommend completely uninstalling Reporting Services from all but the one server.

Once complete, the main server can be upgraded according to the instructions laid out in my previous article. However, you may find, as I have, that SSRS doesn’t always want to be upgraded cleanly. The good news is that this is relatively easy to recover from. The SSRS service application works like other service applications in that when it is being created, and you specify an existing database, that database will be upgraded automatically. Therefore, if you find yourself with an uncooperative SSRS installation, simple uninstall it, and install the 2012 components from scratch, making sure to use the name of your existing RS database when the service application is being created. Once complete, restore your encryption key, and you should be good to go.

Whether or not you are upgrading or installing fresh, there are a number of differences when installing to a multi server farm compared to a single server installation.

Firstly, when installing SSRS 2012, you will be presented with a screen where you may choose the SQL features to be installed.

image

There are two components to Reporting Services in SharePoint mode, Reporting Services – SharePoint, and the Reporting Services Add-in for SharePoint Products. A lot of the guidance simply says to select both options and continue. However, in a multi farm environment it is important to understand the difference between the two.

Feature #1 is the core of Reporting Services, what in the past would have been the instance, but is now the service application itself. Feature #2 is the add-in, which has been around since the first integrated mode SSRS. It is used by SharePoint to connect to SSRS. In the past, that was a connection to the SSRS web services, but is now how the SharePoint front end servers connect to the service application. Therefore, in a multi server farm, feature 1 should be installed on every application server that will process SSRS reports, and feature #2 must be installed on every server participating in the front end role. Since application servers often perform a dual role, at least for administrators, I recommend installing both features on application servers.

I’m not going to walk through the steps required to create the service application and light up the features in this article, as there is quite a bit of good guidance on that available. I also wrote one up for installing on SharePoint 2013 which is pretty much identical to 2010 for SSRS.

Once the initial installation is complete, it should be repeated on every server that will participate in the Application server role. Obviously the service application only needs to be created once. Once all of the bits are installed on all of the relevant servers, simply navigate to Services on Server under System Settings in Central Admin, and start the “SQL Server Reporting Services” service on every application server. Once that’s done, you’ll have a load balanced, multi-server Reporting Services service.

It is worth calling out a common error encountered in the multi-server farm scenario. You may find that after your upgrade or installation has completed, attempts to access a report from the front end servers result in a connection error, “The attempt to connect to the report server failed. Check your connection information and that the report server is a compatible version”.

image

It’s the last bit of the error that’s relevant. This problem arises when the add-in hasn’t been installed to the front end servers in the farm. Now, you may remember that the Reporting Services Add-In is one of the prerequisites that the prerequisite installer installs on your SharePoint boxes. In the case of an upgrade, you may also remember that you never had to do this before to get SSRS working. So why is it that we need to do this now? It’s because the add-in included with the prerequisite installer is for SSRS 2008 R2, and we’ve just added SSRS 2012. The add ins are not forward compatible, and therefore, it need to be on every front end server in the farm.

So to recap, in order to scale out reporting services, Install the service on one application server, and get it working in the farm. Then, install the service on the remaining application servers, start the service on each server, and install the RS add-in for 2012 on all of the front end servers.

1 Comment