Skip to content

Tag: PowerPivot

Quick PowerPivot/PowerView Demonstration on SharePoint Shop Talk

On a recent SharePoint Shoptalk, I presented a quick overview of PowerPivot and PowerView  working in SharePoint. The demonstration uses some real world data, and covers a number of the important features of the products. It’s not terribly long, and if you’re interested in these products, you may want to check it out. I’ve embedded it below.

Also – if you’re so inclined, SharePoint Shop Talk runs live every Thursday from 12:30 to 1:30 PM Eastern time. Come out and be a part of the conversation – you never know what you might learn.

Leave a Comment

The Health Analyzer and PowerPivot for SharePoint

*Note – Updated 2012-09-07

The Health Analyzer that first appeared in SharePoint 2010 is a fantastic idea. Periodically, (you say when) timer jobs run to check predefined health rules, and when those fail, you are alerted in Central Administration with a yellow or red error bar depending on the severity. This can potentially bring problems to an administrator’s attention before their effects are felt. Unfortunately, some of the default rules, through a combination of being useless, or buggy, result in almost any farm being constantly alerted. This has the negative effect of reducing confidence in it, and causing administrators to ignore these alerts, which is a pity, because when properly configured, they can be quite useful.

The trick is to spend a few minutes to deactivate the ones that don’t work properly, or are unnecessary. My personal favourite is the alert “Database has large amounts of unused space” – which is actually a very good condition.

All that you really need to do for these buggy rules is to open them up, and disable them. It’s relatively straightforward to do so. While in Central Administration, just open up any of the alerts, and select the “View” link in the Rule Settings section.

image

From the resulting screen, just click Edit in the ribbon, and deselect the “Enabled” option.

image

Taking the time to do either do this, or to remedy the situation for all of the alerts that you’re receiving on your farm will turn your Health Analyzer from a nuisance to a valuable monitoring tool.

In order to test whether or not your changes have worked, you need to run all of the Health Analyzer jobs. You can do this manually, through Central Administration, or you can run the following PowerShell (by Gary Lapointe, lifted shamelessly from Matthew McDermott’s blog):

Get-SPTimerJob | where {$_.Title -like “Health Analysis Job*”} | Start-SPTimerJob

Installing PowerPivot for SharePoint introduces a group of new rules to the mix, and like their out of the box cousins, they are a mixed blessing. Some are important, and some are downright wrong. I had occasion recently to do a clean install of PowerPivot for SharePoint 2012, and decided to document the Health Analyzer results, and their required actions here.

Upon completion of the PowerPivot for SharePoint install, I was initially taken aback to find so many errors:

image

OK, one of these has nothing to do with PowerPivot, but we’ll tackle the rest. In reality, they’re not so bad taken one at a time. We’ll start with one that’s potentially very important:

1. Built-in accounts are used as application pool or service identities.

It’s good practice to always use domain accounts for any service in SharePoint, which is why this alert exists. Opening it up points to the offending culprit, the Claims to Windows token service (C2WTS). By default, it is typically set up to run with the local system account, and there’s a good reason for that.

image

The C2WTS service is used by PowerPivot (OK… Excel Services) to convert from claims based identities to Windows identities. Non claims based services (like Excel Services) need it to work, and it is required for PowerPivot to work properly.

It’s relatively straightforward to change the identity for this service, but whatever you do, change it using Central Administration, not through Windows Services. SharePoint “knows” about the identity and will eventually reset it if you use the Windows interface. To change the identity, simply navigate to Security-Configure Service Accounts and select the appropriate managed account – but be warned, if you do, you have more work to do – in my opinion, it’s best to leave it as Local System.

The problem is, the C2WTS must run with the identity of an account that is part of the Local Administrators group on the server, or as Local System. If you do decide to use a domain account, you will need to add it that group (see this discussion for more detail). However, if you do, you’ll simply trigger another rule warning you about process identities being in the local administrators group. You just can’t win.

By the way, if you do decide to switch the identity to a domain user, and you decide to switch it back, it’s not so simple. The service account UI only allows the selection of managed accounts, and Local System isn’t one of them. If you do need to switch back, just run the PowerShell scripts listed in this Technet article.

Either way you go, you’ll simply want to deactivate the offending rule.

2. PowerPivot: The Analysis Services instance runs in tabular mode, but the configuration setting that specifies this mode is turned off.

Sounds frightening. However, opening up the alert reveals the true problem.

image

The issue is actually that the rules analyzer can’t see that the tabular mode setting is set properly, so it assumes the worst, and alerts you. The reason that it can’t see it, is that the account used by health analyzer doesn’t have access to the file on the file system. That account is typically the farm account, and granting that account access to the file will prevent the alert from reappearing.

Or you can turn it off –  your choice. Given that there’s little chance of something sneaking into the PowerPivot instance and switching the configuration away from tabular, you’re probably pretty safe.

3. PowerPivot: MidTier process account should have ‘Full Read’ permission on all associated SPWebApplications

This one matters, especially if you need to do data refresh. Don’t turn it off. The detail on this alert doesn’t really add anything, so I’m not showing it here.

By “MidTier” it means the process account running PowerPivot. If you used the (very good) PowerPivot Configuration Tool that ships with SQL Server 2012, and selected the defaults, the chances are that the account is your farm account. If that works for you, and your farm account has access to all of your content, well you likely aren’t getting this alert, but in my case, I don’t typically give the farm account access to the site collections.

In my recent case, I needed to do two things – change the account used by PowerPivot to a different account, and then grant that account access to the farm content.

Firstly, if you’re unsure of what account PowerPivot is using, navigate to Security-Configure Service Accounts. Your list of pools/services will vary, but once you select the one used by your service application, you’ll see it in the list

image

Changing the Application Poll for PowerPivot

If you need to change the account, you’ll need to navigate to your PowerPivot Service Application. From Central Administration, select Application Management-Manage Service Applications, and then select (but don’t click through!) your PPSA.

image

With the PPSA selected, click on the Properties button. Then, in the resultant dialog, select, or create the application pool that you wish to use for PowerPivot.

image

Take note of the managed account that you will be using for the application pool, as that is the account that will need full read access in the next step.

Setting Permissions for the PowerPivot Application Pool

You could go through your site collection and add the service account to any relevant groups, etc, but the easiest way to accomplish what is necessary is to treat it the same way that Search does the default content access account, and add it to the User Policy for the relevant SharePoint web applications. To do this, from Central Admin, navigate to Application Management-Manage Web Applications. Next, select the web application in question (again, don’t click through), and click the “User Policy” button in the ribbon.

image

If your account is not already listed, click add users and follow the prompt. When presented with the option, select “Full Read” for the permission level.When complete, you should see an entry similar to the following

image

Repeat this process for all applications in the farm. Once this is complete, PowerPivot should have the permissions that it needs, and you should receive no further alerts of this type.

4. PowerPivot: Usage data is not getting updated at the expected frequency.

image

This alert may or may not be important to you, but I do recommend getting it fixed. PowerPivot maintains usage data in its service application database, and the service application can surface that data to administrators in Central Administration. To see this, simply navigate to the PowerPivot service application in Central Administration, and click through it to manage.

It actually uses PowerPivot workbooks to surface the information, so if PowerPivot isn’t set up correctly, you will see errors. In addition, these workbooks need to be refreshed from the original data source. This refresh is performed by a timer job, which updates the underlying workbook. This Health Analyzer job looks at the most recently updated date of the workbook, and fires the alert if it hasn’t been updated recently. This is good, because in a default installation, it won’t be.

There is a bug in the setup procedure that prevents the correct permissions from being applied to the service application database (see this discussion for more detail – the bug is still there in SQL 2012 RTM). Fortunately, the fix is relatively straightforward.

You’ll need to run SQL Server Management Studio and connect to the SQL server hosting your PowerPivot Service Application database. Open the database, and navigate to Security-Users. You should see your service account already there (if not – add it…), then right click on it and select properties. In the resulting dialog box, select Membership. If not already selected, you’ll need to check both the db_datareader and db_datawriter roles.

Once that’s done, the service application will be able to use the usage data for PowerPivot.

However, this will not update the relevant worksheet immediately. If you want to do this, you need to navigate to the “PowerPivot Management Dashboard Processing Timer Job” (available in Central Administration via Monitoring-Review job definitions), open it up and click the “Run Now” button. This should refresh the worksheet. The dashboard will then work properly, and the alert should stop firing (at least for now..).

5. PowerPivot: ADOMD.NET is not installed on a standalone WFE that is configured for central admin

You will see this alert if you’re using PowerPivot for SharePoint from SQL Server 2012. ADOMD.NET is a dependency for PowerPivot and the Health Analyzer is alerting you to the fact that it is not installed. The problem is that it IS installed. What’s happening? The Health Analyzer is looking for ADOMD.NET 11.0.0.0 (the one that ships with SQL Server 2008 R2). The ADOMD.NET that ships with SQL Server 2012 is 11.0.2100.60. This of course fails the check, and fires the alert. (For more info check out Trevor Seward’s article)

What’s the solution? Disable the rule. You run the risk of not noticing if the file gets removed, but I bet you’ll notice that when PowerPivot stops working.

Below points added 2012-09-07

6.  PowerPivot: Registry settings for Microsoft.AnalysisServices.ChannelTransport.dll are not valid after you install the SQL Server 2008 R2 version of the MSOLAP provider on this computer

This alert will show up most commonly in larger farms – farms that have separated the shared services roles from the front end server roles, and it will show up on the front end servers. The problem  occurs because although the PowerPivot features get deployed to all servers, the deployed assemblies do not get registered properly in the Global Assembly cache. The PowerPivot install registers them so this shouldn’t appear on the application server(s).

The solution is to register them manually on all of the affected front end servers.

To do so, open a command prompt in administration mode. Once open, navigate to:

C:WindowsassemblyGAC_MSILMicrosoft.AnalysisServices.ChannelTransport

Once here, there will be another subdirectory that will depend on the version of PowerPivot for SharePoint installed. The SQL 2012 version will start with 11, the 2008 version with 10, etc. Navigate into that folder. Once in that folder, run the following command:

C:WindowsMicrosoft.NETFramework64v2.0.50727regasm.exe Microsoft.AnalysisServices.ChannelTransport.dll

That should be all that you need to do to fix this one. It will also let PowerPivot work correctly, which is of course a good thing.

7. PowerPivot: Secondary Logon service (seclogon) is disabled

The secondary login service is a built in Windows service, and it is used by PowerPivot to generate the thumbnails that appear in the PowerPivot gallery. It is set to start up manually in Windows by default. According to the PowerPivot Health Rules reference, this should be sufficient for correct operation. Unfortunately, in some cases it isn’t, the service never starts, the Health Analyzer thinks that the service is disabled, and the thumbnails don’t get built.

Fortunately, the fix for this is simple. Login to each of the servers in the farm, start the service, and set its start up options to automatic. That’s it.

 

This represents the bulk of the Health Analyzer issues with PowerPivot that I’ve come across. If you have others, or I’m out to lunch on any of my analysis, please drop me a comment – I’m keen to hear.

12 Comments

Fixing Access Errors When Creating a BISM Connection for PowerView

PowerView is one of the shiny new features that are available to SharePoint users with the implementation od SQL Server 2012. To be sure, you don’t need to convert your database engine over to SQL Server 2012, but you do need to install PowerPivot for SharePoint, and/or the new Reporting Services service application for SharePoint, both only available in SQL Server 2012.

PowerView requires a connection to an xVelocity Analysis Services engine (formerly known as tabular mode or VertiPaq), or to an Excel spreadsheet with an embedded PowerPivot model in order to function. When connecting to the Analysis Services engine, there are a couple of permissions that need to be set on that engine that you should be aware of.

If you wish to connect to the Analysis Services engine, you have two choices. Firstly, you can use a Reporting Services connection (.rds) and select “Microsoft BI Semantic Model for PowerView” as the Data Source type.

image

Once this is done, it can be used to create PowerView reports. This approach works without any additional configuration. The reason that this is so is that Reporting Services takes advantage of the EFFECTIVEUSERNAME connection property for user impersonation. This is something that BISM does as well. When using an RS connection, the stored credential (in this case) is used to establish a connection with the data source, and then all queries are executed with the identity of the requesting end user (provided that the “Set execution context…” option is not selected.

For further reading on how Reporting Services uses the EFFECTIVEUSERNAME property, check this article. Since BISM uses the same property, it’s highly relevant to PowerView.

However, the problem arises if we choose the second option, and want to work with a BISM file directly. If we want to use a basic BISM file, we first add it by creating a document with the the BI Semantic Model content type.

image

However, the connection form is far simpler than the one used to configure the Reporting Services connection. You’ll notice immediately that there is no option for storing credentials. For a server connection, all that is necessary is a name for the connection, the name of the server, and the name of the database.

Unfortunately, if you haven’t gotten everything configured just so and you click ok, you’ll receive the error “Cannot connect to the server or database”.

image

This is due to the fact that when the server performs the validation check against the Analysis Services server, it uses a system account to first connect to the AS server. This is documented in this Connect forum entry. The corrective action is to add the account to the Analysis Services Server Administrator role:

 image

What account needs to be added? The forum article I referenced above indicates the PowerPivot service account, which is now listed in SharePoint as the “SQL Server Analysis Services” account. However, my testing has indicated that the farm account is the one that is used to verify connections with Analysis Services, so that’s what I’m recommending here. This is only used for validation purposes, and to get around this problem without adding the account to the administrators role, you can simply select the “Save the link file without connection validation” option. This may however be unsettling to users.

Once you’ve created your BISM connection, you should then be able to use it to launch Excel, or create a PowerView report. When you launch Excel, the BISM is referenced by an Excel connection, which then connects directly to the data source with the credentials of the end user. No problem. However, if you attempt to create a PowerView  report, you may find yourself faced with “An error occurred while loading the model for the item or data source”.

image

In the body, it misleadingly leads you to believe that it’s the end user (btidwell in this case) that doesn’t have access. What should happen is that, as explained above, the Reporting Services service makes a connection with the AS server, and then uses the EffectiveUserName property to impersonate the user. For more detail on how BISM authenticates to back end data sources, see this MSDN article.

If you’re seeing this error, the chances are that the service account that is being used for Reporting Services doesn’t have access to the Analysis Services database.

As in the verification step above – granting the account access to the specific database doesn’t do it, you need to add the service account that is used by Reporting services to the Server Administrators role. Once you’ve done that, PowerView should behave smoothly.

4 Comments

Connecting to Cubes and External Data with Excel in SharePoint

While many people are still unaware of it, Excel 2010 (and even previous versions) is a very powerful business intelligence client. That’s right, I said Excel. And I don’t mean the classic grab some data, do some charts and email it around sort of Excel, I mean connecting it to Analysis Services cubes and performing fast, useful data analytics on known sets of data. It can also go off and do some pretty amazing things with the Data Mining add-in, or PowerPivot, but for now I’m going to restrict myself to using core capabilities, and getting them published to SharePoint.

To start with, it’s important to understand how SharePoint interacts with Excel. Of course, at it’s core, you can store Excel files in a SharePoint document library,and open them in Excel. However,starting with SharePoint 2007,SharePoint included Excel Services, which allowed you to open a spreadsheet directly in the browser (without having Excel installed or using any ActiveX trickery). It also provided an Excel calculation engine, that was programmatically callable. Why would that matter? A power user could develop a complex model with Excel, store it in SharePoint, which could then be used as a calculation “black box” for other things. The model could also be tweaked by the power user as necessary with them needing  to worry about coding. Finally, you could expose part of the spreadsheet (and named range or object) through the Excel Services web part, which would truly allow you to incorporate Excel content into a dashboard.

SharePoint 2010 brings more to the table, including not only the browser consumption of spreadsheet content, but editing through the Excel Web Application.

However, all of this power does not come without its risks, and when interacting with external data, risks abound. Microsoft has done an excellent job of providing us with a highly secure infrastructure that allows us to do what we want here, but the only trouble is that it’s not always clear which approach to take. My aim in this article is to navigate through the process of connecting to an external data source (an analysis services cube, but the principle applies to others) in a particular scenario in a simple fashion. Our scenario is a fairly common one:

  • One Front End SharePoint Server
  • One SQL back end Server hosting the SharePoint databases
  • One SQL server hosting the Data Warehouse and the Analysis Services cubes
  • Kerberos not installed in the environment

In this environment, we have an authentication problem, the famous double hop problem. If I as a user ask the server to process a spreadsheet that connects to external data, it can’t just take my authentication token and pass it along to the external data source. What we need to do is to set up a proxy account (this has its own set of implications that I won’t get into here) and we’ll do that via the secure store service. If you’re using Kerberos, then you don’t have the double hop problem and the Secure Store part of this won’t apply, but then you’re probably busy enough dealing with Kerberos issues anyway….

If you’ve ever connected to external data, and then sent the file to someone else, you’ll know that they get prompted to verify that they want to connect to the data source and that they trust the connection. In this case, the author has created an embedded data connection. Embedded data connections are not allowed (or at least are strongly discouraged) on the server, because it has no way of verifying the authenticity of the connection.

The way that we deal with this in SharePoint is by creating a connection file, and then publishing it to a trusted location in SharePoint. Administrators need to take heed, in that by default, Excel Services trusts all SharePoint locations, which makes life easy, but not necessarily secure. The library that the connections are stored in should also utilize approval features, but this is not required. This walkthrough isn’t concerning itself with the security aspects, but they need to be considered in any real world situation.

Most of the steps below only need to be performed once, and connection documents can be reused, etc. However, this walkthrough is aimed at getting everything all set up from scratch.

We also don’t want to store our credentials in our connection string, so we will utilize the Secure Store service in SharePoint which will allow us to use, without necessarily knowing, a set of credentials. This will allows us to work around the double hop problem in a secure way. We will start with the setup of the Secure Store Service.

1. Set up the Secure Store Service ID

From Central Administration, navigate to Manage Service Applications, and click on your Secure Store application. If you don’t have one already, you’ll need to create one. You should know that SharePoint Foundation does NOT come with the Secure Store service. However, Search Server Express 2010 does come with it (and a few other things). Given that it’s free, it provides a nice option.

The Secure Store Service Application relies on both the Secure Store Service (duh) and the Claims to Windows Token service. You’ll need to make sure that they’re both started in the “Services on Server” Section in Central Administration System Settings.

The secure Store application requires an encryption key. If  one has not already been created, you’ll see a message indicating that you need to do so. The process is simple, just click the “Generate New Key” button in the ribbon.

image

Once a key has been created, we need to create a new target application, which is essentially a set of credentials. This is the application that our connection strings will reference when they need to connect to a back end data source. You create a new application by clicking the “New” button in the ribbon. The New application screen then appears.

image

There are a couple of things to note on this screen. Firstly, the Target Application ID is the ID that you will be using when you set up your connection below. You can name it what you like, but you’ll need to remember what it is. The Display Name and the Contact E-Mail need to be filled in, but the important thing to note is the Target Application Type field. If this is to be used by more than one person, you need to make sure that it is set to Group. By default, it is set to Individual, which doesn’t work so well in a shared environment. Take it from me – I found out the hard way. When this is filled in, click Next, and you’re presented with the following screen.

image

The Administrators are just that – the people who will set the properties of this set of credentials. The Members are the people that will be allowed to use this credential set in connections, External Lists etc. In the example above it’s set to anyone authenticated, which again, I wouldn’t recommend in production…..

When done, click OK, and we’re done right? Not so fast. We never actually set the credentials for this application, just everything around it. To do that, we select the application, and click the “Set (credentials)” button in the ribbon, or hover over the ID in the list and select “Set Credentials” from the dropdown.

image

In the subsequent screen you enter the account and the password (twice) of the credentials that are to be used as a proxy. Click OK and you’re done. A couple of things to note. If using a Windows account the name should be in DOMAINACCOUNT format. The proxy account is NOT a managed account, and if the password changes, you’ll need to come back here to update it. The proxy account should be a least privileges account, with access only to the things that are absolutely necessary to retrieve the appropriate data, or whatever its purpose is. Don’t add it to the Domain Administrators group.

OK, now step 1 is done. Whew.

2. Set up a Data Connection Library

The next thing we need to do is to set up a library to store our data connections. If you’ve set up a BI Site already (to use PerformancePoint), you can use the libraries there. Depending on your requirements, I find that it’s often a good idea to centralize this library and use it from different locations, but your requirements may vary. What we want to create is a Data Connection Library. This library will be used to store connections for both the Office Applications (ODC) and InfoPath UDC).

From the Site Actions menu on the site where it is to be located, select More Options, the, in the subsequent dialog box, Filter by Library, and select Data Connection Library. Give it a name (Don’t use spaces  here, you can always come back and add them later, but we don’t want spaces in the internal name), and click Create

image

What makes this library special is that it uses the “Office Data Connection File” and the “Universal Data Connection File” content types. You could always add these content types to any old document library and achieve the same effect. You could also include Reporting Services connections if you’re using Reporting Services in integrated mode.

This library also needs to be registered as trusted in the Excel Services Application. This is done through the Manage Service Applications section of Central Administration. Simply click on your Excel Services application, click Trusted Data Connection Libraries, and add the URL of your library, if not already there.

image

3. Set up a Library to house the Excel Reports

You can store the Excel Report that we’ll be creating below into any document library in the Site Collection. If you have set up a BI Center, then you already have a “Reports” library, whose purpose is to do just that. Oddly, by default, this library isn’t set up to contain Reporting Services reports (.rdl files), but that isn’t relevant to my task here.

You simply need to create a new document library, and for the purposes of this example, I’ll call mine Excel Reports (original, huh?)

You’re now ready to create our connection. We’ll use Excel to do that.

4. Create And Store The Connection File

Open a new Spreadsheet in Excel. Click on the Data tab, then click the “From Other Sources” button and choose “From Analysis Services” (obviously, if you’re using something else, choose that).

image

Enter in the name of your server, and click Next, Select the Database and Cube that you want to connect to, and then click Next again. We’ll need to do a couple of things in the last screen that appears.

image

First, select “Always attempt to use this file to refresh data”. Then, click the “Authentication Settings” button. This is where it gets interesting.

image

Window
s Authentication is what you would use if the data resides on the same machine as the SharePoint front end, or your organization is using Kerberos. With this setting, the credentials of the user are used to connect to the data source.If you select None, then the credentials identified as the “Unattended Service Account” (if configured) in the Excel Services configuration will be used. You can only use one of these accounts, and when configuring it, it too will use the Secure Storage Service. We want to select our credential set and we do so by selecting SSS, and entering the ID of the credential set that we created in step 1 above.

Next, we need to save this connection into the SharePoint data connection library that we created in step 2. Click the Browse button, enter the URL of the library in the Address bar, choose a name and click save.

image

Finally, When you’re done, click finish, and fill out the metadata form that pops up with any comments or keywords that you might want to use later to fine the connection document. Now we have a connection document in the connection library.

5. Create The Excel Content

When you’re done creating the connection document, Excel will prompt you to insert a pivot table or chart. Choose the location, and Excel will insert it for you, and put you in pivot edit mode. You can select your dimensions and measures, and build your chart accordingly. When you’re ready, your worksheet should look something like this.

image

Later, we’ll want to show just the chart on a dashboard page, and to do that, we need to make sure that our chart has a logical name, so we need to do that, as highlighted above. Any named range can be published out to an Excel Services web part. When ready, it’s time to publish to SharePoint, but we also want to set some file properties first. These properties are very well hidden….

Click on the “File” tab to go to the backstage. Once in the backstage, click on the “Save and Send” tab on the left hand side. Then click on “Save to SharePoint”. Your screen will appear as follows:

image

Finally, we need to click on the Publish Options button. This allows you to control what gets used by Excel Services, and what appears when the spreadsheet is opened in the browser. It also lets us define parameters which can be used by the Excel Services web part, but  I won’t be using parameters here. I will however choose to publish only my chart and my pivot table to Excel Services, so I click on the dropdown option in the Show tab, change it to “items in the workbook”, and check my two items.

image

Finally I can click OK, then click on “Save to SharePoint” and save the spreadsheet in the library created in step 3 above.

6. Test the File and Publish the Chart in a Dashboard

Navigate to the library that you stored the report and click on it. The file should open in a browser, and you should see the first item (alphabetically) that you set to publish above.

image

You can switch between published items, using the view dropdown highlighted above. Once you’ve gotten this far, everything is working, and we’re ready to put our chart into a dashboard page.

All that we’ll need to know is the name of the item, and the address of the spreadsheet. In our case, our item is “Chart 2” and the URL of the spreadsheet is http://uvspdev2010/ExcelReports/MySpreadsheet.xlsx. Navigate to anywhere in the site collection that you’d like to show the chart. I’m going to use a team site. From the destination, select Site Actions – Edit Page (or select Edit from the ribbon). Place your cursor where you want the chart to appear, click the Insert tab on the ribbon and click the Web Part button. Select the Business Data category, Choose the “Excel Web Access” part and click insert.

image

Once added, click on the “Click here to open the tool pane” link, and enter all desired parameters. The two crucial parameters are Workbook and Named Item. When adding content to a dashboard, I find that it is better to remove all of the buttons and web part chrome, but that’s totally dependent on the use case. When ready, my web part parameters look something like this:

image

At this point, you can click OK, and you should see your item on the page in context.

image t=”773″>

Congratulations – you’re done!!

This was meant to be a “Hello World” walkthrough, and thus, I’ve left out quite a bit. You can add slicers and all kind of cool BI functions to your sheet, and they’re work in the browser. This is showing data in a cube, and therefore the chart will be updated when the cube is. You can also use parameters in Excel and drive them through filter web parts and connections on the display pages.

There ARE a lot of moving parts here, but once the infrastructure is set up, power users can do some pretty spectacular things using just Excel and SharePoint, and they can do so in a fully managed environment. Oddly enough, chances are that your power users probably already know both of these tools.

21 Comments