Skip to content

Category: Uncategorised

Installing and/or Upgrading a Multi-Server SQL Server Reporting Services 2012 SharePoint Mode Farm

A few months ago I posted an article discussing how to upgrade integrated mode Reporting Services from 2008R2 to 2012. That article pretty well assumed a small SharePoint farm, with a single SharePoint server, a single SQL server, and with Reporting Services running on the SharePoint server. In this article, I’d like to address upgrading and/or installing on a medium or large farm, but to do so, I need to first discuss the nuances involved in scaling out the Reporting Services service application of a SharePoint 2010 farm.

As I discussed previously, and as the Service Application architecture of Reporting Services 2012 makes plain, Reporting Services bits from the SQL server installation media must be installed on a server that has the SharePoint bits installed, and is joined to the farm. In the simple farm scenario mentioned above, that’s fairly straightforward – it’s the SharePoint server (NOT the SQL server!). However, if your SharePoint farm consists of multiple servers, you need to decide where you want your Report processing to be done. With 2012, Reporting Services is a full fledged SharePoint application, which means that it is relatively straightforward to load balance this processing.

A recent project that I worked on had an architecture very similar to the diagram found on the MSDN article explaining how to scale out Reporting Services:

image

The only difference in my case was that there were 4 servers in the front end role (1). Of course the servers in the application role served up more than just Reporting Services, but the diagram is essentially accurate. In our case, the SQL Server cluster was running SQL Server 2008 R2, but that was inconsequential because only the Reporting Services instances on the two Application Servers were being upgraded.

If you are are upgrading, there are a number of other steps in addition required first. No matter what, the encryption key should be backed up first. It will be needed after the upgrade unless you want to recreate all of your data connections. Once that is backed up, you should of course back up your two Reporting Services databases. Next, if you have a load balanced (Reporting Services) environment, I recommend removing from rotation all of the load balanced RS servers, leaving only the server that will be used for the primary upgrade. To make things very simple, I also recommend completely uninstalling Reporting Services from all but the one server.

Once complete, the main server can be upgraded according to the instructions laid out in my previous article. However, you may find, as I have, that SSRS doesn’t always want to be upgraded cleanly. The good news is that this is relatively easy to recover from. The SSRS service application works like other service applications in that when it is being created, and you specify an existing database, that database will be upgraded automatically. Therefore, if you find yourself with an uncooperative SSRS installation, simple uninstall it, and install the 2012 components from scratch, making sure to use the name of your existing RS database when the service application is being created. Once complete, restore your encryption key, and you should be good to go.

Whether or not you are upgrading or installing fresh, there are a number of differences when installing to a multi server farm compared to a single server installation.

Firstly, when installing SSRS 2012, you will be presented with a screen where you may choose the SQL features to be installed.

image

There are two components to Reporting Services in SharePoint mode, Reporting Services – SharePoint, and the Reporting Services Add-in for SharePoint Products. A lot of the guidance simply says to select both options and continue. However, in a multi farm environment it is important to understand the difference between the two.

Feature #1 is the core of Reporting Services, what in the past would have been the instance, but is now the service application itself. Feature #2 is the add-in, which has been around since the first integrated mode SSRS. It is used by SharePoint to connect to SSRS. In the past, that was a connection to the SSRS web services, but is now how the SharePoint front end servers connect to the service application. Therefore, in a multi server farm, feature 1 should be installed on every application server that will process SSRS reports, and feature #2 must be installed on every server participating in the front end role. Since application servers often perform a dual role, at least for administrators, I recommend installing both features on application servers.

I’m not going to walk through the steps required to create the service application and light up the features in this article, as there is quite a bit of good guidance on that available. I also wrote one up for installing on SharePoint 2013 which is pretty much identical to 2010 for SSRS.

Once the initial installation is complete, it should be repeated on every server that will participate in the Application server role. Obviously the service application only needs to be created once. Once all of the bits are installed on all of the relevant servers, simply navigate to Services on Server under System Settings in Central Admin, and start the “SQL Server Reporting Services” service on every application server. Once that’s done, you’ll have a load balanced, multi-server Reporting Services service.

It is worth calling out a common error encountered in the multi-server farm scenario. You may find that after your upgrade or installation has completed, attempts to access a report from the front end servers result in a connection error, “The attempt to connect to the report server failed. Check your connection information and that the report server is a compatible version”.

image

It’s the last bit of the error that’s relevant. This problem arises when the add-in hasn’t been installed to the front end servers in the farm. Now, you may remember that the Reporting Services Add-In is one of the prerequisites that the prerequisite installer installs on your SharePoint boxes. In the case of an upgrade, you may also remember that you never had to do this before to get SSRS working. So why is it that we need to do this now? It’s because the add-in included with the prerequisite installer is for SSRS 2008 R2, and we’ve just added SSRS 2012. The add ins are not forward compatible, and therefore, it need to be on every front end server in the farm.

So to recap, in order to scale out reporting services, Install the service on one application server, and get it working in the farm. Then, install the service on the remaining application servers, start the service on each server, and install the RS add-in for 2012 on all of the front end servers.

1 Comment

Using Per-User Identity with Reporting Services and SharePoint

Almost anyone that has dealt with business intelligence in the Microsoft space is aware of the double hop problem. Simply put, when a user wants a server to execute a process on their behalf, that process will have a different identity, if any at all from the originating user, and therefore user focused operations such as per-user security trimming can’t be performed. The traditional solution to this problem in the Windows world has been to use Kerberos authentication, which allows server to pass on secure user tokens to other servers on behalf of originating users.

Unfortunately, Kerberos requires additional infrastructure, and is not adopted in all organizations. Smaller to mid sized organizations tend to stay away from adopting it, primarily due to its perceived complexity, which leaves them with the double hop problem. In the long term, claims based authentication promises to solve this problem, but it is still in its infancy, and is not adopted at all yet in the SQL Server suite of products, which is where double hop pain is felt most keenly.

The new BISM tabular model with Power View works around this problem by taking advantage of the EFFECTIVEUSERNAME property available in Analysis Services (I’ll be writing more about that later), as does Excel Services and PerformancePoint Services in SharePoint 2013. With EFFECTIVEUSERNAME, the server makes its connection with the data source using its service credentials, but then every query identifies the originating user, and those queries are executed in the context of that user. It’s a clever workaround, and takes care of much of the BI stack, but what about Reporting Services and relational data?

As it turns out, Reporting Services has been able to do this sort of thing for a long time. It’s just been rather cleverly concealed or misunderstood. SQL Server itself has a function, SetUser() that can be used by someone with sufficient privileges to impersonate another user. Originally implemented for testing purposes, Reporting Services takes advantage of it in much the same way that EFFECTIVEUSERNAME works in Analysis Services. The place to turn on this function is within a data source editor.

Using SharePoint Integrated mode, there are 3 main tools that can be used to define a data source. A data source can be created from any library that is configured to use the Report Data Source content type. In addition, SQL Server Report Builder can be used to edit a data source, or to create an embedded data source in a Report. Finally, Business Intelligence Development Studio (BIDS) or SQL Server Data Tools (depending on your version of SQL) can be used to create shared data sources. And the best news is, the interfaces are all different!

image                                        Setting Connection credentials in SQL Server Data Tools

image                                Setting Connection credentials in SharePoint 2013

image                                      Setting  Connection credentials in Report Builder

The option that we’re interested in, in all three cases is “Use this user name and password” (stored credentials). In the case of Data Tools or BIDS, the impersonation option isn’t available. Normally what would happen is that the connection would be deployed to the server, and then edited afterwards. We can therefore ignore this interface. However, examine the description of the impersonation option in both SharePoint and Report Builder. Although worded completely differently, they mean exactly the same thing. And in my opinion, Report Builder has it right. In fact it’s a stretch to say that the SharePoint screen isn’t dead wrong.

The SharePoint form can only be considered correct if the word “this” applies to the currently logged in user. To my mind, when I read that option, I am led to believe that it will be set to the account that is being used to make the connection (the stored credentials), when it is in fact the opposite. The setting in Report Builder is far clearer and more accurate.

When the option is checked, the SetUser() SQL function will be used in advance of any queries performed by the report, and the value will be set to the currently logged in user, NOT to the credentials specified under “Use this name and password”. The credentials specified here are used to make the connection to the SQL server, and because of the security restrictions of SetUser(), MUST have at least the dbOwner role on the queried database, or the SYSADMIN fixed role on the server. My recommendation is the latter if possible. This requirement also makes it very important to restrict the editing of the Data Source.

I imagine that most of the confusion around this feature comes from the fact that in most cases, when you are using impersonation, you are saying “I want to execute this function as this account”. However, in this case, with stored credentials, it’s always the stored credential executing the function, and what we’re asking it to do is to impersonate the currently logged in user. It’s sort of backwards to the way we normally think about impersonation. It’s not immediately intuitive as to why a logged in user would need to impersonate themselves, but that’s basically what’s happening.

Using this option therefore allows you to utilize per user permissions through Reporting Services on SharePoint without having to use Kerberos. This is a good thing, but you should be aware that in many cases, using a proxy account (by not selecting the impersonate option) may be a better option. There are several drawbacks to using per-user authentication. Impersonation should only be used when it is absolutely required.

If you open up the report processing screen for a report that uses a data source defined for impersonation, and select the “used cached data” option, you will receive the warning “This report can not be cached because one or more of the data sources credentials are not stored”.

image

The reality is that although the credentials are in fact stored, the report could be different for different users, and caching depends on a consistent result. Therefore, to prevent confusion, it’s not allowed. If the report is particularly large, this removes an important performance optimization technique.

From the same screen, it’s also made very clear that snapshotting is not an option when impersonation is being used, from the error “This report can not run from a snapshot because one or more of the data sources credentials are not stored”.

image

The reasons are the same as for caching, and the implication is that snapshots cannot be created, preventing us from storing a version history for the reports.

Finally, subscriptions are also rendered useless through the use of impersonation. An attempt to create a subscription to a report using an impersonated data connection results in the following error:

image

Per-user identities also require management on the SQL server side. Because we are impersonating the currently logged in user, that user must be managed from within SQL itself, whereas a proxy account only requires that account to have access to the data in question.

To summarize, Kerberos is not required to do per user queries in Reporting Services for relational data. The SetUser() function in SQL Server combined with the Execution Context option in data connections allow this to be performed quite effectively. However, just because it can be done, doesn’t mean that it should be. In many cases a simple proxy account can do the job, and should likely be allowed to, unless the requirements absolutely demand otherwise.

8 Comments

Using SharePoint Filters with Reporting Services Parameters for Personalized Reports

Using Reporting Services integrated with SharePoint has some tremendous advantages for report re-use. In many cases, a single report can be designed, and then placed on contextual pages for customized display by setting the report parameters through the Reporting Services web part. Filter web parts can also be used to drive the Reporting Services parameters, and finally, users can set the parameters directly if allowed.

However, what happens when we want users to be able to set some of  the parameters, but not all of them? This can be achieved through a combination of filters and web part settings.

Consider the following report:

image

Our requirement is to place a Reporting Services web part on a page that will display the report for the currently logged in user. In addition, the user should be able to select the time period to display, but the default for the time period should be the current year. Finally, users should not be able to use this  report to view the submissions of other employees.

We can accomplish this firstly by adding three parameters to the above report – Start Date, End Date, and Employee. We will then restrict the results returned to the report to those meeting our parameters. Our main dataset query is as follows:

SELECT
ExpenseTotal,Employee,ExpenseType,SubmittedDate
FROM
Expenses
WHERE SubmittedDate BETWEEN @StartDate AND @EndDate

AND Employee= @Employee

Ideally, we could use SharePoint filters to set the value of all three parameters (details on this below…), and then completely hide the Reporting Services parameter pane from the users. Unfortunately, the date filter in SharePoint doesn’t quite meet our requirement.

image

We can explicitly set the default value, or make it a function of the current date, but what we can’t do is to set it to the first day of the current year, last day of the current year, etc. True, on Jan 1 we could go to the page and edit the parameters of the filter, but we’re looking for something a little more elegant than that.

We can get these sorts of values through SQL Queries, so what we’re going to do is to set the default values of the parameters with a Reporting Services Dataset. The dataset query is:

SELECT 
DATENAME(Year,GETDATE())+'0101' as StartDate,
DATENAME(Year,GETDATE())+'1231' as EndDate
This basically retrieves the current year and then appends Jan 01, and Dec 31 respectively to return the values in YYYMMDD format. These values are then set as the defaults for the start and end date parameters respectively:
 
image
We will be setting the value for employee with a filter, so we don’t need to set its available or default values.
 
Once this is done, and the report is saved, we’re ready to add the report to a SharePoint page. To do this, simply navigate to the destination page and edit it, then insert a Reporting Services Web Part on the page. Open the web part tool pane, select the report, and set all of the View/Display options that you want to use. You will need to make sure that the prompt are is displayed, as you will need it for your users to change the date ranges. However, you can start it collapsed.
 
image
 
Finally, you’ll want to load up the report parameters section
 
image

You should not that all of the parameters need a value for the report to render – hence the “Enter A User” value, which will return no records. At this point, we can save the web part and test the form by entering an employee name in the Employee parameter field.

image

The next step is to add the Current User Filter Web Part to the page (in the Filters section of the web parts list), and configure it. It’s a context filter, and it won’t display to the end user, so you can put it anywhere on the page that you wish.

image

You can send either the user’s Active Directory account name, or the value of any of the User Profile fields (as an aside, this is one reason that the user profile is really important…). In our case, the back end data is in a “First Name Last Name” format, and this corresponds with the Name property of the User Profile, and is therefore what we’ll use.

Once configured, we save the web part, and then we need to connect it to the report. We do this by accessing the Web Part Edit Menu, hovering over Connections, and selecting “Send Filter Values To”

image

We then select the report, and the parameter to set:

image

​Once connected, the report should be using the name of the currently logged in user, in our case, Olaf Franz.

image

In the above screen I have opened the parameters pane (I have it closed by default) to show that users can change the dates, but by navigating to this single page, a user will see their own entries for the current year. However the important thing to note here is that although “Employee” is a parameter, it no longer appears in the parameters pane. This is due to the fact that its value is being set by a SharePoint filter, and it has the added benefit of allowing us to prevent users from seeing other users’ reports.

This is a perfect example of where the personalization capabilities of SharePoint can be combined with the analytic power of Reporting Services for a result that’s truly greater than the sum of the parts.

2 Comments

File Formats for the SQL Server 2012 (Denali) Reporting Services Web Services Renderer

One of the major new features in SQL Server Reporting Services (SSRS) 2012 is the ability to render Excel files and Word files in the XML based 2007 file format (.xlsx and .docx). Previous versions of SSRS rendered Excel and Word files into the old 2003 formats (.xls and .doc) Creating one of these output files can be done through the reporting UI by using the Export button. The new formats are used automatically, and you need to perform server configuration in order to use the older formats. 

However, if you are using the Reporting Services web service, nothing changes. The Render method of the ReportExecution2005 web service would appear something like below:

report = rs.Render("EXCEL", deviceinfo, _
extension, mimeType, encoding, warnings, streamIDs)

The first argument is the file format to render into. With SSRS, the above code will render into the 2003 file format (.xls). The web service defaults to the old format.

I was unable to find any documentation at all on the correct argument to get the new file formats, so I poked around a little with my debugger and the ListRenderingExtensions method. As a result, I thought I’d share what I think is the comprehensive list of rendering formats for Reporting Services in SQL Server 2012. These are taken straight from the code.

Name

Description

XML

XML file with report data

NULL

Null Renderer

CSV

CSV (comma delimited)

ATOM

Data Feed

PDF

PDF

RGDI

Remote GDI+ file

HTML4.0

HTML 4.0

MHTML

MHTML (web archive)

EXCEL

Excel 2003

EXCELOPENXML

Excel

RPL

RPL Renderer

IMAGE

TIFF file

WORD

Word 2003

WORDOPENXML

Word

 

Therefore, if I want to render to the description “Excel” the argument needs to be “EXCELOPENXML”, and to render to the description “Excel 2003” the argument is “EXCEL”. That’s not at all confusing.

In any event, hopefully this helps someone (likely me in the future..)

9 Comments

How to Batch Download Videos and Presentations from SharePoint Conference 2011

UPDATE 26/10/2011 – The Files Have been protected

Since originally posting this, the content provider has added the cookie requirement for download of the files. This is in line with Microsoft’s wishes that the videos be downloaded by conference attendees only. In order to batch download the videos, you will need to follow the procedure outlined for the PPTX files below. The addresses of the video files are still the same – only the requirement for an authentication cookie has changed. Thanks to George Winters for pointing this out.

UPDATE 11/10/2011 – The Files Have Moved.

I have updated the post below accordingly. Also note the file names are also different. The new source is significantly faster – thank you MSFT!

I’ve seen a number of questions about this, and I’m in the process of collecting all of the presentations and slide decks from the conference, so I thought that I’d share my process. It’s relatively Straightforward to batch these up.

The videos are all in one location, and the presentations are in another. The base folder for the videos is (now) http://cdn.tri-digital.com/spc/videos/ and the naming format for the files is SPCsessionnumber.wmv.zip, so for session 301, the filename is SPC301.zip. You can simply punch the full URL into a browser and the files will start downloading. In order to batch download a bunch, I am using the open source WGET utility, recommended to me by Ziad Wakim.  I simply created a batch file that calls WGET for all of the presentations that I wanted. Each line in the batch file looks like:

"C:Program Files (x86)GnuWin32binwget.exe" http://cdn.tri-digital.com/spc/videos/SPC261.wmv.zip

The process is the same for the slide decks, but the site base is  http://www.mssharepointconference.com/SessionPresentations and the file name format is identical to the videos with the .zip replaced with .pptx. However, the conference site expects a cookie, so you’ll first need to log into the myspc site. Once done, export the cookie for the site to a text file (In IE use File – Import and Export – Export to a file) that’s available to the WGET command. Once it’s a simple command switch on the end of the WGET command to load in the cookie for each request. If the cookie is exported to a file names cookies.txt, then the command will look something like:

wget http://www.mssharepointconference.com/SessionPresentations/SPC413.pptx  –load-cookies cookies.txt

Another batch file with a line for each slide deck you want, and you can run it and download everything at once. One note – you’ll need to be patient. Given the speeds, I don’t think that I’m the only one who has thought of this….

Hope it helps.

6 Comments