Skip to content

Category: Uncategorised

How To Replace A Custom Field In SharePoint 2010

One of the benefits of SharePoint is its extensibility. It comes with a cornucopia of tools and features that provide fantastic business value, but inevitably, you’ll come across a requirement that the out of the box features don’t address. The good news is that its extensible nature allow for custom development, and these features can be added.

Customization is not without its drawbacks. Just because you can build on SharePoint, it doesn’t mean that you should. With custom code comes all of the costs of ownership of running custom code, and the upgrade difficulties that come with it. If you can possibly use a no-code solution, you probably should. A recent customer challenge that we had is a good demonstration of that, and provides a good example of a highly customized solution being replaced with a no code solution.

Back in the days of SharePoint 2007 (!) we ran across a number of requirements for a cascaded combo box control. When filling out a form, we wanted the options available to one combo box to be driven by the selection in another. Unfortunately, there was no real way to do this with SharePoint forms. It can be done with InfoPath through filtering, but that was only available through InfoPath form libraries. We needed to provide this capability with standard lists.  Our solution was to implement a custom column that provided this capability.

This worked perfectly well for a few years, but when it came time to upgrade to SharePoint 2010, we found that the custom control didn’t quite work. One option would of course be to fire up Visual Studio and make the necessary modifications. However, since SharePoint 2010 allows you to use InfoPath to edit standard SharePoint lists now, this was determined to be the better choice, and we could ditch the custom controls. Another drawback of the custom columns is that InfoPath will simply not work with them, making solid form design difficult.

However, given that this was an upgrade, we wanted to make sure that all of the existing data was retained, so we couldn’t just trash the existing data, and start new. We needed to somehow convert the old column data to standard SharePoint data. We were able to do this through the PowerShell Import/Export function, with a few tweaks.

Step 1 – Upgrade the Content Database

Before you can use this method, the content databases will need to be in SP2010 format. If you’re doing an in place upgrade, this will happen automatically, but if you’re using the DB attach method, you’ll need to first install the custom solutions on the new farm, or else the database mount procedure could fail.

Step 2 – Export the List Data

Most SharePoint administrators are familiar with the STSADM import and export commands that allow you to export and import either single sites or entire site collections. This capability still exists in SharePoint 2010 with both STSADM and with PowerShell, but the PowerShell commands also allow you to export individual lists.

When ready, export your list with the following PowerShell command:

Export-SPWeb -Identity siteurl -Path outputpath -ItemUrl listurl –IncludeUserSecurity      -IncludeVersions All -NoFileCompression –Verbose

where:
siteurl  is the url to the containing site (or web) i.e. http://server1/accounts/2010
outputpath is the file system path for the output files
listurl is the relative url of the list (i.e. Lists/customers)

 

The NoFileCompression is important because we need to edit one of the output files after the export is complete.

Step 3 – Modify The List

Now that the data has been saved outside of the list, you can go ahead and remove the custom columns from the list. Once you’ve done so, be sure to replace them with identically named columns. If possible, you should also use the SharePoint column type that the custom columns were originally derived from. In my example below, I replace a Cascaded Lookup (custom) column with a Text column.

Once this is done, you’ll also want to delete all of the current items in the list. You could simply delete the list itself, but if you have any workflows, or custom forms, they’ll get deleted too. Also, a new list will have a new internal GUID, which may not work for you.

Another option is to use the Data Sheet view to copy the old field values to Excel, so that you can copy them back after the fields are changed. This approach will update the last edited date and the last edited by, so that may not be acceptable. If it is, you can omit steps 2,4 and 5.

Step 4 – Edit the Manifest File

In the path that was used in outputpath in step 2, there will be a number of files. Find any named manifest.xml or manifestx.xml (where x is a number) and edit them. These files contain the metadata for your list. Simply search for the name of your custom field type (in my case, CascadingDropDown) and replace it with the standard type name (like Text). Once all occurrences have been modified, save the file(s).

Step 5 – Import the List

Once the manifest has been modified to match the new list schema, you can bring in the exported list, with essentially the reverse of the PowerShell used in Step 2:

Import-SPWeb -Identity siteurl -Path inputpath -IncludeUserSecurity -UpdateVersions Overwrite -NoFileCompression -Verbose

where:
siteurl is the url to the containing site (or web) i.e. http://server1/accounts/2010
inputpath is the file system path for the input files

 

Once the PowerShell completes, you should have all of your data in a list with no custom columns. From there, you can use InfoPath to modify your forms, or anything else that is possible with the standard column types.

One caveat though. We had customized the standard New/Edit/View aspx forms with SharePoint designer. Before we could move toward our InfoPath goal, we needed to recreate these forms as standard forms.

2 Comments

Upgrade Failure – The statistics ‘xxxx’ is dependent on column ‘tp_ContentType’

 

I wrote recently about overcoming an issue with orphaned documents when doing a database attach upgrade from SharePoint 2007 to SharePoint 2010. During that same database upgrade, I managed to run into a second nasty error. After running the PowerShell mount-spContentDatabase command, the progress would go to approximately 15% complete, and then the process would fail. Investigation of the upgrade log revealed the following errors:

[powershell] [SPContentDatabaseSequence] [ERROR] [8/26/2011 8:56:01 AM]: Action 4.0.2.0 of Microsoft.SharePoint.Upgrade.SPContentDatabaseSequence failed.
[powershell] [SPContentDatabaseSequence] [ERROR] [8/26/2011 8:56:01 AM]: Exception: The statistics '_dta_stat_1365579903_2_3_4_188_190_27_28_187_192' is dependent on column 'tp_ContentType'.
The statistics '_dta_stat_1365579903_28_192_190_4_27_187_188_3_26_2' is dependent on column 'tp_ContentType'.
The statistics '_dta_stat_1365579903_3_4_188_190_27_28_187' is dependent on column 'tp_ContentType'.
ALTER TABLE DROP COLUMN tp_ContentType failed because one or more objects access this column.

This was happening  when the upgrade process was altering the Database schema, and removing the tp_ContentType column from the table. I know SQL quite well from a BI perspective, and well enough from a SharePoint perspective, but I had never before bumped into a problem with SQL statistics.

Statistics are created in conjunction with indices to help optimize performance. Kim Tripp talks about them in a good article here, and Idera has a nice in-depth discussion on them here. However nice they may be, they were preventing my upgrade, so how important were they?

Well, as it turns out, a well meaning SQL DBA had decided that it would be a good idea to optimize the SharePoint databases, and had run a process that created a number of these statistics. Unfortunately, these ones implemented referential integrity constraints that were preventing the schema to be changed, and causing the upgrade to fail.

Luckily, these statistics followed a consistent naming convention. I was fairly easily able to go through each table in the content database and remove the offending statistics (in my case, all of the ones that began with “_dta”. There were other statistics in there, and they were left alone.

Once the statistics were removed, the upgrade proceeded smoothly. Once again, I was forced to manipulate the content database directly to solve a problem, which is not at all a good idea generally. However, in this case, it was done to undo some other changes previously implemented.

Hopefully this can save someone else some grief in the future.

2 Comments

SharePoint 2010 Service Pack 1, the June 2011 CU, and What to Install

Without wasting too much space on this, I’ve already gotten a number of questions on the SharePoint 2010 Service Pack 1, how to install it, and in what order to install all of the bits. Last week, fellow MVP Spencer Harbar put out what I consider to be the definitive unofficial” guidance on this, and is the approach that I recommend.

Spencer’s post can be found here. In short form, install the bits in the following order:

  1. SharePoint Foundation Service Pack 1
  2. SharePoint Foundation Language Packs (all) Service Pack 1
  3. SharePoint Server Service Pack 1
  4. SharePoint Server Language Packs (all) Service Pack 1
  5. Office Web Applications Service Pack 1

When downloading the language pack service packs, don’t forget to select the appropriate language(s) from the dropdown list.

Do NOT run the Product configuration wizard until the last component has been installed. Also, don’t install the June CU, unless you know that you need it (unless it’s a new install). As to the why behind these recommendations, check out Spencer’s post.

Leave a Comment

Connecting to Cubes and External Data with Excel in SharePoint

While many people are still unaware of it, Excel 2010 (and even previous versions) is a very powerful business intelligence client. That’s right, I said Excel. And I don’t mean the classic grab some data, do some charts and email it around sort of Excel, I mean connecting it to Analysis Services cubes and performing fast, useful data analytics on known sets of data. It can also go off and do some pretty amazing things with the Data Mining add-in, or PowerPivot, but for now I’m going to restrict myself to using core capabilities, and getting them published to SharePoint.

To start with, it’s important to understand how SharePoint interacts with Excel. Of course, at it’s core, you can store Excel files in a SharePoint document library,and open them in Excel. However,starting with SharePoint 2007,SharePoint included Excel Services, which allowed you to open a spreadsheet directly in the browser (without having Excel installed or using any ActiveX trickery). It also provided an Excel calculation engine, that was programmatically callable. Why would that matter? A power user could develop a complex model with Excel, store it in SharePoint, which could then be used as a calculation “black box” for other things. The model could also be tweaked by the power user as necessary with them needing  to worry about coding. Finally, you could expose part of the spreadsheet (and named range or object) through the Excel Services web part, which would truly allow you to incorporate Excel content into a dashboard.

SharePoint 2010 brings more to the table, including not only the browser consumption of spreadsheet content, but editing through the Excel Web Application.

However, all of this power does not come without its risks, and when interacting with external data, risks abound. Microsoft has done an excellent job of providing us with a highly secure infrastructure that allows us to do what we want here, but the only trouble is that it’s not always clear which approach to take. My aim in this article is to navigate through the process of connecting to an external data source (an analysis services cube, but the principle applies to others) in a particular scenario in a simple fashion. Our scenario is a fairly common one:

  • One Front End SharePoint Server
  • One SQL back end Server hosting the SharePoint databases
  • One SQL server hosting the Data Warehouse and the Analysis Services cubes
  • Kerberos not installed in the environment

In this environment, we have an authentication problem, the famous double hop problem. If I as a user ask the server to process a spreadsheet that connects to external data, it can’t just take my authentication token and pass it along to the external data source. What we need to do is to set up a proxy account (this has its own set of implications that I won’t get into here) and we’ll do that via the secure store service. If you’re using Kerberos, then you don’t have the double hop problem and the Secure Store part of this won’t apply, but then you’re probably busy enough dealing with Kerberos issues anyway….

If you’ve ever connected to external data, and then sent the file to someone else, you’ll know that they get prompted to verify that they want to connect to the data source and that they trust the connection. In this case, the author has created an embedded data connection. Embedded data connections are not allowed (or at least are strongly discouraged) on the server, because it has no way of verifying the authenticity of the connection.

The way that we deal with this in SharePoint is by creating a connection file, and then publishing it to a trusted location in SharePoint. Administrators need to take heed, in that by default, Excel Services trusts all SharePoint locations, which makes life easy, but not necessarily secure. The library that the connections are stored in should also utilize approval features, but this is not required. This walkthrough isn’t concerning itself with the security aspects, but they need to be considered in any real world situation.

Most of the steps below only need to be performed once, and connection documents can be reused, etc. However, this walkthrough is aimed at getting everything all set up from scratch.

We also don’t want to store our credentials in our connection string, so we will utilize the Secure Store service in SharePoint which will allow us to use, without necessarily knowing, a set of credentials. This will allows us to work around the double hop problem in a secure way. We will start with the setup of the Secure Store Service.

1. Set up the Secure Store Service ID

From Central Administration, navigate to Manage Service Applications, and click on your Secure Store application. If you don’t have one already, you’ll need to create one. You should know that SharePoint Foundation does NOT come with the Secure Store service. However, Search Server Express 2010 does come with it (and a few other things). Given that it’s free, it provides a nice option.

The Secure Store Service Application relies on both the Secure Store Service (duh) and the Claims to Windows Token service. You’ll need to make sure that they’re both started in the “Services on Server” Section in Central Administration System Settings.

The secure Store application requires an encryption key. If  one has not already been created, you’ll see a message indicating that you need to do so. The process is simple, just click the “Generate New Key” button in the ribbon.

image

Once a key has been created, we need to create a new target application, which is essentially a set of credentials. This is the application that our connection strings will reference when they need to connect to a back end data source. You create a new application by clicking the “New” button in the ribbon. The New application screen then appears.

image

There are a couple of things to note on this screen. Firstly, the Target Application ID is the ID that you will be using when you set up your connection below. You can name it what you like, but you’ll need to remember what it is. The Display Name and the Contact E-Mail need to be filled in, but the important thing to note is the Target Application Type field. If this is to be used by more than one person, you need to make sure that it is set to Group. By default, it is set to Individual, which doesn’t work so well in a shared environment. Take it from me – I found out the hard way. When this is filled in, click Next, and you’re presented with the following screen.

image

The Administrators are just that – the people who will set the properties of this set of credentials. The Members are the people that will be allowed to use this credential set in connections, External Lists etc. In the example above it’s set to anyone authenticated, which again, I wouldn’t recommend in production…..

When done, click OK, and we’re done right? Not so fast. We never actually set the credentials for this application, just everything around it. To do that, we select the application, and click the “Set (credentials)” button in the ribbon, or hover over the ID in the list and select “Set Credentials” from the dropdown.

image

In the subsequent screen you enter the account and the password (twice) of the credentials that are to be used as a proxy. Click OK and you’re done. A couple of things to note. If using a Windows account the name should be in DOMAINACCOUNT format. The proxy account is NOT a managed account, and if the password changes, you’ll need to come back here to update it. The proxy account should be a least privileges account, with access only to the things that are absolutely necessary to retrieve the appropriate data, or whatever its purpose is. Don’t add it to the Domain Administrators group.

OK, now step 1 is done. Whew.

2. Set up a Data Connection Library

The next thing we need to do is to set up a library to store our data connections. If you’ve set up a BI Site already (to use PerformancePoint), you can use the libraries there. Depending on your requirements, I find that it’s often a good idea to centralize this library and use it from different locations, but your requirements may vary. What we want to create is a Data Connection Library. This library will be used to store connections for both the Office Applications (ODC) and InfoPath UDC).

From the Site Actions menu on the site where it is to be located, select More Options, the, in the subsequent dialog box, Filter by Library, and select Data Connection Library. Give it a name (Don’t use spaces  here, you can always come back and add them later, but we don’t want spaces in the internal name), and click Create

image

What makes this library special is that it uses the “Office Data Connection File” and the “Universal Data Connection File” content types. You could always add these content types to any old document library and achieve the same effect. You could also include Reporting Services connections if you’re using Reporting Services in integrated mode.

This library also needs to be registered as trusted in the Excel Services Application. This is done through the Manage Service Applications section of Central Administration. Simply click on your Excel Services application, click Trusted Data Connection Libraries, and add the URL of your library, if not already there.

image

3. Set up a Library to house the Excel Reports

You can store the Excel Report that we’ll be creating below into any document library in the Site Collection. If you have set up a BI Center, then you already have a “Reports” library, whose purpose is to do just that. Oddly, by default, this library isn’t set up to contain Reporting Services reports (.rdl files), but that isn’t relevant to my task here.

You simply need to create a new document library, and for the purposes of this example, I’ll call mine Excel Reports (original, huh?)

You’re now ready to create our connection. We’ll use Excel to do that.

4. Create And Store The Connection File

Open a new Spreadsheet in Excel. Click on the Data tab, then click the “From Other Sources” button and choose “From Analysis Services” (obviously, if you’re using something else, choose that).

image

Enter in the name of your server, and click Next, Select the Database and Cube that you want to connect to, and then click Next again. We’ll need to do a couple of things in the last screen that appears.

image

First, select “Always attempt to use this file to refresh data”. Then, click the “Authentication Settings” button. This is where it gets interesting.

image

Window
s Authentication is what you would use if the data resides on the same machine as the SharePoint front end, or your organization is using Kerberos. With this setting, the credentials of the user are used to connect to the data source.If you select None, then the credentials identified as the “Unattended Service Account” (if configured) in the Excel Services configuration will be used. You can only use one of these accounts, and when configuring it, it too will use the Secure Storage Service. We want to select our credential set and we do so by selecting SSS, and entering the ID of the credential set that we created in step 1 above.

Next, we need to save this connection into the SharePoint data connection library that we created in step 2. Click the Browse button, enter the URL of the library in the Address bar, choose a name and click save.

image

Finally, When you’re done, click finish, and fill out the metadata form that pops up with any comments or keywords that you might want to use later to fine the connection document. Now we have a connection document in the connection library.

5. Create The Excel Content

When you’re done creating the connection document, Excel will prompt you to insert a pivot table or chart. Choose the location, and Excel will insert it for you, and put you in pivot edit mode. You can select your dimensions and measures, and build your chart accordingly. When you’re ready, your worksheet should look something like this.

image

Later, we’ll want to show just the chart on a dashboard page, and to do that, we need to make sure that our chart has a logical name, so we need to do that, as highlighted above. Any named range can be published out to an Excel Services web part. When ready, it’s time to publish to SharePoint, but we also want to set some file properties first. These properties are very well hidden….

Click on the “File” tab to go to the backstage. Once in the backstage, click on the “Save and Send” tab on the left hand side. Then click on “Save to SharePoint”. Your screen will appear as follows:

image

Finally, we need to click on the Publish Options button. This allows you to control what gets used by Excel Services, and what appears when the spreadsheet is opened in the browser. It also lets us define parameters which can be used by the Excel Services web part, but  I won’t be using parameters here. I will however choose to publish only my chart and my pivot table to Excel Services, so I click on the dropdown option in the Show tab, change it to “items in the workbook”, and check my two items.

image

Finally I can click OK, then click on “Save to SharePoint” and save the spreadsheet in the library created in step 3 above.

6. Test the File and Publish the Chart in a Dashboard

Navigate to the library that you stored the report and click on it. The file should open in a browser, and you should see the first item (alphabetically) that you set to publish above.

image

You can switch between published items, using the view dropdown highlighted above. Once you’ve gotten this far, everything is working, and we’re ready to put our chart into a dashboard page.

All that we’ll need to know is the name of the item, and the address of the spreadsheet. In our case, our item is “Chart 2” and the URL of the spreadsheet is http://uvspdev2010/ExcelReports/MySpreadsheet.xlsx. Navigate to anywhere in the site collection that you’d like to show the chart. I’m going to use a team site. From the destination, select Site Actions – Edit Page (or select Edit from the ribbon). Place your cursor where you want the chart to appear, click the Insert tab on the ribbon and click the Web Part button. Select the Business Data category, Choose the “Excel Web Access” part and click insert.

image

Once added, click on the “Click here to open the tool pane” link, and enter all desired parameters. The two crucial parameters are Workbook and Named Item. When adding content to a dashboard, I find that it is better to remove all of the buttons and web part chrome, but that’s totally dependent on the use case. When ready, my web part parameters look something like this:

image

At this point, you can click OK, and you should see your item on the page in context.

image t=”773″>

Congratulations – you’re done!!

This was meant to be a “Hello World” walkthrough, and thus, I’ve left out quite a bit. You can add slicers and all kind of cool BI functions to your sheet, and they’re work in the browser. This is showing data in a cube, and therefore the chart will be updated when the cube is. You can also use parameters in Excel and drive them through filter web parts and connections on the display pages.

There ARE a lot of moving parts here, but once the infrastructure is set up, power users can do some pretty spectacular things using just Excel and SharePoint, and they can do so in a fully managed environment. Oddly enough, chances are that your power users probably already know both of these tools.

21 Comments

Observations from AIIM SharePoint Summit

I just returned from the recent AIIM Expo show in Philadelphia. I haven’t been to AIIM since 2002 when we were building out our imaging product. I had a couple of reasons to go this time. I’ll keep the first one to myself for now, but the other was because Microsoft was making a big splash with the SharePoint Summit. I was very interested in what there messaging would be like to the “hard core” ECM market. All of the sessions that I attended were in the SharePoint Summit track.

The keynote was delivered by Eric Swift, the new GM of the division, and by Ryan Duguid, the “ECM guy” at Microsoft. Eric gave a very good talk, and I was extremely impressed by Ryan, who spoke at several sessions throughout the show.

Usability

A couple of great quotes from Ryan – ”ECM works when its invisible to the end user”. I couldn’t agree more. People will use a system when they can see a value for themselves, and when it won’t cause them much disruption. Far too often systems are “imposed” upon end users, and one thing that’s certainly true about information workers is that if they can find away around a difficult system, they’ll take it.

Ryan also said “If you can’t show users their personal payback, they’ll never adopt your system” which is likely why, according to Doculabs,50% of all ECM projects fail. According to Doculabs,this is due to the exclusive focus on one specific area of functionality required by one specific area of the business without taking into account the needs of the wider user community. All of which is saying the same thing.

The final keynote session was presented by Ryan and Bert Sandie from Electronic Arts. Their talk was on how to provide an excellent user experience, partly by using gaming principles. On the surface, that sounds odd, but it makes a ton of sense. If you makes tasks more interesting, people will be more likely to perform them.

As an example, Ryan demonstrated Ribbon Hero, which is an add-in to the Office suite. It installs a button in the ribbon, and presents you with a set of challenges. These challenges are application related tasks and it helps you to varying degrees as you perform them, and you gain skill points by doing so. It allows you to compete with others, increasing your motivation. If you really want to drive use, hand out weekly rewards for “top scores”. A perfect example of applying the gaming concept.

Another concept that came out of this session that I’ve been preaching for years is that you should always include and understand the end users in any application design.Look at what people do, don’t tell them that it’s wrong – adapt it into your solution, and ideally improve it. If you don’t provide users a means of doing what they need to do inside the organization, users will find a way to do it outside.

Bert presented an interesting case study in usability. If you are familiar with the default search page in SharePoint, you’ll know that it is even simpler than Google’s. It’s essentially a white page with a search box and a go button. EA took that page and decorated it to look almost exactly like Google’s. Of course it said Electronic Arts instead of Google, but the letters were even alternately coloured. What was interesting is that by doing that one little thing, usage of the search engine increased 30%.

Bert also demonstrated that he could show that the creation of a single document paid for their entire system, and made a final point that the right user experience combines functionality, usability and aesthetics.

I really liked this focus on usability and community, which seemed to be a theme throughout the SharePoint summit, and was really refreshing to see at an AIIM show. I think that it’s safe to say that the large ECM players have not historically been particularly interested in usability.

Records Management

Microsoft waded into the records management area with the Records Center in 2007. It didn’t exactly meet with glowing reviews, but they’re really hit it out of the park in 2010. Through the new records center it supports all of the traditional records management requirements with file plans etc, but at the same time, it brings RM to the end user through in place records management. Users no longer need to go through many steps and secret rituals to get documents under management, a document (and any other piece of content!) can be declared a record through a simple click of a button. Document routing makes sure that if necessary the content moves to the record center while leaving behind a stub.

Ryan Duguid showed a slide which indicated that if left unchecked, an organization that currently manages 2 TB of data will be managing 45 TB of data in 5 years time. However, if disposition policies were put in place that disposed of 10, 20 and 30% of content annually, that future growth number would shrink to 25, 10 and 4 TB respectively. The RM features in SharePoint 2010 can help bring this reality about

Interestingly, the next day Cyrus Mistry gave a talk on the way that Google manages their content. In essence, they don’t. The mantra is to keep absolutely everything forever, open it up to everyone and rely on search to find it. I actually agree with the opening up concept, but I think it’s impractical, not to mention legally dangerous to leave stuff lying around forever.

Cyrus also pointed out a couple of policies that I might consider implementing. One is that every Google employee writes a small blurb (very short) on their past week’s activities, and what their plans for the next week are. That is visible to everyone. I sort of like it from a few angles. Another is that users can contribute ideas to a central “idea pool”. Ideas are then voted upon, and if an idea gets enough votes, it becomes a project.

CMIS Connector Announced

At the show, it was announced that Microsoft will be shipping a connector for the Content Management Interoperability Services (CMIS) standard. This will allow SharePoint to act as a “front end” for external content management systems, and vice versa. This will allow for easy integration with legacy document management systems, and give the users of these systems a better experience without sacrificing capability.

1 Comment