Skip to content

Month: December 2014

This Column Name is not What You Think – Be Careful With Column Names in SharePoint 2013 / Office 365

 

I’ve been involved with SharePoint for a very, very long time, since 2001 in fact. One of the basic truths that we old timers learned very early on was that an internal list column name wasn’t necessarily the same as its display name. For the most part, when a column is created, the name is the same, but any subsequent changes to the name only affect the way that it is displayed, and the original name stays the same. In fact, even on creation, if a space is used in the name, when saved, the internal name replaces the space with “_x0020_”. This is why it is a best practice to create all entities using no spaces, and then edit them later adding any desired spaces. It just makes life easier for people working with internal names.

Who works with internal names? Anyone that needs to extend SharePoint really. Any explicit reference to a SharePoint column normally needs its internal name. Given that display names are editable, this makes sense. Because of this, for a long time, I’ve gotten used to being careful about column names when creating them, so that it’s easy to reference them later.

Recently, I ran into a very odd behavioural change with SharePoint 2013 (Also with Office 365). My situation was that we needed to display a rotating banner on a page. I decided to use Marc Anderson’s SPServices to do the heavy lifting. Also, since I’m not that great at JQuery (or Javascript for that matter) I turned to Mark Rackley’s blog, who had an excellent example of an image slider using SPServices.

The way that the slider works is by reading entries from a custom list. This list contains a couple of columns named HTML and Picture. I added a third named DestinationURL to make the image clickable. I created this new list, and then edited it using the “traditional” method of accessing List settings. The HTML and DestinationURL fields were simple text fields, and the Picture field was a Picture column. Once created and populated, everything worked great.

Next up, I had to repeat this on a different site. I couldn’t save the custom list as a template because the destination site was in a different language. I therefore decided to just go ahead and create another list manually, but this time, I added the field with the new SharePoint 2013 “Quick List Editor”. If you haven’t seen it before, it shows up as a “+” symbol when you use it.image

image

You simply click the “+” symbol, choose the column type type the name of your column and keep going. If you choose the “More Column Types…” option, you are presented with the traditional column creation dialog box. I needed to use this for my “Picture” column.

image

Once complete, everything looked great.

image

I then went ahead and populated the list with content, and implemented the slider code on the home page. The trouble is that it wouldn’t work. After tearing out my hair a fair bit, it appeared that while the picture column was returning data, the HTML and DestinationURL columns were not. I found this very odd as I had been careful with the names, and this was a new list. Additionally, the Picture type column is more complex. If any column was going to give me grief, I’d think it’d be that one.

Finally, I decided to confirm the column names, because I don’t trust myself. The easiest way to do that is to go into List Settings, hover the mouse over the column definition, and check the destination URL that pops up as it contains “Field=fieldname” in it. I then did so for my HTML field:

image

What? The fieldname is “vtwo”. Upon checking, the DestinationURL field had an equally random name (“vn1m”). However, the Picture field was “Picture” as expected. How did this happen? Why did it happen to only two columns?

As it turns out, it’s the quick list editor is the culprit. When you add columns using it, it has no mechanism to check your column name against existing columns, so it doesn’t even try – it just forces a random column name. However, when you choose “More Column Types” it loads the traditional column editor, which does have such control, and therefore, my Picture field worked as expected.

After going back, deleting and recreating my two columns (and repopulating the content), everything worked as expected.

Lesson learned – when changing a list schema, stick to the list settings interface. Don’t use the Quick List Editor for adding new columns.

12 Comments

Check-In Tracking with Power Query and Power View

For me, 2014 was a particularly busy year. I was fortunate enough to have been invited to speak at a number of international conferences, and took a few wonderful personal trips as well. When travelling, I have gotten into the habit of doing Facebook check ins as I go. It occurred to me after describing my year to a friend, that since Power Query supports Facebook, I should be able to visualize my travel year easily enough.It wasn’t only easy, it was also very fast.

Acquiring the data

Getting data from your Facebook feed is really quite easy. Once you have Power Query installed, open a new Excel workbook, select the Power Query tab. Then from the “Other Data Sources” button, select Facebook.

image

Unless they’ve already been saved, you’ll be prompted to enter Facebook credentials. Once you’ve done so, you’ll be prompted for a Facebook object, and a Facebook connection (or feed). image

There is a wealth of data that can be retrieved, but in our case, we’re interested in check-ins, which are expressed in Facebook as posts. Therefore we select “me” (the currently logged in user) and “Posts” and click OK. The Power Query editor loads with all of our Facebook posts. It should be noted that Power Query is using Facebook Graph to perform the query, as can be seen in the function bar:

image

That import function can be edited, so any valid Facebook graph expression can be used to query data, not just those automatically generated by the Power Query UI.

Unfortunately, “Posts” contains a lot more data than just check ins, so we need to filter it down. We do that be applying a few transformations. All check ins are status posts, so we start by filtering the type column, selecting “status” as the filter value.

image

Not all statuses are check ins. However, all statuses with a value for location are. Where is location? It’s in a related record in the “place” column. The place column values are related records, so first, we need to expand the place column. To do so, click the small “expand” icon next to the column title.

image

Next, select the columns that you want to expand. In this case, all we need is location, so that’s all we need to expand. When OK is selected, the location data is expanded, and the original place column is replaced with a new place.location column. However, the values for location are themselves records, so they also need to be expanded. We then follow the same procedure that we did for “place” to expand them.

image

Once the columns are expanded, it is fairly obvious that only some entries have values for longitude and/or latitude. These are our check ins. We need to filter out all of the other records, and we can do this by filtering latitude or longitude with using the same approach that we used on the “type field earlier. This time, we deselect the null value.

image

We are left with nothing but check ins as records.

Now it’s time to clean up the data – we don’t want to load any more data than we need to perform the analysis, so we delete any unnecessary columns. Also, the expanded column names are cumbersome, so we rename those as well. Finally, the two date fields don’t get auto-detected as dates, so we need to change their data types to Date/Time. IN our case, the completed query appears as below.

image

Visualizing the data

We are now ready to visualize the data, so we select close and load. In our case, Power Query has been set to load only into the model, but your settings may vary. Since we’ll be using Power View, we need it in the data model. After the data has loaded, we select the Insert tab, and select Power View from the ribbon.

image

The first thing that we want to see is a map of our logins, so we select Country from our Power View fields which loads a list of the countries. Then we select Map from the ribbon, which loads the map. Next, we resize the map to fit our requirements. Finally, we drag the id field to the size field (visually showing number of check ins per location) and then we drag the city field under country in the locations field. At this point we can double click on a data point to drill down to a city.

image

We want to restrict the data to 2014, so we drag the created_time field into the filters column, select the advanced filters option, and select greater than Jan 1 2014 and less than Jan 1 2015.

image

Finally, we want to show a list of all of the locations, so we click just below the map visualization and select created_time, Country, state, and city from the fields list.

Normally, I’d finish the post up with a screenshot of the finished post, but this post on the Power BI blog yesterday inspired me to try embedding the actual product. Play around and have a look!

 

1 Comment

Limitations for Power Query OData Feeds in Power BI

One of the features available in Power BI is the ability to take any defined data source and expose it as an OData feed. This is a very simple and quick way to get your existing data available through OData, as it involves a simple check box selection. Complete instructions on setting this up can be found here. There are however a few limitations to what you are able to do that you should be aware of before you head down this path.

Intranet Only

The OData feed feature works through the Data Management Gateway, which is normally used to keep data models stored in the cloud updated regularly with new on-premises data. When a data source is registered, an “enable OData feed” option is made available which when checked, creates an OData feed URL.

When this feed is used, a connection is made directly from the OData client to the Power BI service, which then redirects communication to the Data Management Gateway. The reason that this is important is that because the actual data connection does not go through the Power BI service, the client machine needs to be able to communicate directly with the machine hosting the Data Management Gateway. This means that the OData feed only works on the intranet – it can’t be shared publicly. For now at least.

Data Types

The Data Management Gateway, and therefore the Power BI service don’t support all of the data types supported by SQL Server, or Oracle. If your table or view uses any field that is an unsupported data source, the entire table will be unavailable to use in an OData feed. The table will appear as greyed out when the list of tables to use for OData is being configured.

image

In the above case, the DistrictMaps table contained a geography field, which is unsupported. A complete list of supported data types can be found here. If you are using unsupported data types, you may want to consider creating views that do not contain these fields, and exposing those.

Data Sources

Up until recently (version 1.2), the Data Management Gateway only supported performing data refreshes from two on-premises data sources – SQL Server and Oracle, which constrained its value somewhat. Version 1.2 brought support for a wide variety of Power Query data sources, which really changed the game. Now, since OData feeds utilize the Data Management Gateway, we should be able to expose all sorts of data sources as OData feeds, right?

Wrong. Well, not quite at least. I received a question from Hrvoje Kusulja,  who was trying to expose DB2 data as an OData feed through Power Query, but the OData feed option was disabled. After some testing, and communication with Microsoft, I was able to determine conclusively that while Power Query queries are supported for OData feeds, the underlying Power Query queries MUST come from either SQL Server or Oracle. This is identical to the Power Query refresh support in version 1.1 of the Data Management Gateway. Unfortunately we couldn’t find documentation on this anywhere.

One potential workaround if you need OData support and your data source isn’t supported would be to use an ETL system (Integration Services) to pump data into SQL or Oracle, and create the query from there.

OData feeds is a great little feature, and a nice side benefit from using Power BI and the Data Management Gateway. As with any new product, it has limits that will undoubtedly be reduced in the future, but it’s important to know where they are.

5 Comments

Reporting Services Web Part Error After SharePoint Upgrade

I recently completed an upgrade of SharePoint 2010 to 2013 for a customer that was using Reporting Services integrated mode fairly heavily. After the initial upgrade however, I was getting the following error whenever I tried to access a page that contained a Reporting Services report viewer web part:

Web Part Error: A Web Part or Web Form Control on this Page cannot be displayed or imported. The type Microsoft.ReportingServices.SharePoint.UI.WebParts.ReportViewerWebPart,Microsoft.ReportingServices.SharePoint.UI.WebParts, Version=10.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91 could not be found or it is not registered as safe.

The version of Reporting Services on the originating SharePoint 2010 farm was from SQL Server 2008 R2 (version 10) and the version in the new farm was SQL Server 2012 SP1 (version 11). The message pretty clearly indicates that it was having trouble loading a version 10 instance of the web part assembly, so I opened up the web.config file for my application, and sure enough, while there was a safecontrols entry for version 11 of the assembly, there was none for version 10. There was a binding redirect for the assembly itself (redirecting from version 10 to 11), so this was pretty clearly a bug in the installation process (SQL server install team take note!).

This would only affect content brought forth from an older SSRS system which is what I had. One solution would be to re-add all of the web parts onto the relevant pages. That would explicitly use new assembly references, but would also take a great deal of time. Another would be to directly edit the web.config files of all of the farm applications. While not generally considered best practice, this would solve the problem, especially since it corrects an oversight.

Hunting around, I came across my friend Chris O’Connor’s blog post on this very topic. Chris had this exact issue, and a rather large farm with multiple applications to deal with, so he built some PowerShell script to add these entries back in. The complete script is in Chris’ post.

In any event, regardless of the method used, adding the safe controls entry into the web.config file(s) fixes this particular issue. On to the next problem…..

4 Comments

Error 1603 Installing Reporting Services Add-In onto a New SharePoint Server

I had an interesting (annoying) error this week at a customer site. I was adding a new server to an existing SharePoint farm. This was a relatively new farm, installed a couple of months previously. This farm has SQL Server Reporting Services (SSRS) installed. The new server wasn’t running the SSRS service application, but as with any WFE server, it needed the SSRS SharePoint add-in installed.

Although the SQL Server Reporting Services add-in is included as part of the SharePoint Server prerequisites install, it’s an older version. Since in this case I was using SSRS 2012 SP1, so I needed to manually install it. However, attempting to do so resulted in a rather nasty failure error 1603, and the add-in simply refused to install. I hadn’t seen this one before.

Poking around a bit, I found that you can get this error when the installed version of the bits on a SharePoint server don’t match with what has been applied. This is the state of a server after a Cumulative Update or Service Pack has been installed, but PSConfig (the Products Configuration Wizard) has not yet been run. I then ran the configuration wizard on the new server, and it did in fact indicate that a upgrade was needed. This was a bit confusing, as I had only just installed the SharePoint bits on this machine. How did this happen?

Well, as has been my practice for some time, whenever I install SharePoint bits on a server, I immediately run Windows update to make sure that everything is nice and new. It’s a new server, that’s the worst that could happen, right? Wrong. As I was aware (but had forgotten about), in September, Microsoft started delivering SharePoint updates along with regular Windows updates. This, in my opinion, is not a good thing, and is why as pointed out by Todd Klindt, you should no longer enable auto update on your SharePoint servers. You should apparently be careful of doing manual updates as well.

My little Windows update had snuck in some SharePoint updates causing the upgrade requirement, and the problem with the SSRS add-in. In addition, since this was a multi server environment, it put this server out of step with the other servers. After bringing the rest of the servers up to the latest update level, and running PSConfig on all of them, I was able to install the SSRS add-in on the new server. All was right with the world again.

6 Comments