Skip to content

129 search results for "excel"

Configuring Profile Import in SharePoint 2010 – A Way Around the Minefields

Author’s Note – July 17 2012. When I originally wrote this in early 2010 it was based on some (at the time) sketchy Technet documentation and experience. It was meant to be an easy to understand guide to setting up the User Profile Service. I want to point out that there is a much more comprehensive guide out there on the topic, the Rational Guide to the User Profile Service by Spencer Harbar. It’s the reference I use when I get into trouble, and if this article doesn’t do it for you, I recommend going there.

Profile synchronization has changed drastically in SharePoint 2010 compared to 2007. The 2010 profile synchronization uses the Forefront Identity Management services. There are a lot of good things about this, one of which is that it provides for bi-directional synchronization. User changes to their profile store can be synchronized back to their identity store (Active directory, etc). This of course can be tightly controlled, on a field by field basis. There’s a great post on how to do so here.

Of course, all of this added power is not without its cost in complexity. I had a great deal of trouble even getting profile imports to work at all with some of the pre-release builds, and the final release is still a little rough around the edges. There is a Technet document available here that details precisely how to configure profile imports. It’s completely accurate, but doesn’t necessarily answer all of your questions. This post is my attempt to help guide around the worst of the thorns, and get it working with Active Directory.

First, you’ll need to have a Profile application running. If you’ve done a migration, you already do. If you’ve run the setup wizard, and selected user profile application, you also already have one. Synchronization will however not be happening,even if you’ve done a migration. It will need to be configured.

If you don’t already have a Profile Service application,you’ll need to create one. You do that from the Manage Service Applications screen and choose the New button in the upper left. You’ll want to select “User Profile Service Application”.

New User Profile Service

There is a lot of configuration here, make sure that you scroll to the bottom and fill out all of the relevant options. You’ll be choosing databases for social tagging, a database for storing user profiles, a database for storing user tags, and the URL of your MySite Host. If you haven’t already created a MySite site collection, the configuration screen here will allow you to do so. Once you have successfully created the application, you may then proceed to starting the service. This of course is in a completely different are where the services on the server are controlled. Once there, start the User Profile Synchronization Service.

User Profile Synchronization Service

When you start most of the services, they either start immediately, or give you a configuration screen and start quickly thereafter. Clicking this one gives you the configuration screen where you’re prompted to associate the service with your application from above, and the credentials that the two Windows Service accounts will run with. However once completed, you’ll notice that the service is in a “starting state”. This is normally a bad thing with SharePoint, and indicates that something is hung. Not so in this case. It takes a very long time for this service to start. When it does, you should see the following two services started:

New Profile Import Services

DO NOT attempt to start these services manually, and that will confuse the system in a very big way. Just have patience, and all should be well. You should now be ready to go ahead and perform an import. However, there’s a very import step that likely needs to be performed. The service account that was specified above to run the synchronization service (the one that the two forefront services are running as) need to be granted the “replicating directory changes” permission. There is another Technet article on how to do this, so I won’t go into detail on how.

If you do not perform this step, your import will fail, and you’ll have very little idea as to why. The error message is far from clear. Below I’ll talk a little about how you can troubleshoot import issues.

The next step is to set up the import itself. To do this, open up or manage your Profile service application, and select “Configure Synchronization Connections”

Create New Profile Connection

Once here, you either edit your existing connection(s), or create a new one. The options for import source are greatly increased from 2007 and now include Active Directory, Active Directory Resource (ADAM), Active Directory Logon Data,  BDC (they’ve forgotten to rename it to BCS), IBM Tivoli, Novell eDirectory, or Sun Java System Directory Server. Another improvement in this version is that the import can now use Forms authentication credentials, but can also make use of Claims based authentication if available.

Give the connection a name, select an authentication type, and any appropriate credentials. Once you have done this, press the “Populate Containers” button. If everything is OK, you should see your import source appear below:

Profile Import Container Selection

This dialog is NOT particularly responsive, be prepared for long waits between mouse clicks. The beauty of it is that you can now very easily cherry pick which containers, and even users get imported very easily. This is not something that was straightforward in 2007. Once completed, select OK, and your connection should be created. Note – subsequent edits of the connection will not retain the credential information. This is normal.

At this point you are ready to perform your firs import. Return to the Profile Service Application main page, and in the Synchronization section, select “Start Profile Synchronization”. You then have the option to choose full or incremental sync. Once selected, synchronization runs, and you can keep track of its status on the main profile application screen:

User Profile Service Application with import job running

The import job will take a very long time, compared to import jobs in 2007. The good news is that the status messages are relatively verbose. If however, you feel that the job is taking too long, and you feel that there is a problem and can’t locate it, an excellent tool to use is the Forefront Identity Synchronization Manager Client UI. THis will be installed on your server at:

<install Drive>:Program FilesMicrosoft Office Servers14.0Synchronization ServiceUIShell and the application to run is miisclient.exe. This application is installed by default, but there is no entry in the start menu. Running this file will give you a screen that looks like the following:

Profile Synchronization Manager Client for SharePoint 2010

This screen shows no problems, but if there are, they will be displayed for each step in a great amount of detail. I’ve used it to troubleshoot a few connection issues already.

In conclusion, the new profile synchronization system in 2010 has quite a few more moving parts, is a little rough around the edges (at the moment), and can be a bit of a bear to get going. However, it’s new capabilities make it well worth the effort, and lay the groundwork for what I can see to be some great new features down the road.

11 Comments

Cabo Pulmo 2009 Dive Pictures

After the SharePoint 2009 Conference last October, a few of our crew took a detour to a very tiny town at the end of the Baja called Cabo Pulmo. It was my third trip there (more pictures to come..) and obviously I highly recommend it. This time we stayed at Reinhard”s Rentals and completely loved it. I’ve stayed there previously. We dove with the Cabo Pulmo Beach Resort and they were excellent once again. Every time I’ve returned they’ve had a different staff, and each time they were great. I think that speaks to an excellent dive operation, not unlike my friends at Abyss Dive Shop in Playa Del Carmen. You can see the pictures on Flickr here or view the slideshow below.

[flickrslideshow acct_name=”wpages” id=”72157623920434822″]

 

Leave a Comment

Observations from AIIM SharePoint Summit

I just returned from the recent AIIM Expo show in Philadelphia. I haven’t been to AIIM since 2002 when we were building out our imaging product. I had a couple of reasons to go this time. I’ll keep the first one to myself for now, but the other was because Microsoft was making a big splash with the SharePoint Summit. I was very interested in what there messaging would be like to the “hard core” ECM market. All of the sessions that I attended were in the SharePoint Summit track.

The keynote was delivered by Eric Swift, the new GM of the division, and by Ryan Duguid, the “ECM guy” at Microsoft. Eric gave a very good talk, and I was extremely impressed by Ryan, who spoke at several sessions throughout the show.

Usability

A couple of great quotes from Ryan – ”ECM works when its invisible to the end user”. I couldn’t agree more. People will use a system when they can see a value for themselves, and when it won’t cause them much disruption. Far too often systems are “imposed” upon end users, and one thing that’s certainly true about information workers is that if they can find away around a difficult system, they’ll take it.

Ryan also said “If you can’t show users their personal payback, they’ll never adopt your system” which is likely why, according to Doculabs,50% of all ECM projects fail. According to Doculabs,this is due to the exclusive focus on one specific area of functionality required by one specific area of the business without taking into account the needs of the wider user community. All of which is saying the same thing.

The final keynote session was presented by Ryan and Bert Sandie from Electronic Arts. Their talk was on how to provide an excellent user experience, partly by using gaming principles. On the surface, that sounds odd, but it makes a ton of sense. If you makes tasks more interesting, people will be more likely to perform them.

As an example, Ryan demonstrated Ribbon Hero, which is an add-in to the Office suite. It installs a button in the ribbon, and presents you with a set of challenges. These challenges are application related tasks and it helps you to varying degrees as you perform them, and you gain skill points by doing so. It allows you to compete with others, increasing your motivation. If you really want to drive use, hand out weekly rewards for “top scores”. A perfect example of applying the gaming concept.

Another concept that came out of this session that I’ve been preaching for years is that you should always include and understand the end users in any application design.Look at what people do, don’t tell them that it’s wrong – adapt it into your solution, and ideally improve it. If you don’t provide users a means of doing what they need to do inside the organization, users will find a way to do it outside.

Bert presented an interesting case study in usability. If you are familiar with the default search page in SharePoint, you’ll know that it is even simpler than Google’s. It’s essentially a white page with a search box and a go button. EA took that page and decorated it to look almost exactly like Google’s. Of course it said Electronic Arts instead of Google, but the letters were even alternately coloured. What was interesting is that by doing that one little thing, usage of the search engine increased 30%.

Bert also demonstrated that he could show that the creation of a single document paid for their entire system, and made a final point that the right user experience combines functionality, usability and aesthetics.

I really liked this focus on usability and community, which seemed to be a theme throughout the SharePoint summit, and was really refreshing to see at an AIIM show. I think that it’s safe to say that the large ECM players have not historically been particularly interested in usability.

Records Management

Microsoft waded into the records management area with the Records Center in 2007. It didn’t exactly meet with glowing reviews, but they’re really hit it out of the park in 2010. Through the new records center it supports all of the traditional records management requirements with file plans etc, but at the same time, it brings RM to the end user through in place records management. Users no longer need to go through many steps and secret rituals to get documents under management, a document (and any other piece of content!) can be declared a record through a simple click of a button. Document routing makes sure that if necessary the content moves to the record center while leaving behind a stub.

Ryan Duguid showed a slide which indicated that if left unchecked, an organization that currently manages 2 TB of data will be managing 45 TB of data in 5 years time. However, if disposition policies were put in place that disposed of 10, 20 and 30% of content annually, that future growth number would shrink to 25, 10 and 4 TB respectively. The RM features in SharePoint 2010 can help bring this reality about

Interestingly, the next day Cyrus Mistry gave a talk on the way that Google manages their content. In essence, they don’t. The mantra is to keep absolutely everything forever, open it up to everyone and rely on search to find it. I actually agree with the opening up concept, but I think it’s impractical, not to mention legally dangerous to leave stuff lying around forever.

Cyrus also pointed out a couple of policies that I might consider implementing. One is that every Google employee writes a small blurb (very short) on their past week’s activities, and what their plans for the next week are. That is visible to everyone. I sort of like it from a few angles. Another is that users can contribute ideas to a central “idea pool”. Ideas are then voted upon, and if an idea gets enough votes, it becomes a project.

CMIS Connector Announced

At the show, it was announced that Microsoft will be shipping a connector for the Content Management Interoperability Services (CMIS) standard. This will allow SharePoint to act as a “front end” for external content management systems, and vice versa. This will allow for easy integration with legacy document management systems, and give the users of these systems a better experience without sacrificing capability.

1 Comment

Mapping in SQL Server 2008 R2 Reporting Services

The latest version of Reporting Services, due out any day now, has built in support for mapping. This is a very welcome addition and adds significantly to the “cool” factor. I recently completed a quick demo project using public data available from Elections Canada, and I will be sharing some observations over the next few days. I’m a newbie to mapping, so I’m sure that that  the GIS folks out there will think this is quaint, but this really does take a technology that was previously available to a select few, and make it available to a much broader audience, something that Microsoft is particularly good at.

Firstly, I have to mention that this was built using the November CTP of SQL Server, and was implemented on SharePoint 2010 beta. Technology Preview on Beta would typically be a recipe for frustration, and while there were bugs, it went together surprisingly well.

There are essentially 2 major data components to a map report,the spatial data and the analytic data. The spatial data contains the map elements themselves,and data that will be used to relate to the analytic data, where the analytic data contains the data that is to be mapped. In my first example, I have data that describes all of the riding boundaries in Canada, along with their metadata (riding number, province, etc). I also have election result data that can be grouped by riding number, so a relationship is created between them. I’ll demonstrate below.

You can create a  map in any Reporting Service report, either using the Report Designer in Business Intelligence Design Studio (BIDS), or by using the new Report Builder end user application. Using BIDS, you need to drop a map control onto the design surface to start the map wizard, and Report Builder gives you the option at create time.

image

image

The wizard then prompts you for your spatial data. You have 3 choices as shown above. The map gallery lets you choose from the full list of pre-packaged maps  available in the product consisting of a grand total of 1 country, the USA. To be fair, this is by design. There are so many border disputes in the world that the US is the only safe bet. In fact, as a Canadian, we share a border with 4 different countries, and unless I’m mistaken, we have border disputes with three of them. This leads us to the next choice, the ESRI Shapefile. These files are the de facto standard for spatial data, and are readily and freely available from the public domain. The Shape files used in this example came originally from Elections Canada. Finally, SQL Server, starting with 2008 supports a spatial data type. This is essentially a series of polygons. It’s a little known feature, and spatial data is supported directly within Management Studio, as can be seen below:

image 

Whenever a query result contains spatial data, a little “Spatial results” tab appears which allows you to visualize the results.

Which one to choose? Well importing the shapefile directly is fast, but using SQL data gives you much more flexibility from a querying standpoint. Also, if the gallery or shapefile are chosen, the data is embedded directly in the report, which is not the case with SQL.

Luckily there’s an excellent little utility available from SharpGIS that makes it very easy to just move your shape files up to SQL.

The next screen you see will depend on your choice, but in the case of SQL you can either choose a shared dataset that has the spatial data, or you can create an embedded dataset for this purpose. Once the data has been defined, you can configure the map’s viewport.

image

Here you set the zoom level, and can set the panning parameters. You can also choose the type of data that the map is to use – in this case polygon, but other options are point and line (for single point and route type data). Also of note are the two options at the bottom. You can choose to embed map data in the report (removing reliance on the back end data for the map data, effectively creating a snapshot). The second option is to add a Bing Maps layer – allowing you to add the richness of Bing Maps to your visualizations. However, this can only be done if your map data is geographic. Geographic data maps directly to real geographic positions (latitude and longitude) whereas planar data is simply a set of shapes that fit together.

The next window allows you to choose your visualization.

image 

The basic map will only show the data in the map itself, and required no further data. the color analytical map essentially fills in the regions according to analytical data, and the bubble map positions analytical data on the map. Additional layers can be added to any map after the fact. In our example, we choose the Color Analytical Map.

The next screen prompts us to either choose a shared dataset that contains our analytical data, or to create a new dataset. Once we have selected the data to use, the next dialog box allows us to establish the relationship between the map data and the analytical data. In our case, the field FED_NUM in the map data corresponds to the EDANum field in our election results data.

image

Finally, we choose a theme (which defines the “chrome” of the report), the field to use to determine our our colouring, the patterns to use to visualize the data, and what to display as a label, if anything.

image 

And that’s all there is to creating a basic map! at this point, you can click run in Report Designer, or preview in BIDS, and you should see the results of your analysis as below, which in our case are the results of the Canadian 2008 General Election on a riding by riding basis.

image 

Many things need to be adjusted further (zoom, which color to use for which party, and a number of other things), and I’ll cover some of these in a future blog post, but as you can see, the map wizard is quite capable of delivering immediate results.

Bring on the RTM of SQL Server 2008 R2!

1 Comment