Skip to content

Tag: Future of SharePoint

SharePoint 2016 Team SItes and Groups – It All Comes Together

SharePoint is back. With a vengeance.

For the past few years, SharePoint has been relegated to a supporting role within Office 365. It even lost its identity a few years back, with the name “SharePoint” being replaced by the bland “Sites”. This has been exacerbated recently by the rise of Groups (referred to either as Outlook Groups or Office 365 Groups). If Groups are the way forward, what value can SharePoint provide?

A lot, as it turns out.

Office 365 Groups

I refer to them as Office 365 Groups, because they incorporate elements from multiple Office 365 products. However, they are also referred to as Outlook Groups, which is the name of the mobile app. The interesting thing is that if you scratch the surface of the Groups user interface in either OneDrive or OneNote, you can see pretty quickly that it’s really a SharePoint site, or more specifically a site collection. Just look at the URL.

clip_image001

In SharePoint, a Group is a specific type of site collection with a single document library for files (the OneDrive), and a library for other supporting files (including the Group OneNote). The difference is that a Group is what it is – it can’t be extended or modified to any significant degree. You can’t even access the All Content or site settings by adding “/_layouts/15/viewlsts.aspx” to the site URL (if you try, you get redirected to the main OneDrive view of the Group). SharePoint is really just there as the container. This can be frustrating, because as anyone that has worked with SharePoint knows, it can be much, much more.

The introduction of Groups initially caused confusion, particularly for users of SharePoint team sites, or Yammer. Was OneDrive replacing SharePoint (which is kind of silly… OneDrive IS SharePoint)? Did the new Exchange based conversations mean that Yammer was dead? Those questions have been hanging out there unanswered for quite some time.

Once I understood them, I came to really like Office 365 Groups. They bring together multiple tools into a single coherent location with a clear security boundary, and they are relatively simple to manage. One of the criticisms of Groups has been that there is no single central UI. Groups are manifested in Exchange, SharePoint, OneDrive, OneNote and Power BI, but there’s really no central starting point for a group. It’s like a city of suburbs in search of a downtown.

There have been more than a few detractors of Groups as well. Most of them relate to their immaturity. The Outlook conversations provide excellent email integration (obviously) but were not as full featured as Yammer in other ways. There have been several others, but the biggest complaint seems to me to be the fact that a SharePoint team site provides much more functionality than a simple OneDrive library. These factors have been a significant blocker for the adoption of Office 365 Groups.

That all changed with the Future of SharePoint event on May 4, 2016.

The New Team Sites

Team sites have been the traditional place for groups of people to work in the world of SharePoint. These sites would be decorated with web parts, both in and out of the box in order to augment their capabilities, and to provide a window into other team based content structures such as calendars, custom lists, reports, etc. Team sites have always seemed like the logical starting place for group data, and now they are.

Beginning in mid 2016, whenever a new group is created, a new team site will be created as well. Conversely, a new team site will create an Office 365 Group, with all of its components (OneDrive, OneNote, Mail address, Planner, Power BI Workspace). To be totally clear, this new style of team site is a SharePoint site collection, and not a subsite (or web), which means that its security details are bound to that of the group.

Yammer users may wonder what this means for the previously announced integration with both Groups and Azure Active Directory. Nothing was announced at the event, so this is pure speculation on my part, but I would have to assume that if there is to be a 1:1 correspondence between Office 365 Groups, and Yammer groups, that Yammer will be a part of this as well. Given SharePoint’s strengths, I can only assume that this will be the place that all non conversational Yammer content is stored (files, calendars, etc.

The new team site will intrinsically integrate many of the things that formerly needed to be added on later, and the new Office 365 connectors mean that many other content sources can be added with a minimum of effort.

External Sharing

While both SharePoint and Yammer have had external sharing for several years now, and Yammer now has external groups (with a lowercase g…), Office 365 groups have been restricted to members of the tenant’s Azure Active Directory. Therefore, if we now have a 1:1 correspondence between Groups and team sites, and we are also able to use Yammer as the conversations provider, Office 365 groups need to accommodate external users.

The good news is that soon, they will. Thanks to Wictor Wilen’s sharp eye, we can see in the Office 365 admin center that as of this writing, the infrastructure to support external access to groups has already rolled out. Coincidentally (or not), Yammer support of external groups also rolled out in the same timeframe.

image

The new SharePoint team sites, and their integration with Groups will give Office 365 that entry point that so many have been missing. It is exceedingly easy, and fast to get up and running with a usable site that is automatically integrated across the platform.  When you create a Group, you not only have the AAD group, but a team site, a calendar, a distribution list, a conversation platform, a Planner Plan and a Power BI workspace. At the same time, it brings SharePoint back out of the shadows, and back in to the limelight.

SharePoint is back at the center of Office 365, and it’s better than ever.

2 Comments

Integrating Microsoft Flow with Power BI for Real Time Reporting

In addition to all of the obvious business benefits that Microsoft flow brings to the table, one of the things that initially struck me about it was how useful it would be for data acquisition purposes. The first thing that I did with it was to create a flow that queries weather stations from Weather Underground, and stores the data in SQL Azure, and uses Power BI to analyze the weather patterns.

I may blog about that solution in the future, but with the Future of SharePoint event rapidly coming up, my BI Focal fellow collaborator, Jason Himmelstein convinced me that there was something more interesting that we could do with this. How about near real time monitoring of Twitter conversations for the event? All of the pieces were in place.

We rolled up our sleeves, and in relatively short order, had a solution. Jason has written about the experience on his SharePoint Longhorn blog, and he has included the videos that we put together, so I can be a little less detailed in this post.

There are currently essentially three different technologies necessary to make this happen.

  1. Microsoft Flow
  2. SQL Azure
  3. Power BI

Let’s go through each one.

SQL Azure

We could store our tweets in a variety of locations (CSV, SharePoint, Excel), and there are already a number of examples out there that demonstrate how to do this. The reason that we want to use a SQL Azure database is twofold. Firstly, Flow has actions for connecting and inserting data into it. That takes care of our storage requirement. The second, and most important part is that SQL Azure databases support DirectQuery in Power BI.

With DirectQuery, Power BI does not cache the data – every interaction results in a query back to the source, in our case the SQL Azure database. This has the effect of making the data available for reporting as soon as it has been delivered by Flow. That’s the theory at least. In reality, Power BI caches certain elements temporarily (dashboard tiles for example), but this is as close to real time as you can get in Power BI without writing data directly to t in the API. Reports are for the most part up to the minute.

You need an Azure subscription to create a database, and the process for creating it is documented in the following video.

[embedyt] http://www.youtube.com/watch?v=Qh-7OrSHimE[/embedyt]

We will be using the Twitter trigger with Microsoft flow, and it has several output variables. We want our table to be able to store the values of those variables in a table, so we use the following script to create that table.

CREATE TABLE Twitter
(
 id int IDENTITY(1,1),
RetweetCount int,
TweetText NVARCHAR(250),
TweetedBy NVARCHAR(100),
CreatedAt NVARCHAR(100),
TweetID NVARCHAR(50),
SearchTerm NVARCHAR(50)
);
Go

alter table Twitter add primary key (ID)

Once created, we are ready to fill it with tweets.

Microsoft Flow

The recently announced Microsoft Flow is a tool that allows users to automate and integrate processes from different data sources in the cloud. It is based on Azure Logic Apps, and is currently in preview, but already supports a wide variety of actions and triggers. You can sign up for, or access your flows at http://flow.microsoft.com.

Flows consist of two primary objects, triggers and actions. Most triggers, and at the moment all actions, are tied to a data connection. You can register your connections as you go, but you can also view and register them en-masse by selecting your person icon and selecting “My connections”.

image

Once registered, you can use “Browse” to start from a template, or you can go to “My flows” to start from scratch. That’s what we’ll do. To start, click on “Create new flow”, and you will be presented with the trigger selector.

image

Most of the available triggers are events, and the first 4 are special cases. The recurrence trigger allows you to schedule your flow. This is what I use for my weather gatherer – it just calls a web page every 5 minutes and passes the result into the next action. The external content source actions are in alphabetical order, so we just scroll down to the Twitter action and select it.

image

If you have already registered a Twitter account, it will be used by default. If you want to change it, or add a new one, just click on “Change connection”. It’s a good idea to use multiple Twitter accounts if you’re doing multiple queries to avoid running afoul of Twitter’s rate limiting. Finally, just enter the search term in the Query Text box. Any new post of that term on Twitter will launch the flow.

Next, we need to add the “SQL Azure – Insert Row” action. To do so, click on the “+” symbol,  click add an action, then click “Load more” at the bottom. Scroll down and select the action.

Again, if you have a database registered, it will be selected by default. If you have multiple databases registered, or want to add more, click on “Change Connection”. Once you have the correct connection selected, you can click on the dropdown and select the correct table (the one created above”). Once selected, the fields will load in to the action.

image

Populating the fields is a simple matter of selecting the appropriate output variable from the Twitter trigger. The final field, SearchTerm, is used to distinguish between different Twitter searches. Each flow only triggers on one term, but we want to set up multiple flows. We manually enter the value here (in our case “FutureOfSharePoint”). Later, that will be used as a slicer in Power BI.

Once complete, give the Flow a name, click on “Create Flow”, and then “Done”. At that point, you really are done. That’s it, that’s all there is to it. You can query SQL Azure to check for data, and you can also use the information icon to check on the status of Flow runs.

image

image

All of these steps are well documented in Jason’s video below:

[embedyt] http://www.youtube.com/watch?v=kHwcR1sWDiY[/embedyt]

Power BI

We want to surface this data with Power BI. We can do this directly from the web interface, but we have a lot more options if we design the report with Power BI Desktop.  The next step is to launch Power BI Desktop, Select “Get Data”, select “Microsoft Azure SQL Database” and press the “Connect” button. At this point, you enter in the details about the Azure SQL Server and database, and most importantly, select the DirectQuery option.

image

The import option will retrieve data from the SQL database and cache it in an embedded model within the report. Once published, the Power BI service can keep it refreshed, but no more than 8 times per day. This is contrasted with DirectQuery, where no data is persisted in the service, and every interaction results in an immediate call back to the data source. For frequent updates, this is what we need.

A word of caution here – we pay a significant penalty from a feature standpoint when using DirectQuery mode. Most of the functions in Power Query and many of the functions in DAX are unavailable to us in this mode. However, with this particular data set, these restrictions are an acceptable tradeoff for the frequent updates.

Again, Jason has done a great job explaining the steps required to build the reports and dashboards in the video below, so I am not going to repeat them here.

[embedyt] http://www.youtube.com/watch?v=U71stcLZoBc[/embedyt]

Once the report is published, you may want to present it to a wider audience. You can do that through dashboard sharing if your recipients have access to Power BI, or you can publish it anonymously. Given that this is Twitter data, it’s certainly public, and there is no harm in doing so.

To publish the report anonymously, simply open the report in the Power BI service, and select File – Publish to web.

image

You will then be presented with a dialog box that will give you both a link and an embed code for 3 different possible renditions of the report. Simply select the one you want to use and paste it into the ultimate destination. My report can be seen below, and I will likely update it from time to time to follow current events.

One thing to keep in mind about reports shared anonymously is that even though the report is using DirectQuery, the visuals are only updated approximately every hour. The above report will lag reality by about an hour.

You can see here the power of these tools working together. Flow is an easy to use but yet powerful integration tool. SQL Azure is a rock solid database available in the cloud to other cloud services, and Power BI allows for rapid insights to be built by Power users. No code was harmed in the building of this solution, but regardless, it’s still quite powerful.

From here, I can only see it getting better. My ask from the Flow team? A Power BI action that pumps data directly into a Power BI data model, thus eliminating the need for the Azure DB, and allowing for self updating visuals in Power BI, but that’s a topic for another day.

Leave a Comment