Category Archives: Cloud Computing

How to Migrate a WordPress site to Azure Using In-App MySQL

Did this site load a little faster than it normally does? You may not have a basis of comparison, but I have noticed that pages load between 2x and 5x faster since I moved the site to Azure hosted WordPress using an In-App MySQL database. Previously I was running it on Azure as well, but it was using the 3rd party ClearDB database server. The performance increase is therefore entirely due to the difference in the database engines.

I have been running this blog as a web app on Azure for the last couple of years, ever since it became available. In fact, I wrote about how to enable hosting for multiple users on the same database when I first set it up. At the time, setting up a WordPress web app involved also provisioning a MySQL database through a third-party hosting provider, Clear DB. The initial offering was free, but as I quickly found out, the initial offering also doesn’t provide much, and I needed to upgrade it through the third party. This arrangement was fraught with difficulty. Aside from the unwelcome additional costs, managing the billing cycle was difficult. In addition, all my WordPress sites were a little to a lot sluggish, and increasing Azure resources didn’t seem to help much.

Over time, I learned that I wasn’t the only one, and the performance problems seemed to be with latency between Azure and the third-party provider. However, I didn’t want to start messing around with standing up my own, and it was usable if a tad expensive. However, a month or so ago I was listening to my friends Andrew Connell and Chris Johnson on the Microsoft Cloud Show, and they mentioned that Azure had put out a preview of a native MySQL implementation. This was of course music to my ears, so I tried it out, and it appears to work quite well.

I have since moved all our WordPress properties over to this new architecture, and documented the procedure. The approach that I tool should work for any WordPress site, whether it is hosted on Azure or not, but the examples I use will of course be going from Azure to Azure. I essentially create a new WordPress site, migrate the site assets to it, configure the new site to match the old one, then point the address to the new web app. There are quite likely third party add-ons that facilitate this process, but this process is manual. I am in no way saying that this approach is a best practice, only that it worked for me. Finally, as noted above, the In-App MySQL is in Preview, not production, so if your WordPress site is critical, it would likely be a good idea to hang on for a bit. I however like to live dangerously, and if my blog goes offline for a few hours, it’s not the end of the world.

Here are the steps required to accomplish this.

1. Upgrade the existing site

The new site that will be created will use the latest version of WordPress, and any plugins that get installed will also be the most recent. To avoid any version mismatches, it’s a good idea to make sure that your WordPress version, and all your plugins are up to date.

2. Retrieve the WordPress Assets from the existing Site

You can use the built-in export feature in WordPress to retrieve all the database assets. Open the tools section, select “Export”, and choose “All Content”.

The types of content will vary depending on your WordPress installation, plugins, etc., but make sure that you select all of them. When ready, select “Download Export File”. You’ll get prompted to download an XML file – put it somewhere safe – you’ll need it later.

Next up, you’ll want to retrieve your file system based assets. These will be all your uploaded files, unless you currently use and externally hosted provider, your WordPress themes, and your plugins. Strictly speaking, this step isn’t necessary. You should be able to re-download your themes and plugins, but I have found that they aren’t always available, and that this process is faster. However, if you don’t have access to the file system of the existing site, you may not be able to do this. The upload files can be gathered through the import process later as well, but this approach provides an added level of safety.

For Azure, you’ll use FTP to connect to the file system and copy the files locally. For Azure hosted sites, you can set the FTP credentials by logging into the “new” Azure portal, selecting the web app for your site, then navigating to “Deployment Credentials”. You then enter a user name and a password, and save them.

Next, scroll down to “Properties” for the web app, and take note of the “FTP Host Name” and the “FTP/Deployment user”. You will use these values to connect to the file system.

Now open Explorer on a Windows PC, right click in the “This PC” node, and select “Add a network location”.

Follow the prompts entering the FTP host and the user name when prompted. Do not attempt to log in anonymously. Also, take note – the user name has the form web app name\username. When the node opens, enter the password, and you should see 4 folders. Open “site”, then “wwwroot”, and finally “wp-content”. The folders that you need are here.

Specifically, you are looking for the plugins, themes and uploads folders. Copy these folders locally and keep them handy.

3. Create the new WordPress Site

From the Azure admin portal, select “Create New”, and search for “WordPress”. There are several WordPress options to choose from, but the one we’re pursuing is published by WordPress.

Once selected, you will be prompted to fill out the details. Give the new app a name, select the Resource Group, and most importantly, the Database Provider. ClearDB is the one that we are moving away from, so “MySQL In App” is the one that we want to select.

Once you click OK, the App will be created, and WordPress will be deployed to it. The App creation happens almost immediately, but it takes a few minutes for WordPress to be fully deployed. Don’t be alarmed if there’s nothing there for a little while.

Once deployment is complete, you can simply click on the URL of the app in the “Overview” section. The URL will take the form of

A browser will open and you will be prompted to complete the initial WordPress installation. One that complete, you will be able to login to the WordPress administration portal.

4. Upload the Older Assets to the new WordPress Site

The next thing that we want to do is to upload the assets that we downloaded in step #2 to the new site. To do this, simply connect to the new file system via FTP by following the same steps that were used to connect to the old site in step #2. Once connected, upload the 3 folders to the wp-content folder of the new site. If there are folders that already exist, or that you don’t want to use in the new site, simply omit them from the upload. Once uploaded, we can activate the features.

5. Activate Assets in the New Site

It is important to activate and configure the plugins before the content from the existing site is imported. This is because some plugins extend the schema of the WordPress database, and any content that depends on those schema extensions will fail to import if they are not present.

Login to the administrative portal in the new site, and activate all the required plugins. If you don’t know which plugins should be activate, simply login to the administrative portal of the old site for reference. It’s a good idea to have these portals open side by side as you complete the next few actions. Once the plugins are active, go to the appearance section, and select the same theme as the original. Once the theme is selected, it needs to be configured. Walk through all the configuration options for your theme matching with the original site. Any options that depends on content will need to be set after the content is imported. Once the theme is configured, the plugins should all be configured. This is a very manual process of going through all the configuration screens and comparing the settings to those of the existing site.

Finally, recreate all necessary users from the old system. You will need to match blog posts with authors during the import step. The import step will offer another opportunity to add new users, but it’s a good idea to do this prior so that complete information is added for each user.

6. Imports Content from the Existing System

From the administration portal of the new WordPress site, navigate to the Tools section, and select import. A number of options will be presented, the option that you’re interested in is “WordPress”. If you don’t already have the WordPress Import Plugin, you’ll need to select “Install Now” and allow the plugin to install and activate. Once activated, select “Run Importer”, and the Import dialog will appear. Select the export file that was downloaded in Step #2 above, and then click the “Upload file and import” button to see the Import WordPress dialog.

WordPress Import is author aware, and will automatically assign posts to users that exist in the new environment based on who they were in the old, you simply need to map them at this point. If you forgot to add a user in Step #5, you can do so here as well. Once authorship is assigned, the only other decision is whether to select the “Download and import file attachments”. Strictly speaking, if all assets were brought across in step #2, this shouldn’t be necessary. What this option does is to download all referenced assets from the existing blog during the import process. This doesn’t always work, particularly on larger blogs, which is why step #2 is so important.

In addition, if the content of the site results in a particularly large export file (as was the case with this one), you’ll need to increase the upload limit for your WordPress site. This can be done by creating a “.user.ini” file in the root of your WordPress installation as described here. Additionally, you may also need to increase some of the application timeout values.

7. Test

Test the new site to ensure that it works. This cannot be stressed enough. Open all the sections to ensure that everything looks right. Ideally, use browser windows open side by side with the new and the existing sites

8. Make URL Changes to the Existing WordPress Site

It is important to follow these steps to avoid being locked out of the existing site. There are ways to correct it if it happens, but the situation is beast avoided.

Open the administration portal of the existing site, and navigate to “Settings”, General. If the WordPress Address (URL) and the “Site Address (URL)” values do not match the default URL for the Azure Web App, they will need to be changed to that value here.

The address will take the form It’s also a good idea to navigate to that URL to ensure that it works before saving.

8. Make URL Changes to Azure

If your existing site isn’t running on the default Azure address, you’ll need to repoint it to the new site. This will cause your site to be unavailable for a few moments. To begin, you need to remove your custom domain from the existing (now “old”) site. Navigate to the Web App for the old site in the Azure portal, and select “Custom domains”. Your custom domain should appear there along with the default address (that was used in step 8).

Click on the ellipsis beside the domain, and select “Unassign”. This will remove the custom address from the old site, freeing it up to be used by the new site.

At this point, you will need to make changes to your domain with your domain registrar. You will need to change any references (A records and/or CNAME records) that you currently have for your custom address to point to the new Azure Web App. Details for those settings can be found under “Custom domains” for your new Azure Web App. Once complete, navigate to “Custom domains” in the new Web App and click on the “+” button beside “Add hostname”. Enter your custom address and the click the “Validate” button. The custom address will be tested, and if there are any issues with it, remediation steps will be provided. The Azure portal is quite good at helping to manage this step.

Once the new URL has been registered, it should be tested to ensure that it is accessible from the outside environment. Prior to testing, the old site should be stopped (but not deleted!) to ensure that it is not responding to any requests.

If SSL was used on the old site, at this point they should be brought in to the new Web App and bound to the site.

9. Make URL Changes to the New WordPress Site

If the custom domain is working, follow the steps in step 7, but on the new WordPress site, and use the custom address for the URL values. Save, and login again.

10. Final Testing

At this point the site is live, but it is worthwhile to do another round of testing with the old Web App in a stopped state. This will identify any URLs hardcoded with the old Web App default URL, and missing assets, etc.

At this point, the new WordPress site is set up and working with the In-App MySQL database. I would recommend waiting a week or so before going back and deleting the old site and its assets, just in case.

OneDrive, TwoDrive, ThreeDrive

I’m calling it ThreeDrive now.

The much ballyhooed “Next Generation Sync Client for OneDrive for Business rolled out with the Windows 10 November update. You’d be excused for not noticing, because it looks pretty much the same as the old OneDrive consumer client. In fact, it IS the new OneDrive consumer client as well but it supports OneDrive for Business too. It’s not obvious that it supports OneDrive for Business because as of this writing, it requires a registry key entry. If you sign up for the OneDrive for Business preview, you’ll get the new sync client and the instructions, but for convenience, the key is:

[HKEY_CURRENT_USER\SOFTWARE\Microsoft\OneDrive] – “EnableAddAccounts”=dword:00000001

Presumably, at some point that registry key will be switched on for everyone by an update. Once it has been added, when you open the OneDrive settings, you will now see an option to add a business account.

Once added, your Office 365 OneDrive will be synchronized using the same (more robust) engine as the consumer client, you’ll be able to do selective sync of folders, etc. Once you add your business account, you’ll see two sync “clouds” in your system tray, one blue, and one white. White is your consumer OneDrive, and blue your OneDrive for Business.

I’m pretty sure that this brings us to TwoDrive. You’ll also get (at least) two entries in your Windows File explorer, one for Personal, and one for your corporate OneDrive. If you connect multiple Office 365 accounts, you’ll get multiple blue cloud icons, and multiple explorer entries.

I don’t know why the consumer client icon is blue, but it is what it is. To me, white would help with understanding. We do however have a single sync client! Well, not so fast. If I need to sync SharePoint libraries from either on-prem or Office 365, I will still need the older OneDrive for Business sync client, with all of the same limitations. This is also true for the OneDrives contained in an Office 365 group.

Once that’s installed, by syncing a library, you’ll get another blue cloud icon that is indistinguishable from the icons created by the new sync client, and you’ll get an entry in File Explorer for SharePoint.

Everything should be working at this point. However, although I have gotten my head around this, I find it pretty confusing, and I work in this environment for a living. I know that I’m not alone, I recently spent about an hour with my friend and fellow MVP Marc Anderson helping him get his head around it, so it’s certainly not simple.

I am quite happy to see the new sync client, and the harmonization that it brings. I also know that the need for the old OneDrive for Business sync client will go away as the new client gains the ability to sync with SharePoint libraries and Group based OneDrives. I personally use cloud based storage solutions from a number of vendors, and they all have strengths and weaknesses. OneDrive is still the best deal out there, and it’s also the best solution for corporate sharing. I am however concerned about the complexity. I can imagine the following future conversation with a customer.

Me: You should really look at OneDrive for Business for offline access to your content

Customer: Oh, I have OneDrive already. Didn’t Microsoft just limit its storage capacity?

M: No – they had only turned on unlimited storage for a small test group. They just decided not to move forward with it as earlier announced. Besides, that’s only for the consumer OneDrive, not OneDrive for Business.

C: So they’re not the same thing?

M: No – OneDrive is a consumer product, and you log into it with a Microsoft account. OneDrive for Business is a business product, and you get it with an Office 365 business account. You need a corporate account to use it. It gives each person that uses it 1 TB of storage.

C: So if it’s they’re different things, why are they both called OneDrive?

M: I know. Never mind.

C: OK, so how do I get them both working?

M: Well, you have the November update for Windows 10, right? All that you need to do is to go into your OneDrive settings, and add your business account.

C: Oh, so they use the same sync client?

M: Yes. That’s fairly new. There used to be an exclusive OneDrive for Business client, but you don’t need it now.

C: OK. (adds the business account) So how do I work with it?

M: You see those two cloud icons in your tray? The white one is your consumer account, and the blue one is your business account.

C: I thought that it was one sync client. Why are there two icons?

M: That’s so you can see the two different repositories. They are both driven by the same sync engine. It makes sense.

C: OK, cool. And how do I access my files?

M: Just open up File Explorer. The one that says OneDrive – Personal is your consumer account. The one that says OneDrive – your company name is your business OneDrive.

C: So the business one is my personal OneDrive in Office 365?

M: Yes. I know… the term “Personal” is a bit confusing, but it is what it is.

C: That’s fine. Why is personal cloud icon blue in File Explorer but not in the tray?

M: I have no idea

C: OK – so how do I sync my Office 365 Group OneDrives?

M: Oh. Remember when I mentioned the old sync client? You’ll need that to sync those. The new client doesn’t support them yet, but it will.

C: I thought the old one doesn’t work very well?

M: It’s not as good, but it’ll do for this purpose.

C: OK, how do I install that?

M: You have Office installed, so you already have it. Just open up the OneDrive in your browser, and click on the sync icon.

C: OK (does it). So how do I know it’s working?

M: Open up your tray. See that you have another blue cloud icon? That’s the older sync client.

C: How do I tell the difference between this and the other one?

M: Just hover over the icon. The one that’s just called OneDrive for Business is the older engine. The one that contains your company name is the new one.

C: Am I going to get another icon for every OneDrive that I sync?

M: No – in this case, they all use the same icon. If you hover over and select open the folder, you’ll see what’s syncing.

C: OK. So where do I find my files?

M: Open up File Explorer. Under your two OneDrives, you’ll see a new entry for SharePoint. Click on that, and you’ll see your content.

C: What’s SharePoint?

M: *sigh*

I really quite like OneDrive, and what it can do. It’s even more valuable to me when the people that I interact with use it too, and after walking through this explanation over the past few days, I can see a few barriers to entry. I’d love to see this whole thing simplified.

Diving with Sway (or What I Did on my Autumn Vacation)

Although I threaten to do so in the description of this blog, I rarely talk about diving. I think that the last time that I did so was about 3 years ago. However, I recently encountered an intersection of my vocation and my avocation that I thought was worth sharing.

In September/October this year, my wife and I went on a liveaboard dive trip down the entire Sea of Cortez (or the Gulf of California to you folks in the USA). A liveaboard is a cruise of sorts where you live on the boat and go diving multiple times per day. You get to see some amazing things that would otherwise be unavailable.

I like to document my dives using mostly photos and simetimes video.  The GoPro given to the Office MVPs last year (thanks, Microsoft!!!) at the MVP Summit has increased my use of video. Ultimately I return from these dive trips with a large collection of pictures and video, which I share out via Flickr, Facebook, OneDrive and YouTube. The problem is, these collections are disconnected, and while i try to tag and title the items, none of these collections really tell the story.

Thats where Sway comes in.

I’ve had access to Sway for a while, and while I did dabbble in it a bit originally, I didn’t really “get it”. I had a lot of video this time, and the trip warranted more “narration” than a simple collection of pictures could provide. I decided to give Sway another shot, with (I think) good results. You can see for yourself below. Click on the “Made with Sway” icon at the bottom left to open in full screen.

I found the Sway was exactly the right tool for this task. I was able to bring together diverse media elements from the trip, and organize them in ways that suited the narrative (not always chronologically). Adding the narrative itself completes the picture, and the presentation can stand on its own. I think, to sum it up, that Sway adds context to content.

Sway is an online tool. There are native clients available for Android, iOS ans Windows 10, but you are always working with online content – there are no files stored locally. Creation is simple. There are a number of preconfigured layouts which can be tweaked to some degree, and content addition is a simple matter of dragging and dropping from a set of cloud based repositories. You can share (and collaborate) at any time, and when ready, you can publish it on, where it will be available in galleries, search engines, etc.

In addition to travelogs, I can see this having great usefulness in the education space. I don’t think it will be replacing PowerPoint anytime soon – they serve different purposes. I would say though that in the absence of a presenter, I would likely rather have access to a sway than a PowerPoint deck – the Sway can do so much more on its own for explanations.

Finally, Sway is free. You’ll need a Microsoft account (either consumer or organizational) to use it, but you can simply point your browser at and get going on it. Give it a shot! I think you’ll like it. I do, and I think I’m going to dip into my back catalog to create more. WHen I do, I’ll be posting them in my diving collection on

Using Excel With External Data – What’s the Right Tool?

Excel has been used with external data for… well, as long as I’ve been using Excel. So why would anyone bother to write a blog post about this given that the capability is so mature? In recent years, Excel has adopted a number of new, and frankly better mechanisms for working with external data, while retaining the old. Given that there are now multiple tools in Excel for working with external data, it’s not always clear as to which one is the best, and unfortunately there is no single tool that wins over all, although I believe that that will be the case soon.

The answer, as always is, “it depends”. When it depends, the important thing is to understand the strengths and weaknesses of each approach. With that said, let’s have a look at all of the options.

ODC Connections

ODC (Office Data Connections) are the traditional method of accessing data in Excel. You can create or reuse an ODC connection from the Data tab in the Excel ribbon.

When using an ODC connection, you establish a connection with a data source, form some sort of query and import the resultant data directly into the Excel workbook. From there, the data can be manipulated and shaped in order to support whatever the end user is trying to do. The one exception to this behaviour is the connection to SQL Server Analysis Services (SSAS). When a connection is made to SSAS, only the connection is created. No data is returned until an analysis is performed (through a pivot table, chart etc), and then only the query results are retrieved.

When the workbook using an ODC connection is saved, the data is saved within it. In the case of an SSAS connected workbook, the results of the last analysis are saved along with it. For small amounts of data, this is just fine, but any large analysis is bound to quickly run into the data limits in Excel which is 1,048,576 rows by 16,384 columns in Excel 2013. In addition such a file is very large and extremely cumbersome to work with, but even as such, Excel has been the primary tool of choice for business analysts for years.

Data loaded into the workbook can be refreshed on demand, but it can also be altered, shaped, mashed up, and as is too often the case, grow stale. Workbooks such as these have become known as “spreadmarts” and are the scourge of IT and business alike. With these spreadmarts, we have multiple versions of the same data being proliferated, and it becomes harder to discern which data is most accurate/current, not to mention the governance implications.

SharePoint has provided a way to mitigate some of the concerns with these connections. SharePoint itself supports ODC connections, and therefore users can access these workbooks stored within SharePoint and it also allows them to refresh data from the source either on demand or on open. A single point of storage along with a measure of oversight and browser access helps to restore a modicum of sanity to an out of control spreadmart environment, but the core issues remain.

In order to help with the core issues, Microsoft introduced PowerPivot in 2009.

PowerPivot Connections

Created in PowerPivot

PowerPivot was originally (and still is) an add-in to Excel 2010, and is a built in add-in to Excel 2013. PowerPivot allows for the analysis of massive amounts of data within Excel, limited only by the memory available to the user’s machine (assuming a 64 bit version). It does this by highly compressing data in memory using columnar compression. The end result is that literally hundreds of millions of rows of data can be analyzed efficiently from within Excel.

You can see that compression at work by comparing the same data imported into an Excel workbook directly, and into a PowerPivot model with a workbook. The following two files contain election data, and represent the maximum number of rows that Excel can handle directly (1,048,576) and 25 columns.

Getting data into the model was originally (and still can be) a completely separate process from bringing it into Excel. PowerPivot has its own data import mechanism, accessed from the Power Pivot window itself. First, click on the PowerPivot tab in Excel and then click manage. If you don’t have a PowerPivot tab, you will need to enable the add-in. If you don’t have the add-in, you have an earlier version of Excel – you’ll need to download it.

Once the PowerPivot window opens, the “Get External Data” option is on the ribbon.

Once the appropriate data source is selected and configured, data will be loaded directly into the data model – there is no option to import that data into a worksheet. Once the data is in, pivot tables and pivot charts can be added to the workbook that connect to the data model much like when creating an ODC connection to Analysis Services. In fact, it’s pretty much exactly like connecting to Analysis services, except that the AS process is running on the workstation.

Created in Excel

PowerPivot, and more importantly the tabular data model was included in Excel 2013. With that addition, Microsoft added a few features to make the process of getting data into the data model a little easier for users that were a little less tech savvy, and may be uncomfortable working with a separate PowerPivot window. That’s actually part of the thinking in leaving the PowerPivot add-on turned off by default.

When a user creates an ODC connection as outlined above, there are a couple of new options in Excel 2013. First, the “Select Table” dialog has a new checkbox – “Enable selection of multiple tables”.

When this option is selected, more than one table from the data source can be selected simultaneously, but more importantly, the data will automatically be sent to the data model in addition to any other import destinations.

Even if the multiple selection option wasn’t chosen, the next dialog in the import process, “Import Data” also has a new check box – “Add this data to the Data Model”.

Its purpose is pretty self-explanatory. It should be noted that if you choose this option, and also choose “Only Create Connection”, the data will ONLY be added to the model, nowhere else in the workbook. This is functionally equivalent to doing the import from the PowerPivot window, without enabling the add-in.

Power Query Connections

When Power BI was originally announced, Power Query was also announced and included as a component. This was very much a marketing distinction, as Power Query exists in its own right, and does not require a Power BI license to use. It is available as an add-on to both Excel 2010 and 2013, and will be included with Excel 2016.

Power Query brings some Extract, Transform and Load (ETL) muscle to the Excel data acquisition story. Data can be not only imported and filtered, but also transformed with Power Query and its powerful M language. Power Query brings many features to the table, but this article is focused on its use as a data acquisition tool.

To use Power Query, it must first be downloaded and installed. Once installed, it is available from the Power Query tab (Excel 2010 and 2013).

Or from the data tab, New Query (Excel 2016)

Once the desired data source is selected, the query can be edited, or loaded into either the workbook, the data model, or both simultaneously. To load without editing the query, the load option at the bottom of the import dialog is selected.

Selecting “Load To” will allow you to select the destination for the data – the workbook, the model or both. Selecting Load will import the data to the default destination, which is by default the workbook. Given the fact that the workbook is an inefficient destination for data, I always recommend that you change their default settings for Power Query.

To do so, select Options from the Power Query tab (2010 and 2013) or the New Query button (2016), click the Data Load section, and then specify your default settings.

Data Refresh Options

In almost every case when external data is analyzed, it will need to be refreshed on a periodic basis. Within the Excel Client, this is simple enough – click on the data tab, and then the Refresh All button, or refresh a specific connection. This works no matter what method was used to import the data in the first place. Excel data connections can also be configured to refresh automatically every time the workbook is opened, or on a periodic basis in the background.

However, workbooks can also be used in a browser through Office Web Apps and Excel Services (SharePoint and Office 365) or as a data source for Power BI dashboards. In these cases the workbooks need to be refreshed automatically in order that the consuming users will see the most up to data when the workbooks are opened. The tricky part is that not all of the connection types listed above are supported by all of the servers or services. Let’s dive in to what works with what.

SharePoint with Excel Services

Excel Services first shipped with SharePoint 2007, is a part of 2007, 2010, and will be included with 2016. From the beginning, Excel Services allowed browser users to view and interact with Excel workbooks, including workbooks that were connected to back end data. The connection type supported by Excel Services is ODC, and ODC only.

Excel Services has no mechanism for maintaining data refresh. However, the data connection refresh options are supported which means that the workbook can be automatically refreshed when opened, or on a scheduled basis (every xxx minutes in the background). Unfortunately, this can come with a significant performance penalty, and once refreshed it is only in memory. The workbook in the library is not updated. The data in the workbook can only be changed by editing the workbook in the client, refreshing it, and re-saving it

Workbooks with embedded data models (PowerPivot) can be opened in the browser, but any attempt to interact with the model (selecting a filter, slicer, etc) will result in an error unless PowerPivot for SharePoint has been configured.

SharePoint with Excel Services and PowerPivot for SharePoint

PowerPivot for SharePoint is a combination of a SharePoint Service application and Analysis Services SharePoint mode. When installed, it allows workbooks that have embedded PowerPivot data models to be interacted with through a browser. The way that it works is that when such a workbook is initially interacted, the embedded model is automatically “promoted” to the Analysis Services instance, and a connection is made with it, thus allowing the consuming user to work with it in the same manner as with a SSAS connected workbook,

The PowerPivot for SharePoint service application runs on a SharePoint server and allows for individual workbooks to be automatically refreshed on a scheduled basis. The schedule can be no more granular than once per day, but the actual data within the model on disk is updated, along with any Excel visualizations connected to it.

When the refresh process runs, it is the functional equivalent of editing the file in the client, selecting refresh all, and saving it back to the library. However, there is one significant difference. The Excel client will refresh all connection types, but the PowerPivot for SharePoint process does not understand Power Query connections. It can only handle those created through the Excel or PowerPivot interfaces.

Power Pivot for SharePoint ships on SQL Server media, and this limitation is still true as of SQL Server 2014. At the Ignite 2015 conference in Chicago, one of the promised enhancements was Power Query support in the SharePoint 2016 timeframe.

Office 365

Office 365, or more precisely, SharePoint Online supports Excel workbooks with ODC connections and PowerPivot embedded models in a browser. These workbooks can even be refreshed if the data source is online (SQL Azure), but they cannot be refreshed automatically. In addition, only ODC and PowerPivot connections are supported for manual refresh. Power Query connections require Power BI for Office 365. In addition, Office 365 imposes a 30 MB model size limit – beyond that, the Excel client must be used. In short, the Office 365 data refresh options are very limited.

Power BI for Office 365

Power BI for Office brings the ability to automatically refresh workbooks with embedded data models. Data sources can be on premises or in the cloud. On premises refresh is achieved through the use of the Data Management Gateway. It also raises Office 365’s model size limit from 30 MB to 250 MB. With Power BI for Office 365 both manual and automatic refreshes can be performed for both PowerPivot and Power Query connections, however Power Pivot connections are currently restricted to SQL Server and Oracle only.

The automatic refresh of ODC connections is not supported. A workbook must contain a data model in order to be enabled for Power BI.

Power BI Dashboards

Power BI Dashboards is a new service, allowing users to design dashboards without necessarily having Office 365 or even Excel. It is currently in preview form, so anything said here is subject to change. It is fundamentally based on the data model and it works with Excel files as a data source currently, and it is promised to use Excel as a report source as well. The service has the ability to automatically refresh the underlying Excel files on a periodic basis more frequent than daily.

In order for a workbook to be refreshed by Power BI, it must (at present) be stored in a OneDrive or OneDrive for Business container. It also must utilize either a PowerPivot, or a Power Query connection. At present, the data source must also be cloud based (ie SQL Azure) but on premises connectivity has been promised.

SQL Server Analysis Services

Another consideration, while not a platform for workbooks is SQL Server Analysis Services (SSAS). Excel can be used to design and build a data model, and that data model can at any time be imported into SSAS. As of version 2014, SSAS fully supports all connection types for import – ODC, PowerPivot and Power Query. Once a data model has been imported into SSAS, it can be refreshed on a schedule as often as desired, and you can connect to it with Excel, and share it in SharePoint. You can also connect to it in Power BI Dashboards through the SSAS connector. From both a flexibility and power standpoint, this is the best option, but it does require additional resources and complexity.

Refresh Compatibility Summary

For convenience, the table below summarizes the refresh options for the different connection types.




Power Query

Excel Client




SharePoint/Excel Services


SharePoint/Excel Services/PP4SP



SQL Server Analysis Services Import




Office 365



Office 365 with Power BI



Power BI Dashboards



M – Manual refresh

A – Both Manual and Automatic Refresh

* only limited data sources


The Right Tool

I started out above by saying that the selection of import tool would depend on circumstances, and that is certainly true. However, based on the capabilities and the restrictions of each, I believe that a few rules of thumb can be derived. As always, these will change over time as technology evolves.

  1. Always use the internal Data Model (PowerPivot) when importing data for analysis.


  2. Power Query is the future – use it wherever possible

    All of Microsoft’s energies around ETL and data import are going into Power Query. Power Query is core to Power BI, and announcements at the Ignite Conference indicate that Power Query is being added to both SQL Server Integration Services and to SQL Server Reporting Services. Keep in mind that we have been discussing only the data retrieval side of Power Query – it has a full set of ETL capabilities as well, which should also be considered.

  3. PowerPivot or ODC Connections must be used on premises

    PowerPivot for SharePoint does not support Power Query for refresh. This means that you MUST use PowerPivot connections for workbooks with embedded models. If you are already using SSAS, use an ODC connection within Excel.

  4. Power Query or PowerPivot must be used for cloud BI.

    PowerPivot connections will work for a few limited cases, but more Power Query support is being added constantly. Where possible, invest in Power Query

  5. If on-premises, consider importing your models into SSAS

    SSAS already supports Power Query. If, instead of using PowerPivot for SharePoint, Analysts build their models using Excel and Power Query, they can be “promoted” into SSAS. All that is then required is to connect a new workbook to the SSAS server with an ODC connection for end users. The Power Query workbooks can be used in the cloud, and the SSAS connector in Power BI Dashboard can directly use the SSAS models created.

  6. Choose wisely. Changing the connection type often requires rebuilding the data model, which in many cases is no small feat.

In summary, when importing data into Excel, the preferred destination is the tabular model, and to import data into that model, Power Query is the preferred choice. The only exception to this is on premises deployments. In these environments, consideration should be given to connecting to a SSAS server, and failing that, PowerPivot imports are the best option.

The New Power BI – Now With Enterprise!

Yesterday Microsoft announced the next step in the evolution of Power BI. It’s getting quite a bit of attention, and rightly so for its aim of bringing Business Intelligence closer to users. Democratizing BI has always proved a challenge – it’s the realm of the gurus in the white coats that hold the keys to the data. Microsoft is aiming to accomplish this democratization through a combination of user focus, and as of yesterday, a drastic change in its pricing model. Power BI just went from about $40 per user per month, to free, or $9.99/user/month for advanced capabilities. That’s quite a drop, and arguably the biggest announcement from yesterday – it will have a massive impact. The detailed price breakdown can be found here.

However, all of the focus around personal BI is, in my opinion, missing a key component. Power BI and its components have always focused squarely on both personal and team BI solutions. That is to say the ability for a power user to model data, visualize it quickly and easily and to share it out with fellow team members. While that capability is certainly retained in the new Power BI, this new version contains the first appearance of enterprise grade BI in the cloud for Microsoft.

To fully understand this, it’s necessary to touch on the Microsoft BI stack as it stands today.

Microsoft BI On Premises

The On-Premises BI story from Microsoft may be confusing, and occasionally difficult to understand, but it is very powerful, and relatively complete. In a nutshell, the story is good from a personal, team and enterprise perspective.

On the enterprise side, there are products from both the SQL Server team, and the Office team. Data warehousing is served by SQL Server and ETL duties fall to SQL Server Integration Services (SSIS). Multidimensional analysis storage is served by SQL Server Analysis Services in both OLAP and Tabular modes, and Reporting is performed by SQL Server Reporting Services (SSRS). The SQL product line doesn’t have much on the client side for analysis apart from SSRS, but this slack is taken up by the analysis tools available in Excel, and through Performance Point services in SharePoint.

Indeed, SharePoint also provides a platform for SSRS via SSRS SharePoint mode, and for Excel based analytical workbooks connected to SQL Server and to SSAS through Excel Services.

On the personal BI side, that role has traditionally fallen to Excel. The pitfalls of importing data into Excel workbooks for analysis are well documented and don’t need to be discussed here, but the bulk of those issues were addressed with the introduction of PowerPivot several years ago. PowerPivot allows for massive amounts of data to be cached within the Excel file for analysis without any data integrity concerns. The addition in recent years of  analytic visuals (Power View, Power Map) and ETL capabilities (Power Query) have further rounded out the offering.

Taking that Excel workbook and sharing it brings us into the realm of Team BI. This is to say that the analyses are relatively modest in size, and of interest to a targeted group. These models may not require the rigour or reliability associated with enterprise BI models. Once again, the technology involved here is SharePoint. A user can take a workbook with an embedded PowerPivot model, share it through a SharePoint library, and other users can interact with that embedded model using only a browser. This capability requires PowerPivot for SharePoint, which is really a specialized version of SSAS, along with a SharePoint service application.

One thing to note about these seemingly disparate approaches is that a power user can build a Power Pivot data model with Excel, share it to a team via SharePoint, and when it requires sufficient rigour or management, it can be “upgraded” into SSAS in tabular mode. This common model approach is powerful, and is key to understanding Microsoft’s entire BI strategy. You can also see here that SharePoint straddles the two worlds of team and enterprise BI.

Moving to the cloud

The BI workload is one of the last Microsoft workloads to move to the cloud, and with good reason. Massive amounts of data present problems of scale, and security or data sovereignty concerns tend to keep data on premises. However, there is a very real need to provide BI to users outside of the firewall.

SharePoint is the hub of BI on prem, so it’s logical to assume that with SharePoint Online, it could continue to perform that function in the cloud. The big catch here is that on-prem, SharePoint is simply the display platform. In the enterprise scenario, users connect through SharePoint to the back end servers. This isn’t an option in the cloud, so enterprise BI was left off the table.

With the personal and team BI scenarios, data is cached in a Power Pivot data model, which could be supported in the cloud. When Office 365 moved to the SharePoint 2013 code base for SharePoint online, rudimentary support for embedded Power Pivot models was indeed added. Essentially PowerPivot for SharePoint “light” was added. I call it light for two major reasons. Firstly, data models could be no larger than 10 MB. Secondly, there was no way to update the data contained within the Power Pivot cache, outside of re-uploading the Excel workbook. This is still true without a Power BI license. The inability to refresh the data renders team BI almost useless, except in static data scenarios.

The first generation of Power BI changed all of that. With a Power BI license, it was possible to install a Data Management Gateway on premises that would connect to team BI workbooks in Office 365 and update them on a scheduled basis. Yes, the gateway had many limitations (many of which have been removed over time), but finally, the on-prem refresh story was solved. In addition, the model size limit was increased to 250 MB. However, we were still left with a number of problems or limitations.

  1. Daily data refresh schedule. Automatic data refreshes could be daily at their most frequent. Manual refreshes could be done anytime
  2. Capacity. The maximum size of a data model was increased to 250 MB, which is relatively small for enterprise scenarios. In addition, refreshes aren’t differential, which means that the entire model is re-uploaded on every refresh
  3. Data sensitivity/sovereignty.  The refresh problem was solved, but because the data is still cached in the workbooks, there can be reluctance to sending it outside of the corporate firewall
  4. Per User Security – Power Pivot data models have no concept of user security in a workbook (tabular models in SSAS do). Security is at the workbook level
  5. Cost. This initial cost of Power BI was $40 per user per month. A power BI license was required to interact with any workbook that had a data model larger than 10 MB. Considering that a full Office 365 E3 license was around $25 per user per month, this price tended to limit the audience for sharing.

All of this is to say that Power BI in its first (and as yet current) incarnation is suitable for personal and team BI only. There has been no enterprise cloud BI story.

Power BI V2

The announcements yesterday outlined the next generation of Power BI. Going forward, Power BI will be available as a standalone offering, at the price points offered above. Office 365 users will continue to be able to use it from Office 365, but Office 365 will no longer be required to use it. In it’s early days, Power BI was a SharePoint app, but a careful examination of URLs in the current offering quickly reveals that it’s actually two apps currently, both running on Azure (not in SharePoint).

If you’ve signed up for the new Power BI preview, you may notice that the URL is…… so this move isn’t a big surprise.

With the new model, Excel is no longer the central container. Users connect to data and publish it directly to Power BI. Behind the scenes, the service is doing a very similar thing as what it does with Power Pivot models – it’s storing them in SSAS. In fact, the same limits still apply – 250 MB per model (at least for now) Excel can still be used, but now it is as a data source.

Visualizations are performed through Power Views, and data is acquired through Power Query. These are no longer add-ons, but available on their own through Power BI Designer. This decoupling is good for those that have not made an investment in SharePoint Online, or Excel.

These changes to the architecture and the cost are great news for adoption, but don’t address the needs of the enterprise. Except for one thing – The SSAS Connector.


One of the data sources available to the new Power BI is the SSAS data connector. This connector is a piece of code that runs on premises (it actually includes the Data Management Gateway). It acts as a bridge between the Power BI service, and an on prem SSAS server.

The biggest distinction worth noting is that with the gateway, data is NOT being uploaded to the service, it remains on prem. The way that it works is that when a user interacts with a visualization from the cloud, a query is sent to the SSAS server through the gateway. That query is run, and its results sent back to the user’s visualization, and the data is not persisted.

In addition, when the query is sent back to the SSAS it is run with the permission of the user making the request. This is accomplished through the EFFECTIVEUSERNAME feature in SSAS. This provides for full user level security, and since tabular models in SSAS can utilize per user security, we no longer need to rely on proxy accounts/document level security.

Finally, because the data is being stored in an on prem SSAS server, it can be refreshed automatically as often as desired. For the same reason, we have no capacity limits – you can grow your own SSAS servers as large as you like.

The SSAS connector removes most of the limitations that prevent cloud based enterprise Business Intelligence, and the new pricing model removes the rest. Certainly there are going to be feature limits in the near term, but it appears to me at least that the back of this thorny problem has finally been cracked.