How to Migrate a WordPress site to Azure Using In-App MySQL

Did this site load a little faster than it normally does? You may not have a basis of comparison, but I have noticed that pages load between 2x and 5x faster since I moved the site to Azure hosted WordPress using an In-App MySQL database. Previously I was running it on Azure as well, but it was using the 3rd party ClearDB database server. The performance increase is therefore entirely due to the difference in the database engines.

I have been running this blog as a web app on Azure for the last couple of years, ever since it became available. In fact, I wrote about how to enable hosting for multiple users on the same database when I first set it up. At the time, setting up a WordPress web app involved also provisioning a MySQL database through a third-party hosting provider, Clear DB. The initial offering was free, but as I quickly found out, the initial offering also doesn’t provide much, and I needed to upgrade it through the third party. This arrangement was fraught with difficulty. Aside from the unwelcome additional costs, managing the billing cycle was difficult. In addition, all my WordPress sites were a little to a lot sluggish, and increasing Azure resources didn’t seem to help much.

Over time, I learned that I wasn’t the only one, and the performance problems seemed to be with latency between Azure and the third-party provider. However, I didn’t want to start messing around with standing up my own, and it was usable if a tad expensive. However, a month or so ago I was listening to my friends Andrew Connell and Chris Johnson on the Microsoft Cloud Show, and they mentioned that Azure had put out a preview of a native MySQL implementation. This was of course music to my ears, so I tried it out, and it appears to work quite well.

I have since moved all our WordPress properties over to this new architecture, and documented the procedure. The approach that I tool should work for any WordPress site, whether it is hosted on Azure or not, but the examples I use will of course be going from Azure to Azure. I essentially create a new WordPress site, migrate the site assets to it, configure the new site to match the old one, then point the address to the new web app. There are quite likely third party add-ons that facilitate this process, but this process is manual. I am in no way saying that this approach is a best practice, only that it worked for me. Finally, as noted above, the In-App MySQL is in Preview, not production, so if your WordPress site is critical, it would likely be a good idea to hang on for a bit. I however like to live dangerously, and if my blog goes offline for a few hours, it’s not the end of the world.

Here are the steps required to accomplish this.

1. Upgrade the existing site

The new site that will be created will use the latest version of WordPress, and any plugins that get installed will also be the most recent. To avoid any version mismatches, it’s a good idea to make sure that your WordPress version, and all your plugins are up to date.

2. Retrieve the WordPress Assets from the existing Site

You can use the built-in export feature in WordPress to retrieve all the database assets. Open the tools section, select “Export”, and choose “All Content”.

The types of content will vary depending on your WordPress installation, plugins, etc., but make sure that you select all of them. When ready, select “Download Export File”. You’ll get prompted to download an XML file – put it somewhere safe – you’ll need it later.

Next up, you’ll want to retrieve your file system based assets. These will be all your uploaded files, unless you currently use and externally hosted provider, your WordPress themes, and your plugins. Strictly speaking, this step isn’t necessary. You should be able to re-download your themes and plugins, but I have found that they aren’t always available, and that this process is faster. However, if you don’t have access to the file system of the existing site, you may not be able to do this. The upload files can be gathered through the import process later as well, but this approach provides an added level of safety.

For Azure, you’ll use FTP to connect to the file system and copy the files locally. For Azure hosted sites, you can set the FTP credentials by logging into the “new” Azure portal, selecting the web app for your site, then navigating to “Deployment Credentials”. You then enter a user name and a password, and save them.

Next, scroll down to “Properties” for the web app, and take note of the “FTP Host Name” and the “FTP/Deployment user”. You will use these values to connect to the file system.

Now open Explorer on a Windows PC, right click in the “This PC” node, and select “Add a network location”.

Follow the prompts entering the FTP host and the user name when prompted. Do not attempt to log in anonymously. Also, take note – the user name has the form web app name\username. When the node opens, enter the password, and you should see 4 folders. Open “site”, then “wwwroot”, and finally “wp-content”. The folders that you need are here.

Specifically, you are looking for the plugins, themes and uploads folders. Copy these folders locally and keep them handy.

3. Create the new WordPress Site

From the Azure admin portal, select “Create New”, and search for “WordPress”. There are several WordPress options to choose from, but the one we’re pursuing is published by WordPress.

Once selected, you will be prompted to fill out the details. Give the new app a name, select the Resource Group, and most importantly, the Database Provider. ClearDB is the one that we are moving away from, so “MySQL In App” is the one that we want to select.

Once you click OK, the App will be created, and WordPress will be deployed to it. The App creation happens almost immediately, but it takes a few minutes for WordPress to be fully deployed. Don’t be alarmed if there’s nothing there for a little while.

Once deployment is complete, you can simply click on the URL of the app in the “Overview” section. The URL will take the form of

A browser will open and you will be prompted to complete the initial WordPress installation. One that complete, you will be able to login to the WordPress administration portal.

4. Upload the Older Assets to the new WordPress Site

The next thing that we want to do is to upload the assets that we downloaded in step #2 to the new site. To do this, simply connect to the new file system via FTP by following the same steps that were used to connect to the old site in step #2. Once connected, upload the 3 folders to the wp-content folder of the new site. If there are folders that already exist, or that you don’t want to use in the new site, simply omit them from the upload. Once uploaded, we can activate the features.

5. Activate Assets in the New Site

It is important to activate and configure the plugins before the content from the existing site is imported. This is because some plugins extend the schema of the WordPress database, and any content that depends on those schema extensions will fail to import if they are not present.

Login to the administrative portal in the new site, and activate all the required plugins. If you don’t know which plugins should be activate, simply login to the administrative portal of the old site for reference. It’s a good idea to have these portals open side by side as you complete the next few actions. Once the plugins are active, go to the appearance section, and select the same theme as the original. Once the theme is selected, it needs to be configured. Walk through all the configuration options for your theme matching with the original site. Any options that depends on content will need to be set after the content is imported. Once the theme is configured, the plugins should all be configured. This is a very manual process of going through all the configuration screens and comparing the settings to those of the existing site.

Finally, recreate all necessary users from the old system. You will need to match blog posts with authors during the import step. The import step will offer another opportunity to add new users, but it’s a good idea to do this prior so that complete information is added for each user.

6. Imports Content from the Existing System

From the administration portal of the new WordPress site, navigate to the Tools section, and select import. A number of options will be presented, the option that you’re interested in is “WordPress”. If you don’t already have the WordPress Import Plugin, you’ll need to select “Install Now” and allow the plugin to install and activate. Once activated, select “Run Importer”, and the Import dialog will appear. Select the export file that was downloaded in Step #2 above, and then click the “Upload file and import” button to see the Import WordPress dialog.

WordPress Import is author aware, and will automatically assign posts to users that exist in the new environment based on who they were in the old, you simply need to map them at this point. If you forgot to add a user in Step #5, you can do so here as well. Once authorship is assigned, the only other decision is whether to select the “Download and import file attachments”. Strictly speaking, if all assets were brought across in step #2, this shouldn’t be necessary. What this option does is to download all referenced assets from the existing blog during the import process. This doesn’t always work, particularly on larger blogs, which is why step #2 is so important.

In addition, if the content of the site results in a particularly large export file (as was the case with this one), you’ll need to increase the upload limit for your WordPress site. This can be done by creating a “.user.ini” file in the root of your WordPress installation as described here. Additionally, you may also need to increase some of the application timeout values.

7. Test

Test the new site to ensure that it works. This cannot be stressed enough. Open all the sections to ensure that everything looks right. Ideally, use browser windows open side by side with the new and the existing sites

8. Make URL Changes to the Existing WordPress Site

It is important to follow these steps to avoid being locked out of the existing site. There are ways to correct it if it happens, but the situation is beast avoided.

Open the administration portal of the existing site, and navigate to “Settings”, General. If the WordPress Address (URL) and the “Site Address (URL)” values do not match the default URL for the Azure Web App, they will need to be changed to that value here.

The address will take the form It’s also a good idea to navigate to that URL to ensure that it works before saving.

8. Make URL Changes to Azure

If your existing site isn’t running on the default Azure address, you’ll need to repoint it to the new site. This will cause your site to be unavailable for a few moments. To begin, you need to remove your custom domain from the existing (now “old”) site. Navigate to the Web App for the old site in the Azure portal, and select “Custom domains”. Your custom domain should appear there along with the default address (that was used in step 8).

Click on the ellipsis beside the domain, and select “Unassign”. This will remove the custom address from the old site, freeing it up to be used by the new site.

At this point, you will need to make changes to your domain with your domain registrar. You will need to change any references (A records and/or CNAME records) that you currently have for your custom address to point to the new Azure Web App. Details for those settings can be found under “Custom domains” for your new Azure Web App. Once complete, navigate to “Custom domains” in the new Web App and click on the “+” button beside “Add hostname”. Enter your custom address and the click the “Validate” button. The custom address will be tested, and if there are any issues with it, remediation steps will be provided. The Azure portal is quite good at helping to manage this step.

Once the new URL has been registered, it should be tested to ensure that it is accessible from the outside environment. Prior to testing, the old site should be stopped (but not deleted!) to ensure that it is not responding to any requests.

If SSL was used on the old site, at this point they should be brought in to the new Web App and bound to the site.

9. Make URL Changes to the New WordPress Site

If the custom domain is working, follow the steps in step 7, but on the new WordPress site, and use the custom address for the URL values. Save, and login again.

10. Final Testing

At this point the site is live, but it is worthwhile to do another round of testing with the old Web App in a stopped state. This will identify any URLs hardcoded with the old Web App default URL, and missing assets, etc.

At this point, the new WordPress site is set up and working with the In-App MySQL database. I would recommend waiting a week or so before going back and deleting the old site and its assets, just in case.

SharePoint and Power BI – Better Together

Ever since 2007, SharePoint has included Business Intelligence amongst its core workloads. There have been a variety of approaches to the workload over the years, but today those core workloads include Excel Services/Excel Online, PerformancePoint, SQL Server Reporting Services Integrated Mode, and Power Pivot for SharePoint.

Power Pivot for SharePoint and Excel Services go hand in hand and can really be considered as one of the main pillars, leaving us with three. If we quickly examine these three pillars in SharePoint 2016, it’s pretty easy to spot an emerging trend. Excel Services is gone from SharePoint 2016, its capabilities being added to Excel Online. Excel Online connects to, but does not run on SharePoint. PerformancePoint still exists in 2016, but it has received precisely 0 new features – it is identical to the version in SharePoint 2013, and remains a part of product for legacy reasons. For all intents and purposes, I consider PerformancePoint to be deprecated. SSRS Integrated mode has been greatly improved in 2016, but contains nowhere near the improvements that the Native Mode version of SSRS has in 2016.

At the same time, the past year has witnessed the spectacular rise of Power BI. Power BI is clearly the focus area for Business Intelligence within Microsoft for cloud based BI delivery. Last fall the SQL team announced that on-premises customers were not being ignored, and that SSRS was the platform for on premises BI delivery They also sketched out a roadmap that showed both platforms being able to deliver the same type of reports. In June 2016, the team delivered on a portion of this vision with SQL Server 2016 Reporting Service.

So where does this leave SharePoint in the Business Intelligence ecosystem?

In my opinion, it leaves it right where it should be – as an integrating platform, and NOT as a runtime platform as it has been in the past. SharePoint provides in context BI by connecting content to reports, and providing dashboards connected to multiple sources. In 2016, SharePoint connects to Excel Online to deliver Analytical reports. Excel runs with SharePoint now, not on it. SSRS Integrated mode still runs on SharePoint, but the investments in Native mode are a clear indication to me that this will be the direction here as well. Unfortunately, Sharepoint has been lacking tight integration with Power BI.

The recent Ignite 2016 conference was the first public appearance of the Power BI web part.

Figure 1: Insert web part dialog with Power BI web part

The Power BI web part works with Modern Sharepoint pages and is based on the new SharePoint Framework (SPFx), which means that it is completely client-side. Why does this matter to us? The fact that it is completely client side means that it will work both in SharePoint Online and on premises. Initially, it will only work with SharePoint Online, but that is because the SharePoint Framework is currently unavailable on premises.

To use the new web part, first create or edit a Modern SharePoint page. The Modern pages support the new Modern web parts. Click on a “+” icon to open the insert part control (Figure 1). Once inserted, add the report URL, and the page. The report page should immediately render within the context of the SharePoint page.

Figure 2: Power BI Report page rendered within a SharePoint page

Since the web part is rendering client side, the consuming user obviously needs to have access to the report. This means that the source report must have been shared with them through Power BI dashboard sharing, or the report is in a group within which the consuming user is a member. This latter case makes the most sense given that all Office 365 Groups will have a corresponding Modern Team site. Embedding the report within group pages should “just work”.

The devil is of course in the details, and all of these details are not yet available, but Given the number of questions that I have received over the past year about Sharepoint/Power BI integration, I expect that its existence will come as welcome news. Over time I would expect to see it picking up support for parameters and the ability to work with individual report items (this is speculation, but it makes sense). It’s also not much of a stretch to see how SSRS could make available a Modern web part that worked in the same fashion with on premises SSRSs. That web part could conceivably work both on premises an Online, bringing SSRS to SharePoint Online for the first time.

SharePoint is still very much a platform for integration and for Business Intelligence content delivery. SSRS and Power BI will be the de facto reporting engines for on-premises and the cloud respectively, and Sharepoint will be the dashboarding/integrating platform for both environments.

Microsoft Re-Adopts Yammer as a first class citizen

“The reports of Yammer’s death are greatly exaggerated”

  • Ignite Attendee

Shortly after Microsoft purchased Yammer in the summer of 2012, it was all that the Office division could talk about. Yammer was to replace the conversation feed in SharePoint, the entire development team would adopt the quick shipping Yammer style, and we SharePoint MVPs were told that we were all Yammer MVPs. The conversation feed did in fact replace SharePoint’s in Office 365, and hooks were added to allow it to work with on-premises SharePoint. The SharePoint team moved to a quick shipping cloud first approach, but some time around 2014, the name Yammer was used less and less. At the 2015 Microsoft Ignite conference Yammer had a presence, but it was very muted compared to previous events. At the same time, a new conversations technology appeared in Office 365 Groups that was based on Exchange.

This trend led to a great deal of speculation that Yammer was on the wane. When Microsoft goes silent on a product, it normally means the end of it (Active X, Silverlight, SharePoint Designer, Silverlight). There are notable exceptions to this (SSRS), but it’s normally the case. However, at the same time they continued to make significant investments in it, and most of these investments were architectural (move data centres, Integration with Azure Active Directory). This has sent a very mixed message to the market – why would they continue to invest (heavily) in a dead product? It was almost as if they weren’t sure what to do with it, and were hedging their bets.

The Ignite 2016 conference has removed the mixed part of this messaging. Yammer is quite clearly the social strategy for Microsoft in Office 365. One needs only to look at the attention that Yammer received at the conference. At Ignite 2015, on the show floor, Yammer had a small pedestal with a single screen. It’s significantly larger at Ignite 2016.

There were a number of freebies being handed out. I haven’t seen a new Yammer T-shirt in years, and they were being handed out by the dozens. That itself is telling, but I found the iconography to be particularly interesting.

MVP Amy Dolzine

The renewed investment extended to the social events as well. The Yammer team hosted an event .

These investments are a clear sign, but what really matters is the product itself, and this is where the rubber hits the road. Yammer is becoming more and more tightly integrated with the Office 365 suite all of the time. A lot of architectural work has been done to facilitate this. In fact, next year, Microsoft will be dropping the standalone version of Yammer, and the Enterprise license along with it, making it first class component of Office 365. One look no further that the embedded Yammer conversation views:

In context Yammer conversations embedded in a SharePoint Publishing page

The above shows threaded discussions happening within the context of the content, in this case, a SharePoint publishing page. This is accomplished through the use of the new Yammer web part, which is built with the new SharePoint Framework, and delivered in Modern SharePoint pages. This feature is not available yet, but is coming very soon. The above image is not a mock -up. In fact, if you look at a list of modern web parts in a test environment today, there are only a couple that represent integration points – two of them stand out – Power BI, and Yammer.

Yammer is now an integral part of Office 365 Groups – another topic that was well represented at Ignite. I could attempt to articulate how this works, and why it matters, but this has already been done by Naomi Moneypenny here. There is also a Microsoft blog post discussing it available here. In a nutshell, Yammer will leverage all of the other Groups capabilities including SharePoint for document storage and OneNote for Notes capture, replacing its own native storage systems. Office 365 Groups will use Yammer for threaded discussions.

The approach to Yammer is different than the one we’ve become accustomed to. Yammer is to become an integral part of Office 365 Groups, providing the social component to the excellent content experience of Groups. Yammer becomes a part of a greater whole which, in my opinion is all to the good. Yammer has often been presented and used as a standalone solution. I’ve often felt that the threaded conversations in Yammer work well, but trying to use it for content management or event management is frustrating at best. Integration points between it and Office 365 have been poor to non-existent. The Yammer Add-in for SharePoint was recently removed from the store. Yammer groups have been different than Office 365 Groups leading to a disjointed experience. This is true no longer – now there are only Groups. The same group backing a SharePoint Team site backs Yammer’s social content. Yammer will also share OneDrive, OneNote and calendars, unifying all of the non-social content.

Yammer doesn’t appear to be going anywhere anytime soon.

A Simplified Method of Working with SharePoint Data in Power BI

Although I typically advise against it, there are valid reasons to report on SharePoint list data directly. Power BI Desktop makes this data quite easy to access – you can use the built in connectors for SharePoint or SharePoint Online, or, due to the fact that any SharePoint list is available via OData, you can also use the OData data connector. Microsoft has recently made improvements to both methods, but the SharePoint connectors bring some significant usability advantages. One of these advantages is what I’m calling “summary columns”.

The Problem

Consider the following SharePoint list with different field types:

Connecting to this list with Power BI desktop and editing the query returns all of the list fields regardless of their visibility in the user interface. Assuming that we want to work with the above field values for analysis purposes, we can discard the fields that we don’t need, and reorder the remaining, leaving us with only these fields.

However, you can immediately see that we’ll need to do some work in order to get our data in a usable format.

The title field is simple enough – it’s value is immediately available, no issues there, and no changes are necessary. From the Name field, several columns are returned. Two are the ID of the name in the site collection’s user list. It would be possible to connect to the root of the site collection, retrieve the user list, and establish a relationship between the tables, but clicking the expand icon will allow Power Query to do a lookup for each ID and return the desired attribute, in this case, Name.

The same is true for the lookup field type – the column can be expanded in order to include any attribute of the source item. The field using managed metadata works in a similar fashion; it can be expanded in order to retrieve the text value of the managed metadata item. Link fields work the same way – the can be split into two columns, the link itself and the description. However, the field containing multiple values is a little different. The great news here is that it’s possible – previous versions of Power Query couldn’t work with multi value fields, but now the SharePoint data source supports it.

Multiple value field values are returned as a list. With list items, the expand icon will duplicate the entire record for each value on the list. This may or may not be the desired behaviour, but remember, this is Power BI. Everything can be aggregated.

Before After

The conclusion to be drawn here is that in order to represent SharePoint list data that is using any sort of control more complex than text or number, we need to do some work. However, the good news is that someone (I’m not sure if it’s the Office Team of the Power BI team) has added a feature that makes this whole process much simpler.

The Solution

After connecting to your SharePoint list, edit the query. Instead of diving in and performing all of these manual transforms, select your multi value column(s) if you have any (this will make more sense momentarily). Select any rich text columns as well, and then scroll right to find a column named “FieldValuesAsText”. Select this column, then right click on it and select “Remove Other Columns”.

The FieldValuesAsText column is our magic bullet. It automatically converts most (rich text fields being the exception) of the more complex SharePoint data types to simple text that work well within Power BI. Simple click on its expand button, select the columns that you want to include in your analysis. I find it useful to deselect “Use original column name as prefix” as well. We are left with textual representations of our field data.

You will notice that the multiple value fields here have their values separated by commas. For multiple values, I tend to prefer the “raw” approach, which is why we retained the multiple value column above. We can still expand it and create a separate line for each value, and remove the column created by “FieldValuesAsText”.

Finally, you may have noted that the Rich Text field isn’t automatically converted. In order to extract useful text from it, we still need to use Power Queries transformation functions such as Replace Value, Trim, and Clean.

In a nutshell, if you’ve been frustrated by formatting or data type limitations when using SharePoint data in Power BI, have another look, and check out the FieldValuesAsText column. It will make everything a lot simpler.

How to Enable Unlimited Storage in OneDrive for Business

Last December, it was announced that OneDrive for Business users would indeed be receiving unlimited storage if they had a qualifying subscription. (Details on which subscriptions qualify for unlimited storage can be found in the original announcement here).

Furthermore, I understood from the announcement and the coverage around it that users would initially be enabled with 5 TB, and that if you needed more, you would have to call support and ask for it to be enabled. Presumably this was to discourage users from seeing the infinity symbol for available space, and immediately uploading the contents of their DVD library.

I had been watching my storage stats and checking every month to see if the 5 TB was yet enabled for my account to no avail. I was stuck at 1 TB. My wife also uses our tenant and is an active photographer with quite a few RAW files that she stores in OneDrive for Business. As an aside, she’s very good – you can check out her work at Last week, her storage exceeded 1 TB, and OneDrive for Business started complaining. It was time to do some digging.

As it turns out, my understanding wasn’t exactly correct. You are entitled to unlimited storage, but you will only be given the 5TB cap when you ask. You can ask anytime however. In order to get more than 5 TB, you ask for that too, but you can only ask when your storage is in the warning zone – close to 5 TB.

You might think that being Canadian, I’m fine with just asking politely, but patience is not my strong suit. The good news is that you can use the SharePoint Online PowerShell module to connect to your Office 365 tenant, and change the limit yourself. It’s not particularly easy though, so I’ll walk through the required steps, or at least the steps that I required.

1. Install the SharePoint Online PowerShell management shell

The SPO management shell is a PowerShell extension that allows you to connect to SharePoint Online and use PowerShell to perform administrative functions. It’s not installed by default, but it can be downloaded and installed from the Microsoft Download Center here. The odd thing is that it prompts you to choose from 2 different files, 2 for 64 bit systems, and 2 for 32 bit systems.


I’m not sure what the differences are aside from the bit level, but I grabbed the most recent 64 bit version and installed it.

Once installed, you must run the shell as an administrator, otherwise, it will fail to find the extension files. I also had all sorts of trouble running it on Windows 10 machines. After trying on 2 different ones, I gave up and installed it on a Windows 8.1 virtual machine, where it ran correctly.

2. Connect to your tenant with admin credentials.

From within the management shell, the first thing that you need to do is to connect to your tenant. You do so by running the Connect-SPOService cmdlet. The syntax is:

Connect-SPOService -Url -credential adminuseremailaddress

Neither one of the parameters is as simple as it may seem.

The –Url parameter is the administrative url of your Office 365 SharePoint tenant. Normally, it’s the standard SharePoint url with “-admin” appended on to the end of the first identifier. If you normally access SharePoint online with the url, your admin url is

The –credential parameter is also not quite what it seems. You need admin access to your tenant to run these command, and chances are If you are reading this, then you are. If not, you have to at least provide the credentials of an account that does have admin access. The credential is in the form of an email address, and you will be prompted  for a password when the command is run. This is where I ran into another difficulty.

If you have admin credentials to your tenant, it’s that much more important that your account is secure. One of the best things that you can do in that regard is to use multi-factor authentication. I do this, and have done for some time. Unfortunately, SharePoint Online doesn’t support multi-factor authentication.

Normally this isn’t a big problem, you can just register and supply an application password. Skype for Business still requires this as an example. Unfortunately PowerShell does not allow application passwords. There is no way around this problem.

I fortunately had access to an administrator account that does not use MFA, and I was able to provide that to connect successfully. If you do not, you’ll need to create one in your Office 365 tenant to do this.

3. Set the storage quota

The final step is to run the PowerShell command that actually sets your quota. The syntax of this command is:

Set-SPOSite -Identity https://yourmysiteurl -StorageQuota 5242880

The –Identity parameter is the URL of your MySite, which is where OneDrive is stored. the format usually incorporates your company’s normal SharePoint URL, adds a –my and your email, slightly altered. Therefore if your company name is “CoolCompany” and your email address is “”, then your MySite url is .

Finally, the –StorageQuota parameter needs to be 5242880 which corresponds with 5TB. I assume that you double it for 10 TB, but I haven’t been able to test that, as I haven’t uploaded enough to qualify for the next tier. You can only request storage increases in 5 TB chunks.

Once the quota has been successfully set, you should be able to see your new cap in the OneDrive for Business web UI. Just hover over the OneDrive for Business icon in your tray, right click and select manage storage.


The storage Metrics page will open and your storage allocation can be found in the upper right.


It’s not easy, but it’s worth it if you have a qualifying account.