Skip to content

Month: January 2011

Setting up WordPress on Windows and SQL Azure: A Walkthrough

NOTE – July 17 2012 The post below was originally written in early 2011, and represents the effort required to get WordPress working in an Azure Web Role. With the release of the new Azure IAAS features  in June 2012, I wanted to note that I do not recommend this approach. I am leaving the post here as it may have historical value, or value for those using the tools described. WordPress can now be run as an Azure Web Site, or, as this blog is using, within an Azure Virtual Machine

As I mentioned in a post last week, the blog that you are currently reading is now hosted on Windows Azure. Nothing about the blog platform has changed, it’s still running on WordPress, but along the way I did switch the database from MySQL to SQL Azure. The process of getting this up and running was not exactly straightforward, so I thought that I would share my experience here.

To be clear, I am new to Azure. Brand new. What I’m writing below is simply my experience in getting this up and running, which happily I was able to do. This should not be taken as prescriptive guidance – that MVP badge at the top of this blog is for SharePoint – not Azure. If this helps you, then great. However, I would be happy to receive comments about mistakes, better approaches, or just other approaches.

Your mileage may vary – you’ve been warned…..

Since the Windows Azure operating environment is actually a Virtual Machine running Windows Server 2008 or 2008 R2, you can technically run anything on it that you can with either of those environments. Getting an ASP.NET service up into the cloud is a snap with Visual Studio, but getting other platforms up there, Like PHP requires a bit more effort. Luckily, Microsoft recently published the Windows Azure Companion, which makes it significantly easier to install PHP and PHP based applications like WordPress and Drupal on Azure. We’ll be working with this tool extensively.

1. Create the  Storage Account

The Azure Companion installs a series of files into blob storage, so it is necessary to have a storage account available. Log in to the Azure Dashboard, click on “Hosted Services, Storage Accounts, and CDN”, and select  “Storage Accounts”. Once this has loaded, click on “New Storage Account”.

image

From the following dialog box, choose your subscription, enter a name (URL) for your account, and choose a region.

image

The URL that you enter can be whatever you like, but it MUST be unique across all Azure storage accounts. I also always choose a specific data center. Given that you are charged for bandwidth in and out of the data center, and my WordPress install will be using SQL Azure, I want to make sure that all data moving between my front end server and my SQL Azure server is within the same data center, and this is the only way that I know to do this. The Storage Account should be created fairly quickly.

2. Create the SQL Azure WordPress database

Since we will be using SQL Azure for data storage, its necessary to create the database ahead of time. To do so, log into the Azure Portal select Database, drill into your subscription and select your server, and click create.

image

Depending on your subscription, you may get different options, but you need to select a database name, edition, and maximum size.

image

You can select whatever you want for edition and size, but 1 GB should be more than enough for WordPress. Make sure that you remember the name of the database.

Finally, you’ll need to make sure that the Firewall rules are configured to allow access for  Azure services. From the Server information screen, click on Firewall Rules.

image

Unless already selected, clicking on “Allow other Windows Azure services…..” will add a rule permitting your Azure services to access the database.

In addition, make note of the following information. You will be needing it when it comes time to setting up WordPress, below:

image

3. Install the Windows Azure Companion

You can download the Windows Azure Companion from here. You have three choices – the companion without SSL, with SSL, and the source code. The big difference between the first two is what endpoints are configured, and the source code obviously lets you change the entire solution. Unfortunately, since the solution package has not been configured for remote desktop access, working with the source code is necessary.

Why is remote desktop access necessary? Well, if you are absolutely satisfied that you can get everything configured perfectly in the solution package, then it isn’t but at this stage in the game, I just don’t have that much confidence. RD access lets me tweak things after deployment. The biggest reason for me however was that if you use the application installer in Azure Companion, it will want to install your WordPress Instance in a subdirectory off the root (ie http://blogs.cloudapp.net/MyBlogName). I didn’t want that, I wanted to use http://myblogname.mydomainname.com, and to get that going requires a few modifications to IIS afterwards – hence the need tor RDP access. However, if the default behaviour is OK for you, download the precompiled solution package, follow the instructions here and skip to the next section.

To Edit the Solution you will need Visual Studio 2010 along with the Windows Azure Tools 1.3 SDK installed. You will also need to download the Azure Companion source files. Once done, open the solution file. There are actually 2 to choose from, one using SSL and one not. We will be working with the non-SSL Solution.

The “AdminWebSite” role is what we’ll be working with. This is the central application for Azure Companion, and from there we’ll install PHP and WordPress. For now, we are primarily concerned with configuring the role, and setting up Remote desktop. In addition, we’ll create a certificate to be used both for the application and for management (Remote Desktop).

First we need to configure the role, and we do that by double clicking on the role name in the roles folder. his brings up the Configuration tab.

image

The defaults on this page are fine, but the Instance Count is worth noting. According to my testing, Azure Companion can only be used with one instance. This is because multiple instances need to share a common file in the Blob storage, and this file is locked by the first process that accesses it. This will generate an availability warning on deployment.

The settings tab is where most of the configuration is performed.

image

There are 5 values here that MUST be configured:

WindowsAzureStorageAccountName The name of the storage account created in step 1
WindowsAzureStorageAccountKey The storage account primary key that can be obtained from the portal after the account has been created
AdminUserName A new user  name that will be used to administer Azure Companion
AdminPassword Administrator’s password
ProductListXmlFeed A list of available products to install. Azure Companion is extensible, and you can maintain your own list*.

* There is a publicly available feed at : http://wazstorage.blob.core.windows.net/azurecompanion/default/WindowsAzureCompanionFeed.xml

Once these settings are made, we need to modify the endpoints. Azure allows for 5 endpoints, and if Remote Desktop occupies one of them. On the surface, the 4 endpoints specified should be fine, but my testing showed that either there is a hidden endpoint somewhere, or the limit is actually 4. Either way, we need to remove one of the endpoints. We have no need for MySQL, so that’s the one that loses.

image

The rest of the settings are fine, so we don’t need to explore them. At this point it’s a good idea to save the project. If you get a write protected error, you’ll need to go to the folder where the project files are stored and remove the read only attribute from the project files. Don’t forget to come back and save the project.

While Visual Studio can deploy directly to Azure, for the first deployment, we have a bit of a chicken and egg problem. In order to deploy a solution that has Remote Desktop enabled, you need to have a service certificate and management certificate already available. The service certificates are installed as a node under the hosted service, but the hosted service hasn’t yet been created. Further, to create a new hosted service from the portal , you need to have the two package files available. Therefore, for our first deployment, we will create the service files and deploy manually.

Right click on your cloud project, and click “Publish”.

image

In this case, the default value of “Create Service Package Only” is what we want, so click on “OK”. Visual Studio will then create the files that we need to deploy, and open up a Windows Explorer window to the path in which they’re contained. Copy this path to the clipboard, and we are now ready to create our first Service Application.

Open the Windows Azure Portal, click on Hosted Services, and then click “New Hosted Service”. The new Hosted Service dialog appears.

image

The Service name only matters for management purposes, but the URL Prefix is the way that your application will be addressed from outside, so choose the name wisely. It must be unique among all Azure applications. For the region, make sure that you choose the same region specified for the SQL Azure database above. Since you likely won’t need deployment/production environments, just deploy to production. We’ll be changing this right after we create it, so we don’t want to start it, that just takes extra time. Finally, select the package and configuration generated by Visual Studio.

Once this is done, Azure will create the virtual machine to host the instance, and install the instance itself. The process will take a few minutes, but when you’re ready to proceed, The portal window should appear something like the window below.

image

The next thing that we need to do is to create our service and management certificates. The two certificates can be based on the same root certificate but must be in two different formats. The management certificate will be a .cer file (which is what is  created when a self signed certificate is created) and the service certificate must be in X509 format. This can be done by exporting a .cer file.

If you already have a certificate, you can skip this creation step, but to create one, the easiest way is to use the Visual Studio tools. Once again, Right click on your cloud project, and click “Publish”. This gets a little tricky.

image

Click on the credentials drop down and select <Add>. Then, the Project Management Authentication dialog appears. Again, select the drop down, and if there are no credentials already stored, choose <create>. Then enter a friendly name for the certificate (in this case sfiWordPress). Follow the instructions in step 2 – but note that there is no “Subscription Page”, but that you’ll be uploading a Management certificate into the Management certificate section. The subscription ID is obtained in the portal and the purpose of naming the credentials is so that Visual Studio can refer to them at a later date.

image

When ready, click OK. VS will connect to Azure to make sure that everything OK, and load in all of your hosted services. However, we’re still not quite ready to redeploy. First we need to upload a service certificate to the service (we did the management certificate above). This is because we need such a certificate in order to use Remote Desktop.

First, we need to create our certificate,  To do so, click the “Configure Remote Desktop connections link”.

image

Create a new certificate if necessary, and enter in a local machine (for the service) user name.  When ready click OK. We’re almost ready to deploy, but first we must upload this new certificate to the service. However, it’s not yet in an importable format, so we need to export our certificate to a .pfx file. To do so, without closing our deployment dialog, run the certificate manager add in by clicking on the Start Pearl, and entering certmgr.msc into the search box:

image

When the certificate manager window opens, open the Personal branch, and click on Certificates.  Then, right click on your new certificate, hover over all tasks, and click on Export.

image

The certificate export wizard will then start. Walk through the wizard, make sure that you select the option to export the private key, enter (and Remember!!!) a password, enter a file name and save it. When this is done, we’re ready to add it to our service application.

Go back to the Azure Portal, and navigate to your hosted service. Then click on the certificates node, and press the “Add Certificate” button in the ribbon. Browse to the certificate, and enter its password. When ready, click the create button.

image

Now we’re ready to deploy our Remote Desktop enabled service. Go back to the Visual Studio Publish Dialog, make sure that you have the deploy to Azure option selected, and click the OK button.  If all is well you will receive the following prompt:

image

This is simply warning you that you have another service deployed into the production slot, and that you will be overwriting it with this deployment. Since this is precisely what we want, go ahead and click the “Delete and Continue” button. The deployment process will take several minutes. I suggest going for coffee, or pursuing another vice that requires 5-10 minutes.

4. Use Azure Companion to Set Up PHP and WordPress

Once started, the Azure Companion management service is running on port 8080. You can access it by navigating to your service URL at that port. In this case, it’s http://sfiwhitepages.cloudapp.net:8080. You should see a screen similar to the following:

image

You log in with the ID created in the service definition in Section 3. If you receive errors, chances are that all the services haven’t spun up yet. You will also receive errors here if you configures multiple instances of the service, because both instances are trying to access the same file.

Click on the Applications tab. You will find a number of applications listed, and the one we’re interested in is WordPress. Selecting it  and click next. The next page will show it selected, along with all of its dependencies, including the PHP runtime, and the PHP SQL drivers.

image

One parameter above is quire important, the installation path. This path will be used  to form the URL to your blog, in the form http://serveraddress/InstallationPath. Unless you’ll be taking the additional steps to install the blog into the root that I describe below, you’ll want to choose this name wisely, as you’ll be handing it out.

After you have carefully read all of the licence terms (tee hee), click the “Accept” button. All of the requisite files will be installed for you. This process should take something less than a minute. When done, you will be returned to the application screen. You are now ready to set up WordPress. To begin the process, click on the launch button in the application window:

image

Alternatively, you can enter the URL for WordPress on your service by adding the installation path to the end of your URL, i.e. http://sfiwordpress.clousapp.net/wordpress. Doing so will begin the WordPress configuration procedure. You should be presented with a page that indicates that the WordPress configuration file needs to be created. Go ahead and do so. After a confirmation page (click “Let’s go!”), you’ll be prompted for the database connection information. Here you’ll enter the information that you recorded at the end of the SQL Azure setup step above (step 2). The format of the user name is a little non standard (username@machinename), but the diagram below shows how the information from step 2 maps to the WordPress setup form.

image

Clicking Submit causes WordPress to check the connections, and if all is well, you are prompted to Run the install. Clicking the install button brings up the standard WordPress configuration screen, looking for the Site Title (for use on pages), the administrator user name, the administrator password, and your email. WordPress will use your email for things like administrator password resets, but the sendmail function will not work, at least without further setup (which I haven’t done). In other words, don’t forget the password.

image

Once complete, go ahead and click the “Install WordPress” button. You’ll receive a confirmation message, and clicking on Log In will take you to the login screen. Once logged in (as the administrator), you’ll be taken to the standard WordPress admin screen.

image

Of course navigating to the blog’s address will take you to the blog itself.

image

If you’re happy with the URL as is, you’re done. However, if you’re like me, and you want to have the blog at the root, and/or you want to use your own domain to host the blog, then read on…..

5. Move the Blog to its own Application

Once installation is complete, WordPress is stored in a subdirectory of the main PHP host site. If we don’t need the site for anything else, then we can change its port bindings, create a new web application on port 80, and move the WordPress files to it.

The first step is to connect to your Azure instance with Remote Desktop. To do this, open up the Windows Azure portal, navigate to the service instance, and click the connect button in the ribbon.

image

This will open up the Remote Desktop window. You will log in with the credentials that you created in the Remote Desktop configuration settings in Visual Studio. When entering the User Name, make sure to preface it with a backslash (ie jwhite) to indicate a local user. If you used a self signed certificate, you will receive certificate warnings, but these can safely be ignored.

Once logged in, start the IIS Manager, then open up your virtual server, and open the sites tab. You should see two sites, one is the admin site for Windows Azure Companion, and the other is the PHP Host site. Our first step will be to change the port binding to something other than 80 for the PHP host site, so select it, and click on Bindings.

image

Select the current binding click edit, and change the port to something other than 80 (I used 81). Keep in mind that because this endpoint is not defined in the service, it will be unavailable outside of the host instance.

image

Next, open Windows Explorer and open up the F: folder. Create a new folder to house the WordPress files (in my case F:WordPress). This will be our new blog folder.  The name will only matter to the server. Next, open up the F:Applications folder. This is the root of the current WordPress site. Copy the two files stored there (phpinfo.php and web.cfg) to the folder that you just created. Next, navigate to the folder in the F:applications folder (in my case, F:applicationswordpress) and copy everything there into the same new folder.

Once this is done, go back into IIS Manager, right click on the Sites folder, and select Add New Site.

image

Give the site a unique name, and for the Physical Path, use the new blog folder created above. Ensure that the app is explicitly bound to the server’s IP address, and in addition, ensure that the site is bound to port 80. When finished, the web site should start, but if there is an error, simply restart it.

You should now be able to navigate to the root of your site, but you will likely notice that all of your styles have gone. This is because WordPress still thinks it’s installed in the old folder. To fix this, you have two options. You could connect to your SQL Azure Instance with SQL Enterprise Manager, and edit the siteurl record in the wp_options table. Alternatively, you can navigate to the new blog folder, delete the wp-config.php file, navigate to the blog root, and re-set it up as above. This is likelier the easiest option, but be aware that before you do this, you will need to drop and recreate your SQL database.

We’re almost there…

6. Implement a custom Domain Name

OK. Now we have a WordPress blog running at the root of our application. However, if you’re like me, you likely want to use your own domain, and not cloudapp.net. Unfortunately, Azure uses variable external IP addresses, so there’s no way to use host headers and DNS A records. However, we can create a CName (alias) record that converts xxx.cloudapp.net to servername.mydomain.root.

The first step is to log in to your Domain Services provider. I use DynDNS so the example will be from there. Simply add a CNAME record that converts the Azure service name to your desired address. The DynDNS example is below.

image

You should now be able to navigate to your blog at the new address. It may take a few minutes for changes to propagate.

We have one step to go. WordPress is answering on this new address, but it will still form all of its URLs using the old one. We need to tell it to use the new address, and we do that by navigating to the admin app. Until we make this change, we need to use the old address, which in my case is http://sfiwhitepages.cloudapp.net/wp-admin.

Once logged in, click on settings along the left menu, and then select General.

image

Once those changes are made, you can use the new domain exclusively.

That’s it…we’re done! Easy right?  It’s worth it. No longer is my blog at the mercy of my local power provider, or any IT maintenance schedules. As I mentioned at the beginning, please feel free to comment with any better approaches, or any egregious errors that I may have made.

13 Comments

The White Pages are now Running on Windows and SQL Azure

Ever since I started this blog, I’ve hosted it internally on our premises. Part of the reason for this was that I wanted to have full control over what was going on with it, and I wanted to work in a familiar environment. For me, that was of course the Microsoft stack. While SharePoint has excellent blogging features, made even better by the Community Kit for SharePoint: Enhanced Blog Edition, my feeling is that its feature set is more applicable to an inside the firewall deployment. Also, if I were to use SharePoint for this purpose, I’d be constantly distracted by the desire to improve upon it.

What I needed was a platform that was focused on blogging, and that I wouldn’t wind up tinkering with too much. I settled on WordPress, which seemed to be very well supported, and quite good at what it did. WordPress had direct integration with Windows Live Writer, and had apps for the iPhone, Blackberry, Android, and now Windows Phone 7.

WordPress natively runs on PHP and MySQL, and typically runs in Linux environments. However, since IIS supports PHP and MySQL runs on Windows, it is possible to get it running in my “familiar environment”. Normally doing this sort of thing is a bear, but by using the Web Platform Installer from Microsoft, the installation was a breeze. All that was necessary was to run it, and select WordPress as a desired application. The installer then took care of downloading PHP, MySQL, WordPress, and integrating them all together. After answering a few account and password questions, I was up and running, and have been ever since.

The one drawback of this approach was that I was hosting it myself, and therefore always concerned with reliability and uptime. More importantly it has been sharing a server with other applications, and more than once has gone down because another system needed a reboot, crashed, or something. A hosted environment was obvious, and since I’ve been exploring the Azure platform lately, I thought I’d see what was involved.  One of the advantages of the MVP program, which I’m newly a part of is that you are allocated a certain amount of Azure computing hours, so off I went experimenting.

Happily, one weekend later, this blog has been transitioned to a high speed, and highly available platform, that most importantly, I don’t have to maintain. Not only that, but I’ve been able to take MySQL out of the picture completely, and I’m using a SQL Azure database as my data store. I had several false starts right away, and I’m going to document the approach  that I took and post it here shortly, but for now, I’m pretty happy with the results.

Hello Azure!

Leave a Comment

Using XSL to get the URL of the Current SharePoint Page for the Content Query Web Part

The Content Query Web Part (CQWP) is the Swiss army knife of SharePoint. I use it in all sorts of situations. Recently, I had a situation where I was displaying content from a central list on a series of decentralized sites. The content in the web part was being filtered by the site it was hosted on. When the users navigated to the site, all they saw were the items from the central list that pertained to the site they were on.

When the users clicked on the item in the result set, it opened the item. The problem was that when they closed the item, they would be returned to the source list, which was not what they expected.

The solution was to add the source URL parameter to the XSL that the CQWP used, but how would it know the site that should be returned? Thankfully, Spyral Out had already come across this requirement, and sorted it out. I repeat it here so that there is another source available. Here’s how to do it:

  1. Add the  NameSpace xmlns:ddwrt=”http://schemas.microsoft.com/WebParts/v2/DataView/runtime” at to the top of the xsl file (Usually ItemStyle.xsl) 
  2. Add a parameter “PageUrl “ <xsl:param name=”PageUrl” />  right below your namespace definitions
image

3. Assign this parameter along with the SafeLink URL to a variable, and then use the variable as the link target

<xsl:variable name=”DetailPageLink” select=”concat($SafeLinkUrl,’&amp;Source=’,$PageUrl)” />
<a href=”{$DetailPageLink}” title=”{@LinkToolTip}”>

Once done,users can click through to an item,edit it, and when saved or closed, they will be returned to the page hosting the web part.

1 Comment

Windows Live Photo Gallery and Digital Frames–A Match Made In Heaven

This post may be a little late for the holiday season, but there’s always another one coming up. I’ve written before about Windows Live Photo Gallery and its promise when it was in Beta. It’s been out for several months now, and my opinion on it hasn’t changed – it’s an excellent photo organizing tool. It has some light editing capabilities, but I work frequently with RAW images, and I use Adobe Photoshop with Bridge for my picture processing tasks. However, once processing is complete, Live Photo Gallery takes over for tagging and organizing.

One of the things that I really like about the product is its integration with external applications and galleries. I store my photos in Windows Live and Facebook (for sharing with others) and in Flickr for both sharing and full size image storage. Live Photo Gallery makes this very easy. Once the pictures are tagged, you simply select the ones that you want to send (I also like to tag them with a destination/album name like “Flickr – 2009 General” so that I know what I’ve saved) and then click the relevant destination in the Share section on the tab.

image

Once configured, for each selection, you’ll get a dialog box prompting you for the album and other metadata, and in the case of Facebook, for people. Remember that Windows Live and Facebook are tightly integrated,so that people you tag with Photo Gallery will automatically be reflected on Facebook. In the case of Facebook,the dialog looks something like below:

image

This makes sharing photos online really easy to do. However, I have always found that digital frames were much more difficult. Being in technology, I of course have given digital frames to my parents, grandparents and in-laws. It’s a great way to get photos to them, but managing the content can be a bit of a nightmare. I have run across two major stumbling blocks doing this.

The first problem is the limited storage capacity of the frame itself. With cameras boasting higher and higher megapixel counts, their file sizes are increasing exponentially. Many older frames have storage capacities below 256 MB, which just doesn’t cut it. Even modern frames have a typical capacity of 1 GB and while that can be increased through expansion cards, it’s really only prolonging the inevitable.

The solution to this is to convert the images. Most frames are relatively low resolution, most being in the 640×480 or 800×600 range. If you’re counting, that’s 0.3 and 0.5 megapixels respectively. Converting the images to the native resolution of the frame will result in drastically lower storage requirements without any loss in displayed quality. The problem with this approach is that conversion software is a little above the heads of most casual users users, and it generates yet another group of pictures to manage.

The other problem is randomization. Believe it or not, most frames that I’ve encountered do not automatically randomize image play, leaving you to watch the same sequence over and over again. Since they’re usually sorted on filename, you’re often stuck watching things in chronological order. The way around the order is to get some sort of file renaming utility and rename all of the files before copying them over.

Those are the problems. However, Windows Live Photo Gallery supports plug ins for its destinations (Flickr, SkyDrive, Facebook, YouTube are all out of the box), and there is an excellent plug in written for digital frames written by Leo Lie. Essentially, it treats the frame, or any SD drive as a source such as Flickr, etc. Once you select the photos you want, you simply press the button, select whether or not you want the files resizes, and to what degree, and if you wish, it will randomize your photos for you. This solves the two problems (almost) in one fell swoop. If I tag the photos with the name of the frame, and I continue to be religious about tagging, very time grandma comes over, she can bring her SD card, I can erase it, and reload it. Simple.

You can get the plug in by clicking on my link above, or from within photo gallery, you can check out all of the available plug ins. It’s not obvious how, so I’m including a screenshot below. There is a scrollbar to the right of the Share section on the Home tab of the ribbon. At the bottom of that scrollbar is a drop down. Click it, and click Add a plug in, and you’ll be taken to the plug in gallery. There are several good ones.

image

I should also mention that while I use this for all of my relatives. I recently purchased a Kodak Pulse wireless frame for my use at home. If you have wireless, it’s a very good way to go. With it, you can send pictures directly to it, you can use Kodak’s file share, and you can email pictures directly to it. However the real value here is that it integrates with Facebook so that any pictures you post to Facebook (configurable) will show up on the screen. Since Facebook stores low resolution pictures, this is perfect. I simply use the Windows Live Photo Gallery integration to send the pictures to Facebook, and I’m done. I’ll be going on a diving trip by myself (more on that later) in a few weeks, but the family will be able to see my pictures as I post them.

Now I just need to keep up with my tagging.

Leave a Comment

Joining a Machine to A Domain over VPN with Windows 7

This has probably been blogged about  a million times, but I wanted to get this down here for my own reference. I’ve always assumed that since XP (you could do this easily with XP) that in order to join a machine to a domain, you needed to be physically at that location.

I was recently faced with the need to join a VM to a customer’s domain, but I didn’t want to travel there, so I tried the approach below, and it worked. Hopefully it can help someone else as well. Here’s how:

  1. Establish a VPN connection with the destination network. I used the built in Microsoft VPN client, but any VPN client should work.
  2. Take note of the machine name and the local user account that you’re currently using
  3. Go through the standard domain joining procedure (note that you need to have an account with permissions to join a machine to the domain)
  4. Do NOT reboot right away. Make sure that you add the domain account that you’ll be using to the local administrators group (if applicable). I often forget to do this and it costs a few extra reboots
  5. Reboot the machine.
  6. Login as the user that you noted in #2. You’ll need to use the format MACHINENAMEUSERNAME. You will not yet be able to login as a domain user because you need to establish a VPN connection in order to see a domain controller to allow the login, and set up the domain account.
  7. Once logged in as the local user, establish a VPN connection to the destination network.
  8. Without logging the local user off, use the “switch user” function. (as shown below)image
  9. Login with the domain account that you want to use. The account will be set up locally for you.

This works because the VPN connection is shared between the login sessions. Once you’ve done this, you can log off the local account, and all should be well moving forward. If your domain user needs access to corporate resources,then another VPN connection will need to be established from within that session.

3 Comments