OK, so my company is a Microsoft partner, and we’re supposed to like everything that they throw our way right? That’s actually not true. I’ll certainly give most things that they do a fair shot. It’s also true that I’m willing to sacrifice a certain amount of capability for either ease of use, or for the way that Microsoft products work well together, but as I noted in a previous post, I only gave up my BlackBerry when Microsoft came out with a product that was worth using.
My company is small (currently 6 people) and widely distributed. Cloud solutions make perfect sense to us,and we have been using Exchange Online for over 2 years now. Our requirements for SharePoint went beyond what was possible in BPOS’ offering, but since migrating to Office 365 6 months ago, the new SharePoint online fits the bill, and more and more of our corporate assets live there now.
UnlimitedViz is currently primarily a SharePoint services company focused on Business Intelligence, and a significant portion of those services involve architecting SharePoint environments at a lower level, which involves sizing servers, making resource decisions, etc. I personally love designing solutions and watching them come to life. We are certainly more than capable to maintain our own SharePoint infrastructure, so why would we want to use an admittedly more limited version of the product that is maintained by someone else?
Pretty much because it’s maintained by someone else.
As mentioned above we’re small, and we need to be focused on what we do best, which is providing services to our customers, and building product. Maintaining internal systems, no matter how good we are at it, is a distraction, and a significant cost, both capital and operational. The per user cost of Office 365 is pretty simple to justify from just a cost standpoint, but there are many more benefits that are brought to the table.
No matter what the location of a team member, they can easily access what they need to. Lync brings that down to the voice and IM communication level. No need to mess around with access methods, VPNs, Firewalls, Reverse Proxy servers and the like. We can get to our content easily on site, at home via whatever device we happen to need. Granted, I could set that stuff up on-premise, but now I don’t have to! I also know that my data is safe, and the performance is going to be good. Two months ago, Exchange online suffered an outage for about two hours (the only hiccup I’ve experienced so far). My initial reaction was “what can I do to fix this”, but that was quickly superseded by “It’s not my problem to fix”, so I just sat back and got other work done.
As we bring more customers onto Office 365, supporting them just gets simpler. A simple client request can be acted upon immediately by launching a browser window, and connecting to their site, seamlessly. With most onsite installations, I need to start a virtual machine, connect through a VPN client, and then hope that the correct tools are installed on the VM, or the client site, depending on the access mechanism. I try to keep a VM image available for every type of VPN client used, which is a hopeless and necessary task due to the incompatibilities between clients. In my opinion, the world will be a better place when VPN clients are eliminated (or at least consolidated”).
Customers using Office 365 don’t need VPN clients, and it makes it that much easier (and cheaper for them) for us to support them.
There a a whole bunch of great features about Office 365 (Shared OneNote files accessed via Windows Phone, browser and client is a good one, not to mention Lync), but the reason that I really like it is that it’s solid, it works, and it lets my business focus on using its tools, not maintaining them.
NOTE – July 17 2012– The post below was originally written in early 2011, and represents the effort required to get WordPress working in an Azure Web Role. With the release of the new Azure IAAS features in June 2012, I wanted to note that I do not recommend this approach. I am leaving the post here as it may have historical value, or value for those using the tools described. WordPress can now be run as an Azure Web Site, or, as this blog is using, within an Azure Virtual Machine
As I mentioned in a post last week, the blog that you are currently reading is now hosted on Windows Azure. Nothing about the blog platform has changed, it’s still running on WordPress, but along the way I did switch the database from MySQL to SQL Azure. The process of getting this up and running was not exactly straightforward, so I thought that I would share my experience here.
To be clear, I am new to Azure. Brand new. What I’m writing below is simply my experience in getting this up and running, which happily I was able to do. This should not be taken as prescriptive guidance – that MVP badge at the top of this blog is for SharePoint – not Azure. If this helps you, then great. However, I would be happy to receive comments about mistakes, better approaches, or just other approaches.
Your mileage may vary – you’ve been warned…..
Since the Windows Azure operating environment is actually a Virtual Machine running Windows Server 2008 or 2008 R2, you can technically run anything on it that you can with either of those environments. Getting an ASP.NET service up into the cloud is a snap with Visual Studio, but getting other platforms up there, Like PHP requires a bit more effort. Luckily, Microsoft recently published the Windows Azure Companion, which makes it significantly easier to install PHP and PHP based applications like WordPress and Drupal on Azure. We’ll be working with this tool extensively.
1. Create the Storage Account
The Azure Companion installs a series of files into blob storage, so it is necessary to have a storage account available. Log in to the Azure Dashboard, click on “Hosted Services, Storage Accounts, and CDN”, and select “Storage Accounts”. Once this has loaded, click on “New Storage Account”.
From the following dialog box, choose your subscription, enter a name (URL) for your account, and choose a region.
The URL that you enter can be whatever you like, but it MUST be unique across all Azure storage accounts. I also always choose a specific data center. Given that you are charged for bandwidth in and out of the data center, and my WordPress install will be using SQL Azure, I want to make sure that all data moving between my front end server and my SQL Azure server is within the same data center, and this is the only way that I know to do this. The Storage Account should be created fairly quickly.
2. Create the SQL Azure WordPress database
Since we will be using SQL Azure for data storage, its necessary to create the database ahead of time. To do so, log into the Azure Portal select Database, drill into your subscription and select your server, and click create.
Depending on your subscription, you may get different options, but you need to select a database name, edition, and maximum size.
You can select whatever you want for edition and size, but 1 GB should be more than enough for WordPress. Make sure that you remember the name of the database.
Finally, you’ll need to make sure that the Firewall rules are configured to allow access for Azure services. From the Server information screen, click on Firewall Rules.
Unless already selected, clicking on “Allow other Windows Azure services…..” will add a rule permitting your Azure services to access the database.
In addition, make note of the following information. You will be needing it when it comes time to setting up WordPress, below:
3. Install the Windows Azure Companion
You can download the Windows Azure Companion from here. You have three choices – the companion without SSL, with SSL, and the source code. The big difference between the first two is what endpoints are configured, and the source code obviously lets you change the entire solution. Unfortunately, since the solution package has not been configured for remote desktop access, working with the source code is necessary.
Why is remote desktop access necessary? Well, if you are absolutely satisfied that you can get everything configured perfectly in the solution package, then it isn’t but at this stage in the game, I just don’t have that much confidence. RD access lets me tweak things after deployment. The biggest reason for me however was that if you use the application installer in Azure Companion, it will want to install your WordPress Instance in a subdirectory off the root (ie http://blogs.cloudapp.net/MyBlogName). I didn’t want that, I wanted to use http://myblogname.mydomainname.com, and to get that going requires a few modifications to IIS afterwards – hence the need tor RDP access. However, if the default behaviour is OK for you, download the precompiled solution package, follow the instructions here and skip to the next section.
The “AdminWebSite” role is what we’ll be working with. This is the central application for Azure Companion, and from there we’ll install PHP and WordPress. For now, we are primarily concerned with configuring the role, and setting up Remote desktop. In addition, we’ll create a certificate to be used both for the application and for management (Remote Desktop).
First we need to configure the role, and we do that by double clicking on the role name in the roles folder. his brings up the Configuration tab.
The defaults on this page are fine, but the Instance Count is worth noting. According to my testing, Azure Companion can only be used with one instance. This is because multiple instances need to share a common file in the Blob storage, and this file is locked by the first process that accesses it. This will generate an availability warning on deployment.
The settings tab is where most of the configuration is performed.
There are 5 values here that MUST be configured:
The name of the storage account created in step 1
The storage account primary key that can be obtained from the portal after the account has been created
A new user name that will be used to administer Azure Companion
A list of available products to install. Azure Companion is extensible, and you can maintain your own list*.
Once these settings are made, we need to modify the endpoints. Azure allows for 5 endpoints, and if Remote Desktop occupies one of them. On the surface, the 4 endpoints specified should be fine, but my testing showed that either there is a hidden endpoint somewhere, or the limit is actually 4. Either way, we need to remove one of the endpoints. We have no need for MySQL, so that’s the one that loses.
The rest of the settings are fine, so we don’t need to explore them. At this point it’s a good idea to save the project. If you get a write protected error, you’ll need to go to the folder where the project files are stored and remove the read only attribute from the project files. Don’t forget to come back and save the project.
While Visual Studio can deploy directly to Azure, for the first deployment, we have a bit of a chicken and egg problem. In order to deploy a solution that has Remote Desktop enabled, you need to have a service certificate and management certificate already available. The service certificates are installed as a node under the hosted service, but the hosted service hasn’t yet been created. Further, to create a new hosted service from the portal , you need to have the two package files available. Therefore, for our first deployment, we will create the service files and deploy manually.
Right click on your cloud project, and click “Publish”.
In this case, the default value of “Create Service Package Only” is what we want, so click on “OK”. Visual Studio will then create the files that we need to deploy, and open up a Windows Explorer window to the path in which they’re contained. Copy this path to the clipboard, and we are now ready to create our first Service Application.
Open the Windows Azure Portal, click on Hosted Services, and then click “New Hosted Service”. The new Hosted Service dialog appears.
The Service name only matters for management purposes, but the URL Prefix is the way that your application will be addressed from outside, so choose the name wisely. It must be unique among all Azure applications. For the region, make sure that you choose the same region specified for the SQL Azure database above. Since you likely won’t need deployment/production environments, just deploy to production. We’ll be changing this right after we create it, so we don’t want to start it, that just takes extra time. Finally, select the package and configuration generated by Visual Studio.
Once this is done, Azure will create the virtual machine to host the instance, and install the instance itself. The process will take a few minutes, but when you’re ready to proceed, The portal window should appear something like the window below.
The next thing that we need to do is to create our service and management certificates. The two certificates can be based on the same root certificate but must be in two different formats. The management certificate will be a .cer file (which is what is created when a self signed certificate is created) and the service certificate must be in X509 format. This can be done by exporting a .cer file.
If you already have a certificate, you can skip this creation step, but to create one, the easiest way is to use the Visual Studio tools. Once again, Right click on your cloud project, and click “Publish”. This gets a little tricky.
Click on the credentials drop down and select <Add>. Then, the Project Management Authentication dialog appears. Again, select the drop down, and if there are no credentials already stored, choose <create>. Then enter a friendly name for the certificate (in this case sfiWordPress). Follow the instructions in step 2 – but note that there is no “Subscription Page”, but that you’ll be uploading a Management certificate into the Management certificate section. The subscription ID is obtained in the portal and the purpose of naming the credentials is so that Visual Studio can refer to them at a later date.
When ready, click OK. VS will connect to Azure to make sure that everything OK, and load in all of your hosted services. However, we’re still not quite ready to redeploy. First we need to upload a service certificate to the service (we did the management certificate above). This is because we need such a certificate in order to use Remote Desktop.
First, we need to create our certificate, To do so, click the “Configure Remote Desktop connections link”.
Create a new certificate if necessary, and enter in a local machine (for the service) user name. When ready click OK. We’re almost ready to deploy, but first we must upload this new certificate to the service. However, it’s not yet in an importable format, so we need to export our certificate to a .pfx file. To do so, without closing our deployment dialog, run the certificate manager add in by clicking on the Start Pearl, and entering certmgr.msc into the search box:
When the certificate manager window opens, open the Personal branch, and click on Certificates. Then, right click on your new certificate, hover over all tasks, and click on Export.
The certificate export wizard will then start. Walk through the wizard, make sure that you select the option to export the private key, enter (and Remember!!!) a password, enter a file name and save it. When this is done, we’re ready to add it to our service application.
Go back to the Azure Portal, and navigate to your hosted service. Then click on the certificates node, and press the “Add Certificate” button in the ribbon. Browse to the certificate, and enter its password. When ready, click the create button.
Now we’re ready to deploy our Remote Desktop enabled service. Go back to the Visual Studio Publish Dialog, make sure that you have the deploy to Azure option selected, and click the OK button. If all is well you will receive the following prompt:
This is simply warning you that you have another service deployed into the production slot, and that you will be overwriting it with this deployment. Since this is precisely what we want, go ahead and click the “Delete and Continue” button. The deployment process will take several minutes. I suggest going for coffee, or pursuing another vice that requires 5-10 minutes.
4. Use Azure Companion to Set Up PHP and WordPress
Once started, the Azure Companion management service is running on port 8080. You can access it by navigating to your service URL at that port. In this case, it’s http://sfiwhitepages.cloudapp.net:8080. You should see a screen similar to the following:
You log in with the ID created in the service definition in Section 3. If you receive errors, chances are that all the services haven’t spun up yet. You will also receive errors here if you configures multiple instances of the service, because both instances are trying to access the same file.
Click on the Applications tab. You will find a number of applications listed, and the one we’re interested in is WordPress. Selecting it and click next. The next page will show it selected, along with all of its dependencies, including the PHP runtime, and the PHP SQL drivers.
One parameter above is quire important, the installation path. This path will be used to form the URL to your blog, in the form http://serveraddress/InstallationPath. Unless you’ll be taking the additional steps to install the blog into the root that I describe below, you’ll want to choose this name wisely, as you’ll be handing it out.
After you have carefully read all of the licence terms (tee hee), click the “Accept” button. All of the requisite files will be installed for you. This process should take something less than a minute. When done, you will be returned to the application screen. You are now ready to set up WordPress. To begin the process, click on the launch button in the application window:
Alternatively, you can enter the URL for WordPress on your service by adding the installation path to the end of your URL, i.e. http://sfiwordpress.clousapp.net/wordpress. Doing so will begin the WordPress configuration procedure. You should be presented with a page that indicates that the WordPress configuration file needs to be created. Go ahead and do so. After a confirmation page (click “Let’s go!”), you’ll be prompted for the database connection information. Here you’ll enter the information that you recorded at the end of the SQL Azure setup step above (step 2). The format of the user name is a little non standard (username@machinename), but the diagram below shows how the information from step 2 maps to the WordPress setup form.
Clicking Submit causes WordPress to check the connections, and if all is well, you are prompted to Run the install. Clicking the install button brings up the standard WordPress configuration screen, looking for the Site Title (for use on pages), the administrator user name, the administrator password, and your email. WordPress will use your email for things like administrator password resets, but the sendmail function will not work, at least without further setup (which I haven’t done). In other words, don’t forget the password.
Once complete, go ahead and click the “Install WordPress” button. You’ll receive a confirmation message, and clicking on Log In will take you to the login screen. Once logged in (as the administrator), you’ll be taken to the standard WordPress admin screen.
Of course navigating to the blog’s address will take you to the blog itself.
If you’re happy with the URL as is, you’re done. However, if you’re like me, and you want to have the blog at the root, and/or you want to use your own domain to host the blog, then read on…..
5. Move the Blog to its own Application
Once installation is complete, WordPress is stored in a subdirectory of the main PHP host site. If we don’t need the site for anything else, then we can change its port bindings, create a new web application on port 80, and move the WordPress files to it.
The first step is to connect to your Azure instance with Remote Desktop. To do this, open up the Windows Azure portal, navigate to the service instance, and click the connect button in the ribbon.
This will open up the Remote Desktop window. You will log in with the credentials that you created in the Remote Desktop configuration settings in Visual Studio. When entering the User Name, make sure to preface it with a backslash (ie jwhite) to indicate a local user. If you used a self signed certificate, you will receive certificate warnings, but these can safely be ignored.
Once logged in, start the IIS Manager, then open up your virtual server, and open the sites tab. You should see two sites, one is the admin site for Windows Azure Companion, and the other is the PHP Host site. Our first step will be to change the port binding to something other than 80 for the PHP host site, so select it, and click on Bindings.
Select the current binding click edit, and change the port to something other than 80 (I used 81). Keep in mind that because this endpoint is not defined in the service, it will be unavailable outside of the host instance.
Next, open Windows Explorer and open up the F: folder. Create a new folder to house the WordPress files (in my case F:WordPress). This will be our new blog folder. The name will only matter to the server. Next, open up the F:Applications folder. This is the root of the current WordPress site. Copy the two files stored there (phpinfo.php and web.cfg) to the folder that you just created. Next, navigate to the folder in the F:applications folder (in my case, F:applicationswordpress) and copy everything there into the same new folder.
Once this is done, go back into IIS Manager, right click on the Sites folder, and select Add New Site.
Give the site a unique name, and for the Physical Path, use the new blog folder created above. Ensure that the app is explicitly bound to the server’s IP address, and in addition, ensure that the site is bound to port 80. When finished, the web site should start, but if there is an error, simply restart it.
You should now be able to navigate to the root of your site, but you will likely notice that all of your styles have gone. This is because WordPress still thinks it’s installed in the old folder. To fix this, you have two options. You could connect to your SQL Azure Instance with SQL Enterprise Manager, and edit the siteurl record in the wp_options table. Alternatively, you can navigate to the new blog folder, delete the wp-config.php file, navigate to the blog root, and re-set it up as above. This is likelier the easiest option, but be aware that before you do this, you will need to drop and recreate your SQL database.
We’re almost there…
6. Implement a custom Domain Name
OK. Now we have a WordPress blog running at the root of our application. However, if you’re like me, you likely want to use your own domain, and not cloudapp.net. Unfortunately, Azure uses variable external IP addresses, so there’s no way to use host headers and DNS A records. However, we can create a CName (alias) record that converts xxx.cloudapp.net to servername.mydomain.root.
The first step is to log in to your Domain Services provider. I use DynDNS so the example will be from there. Simply add a CNAME record that converts the Azure service name to your desired address. The DynDNS example is below.
You should now be able to navigate to your blog at the new address. It may take a few minutes for changes to propagate.
We have one step to go. WordPress is answering on this new address, but it will still form all of its URLs using the old one. We need to tell it to use the new address, and we do that by navigating to the admin app. Until we make this change, we need to use the old address, which in my case is http://sfiwhitepages.cloudapp.net/wp-admin.
Once logged in, click on settings along the left menu, and then select General.
Once those changes are made, you can use the new domain exclusively.
That’s it…we’re done! Easy right? It’s worth it. No longer is my blog at the mercy of my local power provider, or any IT maintenance schedules. As I mentioned at the beginning, please feel free to comment with any better approaches, or any egregious errors that I may have made.
Ever since I started this blog, I’ve hosted it internally on our premises. Part of the reason for this was that I wanted to have full control over what was going on with it, and I wanted to work in a familiar environment. For me, that was of course the Microsoft stack. While SharePoint has excellent blogging features, made even better by the Community Kit for SharePoint: Enhanced Blog Edition, my feeling is that its feature set is more applicable to an inside the firewall deployment. Also, if I were to use SharePoint for this purpose, I’d be constantly distracted by the desire to improve upon it.
What I needed was a platform that was focused on blogging, and that I wouldn’t wind up tinkering with too much. I settled on WordPress, which seemed to be very well supported, and quite good at what it did. WordPress had direct integration with Windows Live Writer, and had apps for the iPhone, Blackberry, Android, and now Windows Phone 7.
WordPress natively runs on PHP and MySQL, and typically runs in Linux environments. However, since IIS supports PHP and MySQL runs on Windows, it is possible to get it running in my “familiar environment”. Normally doing this sort of thing is a bear, but by using the Web Platform Installer from Microsoft, the installation was a breeze. All that was necessary was to run it, and select WordPress as a desired application. The installer then took care of downloading PHP, MySQL, WordPress, and integrating them all together. After answering a few account and password questions, I was up and running, and have been ever since.
The one drawback of this approach was that I was hosting it myself, and therefore always concerned with reliability and uptime. More importantly it has been sharing a server with other applications, and more than once has gone down because another system needed a reboot, crashed, or something. A hosted environment was obvious, and since I’ve been exploring the Azure platform lately, I thought I’d see what was involved. One of the advantages of the MVP program, which I’m newly a part of is that you are allocated a certain amount of Azure computing hours, so off I went experimenting.
Happily, one weekend later, this blog has been transitioned to a high speed, and highly available platform, that most importantly, I don’t have to maintain. Not only that, but I’ve been able to take MySQL out of the picture completely, and I’m using a SQL Azure database as my data store. I had several false starts right away, and I’m going to document the approach that I took and post it here shortly, but for now, I’m pretty happy with the results.
A few weeks ago I posted an article on how to get on premises SharePoint working with BPOS for mail delivery (alerts, etc.). Historically, inbound email is something that is significantly trickier than outbound, but with hosted Exchange, I’d suggest that the two roles are switched in terms of difficulty. There are however still a couple of extra hoops that have to be jumped through, and I’ll try to guide you through them here.
For the purposes of this article, I’m going to assume that you’ve already done this when you set up outgoing mail. If not, I’ll refer you to my article linked above, or SharePoint George that will walk you through the requisite steps. Once it’s done for outgoing email, you don’t need to touch it for incoming.
2. Configure the SharePoint Farm to Accept Incoming email
First,you’ll need to navigate to Central Administration,and get into the System Settings section. Once there, select “Configure incoming e-mail settings” in the E-Mail and Text Messages section.
There are a number of settings here that will change a bit from what is the typical guidance out there. I’ll try to explain each configuration item, and what it means. Firstly, I’ll show you a completed configuration:
Enable Incoming E-Mail – Well, that’s pretty straightforward, do I turn on incoming email or not? When you turn it on, SharePoint simply monitors an SMTP drop folder for any messages. If it sees one, it will pick it up, and if the destination name matches a list, it will get delivered. It’s really that simple.
The settings mode lets you choose where the drop folder is. The Automatic setting is normally fine, but if you wanted to use a drop folder in a non default location, or on another server, you would select advanced and enter the desired folder. When the configuration is saved, SharePoint will also try to set the appropriate file system rights on that folder (see George’s blog for more details). I set advanced just so I see the path explicitly.
Directory Management Service – This one normally takes a fair bit of configuration to get working, but when we’re using BPOS, it’s easy – we just set it to no. This is a service that sets up contacts and distribution groups in Exchange, and although we’re using Exchange, it’s hosted, and don’t have access to that feature. We will be creating these manually.
Incoming E-Mail Server Display Address – This is the domain that the list email addresses will use. We’re going to change this. It will default to servername.domain.com. However, even if that address is available externally, we don’t want to be accepting mail from everyone. The IIS SMTP service has no real spam or virus protection, so we want all of our email to go through our hosted Exchange server. The best approach is to use the same domain as your other BPOS users.
E-Mail Drop Folder – As mentioned above, this is the folder that will be monitored for incoming email. If you don’t know if you should change this, then don’t… the default is likely fine.
Once you’re done, click OK to save the configuration. SharePoint is now set up to configure incoming email. Steps 3 and 4 will need to be repeated for every list/library that will accept email.
3. Configure Library to Accept Incoming E-Mail
Navigate to a library that you want to have accept incoming email. From the ribbon, select “Library” (or List..), and then select Library Settings.
Next, under the Communications Column, click the “Incoming e-mail settings” link. You should see a screen similar to the following:
Most of the options are self explanatory, so I won’t go into detail here. The most important ones are of course in the Incoming E-Mail section, which lets you turn it on or off, and lets you specify the address of the list. The address is important, as it will need to match what we do in BPOS in step 4, and it is also important that it is global across the farm (and of course the domain). That name can’t be repeated, so choose wisely. A naming policy is a good idea here.
Once you have the settings the way you want them, click OK, and your list is ready to go. Now it’s on to BPOS.
4. Configure the Address in BPOS
This is where it gets interesting. What we want to do is to have BPOS accept email from internal (and possibly external) senders, and then turn around and deliver them to out IIS SMTP service. Usually, we could set up a contact in Exchange and use mail forwarding to do this for us, but there is no mail forwarding capability in BPOS. So how do we accomplish this? Instead of using mail forwarding, we’ll set up a distribution list with one member, and let it work its magic that way.
The first thing that we need to do is to log into the admin portal at http://admin.microsoftonline.com. Once in click on the Service Settings tab, and then click on the Exchange Online subtab. From the right hand Actions section, click the “Add new contact” link. You then need to add your contact, which in effect is the library that we enabled in 3 previously:
Most of the fields are cosmetic (they will appear in the GAL), but the most important one for our purposes is the E-Mail address. note that this address is NOT the same as the one that we configured for the list, but includes the server name as well. This is important as BPOS needs to deliver the mail to that server. It is also important that that server address is available to BPOS (on a public DNS). This represents one half of the equation. In the next step, we’ll configure BPOS to accept the email for the list’s address by using a distribution list.
Once ready, Save your changes, and then click on the Distribution Lists link on the left of the screen. From the Actions section on the right, click “New distribution list”.
The Email Alias used here must match the one used in 3 above, and. The display name is relatively unimportant, but again will be available to the GAL. Once you save this screen, you should be ready to go.
It’s worthwhile to describe the flow of what happens. When an email is sent from a user, external or internal, the originating server will look for an MX record for the address to the right of the @ symbol. That MX record will point to your BPOS server. The BPOS server will accept the name, as it matches the distribution list that you created in step 4. The message will then be distributed to the members of the list, in this case one member at the precise SMTP address of the server farm. BPOS will send the message to the SMTP server running on the farm, where it will be deposited to the drop folder. Finally, the timer process in SharePoint will pick up the message and deposit it into the appropriate library.