Skip to content

Month: June 2010

Moving To Cloud Based Email–My BPOS Story

When I first stuck out on my own (OK…some time before I struck out on my own..), I knew that I was going to need to come up with a good email solution. My requirements extended beyond those of the consumer market, and ultimately I needed the power and control that commercial email system would offer. I really didn’t know Exchange very well, and I wasn’t about to set up a Domino server (which I knew very well) as it was no longer the direction I was heading in.

I signed up with a hosted Exchange provider. This worked quite well, and was very reliable, but I quickly bumped into size limitations and integration problems. I think that at the time the maximum size mailbox was 25 MB.  I also wanted to gain experience with Exchange, so I bit the bullet and setup up a full domain with Exchange 2003 (including a Blackberry BES server) in my basement. That setup ran (in various guises) from mid 2006 to this past weekend. Initially it was comprised of multiple Exchange servers on virtual machines (required for remote Exchange access with 2003) to a single Exchange server without the BES after upgrading to Exchange 2007.

Hosting my own Exchange server was instructive, but ultimately a pain. My home internet connection is a consumer plan, and my service provider implemented multiple approaches to prevent any server hosting. This initially included blocking SMTP traffic inbound and ultimately (at a particularly bad time) blocking outbound SMTP. I quickly found workarounds to these problems (if you’re interested, I’ve used DynDNS for years, and I find their service to be exceptional. I’d recommend them in a heartbeat), but each one of these represented a significant drag on my time,and I’m not getting any younger.

In addition to the active blocking attempts,consumer ISV service isn’t exactly industrial grade. To be fair, they don’t claim that it is. In fact, ISPs typically go out of their way to not promise uptime reliability. Far too frequently after an outage, communication or power, my automatic DNS synchronizer wouldn’t update quickly enough and mail flow would be interrupted. Backup was another maintenance headache – yes it was getting done, but I had to have the infrastructure to support it, etc. All of this, and a few other things have prompted me to keep an eye open for alternatives.

My company is a Microsoft Online partner. We initially signed up to this program in the early days because of our extensive work with SharePoint, and recently, we have targeted online services as a significant growth area. One of the packages offered in Online Services is BPOS – The Business Productivity Online Suite. Simply put, this is hosted Exchange, SharePoint, Unified Messaging, and Live Meeting. All of this is offered at a very reasonable rate – $12.50 per user per month.

I decided last week to take my home Exchange system and migrate it to BPOS. The process went incredibly smoothly. The BPOS portal lays out all of the steps, but it can be a little confusing. I’ll quickly summarize them below.

1. Sync the Active Directory with BPOS

This sets up a one way synchronization between your Active Directory, and your BPOS Active directory. To be sure these are 2 different directories, and this just allows for simple user maintenance in the cloud. This step is not required for operation, but it is required for mailbox migration. One annoyance here – the synchronization tool must run on a domain joined Windows server running a 32 bit (!!!) OS. Since I only have 64 bit server set up, I had to spin up a new one. Ultimately, I would hope this was replaced by some sort of claims based model.

2. Set up your domain records

There are a number of steps here that are well documented in the setup section. These steps will allow your Outlook clients to auto discover your hosted Exchange mailboxes.

3. Migrate mailboxes

There is a tool that sets all of the appropriate user records, migrates mailbox content, and sets up email forwarding for the migrated users. It’s a VERY good idea to clean up all of your old junk before migrating. I, of course didn’t. That said, my largest mailbox (~2GB) took only about 6 hours to migrate. During the migration period, mail is still delivered to the on premises server, and it is kept both locally and in the cloud for migrated users. If a migration fails, it can be rerun and will pick up from where it left off. Once a user is migrated, and tested to be working, you use the tool to remove the mailbox from the on premises server, which will also remove forwarding. All mail will be delivered to the hosted mailbox.

3.5. Optionally, set up handheld connections to the hosted mailboxes.

4. Set Domain Records

Once all mailboxes have been migrated, set your domain’s MX record to now point to the hosted server, and use the administration portal to set it as authoritative, and to allow incoming mail. Once this is done there will be a lag while the changes propagate through the internet. Mail will not flow for a period of time, so don’t be alarmed.

5. Shut down your on premises Exchange server

…and rest peacefully.

Performance on the BPOS system has been great, and there appear to be no capacity issues. The per user mailbox limit can be set on a per person basis and the maximum is 25 GB. My mailbox is less than 2GB, and I do next to nothing to keep it cleaned out.

The only potential problem I see with it is integration. The Hosted server IS out in the cloud in a different domain, and therefore can’t reach back into the internal systems when necessary. For example, if running in a coexistence mode, free/busy time searches won’t work between the two groups of users. Also, on premises servers that need to send email won’t be able to use the hosted server to do so. Again, I hope that the promise of claims based authentication will help to alleviate these issues going forward.

BPOS is still using the 2007 Suite of products… Exchange 2007 and SharePoint 2007. They are slated to be moved to 2010 this fall, and I’m anxious to see what that will bring. When I know, I’ll certainly be posting back here.

I’m very happy with the results I’ve achieve, and heartily recommend it to any small-medium sized business. In fact, given the cost savings that can be achieved, I can’t see any reason why you wouldn’t want to go this route.

Leave a Comment

Overriding SharePoint 2010 CSS Classes – Background Images

I just overcame a tough little problem while branding a SharePoint 2010 site. I was trying to override the s4-title class in my themeable class but it just wouldn’t work. SharePoint Designer thought it was OK, my css class was loading last, ond IE Developer tools showed it as the active background image. Still no dice.

image

As is often the case when you’re overriding a class, the parent styles are in effect until overridden. I just couldn’t figure out what – the standard style didn’t show a background image. As it turns out, one was in fact declared in corev4.css (the standard set of classes). It didn’t show because it was positioned way above the page.

background:url("/_layouts/images/bgximg.png") repeat-x -0px -1023px;

I’m not sure why they do this (I suspect that it has to do with the theming engine), but my background was inheriting it. Once I added

background-position:0 0;

to my overridden class, my background appeared just fine.

image

Hope this helps someone.

9 Comments

SharePoint 2010 Page Layouts – What’s this UIVersionedContent all about?

If you work with the publishing features of SharePoint at all, or you do much branding, you’ve undoubtedly run into the UIVersionedContent control. Here’s an example of its use:

image

So what does this thing do? Simply put, it allows the SharePoint visual upgrade feature to work. When a site collection is upgraded from SharePoint 2007 (depending on the options selected), the sites themselves may wind up looking pretty much the same as they did before the upgrade. That’s because the SharePoint team didn’t want to be breaking any customizations or forcing users in specific teams to deal with new design elements, or for those concerns to hold up any upgrades. They therefore have introduced Visual Upgrade – this allows sites (not site collections) to be upgraded one at a time.

However if my site using the 2010features is using the same master page, or page layout as a site that uses the 2007 features, how will that work? That’s where this control comes into play. The control simply contains a ContentTemplate control, which in turn contains the markup to be used. It also possesses an attribute, “UIVersion” which is set to either 3 or 4. These numbers correspond to the old WSS versioning system (WSS 3, SharePoint Foundation 4) and will use the contained markup if the version of the site matches the attribute.

You’ll normally see these controls in pairs, giving an either/or type capability,but there’s no need to restrict them to this.

It’s actually a pretty slick system,but it does add a lot of text to the page layouts. One annoying thing is that all of the V3 supporting code is there even if it’s a brand new install of 2010. This makes sense, because you can introduce a V3 content database into the mix at any time, and you never know when you might need the support. However, if you know that the master pages and page layouts you’ll be working with will only be used in by V4 content, you can feel free to go ahead and remove the V3 tags. Before you do though, make sure that you’re not editing the system default masters/layouts. Always create new ones and do your customization there.

I’ve not seen any other values for the attribute besides 3 and 4 – these are processed by the server accordingly. I’m intrigued by the development possibilities though. Ideally, this could support an environment where I can register a “version” that my site could select to use. This would be much cleaner than keeping multiple master pages for variations in branding, or to support micro sites. This also might be a better model in the WCM world for multi lingual support. I have no idea if that’s the plan, but to me it would make sense.

2 Comments

Storing Data In The Cloud

Last week, my colleague Ed Senez posted a very good article about cloud computing, and it’s benefits.Our company has been making moves toward the cloud for a couple of years now, with both Microsoft’s BPOS offering, and our own SharePoint Extranet Accelerator. While companies struggle with the benefits and risks of moving pieces of their business to the cloud, I can see a huge role for the cloud in the consumer space, primarily because it is so cost effective. I have been moving a lot of my personal data to the cloud for the past little while, and I thought that I would share my current observations.

Photos and Videos

Almost any Facebook user is familiar with posting pictures. The social functionality is great – tagging people lets all their friends know that they are in a new picture (maybe not so great if you don’t like the picture, but I digress….). YouTube is of course great for uploading and sharing videos, but both of these services have one drawback – they convert the files on upload resulting in a loss of fidelity. If you care about the quality of your source content, you can’t rely on these services for backup.

This fact led me a few months back to Flickr. At first look, Flickr had a lot of limitations too – a maximum file size,and a maximum upload rate per month,which initially caused me to dismiss it. What I found out was that with the subscriber version there are no limits at all – you can upload to your hearts content, and it will store the images in their true source format. I have been doing just that when I could for the past few weeks, and currently have over 2000 pictures in my photostream. Just 8000 or so to go.

Flickr also allows you to share your pictures publicly, with family and friends, or just keep them private. However, Flickr doesn’t have Facebook’s ubiquity, so I use it for purely public pictures only, and continue to rely on Facebook primarily for sharing and people tagging. Flickr does allow for videos as well, but it does have some size limits, so I will be relying on YouTube for sharing my videos, along with a separate backup strategy (see below) as I get my videos organized.

So how much does this cost? For $25 per year, I know that all of my personal pictures are backed up. Pictures are quite literally irreplaceable. Documents can be recreated, but you’ll never have a chance to capture those precise moments again. The fact that I can use the services to share picture (in full source quality) is really just a bonus.

Simple Storage with SkyDrive

Did you know that you have 25 GB of storage in the cloud that you can use free of charge? If you have a Windows Live ID (also free..) then you do. It’s called Sky Drive, and it’s extremely handy. Simply upload the files you wish to private, shared, or public folders and they’re safely secured away and accessible from any machine with a web browser. Because SkyDrive also uses WebDAV, you can map your SkyDrive folders directly to folders on your computer.

When you are navigating through your SkyDrive, you also have access to the recently released Office Web Applications. These are light, browser only versions of Microsoft Word, Excel, PowerPoint, and One Note, and they’re completely free of charge. You can create a new document using these apps, or edit anything that you upload. These apps are very handy for occasional use, for viewing purposes, or just for accessing an Office document that may have been sent to you when you don’t have the Office applications readily available.

Sky drive should pretty much eliminate the need for FTP servers, certainly for personal use. Given the cost of the service ($0.00), I really don’t see why someone wouldn’t want to take advantage of it.

Backup

I think that everyone that has used a computer for any amount of time has at some point lost data. Afterwards, there is a mad rush to back up the systems, and then make sure that there is a system in place to back everything up. Corporations typically have solid backup strategies in place (that aren’t tested frequently enough, in my opinion), but personal users are often too busy to ensure that their data is backed up in a timely fashion. There are a ton of consumer backup product out there, but they all often have one fatal flaw. They require the user to actually do something to make it work.

This is where the cloud can be of great help. If we can assume that the machine will typically have a connection to the internet, then for all intents and purposes, our backup destination is always available. All that is needed is a good service to make this painless and automatic for the end user. There are a number of such providers out there, and I’m going to briefly discuss the one that I’ve settled on – Carbonite.

With Carbonite, you download a small application that runs in the background, and is constantly ensuring that your files are being backed up. For most users it is as simple as a next – next install, which will backup all standard data folders. If you want to back up a non standard folder, just right click on it and choose to add it to the backup. You can always see what the backup status is from the console, but carbonite also (optionally) places a small indicator over the icon for each file that you have to let you know its backup status. The backed up files are also browser accessible from any internet connected PC, allowing you to access your files in a pinch, and one of the nicest features is that it not only keeps a mirror image of your system off site, it maintains file versioning, so when you make a change to a file and later decide that it wasn’t such a good idea, you can retrieve a previous version.

Given most end users’ bandwidth constraints, the initial backup can take a little while. Mine took two weeks, but that’s me. After initial backup, it all goes very rapidly. So what’s the cost of all of this storage? You can back up as much as you want from a single machine for $55 US per year. To me, that’s a no-brainer.

 

I spend about 5-10% of my time inside my company firewall. Tools to help with remote connectivity are crucial, and I really see a place for cloud based services to provide a lot of these tools. They’re safe, they’re easy, they’re useful, and they’re highly cost effective. In storage alone, I now back up all of my important personal data (redundantly I might add) and enhance my convenience in accessing it. All for less than $100/year.

I’m sold.

Leave a Comment

Now THAT’S Planning Ahead – Next SharePoint Conference Announced

The next SharePoint Conference has been announced…..for fall 2011. I’m guessing the folks at Microsoft got tired of the “when will the next conference be” question. I can’t even get airline tickets that far ahead.

It does seem that they’re not taking the annual approach to the product specific conferences any more. It was 18 months between the last 2 SP Conferences, and the BI Conference is being held at Tech Ed right now – the previous one was October 2008.

Looks like they’re pretty confident of future demand! The precise dates are Oct 3-6 2011, in Anaheim California.

Leave a Comment