I have no idea whether I’ll make it to entry #2. But the trials and tribulations of virtualizing my desktop system have been too many to chronicle in one blog post and it only seems fitting to break it up into a series of posts. I’m doing this because, for almost two years now, I’ve been extolling virtues of moving to a fully virtualized desktop, all along promising to practice what I preach. So, now that I’m practicing it, I find myself questioning whether I should be preaching it.
Let me clear before I start on the series. I have no regrets about virtualizing my desktop. I’m glad I did it and would do it again. I know that what I’m doing isn’t exactly what the various virtualization solution providers had in mind (in other words, targeting normal PC users). But I look forward to the day when virtualizing a computer is more like driving a car with an automatic transmission rather than a stick-shift. NOT that there’s anything wrong with driving a stick-shift (thanks Dad for teaching me to drive one, on a hill). It’s just a little more involved than the automatic. The same goes for virtualization.
Why virtualize? Virtualizing desktops is normally the domain of software developers. It’s not uncommon for a developer to use a virtualization solution like VMware, Xen, or Virtual PC to make their computer pretend as though it were actually two (or three or four) independent computers (otherwise known as virtual machines) running side-by-side without the two ever really coming into physical contact with other, unless its across the network the same way any other two systems might talk or connect to each other.
Developers might for example be running a Linux-based database server in one virtual machine and a Windows based desktop in another in a way that the desktop talks to the server over a network connection. It’s a great way to test client/server solutions.
But for the average Joe user, virtualization is something of enigma. They may have heard about it from a friend, seen some hype about virtualization from one of the solution providers, or caught wind of the fact that both Intel and AMD have support for virtualization built right into their consumer-targeted chips. Unlike with developers however, the benefits aren’t exactly clear. One day, they will be. For now, the question is whether or not it’s worth it given the investment in terms of time, aggravation and the possibility of having to re-license software you’ve already licensed, not to mention the fact that some things just don’t work well in a virtualized environment, if they work at all.