One of the things I've run across on Windows is when a web browser plugin or program you're developing makes an assumption that something is installed that, by default, isn't always present on Windows. A perfect example would be .NET - a whole lot of people running Windows XP have never installed any versions of .NET and so the installer needs to detect and remedy this if necessary.
The way I've been testing this in Windows is to have a virtual machine with a snapshot of a clean, patched, but otherwise untouched install of XP or Vista or 7 or whatever. When I'm done testing I just discard any changes since the snapshot. Works great.
I'm now developing something for the Macintosh, a platform which is very new to me, and I'm seeing that virtualization does not appear to be an option. It's explicitly forbidden in the EULA of Mac OS X, it's only allowed from Mac OS X Server, which seeing as how I'm targeting an end product is of no use to me, and the one program I see which can virtualize it - VirtualBox - only supports the server and actively nukes any discussion of running the consumer/client version of Mac OS X. And the only instructions I find anywhere on the topic seem to involve the use of "hacking" programs which is very much incompatible with the full-time gig I'm trying to do this for.
So it looks like virtualization is out, but at various points I'm going to want or need to simulate what it's like to install and run this software on a "clean" Macintosh. How do people usually do this? Just buy multiple Macintoshes and use Time Machine? Am I thinking about this all wrong and everything Just Works?
To be clear
As I sit here with umpteen versions of various windows VMs on my work machine I never really need that on the Mac. The biggest reason is the way that applications are deployed in bundles on the Mac. Ideally, installation is a copy and uninstallation is dragging the app to the trash. On windows you have much more shared state between applications that must be accounted for that most apps don't have on the Mac. Now if you are writing a device driver or VPN client or something that needs to get into those parts of the system then you don't have that luxury.
Where I really feel the need for virtualization is when I want to target different versions of OS X, or do some sort of regression check to see if things really did work different under version X.Y.X.
So how do we achieve what you are going for, the default is don't care because you don't have the same risks due to app bundles. Or if you care then buy a bunch of external hard drives to boot off of and swap them out as need be. (I only did that once for two versions of OS X, so I cannot say it really is industry practice.)
You don't need a second Mac, you could just install a second OS X Installation on another partition or hard drive.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With