I currently use Amazon S3 as a backup location for my local machines - Primarily using JungleDisk to backup nightly all my local files to my S3 account.
I have been looking at creating a more intelligent backup solution for remote files - Websites on Windows and Linux boxes, along with SQL Server and MySQL databases.
I have been pondering the idea of creating an Amazon EC2 instance which can operate in the cloud as my remote backup machine - Ideally, some sort of script or schedule would trigger the machine to start, and when it does, it would execute a variety of tools or scripts to connect to and back-up my web servers, backing everything up to my Amazon S3 account. Hopefully, when complete the instance can be programmed to shut-down saving execution time.
Am I dreaming? Is this a possibility? Can anyone point me in the right direction?
Thanks,
GW
I had the exact same thought about getting an EC2 instance to be my dynamic rsync machine. Just wrote an entry about it on our blog, but basically the solution is indeed to create an EC2 volume, not S3 and dynamically startup and shutdown an EC2 instance, mount the volume and rsync to it whenever you want to backup.
See my entry here: Using Amazon EC2/EBS/S3 for automated backups
What sort of pointing are you looking for? This sounds like a perfectly workable idea. There are a variety of libraries to access both EC2 and S3 from a scripting language such as Python or Ruby. You would create a machine instance that starts up, reads from a configuration file to find out which machine(s) to connect to, logs in remotely, fetches new versions of files or database dumps that might be present, uploads that to S3, and finally shuts itself down.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With