I want to build a rubygems mirror accesible for some servers at work (that don't have Internet access), so I started like everybody seems to do:
$ cat gemmirror.config
---
- from: http://gems.rubyforge.org
to: /data/rubygems/mirror
$ gem mirror --config-file=gemmirror.config
The mirror starts syncing well. OK. But wait, it downloads the whole content of http://gems.rubyforge.org! I mean all the existing versions of every single gem. Wow. After a couple of hours I'm still downloading the gems that begin with the letter "L" ...
Not to mention the disk-space the mirror is going to take.
Now my question: is there a way to setup a "mini" rubygems server, like CPAN::Mini allows for Perl: such a mirror would contain only the latest version of every gems. This is in 99.9% of the cases what people want, I guess (at least this would be perfectly good enough for me there).
I've googled a lot and sadly, I cannot find any reference to that.
If someone can point me to the way to do that, I'd appreciate very much.
Thanks!
If you need a simple application that makes it easy way to create your own RubyGems mirror without having to push or write all gem you wanted in a configuration file try out Gemirro . It does mirroring without any authentication and you can add your private gems in the gems directory.
GitHub - rubygems/rubygems-mirror: The `gem mirror` RubyGems command, creates local mirrors of all gems from a remote gem source. Use Git or checkout with SVN using the web URL. Work fast with our official CLI.
The gem mirror command can be aborted and continue where it left off. Hi, I am trying to do the same thing. Can anyone guide me through the stepa @TomCaps. I don't think 'gem mirror' continuing from where it left of it very reliable.
I'm referring to the 'gem mirror' command, of course. There are over 44k gems ( rubygems.org/stats) at the moment which would take up over 100GB of space. I'd recommend against it unless you really need to. The gem mirror command can be aborted and continue where it left off.
Not really what you are asking for, but perhaps one way to tackle this, is to setup a caching http proxy that will cache the gems as they are requested, which should subsequent requests hit the cache.
Something like squid should do the job.
Then each client configures the gem proxy to use squid, so all gem downloads go through it and hopefully are in the cache 99% of the time...
Although if you use bundler, seems it has issues with the proxy settings :(
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With