I'm trying to deploy via docker. I'm using the following workflow:
But docker push takes FOREVER. There are like 30 images, and it has to walk through each one and say "Image already exists". Is there any way to speed this up?
Alternatively, should I be using a different process to deploy?
The most likely reason why you are pushing more/large layers of your images on every deployment is that you have not optimized your Dockerfiles.
Progress bars are shown during docker push, which show the uncompressed size. The actual amount of data that's pushed will be compressed before sending, so the uploaded size will not be reflected by the progress bar.
By default, docker pull pulls a single image from the registry. A repository can contain multiple images. To pull all images from a repository, provide the -a (or --all-tags ) option when using docker pull .
If you are pushing on AWS ECR, like I was, it may be that docker on your local needs to restart. See thread about AWS ECR slowness:
https://forums.aws.amazon.com/thread.jspa?threadID=222834
This may affect other platforms as well. It seems that around 1.12.1 on Mac, anyhow, there are some slowness issues that go away with a restart of Docker.
If you're using a local registry, we recently added a redis cache which has helped speed things up tremendously. Details about how to do this are on the registry github page
https://github.com/docker/docker-registry
While pushing still takes time on new images, pulls are very fast, as all layers are in the redis cache.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With