Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

bitbucket pipeline error - container 'docker' exceeded memory limit

I am seeing bitbucket pipeline error - container 'docker' exceeded memory limit while running bitbucket pipeline. I tried to use all possible service memory limits as per below documentation, but issue was not resolved.

Databases and service containers - Service memory limits

Can you help resolve the issue?

like image 369
ElasticSearchUser Avatar asked Mar 05 '20 14:03

ElasticSearchUser


People also ask

How do I increase my memory on bitbucket?

If you'd like to alter the memory usage of your containers, you have two options: Configure (and reduce) the memory your service containers are using: https://confluence.atlassian.com/bitbucket/use-services-and-databases-in-bitbucket-pipelines-874786688.html. Double the total memory of your build, from 4GB to 8GB total ...

What happens when Docker container Hits memory limit?

The --memory parameter limits the container memory usage, and Docker will kill the container if the container tries to use more than the limited memory.

Do Docker containers have a memory limit?

The maximum amount of memory the container can use. If you set this option, the minimum allowed value is 6m (6 megabytes). That is, you must set the value to at least 6 megabytes.

Does Docker limit memory by default?

By default, Docker does not apply memory limitations to individual containers. Containers can consume all available memory of the host.


3 Answers

It is due to your build take more memory than allocated In order to resolve this you need add this in your bitbucket-pipelines.yml

image: .....
options:      <= Add this
  docker: true <= Add this
  size: 2x   <= Add this
pipelines:
  branches:
   branches:
    master:
      - step:
          caches:
            - ....
          services: <= Add this
            - docker <= Add this
definitions: <= Add this
  services: <= Add this
    docker: <= Add this
      memory: 4096 <= Add this
like image 119
Akshay Sharma Avatar answered Oct 20 '22 20:10

Akshay Sharma


I contacted bitbucket, and they provided a solution:

  • at the beginning of the pipeline (before pipelines:)

options:
  docker: true
  size: 2x

  • at every large step:

name: XXXX
image: google/cloud-sdk:latest
services:
  - docker
size: 2x

  • at the end of the pipeline:

definitions:
  services:
    docker:
      memory: 4096

like image 35
Küzdi Máté Avatar answered Oct 20 '22 20:10

Küzdi Máté


As said previously you can use size: 2x on a step to increase the memory limit for that step or set it in options which will enable 2x size for all steps automatically.

However, it is worth noting that doing so will consume twice the number of build minutes compared to a regular step, effectively costing twice as much, as described here

like image 5
Aman Sanghvi Avatar answered Oct 20 '22 22:10

Aman Sanghvi