In this post, we will look at how to build Docker images in Bitbucket Pipelines. First, we will take a look at the limitations of building Docker images in Bitbucket Pipelines, like lack of multi-platform support, limited caching, and the inability to use large portions of BuildKit. Then we will briefly introduce how Depot eliminates those limitations.
Building a Docker image in Bitbucket Pipelines
Bitbucket Pipelines is a CI/CD service built into Bitbucket. To build Docker images in them, we can add a bitbucket-pipelines.yml to the root of our repository with the following contents.
pipelines:
branches:
master:
- step:
name: Build with Docker
script:
- docker build .
services:
- dockerHere we are building a Docker image inside our pipeline by enabling the Docker service on the individual step. Note that we don't need to declare Docker as a service inside our Bitbucket pipeline because it is one of the default services. Underneath the hood, this is mounting the docker CLI into the container running our pipeline, allowing us to run any docker command we want inside our pipeline.
If we want to leverage more advanced features, like those found inside BuildKit, we can enable that by adding the DOCKER_BUILDKIT=1 environment variable to our pipeline.
pipelines:
branches:
master:
- step:
name: Build with Docker
script:
- export DOCKER_BUILDKIT=1
- docker build .
services:
- dockerWith the new environment variable, the docker build will use BuildKit. This workflow is functional and will work to at least build a Docker image inside of Bitbucket Pipelines. But, unfortunately, it won't be that performant, and there are several limitations we should be aware of.
Limitations of building Docker images in Bitbucket Pipelines
Several limitations to building Docker images in Bitbucket Pipelines make it challenging to build images quickly or leverage more advanced tooling like buildx.
No multi-platform or buildx support
Bitbucket Pipelines don't support multi-platform builds. So, for example, you can't build an image for multiple platforms simultaneously, like a multi-platform image for both Intel & Arm. However, BuildKit supports multi-platform builds, and they are available in other CI providers like GitHub Actions, Google Cloud Build, and GitLab CI.
They aren't necessarily performant in those providers, but they are supported.
In Bitbucket Pipelines, you can't even attempt a multi-platform build. Below is our bitbucket-pipelines.yml, but with the added buildx build for a multi-platform image to build an image for both Intel & Arm.
pipelines:
branches:
master:
- step:
name: Build with Docker
script:
- export DOCKER_BUILDKIT=1
- docker buildx build --platform linux/amd64,linux/arm64 .
services:
- dockerBut, if we attempt to run this multi-architecture image build in Bitbucket Pipelines, we get the following error:
+ docker buildx build --platform linux/amd64,linux/arm64 .
unknown flag: --platformWhy? Because buildx is completely disabled in Bitbucket Pipelines. So buildx and, thus, multi-platform builds are disabled and unavailable in Bitbucket Pipelines.
RUN --mount=type=ssh is disabled
When building images with Bitbucket Pipelines, we can't leverage the SSH mount inside of our Dockerfile. So instead, you must use the --ssh flag instead.
Caching limitations
We can enable a Docker cache in Bitbucket Pipeline by specifying the cache option in our config file.
pipelines:
branches:
master:
- step:
name: Build with Docker
script:
- docker build .
services:
- docker
caches:
- dockerThe docker cache allows us to leverage the Docker layer cache across builds. However, there are several limitations to this cache.
First, the cache is limited to 1 GB in size. Docker layers are often significantly larger than 1 GB, so we rarely get to cache all the layers of your build. This limited cache slows Docker builds down across builds because we can't use previous build results for faster builds, as we saw in using Docker layer caching in GitHub Actions.
Second, the docker cache in Bitbucket Pipelines won't work when using BuildKit. So we can't use this default cache when you enable BuildKit. However, you can work around this limitation by using a registry cache approach as we've seen in faster Docker image builds in Google Cloud Build. During the build, we specify the --cache-from flag to pull the cache from a registry.
pipelines:
branches:
master:
- step:
name: Build with Docker
script:
- export DOCKER_BUILDKIT=1
- docker build --cache-from $IMAGE:latest .
services:
- dockerRegistry caches tend to be slower than having a persistent cache across builds that is immediately available.
Faster Docker image builds in Bitbucket Pipelines
These limitations don't prevent us from building a Docker image, but they do prevent us from building a Docker image quickly. Depot provides a drop-in replacement for docker build that allows you to work around these limitations.
To add Depot to our Bitbucket Pipelines, we need to install the depot CLI as part of our step. Here is an updated bitbucket-pipelines.yml file that does exactly that.
pipelines:
branches:
master:
- step:
name: Build Docker image with Depot
script:
- curl -L https://depot.dev/install-cli.sh | DEPOT_INSTALL_DIR=/usr/local/bin sh
- depot build --platform linux/amd64,linux/arm64 .
services:
- dockerWe can install depot via curl and place it in the /usr/local/bin directory. Then, we can swap out docker build for depot build as it's a drop-in replacement that takes in all of the same parameters.
Native BuildKit
By swapping docker build for depot build in our Bitbucket Pipeline, we get a complete native BuildKit environment for both Intel & Arm CPUs and built-in persistent caching on fast NVMe SSDs.
For example, before, we couldn't build a multi-platform image for Intel & Arm. Now we can by leveraging Depot and passing the --platform linux/amd64,linux/arm64 flag. All of the flags you would use with docker build or docker buildx build are natively supported with depot build.
We couldn't use --mount-type=ssh or BuildKit cache mounts like RUN --mount=type=cache,target=/var/cache/apt. But, with Depot, you get all of that functionality out of the box.
The Docker layer cache is immediately available across builds when using Depot. So, we no longer need a registry cache or are constrained to the 1 GB cache Bitbucket Pipelines provides, as we get a 50 GB persistent cache on NVMe SSDs.
Conclusion
Building Docker images in Bitbucket Pipelines works for the initial use case but is not always the most performant option. Some limitations ultimately inhibit our ability to build Docker images as quickly as possible or build for other architectures we need to support. However, there are workarounds, and not everyone needs to build multi-platform images, so it works for some use cases.
However, if you'd like to stick with Bitbucket Pipelines but get faster Docker image builds with native multi-platform support and fast persistent caching, Depot may be an option you want to consider. You can sign up for our 60-minute free tier, and we're always available in our Community Discord to answer any questions you may have.

