You've developed an awesome python package for internal use at your company. You want to publish it so your colleagues can start using it. Since the package is for internal use, publishing it on PyPI (the official Python package registry) isn't an option. Instead, because your company uses GCP, the natural choice is Artifact Registry.
Publishing the package to the registry is straightforward, as the documentation explains.
Packaging the Library
I'm using Poetry for packaging the library. Here are some commands you'll use:
poetry source add --priority=supplemental gcp_registry https://{LOCATION}-python.pkg.dev/{REPO}/{PACKAGE}/
poetry publish --no-interaction --build --repository gcp_registry
Once you've published your package to Artifact Registry, you can make it available as a dependency for other projects.
Installing the Package
To install the package in your local machine, create a requirements_private.txt file:
--index-url https://{LOCATION}-python.pkg.dev/{REPO}/{PACKAGE}/simple/
--extra-index-url https://pypi.org/simple
{YOUR_PACKAGE_NAME}
Then, install the package with:
pip install keyring
pip install keyrings.google-artifactregistry-auth
pip install -r /opt/requirements_private.txt
The keyring packages handle Artifact Registry authentication. Ensure your Application Default Credentials (ADC) are set before proceeding.
Docker Challenges
When running the app in Docker, you face additional challenges:
- You don't want to copy sensitive information, such as your service account file, into the Docker image.
- You still need to authenticate with Artifact Registry.
The solution is simple but not well-documented. It took me days to figure this out, so I want to save your time and help you implement it in minutes.
The Solution
- Pass the GOOGLE_APPLICATION_CREDENTIALS environment variable during the Docker build, pointing to the path of the service account file (not the file content itself).
- Mount the service account file as a secret at the path specified by GOOGLE_APPLICATION_CREDENTIALS.
- Perform all operations, including installing keyring packages and private dependencies, in the same RUN statement. It is important to be in the same
RUN
statement since the mount of the file lives only within that context. - Make sure you have the mode with proper permissions so the file can be read.
Dockerfile Example
Here's how your Dockerfile should look:
ARG GOOGLE_APPLICATION_CREDENTIALS
COPY requirements_private.txt /opt/requirements_private.txt
RUN --mount=type=secret,id=creds,target=/opt/mykey.json,mode=0444 \
pip install keyring && \
pip install keyrings.google-artifactregistry-auth && \
pip install -r /opt/requirements_private.txt
COPY requirements.txt /opt/requirements.txt
RUN pip install -r /opt/requirements.txt
The requirements_private.txt still the same.
--index-url https://{LOCATION}-python.pkg.dev/{REPO}/{PACKAGE}/simple/
--extra-index-url https://pypi.org/simple
{YOUR_PACKAGE_NAME}
As you see you can have multiple requirements files. In my case the requirements.txt file is for the packages that are hosted in the public registry of pypi.
And then your docker_compose.yml file
services:
app:
build:
context: .
args:
- GOOGLE_APPLICATION_CREDENTIALS=/opt/mykey.json
secrets:
- creds
secrets:
creds:
file: "C:/your/local/host/path/to/google_service_account.json"
And then you can just run for building
docker compose build
Hope this article helps you with the integration with artifact registry and docker.
Top comments (0)