Testing a monorepository in a Docker container

I want to share my experience of implementing testing in a monorepository, without affecting the writing of the tests themselves…

Let's say there is a repository with several applications and a folder commonwhich contains the functionality that is used in these applications

.
├── apps
│   ├── app1
│   │   ├── cfg
│   │   │   └── __init__.py
│   │   ├── core
│   │   │   └── __init__.py
│   │   ├── docker-compose.yml
│   │   ├── manage.py
│   │   └── tests
│   │       └── __init__.py
│   ├── app2
│   └── app3
├── common
│   └── __init__.py
├── requirements-dev.txt
└── requirements.txt

When implementing tests, I wanted to implement the following tasks:

  • when implementing new features in applications, run application testing on several versions of python

  • when adding new functionality to common run testing of all applications

  • at the slightest change in requirements.txt or requirements-dev.txt rebuild the environment and also run testing of all applications

These tasks can be achieved using toxadd the file ./tox.ini (relevant for tox==3.27.1):

[main]
current_python_env_dir = {toxworkdir}{/}current_python_env_dir
next_python_env_dir = {toxworkdir}{/}next_python_env_dir
report_temp_dir = test{/}temp{/}
pytest_flags = --tb=no

[tox]
skipsdist = True
envlist =
    {next_python, current_python}-{app1, app2, app3}

[testenv]
recreate = False
sitepackages = False
passenv =
    FORCE_COLOR
commands =
    next_python-app1: {env:TOXBUILD:py.test apps{/}app1{/}tests --junitxml={[main]report_temp_dir}test_{envname}.xml {[main]pytest_flags}}
    current_python-app1: {env:TOXBUILD:py.test apps{/}app1{/}tests{/}smoke --junitxml={[main]report_temp_dir}test_{envname}.xml {[main]pytest_flags}}

    next_python-app2: {env:TOXBUILD:py.test apps{/}app2{/}tests --junitxml={[main]report_temp_dir}test_{envname}.xml {[main]pytest_flags}}
    current_python-app2: {env:TOXBUILD:py.test apps{/}app2{/}tests{/}smoke --junitxml={[main]report_temp_dir}test_{envname}.xml {[main]pytest_flags}}

    next_python-app3: {env:TOXBUILD:py.test apps{/}app3{/}tests --junitxml={[main]report_temp_dir}test_{envname}.xml {[main]pytest_flags}}
    current_python-app3: {env:TOXBUILD:py.test apps{/}app3{/}tests{/}smoke --junitxml={[main]report_temp_dir}test_{envname}.xml {[main]pytest_flags}}

[testenv:next_python-{app1, app2, app3}]
basepython = python3.7
envdir = {[main]next_python_env_dir}
deps =
    -Urrequirements-dev.txt

[testenv:current_python-{app1, app2, app3}]
basepython = python3.8
envdir = {[main]next_python_env_dir}
deps =
    -Urrequirements-dev.txt

Add a dockerfile of the image in which we will be testing ./test/test_tox.Dockerfile:

FROM python3.7

WORKDIR /tox_workdir
COPY requirements*.txt tox.ini /tox_workdir/

#==================================================================================
# Ставим необходимые зависимости
#==================================================================================
RUN apt-get update && apt-get -y --no-install-recommends install \
    cabextract \
    curl \
    ......
    && apt-get clean

#==================================================================================
# Добавляем пользователя webui и далее делаем все через него
#==================================================================================
RUN addgroup --gid 1000 --system web && adduser --system --gid 1000 --uid 1000 web
RUN chown -R web /tox_workdir
USER web

#==================================================================================
# Ставим pyenv, через него устанавливаем необходимые версии python
#  и прописываем пути
#==================================================================================
RUN curl https://pyenv.run | bash \
    && echo 'export PYENV_ROOT="/home/web/.pyenv"' >> ~/.bashrc \
    && echo 'export PATH="/home/web/.local/bin:$PATH"' >> ~/.bashrc \
    && echo 'export PATH="$PYENV_ROOT/shims:$PATH"' >> ~/.bashrc \
	&& echo 'export PATH="$PYENV_ROOT/bin:$PATH"' >> ~/.bashrc

RUN . ~/.bashrc && pyenv install 3.8
RUN . ~/.bashrc && pyenv global 3.8
RUN . ~/.bashrc && pip3.8 install --upgrade cython
#==================================================================================
# Установка tox и окружений
#==================================================================================

RUN pip3 install tox==3.27.1 && pip3 install --upgrade cython
RUN . ~/.bashrc && bash -c "export TOXBUILD=true && tox"

add a script to build this image ./test/build.sh:

#!/bin/bash
docker build -f test_tox.Dockerfile -t test_tox:latest .

Thus, after building this image, we will have 2 versions of python installed in the container and 2 folders with environments (with different versions of python) in which all packages from requirements-dev.txt When running testing from a container, environments will not be rebuilt

Next, to run tests, we will create a file ./test/docker-compose.yaml :

version: "2"

services:
  db:
    image: postgres
    environment:
      - POSTGRES_USER=root
      - POSTGRES_HOST_AUTH_METHOD=trust
    volumes:
      # инит скрипт для создания баз данных для приложений, можно положить
      # рядом, в ./test/create_databases.sql
      - ./create_databases.sql:/docker-entrypoint-initdb.d/init.sql
  cache:
    image: memcached:alpine
  test_tox:
    image: test_tox
    volumes:
      - ../:/test
      # это нужно для того, чтобы локальные конфиги заменялись конфигами,
      # которые подойдут для запуска в контейнере, можно так же положить 
      # рядом в ./test/container_cfg
      - ./container_cfg:/test/apps/app1/cfg
      - ./container_cfg:/test/apps/app2/cfg
      - ./container_cfg:/test/apps/app3/cfg
volumes:
  media:

file ./test/create_databases.sql :

CREATE DATABASE app1_test;
CREATE DATABASE app2_test;
CREATE DATABASE app3_test;

Now you can run testing via docker-composefor convenience, let's create a file ./test/Makefile :

# You can set these variables from the command line.
TOX_WORK_DIR=/tox_workdir/.tox
APP1_ENV=-e current_python-app1,next_python-app1
APP2_ENV=-e current_python-app2,next_python-app2
APP3_ENV=-e current_python-app3,next_python-app3
ALL_ENVS=$(APP1_ENV) $(APP2_ENV) $(APP3_ENV)

define run-test
	docker-compose rm --all
	docker-compose up -d db
	docker-compose up -d cache
	docker-compose run --rm test_tox /bin/bash -c ". ~/.bashrc && cd /test/ && tox $1 --workdir $(TOX_WORK_DIR) -q --result-json .tox-result.json" || :
	docker-compose down
	docker-compose rm --all
endef

build:
	./build_local.sh

app1_test:
	$(call run-test, $(APP1_ENV))

app2_test:
	$(call run-test, $(APP2_ENV))

app3_test:
	$(call run-test, $(APP3_ENV))

tests:
	$(call run-test, $(ALL_ENVS))

all: build tests

.DEFAULT_GOAL := all

through Makefile we can rebuild the test container and start testing the applications we are interested in, for example:

cd test
make build
make app1_test
make tests

The project tree looks like this:

.
├── apps
│   ├── app1
│   │   ├── cfg
│   │   │   └── __init__.py
│   │   ├── core
│   │   │   └── __init__.py
│   │   ├── docker-compose.yml
│   │   ├── manage.py
│   │   └── tests
│   │       └── __init__.py
│   ├── app2
│   └── app3
├── common
│   └── __init__.py
├── requirements-dev.txt
├── requirements.txt
└── test
    ├── Makefile
    ├── build.sh
    ├── container_cfg
    │   └── __init__.py
    ├── create_databases.sql
    ├── docker-compose.yaml
    └── test_tox.Dockerfile

The advantages of this implementation:

  • All testing is performed in a separate container, which already contains everything needed to run tests

  • You can very quickly check the functionality of applications when changing commonwithout going to the test folders of individual subsystems

  • Testing applications with multiple environments (you can also test with different versions of certain packages)

  • Works from the folder ./test/ can be easily attached to CI/CD when starting testing in the pipeline

Minuses:

  • Cracking 2 versions of python into one container

  • Long rebuilding of the container due to installing the second version of python

  • Inability to run testing with additional flags (for example –pdb)

PS When writing this article, I tried to depersonalize the project as much as possible, so some inaccuracies are possible, the main thing was to convey the essence.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *