commit | 22b632fc8cc692e11afd93614a4705267bf0a263 | [log] [tgz] |
---|---|---|
author | Gary Martin <gjm@apache.org> | Wed Sep 08 01:13:56 2021 +0100 |
committer | Gary Martin <gjm@apache.org> | Wed Sep 08 01:13:56 2021 +0100 |
tree | 7067558c9a9796a461e6f7f5a36ba0b4587cda60 | |
parent | 4de51aca83f4f54cf6cb6db295b6fb6c48be78e2 [diff] |
Update models to match legacy db column names
Note that this document describes a new Apache Bloodhound project that is intending to replace the Trac-based version. If you are interested in that version, the appropriate code is available from here.
The new version of Apache Bloodhound is in the bloodhound-core git repository which is mirrored on GitHub here.
If you have not already got the code, you can clone the repo with the following command:
git clone https://github.com/apache/bloodhound-core.git
which will put the code in the bloodhound-core
directory.
This version of Apache Bloodhound requires Python, Poetry and Django.
The versions of Python that can be used are currently 3.6, 3.7, 3.8 and 3.9.
Where convenient is it sensible to go for the newest release of Python that you can.
Modern Linux and BSD distributions will have new enough Python 3 packages in their repos and are often already installed if it is not a minimal installation. For these cases it will usually be sensible to take advantage of this.
If this is not the case, you can look for advice from:
The project now uses Poetry for python environment management and looking after further dependencies.
If you are installing on linux, it is possible that poetry is installable from the repositories for your distro. For example, on recent Fedora releases, the following should work:
sudo dnf install poetry
For anywhere else you can consider following the instructions from the Poetry documentation.
Once installed, optionally you can pre-configure poetry to make it use a .venv
directory at the root of poetry projects. This can be helpful as it makes this easier to find and removal of your copy of the git repo will also clean up these files. If this seems useful:
poetry config virtualenvs.in-project true
As Poetry creates and manages python virtual environments (virtualenv) for you, it is useful to be aware of how they are used. For convenience, throughout this document, any command that requires the virtualenv to be ‘active’ will be provided with poetry run
before the command. While this may get old, it is effectively robust as it should work without having to remind you all the time to be sure the virtualenv is activated.
For a little more completeness, the following lists the options along with example sessions, each including a command to demonstrate exiting the virtualenv if applicable:
poetry run
:poetry run python --version poetry run django-admin help
poetry shell
:poetry shell python --version django-admin help exit
virtualenvs.in-project
option was set):source .venv/bin/activate python --version django-admin help deactivate
It should now be possible to use poetry to install the rest of the project dependencies.
From the root of the project folder (probably bloodhound-core
if the above instructions have been followed) run:
poetry install
The basic setup steps to get running are:
poetry run python manage.py makemigrations trackers poetry run python manage.py migrate
The above will do the basic database setup.
Note that currently models are in flux and, for the moment, no support should be expected for migrations as models change. This will change when basic models gain stability.
For certain operations it will be useful to have accounts and superusers to work with. There are a few ways to add a superuser. For interactive use, the createsuperuser
action is usually straightforward enough:
poetry run python manage.py createsuperuser --email admin@example.com --username admin
Entering the password twice on prompting is currently required. If the options for --username
and --email
are skipped, the command will request these details first.
poetry run python manage.py runserver
Amongst the initial output of that command will be something like:
Starting development server at http://127.0.0.1:8000/ Quit the server with CONTROL-C.
Currently, there is not much to see at the specified location. More work has been done on the core API. The following views may be of interest as you explore:
These paths are subject to change.
Unit tests are currently being written with the standard unittest framework. This may be replaced with pytest.
Unit tests are run with the following command:
poetry run python manage.py test trackers
The integration tests are based on Selenium and Firefox. For convenience of setup, these tests currently expect to connect to Selenium at
If you have docker-compose installed, the selenium-firefox
container can be brought up from the docker
directory with:
docker-compose up selenium-firefox -d
Or, with just docker (from any directory):
docker run -d --network host --privileged --name server \ docker.io/selenium/standalone-firefox
Running the functional tests directly requires a running server and so run
poetry run python manage.py runserver
in one terminal and in a second:
poetry run python functional_tests.py
There are currently not many tests - those that are there are in place to test the setup above and assume that there will be useful tests in due course.
Fixtures for tests when required can be generated with:
poetry run python manage.py dumpdata trackers --format=yaml --indent=2 > trackers/fixtures/[fixture-name].yaml
While the sqlite database backend is convenient to reduce the complexity of setting up development environments, Django provides us with options to use a range of database backends.
Initially we will concentrate on making it easier to support the Postgresql backend.
There are a number of options available to satisfy the dependencies for Postgresql support. For convenience we provide two alternatives through our poetry setup.
Full installation for production like installation should use the following steps:
sudo dnf install gcc python3-devel libpq-devel
poetry install --extras=postgres
Alternatively it is possible to avoid providing the build dependencies and instead follow the simplified steps:
poetry install --extras=postgres-binary
While we recommend the first option, particularly for production deployments, the second option may be pragmatic for setting up for development or testing.
Although at this point we should have the ability to connect to a database through python, we have not addressed actually running a Postgresql database.
For convenience, for development and testing purposes we are going to use containers (docker/podman) to address this. Other possibilities for this exist including installing and configuring postgresql-server but that is currently beyond the scope of this document.
There is a docker folder at the base of the repo that, with a suitable docker host environment can be used to start up a postgresql database container.
The docker/db/scripts directory allows for the provision of valid sql commands in *sql files that will be copied into the container and used to initialize the database if required.
If you have docker-compose installed, the db container can be brought up from the docker
directory with:
docker-compose up db -d
Finally you will need to specify the database to connect to. At the moment this can be achieved by editing the bh_core/settings.py file to change the DATABASES to look something like this, depending on the actual connection details.
DATABASES = { 'default': { 'ENGINE': 'django.db.backends.postgresql', 'NAME': 'bloodhound', 'USER': 'bloodhound', 'PASSWORD': 'postgres', 'HOST': '127.0.0.1', 'PORT': '5432', } }
Note that this aspect of the setup should be expected to change to smooth over some of the difficulties around editing a file that is in source control.
If you have podman instead of docker, the podman
command should work as a drop-in replacement for docker
commands where these are used in this README.
It should also now be possible to use docker-compose
commands directly with a little preparation. Consult this article for details but note that, at the time of writing, there is an error in the article and you will need to use
export DOCKER_HOST=unix:///run/user/$UID/podman/podman.sock
to successfully run docker-compose
commands.