Develop¶
If you have installed the expertsystem
in Editable mode, it
is easy to tweak the source code and try out new ideas immediately, because the
source code is considered the ‘installation’.
Conda and VSCode
The easiest way to contribute, is by using Conda and Visual Studio code. In that case, the complete developer install procedure becomes:
git clone https://github.com/ComPWA/expertsystem.git
cd expertsystem
conda env create
conda activate es
pip install -e .[dev]
code . # open folder in VSCode
For more info, see Visual Studio code.
Automated style checks¶
When working on the source code of the expertsystem
, it is highly recommended
to install certain additional Python tools. Assuming you installed the
expertsystem
in editable mode, these
additional tools can be installed into your virtual environment in one go:
pip install -e .[dev]
Most of the tools that are installed with this command use specific configuration files (e.g. pyproject.toml for black, .pylintrc for pylint, and tox.ini for flake8 and pydocstyle). These config files define our convention policies, such as PEP 8. If you run into persistent linting errors, this may mean we need to further specify our conventions. In that case, it’s best to create an issue and propose a policy change that can then be formulated in the config files.
Tip
If you have Node.js (npm
) on your system, you can run a few
additional checks. Install these packages as follows (possibly with
administrator rights):
npm install -g cspell markdownlint-cli pyright
Normally, these packages are only run in the CI, but if you have them installed, they are also run when you run tox (local CI).
Note that pyright
requires Node.js v12.x (see install instructions
here).
Pre-commit¶
All style checks are enforced through a tool called pre-commit. This tool needs to be activated, but only once, after you clone the repository:
pre-commit install
Upon committing, pre-commit
now runs a set of checks as defined in the
file .pre-commit-config.yaml
over all staged files. You can also quickly run all checks over all indexed
files in the repository with the command:
pre-commit run -a
This command is also run on GitHub actions whenever you submit a pull request, ensuring that all files in the repository follow the conventions set in the config files of these tools.
Testing¶
The fastest way to run all tests is with the command:
pytest -n auto
The flag -n auto causes pytest
to run with a distributed
strategy.
More thorough checks can be run in one go with the following command:
tox -p
This command will run pytest
, build the documentation, and verify
cross-references in the documentation and the API. It’s especially recommended
to run tox before submitting a pull request!
More specialized tox
tests are defined in the tox.ini file, under
each testenv
section. You can list all environments, along with a
description of what they do, by running:
tox -av
Try to keep test coverage high. You can compute current coverage by running
tox -e cov
and opening htmlcov/index.html
in a browser. In VScode, you can
visualize which lines in the code base are covered by tests with the Coverage
Gutters
extension (for this you need to run pytest
with the flag
--cov-report=xml
).
Organizing unit tests
When unit tests are well-organized, you avoid writing duplicate tests. In addition, it allows you to check for coverage of specific parts of the code.
Therefore, when writing new tests, try to follow the module and class
structure of the package. For example, put unit tests that test the functions
and methods defined in the expertsystem.particle
module into a test file
called test_particle.py
that is directly placed under the tests/unit folder.
Similarly, bundle for ParticleCollection
under a
TestParticleCollection
class.
If possible, also try to order the tests by alphabetical order (that is, the
order of the import
statements).
Documentation¶
The documentation that you find on expertsystem.rtfd.io are built from the documentation source code
folder
(docs
) with Sphinx. Sphinx also builds
the API and
therefore checks whether the docstrings in the Python source code are
valid and correctly interlinked.
You can quickly build the documentation from the root directory of this repository with the command:
tox -e doc
Alternatively, you can run sphinx-build
yourself as follows:
cd docs
make html # or EXECUTE_NB= make html
A nice feature of Read the Docs, where we host our documentation, is that documentation is built for each pull request as well. This means that you can view the documentation for your changes as well. For more info, see here, or just click “details” under the RTD check once you submit your PR.
We make use of Markedly Structured Text (MyST), so you can write the documentation in either reStructuredText or Markdown. In addition, it’s easy to write (interactive) code examples in Jupyter notebooks and host them on the website, (see MyST-NB)!
Jupyter Notebooks¶
The docs/usage folder
contains a few notebooks that illustrate how to use the expertsystem
. These
notebooks are also rendered on the Usage page and are run and
tested whenever you make a pull request. As
such, they serve both as up-to-date documentation and as tests of the
interface.
If you want to improve those notebooks, we recommend working with Jupyter Lab, which is installed with the
dev
requirements of the expertsystem
. Jupyter Lab offers a nicer
developer experience than the default Jupyter notebook editor does. In
addition, recommend to install a few extensions:
jupyter labextension install jupyterlab-execute-time
jupyter labextension install @ijmbarr/jupyterlab_spellchecker
jupyter labextension install @aquirdturtle/collapsible_headings
jupyter labextension install @ryantam626/jupyterlab_code_formatter
jupyter labextension install @jupyter-widgets/jupyterlab-manager
jupyter serverextension enable --py jupyterlab_code_formatter
Now, if you want to test all notebooks documentation folder and check how they will look like in the Documentation, you can do this with:
tox -e docnb
This command takes more time than tox -e doc
, but it is good practice
to do this before you submit a pull request.
Spelling¶
Throughout this repository, we follow American English (en-us) spelling conventions. As a tool, we use cSpell because it allows to check variable names in camel case and snake case. This way, a spelling checker helps you in avoid mistakes in the code as well!
Accepted words are tracked through the cspell.json
file. As with the
other config files, cspell.json
formulates our conventions with regard
to spelling and can be continuously updated while our code base develops. In
the file, the words
section lists words that you want to see as
suggested corrections, while ignoreWords
are just the words that won’t
be flagged. Try to be sparse in adding words: if some word is just specific to
one file, you can ignore it inline, or you can add the file to the
ignorePaths
section if you want to ignore it completely.
It is easiest to use cSpell in Visual Studio code, through
the Code Spell Checker
extension: it provides linting, suggests corrections from the words
section, and enables you to quickly add or ignore words through the
cspell.json
file. Alternatively, you can run cSpell on the entire code base
(with cspell $(git ls-files)
), but for that your system requires npm.
Git and GitHub¶
The expertsystem
source code is maintained with Git and published through
GitHub. We keep track of issues with the code, documentation, and developer
set-up with GitHub issues (see overview here). This is also the place
where you can report bugs.
Issue management¶
We keep track of issue dependencies, time estimates, planning, pipeline
statuses, et cetera with ZenHub. You can use your
GitHub account to log in there and automatically get access to the
expertsystem
issue board once you are part of the ComPWA organization.
Publicly available are:
Issue labels: help to categorize issues by type (maintenance, enhancement, bug, etc.).
Milestones: way to bundle issues for upcoming releases.
Commit conventions¶
Please use conventional commit messages: start the commit with a semantic keyword (see e.g. Angular or these examples, followed by a column, then the message. The message itself should be in imperative mood — just imagine the commit to give a command to the code framework. So for instance:
feat: add coverage report tools
orfix: remove ...
.Keep pull requests small. If the issue you try to address is too big, discuss in the team whether the issue can be converted into an Epic and split up into smaller tasks.
Before creating a pull request, run
tox
. See also Testing.Also use a conventional commit message style for the PR title. This is because we follow a linear commit history and the PR title will become the eventual commit message. Note that a conventional commit message style is enforced through GitHub Actions, as well as PR labels.
PRs can only be merged through ‘squash and merge’. There, you will see a summary based on the separate commits that constitute this PR. Leave the relevant commits in as bullet points. See the commit history for examples. This comes in especially handy when drafting a release!
Milestones and releases¶
An overview of the expertsystem
package releases can be found on PyPI
history page. More
descriptive release notes can be found on the release page.
Release notes are automatically generated from the PRs that were merged into the master branch since the previous tag (see latest draft). The changelog there is generated from the PR titles and categorized by issue label. New releases are automatically published to PyPI when a new tag with such release notes is created (see setuptools-scm).
Continuous Integration¶
All style checks, testing of the documentation and links, and unit tests are performed upon each pull request through GitHub Actions (see status overview here). All checks performed for each PR have to pass before the PR can be merged.
Visual Studio code¶
We recommend using Visual Studio Code as it’s free, regularly updated, and very flexible through it’s wide offer of user extensions.
If you add or open this repository as a VSCode workspace, the file .vscode/settings.json will ensure that you have the right developer settings for this repository. In addition, VSCode will automatically recommend you to install a number of extensions that we use when working on this code base (they are defined in the .vscode/extensions.json file).
You can still specify your own settings in either the user or encompassing workspace settings, as the VSCode settings that come with this are folder settings.