Contributing¶
Thank you for your interest in contributing to MQT Qudits! This document outlines how to contribute and the development guidelines.
We use GitHub to host code, to track issues and feature requests, as well as accept pull requests. See https://docs.github.com/en/get-started/quickstart for a general introduction to working with GitHub and contributing to projects.
Types of Contributions¶
Pick the path that fits your time and interests:
🐛 Report bugs:
Use the 🐛 Bug report template at https://github.com/munich-quantum-toolkit/qudits/issues. Include steps to reproduce, expected vs. actual behavior, environment, and a minimal example.
🛠️ Fix bugs:
Browse issues, especially those labeled “bug”, “help wanted”, or “good first issue”. Open a draft PR early to get feedback.
💡 Propose features:
Use the ✨ Feature request template at https://github.com/munich-quantum-toolkit/qudits/issues. Describe the motivation, alternatives considered, and (optionally) a small API sketch.
✨ Implement features:
Pick items labeled “feature” or “enhancement”. Coordinate in the issue first if the change is substantial; start with a draft PR.
📝 Improve documentation:
Add or refine docstrings, tutorials, and examples; fix typos; clarify explanations. Small documentation-only PRs are very welcome.
⚡️ Performance and reliability:
Profile hot paths, add benchmarks, reduce allocations, deflake tests, and improve error messages.
📦 Packaging and tooling:
Improve build configuration, type hints/stubs, CI workflows, and platform wheels. Incremental tooling fixes have a big impact.
🙌 Community support:
Triage issues, reproduce reports, and answer questions in Discussions: https://github.com/munich-quantum-toolkit/qudits/discussions.
Guidelines¶
Please adhere to the following guidelines to help the project grow sustainably.
Core Guidelines¶
Write meaningful commit messages, preferably using gitmoji for additional context.
Focus on a single feature or bug at a time and only touch relevant files. Split multiple features into separate contributions.
Add tests for new features to ensure they work as intended.
Document new features.
Add tests for bug fixes to demonstrate the fix.
Document your code thoroughly and ensure it is readable.
Keep your code clean by removing debug statements, leftover comments, and unrelated code.
Check your code for style and linting errors before committing.
Follow the project’s coding standards and conventions.
Be open to feedback and willing to make necessary changes based on code reviews.
Pull Request Workflow¶
Create PRs early. Work-in-progress PRs are welcome; mark them as drafts on GitHub.
Use a clear title, reference related issues by number, and describe the changes. Follow the PR template; only omit the issue reference if not applicable.
CI runs on all supported platforms and Python versions to build, test, format, and lint. All checks must pass before merging.
When ready, convert the draft to a regular PR and request a review from a maintainer. If unsure, ask in PR comments. If you are a first-time contributor, mention a maintainer in a comment to request a review.
If your PR gets a “Changes requested” review, address the feedback and push updates to the same branch. Do not close and reopen a new PR. Respond to comments to signal that you have addressed the feedback. Do not resolve review comments yourself; the reviewer will do so once satisfied.
Re-request a review after pushing changes that address feedback.
Do not squash commits locally; maintainers typically squash on merge. Avoid rebasing or force-pushing before reviews; you may rebase after addressing feedback if desired.
Get Started 🎉¶
Ready to contribute? We value contributions from people with all levels of experience. In particular, if this is your first PR, not everything has to be perfect. We will guide you through the process.
Installation¶
Check out our installation guide for developers for instructions on how to set up your development environment.
Working on the C++ Library¶
Building the project requires a C++20-capable C++ compiler and CMake 3.24 or newer. As of August 2025, our CI pipeline on GitHub continuously tests the library across the following matrix of systems and compilers:
ubuntu-24.04
:Release
andDebug
builds usinggcc
ubuntu-24.04-arm
:Release
build usinggcc
macos-14
:Release
andDebug
builds usingAppleClang
windows-2022
:Release
andDebug
builds usingmsvc
windows-11-arm
:Release
build usingmsvc
To access the latest build logs, visit the GitHub Actions page.
Additionally, we regularly run extensive tests with an even wider matrix of compilers and operating systems. We are not aware of any issues with other compilers or operating systems. If you encounter any problems, please open an issue and let us know.
Configure and Build¶
Tip
We recommend using an IDE like CLion or Visual Studio Code for development. Both IDEs have excellent support for CMake projects and provide a convenient way to run CMake and build the project. If you prefer to work on the command line, the following instructions will guide you through the process.
Our projects use CMake as the main build configuration tool. Building a project using CMake is a two-stage process. First, CMake needs to be configured by calling:
$ cmake -S . -B build -DCMAKE_BUILD_TYPE=Release
This tells CMake to
search the current directory
.
(passed via-S
) for aCMakeLists.txt
file,process it into a directory
build
(passed via-B
), andconfigure a
Release
build (passed via-DCMAKE_BUILD_TYPE
) as opposed to, e.g., aDebug
build.
After configuring CMake, the project can be built by calling:
$ cmake --build build --config Release
This builds the project in the build
directory (passed via --build
).
Some operating systems and development environments explicitly require a configuration to be set, which is why the --config
flag is also passed to the build command.
The flag --parallel <NUMBER_OF_THREADS>
may be added to trigger a parallel build.
Building the project this way generates
the main project libraries in the
build/src
directory andsome test executables in the
build/test
directory.
Note
This project uses CMake’s FetchContent
module to download and build its dependencies.
Because of this, the first time you configure the project, you will need an active internet connection to fetch the required libraries.
However, there are several ways to bypass these downloads:
Use system-installed dependencies: If the dependencies are already installed on your system and Find-modules exist for them,
FetchContent
will use those versions instead of downloading them.Provide a local copy: If you have local copies of the dependencies (from a previous build or another project), you can point
FetchContent
to them by passing the-DFETCHCONTENT_SOURCE_DIR_<uppercaseName>
flag to your CMake configure step. The<uppercaseName>
should be replaced with the name of the dependency as specified in the project’s CMake files.Use project-specific options: Some projects provide specific CMake options to use a system-wide dependency instead of downloading it. Check the project’s documentation or CMake files for these types of flags.
Running the C++ Tests and Code Coverage¶
We use the GoogleTest framework for unit testing of the C++ library.
All tests are contained in the test
directory, which is further divided into subdirectories for different parts of the library.
You are expected to write tests for any new features you implement and ensure that all tests pass.
Our CI pipeline on GitHub will also run the tests and check for any failures.
It will also collect code coverage information and upload it to Codecov.
Our goal is to have new contributions at least maintain the current code coverage level, while striving for covering as much of the code as possible.
Try to write meaningful tests that actually test the correctness of the code and not just exercise the code paths.
Most IDEs like CLion or Visual Studio Code provide a convenient way to run the tests directly from the IDE. If you prefer to run the tests from the command line, you can use CMake’s test runner CTest. To run the tests, run the following command from the main project directory after building the project as described above:
$ ctest -C Release --test-dir build
Tip
If you want to disable configuring and building the C++ tests, you can pass -DBUILD_MQT_QUDITS_TESTS=OFF
to the CMake configure step.
C++ Code Formatting and Linting¶
This project mostly follows the LLVM Coding Standard, which is a set of guidelines for writing C++ code. To ensure the quality of the code and that it conforms to these guidelines, we use:
clang-tidy
, a static analysis tool that checks for common mistakes in C++ code, andclang-format
, a tool that automatically formats C++ code according to a given style guide.
Common IDEs like CLion or Visual Studio Code have plugins that can automatically run clang-tidy
on the code and automatically format it with clang-format
.
If you are using CLion, you can configure the project to use the
.clang-tidy
and.clang-format
files in the project root directory.If you are using Visual Studio Code, you can install the clangd extension.
They will automatically execute clang-tidy
on your code and highlight any issues.
In many cases, they also provide quick-fixes for these issues.
Furthermore, they provide a command to automatically format your code according to the given style.
Note
After configuring CMake, you can run clang-tidy
on a file by calling the following command:
$ clang-tidy <FILE> -- -I <PATH_TO_INCLUDE_DIRECTORY>
Here, <FILE>
is the file you want to analyze and <PATH_TO_INCLUDE_DIRECTORY>
is the path to the include
directory of the project.
Our pre-commit
configuration also includes clang-format
.
If you have installed pre-commit
, it will automatically run clang-format
on your code before each commit.
If you do not have pre-commit
set up, the pre-commit.ci bot will run clang-format
on your code and automatically format it according to the style guide.
Tip
Remember to pull the changes back into your local repository after the bot has formatted your code to avoid merge conflicts.
Our CI pipeline will also run clang-tidy
over the changes in your PR and report any issues it finds.
Due to technical limitations, the workflow can only post PR comments if the changes are not coming from a fork.
If you are working on a fork, you can still see the clang-tidy
results either in the GitHub Actions logs, on the workflow summary page, or in the “Files changed” tab of the PR.
C++ Documentation¶
Historically, the C++ part of the code base has not been sufficiently documented. Given the substantial size of the code base, we have set ourselves the goal to improve the documentation over time. We expect any new additions to the code base to be documented using Doxygen comments. When touching existing code, we encourage you to add Doxygen comments to the code you touch or refactor.
For some tips on how to write good Doxygen comments, see the Doxygen Manual.
The C++ API documentation is integrated into the overall documentation that we host on ReadTheDocs using the breathe extension for Sphinx. See Working on the Documentation for more information on how to build the documentation.
Working on the Python Package¶
We use pybind11
to expose large parts of the C++ core library to Python.
This allows us to keep the performance-critical parts of the code in C++ while providing a convenient interface for Python users.
All code related to C++-Python bindings is contained in the bindings
directory.
Tip
To build only the Python bindings, pass -DBUILD_MQT_QUDITS_BINDINGS=ON
to the CMake configure step.
CMake will then try to find Python and the necessary dependencies (pybind11
) on your system and configure the respective targets.
In CLion, you can enable an option to pass the current Python interpreter to CMake.
Go to Preferences
-> Build, Execution, Deployment
-> CMake
-> Python Integration
and check the box Pass Python Interpreter to CMake
.
Alternatively, you can pass -DPython_ROOT_DIR=<PATH_TO_PYTHON>
to the configure step to point CMake to a specific Python installation.
The Python package itself lives in the python/mqt/qudits
directory.
The package lives in the src/mqt/qudits
directory.
We recommend using nox
for development.
nox
is a Python automation tool that allows you to define tasks in a noxfile.py
file and then run them with a single command.
If you have not installed it yet, see our installation guide for developers.
We define four convenient nox
sessions in our noxfile.py
:
tests
to run the Python testsminimums
to run the Python tests with the minimum dependencieslint
to run the Python code formatting and lintingdocs
to build the documentation
These are explained in more detail in the following sections.
Running the Python Tests¶
The Python code is tested by unit tests using the pytest
framework.
The corresponding test files can be found in the test/python
directory.
A nox
session is provided to conveniently run the Python tests.
$ nox -s tests
This command automatically builds the project and runs the tests on all supported Python versions.
For each Python version, it will create a virtual environment (in the .nox
directory) and install the project into it.
We take extra care to install the project without build isolation so that rebuilds are typically very fast.
If you only want to run the tests on a specific Python version, you can pass the desired Python version to the nox
command.
$ nox -s tests-3.12
Note
If you do not want to use nox
, you can also run the tests directly using pytest
.
This requires that you have the project and its test dependencies installed in your virtual environment (e.g., by running uv sync
).
(.venv) $ pytest
We provide an additional nox session minimums
that makes use of uv
’s --resolution=lowest-direct
flag to install the lowest possible versions of the direct dependencies.
This ensures that the project can still be built and the tests pass with the minimum required versions of the dependencies.
$ nox -s minimums
Python Code Formatting and Linting¶
The Python code is formatted and linted using a collection of pre-commit
hooks.
This collection includes
ruff, an extremely fast Python linter and formatter written in Rust, and
mypy, a static type checker for Python code.
The hooks can be installed by running the following command in the root directory:
$ pre-commit install
This will install the hooks in the .git/hooks
directory of the repository.
The hooks will be executed whenever you commit changes.
You can also run the nox
session lint
to run the hooks manually.
$ nox -s lint
Note
If you do not want to use nox
, you can also run the hooks manually by using pre-commit
.
$ pre-commit run --all-files
Python Documentation¶
The Python code is documented using Google-style docstrings.
Every public function, class, and module should have a docstring that explains what it does and how to use it.
ruff
will check for missing docstrings and will explicitly warn you if you forget to add one.
We heavily rely on type hints to document the expected types of function arguments and return values.
For the compiled parts of the code base, we provide type hints in the form of stub files in the python/mqt/qudits
directory.
The Python API documentation is integrated into the overall documentation that we host on ReadTheDocs using the
sphinx-autoapi
extension for Sphinx.
Working on the Documentation¶
The documentation is written in MyST (a flavor of Markdown) and built using Sphinx.
The documentation source files can be found in the docs/
directory.
On top of the API documentation, we provide a set of tutorials and examples that demonstrate how to use the library. These are written in Markdown using myst-nb, which allows executing Python code blocks in the documentation. The code blocks are executed during the documentation build process, and the output is included in the documentation. This allows us to provide up-to-date examples and tutorials that are guaranteed to work with the latest version of the library.
You can build the documentation using the nox
session docs
.
$ nox -s docs
This will install all dependencies for building the documentation in an isolated environment, build the Python package, and then build the documentation. It will then host the documentation on a local web server for you to view.
Note
If you do not want to use nox
, you can also build the documentation directly using sphinx-build
.
This requires that you have the project and its documentation dependencies installed in your virtual environment (e.g., by running uv sync
).
(.venv) $ sphinx-build -b html docs/ docs/_build
The docs can then be found in the docs/_build
directory.
Tips for Development¶
If something goes wrong, the CI pipeline will notify you. Here are some tips for finding the cause of certain failures:
If any of the
CI / 🇨 Test
checks fail, this indicates build errors or test failures in the C++ part of the code base. Look through the respective logs on GitHub for any error or failure messages.If any of the
CI / 🐍 Test
checks fail, this indicates build errors or test failures in the Python part of the code base. Look through the respective logs on GitHub for any error or failure messages.If any of the
codecov/\*
checks fail, this means that your changes are not appropriately covered by tests or that the overall project coverage decreased too much. Ensure that you include tests for all your changes in the PR.If
cpp-linter
comments on your PR with a list of warnings, these have been raised byclang-tidy
when checking the C++ part of your changes for warnings or style guideline violations. The individual messages frequently provide helpful suggestions on how to fix the warnings. If you don’t see any messages, but the🇨 Lint / 🚨 Lint
check is red, click on theDetails
link to see the full log of the check and a step summary.If the
pre-commit.ci
check fails, some of thepre-commit
checks failed and could not be fixed automatically by the pre-commit.ci bot. The individual log messages frequently provide helpful suggestions on how to fix the warnings.If the
docs/readthedocs.org:\*
check fails, the documentation could not be built properly. Inspect the corresponding log file for any errors.
Releasing a New Version¶
Before releasing a new version, check the GitHub release draft generated by the Release Drafter for unlabelled PRs.
Unlabelled PRs would appear at the top of the release draft below the main heading.
Furthermore, check whether the version number in the release draft is correct.
The version number in the release draft is dictated by the presence of certain labels on the PRs involved in a release.
By default, a patch release will be created.
If any PR has the minor
or major
label, a minor or major release will be created, respectively.
Note
Sometimes, Dependabot or Renovate will tag a PR updating a dependency with a minor
or major
label because the dependency update itself is a minor or major release.
This does not mean that the dependency update itself is a breaking change for MQT Qudits.
If you are sure that the dependency update does not introduce any breaking changes for MQT Qudits, you can remove the minor
or major
label from the PR.
This will ensure that the respective PR does not influence the type of an upcoming release.
Once everything is in order, navigate to the Releases page on GitHub, edit the release draft if necessary, and publish the release.