Requires Python 3.9-3.12.
pip install -e .[tests,bigquery,docker,datahub,gcs,s3]
pip install -r requirements-dev.txt
pre-commit installNote: A dbt adapter extra (e.g., bigquery, snowflake) is required because dbt-core is provided as a transitive dependency. Any adapter can be used for development.
# Run all tests
pytest --cov data_pipelines_cli --cov-report term-missing --ignore=venv
# Run specific test
pytest tests/test_dbt_utils.py::test_specific_function
# Test all Python versions with tox
tox
# Test specific Python version
tox -e py310- Fork branch from
develop. - Ensure to provide unit tests for new functionality.
- Install dev requirements:
pip install -r requirements-dev.txtand setup a hook:pre-commit install. - Run
toxto verify all Python versions pass. - Update documentation accordingly.
- Update changelog according to "Keep a changelog" guidelines.
- Squash changes with a single commit as much as possible and ensure verbose PR name.
- Open a PR against the
developbranch.
We reserve the right to take over and modify or abandon PRs that do not match the workflow or are abandoned.
- Create the release candidate:
- Go to the Prepare release action.
- Click "Run workflow"
- Enter the part of the version to bump (one of
<major>.<minor>.<patch>). Minor (x.x.x) is a default.
- If the workflow has run sucessfully:
- Go to the newly openened PR named
Release candidate <version> - Check that changelog and version have been properly updated. If not pull the branch and apply manual changes if necessary.
- Merge the PR to main
- Go to the newly openened PR named
- Checkout the Publish workflow to see if:
- The package has been uploaded on PyPI successfully
- The changes have been merged back to develop