-
Notifications
You must be signed in to change notification settings - Fork 3
Add doctests and acceptance tests to CI pipeline and fix broken doctests #244
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Add doctests and acceptance tests to CI pipeline and fix broken doctests #244
Conversation
- Add doctest step to run_unit_tests.yml workflow - Fix DigitalWaveformFailure doctest: remove column_index field that no longer exists in API - Fix _mask_to_column_indices doctest: add missing bitorder parameter This ensures doctests are run as part of CI to catch future API/documentation mismatches. Signed-off-by: Kosta Ilic <[email protected]>
With only 6 tests taking ~0.1s to run, acceptance tests provide valuable integration testing coverage for memory-mapping functionality across all OS/Python combinations with minimal CI overhead. Signed-off-by: Kosta Ilic <[email protected]>
… same effect in a simpler manner that reduces duplication. Co-authored-by: Brad Keryan <[email protected]>
|
@bkeryan, I committed the change you recommended to see how it works. Looking at one of the logs, several tests get skipped in the PR. When copilot runs those tests locally, they do NOT get skipped. Upon further investigation, it turns out that there was a discrepancy in pytest versions between the server and my computer. When I upgraded pytest locally from 9.0.1 to 9.0.2, I saw the same four tests getting skipped. The tests are: I started looking at why one of these tests was failing. It is my understanding that Pytest-doctestplus 1.6.0 has changed. The changelog states: The change was intentional - PR #312 was specifically created to make skipped doctests more visible in pytest output. Previously, doctests that couldn't be imported were silently ignored; now they properly show as "SKIPPED" tests. The skip reasons shown in our tests are correct: However, there's still the version parsing bug - pytest-doctestplus 1.6.0 is incorrectly treating "numpy>=2.0" as a literal module name instead of parsing it as a version requirement. This appears to be a bug that wasn't caught during testing. The real problem isn't that the tests are skipping (that's the intended new behavior), but that the version parsing is broken. Even though we have NumPy 2.3.4 installed, the plugin is looking for a module literally named "numpy>=2.0" instead of checking if the installed numpy version meets the >=2.0 requirement. Our options include:
[Several days later] I tried and did not like the workaround. I would prefer for us to report and fix the bug in pytest-doctestplus. I wrote that up in #251. I hope we can now make progress on this PR. |
- Remove __doctest_requires__ statements that cause pytest-doctestplus 1.6.0 to skip tests - Update pytest dependencies to match CI environment - All 22 doctests now pass without skips Signed-off-by: Kosta Ilic <[email protected]>
These statements are needed because the doctests actually use NumPy 2.0 features like np.long and np.ulong. The pytest-doctestplus 1.6.0 version has a bug where it doesn't properly parse version requirements, but the requirements themselves are legitimate and necessary.
This workaround addresses pytest-doctestplus 1.6.0 bug that doesn't properly parse version requirements like 'numpy>=2.0'. The inline checks skip doctests that use NumPy 2.0 features (np.long, np.ulong) when running with older NumPy. - Remove __doctest_requires__ statements that were causing pytest failures - Add inline version checks in doctests that use NumPy 2.0 features - Include comprehensive comments explaining the workaround and maintenance notes - This is a temporary solution until pytest-doctestplus is fixed or NumPy < 2.0 support is dropped
Break long comment lines to stay within 100 character limit while preserving meaning and readability.
- Break long comment lines about formatting differences into multiple lines - Split long pytest.skip lines using intermediate variable - Fix doctest syntax errors from line continuation All lines now under 100 character limit while maintaining functionality
Applied black formatting to resolve BLK100 styleguide issue in complex module. Signed-off-by: Kosta Ilic <[email protected]>
- Replace inline doctest version checks with module-level pytest.skip in complex module - Remove setup code from docstrings to provide clean user-facing documentation - Maintain doctest functionality while hiding implementation details from generated HTML docs - All 22 doctests continue to pass with NumPy 2.0 compatibility maintained
- Change single quotes to double quotes in version check for Black compliance
- Revert to inline doctest version checks to avoid interfering with module imports - Module-level pytest.skip was causing import failures in CI with older NumPy versions - Maintain clean documentation while preserving import functionality
- Remove trailing spaces from doctest skip comment - Split long version check line to stay under 100 character limit
- Replace __doctest_requires__ with inline version checks in _arguments.py and complex module - pytest-doctestplus 1.6.0: treated 'numpy>=2.0' as literal module name - pytest-doctestplus 1.7.0: has execution pipeline bugs causing incorrect skipping - Inline checks work reliably across CI matrix with different NumPy versions - All doctests now pass locally with NumPy 2.3.4 and will skip correctly with NumPy < 2.0
- Break long comment lines to stay under 100 character limit - Fixes CI linting errors in _arguments.py
Signed-off-by: Kosta Ilic <[email protected]>
These changes will be moved to a separate PR for focused review.
…to be required for the main purpose of the PR.
|
@KostaIlic2 I'll update the oldest-deps workflow in a separate PR since it requires adding |
What does this Pull Request accomplish?
I encountered unexpected doctest failures, and got copilot to fix them. To prevent this situation in the future, I got copilot to add doctests to the pipeline. I then tracked down that acceptance tests are also not in CI, so I had copilot add those too.
Summary of changes:
This ensures doctests are run as part of CI to catch future API/documentation mismatches. It also ensures acceptance tests are run as part of CI.
Why should this Pull Request be merged?
To reduce probability of incorrect documentation examples and surprises with acceptance tests in the future. And to unblock future contributions.
What testing has been done?
I leaned on draft PR pipeline build for required automated QA.
Root Cause Analysis
I asked copilot to figure out which PR broke doctests. Here is what it figured:
PR #222 "Reverse Indices for Digital Waveform Signals" broke the doctests
👤 Author: Mike Prosser (@mikeprosserni)
📅 Date: December 10, 2025 (9 days ago)
🔗 Commit: 0108c35