-
Notifications
You must be signed in to change notification settings - Fork 756
Adding test for CadenceWith16BitConvActivationsQuantizer #16205
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
Summary: Add annotation tests for CadenceWith16BitConvActivationsQuantizer covering both conv1d and conv2d operations. https://www.internalfb.com/code/fbsource/[01c566b03c670b1869136cbb64f25d16d730c8d4]/fbcode/executorch/backends/cadence/aot/quantizer/quantizer.py?lines=384-396 Reviewed By: hsharma35 Differential Revision: D88895865
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/16205
Note: Links to docs will display an error until the docs builds have been completed. ❌ 1 New Failure, 1 Cancelled Job, 1 Unrelated FailureAs of commit d6d5a52 with merge base 5033840 ( NEW FAILURE - The following job has failed:
UNSTABLE - The following job is marked as unstable, possibly due to flakiness on trunk:
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
This PR needs a
|
Summary: Add annotation tests for CadenceWith16BitConvActivationsQuantizer covering both conv1d and conv2d operations. https://www.internalfb.com/code/fbsource/[01c566b03c670b1869136cbb64f25d16d730c8d4]/fbcode/executorch/backends/cadence/aot/quantizer/quantizer.py?lines=384-396 Reviewed By: hsharma35 Differential Revision: D88895865
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull request overview
This PR adds comprehensive test coverage for CadenceWith16BitConvActivationsQuantizer, which was previously in the excluded quantizers list. The implementation validates that the quantizer correctly annotates both conv1d and conv2d operations with 16-bit activation quantization specs.
Key Changes:
- Removed
CadenceWith16BitConvActivationsQuantizerfrom the exclusion list - Added parameterized test cases for both conv1d and conv2d operations
- Implemented graph builder helper methods following established patterns
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
Summary:
Add annotation tests for CadenceWith16BitConvActivationsQuantizer covering both conv1d and conv2d operations.
https://www.internalfb.com/code/fbsource/[01c566b03c670b1869136cbb64f25d16d730c8d4]/fbcode/executorch/backends/cadence/aot/quantizer/quantizer.py?lines=384-396
Reviewed By: hsharma35
Differential Revision: D88895865