-
Notifications
You must be signed in to change notification settings - Fork 766
Adding Test for CadenceWithLayerNormQuantizer #16355
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/16355
Note: Links to docs will display an error until the docs builds have been completed. ❗ 1 Active SEVsThere are 1 currently active SEVs. If your PR is affected, please view them below: ❌ 1 New Failure, 1 Unrelated FailureAs of commit afcf441 with merge base 0ee2f49 ( NEW FAILURE - The following job has failed:
UNSTABLE - The following job is marked as unstable, possibly due to flakiness on trunk:
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
This PR needs a
|
Summary: Adding test for CadenceWithLayerNormQuantizer. https://www.internalfb.com/code/fbsource/[72237310dfd9016b5f54a44d994c8a7065eda5d1]/fbcode/executorch/backends/cadence/aot/quantizer/quantizer.py?lines=296-306 Differential Revision: D88898823
Summary: Adding test for CadenceWithLayerNormQuantizer. https://www.internalfb.com/code/fbsource/[72237310dfd9016b5f54a44d994c8a7065eda5d1]/fbcode/executorch/backends/cadence/aot/quantizer/quantizer.py?lines=296-306 Differential Revision: D88898823
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull request overview
This PR adds comprehensive test coverage for three previously untested Cadence quantizers: CadenceWith16BitConvActivationsQuantizer, CadenceWithSoftmaxQuantizer, and CadenceWithLayerNormQuantizer. The tests verify that these quantizers correctly annotate graph nodes with the expected quantization specifications.
- Removes TODO comments for the three quantizers from the exclusion list
- Adds four new test cases covering conv1d, conv2d, softmax, and layer_norm operations
- Implements corresponding graph builder helper methods following the established pattern
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
Summary: Adding test for CadenceWithLayerNormQuantizer. https://www.internalfb.com/code/fbsource/[72237310dfd9016b5f54a44d994c8a7065eda5d1]/fbcode/executorch/backends/cadence/aot/quantizer/quantizer.py?lines=296-306 Differential Revision: D88898823
Summary: Add annotation tests for CadenceWith16BitConvActivationsQuantizer covering both conv1d and conv2d operations. Differential Revision: D88895865
Differential Revision: D88896712
Summary: Adding test for CadenceWithLayerNormQuantizer. https://www.internalfb.com/code/fbsource/[72237310dfd9016b5f54a44d994c8a7065eda5d1]/fbcode/executorch/backends/cadence/aot/quantizer/quantizer.py?lines=296-306 Reviewed By: hsharma35 Differential Revision: D88898823
6f4fab8 to
a983619
Compare
Summary: Adding test for CadenceWithLayerNormQuantizer. https://www.internalfb.com/code/fbsource/[72237310dfd9016b5f54a44d994c8a7065eda5d1]/fbcode/executorch/backends/cadence/aot/quantizer/quantizer.py?lines=296-306 Reviewed By: hsharma35 Differential Revision: D88898823
a983619 to
71aabe7
Compare
Summary: Pull Request resolved: pytorch#16355 Adding test for CadenceWithLayerNormQuantizer. https://www.internalfb.com/code/fbsource/[72237310dfd9016b5f54a44d994c8a7065eda5d1]/fbcode/executorch/backends/cadence/aot/quantizer/quantizer.py?lines=296-306 Reviewed By: hsharma35 Differential Revision: D88898823
71aabe7 to
afcf441
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull request overview
Copilot reviewed 1 out of 1 changed files in this pull request and generated 1 comment.
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| def _build_layer_norm_graph(self) -> tuple[torch.fx.GraphModule, torch.fx.Node]: | ||
| """Build a simple graph with a layer_norm operation.""" | ||
| # Input shape: (batch, features) | ||
| x = torch.randn(1, 10) | ||
| # normalized_shape must match the last dimension(s) of input | ||
| normalized_shape = [10] | ||
| gm = single_op_builder( | ||
| placeholders=(x,), | ||
| op=torch.ops.aten.layer_norm.default, | ||
| args=(x, normalized_shape), | ||
| ) | ||
|
|
||
| layer_norm_nodes = gm.graph.find_nodes( | ||
| op="call_function", | ||
| target=torch.ops.aten.layer_norm.default, | ||
| ) | ||
| self.assertEqual( | ||
| len(layer_norm_nodes), 1, "Should find exactly one layer_norm node" | ||
| ) | ||
| # Add source_fn_stack metadata required by quantizer pattern matching | ||
| layer_norm_nodes[0].meta["source_fn_stack"] = [ | ||
| ("layer_norm", torch.ops.aten.layer_norm.default) | ||
| ] | ||
| return gm, layer_norm_nodes[0] |
Copilot
AI
Dec 22, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The _build_layer_norm_graph method uses a different approach than the other graph builders (single_op_builder instead of GraphBuilder), and manually adds source_fn_stack metadata after graph construction. While this works, it creates an inconsistency in the codebase.
Consider refactoring to use GraphBuilder directly like the other methods for consistency. The normalized_shape parameter can be passed directly in args since GraphBuilder's call_operator handles both tensor and non-tensor arguments. This would allow adding the source_fn_stack metadata during construction rather than after the fact.
Summary:
Adding test for CadenceWithLayerNormQuantizer.
https://www.internalfb.com/code/fbsource/[72237310dfd9016b5f54a44d994c8a7065eda5d1]/fbcode/executorch/backends/cadence/aot/quantizer/quantizer.py?lines=296-306
Differential Revision: D88898823