-
Notifications
You must be signed in to change notification settings - Fork 563
Refactors for FullyConnected and UNet models
#1330
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Refactors for FullyConnected and UNet models
#1330
Conversation
Greptile SummaryThis PR brings the Key improvements:
The refactoring maintains backward compatibility while significantly improving code quality, documentation clarity, and developer experience. All changes align with the framework's best practices for model implementation. Important Files Changed
|
…ting. If successful, will propagate.
|
Blocked by possible fixes in #1332 |
CharlelieLrt
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Mostly minor comments and questions. One bigger question though: many of the layers and submodules used in the UNet have similar ones defined in physicnemo/nn (e.g. UNet block, convolutions, attention, etc.)
- Would it be possible to reuse or consolidate them (maybe a long shot)?
- If the ones defined in the
unet.pymodule are useful enough and sufficiently general, would it make sense to move them to the appropriate module ofphysicsnemo.nn?
…d adds assertions for in_channels and out_channels
…es for improved clarity and type safety.
CharlelieLrt
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM!
|
/blossom-ci |
| # Torch script, need to save it as seperate model since TS model doesnt have meta | ||
| if model.meta.jit: | ||
| fwd_model = torch.jit.script(model) | ||
| fwd_model = torch.compile(model) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this change could be confusing. Why are we replacing torch.jit.script with torch.compile under the same jit metadata flag? If the goal is to deprecate TorchScript support, it might be clearer to rename the metadata (e.g., compile or torch_compile) rather than reusing jit.
Also, not all models that are TorchScript-compatible are necessarily compatible with torch.compile, so I’m curious whether this change fails any tests.
mnabian
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Left a comment about jit vs compile. Otherwise, LGTM!
PhysicsNeMo Pull Request
Description
Brings
FullyConnected(both model and PN.nn variants) andUNetinto compliance with new refactor standards.Checklist
Dependencies
Review Process
All PRs are reviewed by the PhysicsNeMo team before merging.
Depending on which files are changed, GitHub may automatically assign a maintainer for review.
We are also testing AI-based code review tools (e.g., Greptile), which may add automated comments with a confidence score.
This score reflects the AI’s assessment of merge readiness and is not a qualitative judgment of your work, nor is
it an indication that the PR will be accepted / rejected.
AI-generated feedback should be reviewed critically for usefulness.
You are not required to respond to every AI comment, but they are intended to help both authors and reviewers.
Please react to Greptile comments with 👍 or 👎 to provide feedback on their accuracy.