Skip to content

[Utils] Replace preserve_attr with patch_attr #1187

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 7 commits into from
Apr 8, 2025

Conversation

kylesayrs
Copy link
Collaborator

Purpose

  • Provide explicit patching functionality to preserve_attr
  • This function is very similar to unittest.mock.patch, except that using this functionality does not require using the unittest library in source code

Changes

  • Replace preserve_attr with patch_attr
  • Replace usage in src/llmcompressor/pipelines/sequential/helpers.py, which helps with clarity
  • Add unit test mark to utils/helpers pytests

Testing

  • Added tests

Signed-off-by: Kyle Sayers <[email protected]>
Copy link

👋 Hi! Thank you for contributing to llm-compressor. Please add the ready label when the PR is ready for review.

Note: This is required to complete the testing suite, please only add the label once the PR is code complete and local testing has been performed.

@kylesayrs kylesayrs changed the title Replace preserve_attr with patch_attr [Utils] Replace preserve_attr with patch_attr Feb 24, 2025
@kylesayrs kylesayrs self-assigned this Feb 26, 2025
@kylesayrs kylesayrs added the ready When a PR is ready for review label Mar 10, 2025
@kylesayrs kylesayrs enabled auto-merge (squash) April 8, 2025 19:11
@kylesayrs kylesayrs merged commit e5780c5 into main Apr 8, 2025
8 checks passed
@kylesayrs kylesayrs deleted the kylesayrs/rename-patch_attr branch April 8, 2025 19:43
kylesayrs added a commit that referenced this pull request Apr 9, 2025
## Purpose ##
* Follow up to #1188
* Add utilities which can be used by developers as well as used during
testing of model architectures

## Prerequisites ##
* #1187

## Changes ##
* Add `skip_weights_download` which allows a model to be initialized and
dispatched without downloading the weights
* Add `patch_transformers_logger_level` which is used by
`skip_weights_download` to reduce warning verbosity

---------

Signed-off-by: Kyle Sayers <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ready When a PR is ready for review
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants