Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Test Fix] Fix Consecutive oneshot #971

Merged
merged 32 commits into from
Jan 23, 2025
Merged

[Test Fix] Fix Consecutive oneshot #971

merged 32 commits into from
Jan 23, 2025

Conversation

horheynm
Copy link
Collaborator

@horheynm horheynm commented Dec 11, 2024

Contingent on merge of huggingface/transformers#34719
~~ ^ has been merged not yet released ~~
^ has been released

Blocked on
neuralmagic/compressed-tensors#237

SUMMARY:

TEST PLAN:
ran the test using transformers main
Must pass tests/llmcompressor/transformers/obcq/test_consecutive_runs.py

Copy link

👋 Hi! Thank you for contributing to llm-compressor. Please add the ready label when the PR is ready for review.

@horheynm
Copy link
Collaborator Author

/ready

@dsikka dsikka marked this pull request as draft December 12, 2024 17:00
@horheynm horheynm changed the title fix conseq onehsot [Test Fix] Fix Consecutive oneshot Dec 16, 2024
@horheynm horheynm marked this pull request as ready for review December 23, 2024 14:13
@horheynm horheynm marked this pull request as draft December 23, 2024 15:02
@horheynm horheynm marked this pull request as ready for review January 10, 2025 13:48
Copy link
Collaborator

@rahul-tuli rahul-tuli left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks fine, I feel we can make a few functions cleaner by using pathlib. But I'll leave that upto you

src/llmcompressor/transformers/utils/helpers.py Outdated Show resolved Hide resolved
src/llmcompressor/transformers/utils/helpers.py Outdated Show resolved Hide resolved
rahul-tuli
rahul-tuli previously approved these changes Jan 20, 2025
src/llmcompressor/transformers/utils/helpers.py Outdated Show resolved Hide resolved
rahul-tuli
rahul-tuli previously approved these changes Jan 20, 2025
Copy link
Collaborator

@rahul-tuli rahul-tuli left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

Copy link
Collaborator

@kylesayrs kylesayrs left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I believe the purpose of this test to test consecutive runs without model reloading.

There are quite a few helper functions which I'm not entirely convinced are needed as of now. I think it might be worth revisiting whether they're actually needed for these tests

@rahul-tuli rahul-tuli added the ready When a PR is ready for review label Jan 22, 2025
kylesayrs
kylesayrs previously approved these changes Jan 22, 2025
@dsikka dsikka merged commit 0b6782a into main Jan 23, 2025
7 checks passed
@dsikka dsikka deleted the fix-test-conseq-oneshot branch January 23, 2025 23:59
rahul-tuli added a commit that referenced this pull request Jan 28, 2025
~~Contingent on merge of
huggingface/transformers#34719
~~ ^ has been merged not yet released ~~
^ has been released

Blocked on
neuralmagic/compressed-tensors#237

SUMMARY:
* In multiple optimization tests, automatically decompress model if
provided as optimized model
* Fix recipe stage length
* Revive old code
* When running multiple optimizations (ex. oneshot then finetune,
oneshot and oneshot), the recipes needs to be added to the session using
`initialize_recipe`. Example here
https://github.com/vllm-project/llm-compressor/pull/971/files#diff-c9ae8b3ad24d13abeea5b649a5fd6d0b0925f5c9cc40220cbfbe21ae81242f8dR63-R65

TEST PLAN:
ran the test using transformers main
Must pass tests/llmcompressor/transformers/obcq/test_consecutive_runs.py

---------

Co-authored-by: Dipika Sikka <[email protected]>
Co-authored-by: Rahul Tuli <[email protected]>
Signed-off-by: Rahul Tuli <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ready When a PR is ready for review
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants