-
Notifications
You must be signed in to change notification settings - Fork 6k
[bugfix] bugfix for npu free memory #9640
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
@sayakpaul I split the PR into two as suggested, thanks for your help! |
I am not sure if this PR is needed to allow NPU attention processor importing, do we? |
@sayakpaul After I added NPU attention, the computing speed / FPS actually increased |
That is okay but I don't think we need to change anything to be able to import the NPU attention processor from |
@sayakpaul Thanks for your suggestion, I've added the NPU check condition to avoid unnecessary issues |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks, left some comments. @yiyixuxu what do you think?
IPAdapterAttnProcessor2_0, | ||
) | ||
if is_torch_npu_available(): | ||
cross_attention_processors = ( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
These attention processors don’t depend on Torch NPU no?
IPAdapterAttnProcessor2_0, | ||
) | ||
|
||
attentionProcessor = Union[ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We don’t follow camel casing in the diffusers codebase.
@sayakpaul I've changed the code based on your suggestions, thanks! |
Thank you but I am still struggling to understand #9640 (comment) |
@sayakpaul Hello again, I changed the PR for bug fix because the original version tries to implement the wrong function for empty cache |
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
@sayakpaul Seems like it needs a review for the requirement |
Thanks! |
* Improve NPU performance * Improve NPU performance * Improve NPU performance * Improve NPU performance * [bugfix] bugfix for npu free memory * [bugfix] bugfix for npu free memory * [bugfix] bugfix for npu free memory --------- Co-authored-by: 蒋硕 <jiangshuo9@h-partners.com> Co-authored-by: Sayak Paul <spsayakpaul@gmail.com>
What does this PR do?
The original code will output error that torch_npu has no attribute empty_cache
Before submitting
documentation guidelines, and
here are tips on formatting docstrings.
Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.