Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

remove dependency of flash_attn when use_flash_attn is set to false #20

Merged
merged 5 commits into from
Mar 8, 2024

Conversation

sallyjunjun
Copy link
Collaborator

Thanks for your contribution and we appreciate it a lot. The following instructions would make your pull request more healthy and more easily get feedback. If you do not understand some items, don't worry, just make the pull request and seek help from maintainers.

Motivation

v100 is not supported in flash attention. In order to support v100 users using InternEvo, we remove dependency of flash attention module when use_flash_attn is set to false.

Modification

all flash attention related code.

BC-breaking (Optional)

Does the modification introduce changes that break the backward compatibility of the downstream repositories?
If so, please describe how it breaks the compatibility and how the downstream projects should modify their code to keep compatibility with this PR.

Use cases (Optional)

If this PR introduces a new feature, it is better to list some use cases here and update the documentation.

Checklist

Before PR:

  • Pre-commit or other linting tools are used to fix the potential lint issues.
  • Bug fixes are fully covered by unit tests, the case that causes the bug should be added in the unit tests.
  • The modification is covered by complete unit tests. If not, please add more unit test to ensure the correctness.
  • The documentation has been modified accordingly, like docstring or example tutorials.

After PR:

  • If the modification has potential influence on downstream or other related projects, this PR should be tested with those projects.
  • CLA has been signed and all committers have signed the CLA in this PR.

@sallyjunjun sallyjunjun force-pushed the grj-no-flash-attn branch 19 times, most recently from 9b67f33 to aa7c10e Compare January 25, 2024 08:06
@@ -114,7 +118,47 @@ def forward(self, input): # pylint: disable=W0622
)


class ColumnParallelLinearTorch(ColumnParallelLinear):
class ColumnParallelLinearTorch(nn.Linear):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

mark一下,如果这里变动的话,isp算法set_parallel_attr_for_param_groups函数需要相应更新

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这里只是将继承类更换了,将init函数实现了一下,主题逻辑没有变动。

@huangting4201 huangting4201 marked this pull request as draft January 26, 2024 03:27
@huangting4201 huangting4201 marked this pull request as ready for review January 26, 2024 03:40
@sallyjunjun sallyjunjun force-pushed the grj-no-flash-attn branch 3 times, most recently from c04d9c7 to 4c786e9 Compare January 30, 2024 03:48
@sallyjunjun sallyjunjun force-pushed the grj-no-flash-attn branch 3 times, most recently from c6b42f9 to 356d4be Compare February 2, 2024 11:10
@github-actions github-actions bot added the Stale label Feb 10, 2024
@github-actions github-actions bot closed this Feb 17, 2024
@sallyjunjun sallyjunjun reopened this Feb 19, 2024
@sallyjunjun sallyjunjun force-pushed the grj-no-flash-attn branch 3 times, most recently from 407be0d to c22e806 Compare February 19, 2024 09:04
@github-actions github-actions bot removed the Stale label Feb 20, 2024
@sallyjunjun sallyjunjun force-pushed the grj-no-flash-attn branch 7 times, most recently from e0ce40e to 7ebc1c3 Compare February 26, 2024 03:58
@InternLM InternLM deleted a comment from github-actions bot Feb 28, 2024
@InternLM InternLM deleted a comment from github-actions bot Feb 28, 2024
@sunpengsdu
Copy link
Contributor

should have some test cases?

@sunpengsdu sunpengsdu merged commit 0dcc0e9 into InternLM:develop Mar 8, 2024
15 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants