Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

run demo on colab bug #223

Open
zkailinzhang opened this issue Nov 27, 2024 · 1 comment
Open

run demo on colab bug #223

zkailinzhang opened this issue Nov 27, 2024 · 1 comment

Comments

@zkailinzhang
Copy link

/content/BLIP/models/med.py in forward(self, hidden_states, attention_mask, head_mask, encoder_hidden_states, encoder_attention_mask, past_key_value, output_attentions)
176
177 # Take the dot product between "query" and "key" to get the raw attention scores.
--> 178 attention_scores = torch.matmul(query_layer, key_layer.transpose(-1, -2))
179
180 if self.position_embedding_type == "relative_key" or self.position_embedding_type == "relative_key_query":

RuntimeError: The size of tensor a (3) must match the size of tensor b (9) at non-singleton dimension 0

@weitong8591
Copy link

weitong8591 commented Dec 2, 2024

I got the same error

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants