You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
/content/BLIP/models/med.py in forward(self, hidden_states, attention_mask, head_mask, encoder_hidden_states, encoder_attention_mask, past_key_value, output_attentions)
176
177 # Take the dot product between "query" and "key" to get the raw attention scores.
--> 178 attention_scores = torch.matmul(query_layer, key_layer.transpose(-1, -2))
179
180 if self.position_embedding_type == "relative_key" or self.position_embedding_type == "relative_key_query":
RuntimeError: The size of tensor a (3) must match the size of tensor b (9) at non-singleton dimension 0
The text was updated successfully, but these errors were encountered:
/content/BLIP/models/med.py in forward(self, hidden_states, attention_mask, head_mask, encoder_hidden_states, encoder_attention_mask, past_key_value, output_attentions)
176
177 # Take the dot product between "query" and "key" to get the raw attention scores.
--> 178 attention_scores = torch.matmul(query_layer, key_layer.transpose(-1, -2))
179
180 if self.position_embedding_type == "relative_key" or self.position_embedding_type == "relative_key_query":
RuntimeError: The size of tensor a (3) must match the size of tensor b (9) at non-singleton dimension 0
The text was updated successfully, but these errors were encountered: