We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Intriguing phenomenon occurred when I inputted the same object (downloaded here input) into Qwen2-VL-2B-Instruct twice.
import contextlib import copy import torch import torch.nn.functional as F from transformers import Qwen2VLForConditionalGeneration if __name__ == '__main__': torch.use_deterministic_algorithms(True, warn_only=True) # won't help torch.backends.cudnn.benchmark = False # won't help torch.backends.cuda.matmul.allow_tf32 = True # won't help torch.backends.cudnn.allow_tf32 = True # won't help torch.backends.cudnn.deterministic = True # won't help model_kwargs = dict( revision='main', torch_dtype=torch.float16, ) model = Qwen2VLForConditionalGeneration.from_pretrained('Qwen/Qwen2-VL-2B-Instruct', **model_kwargs, ) model.eval() device = torch.device('cuda:7') ctx_manager = contextlib.nullcontext() compute_loss_context_manager = contextlib.nullcontext() batch = torch.load('./input.pt') batch = {k: v if not hasattr(v, 'to') else v.to(device) for k, v in batch.items()} model.to(device) with torch.no_grad(): l1 = model(**copy.deepcopy(batch)) with torch.no_grad(): l2 = model(**copy.deepcopy(batch)) print(F.cosine_similarity(l1.logits.float(), l2.logits.float(), dim=-1).mean()) # != 1.0 print(F.mse_loss(l1.logits.float(), l2.logits.float()).mean()) # !=0.0
You may find that l1 and l2 are inconsistent. Is this a bug or a feature?
The text was updated successfully, but these errors were encountered:
Did you try to manual seed torch and torch.cuda? like this
torch.manual_seed(args.seed) torch.cuda.manual_seed_all(args.seed)
Sorry, something went wrong.
No branches or pull requests
Intriguing phenomenon occurred when I inputted the same object (downloaded here input) into Qwen2-VL-2B-Instruct twice.
You may find that l1 and l2 are inconsistent. Is this a bug or a feature?
The text was updated successfully, but these errors were encountered: