-
Notifications
You must be signed in to change notification settings - Fork 69
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
output strangeness #123
Comments
Hi, we have re-tested locally and asked colleagues who had no experience with the videollama2 project to build a videollama2 environment from scratch to run the inference demo, but the results are all fine. So we are currently unable to locate the problem. Perhaps you can disclose more details to help us locate the problem. |
Is this your finetuned model? My finetuned model met the same problem. |
You can re-clone our repository and follow the instructions. Someone else also encountered this problem at the beginning, but this problem has been solved.(#119) |
Hi @babyta @Danielement321, we have internally reproduced this bug and it roots from the version of transformers. Please run |
I don't know what happened, I tested the a+v mode
The text was updated successfully, but these errors were encountered: