Replies: 1 comment
-
Can you display the prompt that is sent to the LLM? If you run |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Describe the bug
Vanna shows a worst result than LLM:
In Vanna:
With below training data:
Directly in LLM
And without any previous info or training data like I did in vanna:
Important note: I'm using the same LLM model in both
To Reproduce
Just try in Vann and through the model directly the same question.
Expected behavior
A better answer.
Error logs/Screenshots
Already provided.
Additional context
![image](https://private-user-images.githubusercontent.com/39730484/356898853-c68bc7a1-1624-4e28-8d87-26e48e0533c2.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3Mzg5NTEwMjUsIm5iZiI6MTczODk1MDcyNSwicGF0aCI6Ii8zOTczMDQ4NC8zNTY4OTg4NTMtYzY4YmM3YTEtMTYyNC00ZTI4LThkODctMjZlNDhlMDUzM2MyLnBuZz9YLUFtei1BbGdvcml0aG09QVdTNC1ITUFDLVNIQTI1NiZYLUFtei1DcmVkZW50aWFsPUFLSUFWQ09EWUxTQTUzUFFLNFpBJTJGMjAyNTAyMDclMkZ1cy1lYXN0LTElMkZzMyUyRmF3czRfcmVxdWVzdCZYLUFtei1EYXRlPTIwMjUwMjA3VDE3NTIwNVomWC1BbXotRXhwaXJlcz0zMDAmWC1BbXotU2lnbmF0dXJlPWQ0OGNhY2E0MWRlYTZjNzY1NjVkNTk5ZDEyZjcwMjhhZTM0ODYzYjIyNzFhYzZhMWFlZTg2NTMyZTNlYTUwM2UmWC1BbXotU2lnbmVkSGVhZGVycz1ob3N0In0.dK7OG5n0OnL6HG0q55COVlmOIU5FqMYfdGwejSPmUMQ)
Connection to database is working fine, find below result provided by me placing the exact correct query:
Beta Was this translation helpful? Give feedback.
All reactions