Replies: 1 comment
-
Can you display the prompt that is sent to the LLM? If you run |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Describe the bug
Vanna shows a worst result than LLM:
In Vanna:
With below training data:
Directly in LLM
And without any previous info or training data like I did in vanna:
Important note: I'm using the same LLM model in both
To Reproduce
Just try in Vann and through the model directly the same question.
Expected behavior
A better answer.
Error logs/Screenshots
Already provided.
Additional context
Connection to database is working fine, find below result provided by me placing the exact correct query:
Beta Was this translation helpful? Give feedback.
All reactions