-
Notifications
You must be signed in to change notification settings - Fork 1.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Langsmith nested runs do not work correctly with streaming #4223
Langsmith nested runs do not work correctly with streaming #4223
Comments
This is most likely an issue on the LangSmith integration side. Can you file a report with them? CC @jacoblee93 |
Thanks for flagging - will dig in! |
…avior of user runId / traceable Fixes vercel/ai#4223
…avior of user runId / traceable (#1383) Fixes vercel/ai#4223
Hey @grantfwilkinson and @lgrammel, this should be fixed in |
@lgrammel thank you for tagging. @jacoblee93 I tried 0.2.15 and didnt see any changes on my end. I reviewed the commit and not sure if I need to keep traceable the same format as above or if anything changed. I used the same vercel packages as your tests. |
Sorry about the mixup - will keep you posted! |
Description
I am working with the Vercel AI Chatbot example and using streamText.
I have experimental_telemetry: AISDKExporter.getSettings() setup.
And since I am using a tool that uses Langchain multi query retriever, it is creating two runs, my chat run and my query run.
If I leave them un-nested it works correctly with latency, input/output, etc.
If I nest them with https://sdk.vercel.ai/providers/observability/langsmith traceable method, it combines them but removes the input/output + the latency is always 0.1s. See run example. If its not streaming text it works correctly.
I tried updating the run end time + add output onFinish which does not work. I tried different tracing through sentry which also didnt work. I tried moving the traceable call different levels.
Wasnt sure how to best read the debug log from open telemetry but didnt see anything obvious there.
Code example
AI provider
ai:4.0.20
Additional context
No response
The text was updated successfully, but these errors were encountered: