Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docs(langchainjs): Update LangChain.js callback backgrounding recommendations #431

Merged
merged 1 commit into from
Sep 16, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 2 additions & 1 deletion docs/tracing/faq/langchain_specific_guides.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -187,7 +187,8 @@ This tactic is also useful for when you have multiple chains running in a shared

In LangChain Python, LangSmith's tracing is done in a background thread to avoid obstructing your production application. This means that your process may end before all traces are successfully posted to LangSmith. This is especially prevalent in a serverless environment, where your VM may be terminated immediately once your chain or agent completes.

In LangChain JS, the default is to block for a short period of time for the trace to finish due to the greater popularity of serverless environments. You can make callbacks asynchronous by setting the `LANGCHAIN_CALLBACKS_BACKGROUND` environment variable to `"true"`.
In LangChain JS, prior to `@langchain/core` version `0.3.0`, the default was to block for a short period of time for the trace to finish due to the greater popularity of serverless environments. Versions `>=0.3.0` will have the same default as Python.
You can explicitly make callbacks synchronous by setting the `LANGCHAIN_CALLBACKS_BACKGROUND` environment variable to `"false"` or asynchronous by setting it to `"true"`. You can also check out [this guide](https://js.langchain.com/docs/how_to/callbacks_serverless) for more options for awaiting backgrounded callbacks in serverless environments.

For both languages, LangChain exposes methods to wait for traces to be submitted before exiting your application.
Below is an example:
Expand Down
10 changes: 8 additions & 2 deletions src/components/QuickStart.js
Original file line number Diff line number Diff line change
Expand Up @@ -155,7 +155,7 @@
);
}

export function ConfigureSDKEnvironmentCodeTabs({}) {

Check warning on line 158 in src/components/QuickStart.js

View workflow job for this annotation

GitHub Actions / Check linting

Unexpected empty object pattern
return (
<CodeTabs
tabs={[
Expand All @@ -170,7 +170,7 @@
);
}

export function ConfigureEnvironmentCodeTabs({}) {

Check warning on line 173 in src/components/QuickStart.js

View workflow job for this annotation

GitHub Actions / Check linting

Unexpected empty object pattern
return (
<CodeTabs
tabs={[
Expand All @@ -185,7 +185,7 @@
);
}

export function LangChainQuickStartCodeTabs({}) {

Check warning on line 188 in src/components/QuickStart.js

View workflow job for this annotation

GitHub Actions / Check linting

Unexpected empty object pattern
const simpleTSBlock = `import { ChatOpenAI } from "@langchain/openai";
import { ChatPromptTemplate } from "@langchain/core/prompts";
import { StringOutputParser } from "@langchain/core/output_parsers";
Expand All @@ -203,7 +203,7 @@
const context = "During this morning's meeting, we solved all world conflict."
await chain.invoke({ question: question, context: context });`;

const alternativeTSBlock = `import { Client } from "langsmith";

Check warning on line 206 in src/components/QuickStart.js

View workflow job for this annotation

GitHub Actions / Check linting

'alternativeTSBlock' is assigned a value but never used
import { LangChainTracer } from "langchain/callbacks";

const client = new Client({
Expand Down Expand Up @@ -258,9 +258,15 @@
export LANGCHAIN_API_KEY=<your-api-key>
# The below examples use the OpenAI API, though it's not necessary in general
export OPENAI_API_KEY=<your-openai-api-key>`;
const typescriptFootnote = `If you are using LangChain with LangSmith and are not in a serverless environment, we also suggest setting the following to reduce latency:
const typescriptFootnote = `If you are using LangChain.js with LangSmith and are not in a serverless environment, we also recommend setting the following explicitly to reduce latency:

\`export LANGCHAIN_CALLBACKS_BACKGROUND=true\``;
\`export LANGCHAIN_CALLBACKS_BACKGROUND=true\`

If you are in a serverless environment, we recommend setting the reverse to allow tracing to finish before your function ends:

\`export LANGCHAIN_CALLBACKS_BACKGROUND=false\`

See [this LangChain.js guide](https://js.langchain.com/docs/how_to/callbacks_serverless) for more information.`;
return (
<CodeTabs
tabs={[
Expand Down Expand Up @@ -311,7 +317,7 @@
print(tok, end="")
# See an example run at: https://smith.langchain.com/public/3e853ad8-77ce-404d-ad4c-05726851ad0f/r`);

export function TraceableQuickStartCodeBlock({}) {

Check warning on line 320 in src/components/QuickStart.js

View workflow job for this annotation

GitHub Actions / Check linting

Unexpected empty object pattern
return (
<CodeBlock
className={TraceableQuickStart.value}
Expand All @@ -322,7 +328,7 @@
);
}

export function TraceableThreadingCodeBlock({}) {

Check warning on line 331 in src/components/QuickStart.js

View workflow job for this annotation

GitHub Actions / Check linting

Unexpected empty object pattern
return (
<CodeBlock
className={TraceableQuickStart.value}
Expand Down Expand Up @@ -396,7 +402,7 @@
);
}

export function RunTreeQuickStartCodeTabs({}) {

Check warning on line 405 in src/components/QuickStart.js

View workflow job for this annotation

GitHub Actions / Check linting

Unexpected empty object pattern
return (
<CodeTabs
tabs={[
Expand Down
Loading