Skip to content

Commit

Permalink
replace all LANGCHAIN_ flags with LANGSMITH_ flags (#616)
Browse files Browse the repository at this point in the history
  • Loading branch information
isahers1 authored Jan 22, 2025
2 parents cadfd36 + 4d299f6 commit e9ec569
Show file tree
Hide file tree
Showing 22 changed files with 73 additions and 75 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -34,12 +34,7 @@ The API key will be shown only once, so make sure to copy it and store it in a s

## Configure the SDK

You may set the following environment variables in addition to `LANGCHAIN_API_KEY` (or equivalently `LANGSMITH_API_KEY`).
You may set the following environment variables in addition to `LANGSMITH_API_KEY`.
These are only required if using the EU instance.

:::info
`LANGCHAIN_HUB_API_URL` is only required if using the legacy langchainhub sdk
:::

`LANGCHAIN_ENDPOINT=`<RegionalUrl type='api' link={false} />
`LANGCHAIN_HUB_API_URL=`<RegionalUrl type='hub' link={false} />
`LANGSMITH_ENDPOINT=`<RegionalUrl type='api' link={false} />
Original file line number Diff line number Diff line change
Expand Up @@ -174,10 +174,10 @@ import requests


def main():
api_key = os.environ["LANGCHAIN_API_KEY"]
# LANGCHAIN_ORGANIZATION_ID is not a standard environment variable in the SDK, just used for this example
organization_id = os.environ["LANGCHAIN_ORGANIZATION_ID"]
base_url = os.environ.get("LANGCHAIN_ENDPOINT") or "https://api.smith.langchain.com"
api_key = os.environ["LANGSMITH_API_KEY"]
# LANGSMITH_ORGANIZATION_ID is not a standard environment variable in the SDK, just used for this example
organization_id = os.environ["LANGSMITH_ORGANIZATION_ID"]
base_url = os.environ.get("LANGSMITH_ENDPOINT") or "https://api.smith.langchain.com"
headers = {
"Content-Type": "application/json",
"X-API-Key": api_key,
Expand Down
4 changes: 2 additions & 2 deletions docs/evaluation/how_to_guides/evaluate_with_attachments.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@ png_url = "https://www.w3.org/Graphics/PNG/nurbcup2si.png"\n
pdf_bytes = requests.get(pdf_url).content
wav_bytes = requests.get(wav_url).content
png_bytes = requests.get(png_url).content\n
# Define the LANGCHAIN_API_KEY environment variable with your API key
# Define the LANGSMITH_API_KEY environment variable with your API key
langsmith_client = Client()\n
dataset_name = "attachment-test-dataset:" + str(uuid.uuid4())[0:8]\n
dataset = langsmith_client.create_dataset(
Expand Down Expand Up @@ -104,7 +104,7 @@ if (!response.ok) {
const pdfArrayBuffer = await fetchArrayBuffer(pdfUrl);
const wavArrayBuffer = await fetchArrayBuffer(wavUrl);
const pngArrayBuffer = await fetchArrayBuffer(pngUrl);\n
// Create the LangSmith client (Ensure LANGCHAIN_API_KEY is set in env)
// Create the LangSmith client (Ensure LANGSMITH_API_KEY is set in env)
const langsmithClient = new Client();\n
// Create a unique dataset name
const datasetName = "attachment-test-dataset:" + uuid4().substring(0, 8);\n
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -195,7 +195,7 @@ body = {
resp = requests.post(
"https://api.smith.langchain.com/api/v1/datasets/upload-experiment",
json=body,
headers={"x-api-key": os.environ["LANGCHAIN_API_KEY"]}
headers={"x-api-key": os.environ["LANGSMITH_API_KEY"]}
)
print(resp.json())
```
Expand Down
4 changes: 2 additions & 2 deletions docs/evaluation/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -44,8 +44,8 @@ To create an API key head to the <RegionalUrl text='Settings page' suffix='/sett

<CodeTabs
tabs={[
ShellBlock(`export LANGCHAIN_TRACING_V2=true
export LANGCHAIN_API_KEY="<your-langchain-api-key>"
ShellBlock(`export LANGSMITH_TRACING=true
export LANGSMITH_API_KEY="<your-langchain-api-key>"
# The example uses OpenAI, but it's not necessary in general
export OPENAI_API_KEY="<your-openai-api-key>"`),
]}
Expand Down
4 changes: 2 additions & 2 deletions docs/evaluation/tutorials/agents.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -35,8 +35,8 @@ def _set_env(var: str) -> None:
if not os.environ.get(var):
os.environ[var] = getpass.getpass(f"Set {var}: ")

os.environ["LANGCHAIN_TRACING_V2"] = "true"
_set_env("LANGCHAIN_API_KEY")
os.environ["LANGSMITH_TRACING"] = "true"
_set_env("LANGSMITH_API_KEY")
_set_env("OPENAI_API_KEY")
#endregion
```
Expand Down
8 changes: 4 additions & 4 deletions docs/evaluation/tutorials/backtesting.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -45,10 +45,10 @@ import os

# Set the project name to whichever project you'd like to be testing against
project_name = "Tweet Writing Task"
os.environ["LANGCHAIN_PROJECT"] = project_name
os.environ["LANGCHAIN_TRACING_V2"] = "true"
if not os.environ.get("LANGCHAIN_API_KEY"):
os.environ["LANGCHAIN_API_KEY"] = getpass.getpass("YOUR API KEY")
os.environ["LANGSMITH_PROJECT"] = project_name
os.environ["LANGSMITH_TRACING"] = "true"
if not os.environ.get("LANGSMITH_API_KEY"):
os.environ["LANGSMITH_API_KEY"] = getpass.getpass("YOUR API KEY")

# Optional. You can swap OpenAI for any other tool-calling chat model.
os.environ["OPENAI_API_KEY"] = "YOUR OPENAI API KEY"
Expand Down
8 changes: 4 additions & 4 deletions docs/evaluation/tutorials/rag.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -47,13 +47,13 @@ First, let's set our environment variables:
python`
import os
os.environ["LANGCHAIN_TRACING_V2"] = "true"
os.environ["LANGCHAIN_API_KEY"] = "YOUR LANGCHAIN API KEY"
os.environ["LANGSMITH_TRACING"] = "true"
os.environ["LANGSMITH_API_KEY"] = "YOUR LANGSMITH API KEY"
os.environ["OPENAI_API_KEY"] = "YOUR OPENAI API KEY"
`,
typescript`
process.env.LANGCHAIN_TRACING_V2 = "true";
process.env.LANGCHAIN_API_KEY = "YOUR LANGCHAIN API KEY";
process.env.LANGSMITH_TRACING = "true";
process.env.LANGSMITH_API_KEY = "YOUR LANGSMITH API KEY";
process.env.OPENAI_API_KEY = "YOUR OPENAI API KEY";
`,
]}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ messages = [
# highlight-next-line
# You can set metadata & tags **statically** when decorating a function
# Use the @traceable decorator with tags and metadata
# Ensure that the LANGCHAIN_TRACING_V2 environment variables are set for @traceable to work
# Ensure that the LANGSMITH_TRACING environment variables are set for @traceable to work
@ls.traceable(
run_type="llm",
name="OpenAI Call Decorator",
Expand Down
10 changes: 5 additions & 5 deletions docs/observability/how_to_guides/tracing/annotate_code.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -26,9 +26,9 @@ If you are using LangChain (either Python or JS/TS), you can skip this section a
LangSmith makes it easy to log traces with minimal changes to your existing code with the `@traceable` decorator in Python and `traceable` function in TypeScript.

:::note
The `LANGCHAIN_TRACING_V2` environment variable must be set to `'true'` in order for traces to be logged to LangSmith, even when using `@traceable` or `traceable`. This allows you to toggle tracing on and off without changing your code.
The `LANGSMITH_TRACING` environment variable must be set to `'true'` in order for traces to be logged to LangSmith, even when using `@traceable` or `traceable`. This allows you to toggle tracing on and off without changing your code.

Additionally, you will need to set the `LANGCHAIN_API_KEY` environment variable to your API key (see [Setup](/) for more information).
Additionally, you will need to set the `LANGSMITH_API_KEY` environment variable to your API key (see [Setup](/) for more information).

By default, the traces will be logged to a project named `default`.
To log traces to a different project, see [this section](./log_traces_to_project).
Expand Down Expand Up @@ -172,9 +172,9 @@ The wrapper works seamlessly with the `@traceable` decorator or `traceable` func
Tool calls are automatically rendered

:::note
The `LANGCHAIN_TRACING_V2` environment variable must be set to `'true'` in order for traces to be logged to LangSmith, even when using `wrap_openai` or `wrapOpenAI`. This allows you to toggle tracing on and off without changing your code.
The `LANGSMITH_TRACING` environment variable must be set to `'true'` in order for traces to be logged to LangSmith, even when using `wrap_openai` or `wrapOpenAI`. This allows you to toggle tracing on and off without changing your code.

Additionally, you will need to set the `LANGCHAIN_API_KEY` environment variable to your API key (see [Setup](/) for more information).
Additionally, you will need to set the `LANGSMITH_API_KEY` environment variable to your API key (see [Setup](/) for more information).

By default, the traces will be logged to a project named `default`.
To log traces to a different project, see [this section](./log_traces_to_project).
Expand Down Expand Up @@ -232,7 +232,7 @@ await chatPipeline("Can you summarize this morning's meetings?");`),
## Use the `RunTree` API

Another, more explicit way to log traces to LangSmith is via the `RunTree` API. This API allows you more control over your tracing - you can manually
create runs and children runs to assemble your trace. You still need to set your `LANGCHAIN_API_KEY`, but `LANGCHAIN_TRACING_V2` is not
create runs and children runs to assemble your trace. You still need to set your `LANGSMITH_API_KEY`, but `LANGSMITH_TRACING` is not
necessary for this method.

This method is not recommended, as it's easier to make mistakes in propagating trace context.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -16,20 +16,24 @@ You can change the destination project of your traces both statically through en

## Set the destination project statically

As mentioned in the [Tracing Concepts](/observability/concepts#projects) section, LangSmith uses the concept of a `Project` to group traces. If left unspecified, the project is set to `default`. You can set the `LANGCHAIN_PROJECT` environment variable to configure a custom project name for an entire application run. This should be done before executing your application.
As mentioned in the [Tracing Concepts](/observability/concepts#projects) section, LangSmith uses the concept of a `Project` to group traces. If left unspecified, the project is set to `default`. You can set the `LANGSMITH_PROJECT` environment variable to configure a custom project name for an entire application run. This should be done before executing your application.

```bash
export LANGCHAIN_PROJECT=my-custom-project
export LANGSMITH_PROJECT=my-custom-project
```

:::warning SDK compatibility in JS
The `LANGSMITH_PROJECT` flag is only supported in JS SDK versions >= 0.2.16, use `LANGCHAIN_PROJECT` instead if you are using an older version.
:::

If the project specified does not exist, it will be created automatically when the first trace is ingested.

## Set the destination project dynamically

You can also set the project name at program runtime in various ways, depending on how you are [annotating your code for tracing](./annotate_code). This is useful when you want to log traces to different projects within the same application.

:::note
Setting the project name dynamically using one of the below methods overrides the project name set by the `LANGCHAIN_PROJECT` environment variable.
Setting the project name dynamically using one of the below methods overrides the project name set by the `LANGSMITH_PROJECT` environment variable.
:::

<CodeTabs
Expand All @@ -43,7 +47,7 @@ messages = [
{"role": "user", "content": "Hello!"}
]\n
# Use the @traceable decorator with the 'project_name' parameter to log traces to LangSmith
# Ensure that the LANGCHAIN_TRACING_V2 environment variables is set for @traceable to work
# Ensure that the LANGSMITH_TRACING environment variables is set for @traceable to work
@traceable(
run_type="llm",
name="OpenAI Call Decorator",
Expand All @@ -68,7 +72,7 @@ call_openai(
)\n
# The wrapped OpenAI client accepts all the same langsmith_extra parameters
# as @traceable decorated functions, and logs traces to LangSmith automatically.
# Ensure that the LANGCHAIN_TRACING_V2 environment variables is set for the wrapper to work.
# Ensure that the LANGSMITH_TRACING environment variables is set for the wrapper to work.
from langsmith import wrappers
wrapped_client = wrappers.wrap_openai(client)
wrapped_client.chat.completions.create(
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -17,8 +17,8 @@ In some situations, you may need to prevent the inputs and outputs of your trace
If you want to completely hide the inputs and outputs of your traces, you can set the following environment variables when running your application:

```bash
LANGCHAIN_HIDE_INPUTS=true
LANGCHAIN_HIDE_OUTPUTS=true
LANGSMITH_HIDE_INPUTS=true
LANGSMITH_HIDE_OUTPUTS=true
```

This works for both the LangSmith SDK (Python and TypeScript) and LangChain.
Expand Down Expand Up @@ -98,7 +98,7 @@ This feature is available in the following LangSmith SDK versions:

To mask specific data in inputs and outputs, you can use the `create_anonymizer` / `createAnonymizer` function and pass the newly created anonymizer when instantiating the client. The anonymizer can be either constructed from a list of regex patterns and the replacement values or from a function that accepts and returns a string value.

The anonymizer will be skipped for inputs if `LANGCHAIN_HIDE_INPUTS = true`. Same applies for outputs if `LANGCHAIN_HIDE_OUTPUTS = true`.
The anonymizer will be skipped for inputs if `LANGSMITH_HIDE_INPUTS = true`. Same applies for outputs if `LANGSMITH_HIDE_OUTPUTS = true`.

However, if inputs or outputs are to be sent to client, the `anonymizer` method will take precedence over functions found in `hide_inputs` and `hide_outputs`. By default, the `create_anonymizer` will only look at maximum of 10 nesting levels deep, which can be configured via the `max_depth` parameter.

Expand Down
4 changes: 2 additions & 2 deletions docs/observability/how_to_guides/tracing/sample_traces.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -9,12 +9,12 @@ This section is relevant for those using the LangSmith SDK or LangChain, not for
:::

By default, all traces are logged to LangSmith.
To down-sample the number of traces logged to LangSmith, set the `LANGCHAIN_TRACING_SAMPLING_RATE` environment variable to
To down-sample the number of traces logged to LangSmith, set the `LANGSMITH_TRACING_SAMPLING_RATE` environment variable to
any float between `0` (no traces) and `1` (all traces).
For instance, setting the following environment variable will log 75% of the traces.

```bash
export LANGCHAIN_TRACING_SAMPLING_RATE=0.75
export LANGSMITH_TRACING_SAMPLING_RATE=0.75
```

This works for the `traceable` decorator and `RunTree` objects.
Original file line number Diff line number Diff line change
Expand Up @@ -14,5 +14,5 @@ This section is only relevant for users who are

:::

If you've decided you no longer want to trace your runs, you can unset the `LANGCHAIN_TRACING_V2` environment variable. Traces will no longer be logged to LangSmith.
If you've decided you no longer want to trace your runs, you can unset the `LANGSMITH_TRACING` environment variable. Traces will no longer be logged to LangSmith.
Note that this currently does not affect the `RunTree` objects or API users, as these are meant to be low-level and not affected by the tracing toggle.
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ We provide a convenient integration with [Instructor](https://jxnl.github.io/ins
In order to use, you first need to set your LangSmith API key.

```shell
export LANGCHAIN_API_KEY=<your-api-key>
export LANGSMITH_API_KEY=<your-api-key>
```

Next, you will need to install the LangSmith SDK:
Expand Down
18 changes: 11 additions & 7 deletions docs/observability/how_to_guides/tracing/trace_with_langchain.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,7 @@ chain.invoke({"question": "Am I using a callback?", "context": "I'm using a call
from langchain_core.tracers.context import tracing_v2_enabled
with tracing_v2_enabled():
chain.invoke({"question": "Am I using a context manager?", "context": "I'm using a context manager"})\n
# This will NOT be traced (assuming LANGCHAIN_TRACING_V2 is not set)
# This will NOT be traced (assuming LANGSMITH_TRACING is not set)
chain.invoke({"question": "Am I being traced?", "context": "I'm not being traced"})`),
TypeScriptBlock(`// You can configure a LangChainTracer instance to trace a specific invocation.
import { LangChainTracer } from "@langchain/core/tracers/tracer_langchain";\n
Expand All @@ -84,12 +84,16 @@ await chain.invoke(

### Statically

As mentioned in the [tracing conceptual guide](../../concepts) LangSmith uses the concept of a Project to group traces. If left unspecified, the tracer project is set to default. You can set the `LANGCHAIN_PROJECT` environment variable to configure a custom project name for an entire application run. This should be done before executing your application.
As mentioned in the [tracing conceptual guide](../../concepts) LangSmith uses the concept of a Project to group traces. If left unspecified, the tracer project is set to default. You can set the `LANGSMITH_PROJECT` environment variable to configure a custom project name for an entire application run. This should be done before executing your application.

```shell
export LANGCHAIN_PROJECT=my-project
export LANGSMITH_PROJECT=my-project
```

:::warning SDK compatibility in JS
The `LANGSMITH_PROJECT` flag is only supported in JS SDK versions >= 0.2.16, use `LANGCHAIN_PROJECT` instead if you are using an older version.
:::

### Dynamically

This largely builds off of the [previous section](#trace-selectively) and allows you to set the project name for a specific `LangChainTracer` instance or as parameters to the `tracing_v2_enabled` context manager in Python.
Expand Down Expand Up @@ -317,10 +321,10 @@ try {

As mentioned in other guides, the following environment variables allow you to configure tracing enabled, the api endpoint, the api key, and the tracing project:

- `LANGCHAIN_TRACING_V2`
- `LANGCHAIN_API_KEY`
- `LANGCHAIN_ENDPOINT`
- `LANGCHAIN_PROJECT`
- `LANGSMITH_TRACING`
- `LANGSMITH_API_KEY`
- `LANGSMITH_ENDPOINT`
- `LANGSMITH_PROJECT`

However, in some environments, it is not possible to set environment variables. In these cases, you can set the tracing configuration programmatically.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -204,7 +204,7 @@ import { generateText } from "ai";

interface Env {
OPENAI_API_KEY: string;
LANGSMITH_TRACING_V2: string;
LANGSMITH_TRACING: string;
LANGSMITH_ENDPOINT: string;
LANGSMITH_API_KEY: string;
}
Expand All @@ -218,9 +218,9 @@ const handler = {
model,
prompt: "Tell me a joke",
experimental_telemetry: AISDKExporter.getSettings({
// As `process.env.LANGSMITH_TRACING_V2` is undefined in Cloudflare Workers,
// As `process.env.LANGSMITH_TRACING` is undefined in Cloudflare Workers,
// we need to check the environment variable directly.
isEnabled: env.LANGSMITH_TRACING_V2 === "true",
isEnabled: env.LANGSMITH_TRACING === "true",
}),
});

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -14,16 +14,16 @@ import { RegionalUrl } from "@site/src/components/RegionalUrls";

As mentioned in other guides, the following environment variables allow you to configure tracing enabled, the api endpoint, the api key, and the tracing project:

- `LANGCHAIN_TRACING_V2`
- `LANGCHAIN_API_KEY`
- `LANGCHAIN_ENDPOINT`
- `LANGCHAIN_PROJECT`
- `LANGSMITH_TRACING`
- `LANGSMITH_API_KEY`
- `LANGSMITH_ENDPOINT`
- `LANGSMITH_PROJECT`

In some environments, it is not possible to set environment variables. In these cases, you can set the tracing configuration programmatically.

:::caution Recently changed behavior
Due to a number of asks for finer-grained control of tracing using the `trace` context manager,
**we changed the behavior** of `with trace` to honor the `LANGCHAIN_TRACING_V2` environment variable in version **0.1.95** of the Python SDK. You can find more details in the [release notes](https://github.com/langchain-ai/langsmith-sdk/releases/tag/v0.1.95).
**we changed the behavior** of `with trace` to honor the `LANGSMITH_TRACING` environment variable in version **0.1.95** of the Python SDK. You can find more details in the [release notes](https://github.com/langchain-ai/langsmith-sdk/releases/tag/v0.1.95).
The recommended way to disable/enable tracing without setting environment variables is to use the `with tracing_context` context manager, as shown in the example below.
:::

Expand Down
10 changes: 7 additions & 3 deletions docs/observability/tutorials/observability.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -115,11 +115,15 @@ Next, install the LangSmith SDK:
Finally, set up the appropriate environment variables. This will log traces to the `default` project (though you can easily change that).

```shell
export LANGCHAIN_TRACING_V2=true
export LANGCHAIN_API_KEY=<your-api-key>
export LANGCHAIN_PROJECT=default
export LANGSMITH_TRACING=true
export LANGSMITH_API_KEY=<your-api-key>
export LANGSMITH_PROJECT=default
```

:::warning SDK compatibility in JS
The `LANGSMITH_PROJECT` flag is only supported in JS SDK versions >= 0.2.16, use `LANGCHAIN_PROJECT` instead if you are using an older version.
:::

### Trace your LLM calls

The first thing you might want to trace is all your OpenAI calls.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -39,14 +39,14 @@ In TypeScript, you must use the LangChain npm package for pulling prompts (it al

## Configure environment variables

If you already have `LANGCHAIN_API_KEY` set to your current workspace's api key from LangSmith, you can skip this step.
If you already have `LANGSMITH_API_KEY` set to your current workspace's api key from LangSmith, you can skip this step.

Otherwise, get an API key for your workspace by navigating to `Settings > API Keys > Create API Key` in LangSmith.

Set your environment variable.

```bash
export LANGCHAIN_API_KEY="lsv2_..."
export LANGSMITH_API_KEY="lsv2_..."
```

:::note Terminology
Expand Down
Loading

0 comments on commit e9ec569

Please sign in to comment.