ConstOptionaloptions: LangChainOptionsimport * as Sentry from '@sentry/node';
import { ChatOpenAI } from '@langchain/openai';
Sentry.init({
integrations: [Sentry.langChainIntegration()],
sendDefaultPii: true, // Enable to record inputs/outputs
});
// LangChain calls are automatically instrumented
const model = new ChatOpenAI();
await model.invoke("What is the capital of France?");
You can also manually add the Sentry callback handler alongside other callbacks:
import * as Sentry from '@sentry/node';
import { ChatOpenAI } from '@langchain/openai';
const sentryHandler = Sentry.createLangChainCallbackHandler({
recordInputs: true,
recordOutputs: true
});
const model = new ChatOpenAI();
await model.invoke(
"What is the capital of France?",
{ callbacks: [sentryHandler, myOtherCallback] }
);
recordInputs: Whether to record input messages/prompts (default: respects sendDefaultPii client option)recordOutputs: Whether to record response text (default: respects sendDefaultPii client option)By default, the integration will:
sendDefaultPii is set to true in your Sentry client options// Record inputs and outputs when sendDefaultPii is false
Sentry.init({
integrations: [
Sentry.langChainIntegration({
recordInputs: true,
recordOutputs: true
})
],
});
// Never record inputs/outputs regardless of sendDefaultPii
Sentry.init({
sendDefaultPii: true,
integrations: [
Sentry.langChainIntegration({
recordInputs: false,
recordOutputs: false
})
],
});
The integration captures the following LangChain lifecycle events:
Adds Sentry tracing instrumentation for LangChain.
This integration is enabled by default.
When configured, this integration automatically instruments LangChain runnable instances to capture telemetry data by injecting Sentry callback handlers into all LangChain calls.
Important: This integration automatically skips wrapping the OpenAI, Anthropic, and Google GenAI providers to prevent duplicate spans when using LangChain with these AI providers. LangChain handles the instrumentation for all underlying AI providers.