Metadata
Traces and observations (see Langfuse Data Model) can be enriched with metadata to better understand your users, application, and experiments. Metadata can be added to traces in the form of arbitrary JSON.
When using the @observe()
decorator:
from langfuse.decorators import langfuse_context, observe
@observe()
def fn():
langfuse_context.update_current_trace(
metadata={"key":"value"}
)
fn()
When using the low-level SDK:
from langfuse import Langfuse
langfuse = Langfuse()
trace = langfuse.trace(
metadata={"key":"value"}
)
import { Langfuse } from "langfuse";
const langfuse = new Langfuse();
const trace = langfuse.trace({
metadata: { key: "value" },
});
See JS/TS SDK docs for more details.
When using the OpenAI SDK Integration, pass metadata
as an additional argument:
from langfuse.openai import openai
completion = openai.chat.completions.create(
name="test-chat",
model="gpt-3.5-turbo",
messages=[
{"role": "system", "content": "You are a calculator."},
{"role": "user", "content": "1 + 1 = "}],
temperature=0,
# add metadata as additional argument
metadata={"key":"value"}
)
When using the integration with the @observe()
decorator (see interop docs), set metadata
via the langfuse_context
:
from langfuse.decorators import langfuse_context, observe
from langfuse.openai import openai
@observe()
def fn():
langfuse_context.update_current_trace(
metadata={"key":"value"}
)
completion = openai.chat.completions.create(
name="test-chat",
model="gpt-3.5-turbo",
messages=[
{"role": "system", "content": "You are a calculator."},
{"role": "user", "content": "1 + 1 = "}],
temperature=0,
)
fn()
When using the OpenAI SDK Integration (JS), pass metadata
as an additional argument:
import OpenAI from "openai";
import { observeOpenAI } from "langfuse";
const res = await observeOpenAI(new OpenAI(), {
metadata: { someMetadataKey: "someValue" },
}).chat.completions.create({
messages: [{ role: "system", content: "Tell me a story about a dog." }],
model: "gpt-3.5-turbo",
max_tokens: 300,
});
When using the CallbackHandler, you can pass metadata
as a keyword argument:
handler = CallbackHandler(
metadata={"key":"value"}
)
When using the integration with the @observe()
decorator (see interop docs), set metadata
via the langfuse_context
:
from langfuse.decorators import langfuse_context, observe
@observe()
def fn():
langfuse_context.update_current_trace(
metadata={"key":"value"}
)
langfuse_handler = langfuse_context.get_current_langchain_handler()
# Pass handler to invoke of your langchain chain/agent
chain.invoke({"person": person}, config={"callbacks":[langfuse_handler]})
fn()
When using the CallbackHandler, you can pass metadata
to the constructor:
const handler = new CallbackHandler({
metadata: { key: "value" },
});
When using the integration with the JS SDK (see interop docs), set metadata
via langfuse.trace()
:
import { CallbackHandler, Langfuse } from "langfuse-langchain";
const langfuse = new Langfuse();
const trace = langfuse.trace({
metadata: { key: "value" },
});
const langfuseHandler = new CallbackHandler({ root: trace });
// Add Langfuse handler as callback to your langchain chain/agent
await chain.invoke({ input: "<user_input>" }, { callbacks: [langfuseHandler] });
When using the LlamaIndex Integration, set the metadata
via the instrumentor.observe()
context manager:
from langfuse.llama_index import LlamaIndexInstrumentor
instrumentor = LlamaIndexInstrumentor()
with instrumentor.observe(metadata={"key":"value"}):
# ... your LlamaIndex index creation ...
index.as_query_engine().query("What is the capital of France?")
instrumentor.flush()
When using the integration with the @observe()
decorator (see interop docs), set the metadata via the langfuse_context
:
from langfuse.decorators import langfuse_context, observe
from langfuse.llama_index import LlamaIndexInstrumentor
instrumentor = LlamaIndexInstrumentor()
@observe()
def llama_index_fn(question: str):
# Update context
langfuse_context.update_current_trace(metadata={"key":"value"})
# Get IDs
current_trace_id = langfuse_context.get_current_trace_id()
current_observation_id = langfuse_context.get_current_observation_id()
# Pass to instrumentor
with instrumentor.observe(
trace_id=current_trace_id,
parent_observation_id=current_observation_id,
update_parent=False
) as trace:
# ... your LlamaIndex index creation ...
index.as_query_engine().query("What is the capital of France?")
# Run application
index = VectorStoreIndex.from_documents([doc1, doc2])
response = index.as_query_engine().query(question)
return response
When using the (deprecated) LlamaIndex Callback Integration, set the metadata
via set_trace_params
. All LlamaIndex traces created after set_trace_params
will include the metadata
. Learn more about set_trace_params
here.
from llama_index.core import Settings
from llama_index.core.callbacks import CallbackManager
from langfuse import langfuse
# Instantiate a new LlamaIndexCallbackHandler and register it in the LlamaIndex Settings
langfuse_callback_handler = LlamaIndexCallbackHandler()
Settings.callback_manager = CallbackManager([langfuse_callback_handler])
langfuse_callback_handler.set_trace_params(
metadata={"key":"value"}
)
When using the integration with the @observe()
decorator (see interop docs), set the metadata via the langfuse_context
:
from langfuse.decorators import langfuse_context, observe
from llama_index.core import Document, VectorStoreIndex
from llama_index.core import Settings
from llama_index.core.callbacks import CallbackManager
@observe()
def llama_index_fn(question: str):
langfuse_context.update_current_trace(
metadata={"key":"value"}
)
# Set callback manager for LlamaIndex, will apply to all LlamaIndex executions in this function
langfuse_handler = langfuse_context.get_current_llama_index_handler()
Settings.callback_manager = CallbackManager([langfuse_handler])
# Run application
index = VectorStoreIndex.from_documents([doc1,doc2])
response = index.as_query_engine().query(question)
return response
You can set the metadata
via the override configs, see the Flowise Integration docs for more details.