release notes
release notes
Published 9/2/2025
Safe upgradeWhen interacting with the Responses API, langchain-openai now defaults to storing response items in message content. This behavior was previously opt-in by specifying output_version="responses/v1" when instantiating ChatOpenAI. This was done to resolve BadRequestErrors that can arise in some multi-turn contexts.
To restore previous behavior, set the LC_OUTPUT_VERSION environment variable to v0, or specify output_version="v0" when instantiating ChatOpenAI:
os.environ["LC_OUTPUT_VERSION"] = "v0"
# or
from langchain_openai import ChatOpenAI
llm = ChatOpenA(model="...", output_version="v0")
release notes
Published 9/2/2025
Safe upgradeWhen interacting with the Responses API, langchain-openai now defaults to storing response items in message content. This behavior was previously opt-in by specifying output_version="responses/v1" when instantiating ChatOpenAI. This was done to resolve BadRequestErrors that can arise in some multi-turn contexts.
To restore previous behavior, set the LC_OUTPUT_VERSION environment variable to v0, or specify output_version="v0" when instantiating ChatOpenAI:
os.environ["LC_OUTPUT_VERSION"] = "v0"
# or
from langchain_openai import ChatOpenAI
llm = ChatOpenA(model="...", output_version="v0")