Skip to content

Commit

Permalink
lockfiles updated
Browse files Browse the repository at this point in the history
  • Loading branch information
aabmass committed Feb 8, 2025
1 parent e8759c5 commit 15733aa
Show file tree
Hide file tree
Showing 10 changed files with 2,437 additions and 444 deletions.
Original file line number Diff line number Diff line change
@@ -1,33 +1,32 @@
This sample contains part of the LangGraph chatbot demo taken from
https://python.langchain.com/docs/tutorials/chatbot, running with OTel instrumentation. It
sends traces and logs to the OTel collector which sends them to GCP. Docker compose wraps
everything to make it easy to run.
This sample contains a Streamlit + LangGraph chatbot demo. It sends traces and logs to the GCP
with the OTLP exporter and opentelemetry-exporter-gcp-logging exporters.

The `run_streamlit.py` script allows you to easily run the sample with auto instrumentation
enabled and sending telemetry to GCP. It just sets some environment variables and runs with
`opentelemetry-instrument.

## Running the example

I recommend running in Cloud Shell, it's super simple. You will see GenAI spans in trace
explorer right away. Make sure the Vertex and Trace APIs are enabled in the project.
First, make sure you have `uv` installed: https://docs.astral.sh/uv/getting-started/installation/.

Optionally, set a project with `export GOOGLE_CLOUD_PROJECT=...`. The app respects ADC.

### Cloud Shell or GCE
### Without cloning

```sh
git clone --branch=vertex-langgraph https://github.com/aabmass/opentelemetry-python-contrib.git
cd opentelemetry-python-contrib/instrumentation-genai/opentelemetry-instrumentation-vertexai/examples/langgraph-chatbot-demo
docker compose up --build --abort-on-container-exit
uv run --upgrade https://raw.githubusercontent.com/aabmass/opentelemetry-python-contrib/refs/heads/vertex-langgraph/inst rumentation-genai/opentelemetry-instrumentation-vertexai/examples/langgraph-chatbot-demo/run_streamlit.py
```

### Locally with Application Default Credentials
### With cloned repo

```sh
git clone --branch=vertex-langgraph https://github.com/aabmass/opentelemetry-python-contrib.git
cd opentelemetry-python-contrib/instrumentation-genai/opentelemetry-instrumentation-vertexai/examples/langgraph-chatbot-demo
uv run run_streamlit.py
```

### Without auto instrumentation

# Export the credentials to `GOOGLE_APPLICATION_CREDENTIALS` environment variable so it is
# available inside the docker containers
export GOOGLE_APPLICATION_CREDENTIALS=$HOME/.config/gcloud/application_default_credentials.json
# Lets collector read mounted config
export USERID="$(id -u)"
# Specify the project ID
export GOOGLE_CLOUD_PROJECT=<your project id>
docker compose up --build --abort-on-container-exit
```sh
uv run streamlit run src/langgraph_chatbot_demo/langchain_history.py
```

This file was deleted.

This file was deleted.

Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ dependencies = [
"langchain-google-vertexai>=2.0.7",
"langgraph>0.2.27",
"opentelemetry-distro>=0.50b0",
"opentelemetry-exporter-gcp-logging",
"opentelemetry-exporter-gcp-logging>=1.9.0a0",
"opentelemetry-exporter-gcp-trace>=1.8.0",
"opentelemetry-exporter-otlp-proto-grpc>=1.29.0",
"opentelemetry-instrumentation-aiohttp-client>=0.50b0",
Expand All @@ -24,8 +24,7 @@ dependencies = [
]

[tool.uv.sources]
opentelemetry-instrumentation-vertexai = { git = "https://github.com/aabmass/opentelemetry-python-contrib.git", subdirectory = "instrumentation-genai/opentelemetry-instrumentation-vertexai", branch = "vertex-langgraph" }
opentelemetry-exporter-gcp-logging = { git = "https://github.com/DylanRussell/opentelemetry-operations-python.git", subdirectory = "opentelemetry-exporter-gcp-logging", branch = "logging_exporter" }
opentelemetry-instrumentation-vertexai = { path = "../../" }

[dependency-groups]
dev = ["ipython>=8.18.1", "ruff>=0.9.2"]
Expand Down
Original file line number Diff line number Diff line change
@@ -1,55 +1,14 @@
# /// script
# requires-python = ">=3.13"
# dependencies = [
# "langgraph-chatbot-demo",
# "langgraph-chatbot-demo",
# ]
#
# [tool.uv.sources]
# langgraph-chatbot-demo = { path = "." }
# langgraph-chatbot-demo = { git = "https://github.com/aabmass/opentelemetry-python-contrib.git", subdirectory = "instrumentation-genai/opentelemetry-instrumentation-vertexai/examples/langgraph-chatbot-demo", branch = "vertex-langgraph" }
#
# ///

import os
import importlib.util
import subprocess

import google.auth
import google.auth.transport
import google.auth.transport.requests

creds, project_id = google.auth.default()
creds.refresh(google.auth.transport.requests.Request())


def setenv_default(k: str, v: str) -> None:
if k not in os.environ:
os.environ[k] = v


setenv_default(
"OTEL_EXPORTER_OTLP_ENDPOINT", "https://telemetry.googleapis.com:443"
)
setenv_default("OTEL_SERVICE_NAME", "langgraph-chatbot-demo")
setenv_default("OTEL_PYTHON_LOGGING_AUTO_INSTRUMENTATION_ENABLED", "true")
setenv_default("OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT", "true")
setenv_default("OTEL_LOGS_EXPORTER", "gcp_logging")
setenv_default("OTEL_RESOURCE_ATTRIBUTES", f"gcp.project_id={project_id}")
setenv_default(
"OTEL_EXPORTER_OTLP_HEADERS",
f"authorization=Bearer {creds.token},x-goog-user-project={project_id}",
)

langchain_app_spec = importlib.util.find_spec(
"langgraph_chatbot_demo.langchain_history"
)
if not (langchain_app_spec and langchain_app_spec.origin):
raise Exception("Could not find langchain_history.py")
from langgraph_chatbot_demo.run_streamlit import run_streamlit

subprocess.run(
[
"opentelemetry-instrument",
"streamlit",
"run",
langchain_app_spec.origin,
],
check=True,
)
run_streamlit()
Loading

0 comments on commit 15733aa

Please sign in to comment.