Skip to content

Commit b668aeb

Browse files
gustavocidornelaswhoseoyster
authored andcommitted
chore: add Vertex AI example
1 parent 53de019 commit b668aeb

File tree

3 files changed

+160
-6
lines changed

3 files changed

+160
-6
lines changed

examples/tracing/langchain/langchain_callback.ipynb

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -81,7 +81,7 @@
8181
"id": "76a350b4",
8282
"metadata": {},
8383
"source": [
84-
"Now, you can pass the `openlayer_handler` as a callback to LLM's or chain invokations."
84+
"Now, you can pass the `openlayer_handler` as a callback to LLM's or chain invocations."
8585
]
8686
},
8787
{
@@ -119,7 +119,7 @@
119119
"id": "9a702ad1-da68-4757-95a6-4661ddaef251",
120120
"metadata": {},
121121
"source": [
122-
"That's it! Now your data is being streamed to Openlayer after every invokation."
122+
"That's it! Now your data is being streamed to Openlayer after every invocation."
123123
]
124124
},
125125
{
@@ -147,7 +147,7 @@
147147
"name": "python",
148148
"nbconvert_exporter": "python",
149149
"pygments_lexer": "ipython3",
150-
"version": "3.9.6"
150+
"version": "3.9.18"
151151
}
152152
},
153153
"nbformat": 4,

examples/tracing/ollama/ollama_tracing.ipynb

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -74,7 +74,7 @@
7474
"source": [
7575
"## 3. Use an Ollama model with LangChain\n",
7676
"\n",
77-
"Now, you can pass the `openlayer_handler` as a callback to LLM's or chain invokations."
77+
"Now, you can pass the `openlayer_handler` as a callback to LLM's or chain invocations."
7878
]
7979
},
8080
{
@@ -115,7 +115,7 @@
115115
"id": "9a702ad1-da68-4757-95a6-4661ddaef251",
116116
"metadata": {},
117117
"source": [
118-
"That's it! Now your data is being streamed to Openlayer after every invokation."
118+
"That's it! Now your data is being streamed to Openlayer after every invocation."
119119
]
120120
},
121121
{
@@ -143,7 +143,7 @@
143143
"name": "python",
144144
"nbconvert_exporter": "python",
145145
"pygments_lexer": "ipython3",
146-
"version": "3.9.19"
146+
"version": "3.9.18"
147147
}
148148
},
149149
"nbformat": 4,
Lines changed: 154 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,154 @@
1+
{
2+
"cells": [
3+
{
4+
"cell_type": "markdown",
5+
"id": "2722b419",
6+
"metadata": {},
7+
"source": [
8+
"[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/openlayer-ai/openlayer-python/blob/main/examples/tracing/vertex-ai/vertex_ai_tracing.ipynb)\n",
9+
"\n",
10+
"\n",
11+
"# <a id=\"top\">Vertex AI tracing</a>\n",
12+
"\n",
13+
"This notebook illustrates how use Openlayer's callback handler to trace calls to Vertex AI Gemini models. \n",
14+
"\n",
15+
"To use the integration you must:\n",
16+
"\n",
17+
"- Have your Vertex AI credentials configured for your environment (gcloud, workload identity, etc.)\n",
18+
"- Store the path to a service account JSON file as the `GOOGLE_APPLICATION_CREDENTIALS` environment variable."
19+
]
20+
},
21+
{
22+
"cell_type": "code",
23+
"execution_count": null,
24+
"id": "020c8f6a",
25+
"metadata": {},
26+
"outputs": [],
27+
"source": [
28+
"!pip install openlayer langchain-google-vertexai"
29+
]
30+
},
31+
{
32+
"cell_type": "markdown",
33+
"id": "75c2a473",
34+
"metadata": {},
35+
"source": [
36+
"## 1. Set the environment variables"
37+
]
38+
},
39+
{
40+
"cell_type": "code",
41+
"execution_count": null,
42+
"id": "f3f4fa13",
43+
"metadata": {},
44+
"outputs": [],
45+
"source": [
46+
"import os\n",
47+
"\n",
48+
"# Openlayer env variables\n",
49+
"os.environ[\"OPENLAYER_API_KEY\"] = \"YOUR_OPENLAYER_API_KEY_HERE\"\n",
50+
"os.environ[\"OPENLAYER_INFERENCE_PIPELINE_ID\"] = \"YOUR_OPENLAYER_INFERENCE_PIPELINE_ID_HERE\""
51+
]
52+
},
53+
{
54+
"cell_type": "markdown",
55+
"id": "9758533f",
56+
"metadata": {},
57+
"source": [
58+
"## 2. Instantiate the `OpenlayerHandler`"
59+
]
60+
},
61+
{
62+
"cell_type": "code",
63+
"execution_count": null,
64+
"id": "e60584fa",
65+
"metadata": {},
66+
"outputs": [],
67+
"source": [
68+
"from openlayer.lib.integrations import langchain_callback\n",
69+
"\n",
70+
"openlayer_handler = langchain_callback.OpenlayerHandler()"
71+
]
72+
},
73+
{
74+
"cell_type": "markdown",
75+
"id": "76a350b4",
76+
"metadata": {},
77+
"source": [
78+
"## 3. Use a Vertex AI model with LangChain\n",
79+
"\n",
80+
"Now, you can pass the `openlayer_handler` as a callback to LLM's or chain invocations."
81+
]
82+
},
83+
{
84+
"cell_type": "code",
85+
"execution_count": null,
86+
"id": "e00c1c79",
87+
"metadata": {},
88+
"outputs": [],
89+
"source": [
90+
"from langchain_google_vertexai import ChatVertexAI"
91+
]
92+
},
93+
{
94+
"cell_type": "code",
95+
"execution_count": null,
96+
"id": "abaf6987-c257-4f0d-96e7-3739b24c7206",
97+
"metadata": {},
98+
"outputs": [],
99+
"source": [
100+
"chat = ChatVertexAI(\n",
101+
" model=\"gemini-1.5-flash-001\",\n",
102+
" callbacks=[openlayer_handler]\n",
103+
")"
104+
]
105+
},
106+
{
107+
"cell_type": "code",
108+
"execution_count": null,
109+
"id": "4123669f-aa28-47b7-8d46-ee898aba99e8",
110+
"metadata": {},
111+
"outputs": [],
112+
"source": [
113+
"chat.invoke(\"What's the meaning of life?\")"
114+
]
115+
},
116+
{
117+
"cell_type": "markdown",
118+
"id": "9a702ad1-da68-4757-95a6-4661ddaef251",
119+
"metadata": {},
120+
"source": [
121+
"That's it! Now your data is being streamed to Openlayer after every invocation."
122+
]
123+
},
124+
{
125+
"cell_type": "code",
126+
"execution_count": null,
127+
"id": "a3092828-3fbd-4f12-bae7-8de7f7319ff0",
128+
"metadata": {},
129+
"outputs": [],
130+
"source": []
131+
}
132+
],
133+
"metadata": {
134+
"kernelspec": {
135+
"display_name": "Python 3 (ipykernel)",
136+
"language": "python",
137+
"name": "python3"
138+
},
139+
"language_info": {
140+
"codemirror_mode": {
141+
"name": "ipython",
142+
"version": 3
143+
},
144+
"file_extension": ".py",
145+
"mimetype": "text/x-python",
146+
"name": "python",
147+
"nbconvert_exporter": "python",
148+
"pygments_lexer": "ipython3",
149+
"version": "3.9.18"
150+
}
151+
},
152+
"nbformat": 4,
153+
"nbformat_minor": 5
154+
}

0 commit comments

Comments
 (0)