Skip to content

Commit a83ba44

Browse files
authored
Harrison/ver0089 (langchain-ai#1144)
1 parent 7b5e160 commit a83ba44

File tree

9 files changed

+347
-9
lines changed

9 files changed

+347
-9
lines changed

docs/modules/llms/integrations.rst

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -3,6 +3,12 @@ Integrations
33

44
The examples here are all "how-to" guides for how to integrate with various LLM providers.
55

6+
`OpenAI <./integrations/openai.html>`_: Covers how to connect to OpenAI models.
7+
8+
`Cohere <./integrations/cohere.html>`_: Covers how to connect to Cohere models.
9+
10+
`AI21 <./integrations/ai21.html>`_: Covers how to connect to AI21 models.
11+
612
`Huggingface Hub <./integrations/huggingface_hub.html>`_: Covers how to connect to LLMs hosted on HuggingFace Hub.
713

814
`Azure OpenAI <./integrations/azure_openai_example.html>`_: Covers how to connect to Azure-hosted OpenAI Models.
Lines changed: 99 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,99 @@
1+
{
2+
"cells": [
3+
{
4+
"cell_type": "markdown",
5+
"id": "9597802c",
6+
"metadata": {},
7+
"source": [
8+
"# Cohere\n",
9+
"This example goes over how to use LangChain to interact with Cohere models"
10+
]
11+
},
12+
{
13+
"cell_type": "code",
14+
"execution_count": 1,
15+
"id": "6fb585dd",
16+
"metadata": {},
17+
"outputs": [],
18+
"source": [
19+
"from langchain.llms import AI21\n",
20+
"from langchain import PromptTemplate, LLMChain"
21+
]
22+
},
23+
{
24+
"cell_type": "code",
25+
"execution_count": 2,
26+
"id": "035dea0f",
27+
"metadata": {},
28+
"outputs": [],
29+
"source": [
30+
"template = \"\"\"Question: {question}\n",
31+
"\n",
32+
"Answer: Let's think step by step.\"\"\"\n",
33+
"\n",
34+
"prompt = PromptTemplate(template=template, input_variables=[\"question\"])"
35+
]
36+
},
37+
{
38+
"cell_type": "code",
39+
"execution_count": 3,
40+
"id": "3f3458d9",
41+
"metadata": {},
42+
"outputs": [],
43+
"source": [
44+
"llm = AI21()"
45+
]
46+
},
47+
{
48+
"cell_type": "code",
49+
"execution_count": 4,
50+
"id": "a641dbd9",
51+
"metadata": {},
52+
"outputs": [],
53+
"source": [
54+
"llm_chain = LLMChain(prompt=prompt, llm=llm)"
55+
]
56+
},
57+
{
58+
"cell_type": "code",
59+
"execution_count": null,
60+
"id": "9f0b1960",
61+
"metadata": {},
62+
"outputs": [],
63+
"source": [
64+
"question = \"What NFL team won the Super Bowl in the year Justin Beiber was born?\"\n",
65+
"\n",
66+
"llm_chain.run(question)"
67+
]
68+
},
69+
{
70+
"cell_type": "code",
71+
"execution_count": null,
72+
"id": "22bce013",
73+
"metadata": {},
74+
"outputs": [],
75+
"source": []
76+
}
77+
],
78+
"metadata": {
79+
"kernelspec": {
80+
"display_name": "Python 3 (ipykernel)",
81+
"language": "python",
82+
"name": "python3"
83+
},
84+
"language_info": {
85+
"codemirror_mode": {
86+
"name": "ipython",
87+
"version": 3
88+
},
89+
"file_extension": ".py",
90+
"mimetype": "text/x-python",
91+
"name": "python",
92+
"nbconvert_exporter": "python",
93+
"pygments_lexer": "ipython3",
94+
"version": "3.9.1"
95+
}
96+
},
97+
"nbformat": 4,
98+
"nbformat_minor": 5
99+
}
Lines changed: 110 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,110 @@
1+
{
2+
"cells": [
3+
{
4+
"cell_type": "markdown",
5+
"id": "9597802c",
6+
"metadata": {},
7+
"source": [
8+
"# Cohere\n",
9+
"This example goes over how to use LangChain to interact with Cohere models"
10+
]
11+
},
12+
{
13+
"cell_type": "code",
14+
"execution_count": 1,
15+
"id": "6fb585dd",
16+
"metadata": {},
17+
"outputs": [],
18+
"source": [
19+
"from langchain.llms import Cohere\n",
20+
"from langchain import PromptTemplate, LLMChain"
21+
]
22+
},
23+
{
24+
"cell_type": "code",
25+
"execution_count": 2,
26+
"id": "035dea0f",
27+
"metadata": {},
28+
"outputs": [],
29+
"source": [
30+
"template = \"\"\"Question: {question}\n",
31+
"\n",
32+
"Answer: Let's think step by step.\"\"\"\n",
33+
"\n",
34+
"prompt = PromptTemplate(template=template, input_variables=[\"question\"])"
35+
]
36+
},
37+
{
38+
"cell_type": "code",
39+
"execution_count": 3,
40+
"id": "3f3458d9",
41+
"metadata": {},
42+
"outputs": [],
43+
"source": [
44+
"llm = Cohere()"
45+
]
46+
},
47+
{
48+
"cell_type": "code",
49+
"execution_count": 4,
50+
"id": "a641dbd9",
51+
"metadata": {},
52+
"outputs": [],
53+
"source": [
54+
"llm_chain = LLMChain(prompt=prompt, llm=llm)"
55+
]
56+
},
57+
{
58+
"cell_type": "code",
59+
"execution_count": 5,
60+
"id": "9f844993",
61+
"metadata": {},
62+
"outputs": [
63+
{
64+
"data": {
65+
"text/plain": [
66+
"\" Let's start with the year that Justin Beiber was born. You know that he was born in 1994. We have to go back one year. 1993.\\n\\n1993 was the year that the Dallas Cowboys won the Super Bowl. They won over the Buffalo Bills in Super Bowl 26.\\n\\nNow, let's do it backwards. According to our information, the Green Bay Packers last won the Super Bowl in the 2010-2011 season. Now, we can't go back in time, so let's go from 2011 when the Packers won the Super Bowl, back to 1984. That is the year that the Packers won the Super Bowl over the Raiders.\\n\\nSo, we have the year that Justin Beiber was born, 1994, and the year that the Packers last won the Super Bowl, 2011, and now we have to go in the middle, 1986. That is the year that the New York Giants won the Super Bowl over the Denver Broncos. The Giants won Super Bowl 21.\\n\\nThe New York Giants won the Super Bowl in 1986. This means that the Green Bay Packers won the Super Bowl in 2011.\\n\\nDid you get it right? If you are still a bit confused, just try to go back to the question again and review the answer\""
67+
]
68+
},
69+
"execution_count": 5,
70+
"metadata": {},
71+
"output_type": "execute_result"
72+
}
73+
],
74+
"source": [
75+
"question = \"What NFL team won the Super Bowl in the year Justin Beiber was born?\"\n",
76+
"\n",
77+
"llm_chain.run(question)"
78+
]
79+
},
80+
{
81+
"cell_type": "code",
82+
"execution_count": null,
83+
"id": "4797d719",
84+
"metadata": {},
85+
"outputs": [],
86+
"source": []
87+
}
88+
],
89+
"metadata": {
90+
"kernelspec": {
91+
"display_name": "Python 3 (ipykernel)",
92+
"language": "python",
93+
"name": "python3"
94+
},
95+
"language_info": {
96+
"codemirror_mode": {
97+
"name": "ipython",
98+
"version": 3
99+
},
100+
"file_extension": ".py",
101+
"mimetype": "text/x-python",
102+
"name": "python",
103+
"nbconvert_exporter": "python",
104+
"pygments_lexer": "ipython3",
105+
"version": "3.9.1"
106+
}
107+
},
108+
"nbformat": 4,
109+
"nbformat_minor": 5
110+
}
Lines changed: 110 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,110 @@
1+
{
2+
"cells": [
3+
{
4+
"cell_type": "markdown",
5+
"id": "9597802c",
6+
"metadata": {},
7+
"source": [
8+
"# OpenAI\n",
9+
"This example goes over how to use LangChain to interact with OpenAI models"
10+
]
11+
},
12+
{
13+
"cell_type": "code",
14+
"execution_count": 1,
15+
"id": "6fb585dd",
16+
"metadata": {},
17+
"outputs": [],
18+
"source": [
19+
"from langchain.llms import OpenAI\n",
20+
"from langchain import PromptTemplate, LLMChain"
21+
]
22+
},
23+
{
24+
"cell_type": "code",
25+
"execution_count": 2,
26+
"id": "035dea0f",
27+
"metadata": {},
28+
"outputs": [],
29+
"source": [
30+
"template = \"\"\"Question: {question}\n",
31+
"\n",
32+
"Answer: Let's think step by step.\"\"\"\n",
33+
"\n",
34+
"prompt = PromptTemplate(template=template, input_variables=[\"question\"])"
35+
]
36+
},
37+
{
38+
"cell_type": "code",
39+
"execution_count": 3,
40+
"id": "3f3458d9",
41+
"metadata": {},
42+
"outputs": [],
43+
"source": [
44+
"llm = OpenAI()"
45+
]
46+
},
47+
{
48+
"cell_type": "code",
49+
"execution_count": 4,
50+
"id": "a641dbd9",
51+
"metadata": {},
52+
"outputs": [],
53+
"source": [
54+
"llm_chain = LLMChain(prompt=prompt, llm=llm)"
55+
]
56+
},
57+
{
58+
"cell_type": "code",
59+
"execution_count": 5,
60+
"id": "9f844993",
61+
"metadata": {},
62+
"outputs": [
63+
{
64+
"data": {
65+
"text/plain": [
66+
"' Justin Bieber was born in 1994, so the NFL team that won the Super Bowl in that year was the Dallas Cowboys.'"
67+
]
68+
},
69+
"execution_count": 5,
70+
"metadata": {},
71+
"output_type": "execute_result"
72+
}
73+
],
74+
"source": [
75+
"question = \"What NFL team won the Super Bowl in the year Justin Beiber was born?\"\n",
76+
"\n",
77+
"llm_chain.run(question)"
78+
]
79+
},
80+
{
81+
"cell_type": "code",
82+
"execution_count": null,
83+
"id": "4797d719",
84+
"metadata": {},
85+
"outputs": [],
86+
"source": []
87+
}
88+
],
89+
"metadata": {
90+
"kernelspec": {
91+
"display_name": "Python 3 (ipykernel)",
92+
"language": "python",
93+
"name": "python3"
94+
},
95+
"language_info": {
96+
"codemirror_mode": {
97+
"name": "ipython",
98+
"version": 3
99+
},
100+
"file_extension": ".py",
101+
"mimetype": "text/x-python",
102+
"name": "python",
103+
"nbconvert_exporter": "python",
104+
"pygments_lexer": "ipython3",
105+
"version": "3.9.1"
106+
}
107+
},
108+
"nbformat": 4,
109+
"nbformat_minor": 5
110+
}

langchain/agents/load_tools.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -133,7 +133,7 @@ def _get_google_search(**kwargs: Any) -> BaseTool:
133133

134134
def _get_google_serper(**kwargs: Any) -> BaseTool:
135135
return Tool(
136-
name="Search",
136+
name="Serper Search",
137137
func=GoogleSerperAPIWrapper(**kwargs).run,
138138
description="A low-cost Google Search API. Useful for when you need to answer questions about current events. Input should be a search query.",
139139
)
@@ -154,7 +154,7 @@ def _get_serpapi(**kwargs: Any) -> BaseTool:
154154

155155
def _get_searx_search(**kwargs: Any) -> BaseTool:
156156
return Tool(
157-
name="Search",
157+
name="SearX Search",
158158
description="A meta search engine. Useful for when you need to answer questions about current events. Input should be a search query.",
159159
func=SearxSearchWrapper(**kwargs).run,
160160
)

langchain/tools/bing_search/tool.py

Lines changed: 6 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -7,8 +7,12 @@
77
class BingSearchRun(BaseTool):
88
"""Tool that adds the capability to query the Bing search API."""
99

10-
name = "bing_search"
11-
description = "Execute the Bing search API."
10+
name = "Bing Search"
11+
description = (
12+
"A wrapper around Bing Search. "
13+
"Useful for when you need to answer questions about current events. "
14+
"Input should be a search query."
15+
)
1216
api_wrapper: BingSearchAPIWrapper
1317

1418
def _run(self, query: str) -> str:

langchain/tools/google_search/tool.py

Lines changed: 6 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -7,8 +7,12 @@
77
class GoogleSearchRun(BaseTool):
88
"""Tool that adds the capability to query the Google search API."""
99

10-
name = "google_search"
11-
description = "Execute the Google search API."
10+
name = "Google Search"
11+
description = (
12+
"A wrapper around Google Search. "
13+
"Useful for when you need to answer questions about current events. "
14+
"Input should be a search query."
15+
)
1216
api_wrapper: GoogleSearchAPIWrapper
1317

1418
def _run(self, query: str) -> str:

langchain/tools/wolfram_alpha/tool.py

Lines changed: 7 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -7,8 +7,13 @@
77
class WolframAlphaQueryRun(BaseTool):
88
"""Tool that adds the capability to query using the Wolfram Alpha SDK."""
99

10-
name = "query_wolfram_alpha"
11-
description = "Query Wolfram Alpha with the given query."
10+
name = "Wolfram Alpha"
11+
description = (
12+
"A wrapper around Wolfram Alpha. "
13+
"Useful for when you need to answer questions about Math, "
14+
"Science, Technology, Culture, Society and Everyday Life. "
15+
"Input should be a search query."
16+
)
1217
api_wrapper: WolframAlphaAPIWrapper
1318

1419
def _run(self, query: str) -> str:

0 commit comments

Comments
 (0)