Skip to content

Commit

Permalink
Add API chain (langchain-ai#1400)
Browse files Browse the repository at this point in the history
* adding initial API chain code

* lint fix

* add docs and example

* Update file structure, move around parameters and TS types

* Slightly shorten APIChain example for consistency

---------

Co-authored-by: Larry Anderson <larryboymi@hotmail.com>
  • Loading branch information
jacoblee93 and larryboymi authored May 24, 2023
1 parent 052664c commit 1f787a6
Show file tree
Hide file tree
Showing 10 changed files with 335 additions and 1 deletion.
8 changes: 8 additions & 0 deletions docs/docs/modules/chains/other_chains/api_chain.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
import CodeBlock from "@theme/CodeBlock";
import APIExample from "@examples/chains/api_chain.ts";

# `APIChain`

APIChain enables using LLMs to interact with APIs to retrieve relevant information. Construct the chain by providing a question relevant to the provided API documentation.

<CodeBlock language="typescript">{APIExample}</CodeBlock>
2 changes: 1 addition & 1 deletion docs/docs/use_cases/api.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ If you are just getting started and you have relatively simple APIs, you should
Chains are a sequence of predetermined steps, so they are good to get started with as they give you more control and let you
understand what is happening better.

TODO: add an API chain and then add an example here.
- [API Chain](../modules/chains/other_chains/api_chain.mdx)

## Agents

Expand Down
43 changes: 43 additions & 0 deletions examples/src/chains/api_chain.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
import { OpenAI } from "langchain/llms/openai";
import { APIChain } from "langchain/chains";

const OPEN_METEO_DOCS = `BASE URL: https://api.open-meteo.com/
API Documentation
The API endpoint /v1/forecast accepts a geographical coordinate, a list of weather variables and responds with a JSON hourly weather forecast for 7 days. Time always starts at 0:00 today and contains 168 hours. All URL parameters are listed below:
Parameter Format Required Default Description
latitude, longitude Floating point Yes Geographical WGS84 coordinate of the location
hourly String array No A list of weather variables which should be returned. Values can be comma separated, or multiple &hourly= parameter in the URL can be used.
daily String array No A list of daily weather variable aggregations which should be returned. Values can be comma separated, or multiple &daily= parameter in the URL can be used. If daily weather variables are specified, parameter timezone is required.
current_weather Bool No false Include current weather conditions in the JSON output.
temperature_unit String No celsius If fahrenheit is set, all temperature values are converted to Fahrenheit.
windspeed_unit String No kmh Other wind speed speed units: ms, mph and kn
precipitation_unit String No mm Other precipitation amount units: inch
timeformat String No iso8601 If format unixtime is selected, all time values are returned in UNIX epoch time in seconds. Please note that all timestamp are in GMT+0! For daily values with unix timestamps, please apply utc_offset_seconds again to get the correct date.
timezone String No GMT If timezone is set, all timestamps are returned as local-time and data is returned starting at 00:00 local-time. Any time zone name from the time zone database is supported. If auto is set as a time zone, the coordinates will be automatically resolved to the local time zone.
past_days Integer (0-2) No 0 If past_days is set, yesterday or the day before yesterday data are also returned.
start_date
end_date String (yyyy-mm-dd) No The time interval to get weather data. A day must be specified as an ISO8601 date (e.g. 2022-06-30).
models String array No auto Manually select one or more weather models. Per default, the best suitable weather models will be combined.
Variable Valid time Unit Description
temperature_2m Instant °C (°F) Air temperature at 2 meters above ground
snowfall Preceding hour sum cm (inch) Snowfall amount of the preceding hour in centimeters. For the water equivalent in millimeter, divide by 7. E.g. 7 cm snow = 10 mm precipitation water equivalent
rain Preceding hour sum mm (inch) Rain from large scale weather systems of the preceding hour in millimeter
showers Preceding hour sum mm (inch) Showers from convective precipitation in millimeters from the preceding hour
weathercode Instant WMO code Weather condition as a numeric code. Follow WMO weather interpretation codes. See table below for details.
snow_depth Instant meters Snow depth on the ground
freezinglevel_height Instant meters Altitude above sea level of the 0°C level
visibility Instant meters Viewing distance in meters. Influenced by low clouds, humidity and aerosols. Maximum visibility is approximately 24 km.`;

export async function run() {
const model = new OpenAI({ modelName: "text-davinci-003" });
const chain = APIChain.fromLLMAndAPIDocs(model, OPEN_METEO_DOCS);

const res = await chain.call({
question:
"What is the weather like right now in Munich, Germany in degrees Farenheit?",
});
console.log({ res });
}
136 changes: 136 additions & 0 deletions langchain/src/chains/api/api_chain.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,136 @@
import { BaseChain, ChainInputs } from "../base.js";
import { SerializedAPIChain } from "../serde.js";
import { LLMChain } from "../llm_chain.js";
import { BaseLanguageModel } from "../../base_language/index.js";
import { CallbackManagerForChainRun } from "../../callbacks/manager.js";
import { ChainValues } from "../../schema/index.js";
import {
API_URL_PROMPT_TEMPLATE,
API_RESPONSE_PROMPT_TEMPLATE,
} from "./prompts.js";
import { BasePromptTemplate } from "../../index.js";

export interface APIChainInput extends Omit<ChainInputs, "memory"> {
apiAnswerChain: LLMChain;
apiRequestChain: LLMChain;
apiDocs: string;
inputKey?: string;
headers?: Record<string, string>;
/** Key to use for output, defaults to `output` */
outputKey?: string;
}

export type APIChainOptions = {
headers?: Record<string, string>;
apiUrlPrompt?: BasePromptTemplate;
apiResponsePrompt?: BasePromptTemplate;
};

export class APIChain extends BaseChain implements APIChainInput {
apiAnswerChain: LLMChain;

apiRequestChain: LLMChain;

apiDocs: string;

headers = {};

inputKey = "question";

outputKey = "output";

get inputKeys() {
return [this.inputKey];
}

get outputKeys() {
return [this.outputKey];
}

constructor(fields: APIChainInput) {
super(fields);
this.apiRequestChain = fields.apiRequestChain;
this.apiAnswerChain = fields.apiAnswerChain;
this.apiDocs = fields.apiDocs;
this.inputKey = fields.inputKey ?? this.inputKey;
this.outputKey = fields.outputKey ?? this.outputKey;
this.headers = fields.headers ?? this.headers;
}

/** @ignore */
async _call(
values: ChainValues,
runManager?: CallbackManagerForChainRun
): Promise<ChainValues> {
const question: string = values[this.inputKey];

const api_url = await this.apiRequestChain.predict(
{ question, api_docs: this.apiDocs },
runManager?.getChild()
);

const res = await fetch(api_url, { headers: this.headers });
const api_response = await res.text();

const answer = await this.apiAnswerChain.predict(
{ question, api_docs: this.apiDocs, api_url, api_response },
runManager?.getChild()
);

return { [this.outputKey]: answer };
}

_chainType() {
return "api_chain" as const;
}

static async deserialize(data: SerializedAPIChain) {
const { api_request_chain, api_answer_chain, api_docs } = data;

if (!api_request_chain) {
throw new Error("LLMChain must have api_request_chain");
}
if (!api_answer_chain) {
throw new Error("LLMChain must have api_answer_chain");
}

if (!api_docs) {
throw new Error("LLMChain must have api_docs");
}

return new APIChain({
apiAnswerChain: await LLMChain.deserialize(api_answer_chain),
apiRequestChain: await LLMChain.deserialize(api_request_chain),
apiDocs: api_docs,
});
}

serialize(): SerializedAPIChain {
return {
_type: this._chainType(),
api_answer_chain: this.apiAnswerChain.serialize(),
api_request_chain: this.apiRequestChain.serialize(),
api_docs: this.apiDocs,
};
}

static fromLLMAndAPIDocs(
llm: BaseLanguageModel,
apiDocs: string,
options: APIChainOptions &
Omit<APIChainInput, "apiAnswerChain" | "apiRequestChain" | "apiDocs"> = {}
): APIChain {
const {
apiUrlPrompt = API_URL_PROMPT_TEMPLATE,
apiResponsePrompt = API_RESPONSE_PROMPT_TEMPLATE,
} = options;
const apiRequestChain = new LLMChain({ prompt: apiUrlPrompt, llm });
const apiAnswerChain = new LLMChain({ prompt: apiResponsePrompt, llm });
return new this({
apiAnswerChain,
apiRequestChain,
apiDocs,
...options,
});
}
}
29 changes: 29 additions & 0 deletions langchain/src/chains/api/prompts.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
/* eslint-disable spaced-comment */
import { PromptTemplate } from "../../prompts/prompt.js";

export const API_URL_RAW_PROMPT_TEMPLATE = `You are given the below API Documentation:
{api_docs}
Using this documentation, generate the full API url to call for answering the user question.
You should build the API url in order to get a response that is as short as possible, while still getting the necessary information to answer the question. Pay attention to deliberately exclude any unnecessary pieces of data in the API call.
Question:{question}
API url:`;

export const API_URL_PROMPT_TEMPLATE = /* #__PURE__ */ new PromptTemplate({
inputVariables: ["api_docs", "question"],
template: API_URL_RAW_PROMPT_TEMPLATE,
});

export const API_RESPONSE_RAW_PROMPT_TEMPLATE = `${API_URL_RAW_PROMPT_TEMPLATE} {api_url}
Here is the response from the API:
{api_response}
Summarize this response to answer the original question.
Summary:`;
export const API_RESPONSE_PROMPT_TEMPLATE = /* #__PURE__ */ new PromptTemplate({
inputVariables: ["api_docs", "question", "api_url", "api_response"],
template: API_RESPONSE_RAW_PROMPT_TEMPLATE,
});
4 changes: 4 additions & 0 deletions langchain/src/chains/base.ts
Original file line number Diff line number Diff line change
Expand Up @@ -189,6 +189,10 @@ export abstract class BaseChain extends BaseLangChain implements ChainInputs {
const { VectorDBQAChain } = await import("./vector_db_qa.js");
return VectorDBQAChain.deserialize(data, values);
}
case "api_chain": {
const { APIChain } = await import("./api/api_chain.js");
return APIChain.deserialize(data);
}
default:
throw new Error(
`Invalid prompt type in config: ${
Expand Down
2 changes: 2 additions & 0 deletions langchain/src/chains/index.ts
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
export { BaseChain, ChainInputs } from "./base.js";
export { LLMChain, LLMChainInput } from "./llm_chain.js";
export { APIChain, APIChainInput } from "./api/api_chain.js";
export { ConversationChain } from "./conversation.js";
export {
SequentialChain,
Expand Down Expand Up @@ -61,6 +62,7 @@ export {
SerializedSimpleSequentialChain,
SerializedSqlDatabaseChain,
SerializedAnalyzeDocumentChain,
SerializedAPIChain,
SerializedBaseChain,
SerializedChatVectorDBQAChain,
SerializedMapReduceDocumentsChain,
Expand Down
8 changes: 8 additions & 0 deletions langchain/src/chains/serde.ts
Original file line number Diff line number Diff line change
Expand Up @@ -32,6 +32,13 @@ export type SerializedVectorDBQAChain = {
combine_documents_chain: SerializedBaseChain;
};

export type SerializedAPIChain = {
_type: "api_chain";
api_request_chain: SerializedLLMChain;
api_answer_chain: SerializedLLMChain;
api_docs: string;
};

export type SerializedStuffDocumentsChain = {
_type: "stuff_documents_chain";
llm_chain?: SerializedLLMChain;
Expand Down Expand Up @@ -81,6 +88,7 @@ export type SerializedBaseChain =
| SerializedSequentialChain
| SerializedSimpleSequentialChain
| SerializedVectorDBQAChain
| SerializedAPIChain
| SerializedStuffDocumentsChain
| SerializedSqlDatabaseChain
| SerializedChatVectorDBQAChain
Expand Down
75 changes: 75 additions & 0 deletions langchain/src/chains/tests/api_chain.int.test.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,75 @@
import { test } from "@jest/globals";
import { OpenAI } from "../../llms/openai.js";
import { LLMChain } from "../llm_chain.js";
import { loadChain } from "../load.js";
import { APIChain, APIChainInput } from "../api/api_chain.js";
import {
API_URL_PROMPT_TEMPLATE,
API_RESPONSE_PROMPT_TEMPLATE,
} from "../api/prompts.js";
import { OPEN_METEO_DOCS } from "./example_data/open_meteo_docs.js";

const test_api_docs = `
This API endpoint will search the notes for a user.
Endpoint: https://httpbin.org
GET /get
Query parameters:
q | string | The search term for notes
`;

const testApiData = {
api_docs: test_api_docs,
question: "Search for notes containing langchain",
api_url: "https://httpbin.com/api/notes?q=langchain",
api_response: JSON.stringify({
success: true,
results: [{ id: 1, content: "Langchain is awesome!" }],
}),
api_summary: "There is 1 note about langchain.",
};

test("Test APIChain", async () => {
const model = new OpenAI({ modelName: "text-davinci-003" });
const apiRequestChain = new LLMChain({
prompt: API_URL_PROMPT_TEMPLATE,
llm: model,
});
const apiAnswerChain = new LLMChain({
prompt: API_RESPONSE_PROMPT_TEMPLATE,
llm: model,
});

const apiChainInput: APIChainInput = {
apiAnswerChain,
apiRequestChain,
apiDocs: testApiData.api_docs,
};

const chain = new APIChain(apiChainInput);
const res = await chain.call({
question: "Search for notes containing langchain",
});
console.log({ res });
});

test("Test APIChain fromLLMAndApiDocs", async () => {
// This test doesn't work as well with earlier models
const model = new OpenAI({ modelName: "text-davinci-003" });
const chain = APIChain.fromLLMAndAPIDocs(model, OPEN_METEO_DOCS);
const res = await chain.call({
question:
"What is the weather like right now in Munich, Germany in degrees Farenheit?",
});
console.log({ res });
});

test("Load APIChain from hub", async () => {
const chain = await loadChain("lc://chains/api/meteo/chain.json");
const res = await chain.call({
question:
"What is the weather like right now in Munich, Germany in degrees Farenheit?",
});
console.log({ res });
});
29 changes: 29 additions & 0 deletions langchain/src/chains/tests/example_data/open_meteo_docs.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
export const OPEN_METEO_DOCS = `BASE URL: https://api.open-meteo.com/
API Documentation
The API endpoint /v1/forecast accepts a geographical coordinate, a list of weather variables and responds with a JSON hourly weather forecast for 7 days. Time always starts at 0:00 today and contains 168 hours. All URL parameters are listed below:
Parameter Format Required Default Description
latitude, longitude Floating point Yes Geographical WGS84 coordinate of the location
hourly String array No A list of weather variables which should be returned. Values can be comma separated, or multiple &hourly= parameter in the URL can be used.
daily String array No A list of daily weather variable aggregations which should be returned. Values can be comma separated, or multiple &daily= parameter in the URL can be used. If daily weather variables are specified, parameter timezone is required.
current_weather Bool No false Include current weather conditions in the JSON output.
temperature_unit String No celsius If fahrenheit is set, all temperature values are converted to Fahrenheit.
windspeed_unit String No kmh Other wind speed speed units: ms, mph and kn
precipitation_unit String No mm Other precipitation amount units: inch
timeformat String No iso8601 If format unixtime is selected, all time values are returned in UNIX epoch time in seconds. Please note that all timestamp are in GMT+0! For daily values with unix timestamps, please apply utc_offset_seconds again to get the correct date.
timezone String No GMT If timezone is set, all timestamps are returned as local-time and data is returned starting at 00:00 local-time. Any time zone name from the time zone database is supported. If auto is set as a time zone, the coordinates will be automatically resolved to the local time zone.
past_days Integer (0-2) No 0 If past_days is set, yesterday or the day before yesterday data are also returned.
start_date
end_date String (yyyy-mm-dd) No The time interval to get weather data. A day must be specified as an ISO8601 date (e.g. 2022-06-30).
models String array No auto Manually select one or more weather models. Per default, the best suitable weather models will be combined.
Variable Valid time Unit Description
temperature_2m Instant °C (°F) Air temperature at 2 meters above ground
snowfall Preceding hour sum cm (inch) Snowfall amount of the preceding hour in centimeters. For the water equivalent in millimeter, divide by 7. E.g. 7 cm snow = 10 mm precipitation water equivalent
rain Preceding hour sum mm (inch) Rain from large scale weather systems of the preceding hour in millimeter
showers Preceding hour sum mm (inch) Showers from convective precipitation in millimeters from the preceding hour
weathercode Instant WMO code Weather condition as a numeric code. Follow WMO weather interpretation codes. See table below for details.
snow_depth Instant meters Snow depth on the ground
freezinglevel_height Instant meters Altitude above sea level of the 0°C level
visibility Instant meters Viewing distance in meters. Influenced by low clouds, humidity and aerosols. Maximum visibility is approximately 24 km.`;

0 comments on commit 1f787a6

Please sign in to comment.