Skip to content

Plugin for LLM adding support for Anthropic's Claude models

License

Notifications You must be signed in to change notification settings

tomviner/llm-claude

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

20 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

llm-claude

PyPI Changelog Tests License

Plugin for LLM adding support for Anthropic's Claude models.

Installation

Install this plugin in the same environment as LLM.

llm install llm-claude

Configuration

You will need an API key from Anthropic. Request access at https://www.anthropic.com/earlyaccess then go to https://console.anthropic.com/account/keys

You can set that as an environment variable called ANTHROPIC_API_KEY, or add it to the llm set of saved keys using:

llm keys set claude
Enter key: <paste key here>

Usage

This plugin adds models called claude and claude-instant.

Anthropic describes them as:

two families of models, both of which support 100,000 token context windows:

  • Claude Instant: low-latency, high throughput
  • Claude: superior performance on tasks that require complex reasoning

You can query them like this:

llm -m claude-instant "Ten great names for a new space station"
llm -m claude "Compare and contrast the leadership styles of Abraham Lincoln and Boris Johnson."

Options

  • max_tokens_to_sample, default 10_000: The maximum number of tokens to generate before stopping

Use like this:

llm -m claude -o max_tokens_to_sample 20 "Sing me the alphabet"
 Here is the alphabet song:

A B C D E F G
H I J

Development

To set up this plugin locally, first checkout the code. Then create a new virtual environment:

cd llm-claude
python3 -m venv venv
source venv/bin/activate

Now install the dependencies and test dependencies:

pip install -e '.[test]'

To run the tests:

pytest

About

Plugin for LLM adding support for Anthropic's Claude models

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages