Skip to content

[FEATURE] Integrate with Antropic API #176

@badmonster0

Description

@badmonster0

CocoIndex 🥥 currently supports OpenAI and Ollama ✨ , to add LLM as part of the data pipeline. https://cocoindex.io/docs/ai/llm

Here is an example of how Cocoindex uses Ollama to extract structured information from PDF.
https://cocoindex.io/blogs/cocoindex-ollama-structured-extraction-from-pdf/

We would like to add support for Anthropic API - https://docs.anthropic.com/en/api/getting-started

Related code to support OpenAI

Steps:

  1. Update Rust code:
  2. Add Enum to LlmApiType in Python]:

    cocoindex/src/llm/mod.rs

    Lines 52 to 62 in 801ae8f

    pub async fn new_llm_generation_client(spec: LlmSpec) -> Result<Box<dyn LlmGenerationClient>> {
    let client = match spec.api_type {
    LlmApiType::Ollama => {
    Box::new(ollama::Client::new(spec).await?) as Box<dyn LlmGenerationClient>
    }
    LlmApiType::OpenAi => {
    Box::new(openai::Client::new(spec).await?) as Box<dyn LlmGenerationClient>
    }
    };
    Ok(client)
    }
  3. Test with the existing manuals_llm_extraction example. You can add a few lines similar to OpenAI:
    # Replace by this spec below, to use OpenAI API model instead of ollama
    # llm_spec=cocoindex.LlmSpec(
    # api_type=cocoindex.LlmApiType.OPENAI, model="gpt-4o"),
  4. Update documentation: https://cocoindex.io/docs/ai/llm

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions