Skip to content

Commit

Permalink
Enable GoogleGenAI extension (#111)
Browse files Browse the repository at this point in the history
  • Loading branch information
svilupp authored Mar 26, 2024
1 parent 38474f8 commit 3dffa33
Show file tree
Hide file tree
Showing 12 changed files with 117 additions and 79 deletions.
1 change: 1 addition & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0

### Added
- Added support for `aigenerate` with Anthropic API. Preset model aliases are `claudeo`, `claudes`, and `claudeh`, for Claude 3 Opus, Sonnet, and Haiku, respectively.
- Enabled the GoogleGenAI extension since `GoogleGenAI.jl` is now officially registered. You can use `aigenerate` by setting the model to `gemini` and providing the `GOOGLE_API_KEY` environment variable.

### Fixed

Expand Down
4 changes: 3 additions & 1 deletion Project.toml
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
name = "PromptingTools"
uuid = "670122d1-24a8-4d70-bfce-740807c42192"
authors = ["J S @svilupp and contributors"]
version = "0.16.1"
version = "0.17.0"

[deps]
AbstractTrees = "1520ce14-60c1-5f80-bbc7-55ef81b5835c"
Expand All @@ -17,11 +17,13 @@ Random = "9a3f8284-a2c9-5f02-9a11-845980a1fd5c"
Test = "8dfed614-e22c-5e08-85e1-65c5234f0b40"

[weakdeps]
GoogleGenAI = "903d41d1-eaca-47dd-943b-fee3930375ab"
LinearAlgebra = "37e2e46d-f89d-539d-b4ee-838fcccc9c8e"
Markdown = "d6f4376e-aef5-505a-96c1-9c027394607a"
SparseArrays = "2f01184e-e22b-5df5-ae63-d93ebab69eaf"

[extensions]
GoogleGenAIPromptingToolsExt = ["GoogleGenAI"]
MarkdownPromptingToolsExt = ["Markdown"]
RAGToolsExperimentalExt = ["SparseArrays", "LinearAlgebra"]

Expand Down
8 changes: 8 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -99,6 +99,7 @@ For more practical examples, see the `examples/` folder and the [Advanced Exampl
- [Package Interface](#package-interface)
- [Frequently Asked Questions](#frequently-asked-questions)
- [Why OpenAI](#why-openai)
- [What if I cannot access OpenAI?](#what-if-i-cannot-access-openai)
- [Data Privacy and OpenAI](#data-privacy-and-openai)
- [Creating OpenAI API Key](#creating-openai-api-key)
- [Setting OpenAI Spending Limits](#setting-openai-spending-limits)
Expand Down Expand Up @@ -673,6 +674,13 @@ There will be situations not or cannot use it (eg, privacy, cost, etc.). In that

Note: To get started with [Ollama.ai](https://ollama.ai/), see the [Setup Guide for Ollama](#setup-guide-for-ollama) section below.

### What if I cannot access OpenAI?

There are many alternatives:

- **Other APIs**: MistralAI, Anthropic, Google, Together, Fireworks, Voyager (the latter ones tend to give free credits upon joining!)
- **Locally-hosted models**: Llama.cpp/Llama.jl, Ollama, vLLM (see the examples and the corresponding docs)

### Data Privacy and OpenAI

At the time of writing, OpenAI does NOT use the API calls for training their models.
Expand Down
1 change: 1 addition & 0 deletions docs/Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@
DataFramesMeta = "1313f7d8-7da2-5740-9ea0-a2ca25f37964"
Documenter = "e30172f5-a6a5-5a46-863b-614d45cd2de4"
DocumenterVitepress = "4710194d-e776-4893-9690-8d956a29c365"
GoogleGenAI = "903d41d1-eaca-47dd-943b-fee3930375ab"
HTTP = "cd3eb016-35fb-5094-929b-558a96fad6f3"
JSON3 = "0f8b85d8-7281-11e9-16c2-39a750bddbf1"
LinearAlgebra = "37e2e46d-f89d-539d-b4ee-838fcccc9c8e"
Expand Down
12 changes: 2 additions & 10 deletions docs/src/examples/working_with_google_ai_studio.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,20 +6,12 @@ Get an API key from [here](https://ai.google.dev/). If you see a documentation p

Save the API key in your environment as `GOOGLE_API_KEY`.

We'll need `GoogleGenAI.jl` package:
We'll need `GoogleGenAI` package:

````julia
using Pkg; Pkg.add(url="https://github.com/tylerjthomas9/GoogleGenAI.jl/")
using Pkg; Pkg.add("GoogleGenAI")
````

> [!WARNING]
> This tutorial is DISABLED FOR NOW, because GoogleGenAI.jl is NOT a registered package yet and, hence, we cannot have an extension for it.
>
> If you want to use Google models, you need to install GoogleGenAI and add the following file to `[extensions]` section in Project.toml:
> `GoogleGenAIPromptingToolsExt = ["GoogleGenAI"]
>
> Save the Project.toml changes and restart Julia. You can now use GoogleGenAI models with PromptingTools as shown below.
You can now use the Gemini-1.0-Pro model like any other model in PromptingTools. We **only support `aigenerate`** at the moment.

Let's import PromptingTools:
Expand Down
7 changes: 7 additions & 0 deletions docs/src/frequently_asked_questions.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,13 @@ There will be situations not or cannot use it (eg, privacy, cost, etc.). In that

Note: To get started with [Ollama.ai](https://ollama.ai/), see the [Setup Guide for Ollama](#setup-guide-for-ollama) section below.

### What if I cannot access OpenAI?

There are many alternatives:

- **Other APIs**: MistralAI, Anthropic, Google, Together, Fireworks, Voyager (the latter ones tend to give free credits upon joining!)
- **Locally-hosted models**: Llama.cpp/Llama.jl, Ollama, vLLM (see the examples and the corresponding docs)

## Data Privacy and OpenAI

At the time of writing, OpenAI does NOT use the API calls for training their models.
Expand Down
20 changes: 3 additions & 17 deletions ext/GoogleGenAIPromptingToolsExt.jl
Original file line number Diff line number Diff line change
Expand Up @@ -9,23 +9,9 @@ const PT = PromptingTools
function PromptingTools.ggi_generate_content(prompt_schema::PT.AbstractGoogleSchema,
api_key::AbstractString, model_name::AbstractString,
conversation; http_kwargs, api_kwargs...)
## Build the provider
provider = GoogleGenAI.GoogleProvider(; api_key)
url = "$(provider.base_url)/models/$model_name:generateContent?key=$(provider.api_key)"
generation_config = Dict{String, Any}()
for (key, value) in api_kwargs
generation_config[string(key)] = value
end

body = Dict("contents" => conversation,
"generationConfig" => generation_config)
response = HTTP.post(url; headers = Dict("Content-Type" => "application/json"),
body = JSON3.write(body), http_kwargs...)
if response.status >= 200 && response.status < 300
return GoogleGenAI._parse_response(response)
else
error("Request failed with status $(response.status): $(String(response.body))")
end
## TODO: Ignores http_kwargs for now, needs upstream change
r = GoogleGenAI.generate_content(api_key, model_name, conversation; api_kwargs...)
return r
end

end # end of module
4 changes: 4 additions & 0 deletions src/llm_anthropic.jl
Original file line number Diff line number Diff line change
Expand Up @@ -260,3 +260,7 @@ function aiscan(prompt_schema::AbstractAnthropicSchema, prompt::ALLOWED_PROMPT_T
kwargs...)
error("Anthropic schema does not yet support aiscan. Please use OpenAISchema instead.")
end
function aiimage(prompt_schema::AbstractAnthropicSchema, prompt::ALLOWED_PROMPT_TYPE;
kwargs...)
error("Anthropic schema does not yet support aiimage. Please use OpenAISchema instead.")
end
37 changes: 29 additions & 8 deletions src/llm_google.jl
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ function render(schema::AbstractGoogleSchema,
messages_replaced = render(NoSchema(), messages; conversation, kwargs...)

## Second pass: convert to the OpenAI schema
conversation = Dict{String, Any}[]
conversation = Dict{Symbol, Any}[]

# replace any handlebar variables in the messages
for msg in messages_replaced
Expand All @@ -32,21 +32,21 @@ function render(schema::AbstractGoogleSchema,
elseif msg isa AIMessage
"model"
end
push!(conversation, Dict("role" => role, "parts" => [Dict("text" => msg.content)]))
push!(conversation, Dict(:role => role, :parts => [Dict("text" => msg.content)]))
end
## Merge any subsequent UserMessages
merged_conversation = Dict{String, Any}[]
merged_conversation = Dict{Symbol, Any}[]
# run n-1 times, look at the current item and the next one
i = 1
while i <= (length(conversation) - 1)
next_i = i + 1
if conversation[i]["role"] == "user" && conversation[next_i]["role"] == "user"
if conversation[i][:role] == "user" && conversation[next_i][:role] == "user"
## Concat the user messages to together, put two newlines
txt1 = conversation[i]["parts"][1]["text"]
txt2 = conversation[next_i]["parts"][1]["text"]
txt1 = conversation[i][:parts][1]["text"]
txt2 = conversation[next_i][:parts][1]["text"]
merged_text = isempty(txt1) || isempty(txt2) ? txt1 * txt2 :
txt1 * "\n\n" * txt2
new_msg = Dict("role" => "user", "parts" => [Dict("text" => merged_text)])
new_msg = Dict(:role => "user", :parts => [Dict("text" => merged_text)])
push!(merged_conversation, new_msg)
i += 2
else
Expand Down Expand Up @@ -178,7 +178,7 @@ function aigenerate(prompt_schema::AbstractGoogleSchema, prompt::ALLOWED_PROMPT_
output_token_estimate = length(r.text)
msg = AIMessage(;
content = r.text |> strip,
status = 200,
status = convert(Int, r.response_status),
## for google it's CHARACTERS, not tokens
tokens = (input_token_estimate, output_token_estimate),
elapsed = time)
Expand All @@ -198,3 +198,24 @@ function aigenerate(prompt_schema::AbstractGoogleSchema, prompt::ALLOWED_PROMPT_

return output
end

function aiembed(prompt_schema::AbstractGoogleSchema, prompt::ALLOWED_PROMPT_TYPE;
kwargs...)
error("Google schema does not yet support aiembed. Please use OpenAISchema instead.")
end
function aiclassify(prompt_schema::AbstractGoogleSchema, prompt::ALLOWED_PROMPT_TYPE;
kwargs...)
error("Google schema does not yet support aiclassify. Please use OpenAISchema instead.")
end
function aiextract(prompt_schema::AbstractGoogleSchema, prompt::ALLOWED_PROMPT_TYPE;
kwargs...)
error("Google schema does not yet support aiextract. Please use OpenAISchema instead.")
end
function aiscan(prompt_schema::AbstractGoogleSchema, prompt::ALLOWED_PROMPT_TYPE;
kwargs...)
error("Google schema does not yet support aiscan. Please use OpenAISchema instead.")
end
function aiimage(prompt_schema::AbstractGoogleSchema, prompt::ALLOWED_PROMPT_TYPE;
kwargs...)
error("Google schema does not yet support aiimage. Please use OpenAISchema instead.")
end
2 changes: 1 addition & 1 deletion src/llm_interface.jl
Original file line number Diff line number Diff line change
Expand Up @@ -246,7 +246,7 @@ struct GoogleSchema <: AbstractGoogleSchema end
"Echoes the user's input back to them. Used for testing the implementation"
@kwdef mutable struct TestEchoGoogleSchema <: AbstractGoogleSchema
text::Any
status::Integer
response_status::Integer
model_id::String = ""
inputs::Any = nothing
end
Expand Down
10 changes: 9 additions & 1 deletion test/llm_anthropic.jl
Original file line number Diff line number Diff line change
Expand Up @@ -153,4 +153,12 @@ end
@test schema2.inputs.system == "Act as a helpful AI assistant"
@test schema2.inputs.messages == [Dict("role" => "user", "content" => "Hello World")]
@test schema2.model_id == "claude-3-sonnet-20240229"
end
end

@testset "not implemented ai* functions" begin
@test_throws ErrorException aiembed(AnthropicSchema(), "prompt")
@test_throws ErrorException aiextract(AnthropicSchema(), "prompt")
@test_throws ErrorException aiclassify(AnthropicSchema(), "prompt")
@test_throws ErrorException aiscan(AnthropicSchema(), "prompt")
@test_throws ErrorException aiimage(AnthropicSchema(), "prompt")
end
Loading

0 comments on commit 3dffa33

Please sign in to comment.