Skip to content

Commit a392727

Browse files
authored
Merge branch 'master' into how-to-use-openrouter-to-access-multiple-ai-models-in-one-python-script
2 parents e3e3653 + 1558b79 commit a392727

File tree

9 files changed

+1297
-0
lines changed

9 files changed

+1297
-0
lines changed

geopandas-basics/README.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,3 @@
1+
# GeoPandas Basics: Maps, Projections, and Spatial Joins
2+
3+
This folder provides the code examples for the tutorial [GeoPandas Basics: Maps, Projections, and Spatial Joins](https://realpython.com/geopandas/).

geopandas-basics/geopandas-basics.ipynb

Lines changed: 1165 additions & 0 deletions
Large diffs are not rendered by default.

ollama-python-sdk/README.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,3 @@
1+
# How to Integrate Local LLMs With Ollama and Python
2+
3+
This folder provides the code examples for the Real Python tutorial [How to Integrate Local LLMs With Ollama and Python](https://realpython.com/ollama-python/).

ollama-python-sdk/chat.py

Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,11 @@
1+
from ollama import chat
2+
3+
messages = [
4+
{
5+
"role": "user",
6+
"content": "Explain what Python is in one sentence.",
7+
},
8+
]
9+
10+
response = chat(model="llama3.2:latest", messages=messages)
11+
print(response.message.content)

ollama-python-sdk/chat_context.py

Lines changed: 24 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,24 @@
1+
from ollama import chat
2+
3+
messages = [
4+
{
5+
"role": "system",
6+
"content": "You are an expert Python tutor.",
7+
},
8+
{
9+
"role": "user",
10+
"content": "Define list comprehensions in a sentence.",
11+
},
12+
]
13+
response = chat(model="llama3.2:latest", messages=messages)
14+
print(response.message.content)
15+
16+
messages.append(response.message) # Keep context
17+
messages.append(
18+
{
19+
"role": "user",
20+
"content": "Provide a short, practical example.",
21+
}
22+
)
23+
response = chat(model="llama3.2:latest", messages=messages)
24+
print(response.message.content)

ollama-python-sdk/generate_code.py

Lines changed: 17 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,17 @@
1+
from ollama import generate
2+
3+
prompt = """
4+
Write a Python function fizzbuzz(n: int) -> List[str] that:
5+
6+
- Returns a list of strings for the numbers 1..n
7+
- Uses "Fizz" for multiples of 3
8+
- Uses "Buzz" for multiples of 5
9+
- Uses "FizzBuzz" for multiples of both 3 and 5
10+
- Uses the number itself (as a string) otherwise
11+
- Raises ValueError if n < 1
12+
13+
Include type hints compatible with Python 3.8.
14+
"""
15+
16+
response = generate(model="codellama:latest", prompt=prompt)
17+
print(response.response)

ollama-python-sdk/generate_text.py

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,8 @@
1+
from ollama import generate
2+
3+
response = generate(
4+
model="llama3.2:latest",
5+
prompt="Explain what Python is in one sentence.",
6+
)
7+
8+
print(response.response)

ollama-python-sdk/streams.py

Lines changed: 15 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,15 @@
1+
from ollama import chat
2+
3+
stream = chat(
4+
model="llama3.2:latest",
5+
messages=[
6+
{
7+
"role": "user",
8+
"content": "Explain Python dataclasses with a quick example.",
9+
}
10+
],
11+
stream=True,
12+
)
13+
14+
for chunk in stream:
15+
print(chunk.message.content, end="", flush=True)

ollama-python-sdk/tool_calling.py

Lines changed: 51 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,51 @@
1+
import math
2+
3+
from ollama import chat
4+
5+
6+
# Define a tool as a Python function
7+
def square_root(number: float) -> float:
8+
"""Calculate the square root of a number.
9+
10+
Args:
11+
number: The number to calculate the square root for.
12+
13+
Returns:
14+
The square root of the number.
15+
"""
16+
return math.sqrt(number)
17+
18+
19+
messages = [
20+
{
21+
"role": "user",
22+
"content": "What is the square root of 36?",
23+
}
24+
]
25+
26+
response = chat(
27+
model="llama3.2:latest", # You may want to try this model: llama3.1:8b
28+
messages=messages,
29+
tools=[square_root], # Pass the tools along with the prompt
30+
)
31+
32+
# Append the response for context
33+
messages.append(response.message)
34+
35+
if response.message.tool_calls:
36+
tool = response.message.tool_calls[0]
37+
# Call the tool
38+
result = square_root(float(tool.function.arguments["number"]))
39+
40+
# Append the tool result
41+
messages.append(
42+
{
43+
"role": "tool",
44+
"tool_name": tool.function.name,
45+
"content": str(result),
46+
}
47+
)
48+
49+
# Obtain the final answer
50+
final_response = chat(model="llama3.2:latest", messages=messages)
51+
print(final_response.message.content)

0 commit comments

Comments
 (0)