Skip to content

Commit 9be941e

Browse files
authored
Merge pull request #43 from code-dot-org/aws-bedrock-llama
add test script for aws bedrock access
2 parents 3a8e24f + 730343f commit 9be941e

File tree

5 files changed

+40
-3
lines changed

5 files changed

+40
-3
lines changed

.gitignore

Lines changed: 0 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,9 +1,6 @@
11
config.txt
22
__pycache__
33

4-
# Ignore local Ruby config necessary for our custom AWS auth solution
5-
.ruby-version
6-
74
# pyenv config
85
.python-version
96

.ruby-version

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
3.0.5

README.md

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -63,6 +63,12 @@ set python 3.11 for the aiproxy repo:
6363
create a python virtual environment at the top of the directory:
6464
* `python -m venv .venv`
6565

66+
ensure aws access for accessing aws bedrock models:
67+
* from the code-dot-org repo root, run:
68+
* `bin/aws_access`
69+
* from this repo's root, run:
70+
* `gem install aws-google`
71+
6672
#### run
6773

6874
Activate the virtual environment:

bin/aws_llama_test.py

Lines changed: 32 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,32 @@
1+
#!/usr/bin/env python
2+
3+
import subprocess
4+
import boto3
5+
import json
6+
7+
# check aws access
8+
try:
9+
result = subprocess.run('aws sts get-caller-identity', shell=True, check=True, text=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
10+
print(f"AWS access configured: {result.stdout}")
11+
except subprocess.CalledProcessError as e:
12+
print(f"AWS access not configured: {e} {e.stderr}Please see README.md and make sure you ran `gem install aws-google` and `bin/aws_access`")
13+
exit(1)
14+
15+
bedrock = boto3.client(service_name='bedrock-runtime')
16+
17+
body = json.dumps({
18+
"prompt": "\n\nHuman:explain black holes to 8th graders\n\nAssistant:",
19+
"max_gen_len": 128,
20+
"temperature": 0.1,
21+
"top_p": 0.9,
22+
})
23+
24+
modelId = 'meta.llama2-13b-chat-v1'
25+
accept = 'application/json'
26+
contentType = 'application/json'
27+
28+
response = bedrock.invoke_model(body=body, modelId=modelId, accept=accept, contentType=contentType)
29+
30+
response_body = json.loads(response.get('body').read())
31+
32+
print(response_body.get('generation'))

requirements.txt

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -8,3 +8,4 @@ requests-mock==1.11.0
88
coverage==7.3.2
99
scikit-learn~=1.3.2
1010
gdown~=4.7.1
11+
boto3==1.34.30

0 commit comments

Comments
 (0)