Skip to content

Commit 85bf9ca

Browse files
Stainless Botstainless-app[bot]
authored andcommitted
chore: update SDK settings (#221)
1 parent dc8ff33 commit 85bf9ca

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

79 files changed

+1168
-1219
lines changed

CONTRIBUTING.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -32,7 +32,7 @@ $ pip install -r requirements-dev.lock
3232
## Modifying/Adding code
3333

3434
Most of the SDK is generated code, and any modified code will be overridden on the next generation. The
35-
`src/openlayer-test/lib/` and `examples/` directories are exceptions and will never be overridden.
35+
`src/openlayer/lib/` and `examples/` directories are exceptions and will never be overridden.
3636

3737
## Adding and running examples
3838

README.md

Lines changed: 95 additions & 78 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
# Openlayer Python API library
22

3-
[![PyPI version](https://img.shields.io/pypi/v/openlayer-test.svg)](https://pypi.org/project/openlayer-test/)
3+
[![PyPI version](https://img.shields.io/pypi/v/openlayer.svg)](https://pypi.org/project/openlayer/)
44

55
The Openlayer Python library provides convenient access to the Openlayer REST API from any Python 3.7+
66
application. The library includes type definitions for all request params and response fields,
@@ -16,7 +16,7 @@ The REST API documentation can be found [on openlayer.com](https://openlayer.com
1616

1717
```sh
1818
# install from PyPI
19-
pip install --pre openlayer-test
19+
pip install --pre openlayer
2020
```
2121

2222
## Usage
@@ -25,7 +25,7 @@ The full API of this library can be found in [api.md](api.md).
2525

2626
```python
2727
import os
28-
from openlayer-test import Openlayer
28+
from openlayer import Openlayer
2929

3030
client = Openlayer(
3131
# This is the default and can be omitted
@@ -41,13 +41,15 @@ data_stream_response = client.inference_pipelines.data.stream(
4141
"cost_column_name": "cost",
4242
"timestamp_column_name": "timestamp",
4343
},
44-
rows=[{
45-
"user_query": "what's the meaning of life?",
46-
"output": "42",
47-
"tokens": 7,
48-
"cost": 0.02,
49-
"timestamp": 1620000000,
50-
}],
44+
rows=[
45+
{
46+
"user_query": "what's the meaning of life?",
47+
"output": "42",
48+
"tokens": 7,
49+
"cost": 0.02,
50+
"timestamp": 1620000000,
51+
}
52+
],
5153
)
5254
print(data_stream_response.success)
5355
```
@@ -64,32 +66,36 @@ Simply import `AsyncOpenlayer` instead of `Openlayer` and use `await` with each
6466
```python
6567
import os
6668
import asyncio
67-
from openlayer-test import AsyncOpenlayer
69+
from openlayer import AsyncOpenlayer
6870

6971
client = AsyncOpenlayer(
7072
# This is the default and can be omitted
7173
api_key=os.environ.get("OPENLAYER_API_KEY"),
7274
)
7375

76+
7477
async def main() -> None:
75-
data_stream_response = await client.inference_pipelines.data.stream(
76-
"182bd5e5-6e1a-4fe4-a799-aa6d9a6ab26e",
77-
config={
78-
"input_variable_names": ["user_query"],
79-
"output_column_name": "output",
80-
"num_of_token_column_name": "tokens",
81-
"cost_column_name": "cost",
82-
"timestamp_column_name": "timestamp",
83-
},
84-
rows=[{
85-
"user_query": "what's the meaning of life?",
86-
"output": "42",
87-
"tokens": 7,
88-
"cost": 0.02,
89-
"timestamp": 1620000000,
90-
}],
91-
)
92-
print(data_stream_response.success)
78+
data_stream_response = await client.inference_pipelines.data.stream(
79+
"182bd5e5-6e1a-4fe4-a799-aa6d9a6ab26e",
80+
config={
81+
"input_variable_names": ["user_query"],
82+
"output_column_name": "output",
83+
"num_of_token_column_name": "tokens",
84+
"cost_column_name": "cost",
85+
"timestamp_column_name": "timestamp",
86+
},
87+
rows=[
88+
{
89+
"user_query": "what's the meaning of life?",
90+
"output": "42",
91+
"tokens": 7,
92+
"cost": 0.02,
93+
"timestamp": 1620000000,
94+
}
95+
],
96+
)
97+
print(data_stream_response.success)
98+
9399

94100
asyncio.run(main())
95101
```
@@ -107,16 +113,16 @@ Typed requests and responses provide autocomplete and documentation within your
107113

108114
## Handling errors
109115

110-
When the library is unable to connect to the API (for example, due to network connection problems or a timeout), a subclass of `openlayer-test.APIConnectionError` is raised.
116+
When the library is unable to connect to the API (for example, due to network connection problems or a timeout), a subclass of `openlayer.APIConnectionError` is raised.
111117

112118
When the API returns a non-success status code (that is, 4xx or 5xx
113-
response), a subclass of `openlayer-test.APIStatusError` is raised, containing `status_code` and `response` properties.
119+
response), a subclass of `openlayer.APIStatusError` is raised, containing `status_code` and `response` properties.
114120

115-
All errors inherit from `openlayer-test.APIError`.
121+
All errors inherit from `openlayer.APIError`.
116122

117123
```python
118-
import openlayer-test
119-
from openlayer-test import Openlayer
124+
import openlayer
125+
from openlayer import Openlayer
120126

121127
client = Openlayer()
122128

@@ -130,20 +136,22 @@ try:
130136
"cost_column_name": "cost",
131137
"timestamp_column_name": "timestamp",
132138
},
133-
rows=[{
134-
"user_query": "what's the meaning of life?",
135-
"output": "42",
136-
"tokens": 7,
137-
"cost": 0.02,
138-
"timestamp": 1620000000,
139-
}],
139+
rows=[
140+
{
141+
"user_query": "what's the meaning of life?",
142+
"output": "42",
143+
"tokens": 7,
144+
"cost": 0.02,
145+
"timestamp": 1620000000,
146+
}
147+
],
140148
)
141-
except openlayer-test.APIConnectionError as e:
149+
except openlayer.APIConnectionError as e:
142150
print("The server could not be reached")
143-
print(e.__cause__) # an underlying Exception, likely raised within httpx.
144-
except openlayer-test.RateLimitError as e:
151+
print(e.__cause__) # an underlying Exception, likely raised within httpx.
152+
except openlayer.RateLimitError as e:
145153
print("A 429 status code was received; we should back off a bit.")
146-
except openlayer-test.APIStatusError as e:
154+
except openlayer.APIStatusError as e:
147155
print("Another non-200-range status code was received")
148156
print(e.status_code)
149157
print(e.response)
@@ -171,7 +179,7 @@ Connection errors (for example, due to a network connectivity problem), 408 Requ
171179
You can use the `max_retries` option to configure or disable retry settings:
172180

173181
```python
174-
from openlayer-test import Openlayer
182+
from openlayer import Openlayer
175183

176184
# Configure the default for all requests:
177185
client = Openlayer(
@@ -180,7 +188,7 @@ client = Openlayer(
180188
)
181189

182190
# Or, configure per-request:
183-
client.with_options(max_retries = 5).inference_pipelines.data.stream(
191+
client.with_options(max_retries=5).inference_pipelines.data.stream(
184192
"182bd5e5-6e1a-4fe4-a799-aa6d9a6ab26e",
185193
config={
186194
"input_variable_names": ["user_query"],
@@ -189,13 +197,15 @@ client.with_options(max_retries = 5).inference_pipelines.data.stream(
189197
"cost_column_name": "cost",
190198
"timestamp_column_name": "timestamp",
191199
},
192-
rows=[{
193-
"user_query": "what's the meaning of life?",
194-
"output": "42",
195-
"tokens": 7,
196-
"cost": 0.02,
197-
"timestamp": 1620000000,
198-
}],
200+
rows=[
201+
{
202+
"user_query": "what's the meaning of life?",
203+
"output": "42",
204+
"tokens": 7,
205+
"cost": 0.02,
206+
"timestamp": 1620000000,
207+
}
208+
],
199209
)
200210
```
201211

@@ -205,7 +215,7 @@ By default requests time out after 1 minute. You can configure this with a `time
205215
which accepts a float or an [`httpx.Timeout`](https://www.python-httpx.org/advanced/#fine-tuning-the-configuration) object:
206216

207217
```python
208-
from openlayer-test import Openlayer
218+
from openlayer import Openlayer
209219

210220
# Configure the default for all requests:
211221
client = Openlayer(
@@ -219,7 +229,7 @@ client = Openlayer(
219229
)
220230

221231
# Override per-request:
222-
client.with_options(timeout = 5.0).inference_pipelines.data.stream(
232+
client.with_options(timeout=5.0).inference_pipelines.data.stream(
223233
"182bd5e5-6e1a-4fe4-a799-aa6d9a6ab26e",
224234
config={
225235
"input_variable_names": ["user_query"],
@@ -228,13 +238,15 @@ client.with_options(timeout = 5.0).inference_pipelines.data.stream(
228238
"cost_column_name": "cost",
229239
"timestamp_column_name": "timestamp",
230240
},
231-
rows=[{
232-
"user_query": "what's the meaning of life?",
233-
"output": "42",
234-
"tokens": 7,
235-
"cost": 0.02,
236-
"timestamp": 1620000000,
237-
}],
241+
rows=[
242+
{
243+
"user_query": "what's the meaning of life?",
244+
"output": "42",
245+
"tokens": 7,
246+
"cost": 0.02,
247+
"timestamp": 1620000000,
248+
}
249+
],
238250
)
239251
```
240252

@@ -271,7 +283,7 @@ if response.my_field is None:
271283
The "raw" Response object can be accessed by prefixing `.with_raw_response.` to any HTTP method call, e.g.,
272284

273285
```py
274-
from openlayer-test import Openlayer
286+
from openlayer import Openlayer
275287

276288
client = Openlayer()
277289
response = client.inference_pipelines.data.with_raw_response.stream(
@@ -297,9 +309,9 @@ data = response.parse() # get the object that `inference_pipelines.data.stream(
297309
print(data.success)
298310
```
299311

300-
These methods return an [`APIResponse`](https://github.com/openlayer-ai/openlayer-python/tree/main/src/openlayer-test/_response.py) object.
312+
These methods return an [`APIResponse`](https://github.com/openlayer-ai/openlayer-python/tree/main/src/openlayer/_response.py) object.
301313

302-
The async client returns an [`AsyncAPIResponse`](https://github.com/openlayer-ai/openlayer-python/tree/main/src/openlayer-test/_response.py) with the same structure, the only difference being `await`able methods for reading the response content.
314+
The async client returns an [`AsyncAPIResponse`](https://github.com/openlayer-ai/openlayer-python/tree/main/src/openlayer/_response.py) with the same structure, the only difference being `await`able methods for reading the response content.
303315

304316
#### `.with_streaming_response`
305317

@@ -317,18 +329,20 @@ with client.inference_pipelines.data.with_streaming_response.stream(
317329
"cost_column_name": "cost",
318330
"timestamp_column_name": "timestamp",
319331
},
320-
rows=[{
321-
"user_query": "what's the meaning of life?",
322-
"output": "42",
323-
"tokens": 7,
324-
"cost": 0.02,
325-
"timestamp": 1620000000,
326-
}],
327-
) as response :
328-
print(response.headers.get('X-My-Header'))
332+
rows=[
333+
{
334+
"user_query": "what's the meaning of life?",
335+
"output": "42",
336+
"tokens": 7,
337+
"cost": 0.02,
338+
"timestamp": 1620000000,
339+
}
340+
],
341+
) as response:
342+
print(response.headers.get("X-My-Header"))
329343

330344
for line in response.iter_lines():
331-
print(line)
345+
print(line)
332346
```
333347

334348
The context manager is required so that the response will reliably be closed.
@@ -377,12 +391,15 @@ You can directly override the [httpx client](https://www.python-httpx.org/api/#c
377391
- Additional [advanced](https://www.python-httpx.org/advanced/#client-instances) functionality
378392

379393
```python
380-
from openlayer-test import Openlayer, DefaultHttpxClient
394+
from openlayer import Openlayer, DefaultHttpxClient
381395

382396
client = Openlayer(
383397
# Or use the `OPENLAYER_BASE_URL` env var
384398
base_url="http://my.test.server.example.com:8083",
385-
http_client=DefaultHttpxClient(proxies="http://my.test.proxy.example.com", transport=httpx.HTTPTransport(local_address="0.0.0.0")),
399+
http_client=DefaultHttpxClient(
400+
proxies="http://my.test.proxy.example.com",
401+
transport=httpx.HTTPTransport(local_address="0.0.0.0"),
402+
),
386403
)
387404
```
388405

0 commit comments

Comments
 (0)