You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
data= response.parse() # get the object that `inference_pipelines.data.stream()` would have returned
309
-
print(data.success)
220
+
project= response.parse() # get the object that `projects.create()` would have returned
221
+
print(project.id)
310
222
```
311
223
312
224
These methods return an [`APIResponse`](https://github.com/openlayer-ai/openlayer-python/tree/main/src/openlayer/_response.py) object.
@@ -320,24 +232,9 @@ The above interface eagerly reads the full response body when you make the reque
320
232
To stream the response body, use `.with_streaming_response` instead, which requires a context manager and only reads the response body once you call `.read()`, `.text()`, `.json()`, `.iter_bytes()`, `.iter_text()`, `.iter_lines()` or `.parse()`. In the async client, these are async methods.
321
233
322
234
```python
323
-
with client.inference_pipelines.data.with_streaming_response.stream(
324
-
"182bd5e5-6e1a-4fe4-a799-aa6d9a6ab26e",
325
-
config={
326
-
"input_variable_names": ["user_query"],
327
-
"output_column_name": "output",
328
-
"num_of_token_column_name": "tokens",
329
-
"cost_column_name": "cost",
330
-
"timestamp_column_name": "timestamp",
331
-
},
332
-
rows=[
333
-
{
334
-
"user_query": "what's the meaning of life?",
335
-
"output": "42",
336
-
"tokens": 7,
337
-
"cost": 0.02,
338
-
"timestamp": 1620000000,
339
-
}
340
-
],
235
+
with client.projects.with_streaming_response.create(
0 commit comments