You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
> Once this package is [published to PyPI](https://app.stainlessapi.com/docs/guides/publish), this will become: `pip install --pre contextual`
23
+
> Once this package is [published to PyPI](https://app.stainlessapi.com/docs/guides/publish), this will become: `pip install --pre contextual-sdk`
24
24
25
25
## Usage
26
26
27
27
The full API of this library can be found in [api.md](api.md).
28
28
29
29
```python
30
30
import os
31
-
fromcontextualimport ContextualAI
31
+
fromcontextual_sdkimport ContextualAI
32
32
33
33
client = ContextualAI(
34
34
api_key=os.environ.get("CONTEXTUAL_API_KEY"), # This is the default and can be omitted
@@ -52,7 +52,7 @@ Simply import `AsyncContextualAI` instead of `ContextualAI` and use `await` with
52
52
```python
53
53
import os
54
54
import asyncio
55
-
fromcontextualimport AsyncContextualAI
55
+
fromcontextual_sdkimport AsyncContextualAI
56
56
57
57
client = AsyncContextualAI(
58
58
api_key=os.environ.get("CONTEXTUAL_API_KEY"), # This is the default and can be omitted
@@ -80,31 +80,94 @@ Nested request parameters are [TypedDicts](https://docs.python.org/3/library/typ
80
80
81
81
Typed requests and responses provide autocomplete and documentation within your editor. If you would like to see type errors in VS Code to help catch bugs earlier, set `python.analysis.typeCheckingMode` to `basic`.
82
82
83
+
## Pagination
84
+
85
+
List methods in the Contextual AI API are paginated.
86
+
87
+
This library provides auto-paginating iterators with each list response, so you do not have to request successive pages manually:
88
+
89
+
```python
90
+
from contextual_sdk import ContextualAI
91
+
92
+
client = ContextualAI()
93
+
94
+
all_datastores = []
95
+
# Automatically fetches more pages as needed.
96
+
for datastore in client.datastores.list():
97
+
# Do something with datastore here
98
+
all_datastores.append(datastore)
99
+
print(all_datastores)
100
+
```
101
+
102
+
Or, asynchronously:
103
+
104
+
```python
105
+
import asyncio
106
+
from contextual_sdk import AsyncContextualAI
107
+
108
+
client = AsyncContextualAI()
109
+
110
+
111
+
asyncdefmain() -> None:
112
+
all_datastores = []
113
+
# Iterate through items across all pages, issuing requests as needed.
114
+
asyncfor datastore in client.datastores.list():
115
+
all_datastores.append(datastore)
116
+
print(all_datastores)
117
+
118
+
119
+
asyncio.run(main())
120
+
```
121
+
122
+
Alternatively, you can use the `.has_next_page()`, `.next_page_info()`, or `.get_next_page()` methods for more granular control working with pages:
123
+
124
+
```python
125
+
first_page =await client.datastores.list()
126
+
if first_page.has_next_page():
127
+
print(f"will fetch next page using these details: {first_page.next_page_info()}")
128
+
next_page =await first_page.get_next_page()
129
+
print(f"number of items we just fetched: {len(next_page.datastores)}")
When the library is unable to connect to the API (for example, due to network connection problems or a timeout), a subclass of `contextual.APIConnectionError` is raised.
148
+
When the library is unable to connect to the API (for example, due to network connection problems or a timeout), a subclass of `contextual_sdk.APIConnectionError` is raised.
86
149
87
150
When the API returns a non-success status code (that is, 4xx or 5xx
88
-
response), a subclass of `contextual.APIStatusError` is raised, containing `status_code` and `response` properties.
151
+
response), a subclass of `contextual_sdk.APIStatusError` is raised, containing `status_code` and `response` properties.
89
152
90
-
All errors inherit from `contextual.APIError`.
153
+
All errors inherit from `contextual_sdk.APIError`.
91
154
92
155
```python
93
-
importcontextual
94
-
fromcontextualimport ContextualAI
156
+
importcontextual_sdk
157
+
fromcontextual_sdkimport ContextualAI
95
158
96
159
client = ContextualAI()
97
160
98
161
try:
99
162
client.datastores.create(
100
163
name="name",
101
164
)
102
-
exceptcontextual.APIConnectionError as e:
165
+
exceptcontextual_sdk.APIConnectionError as e:
103
166
print("The server could not be reached")
104
167
print(e.__cause__) # an underlying Exception, likely raised within httpx.
105
-
exceptcontextual.RateLimitError as e:
168
+
exceptcontextual_sdk.RateLimitError as e:
106
169
print("A 429 status code was received; we should back off a bit.")
107
-
exceptcontextual.APIStatusError as e:
170
+
exceptcontextual_sdk.APIStatusError as e:
108
171
print("Another non-200-range status code was received")
109
172
print(e.status_code)
110
173
print(e.response)
@@ -132,7 +195,7 @@ Connection errors (for example, due to a network connectivity problem), 408 Requ
132
195
You can use the `max_retries` option to configure or disable retry settings:
133
196
134
197
```python
135
-
fromcontextualimport ContextualAI
198
+
fromcontextual_sdkimport ContextualAI
136
199
137
200
# Configure the default for all requests:
138
201
client = ContextualAI(
@@ -152,7 +215,7 @@ By default requests time out after 1 minute. You can configure this with a `time
152
215
which accepts a float or an [`httpx.Timeout`](https://www.python-httpx.org/advanced/#fine-tuning-the-configuration) object:
153
216
154
217
```python
155
-
fromcontextualimport ContextualAI
218
+
fromcontextual_sdkimport ContextualAI
156
219
157
220
# Configure the default for all requests:
158
221
client = ContextualAI(
@@ -206,7 +269,7 @@ if response.my_field is None:
206
269
The "raw" Response object can be accessed by prefixing `.with_raw_response.` to any HTTP method call, e.g.,
@@ -218,9 +281,9 @@ datastore = response.parse() # get the object that `datastores.create()` would
218
281
print(datastore.id)
219
282
```
220
283
221
-
These methods return an [`APIResponse`](https://github.com/stainless-sdks/sunrise-python/tree/main/src/contextual/_response.py) object.
284
+
These methods return an [`APIResponse`](https://github.com/stainless-sdks/sunrise-python/tree/main/src/contextual_sdk/_response.py) object.
222
285
223
-
The async client returns an [`AsyncAPIResponse`](https://github.com/stainless-sdks/sunrise-python/tree/main/src/contextual/_response.py) with the same structure, the only difference being `await`able methods for reading the response content.
286
+
The async client returns an [`AsyncAPIResponse`](https://github.com/stainless-sdks/sunrise-python/tree/main/src/contextual_sdk/_response.py) with the same structure, the only difference being `await`able methods for reading the response content.
224
287
225
288
#### `.with_streaming_response`
226
289
@@ -285,7 +348,7 @@ You can directly override the [httpx client](https://www.python-httpx.org/api/#c
By default the library closes underlying HTTP connections whenever the client is [garbage collected](https://docs.python.org/3/reference/datamodel.html#object.__del__). You can manually close the client using the `.close()` method if desired, or with a context manager that closes when exiting.
309
372
310
373
```py
311
-
fromcontextualimport ContextualAI
374
+
fromcontextual_sdkimport ContextualAI
312
375
313
376
with ContextualAI() as client:
314
377
# make requests here
@@ -336,8 +399,8 @@ If you've upgraded to the latest version but aren't seeing any new features you
336
399
You can determine the version that is being used at runtime with:
0 commit comments