Skip to content

Commit 3c4ff11

Browse files
feat(api): update via SDK Studio
1 parent d0ba657 commit 3c4ff11

File tree

136 files changed

+712
-724
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

136 files changed

+712
-724
lines changed

.stats.yml

+1-1
Original file line numberDiff line numberDiff line change
@@ -1,2 +1,2 @@
11
configured_endpoints: 30
2-
openapi_spec_url: https://storage.googleapis.com/stainless-sdk-openapi-specs/contextual-ai%2Fsunrise-798cfd0b1503f37e06a9fe9d67f60dda0c3bad6c008db4da87701935c7237f06.yml
2+
openapi_spec_url: https://storage.googleapis.com/stainless-sdk-openapi-specs/contextual-ai%2Fsunrise-e75bdcdab9ab956493c7bf6fd9cafbc2e6f0e40fc69e22c8c10b26a0bcb68032.yml

CONTRIBUTING.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -37,7 +37,7 @@ $ pip install -r requirements-dev.lock
3737

3838
Most of the SDK is generated code. Modifications to code will be persisted between generations, but may
3939
result in merge conflicts between manual patches and changes from the generator. The generator will never
40-
modify the contents of the `src/sunrise/lib/` and `examples/` directories.
40+
modify the contents of the `src/contextual/lib/` and `examples/` directories.
4141

4242
## Adding and running examples
4343

LICENSE

+1-1
Original file line numberDiff line numberDiff line change
@@ -186,7 +186,7 @@
186186
same "printed page" as the copyright notice for easier
187187
identification within third-party archives.
188188

189-
Copyright 2024 Sunrise
189+
Copyright 2024 Contextual AI
190190

191191
Licensed under the Apache License, Version 2.0 (the "License");
192192
you may not use this file except in compliance with the License.

README.md

+40-40
Original file line numberDiff line numberDiff line change
@@ -1,16 +1,16 @@
1-
# Sunrise Python API library
1+
# Contextual AI Python API library
22

3-
[![PyPI version](https://img.shields.io/pypi/v/sunrise.svg)](https://pypi.org/project/sunrise/)
3+
[![PyPI version](https://img.shields.io/pypi/v/contextual.svg)](https://pypi.org/project/contextual/)
44

5-
The Sunrise Python library provides convenient access to the Sunrise REST API from any Python 3.8+
5+
The Contextual AI Python library provides convenient access to the Contextual AI REST API from any Python 3.8+
66
application. The library includes type definitions for all request params and response fields,
77
and offers both synchronous and asynchronous clients powered by [httpx](https://github.com/encode/httpx).
88

99
It is generated with [Stainless](https://www.stainlessapi.com/).
1010

1111
## Documentation
1212

13-
The REST API documentation can be found on [docs.sunrise.com](https://docs.sunrise.com). The full API of this library can be found in [api.md](api.md).
13+
The REST API documentation can be found on [docs.contextual.ai](https://docs.contextual.ai). The full API of this library can be found in [api.md](api.md).
1414

1515
## Installation
1616

@@ -20,18 +20,18 @@ pip install git+ssh://git@github.com/stainless-sdks/sunrise-python.git
2020
```
2121

2222
> [!NOTE]
23-
> Once this package is [published to PyPI](https://app.stainlessapi.com/docs/guides/publish), this will become: `pip install --pre sunrise`
23+
> Once this package is [published to PyPI](https://app.stainlessapi.com/docs/guides/publish), this will become: `pip install --pre contextual`
2424
2525
## Usage
2626

2727
The full API of this library can be found in [api.md](api.md).
2828

2929
```python
3030
import os
31-
from sunrise import Sunrise
31+
from contextual import ContextualAI
3232

33-
client = Sunrise(
34-
bearer_token=os.environ.get("BEARER_TOKEN"), # This is the default and can be omitted
33+
client = ContextualAI(
34+
api_key=os.environ.get("CONTEXTUAL_API_KEY"), # This is the default and can be omitted
3535
)
3636

3737
create_datastore_output = client.datastores.create(
@@ -40,22 +40,22 @@ create_datastore_output = client.datastores.create(
4040
print(create_datastore_output.id)
4141
```
4242

43-
While you can provide a `bearer_token` keyword argument,
43+
While you can provide an `api_key` keyword argument,
4444
we recommend using [python-dotenv](https://pypi.org/project/python-dotenv/)
45-
to add `BEARER_TOKEN="My Bearer Token"` to your `.env` file
46-
so that your Bearer Token is not stored in source control.
45+
to add `CONTEXTUAL_API_KEY="My API Key"` to your `.env` file
46+
so that your API Key is not stored in source control.
4747

4848
## Async usage
4949

50-
Simply import `AsyncSunrise` instead of `Sunrise` and use `await` with each API call:
50+
Simply import `AsyncContextualAI` instead of `ContextualAI` and use `await` with each API call:
5151

5252
```python
5353
import os
5454
import asyncio
55-
from sunrise import AsyncSunrise
55+
from contextual import AsyncContextualAI
5656

57-
client = AsyncSunrise(
58-
bearer_token=os.environ.get("BEARER_TOKEN"), # This is the default and can be omitted
57+
client = AsyncContextualAI(
58+
api_key=os.environ.get("CONTEXTUAL_API_KEY"), # This is the default and can be omitted
5959
)
6060

6161

@@ -82,29 +82,29 @@ Typed requests and responses provide autocomplete and documentation within your
8282

8383
## Handling errors
8484

85-
When the library is unable to connect to the API (for example, due to network connection problems or a timeout), a subclass of `sunrise.APIConnectionError` is raised.
85+
When the library is unable to connect to the API (for example, due to network connection problems or a timeout), a subclass of `contextual.APIConnectionError` is raised.
8686

8787
When the API returns a non-success status code (that is, 4xx or 5xx
88-
response), a subclass of `sunrise.APIStatusError` is raised, containing `status_code` and `response` properties.
88+
response), a subclass of `contextual.APIStatusError` is raised, containing `status_code` and `response` properties.
8989

90-
All errors inherit from `sunrise.APIError`.
90+
All errors inherit from `contextual.APIError`.
9191

9292
```python
93-
import sunrise
94-
from sunrise import Sunrise
93+
import contextual
94+
from contextual import ContextualAI
9595

96-
client = Sunrise()
96+
client = ContextualAI()
9797

9898
try:
9999
client.datastores.create(
100100
name="name",
101101
)
102-
except sunrise.APIConnectionError as e:
102+
except contextual.APIConnectionError as e:
103103
print("The server could not be reached")
104104
print(e.__cause__) # an underlying Exception, likely raised within httpx.
105-
except sunrise.RateLimitError as e:
105+
except contextual.RateLimitError as e:
106106
print("A 429 status code was received; we should back off a bit.")
107-
except sunrise.APIStatusError as e:
107+
except contextual.APIStatusError as e:
108108
print("Another non-200-range status code was received")
109109
print(e.status_code)
110110
print(e.response)
@@ -132,10 +132,10 @@ Connection errors (for example, due to a network connectivity problem), 408 Requ
132132
You can use the `max_retries` option to configure or disable retry settings:
133133

134134
```python
135-
from sunrise import Sunrise
135+
from contextual import ContextualAI
136136

137137
# Configure the default for all requests:
138-
client = Sunrise(
138+
client = ContextualAI(
139139
# default is 2
140140
max_retries=0,
141141
)
@@ -152,16 +152,16 @@ By default requests time out after 1 minute. You can configure this with a `time
152152
which accepts a float or an [`httpx.Timeout`](https://www.python-httpx.org/advanced/#fine-tuning-the-configuration) object:
153153

154154
```python
155-
from sunrise import Sunrise
155+
from contextual import ContextualAI
156156

157157
# Configure the default for all requests:
158-
client = Sunrise(
158+
client = ContextualAI(
159159
# 20 seconds (default is 1 minute)
160160
timeout=20.0,
161161
)
162162

163163
# More granular control:
164-
client = Sunrise(
164+
client = ContextualAI(
165165
timeout=httpx.Timeout(60.0, read=5.0, write=10.0, connect=2.0),
166166
)
167167

@@ -181,10 +181,10 @@ Note that requests that time out are [retried twice by default](#retries).
181181

182182
We use the standard library [`logging`](https://docs.python.org/3/library/logging.html) module.
183183

184-
You can enable logging by setting the environment variable `SUNRISE_LOG` to `info`.
184+
You can enable logging by setting the environment variable `CONTEXTUAL_AI_LOG` to `info`.
185185

186186
```shell
187-
$ export SUNRISE_LOG=info
187+
$ export CONTEXTUAL_AI_LOG=info
188188
```
189189

190190
Or to `debug` for more verbose logging.
@@ -206,9 +206,9 @@ if response.my_field is None:
206206
The "raw" Response object can be accessed by prefixing `.with_raw_response.` to any HTTP method call, e.g.,
207207

208208
```py
209-
from sunrise import Sunrise
209+
from contextual import ContextualAI
210210

211-
client = Sunrise()
211+
client = ContextualAI()
212212
response = client.datastores.with_raw_response.create(
213213
name="name",
214214
)
@@ -218,9 +218,9 @@ datastore = response.parse() # get the object that `datastores.create()` would
218218
print(datastore.id)
219219
```
220220

221-
These methods return an [`APIResponse`](https://github.com/stainless-sdks/sunrise-python/tree/main/src/sunrise/_response.py) object.
221+
These methods return an [`APIResponse`](https://github.com/stainless-sdks/sunrise-python/tree/main/src/contextual/_response.py) object.
222222

223-
The async client returns an [`AsyncAPIResponse`](https://github.com/stainless-sdks/sunrise-python/tree/main/src/sunrise/_response.py) with the same structure, the only difference being `await`able methods for reading the response content.
223+
The async client returns an [`AsyncAPIResponse`](https://github.com/stainless-sdks/sunrise-python/tree/main/src/contextual/_response.py) with the same structure, the only difference being `await`able methods for reading the response content.
224224

225225
#### `.with_streaming_response`
226226

@@ -284,10 +284,10 @@ You can directly override the [httpx client](https://www.python-httpx.org/api/#c
284284
- Additional [advanced](https://www.python-httpx.org/advanced/clients/) functionality
285285

286286
```python
287-
from sunrise import Sunrise, DefaultHttpxClient
287+
from contextual import ContextualAI, DefaultHttpxClient
288288

289-
client = Sunrise(
290-
# Or use the `SUNRISE_BASE_URL` env var
289+
client = ContextualAI(
290+
# Or use the `CONTEXTUAL_AI_BASE_URL` env var
291291
base_url="http://my.test.server.example.com:8083",
292292
http_client=DefaultHttpxClient(
293293
proxies="http://my.test.proxy.example.com",
@@ -325,8 +325,8 @@ If you've upgraded to the latest version but aren't seeing any new features you
325325
You can determine the version that is being used at runtime with:
326326

327327
```py
328-
import sunrise
329-
print(sunrise.__version__)
328+
import contextual
329+
print(contextual.__version__)
330330
```
331331

332332
## Requirements

SECURITY.md

+3-3
Original file line numberDiff line numberDiff line change
@@ -16,11 +16,11 @@ before making any information public.
1616
## Reporting Non-SDK Related Security Issues
1717

1818
If you encounter security issues that are not directly related to SDKs but pertain to the services
19-
or products provided by Sunrise please follow the respective company's security reporting guidelines.
19+
or products provided by Contextual AI please follow the respective company's security reporting guidelines.
2020

21-
### Sunrise Terms and Policies
21+
### Contextual AI Terms and Policies
2222

23-
Please contact dev-feedback@sunrise.com for any questions or concerns regarding security of our services.
23+
Please contact support@contextual.ai for any questions or concerns regarding security of our services.
2424

2525
---
2626

0 commit comments

Comments
 (0)