Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 4 additions & 2 deletions CLAUDE.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ This file provides guidance to Claude Code (claude.ai/code) when working with co

## Project Overview

OpenGradient Python SDK - A decentralized model management and inference platform SDK. The SDK enables programmatic access to model repositories and decentralized AI infrastructure, including end-to-end verified AI execution.
OpenGradient Python SDK - A decentralized model management and inference platform SDK. The SDK enables programmatic access to model repositories and decentralized AI infrastructure, including end-to-end verified AI execution. Use virtualenv for dependency management locally (in `venv` folder).

## Development Commands

Expand Down Expand Up @@ -108,7 +108,9 @@ User configuration stored via `opengradient config init` wizard.

## Documentation (pdoc)

Docs are generated with `pdoc3` using a custom Mako template at `templates/text.mako`. Run `make docs` to regenerate into `docs/`.
Docs are generated with `pdoc3` using a custom Mako template at `templates/text.mako`. Run `make docs` to regenerate into `docs/`. Do not edit generated documentation files in `docs/` by hand.

There are concrete example scripts using the SDK in the examples/ folder that highlight how to use the SDK and provides a starting point for developers.

### Cross-referencing in docstrings

Expand Down
6 changes: 4 additions & 2 deletions docs/opengradient/client/client.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,14 +21,15 @@ blockchain private key and optional Model Hub credentials.
#### Constructor

```python
def __init__(private_key: str, email: Optional[str] = None, password: Optional[str] = None, rpc_url: str = 'https://ogevmdevnet.opengradient.ai', api_url: str = 'https://sdk-devnet.opengradient.ai', contract_address: str = '0x8383C9bD7462F12Eb996DD02F78234C0421A6FaE', og_llm_server_url: Optional[str] = 'https://llmogevm.opengradient.ai', og_llm_streaming_server_url: Optional[str] = 'https://llmogevm.opengradient.ai')
def __init__(private_key: str, email: Optional[str] = None, password: Optional[str] = None, twins_api_key: Optional[str] = None, rpc_url: str = 'https://ogevmdevnet.opengradient.ai', api_url: str = 'https://sdk-devnet.opengradient.ai', contract_address: str = '0x8383C9bD7462F12Eb996DD02F78234C0421A6FaE', og_llm_server_url: Optional[str] = 'https://llmogevm.opengradient.ai', og_llm_streaming_server_url: Optional[str] = 'https://llmogevm.opengradient.ai')
```

**Arguments**

* **`private_key`**: Private key for OpenGradient transactions.
* **`email`**: Email for Model Hub authentication. Optional.
* **`password`**: Password for Model Hub authentication. Optional.
* **`twins_api_key`**: API key for digital twins chat (twin.fun). Optional.
* **`rpc_url`**: RPC URL for the blockchain network.
* **`api_url`**: API URL for the OpenGradient API.
* **`contract_address`**: Inference contract address.
Expand All @@ -39,4 +40,5 @@ def __init__(private_key: str, email: Optional[str] = None, password: Optio

* [**`alpha`**](./alpha): Alpha Testnet features including on-chain inference, workflow management, and ML model execution.
* [**`llm`**](./llm): LLM chat and completion via TEE-verified execution.
* [**`model_hub`**](./model_hub): Model Hub for creating, versioning, and uploading ML models.
* [**`model_hub`**](./model_hub): Model Hub for creating, versioning, and uploading ML models.
* [**`twins`**](./twins): Digital twins chat via OpenGradient verifiable inference.
10 changes: 7 additions & 3 deletions docs/opengradient/client/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,11 +10,12 @@ OpenGradient Client -- the central entry point to all SDK services.

## Overview

The [Client](./client) class provides unified access to three service namespaces:
The [Client](./client) class provides unified access to four service namespaces:

- **[llm](./llm)** -- LLM chat and text completion with TEE-verified execution and x402 payment settlement
- **[model_hub](./model_hub)** -- Model repository management: create, version, and upload ML models
- **[alpha](./alpha)** -- Alpha Testnet features: on-chain ONNX model inference (VANILLA, TEE, ZKML modes), workflow deployment, and scheduled ML model execution
- **[twins](./twins)** -- Digital twins chat via OpenGradient verifiable inference

## Usage

Expand Down Expand Up @@ -52,6 +53,7 @@ repo = client.model_hub.create_model("my-model", "A price prediction model")
* [exceptions](./exceptions): Exception types for OpenGradient SDK errors.
* [llm](./llm): LLM chat and completion via TEE-verified execution with x402 payments.
* [model_hub](./model_hub): Model Hub for creating, versioning, and uploading ML models.
* [twins](./twins): Digital twins chat via OpenGradient verifiable inference.

## Classes

Expand All @@ -66,14 +68,15 @@ blockchain private key and optional Model Hub credentials.
#### Constructor

```python
def __init__(private_key: str, email: Optional[str] = None, password: Optional[str] = None, rpc_url: str = 'https://ogevmdevnet.opengradient.ai', api_url: str = 'https://sdk-devnet.opengradient.ai', contract_address: str = '0x8383C9bD7462F12Eb996DD02F78234C0421A6FaE', og_llm_server_url: Optional[str] = 'https://llmogevm.opengradient.ai', og_llm_streaming_server_url: Optional[str] = 'https://llmogevm.opengradient.ai')
def __init__(private_key: str, email: Optional[str] = None, password: Optional[str] = None, twins_api_key: Optional[str] = None, rpc_url: str = 'https://ogevmdevnet.opengradient.ai', api_url: str = 'https://sdk-devnet.opengradient.ai', contract_address: str = '0x8383C9bD7462F12Eb996DD02F78234C0421A6FaE', og_llm_server_url: Optional[str] = 'https://llmogevm.opengradient.ai', og_llm_streaming_server_url: Optional[str] = 'https://llmogevm.opengradient.ai')
```

**Arguments**

* **`private_key`**: Private key for OpenGradient transactions.
* **`email`**: Email for Model Hub authentication. Optional.
* **`password`**: Password for Model Hub authentication. Optional.
* **`twins_api_key`**: API key for digital twins chat (twin.fun). Optional.
* **`rpc_url`**: RPC URL for the blockchain network.
* **`api_url`**: API URL for the OpenGradient API.
* **`contract_address`**: Inference contract address.
Expand All @@ -84,4 +87,5 @@ def __init__(private_key: str, email: Optional[str] = None, password: Optio

* [**`alpha`**](./alpha): Alpha Testnet features including on-chain inference, workflow management, and ML model execution.
* [**`llm`**](./llm): LLM chat and completion via TEE-verified execution.
* [**`model_hub`**](./model_hub): Model Hub for creating, versioning, and uploading ML models.
* [**`model_hub`**](./model_hub): Model Hub for creating, versioning, and uploading ML models.
* [**`twins`**](./twins): Digital twins chat via OpenGradient verifiable inference.
51 changes: 51 additions & 0 deletions docs/opengradient/client/twins.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,51 @@
---
outline: [2,3]
---

[opengradient](../index) / [client](./index) / twins

# Package opengradient.client.twins

Digital twins chat via OpenGradient verifiable inference.

## Classes

### `Twins`

Digital twins chat namespace.

Provides access to digital twin conversations from twin.fun,
backed by OpenGradient verifiable inference.

#### Constructor

```python
def __init__(api_key: str)
```

#### Methods

---

#### `chat()`

```python
def chat(self, twin_id: str, model: `TEE_LLM`, messages: List[Dict], temperature: Optional[float] = None, max_tokens: Optional[int] = None) ‑> `TextGenerationOutput`
```
Chat with a digital twin.

**Arguments**

* **`twin_id`**: The unique identifier of the digital twin.
* **`model`**: The model to use for inference (e.g., TEE_LLM.GROK_4_1_FAST_NON_REASONING).
* **`messages`**: The conversation messages to send.
* **`temperature`**: Sampling temperature. Optional.
* **`max_tokens`**: Maximum number of tokens for the response. Optional.

**Returns**

TextGenerationOutput: Generated text results including chat_output and finish_reason.

**Raises**

* **`OpenGradientError`**: If the request fails.
7 changes: 5 additions & 2 deletions docs/opengradient/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -57,11 +57,12 @@ print(result.model_output)

## Client Namespaces

The [Client](./client/index) object exposes three namespaces:
The [Client](./client/index) object exposes four namespaces:

- **[llm](./client/llm)** -- Verifiable LLM chat and completion via TEE-verified execution with x402 payments
- **[alpha](./client/alpha)** -- On-chain ONNX model inference, workflow deployment, and scheduled ML model execution (only available on the Alpha Testnet)
- **[model_hub](./client/model_hub)** -- Model repository management
- **[twins](./client/twins)** -- Digital twins chat via OpenGradient verifiable inference (requires twins API key)

## Model Hub (requires email auth)

Expand Down Expand Up @@ -126,14 +127,15 @@ blockchain private key and optional Model Hub credentials.
#### Constructor

```python
def __init__(private_key: str, email: Optional[str] = None, password: Optional[str] = None, rpc_url: str = 'https://ogevmdevnet.opengradient.ai', api_url: str = 'https://sdk-devnet.opengradient.ai', contract_address: str = '0x8383C9bD7462F12Eb996DD02F78234C0421A6FaE', og_llm_server_url: Optional[str] = 'https://llmogevm.opengradient.ai', og_llm_streaming_server_url: Optional[str] = 'https://llmogevm.opengradient.ai')
def __init__(private_key: str, email: Optional[str] = None, password: Optional[str] = None, twins_api_key: Optional[str] = None, rpc_url: str = 'https://ogevmdevnet.opengradient.ai', api_url: str = 'https://sdk-devnet.opengradient.ai', contract_address: str = '0x8383C9bD7462F12Eb996DD02F78234C0421A6FaE', og_llm_server_url: Optional[str] = 'https://llmogevm.opengradient.ai', og_llm_streaming_server_url: Optional[str] = 'https://llmogevm.opengradient.ai')
```

**Arguments**

* **`private_key`**: Private key for OpenGradient transactions.
* **`email`**: Email for Model Hub authentication. Optional.
* **`password`**: Password for Model Hub authentication. Optional.
* **`twins_api_key`**: API key for digital twins chat (twin.fun). Optional.
* **`rpc_url`**: RPC URL for the blockchain network.
* **`api_url`**: API URL for the OpenGradient API.
* **`contract_address`**: Inference contract address.
Expand All @@ -145,6 +147,7 @@ def __init__(private_key: str, email: Optional[str] = None, password: Optio
* [**`alpha`**](./client/alpha): Alpha Testnet features including on-chain inference, workflow management, and ML model execution.
* [**`llm`**](./client/llm): LLM chat and completion via TEE-verified execution.
* [**`model_hub`**](./client/model_hub): Model Hub for creating, versioning, and uploading ML models.
* [**`twins`**](./client/twins): Digital twins chat via OpenGradient verifiable inference.

### `InferenceMode`

Expand Down
29 changes: 29 additions & 0 deletions examples/twins_chat.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
## Chat with digital twins from twin.fun via OpenGradient verifiable inference
# Browse available twins at https://twin.fun

import os

import opengradient as og

client = og.init(
private_key=os.environ.get("OG_PRIVATE_KEY"),
twins_api_key=os.environ.get("TWINS_API_KEY"),
)

# Chat with Elon Musk
elon = client.twins.chat(
twin_id="0x1abd463fd6244be4a1dc0f69e0b70cd5",
model=og.TEE_LLM.GROK_4_1_FAST_NON_REASONING,
messages=[{"role": "user", "content": "What do you think about AI?"}],
max_tokens=1000,
)
print(f"Elon: {elon.chat_output['content']}")

# Chat with Donald Trump
trump = client.twins.chat(
twin_id="0x66ae99aae4324ed580b2787ac5e811f6",
model=og.TEE_LLM.GROK_4_1_FAST_NON_REASONING,
messages=[{"role": "user", "content": "What's your plan for America?"}],
max_tokens=1000,
)
print(f"Trump: {trump.chat_output['content']}")
3 changes: 2 additions & 1 deletion src/opengradient/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -48,11 +48,12 @@

## Client Namespaces

The `opengradient.client.Client` object exposes three namespaces:
The `opengradient.client.Client` object exposes four namespaces:

- **`opengradient.client.llm`** -- Verifiable LLM chat and completion via TEE-verified execution with x402 payments
- **`opengradient.client.alpha`** -- On-chain ONNX model inference, workflow deployment, and scheduled ML model execution (only available on the Alpha Testnet)
- **`opengradient.client.model_hub`** -- Model repository management
- **`opengradient.client.twins`** -- Digital twins chat via OpenGradient verifiable inference (requires twins API key)

## Model Hub (requires email auth)

Expand Down
4 changes: 1 addition & 3 deletions src/opengradient/cli.py
Original file line number Diff line number Diff line change
Expand Up @@ -324,9 +324,7 @@ def infer(ctx, model_cid: str, inference_mode: str, input_data, input_file: Path
model_input = json.load(file)

click.echo(f'Running {inference_mode} inference for model "{model_cid}"')
inference_result = client.alpha.infer(
model_cid=model_cid, inference_mode=InferenceModes[inference_mode], model_input=model_input
)
inference_result = client.alpha.infer(model_cid=model_cid, inference_mode=InferenceModes[inference_mode], model_input=model_input)

click.echo() # Add a newline for better spacing
click.secho("✅ Transaction successful", fg="green", bold=True)
Expand Down
3 changes: 2 additions & 1 deletion src/opengradient/client/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,11 +3,12 @@

## Overview

The `opengradient.client.client.Client` class provides unified access to three service namespaces:
The `opengradient.client.client.Client` class provides unified access to four service namespaces:

- **`opengradient.client.llm`** -- LLM chat and text completion with TEE-verified execution and x402 payment settlement
- **`opengradient.client.model_hub`** -- Model repository management: create, version, and upload ML models
- **`opengradient.client.alpha`** -- Alpha Testnet features: on-chain ONNX model inference (VANILLA, TEE, ZKML modes), workflow deployment, and scheduled ML model execution
- **`opengradient.client.twins`** -- Digital twins chat via OpenGradient verifiable inference

## Usage

Expand Down
8 changes: 8 additions & 0 deletions src/opengradient/client/client.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,7 @@
from .alpha import Alpha
from .llm import LLM
from .model_hub import ModelHub
from .twins import Twins


class Client:
Expand All @@ -40,11 +41,15 @@ class Client:
alpha: Alpha
"""Alpha Testnet features including on-chain inference, workflow management, and ML model execution."""

twins: Twins
"""Digital twins chat via OpenGradient verifiable inference."""

def __init__(
self,
private_key: str,
email: Optional[str] = None,
password: Optional[str] = None,
twins_api_key: Optional[str] = None,
rpc_url: str = DEFAULT_RPC_URL,
api_url: str = DEFAULT_API_URL,
contract_address: str = DEFAULT_INFERENCE_CONTRACT_ADDRESS,
Expand All @@ -58,6 +63,7 @@ def __init__(
private_key: Private key for OpenGradient transactions.
email: Email for Model Hub authentication. Optional.
password: Password for Model Hub authentication. Optional.
twins_api_key: API key for digital twins chat (twin.fun). Optional.
rpc_url: RPC URL for the blockchain network.
api_url: API URL for the OpenGradient API.
contract_address: Inference contract address.
Expand Down Expand Up @@ -87,3 +93,5 @@ def __init__(
api_url=api_url,
)

if twins_api_key is not None:
self.twins = Twins(api_key=twins_api_key)
96 changes: 96 additions & 0 deletions src/opengradient/client/twins.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,96 @@
"""Digital twins chat via OpenGradient verifiable inference."""

from typing import Dict, List, Optional

import httpx

from ..types import TEE_LLM, TextGenerationOutput
from .exceptions import OpenGradientError

TWINS_API_BASE_URL = "https://chat-api.memchat.io"


class Twins:
"""
Digital twins chat namespace.

Provides access to digital twin conversations backed by OpenGradient
verifiable inference. Browse available twins at https://twin.fun.

Usage:
client = og.init(private_key="0x...", twins_api_key="your-api-key")
response = client.twins.chat(
twin_id="0x1abd463fd6244be4a1dc0f69e0b70cd5",
model=og.TEE_LLM.GROK_4_1_FAST_NON_REASONING,
messages=[{"role": "user", "content": "What do you think about AI?"}],
max_tokens=1000,
)
print(response.chat_output["content"])
"""

def __init__(self, api_key: str):
self._api_key = api_key

def chat(
self,
twin_id: str,
model: TEE_LLM,
messages: List[Dict],
temperature: Optional[float] = None,
max_tokens: Optional[int] = None,
) -> TextGenerationOutput:
"""
Chat with a digital twin.

Args:
twin_id: The unique identifier of the digital twin.
model: The model to use for inference (e.g., TEE_LLM.GROK_4_1_FAST_NON_REASONING).
messages: The conversation messages to send.
temperature: Sampling temperature. Optional.
max_tokens: Maximum number of tokens for the response. Optional.

Returns:
TextGenerationOutput: Generated text results including chat_output and finish_reason.

Raises:
OpenGradientError: If the request fails.
"""
url = f"{TWINS_API_BASE_URL}/api/v1/twins/{twin_id}/chat"
headers = {
"Content-Type": "application/json",
"X-API-Key": self._api_key,
}

payload: Dict = {
"model": model.value,
"messages": messages,
}
if temperature is not None:
payload["temperature"] = temperature
if max_tokens is not None:
payload["max_tokens"] = max_tokens

try:
response = httpx.post(url, json=payload, headers=headers, timeout=60)
response.raise_for_status()
result = response.json()

choices = result.get("choices")
if not choices:
raise OpenGradientError(f"Invalid response: 'choices' missing or empty in {result}")

return TextGenerationOutput(
transaction_hash="",
finish_reason=choices[0].get("finish_reason"),
chat_output=choices[0].get("message"),
payment_hash=None,
)
except OpenGradientError:
raise
except httpx.HTTPStatusError as e:
raise OpenGradientError(
f"Twins chat request failed: {e.response.status_code} {e.response.text}",
status_code=e.response.status_code,
)
except Exception as e:
raise OpenGradientError(f"Twins chat request failed: {str(e)}")