---
name: eternal-memory
description: "Cloud persistence for agent memories, data, and embeddings — survives sandbox lifecycle events, enables cross-agent knowledge sharing"
compatibility: "Requires Python 3.10+ and Node.js 18+"
metadata:
  platform: python
  version: "0.3.0"
---

## Eternal Memory

Cloud persistence API for AI agents. Durable storage that outlives your local environment — plus cross-agent knowledge sharing via lineage.

**Base URL:** `https://api.eternal-memory.xyz`

**First 100 operations free (up to 10 MB total) — no USDC required to start.**

### Endpoints & Pricing

| Method | Endpoint | Cost (USDC) | Description |
|--------|----------|-------------|-------------|
| `POST` | `/v1/store` | $0.001 + $0.001/KB (+$0.0005 for server-side embed) | Store key-value, blob, or vector |
| `GET` | `/v1/retrieve/:key` | $0.0005 | Retrieve by key |
| `POST` | `/v1/query` | $0.001 (+$0.0005 for server-side embed) | Semantic vector search |
| `GET` | `/v1/list` | $0.0005 | List entries (filterable, `?include=values` for bulk restore) |
| `DELETE` | `/v1/delete/:key` | Free | Delete an entry |
| `POST` | `/v1/share` | $0.001 | Grant access to another agent |
| `GET` | `/v1/export` | $0.01 | Export all data (always requires payment — not free-tier eligible) |
| `GET` | `/v1/status` | Free | Usage summary |

Payment is automatic via x402 (HTTP 402 flow). First 100 paid operations are free per wallet (10 MB total).

### Setup

Download the SDK bridge (one-time):

```bash
curl -o sdk-bridge.mjs https://api.eternal-memory.xyz/sdk-bridge.mjs
```

Requires `node >= 18` in PATH (already available in most agent runtimes).

### Python Wrapper

The bridge delegates all auth, signing, and payment handling to the JS SDK — zero crypto dependencies in Python.

```python
import subprocess, json, threading, queue

class EternalMemory:
    def __init__(self, private_key, base_url="https://api.eternal-memory.xyz"):
        self._id = 0
        self._pending = {}
        self._proc = subprocess.Popen(
            ["node", "sdk-bridge.mjs", "--base-url", base_url, "--private-key", private_key],
            stdin=subprocess.PIPE, stdout=subprocess.PIPE, text=True)
        self._reader = threading.Thread(target=self._read_loop, daemon=True)
        self._reader.start()

    def _read_loop(self):
        for line in self._proc.stdout:
            msg = json.loads(line)
            q = self._pending.pop(msg.get("id"), None)
            if q: q.put(msg)

    def _call(self, method, *args):
        self._id += 1
        q = queue.Queue()
        self._pending[self._id] = q
        self._proc.stdin.write(json.dumps({"id": self._id, "method": method, "args": args}) + "\n")
        self._proc.stdin.flush()
        resp = q.get(timeout=30)
        if "error" in resp: raise RuntimeError(resp["error"])
        return resp["result"]

    def store(self, key, value, opts=None):  return self._call("store", key, value, opts or {})
    def retrieve(self, key, ns="default"):   return self._call("retrieve", key, ns)
    def query(self, embedding=None, opts=None): return self._call("query", embedding, opts or {})
    def list(self, opts=None):               return self._call("list", opts or {})
    def delete(self, key, ns="default"):     return self._call("delete", key, ns)
    def share(self, opts=None):              return self._call("share", opts or {})
    def status(self):                        return self._call("status")
    def close(self):                         self._proc.stdin.close(); self._proc.wait()
```

### Usage

```python
KEY = "0xYOUR_AGENT_PRIVATE_KEY"
mem = EternalMemory(KEY)

# Store
mem.store("learned-pattern", {"pattern": "retry with backoff"}, {"namespace": "procedures"})

# Retrieve
data = mem.retrieve("learned-pattern", "procedures")
print(data["data"])

# List (with values for bulk restore)
entries = mem.list({"namespace": "soul", "include": "values"})

# Semantic search — server-side embedding (recommended)
results = mem.query(None, {"queryText": "how to handle API failures", "threshold": 0.7})

# Semantic search — client-side embedding (requires embeddedWith)
# results = mem.query([...], {"embeddedWith": "openai/text-embedding-3-small", "threshold": 0.7})

# Delete (free)
mem.delete("old-key")

# Share with another agent
mem.share({"granteeId": "0xChild...", "accessLevel": "inherit"})

# Status (free)
print(mem.status())

mem.close()
```

### Platform Callouts

**CrewAI** — wrap the client in a `BaseTool`:
```python
from crewai.tools import BaseTool

mem = EternalMemory("0xYOUR_KEY")

class StoreMemoryTool(BaseTool):
    name: str = "store_memory"
    description: str = "Store data in persistent Eternal Memory"
    def _run(self, key: str, value: str, namespace: str = "facts") -> str:
        mem.store(key, {"text": value}, {"namespace": namespace})
        return f"Stored '{key}'"
```

**Fetch.ai** — derive key from agent seed:
```python
import hashlib
key = "0x" + hashlib.sha256(b"my-agent-seed").hexdigest()
mem = EternalMemory(key)
```

**Olas** — load key from keys.json:
```python
key = json.load(open("keys.json"))[0]["private_key"]
mem = EternalMemory(key)
```

### Vector Search

**Server-side embedding (recommended):** Send `embeddingText` on store, `queryText` on query. 384-dim all-MiniLM-L6-v2. Surcharge: +$0.0005. Free tier included.

**Client-side embedding:** Send `embedding` + `embeddedWith` (model name). Model consistency enforced — cross-model queries return empty results.

### Namespaces

Organize data by purpose:
- `soul` — identity, SOUL.md backup, core values
- `procedures` — learned techniques and skills
- `goals` — active tasks and objectives
- `facts` — observations, API patterns, environmental constants
- `sessions` — continuity state, checkpoints
- `episodic` — high-importance events worth preserving long-term
- `relationships` — trust scores, interaction history across agents
- `reputation` — social layer feedback scores, agent discovery
