File size: 11,870 Bytes
97d8319 9215aa3 97d8319 9215aa3 ae277d8 b8db2c2 27d8498 ae277d8 9527bc9 4c19768 27d8498 9527bc9 7cfc74a 27d8498 9527bc9 27d8498 9527bc9 27d8498 9527bc9 27d8498 9527bc9 27d8498 9527bc9 27d8498 9527bc9 27d8498 9527bc9 27d8498 9527bc9 27d8498 9527bc9 27d8498 9527bc9 27d8498 9527bc9 27d8498 664bf7e 27d8498 9527bc9 | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 | ---
license: mit
model_name: Shadowclaw v1.0 (Testing)
binary_size: 100-200 KiB
llm_integration: Ollama (localhost:11434)
build: make
dependencies: gcc, libcurl, bc, cJSON
persistence: shadowclaw.bin
---
# Shadowclaw v1.3 (Testing)
<div style="max-width: 100%; overflow-x: auto; background: #f6f8fa; padding: 4px; border-radius: 4px;">
<pre style="font-family: 'Courier New', monospace; font-size: clamp(4px, 2vw, 10px); line-height: 1.2; margin: 0;">
____ _ _ _
/ ___|| |__ __ _ __| | _____ _____| | __ ___ __
\___ \| '_ \ / _` |/ _` |/ _ \ \ /\ / / __| |/ _` \ \ /\ / /
___) | | | | (_| | (_| | (_) \ V V / (__| | (_| |\ V V /
|____/|_| |_|\__,_|\__,_|\___/ \_/\_/ \___|_|\__,_| \_/\_/
</pre>
</div>
**Shadowclaw** is a minimal, single‑binary agent harness written in C. It follows the *OpenClaw* philosophy: self‑hosted, tool‑using, persistent memory, and minimal dependencies. The core memory management uses **Tsoding's "shadow header" trick** (like `stb_ds` but for a growable arena). All data (conversation history, tool definitions, results) lives inside a single `realloc`‑ed memory block with a hidden header. The agent communicates with a local LLM (Ollama) via curl, can execute shell commands, read/write files, perform HTTP GET, and evaluate simple math expressions. State is automatically saved to disk after every interaction.
**Niche edge use cases:**
RPi Zero/IoT: offline sensor scripts (shell + persistent shadow.bin)
Air-gapped systems: USB-stick local LLM agent (file/HTTP/math)
Embedded routers: 100-200KB network automation (low-mem Linux)
Low-power edge nodes: self-hosted persistent AI, no cloud.
---
## Repository Structure
```
shadowclaw/
├── shadowclaw.c # main program: arena, tools, LLM glue, event loop
├── Makefile # simple build file
├── cJSON.c # lightweigt JSON parser (from cJSON library)
└── cJSON.h # cJSON header
```
---
## Features
- **Growable shadow arena** – all objects stored in one contiguous block with hidden headers.
- **Persistent memory** – saves and loads full arena to/from `shadowclaw.bin`.
- **Tool use** – shell, file I/O, HTTP GET, math (via `bc`).
- **Ollama integration** – talk to any model running locally.
- **Blob storage** – system prompts, user messages, assistant replies, tool calls, results.
- **Tiny footprint** – stripped binary ≈ 100–200 KiB.
---
## Requirements
- Linux (developed on Debian, should work on most Unix‑likes)
- `gcc`, `make`, `libcurl4-openssl-dev`, `bc`
- [Ollama](https://ollama.com/) running locally (default `http://localhost:11434`) with a model pulled (e.g. `llama3.2`)
Install build dependencies on Debian/Ubuntu:
```bash
sudo apt update
sudo apt install build-essential libcurl4-openssl-dev bc
```
---
## Define your model:
- **Change Ollama endpoint/model**: edit `ollama_endpoint` and `ollama_model` in `shadowclaw.c`.
Find this line (507) in the shadowclaw.c file:
```bash
// --------------------------------------------------------------------
// Main (with slash commands from v1.2.2)
// --------------------------------------------------------------------
int main(int argc, char **argv) {
const char *state_file = "shadowclaw.bin";
const char *ollama_endpoint = "http://localhost:11434";
const char *ollama_model = "qwen2.5:0.5b"; // change as needed
```
Adjust the model endpoint in "ollama_model = "qwen2.5:0.5b";" to meet your model.
- **Add new tools**: extend the `tools` array in `shadowclaw.c` with a name and a function that takes a `const char*` argument and returns a `char*` (must be `malloc`ed, the caller will `free` it).
---
## How It Works
### Shadow Arena
All dynamic data lives in a single `realloc`‑ed block.
A `ShadowHeader` is stored **immediately before** the user‑visible pointer.
This header holds capacity, used length, a magic number, and a dirty flag.
```
+-------------------+---------------------+
| ShadowHeader | payload (blobs) |
+-------------------+---------------------+
^
`- pointer returned to user
```
### Blob Format
Each item (system prompt, user message, tool result, etc.) is stored as:
```
+------------+------+------+-----------------+
| BlobHeader | kind | id | payload (bytes) |
+------------+------+------+-----------------+
```
- `BlobHeader` contains the payload size, kind, and a 64‑bit ID.
- The payload is arbitrary data (null‑terminated strings for most kinds).
### Persistence
-The whole arena (header + payload) is written to `shadowclaw.bin` with `fwrite`.
-On startup, if the file exists and has a valid magic number, it is loaded back.
-All conversations and tool results are automatically saved to `shadowclaw.bin` and reloaded on restart.
---
# Updated 3/3/2026: Shadowclaw v1.3
### Built‑in Slash Commands
Type any of these commands directly at the `>` prompt – they are handled without invoking the LLM.
| Command | Description |
|-----------|-------------|
| `/help` | Show this help message. |
| `/tools` | List all available tools (shell, file I/O, HTTP, math, `list_dir`). |
| `/state` | Display current arena memory statistics (capacity, used bytes, dirty flag). |
| `/clear` | Clear the conversation history while retaining the system prompt. |
| `/chat` | Remind you that you are already in chat mode (the default behaviour). |
| `/exit` | Quit Shadowclaw. |
## Build
```bash
cd /Home/User/Shadowclaw # The folder you have the files located.
```
```bash
make clean && make
```
## Run
```bash
./shadowclaw
```
Use the slash commands above to get started – even without Ollama running, you can explore the built‑in features.
When your agent is running you should be able to use /help and see (example):
```bash
┌──(kali㉿user)-[~/shadowclaw]
└─$ ./shadowclaw
ShadowClaw ready. Type your message (Ctrl-D to exit)
/help
Shadowclaw commands:
/help Show this help
/tools List available tools
/state Show arena memory stats
/clear Clear conversation history (keeps system prompt)
/chat Remind you that chat mode is active
/exit Exit Shadowclaw
```
## Notes
-If Ollama is not running, you’ll see LLM call failed.
-Tool arguments can be a JSON array – they will be joined with spaces (useful if your model outputs "args":["arg1","arg2"]).
-All conversations and tool results are saved in shadowclaw.bin and reloaded on restart.
---
## How Tool Arguments Work
Shadowclaw processes tool calls by parsing JSON, where the args value is passed as a string to the function, executing it, and appending the result as a kind 5 blob for the next prompt. It supports a fallback where the model outputs args as an array (e.g., [arg1, arg2]), allowing for flexible input handling when standard JSON encoding fails.
-Tool Call Flow: The agent parses the JSON block, executes the tool, and appends the result.
-Result Visibility: The tool output is visible to the model in the next prompt.
-Fallback Mechanism: If JSON parsing fails, Shadowclaw interprets args as an array.
-Alternative Tooling: In some scenarios, tool calls can be handled as a single string to avoid parsing errors.
This system helps in handling malformed JSON, which occasionally occurs with certain LLM providers, ensuring the tool call succeeds.
```json
{"tool":"write_file","args":["notes.txt","Hello world"]}
```
Shadowclaw automatically **joins the array elements with spaces**, so the tool receives a single string `"notes.txt Hello world"`. This makes the agent robust to different model behaviours to ensure:
- **Simplicity** – most tools only need a single string argument (a command, a filename, a URL).
- **Flexibility** – if a tool needs multiple pieces of information (like `write_file` needing a filename **and** content), we use a simple delimiter (newline). The LLM can learn this pattern from the system prompt.
- **Lightweight Design** – no complex argument schemas, no extra JSON nesting. The entire tool call is tiny.
- **Args are always a single string** to keep things simple.
- **Multiple values are encoded with delimiters** (like newline) – the tool and LLM agree on the format.
- **Shadowclaw’s niche** is minimalism: a single binary with persistent memory, running anywhere, with just enough tools to be genuinely useful.
You can easily add new tools by editing the `tools` array – each new function opens up more automation possibilities for your unique environment.
---
## Niche Use Cases
Shadowclaw is designed for **minimal, self‑contained AI agents** running on resource‑constrained or air‑gapped systems.
Its toolset is deliberately small but powerful enough to automate many tasks without spawning heavy processes.
| Tool | What it does | Example args | Unique Niche Use |
|------|--------------|--------------|------------------|
| `shell` | Executes any shell command | `"ls -la /home"` | Automate system maintenance, run scripts, control services on a headless Raspberry Pi or embedded Linux device. |
| `read_file` | Reads a file from disk | `"/etc/passwd"` | Inspect configuration files, read logs, retrieve data for the LLM to analyse – all without a web interface. |
| `write_file` | Writes content to a file (args: `filename\ncontent`) | `"notes.txt\nBuy milk"` | Create notes, write configuration files, save results – perfect for offline data logging. |
| `http_get` | Fetches a URL (via libcurl) | `"https://api.example.com/data"` | Retrieve weather, fetch RSS feeds, call local REST APIs – even on devices with no browser, just a network stack. |
| `math` | Evaluates an expression using `bc` | `"2 + 2 * 5"` | Let the LLM do arithmetic, unit conversions, or simple calculations without relying on external tools. |
| `list_dir` | Lists directory contents | `"/home/user"` | Explore filesystem, find documents, check available storage – all natively, without forking a shell. |
---
## Niche Use Case Examples:
### 1. Raspberry Pi Zero / IoT Sensor Node
- **Tool used:** `shell` + `write_file`
- **Flow:** LLM asks to read a temperature sensor via a shell script, then logs the value to a file.
- **Why Shadowclaw?** Runs in <200KB RAM, no Python, no bloat. Persistent memory keeps the last readings.
### 2. Air‑Gapped Engineering Workstation
- **Tool used:** `http_get` + `math` + `read_file`
- **Flow:** Engineer asks for a component value calculation; LLM fetches a local datasheet via HTTP, reads a config file, does the math, and writes a report.
- **Why Shadowclaw?** No cloud, no internet required. All data stays on the machine.
### 3. Embedded Router Automation
- **Tool used:** `shell` + `list_dir`
- **Flow:** Network admin asks for a list of active interfaces; LLM runs `ifconfig` and parses the output, then suggests a config change.
- **Why Shadowclaw?** Tiny binary fits in router storage (often just a few MB). Uses curl for local API calls.
### 4. Low‑Power Field Logger
- **Tool used:** `write_file` + `math`
- **Flow:** A solar‑powered device collects environmental data; Shadowclaw can process and store it locally, then answer queries about trends.
- **Why Shadowclaw?** No database needed – just a binary and a single file (`shadowclaw.bin`) for persistence.
---
## Credits
- **Tsoding** – for the “insane shadow data trick” (the header‑before‑data arena idea).
- **cJSON** – Dave Gamble and contributors – the minimal JSON parser used.
- **Ollama** – HTTP requests via curl.
- **Vibe Code** - xAI Grok 4.20 and Deepseek AI
- **Openclaw**
- **webXOS**
---
## License
This project is released under the open sourced MIT License.
cJSON is also MIT licensed.
|