Merge branch 'main' into pr-746
This commit is contained in:
commit
7d7d6bcadc
29
README.md
29
README.md
@ -16,10 +16,12 @@
|
|||||||
|
|
||||||
⚡️ Delivers core agent functionality in just **~4,000** lines of code — **99% smaller** than Clawdbot's 430k+ lines.
|
⚡️ Delivers core agent functionality in just **~4,000** lines of code — **99% smaller** than Clawdbot's 430k+ lines.
|
||||||
|
|
||||||
📏 Real-time line count: **3,668 lines** (run `bash core_agent_lines.sh` to verify anytime)
|
📏 Real-time line count: **3,689 lines** (run `bash core_agent_lines.sh` to verify anytime)
|
||||||
|
|
||||||
## 📢 News
|
## 📢 News
|
||||||
|
|
||||||
|
- **2026-02-16** 🦞 nanobot now integrates a [ClawHub](https://clawhub.ai) skill — search and install public agent skills.
|
||||||
|
- **2026-02-15** 🔑 nanobot now supports OpenAI Codex provider with OAuth login support.
|
||||||
- **2026-02-14** 🔌 nanobot now supports MCP! See [MCP section](#mcp-model-context-protocol) for details.
|
- **2026-02-14** 🔌 nanobot now supports MCP! See [MCP section](#mcp-model-context-protocol) for details.
|
||||||
- **2026-02-13** 🎉 Released v0.1.3.post7 — includes security hardening and multiple improvements. All users are recommended to upgrade to the latest version. See [release notes](https://github.com/HKUDS/nanobot/releases/tag/v0.1.3.post7) for more details.
|
- **2026-02-13** 🎉 Released v0.1.3.post7 — includes security hardening and multiple improvements. All users are recommended to upgrade to the latest version. See [release notes](https://github.com/HKUDS/nanobot/releases/tag/v0.1.3.post7) for more details.
|
||||||
- **2026-02-12** 🧠 Redesigned memory system — Less code, more reliable. Join the [discussion](https://github.com/HKUDS/nanobot/discussions/566) about it!
|
- **2026-02-12** 🧠 Redesigned memory system — Less code, more reliable. Join the [discussion](https://github.com/HKUDS/nanobot/discussions/566) about it!
|
||||||
@ -143,19 +145,19 @@ That's it! You have a working AI assistant in 2 minutes.
|
|||||||
|
|
||||||
## 💬 Chat Apps
|
## 💬 Chat Apps
|
||||||
|
|
||||||
Talk to your nanobot through Telegram, Discord, WhatsApp, Feishu, Mochat, DingTalk, Slack, Email, or QQ — anytime, anywhere.
|
Connect nanobot to your favorite chat platform.
|
||||||
|
|
||||||
| Channel | Setup |
|
| Channel | What you need |
|
||||||
|---------|-------|
|
|---------|---------------|
|
||||||
| **Telegram** | Easy (just a token) |
|
| **Telegram** | Bot token from @BotFather |
|
||||||
| **Discord** | Easy (bot token + intents) |
|
| **Discord** | Bot token + Message Content intent |
|
||||||
| **WhatsApp** | Medium (scan QR) |
|
| **WhatsApp** | QR code scan |
|
||||||
| **Feishu** | Medium (app credentials) |
|
| **Feishu** | App ID + App Secret |
|
||||||
| **Mochat** | Medium (claw token + websocket) |
|
| **Mochat** | Claw token (auto-setup available) |
|
||||||
| **DingTalk** | Medium (app credentials) |
|
| **DingTalk** | App Key + App Secret |
|
||||||
| **Slack** | Medium (bot + app tokens) |
|
| **Slack** | Bot token + App-Level token |
|
||||||
| **Email** | Medium (IMAP/SMTP credentials) |
|
| **Email** | IMAP/SMTP credentials |
|
||||||
| **QQ** | Easy (app credentials) |
|
| **QQ** | App ID + App Secret |
|
||||||
|
|
||||||
<details>
|
<details>
|
||||||
<summary><b>Telegram</b> (Recommended)</summary>
|
<summary><b>Telegram</b> (Recommended)</summary>
|
||||||
@ -586,6 +588,7 @@ Config file: `~/.nanobot/config.json`
|
|||||||
| `zhipu` | LLM (Zhipu GLM) | [open.bigmodel.cn](https://open.bigmodel.cn) |
|
| `zhipu` | LLM (Zhipu GLM) | [open.bigmodel.cn](https://open.bigmodel.cn) |
|
||||||
| `vllm` | LLM (local, any OpenAI-compatible server) | — |
|
| `vllm` | LLM (local, any OpenAI-compatible server) | — |
|
||||||
| `openai_codex` | LLM (Codex, OAuth) | `nanobot provider login openai-codex` |
|
| `openai_codex` | LLM (Codex, OAuth) | `nanobot provider login openai-codex` |
|
||||||
|
| `github_copilot` | LLM (GitHub Copilot, OAuth) | Requires [GitHub Copilot](https://github.com/features/copilot) subscription |
|
||||||
|
|
||||||
<details>
|
<details>
|
||||||
<summary><b>OpenAI Codex (OAuth)</b></summary>
|
<summary><b>OpenAI Codex (OAuth)</b></summary>
|
||||||
|
|||||||
@ -225,12 +225,16 @@ To recall past events, grep {workspace_path}/memory/HISTORY.md"""
|
|||||||
Returns:
|
Returns:
|
||||||
Updated message list.
|
Updated message list.
|
||||||
"""
|
"""
|
||||||
msg: dict[str, Any] = {"role": "assistant", "content": content or ""}
|
msg: dict[str, Any] = {"role": "assistant"}
|
||||||
|
|
||||||
|
# Omit empty content — some backends reject empty text blocks
|
||||||
|
if content:
|
||||||
|
msg["content"] = content
|
||||||
|
|
||||||
if tool_calls:
|
if tool_calls:
|
||||||
msg["tool_calls"] = tool_calls
|
msg["tool_calls"] = tool_calls
|
||||||
|
|
||||||
# Thinking models reject history without this
|
# Include reasoning content when provided (required by some thinking models)
|
||||||
if reasoning_content:
|
if reasoning_content:
|
||||||
msg["reasoning_content"] = reasoning_content
|
msg["reasoning_content"] = reasoning_content
|
||||||
|
|
||||||
|
|||||||
@ -50,6 +50,10 @@ class CronTool(Tool):
|
|||||||
"type": "string",
|
"type": "string",
|
||||||
"description": "Cron expression like '0 9 * * *' (for scheduled tasks)"
|
"description": "Cron expression like '0 9 * * *' (for scheduled tasks)"
|
||||||
},
|
},
|
||||||
|
"tz": {
|
||||||
|
"type": "string",
|
||||||
|
"description": "IANA timezone for cron expressions (e.g. 'America/Vancouver')"
|
||||||
|
},
|
||||||
"at": {
|
"at": {
|
||||||
"type": "string",
|
"type": "string",
|
||||||
"description": "ISO datetime for one-time execution (e.g. '2026-02-12T10:30:00')"
|
"description": "ISO datetime for one-time execution (e.g. '2026-02-12T10:30:00')"
|
||||||
@ -68,30 +72,46 @@ class CronTool(Tool):
|
|||||||
message: str = "",
|
message: str = "",
|
||||||
every_seconds: int | None = None,
|
every_seconds: int | None = None,
|
||||||
cron_expr: str | None = None,
|
cron_expr: str | None = None,
|
||||||
|
tz: str | None = None,
|
||||||
at: str | None = None,
|
at: str | None = None,
|
||||||
job_id: str | None = None,
|
job_id: str | None = None,
|
||||||
**kwargs: Any
|
**kwargs: Any
|
||||||
) -> str:
|
) -> str:
|
||||||
if action == "add":
|
if action == "add":
|
||||||
return self._add_job(message, every_seconds, cron_expr, at)
|
return self._add_job(message, every_seconds, cron_expr, tz, at)
|
||||||
elif action == "list":
|
elif action == "list":
|
||||||
return self._list_jobs()
|
return self._list_jobs()
|
||||||
elif action == "remove":
|
elif action == "remove":
|
||||||
return self._remove_job(job_id)
|
return self._remove_job(job_id)
|
||||||
return f"Unknown action: {action}"
|
return f"Unknown action: {action}"
|
||||||
|
|
||||||
def _add_job(self, message: str, every_seconds: int | None, cron_expr: str | None, at: str | None) -> str:
|
def _add_job(
|
||||||
|
self,
|
||||||
|
message: str,
|
||||||
|
every_seconds: int | None,
|
||||||
|
cron_expr: str | None,
|
||||||
|
tz: str | None,
|
||||||
|
at: str | None,
|
||||||
|
) -> str:
|
||||||
if not message:
|
if not message:
|
||||||
return "Error: message is required for add"
|
return "Error: message is required for add"
|
||||||
if not self._channel or not self._chat_id:
|
if not self._channel or not self._chat_id:
|
||||||
return "Error: no session context (channel/chat_id)"
|
return "Error: no session context (channel/chat_id)"
|
||||||
|
if tz and not cron_expr:
|
||||||
|
return "Error: tz can only be used with cron_expr"
|
||||||
|
if tz:
|
||||||
|
from zoneinfo import ZoneInfo
|
||||||
|
try:
|
||||||
|
ZoneInfo(tz)
|
||||||
|
except (KeyError, Exception):
|
||||||
|
return f"Error: unknown timezone '{tz}'"
|
||||||
|
|
||||||
# Build schedule
|
# Build schedule
|
||||||
delete_after = False
|
delete_after = False
|
||||||
if every_seconds:
|
if every_seconds:
|
||||||
schedule = CronSchedule(kind="every", every_ms=every_seconds * 1000)
|
schedule = CronSchedule(kind="every", every_ms=every_seconds * 1000)
|
||||||
elif cron_expr:
|
elif cron_expr:
|
||||||
schedule = CronSchedule(kind="cron", expr=cron_expr)
|
schedule = CronSchedule(kind="cron", expr=cron_expr, tz=tz)
|
||||||
elif at:
|
elif at:
|
||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
dt = datetime.fromisoformat(at)
|
dt = datetime.fromisoformat(at)
|
||||||
|
|||||||
@ -52,6 +52,11 @@ class MessageTool(Tool):
|
|||||||
"chat_id": {
|
"chat_id": {
|
||||||
"type": "string",
|
"type": "string",
|
||||||
"description": "Optional: target chat/user ID"
|
"description": "Optional: target chat/user ID"
|
||||||
|
},
|
||||||
|
"media": {
|
||||||
|
"type": "array",
|
||||||
|
"items": {"type": "string"},
|
||||||
|
"description": "Optional: list of file paths to attach (images, audio, documents)"
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
"required": ["content"]
|
"required": ["content"]
|
||||||
@ -62,6 +67,7 @@ class MessageTool(Tool):
|
|||||||
content: str,
|
content: str,
|
||||||
channel: str | None = None,
|
channel: str | None = None,
|
||||||
chat_id: str | None = None,
|
chat_id: str | None = None,
|
||||||
|
media: list[str] | None = None,
|
||||||
**kwargs: Any
|
**kwargs: Any
|
||||||
) -> str:
|
) -> str:
|
||||||
channel = channel or self._default_channel
|
channel = channel or self._default_channel
|
||||||
@ -76,11 +82,13 @@ class MessageTool(Tool):
|
|||||||
msg = OutboundMessage(
|
msg = OutboundMessage(
|
||||||
channel=channel,
|
channel=channel,
|
||||||
chat_id=chat_id,
|
chat_id=chat_id,
|
||||||
content=content
|
content=content,
|
||||||
|
media=media or []
|
||||||
)
|
)
|
||||||
|
|
||||||
try:
|
try:
|
||||||
await self._send_callback(msg)
|
await self._send_callback(msg)
|
||||||
return f"Message sent to {channel}:{chat_id}"
|
media_info = f" with {len(media)} attachments" if media else ""
|
||||||
|
return f"Message sent to {channel}:{chat_id}{media_info}"
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
return f"Error sending message: {str(e)}"
|
return f"Error sending message: {str(e)}"
|
||||||
|
|||||||
@ -198,6 +198,18 @@ class TelegramChannel(BaseChannel):
|
|||||||
await self._app.shutdown()
|
await self._app.shutdown()
|
||||||
self._app = None
|
self._app = None
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def _get_media_type(path: str) -> str:
|
||||||
|
"""Guess media type from file extension."""
|
||||||
|
ext = path.rsplit(".", 1)[-1].lower() if "." in path else ""
|
||||||
|
if ext in ("jpg", "jpeg", "png", "gif", "webp"):
|
||||||
|
return "photo"
|
||||||
|
if ext == "ogg":
|
||||||
|
return "voice"
|
||||||
|
if ext in ("mp3", "m4a", "wav", "aac"):
|
||||||
|
return "audio"
|
||||||
|
return "document"
|
||||||
|
|
||||||
async def send(self, msg: OutboundMessage) -> None:
|
async def send(self, msg: OutboundMessage) -> None:
|
||||||
"""Send a message through Telegram."""
|
"""Send a message through Telegram."""
|
||||||
if not self._app:
|
if not self._app:
|
||||||
@ -212,16 +224,35 @@ class TelegramChannel(BaseChannel):
|
|||||||
logger.error(f"Invalid chat_id: {msg.chat_id}")
|
logger.error(f"Invalid chat_id: {msg.chat_id}")
|
||||||
return
|
return
|
||||||
|
|
||||||
for chunk in _split_message(msg.content):
|
# Send media files
|
||||||
|
for media_path in (msg.media or []):
|
||||||
try:
|
try:
|
||||||
html = _markdown_to_telegram_html(chunk)
|
media_type = self._get_media_type(media_path)
|
||||||
await self._app.bot.send_message(chat_id=chat_id, text=html, parse_mode="HTML")
|
sender = {
|
||||||
|
"photo": self._app.bot.send_photo,
|
||||||
|
"voice": self._app.bot.send_voice,
|
||||||
|
"audio": self._app.bot.send_audio,
|
||||||
|
}.get(media_type, self._app.bot.send_document)
|
||||||
|
param = "photo" if media_type == "photo" else media_type if media_type in ("voice", "audio") else "document"
|
||||||
|
with open(media_path, 'rb') as f:
|
||||||
|
await sender(chat_id=chat_id, **{param: f})
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.warning(f"HTML parse failed, falling back to plain text: {e}")
|
filename = media_path.rsplit("/", 1)[-1]
|
||||||
|
logger.error(f"Failed to send media {media_path}: {e}")
|
||||||
|
await self._app.bot.send_message(chat_id=chat_id, text=f"[Failed to send: {filename}]")
|
||||||
|
|
||||||
|
# Send text content
|
||||||
|
if msg.content and msg.content != "[empty message]":
|
||||||
|
for chunk in _split_message(msg.content):
|
||||||
try:
|
try:
|
||||||
await self._app.bot.send_message(chat_id=chat_id, text=chunk)
|
html = _markdown_to_telegram_html(chunk)
|
||||||
except Exception as e2:
|
await self._app.bot.send_message(chat_id=chat_id, text=html, parse_mode="HTML")
|
||||||
logger.error(f"Error sending Telegram message: {e2}")
|
except Exception as e:
|
||||||
|
logger.warning(f"HTML parse failed, falling back to plain text: {e}")
|
||||||
|
try:
|
||||||
|
await self._app.bot.send_message(chat_id=chat_id, text=chunk)
|
||||||
|
except Exception as e2:
|
||||||
|
logger.error(f"Error sending Telegram message: {e2}")
|
||||||
|
|
||||||
async def _on_start(self, update: Update, context: ContextTypes.DEFAULT_TYPE) -> None:
|
async def _on_start(self, update: Update, context: ContextTypes.DEFAULT_TYPE) -> None:
|
||||||
"""Handle /start command."""
|
"""Handle /start command."""
|
||||||
|
|||||||
@ -292,7 +292,9 @@ def _make_provider(config: Config):
|
|||||||
if provider_name == "openai_codex" or model.startswith("openai-codex/"):
|
if provider_name == "openai_codex" or model.startswith("openai-codex/"):
|
||||||
return OpenAICodexProvider(default_model=model)
|
return OpenAICodexProvider(default_model=model)
|
||||||
|
|
||||||
if not model.startswith("bedrock/") and not (p and p.api_key):
|
from nanobot.providers.registry import find_by_name
|
||||||
|
spec = find_by_name(provider_name)
|
||||||
|
if not model.startswith("bedrock/") and not (p and p.api_key) and not (spec and spec.is_oauth):
|
||||||
console.print("[red]Error: No API key configured.[/red]")
|
console.print("[red]Error: No API key configured.[/red]")
|
||||||
console.print("Set one in ~/.nanobot/config.json under providers section")
|
console.print("Set one in ~/.nanobot/config.json under providers section")
|
||||||
raise typer.Exit(1)
|
raise typer.Exit(1)
|
||||||
@ -726,20 +728,26 @@ def cron_list(
|
|||||||
table.add_column("Next Run")
|
table.add_column("Next Run")
|
||||||
|
|
||||||
import time
|
import time
|
||||||
|
from datetime import datetime as _dt
|
||||||
|
from zoneinfo import ZoneInfo
|
||||||
for job in jobs:
|
for job in jobs:
|
||||||
# Format schedule
|
# Format schedule
|
||||||
if job.schedule.kind == "every":
|
if job.schedule.kind == "every":
|
||||||
sched = f"every {(job.schedule.every_ms or 0) // 1000}s"
|
sched = f"every {(job.schedule.every_ms or 0) // 1000}s"
|
||||||
elif job.schedule.kind == "cron":
|
elif job.schedule.kind == "cron":
|
||||||
sched = job.schedule.expr or ""
|
sched = f"{job.schedule.expr or ''} ({job.schedule.tz})" if job.schedule.tz else (job.schedule.expr or "")
|
||||||
else:
|
else:
|
||||||
sched = "one-time"
|
sched = "one-time"
|
||||||
|
|
||||||
# Format next run
|
# Format next run
|
||||||
next_run = ""
|
next_run = ""
|
||||||
if job.state.next_run_at_ms:
|
if job.state.next_run_at_ms:
|
||||||
next_time = time.strftime("%Y-%m-%d %H:%M", time.localtime(job.state.next_run_at_ms / 1000))
|
ts = job.state.next_run_at_ms / 1000
|
||||||
next_run = next_time
|
try:
|
||||||
|
tz = ZoneInfo(job.schedule.tz) if job.schedule.tz else None
|
||||||
|
next_run = _dt.fromtimestamp(ts, tz).strftime("%Y-%m-%d %H:%M")
|
||||||
|
except Exception:
|
||||||
|
next_run = time.strftime("%Y-%m-%d %H:%M", time.localtime(ts))
|
||||||
|
|
||||||
status = "[green]enabled[/green]" if job.enabled else "[dim]disabled[/dim]"
|
status = "[green]enabled[/green]" if job.enabled else "[dim]disabled[/dim]"
|
||||||
|
|
||||||
@ -754,6 +762,7 @@ def cron_add(
|
|||||||
message: str = typer.Option(..., "--message", "-m", help="Message for agent"),
|
message: str = typer.Option(..., "--message", "-m", help="Message for agent"),
|
||||||
every: int = typer.Option(None, "--every", "-e", help="Run every N seconds"),
|
every: int = typer.Option(None, "--every", "-e", help="Run every N seconds"),
|
||||||
cron_expr: str = typer.Option(None, "--cron", "-c", help="Cron expression (e.g. '0 9 * * *')"),
|
cron_expr: str = typer.Option(None, "--cron", "-c", help="Cron expression (e.g. '0 9 * * *')"),
|
||||||
|
tz: str | None = typer.Option(None, "--tz", help="IANA timezone for cron (e.g. 'America/Vancouver')"),
|
||||||
at: str = typer.Option(None, "--at", help="Run once at time (ISO format)"),
|
at: str = typer.Option(None, "--at", help="Run once at time (ISO format)"),
|
||||||
deliver: bool = typer.Option(False, "--deliver", "-d", help="Deliver response to channel"),
|
deliver: bool = typer.Option(False, "--deliver", "-d", help="Deliver response to channel"),
|
||||||
to: str = typer.Option(None, "--to", help="Recipient for delivery"),
|
to: str = typer.Option(None, "--to", help="Recipient for delivery"),
|
||||||
@ -764,11 +773,15 @@ def cron_add(
|
|||||||
from nanobot.cron.service import CronService
|
from nanobot.cron.service import CronService
|
||||||
from nanobot.cron.types import CronSchedule
|
from nanobot.cron.types import CronSchedule
|
||||||
|
|
||||||
|
if tz and not cron_expr:
|
||||||
|
console.print("[red]Error: --tz can only be used with --cron[/red]")
|
||||||
|
raise typer.Exit(1)
|
||||||
|
|
||||||
# Determine schedule type
|
# Determine schedule type
|
||||||
if every:
|
if every:
|
||||||
schedule = CronSchedule(kind="every", every_ms=every * 1000)
|
schedule = CronSchedule(kind="every", every_ms=every * 1000)
|
||||||
elif cron_expr:
|
elif cron_expr:
|
||||||
schedule = CronSchedule(kind="cron", expr=cron_expr)
|
schedule = CronSchedule(kind="cron", expr=cron_expr, tz=tz)
|
||||||
elif at:
|
elif at:
|
||||||
import datetime
|
import datetime
|
||||||
dt = datetime.datetime.fromisoformat(at)
|
dt = datetime.datetime.fromisoformat(at)
|
||||||
|
|||||||
@ -2,7 +2,6 @@
|
|||||||
|
|
||||||
import json
|
import json
|
||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
from typing import Any
|
|
||||||
|
|
||||||
from nanobot.config.schema import Config
|
from nanobot.config.schema import Config
|
||||||
|
|
||||||
@ -35,7 +34,7 @@ def load_config(config_path: Path | None = None) -> Config:
|
|||||||
with open(path) as f:
|
with open(path) as f:
|
||||||
data = json.load(f)
|
data = json.load(f)
|
||||||
data = _migrate_config(data)
|
data = _migrate_config(data)
|
||||||
return Config.model_validate(convert_keys(data))
|
return Config.model_validate(data)
|
||||||
except (json.JSONDecodeError, ValueError) as e:
|
except (json.JSONDecodeError, ValueError) as e:
|
||||||
print(f"Warning: Failed to load config from {path}: {e}")
|
print(f"Warning: Failed to load config from {path}: {e}")
|
||||||
print("Using default configuration.")
|
print("Using default configuration.")
|
||||||
@ -54,9 +53,7 @@ def save_config(config: Config, config_path: Path | None = None) -> None:
|
|||||||
path = config_path or get_config_path()
|
path = config_path or get_config_path()
|
||||||
path.parent.mkdir(parents=True, exist_ok=True)
|
path.parent.mkdir(parents=True, exist_ok=True)
|
||||||
|
|
||||||
# Convert to camelCase format
|
data = config.model_dump(by_alias=True)
|
||||||
data = config.model_dump()
|
|
||||||
data = convert_to_camel(data)
|
|
||||||
|
|
||||||
with open(path, "w") as f:
|
with open(path, "w") as f:
|
||||||
json.dump(data, f, indent=2)
|
json.dump(data, f, indent=2)
|
||||||
@ -70,37 +67,3 @@ def _migrate_config(data: dict) -> dict:
|
|||||||
if "restrictToWorkspace" in exec_cfg and "restrictToWorkspace" not in tools:
|
if "restrictToWorkspace" in exec_cfg and "restrictToWorkspace" not in tools:
|
||||||
tools["restrictToWorkspace"] = exec_cfg.pop("restrictToWorkspace")
|
tools["restrictToWorkspace"] = exec_cfg.pop("restrictToWorkspace")
|
||||||
return data
|
return data
|
||||||
|
|
||||||
|
|
||||||
def convert_keys(data: Any) -> Any:
|
|
||||||
"""Convert camelCase keys to snake_case for Pydantic."""
|
|
||||||
if isinstance(data, dict):
|
|
||||||
return {camel_to_snake(k): convert_keys(v) for k, v in data.items()}
|
|
||||||
if isinstance(data, list):
|
|
||||||
return [convert_keys(item) for item in data]
|
|
||||||
return data
|
|
||||||
|
|
||||||
|
|
||||||
def convert_to_camel(data: Any) -> Any:
|
|
||||||
"""Convert snake_case keys to camelCase."""
|
|
||||||
if isinstance(data, dict):
|
|
||||||
return {snake_to_camel(k): convert_to_camel(v) for k, v in data.items()}
|
|
||||||
if isinstance(data, list):
|
|
||||||
return [convert_to_camel(item) for item in data]
|
|
||||||
return data
|
|
||||||
|
|
||||||
|
|
||||||
def camel_to_snake(name: str) -> str:
|
|
||||||
"""Convert camelCase to snake_case."""
|
|
||||||
result = []
|
|
||||||
for i, char in enumerate(name):
|
|
||||||
if char.isupper() and i > 0:
|
|
||||||
result.append("_")
|
|
||||||
result.append(char.lower())
|
|
||||||
return "".join(result)
|
|
||||||
|
|
||||||
|
|
||||||
def snake_to_camel(name: str) -> str:
|
|
||||||
"""Convert snake_case to camelCase."""
|
|
||||||
components = name.split("_")
|
|
||||||
return components[0] + "".join(x.title() for x in components[1:])
|
|
||||||
|
|||||||
@ -2,27 +2,37 @@
|
|||||||
|
|
||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
from pydantic import BaseModel, Field, ConfigDict
|
from pydantic import BaseModel, Field, ConfigDict
|
||||||
|
from pydantic.alias_generators import to_camel
|
||||||
from pydantic_settings import BaseSettings
|
from pydantic_settings import BaseSettings
|
||||||
|
|
||||||
|
|
||||||
class WhatsAppConfig(BaseModel):
|
class Base(BaseModel):
|
||||||
|
"""Base model that accepts both camelCase and snake_case keys."""
|
||||||
|
|
||||||
|
model_config = ConfigDict(alias_generator=to_camel, populate_by_name=True)
|
||||||
|
|
||||||
|
|
||||||
|
class WhatsAppConfig(Base):
|
||||||
"""WhatsApp channel configuration."""
|
"""WhatsApp channel configuration."""
|
||||||
|
|
||||||
enabled: bool = False
|
enabled: bool = False
|
||||||
bridge_url: str = "ws://localhost:3001"
|
bridge_url: str = "ws://localhost:3001"
|
||||||
bridge_token: str = "" # Shared token for bridge auth (optional, recommended)
|
bridge_token: str = "" # Shared token for bridge auth (optional, recommended)
|
||||||
allow_from: list[str] = Field(default_factory=list) # Allowed phone numbers
|
allow_from: list[str] = Field(default_factory=list) # Allowed phone numbers
|
||||||
|
|
||||||
|
|
||||||
class TelegramConfig(BaseModel):
|
class TelegramConfig(Base):
|
||||||
"""Telegram channel configuration."""
|
"""Telegram channel configuration."""
|
||||||
|
|
||||||
enabled: bool = False
|
enabled: bool = False
|
||||||
token: str = "" # Bot token from @BotFather
|
token: str = "" # Bot token from @BotFather
|
||||||
allow_from: list[str] = Field(default_factory=list) # Allowed user IDs or usernames
|
allow_from: list[str] = Field(default_factory=list) # Allowed user IDs or usernames
|
||||||
proxy: str | None = None # HTTP/SOCKS5 proxy URL, e.g. "http://127.0.0.1:7890" or "socks5://127.0.0.1:1080"
|
proxy: str | None = None # HTTP/SOCKS5 proxy URL, e.g. "http://127.0.0.1:7890" or "socks5://127.0.0.1:1080"
|
||||||
|
|
||||||
|
|
||||||
class FeishuConfig(BaseModel):
|
class FeishuConfig(Base):
|
||||||
"""Feishu/Lark channel configuration using WebSocket long connection."""
|
"""Feishu/Lark channel configuration using WebSocket long connection."""
|
||||||
|
|
||||||
enabled: bool = False
|
enabled: bool = False
|
||||||
app_id: str = "" # App ID from Feishu Open Platform
|
app_id: str = "" # App ID from Feishu Open Platform
|
||||||
app_secret: str = "" # App Secret from Feishu Open Platform
|
app_secret: str = "" # App Secret from Feishu Open Platform
|
||||||
@ -31,24 +41,28 @@ class FeishuConfig(BaseModel):
|
|||||||
allow_from: list[str] = Field(default_factory=list) # Allowed user open_ids
|
allow_from: list[str] = Field(default_factory=list) # Allowed user open_ids
|
||||||
|
|
||||||
|
|
||||||
class DingTalkConfig(BaseModel):
|
class DingTalkConfig(Base):
|
||||||
"""DingTalk channel configuration using Stream mode."""
|
"""DingTalk channel configuration using Stream mode."""
|
||||||
|
|
||||||
enabled: bool = False
|
enabled: bool = False
|
||||||
client_id: str = "" # AppKey
|
client_id: str = "" # AppKey
|
||||||
client_secret: str = "" # AppSecret
|
client_secret: str = "" # AppSecret
|
||||||
allow_from: list[str] = Field(default_factory=list) # Allowed staff_ids
|
allow_from: list[str] = Field(default_factory=list) # Allowed staff_ids
|
||||||
|
|
||||||
|
|
||||||
class DiscordConfig(BaseModel):
|
class DiscordConfig(Base):
|
||||||
"""Discord channel configuration."""
|
"""Discord channel configuration."""
|
||||||
|
|
||||||
enabled: bool = False
|
enabled: bool = False
|
||||||
token: str = "" # Bot token from Discord Developer Portal
|
token: str = "" # Bot token from Discord Developer Portal
|
||||||
allow_from: list[str] = Field(default_factory=list) # Allowed user IDs
|
allow_from: list[str] = Field(default_factory=list) # Allowed user IDs
|
||||||
gateway_url: str = "wss://gateway.discord.gg/?v=10&encoding=json"
|
gateway_url: str = "wss://gateway.discord.gg/?v=10&encoding=json"
|
||||||
intents: int = 37377 # GUILDS + GUILD_MESSAGES + DIRECT_MESSAGES + MESSAGE_CONTENT
|
intents: int = 37377 # GUILDS + GUILD_MESSAGES + DIRECT_MESSAGES + MESSAGE_CONTENT
|
||||||
|
|
||||||
class EmailConfig(BaseModel):
|
|
||||||
|
class EmailConfig(Base):
|
||||||
"""Email channel configuration (IMAP inbound + SMTP outbound)."""
|
"""Email channel configuration (IMAP inbound + SMTP outbound)."""
|
||||||
|
|
||||||
enabled: bool = False
|
enabled: bool = False
|
||||||
consent_granted: bool = False # Explicit owner permission to access mailbox data
|
consent_granted: bool = False # Explicit owner permission to access mailbox data
|
||||||
|
|
||||||
@ -78,18 +92,21 @@ class EmailConfig(BaseModel):
|
|||||||
allow_from: list[str] = Field(default_factory=list) # Allowed sender email addresses
|
allow_from: list[str] = Field(default_factory=list) # Allowed sender email addresses
|
||||||
|
|
||||||
|
|
||||||
class MochatMentionConfig(BaseModel):
|
class MochatMentionConfig(Base):
|
||||||
"""Mochat mention behavior configuration."""
|
"""Mochat mention behavior configuration."""
|
||||||
|
|
||||||
require_in_groups: bool = False
|
require_in_groups: bool = False
|
||||||
|
|
||||||
|
|
||||||
class MochatGroupRule(BaseModel):
|
class MochatGroupRule(Base):
|
||||||
"""Mochat per-group mention requirement."""
|
"""Mochat per-group mention requirement."""
|
||||||
|
|
||||||
require_mention: bool = False
|
require_mention: bool = False
|
||||||
|
|
||||||
|
|
||||||
class MochatConfig(BaseModel):
|
class MochatConfig(Base):
|
||||||
"""Mochat channel configuration."""
|
"""Mochat channel configuration."""
|
||||||
|
|
||||||
enabled: bool = False
|
enabled: bool = False
|
||||||
base_url: str = "https://mochat.io"
|
base_url: str = "https://mochat.io"
|
||||||
socket_url: str = ""
|
socket_url: str = ""
|
||||||
@ -114,15 +131,17 @@ class MochatConfig(BaseModel):
|
|||||||
reply_delay_ms: int = 120000
|
reply_delay_ms: int = 120000
|
||||||
|
|
||||||
|
|
||||||
class SlackDMConfig(BaseModel):
|
class SlackDMConfig(Base):
|
||||||
"""Slack DM policy configuration."""
|
"""Slack DM policy configuration."""
|
||||||
|
|
||||||
enabled: bool = True
|
enabled: bool = True
|
||||||
policy: str = "open" # "open" or "allowlist"
|
policy: str = "open" # "open" or "allowlist"
|
||||||
allow_from: list[str] = Field(default_factory=list) # Allowed Slack user IDs
|
allow_from: list[str] = Field(default_factory=list) # Allowed Slack user IDs
|
||||||
|
|
||||||
|
|
||||||
class SlackConfig(BaseModel):
|
class SlackConfig(Base):
|
||||||
"""Slack channel configuration."""
|
"""Slack channel configuration."""
|
||||||
|
|
||||||
enabled: bool = False
|
enabled: bool = False
|
||||||
mode: str = "socket" # "socket" supported
|
mode: str = "socket" # "socket" supported
|
||||||
webhook_path: str = "/slack/events"
|
webhook_path: str = "/slack/events"
|
||||||
@ -134,16 +153,18 @@ class SlackConfig(BaseModel):
|
|||||||
dm: SlackDMConfig = Field(default_factory=SlackDMConfig)
|
dm: SlackDMConfig = Field(default_factory=SlackDMConfig)
|
||||||
|
|
||||||
|
|
||||||
class QQConfig(BaseModel):
|
class QQConfig(Base):
|
||||||
"""QQ channel configuration using botpy SDK."""
|
"""QQ channel configuration using botpy SDK."""
|
||||||
|
|
||||||
enabled: bool = False
|
enabled: bool = False
|
||||||
app_id: str = "" # 机器人 ID (AppID) from q.qq.com
|
app_id: str = "" # 机器人 ID (AppID) from q.qq.com
|
||||||
secret: str = "" # 机器人密钥 (AppSecret) from q.qq.com
|
secret: str = "" # 机器人密钥 (AppSecret) from q.qq.com
|
||||||
allow_from: list[str] = Field(default_factory=list) # Allowed user openids (empty = public access)
|
allow_from: list[str] = Field(default_factory=list) # Allowed user openids (empty = public access)
|
||||||
|
|
||||||
|
|
||||||
class ChannelsConfig(BaseModel):
|
class ChannelsConfig(Base):
|
||||||
"""Configuration for chat channels."""
|
"""Configuration for chat channels."""
|
||||||
|
|
||||||
whatsapp: WhatsAppConfig = Field(default_factory=WhatsAppConfig)
|
whatsapp: WhatsAppConfig = Field(default_factory=WhatsAppConfig)
|
||||||
telegram: TelegramConfig = Field(default_factory=TelegramConfig)
|
telegram: TelegramConfig = Field(default_factory=TelegramConfig)
|
||||||
discord: DiscordConfig = Field(default_factory=DiscordConfig)
|
discord: DiscordConfig = Field(default_factory=DiscordConfig)
|
||||||
@ -155,8 +176,9 @@ class ChannelsConfig(BaseModel):
|
|||||||
qq: QQConfig = Field(default_factory=QQConfig)
|
qq: QQConfig = Field(default_factory=QQConfig)
|
||||||
|
|
||||||
|
|
||||||
class AgentDefaults(BaseModel):
|
class AgentDefaults(Base):
|
||||||
"""Default agent configuration."""
|
"""Default agent configuration."""
|
||||||
|
|
||||||
workspace: str = "~/.nanobot/workspace"
|
workspace: str = "~/.nanobot/workspace"
|
||||||
model: str = "anthropic/claude-opus-4-5"
|
model: str = "anthropic/claude-opus-4-5"
|
||||||
max_tokens: int = 8192
|
max_tokens: int = 8192
|
||||||
@ -165,20 +187,23 @@ class AgentDefaults(BaseModel):
|
|||||||
memory_window: int = 50
|
memory_window: int = 50
|
||||||
|
|
||||||
|
|
||||||
class AgentsConfig(BaseModel):
|
class AgentsConfig(Base):
|
||||||
"""Agent configuration."""
|
"""Agent configuration."""
|
||||||
|
|
||||||
defaults: AgentDefaults = Field(default_factory=AgentDefaults)
|
defaults: AgentDefaults = Field(default_factory=AgentDefaults)
|
||||||
|
|
||||||
|
|
||||||
class ProviderConfig(BaseModel):
|
class ProviderConfig(Base):
|
||||||
"""LLM provider configuration."""
|
"""LLM provider configuration."""
|
||||||
|
|
||||||
api_key: str = ""
|
api_key: str = ""
|
||||||
api_base: str | None = None
|
api_base: str | None = None
|
||||||
extra_headers: dict[str, str] | None = None # Custom headers (e.g. APP-Code for AiHubMix)
|
extra_headers: dict[str, str] | None = None # Custom headers (e.g. APP-Code for AiHubMix)
|
||||||
|
|
||||||
|
|
||||||
class ProvidersConfig(BaseModel):
|
class ProvidersConfig(Base):
|
||||||
"""Configuration for LLM providers."""
|
"""Configuration for LLM providers."""
|
||||||
|
|
||||||
custom: ProviderConfig = Field(default_factory=ProviderConfig) # Any OpenAI-compatible endpoint
|
custom: ProviderConfig = Field(default_factory=ProviderConfig) # Any OpenAI-compatible endpoint
|
||||||
anthropic: ProviderConfig = Field(default_factory=ProviderConfig)
|
anthropic: ProviderConfig = Field(default_factory=ProviderConfig)
|
||||||
openai: ProviderConfig = Field(default_factory=ProviderConfig)
|
openai: ProviderConfig = Field(default_factory=ProviderConfig)
|
||||||
@ -193,40 +218,47 @@ class ProvidersConfig(BaseModel):
|
|||||||
minimax: ProviderConfig = Field(default_factory=ProviderConfig)
|
minimax: ProviderConfig = Field(default_factory=ProviderConfig)
|
||||||
aihubmix: ProviderConfig = Field(default_factory=ProviderConfig) # AiHubMix API gateway
|
aihubmix: ProviderConfig = Field(default_factory=ProviderConfig) # AiHubMix API gateway
|
||||||
openai_codex: ProviderConfig = Field(default_factory=ProviderConfig) # OpenAI Codex (OAuth)
|
openai_codex: ProviderConfig = Field(default_factory=ProviderConfig) # OpenAI Codex (OAuth)
|
||||||
|
github_copilot: ProviderConfig = Field(default_factory=ProviderConfig) # Github Copilot (OAuth)
|
||||||
|
|
||||||
|
|
||||||
class GatewayConfig(BaseModel):
|
class GatewayConfig(Base):
|
||||||
"""Gateway/server configuration."""
|
"""Gateway/server configuration."""
|
||||||
|
|
||||||
host: str = "0.0.0.0"
|
host: str = "0.0.0.0"
|
||||||
port: int = 18790
|
port: int = 18790
|
||||||
|
|
||||||
|
|
||||||
class WebSearchConfig(BaseModel):
|
class WebSearchConfig(Base):
|
||||||
"""Web search tool configuration."""
|
"""Web search tool configuration."""
|
||||||
|
|
||||||
api_key: str = "" # Brave Search API key
|
api_key: str = "" # Brave Search API key
|
||||||
max_results: int = 5
|
max_results: int = 5
|
||||||
|
|
||||||
|
|
||||||
class WebToolsConfig(BaseModel):
|
class WebToolsConfig(Base):
|
||||||
"""Web tools configuration."""
|
"""Web tools configuration."""
|
||||||
|
|
||||||
search: WebSearchConfig = Field(default_factory=WebSearchConfig)
|
search: WebSearchConfig = Field(default_factory=WebSearchConfig)
|
||||||
|
|
||||||
|
|
||||||
class ExecToolConfig(BaseModel):
|
class ExecToolConfig(Base):
|
||||||
"""Shell exec tool configuration."""
|
"""Shell exec tool configuration."""
|
||||||
|
|
||||||
timeout: int = 60
|
timeout: int = 60
|
||||||
|
|
||||||
|
|
||||||
class MCPServerConfig(BaseModel):
|
class MCPServerConfig(Base):
|
||||||
"""MCP server connection configuration (stdio or HTTP)."""
|
"""MCP server connection configuration (stdio or HTTP)."""
|
||||||
|
|
||||||
command: str = "" # Stdio: command to run (e.g. "npx")
|
command: str = "" # Stdio: command to run (e.g. "npx")
|
||||||
args: list[str] = Field(default_factory=list) # Stdio: command arguments
|
args: list[str] = Field(default_factory=list) # Stdio: command arguments
|
||||||
env: dict[str, str] = Field(default_factory=dict) # Stdio: extra env vars
|
env: dict[str, str] = Field(default_factory=dict) # Stdio: extra env vars
|
||||||
url: str = "" # HTTP: streamable HTTP endpoint URL
|
url: str = "" # HTTP: streamable HTTP endpoint URL
|
||||||
|
|
||||||
|
|
||||||
class ToolsConfig(BaseModel):
|
class ToolsConfig(Base):
|
||||||
"""Tools configuration."""
|
"""Tools configuration."""
|
||||||
|
|
||||||
web: WebToolsConfig = Field(default_factory=WebToolsConfig)
|
web: WebToolsConfig = Field(default_factory=WebToolsConfig)
|
||||||
exec: ExecToolConfig = Field(default_factory=ExecToolConfig)
|
exec: ExecToolConfig = Field(default_factory=ExecToolConfig)
|
||||||
restrict_to_workspace: bool = False # If true, restrict all tool access to workspace directory
|
restrict_to_workspace: bool = False # If true, restrict all tool access to workspace directory
|
||||||
@ -235,6 +267,7 @@ class ToolsConfig(BaseModel):
|
|||||||
|
|
||||||
class Config(BaseSettings):
|
class Config(BaseSettings):
|
||||||
"""Root configuration for nanobot."""
|
"""Root configuration for nanobot."""
|
||||||
|
|
||||||
agents: AgentsConfig = Field(default_factory=AgentsConfig)
|
agents: AgentsConfig = Field(default_factory=AgentsConfig)
|
||||||
channels: ChannelsConfig = Field(default_factory=ChannelsConfig)
|
channels: ChannelsConfig = Field(default_factory=ChannelsConfig)
|
||||||
providers: ProvidersConfig = Field(default_factory=ProvidersConfig)
|
providers: ProvidersConfig = Field(default_factory=ProvidersConfig)
|
||||||
@ -249,6 +282,7 @@ class Config(BaseSettings):
|
|||||||
def _match_provider(self, model: str | None = None) -> tuple["ProviderConfig | None", str | None]:
|
def _match_provider(self, model: str | None = None) -> tuple["ProviderConfig | None", str | None]:
|
||||||
"""Match provider config and its registry name. Returns (config, spec_name)."""
|
"""Match provider config and its registry name. Returns (config, spec_name)."""
|
||||||
from nanobot.providers.registry import PROVIDERS
|
from nanobot.providers.registry import PROVIDERS
|
||||||
|
|
||||||
model_lower = (model or self.agents.defaults.model).lower()
|
model_lower = (model or self.agents.defaults.model).lower()
|
||||||
|
|
||||||
# Match by keyword (order follows PROVIDERS registry)
|
# Match by keyword (order follows PROVIDERS registry)
|
||||||
@ -286,6 +320,7 @@ class Config(BaseSettings):
|
|||||||
def get_api_base(self, model: str | None = None) -> str | None:
|
def get_api_base(self, model: str | None = None) -> str | None:
|
||||||
"""Get API base URL for the given model. Applies default URLs for known gateways."""
|
"""Get API base URL for the given model. Applies default URLs for known gateways."""
|
||||||
from nanobot.providers.registry import find_by_name
|
from nanobot.providers.registry import find_by_name
|
||||||
|
|
||||||
p, name = self._match_provider(model)
|
p, name = self._match_provider(model)
|
||||||
if p and p.api_base:
|
if p and p.api_base:
|
||||||
return p.api_base
|
return p.api_base
|
||||||
@ -298,7 +333,4 @@ class Config(BaseSettings):
|
|||||||
return spec.default_api_base
|
return spec.default_api_base
|
||||||
return None
|
return None
|
||||||
|
|
||||||
model_config = ConfigDict(
|
model_config = ConfigDict(env_prefix="NANOBOT_", env_nested_delimiter="__")
|
||||||
env_prefix="NANOBOT_",
|
|
||||||
env_nested_delimiter="__"
|
|
||||||
)
|
|
||||||
|
|||||||
@ -32,7 +32,8 @@ def _compute_next_run(schedule: CronSchedule, now_ms: int) -> int | None:
|
|||||||
try:
|
try:
|
||||||
from croniter import croniter
|
from croniter import croniter
|
||||||
from zoneinfo import ZoneInfo
|
from zoneinfo import ZoneInfo
|
||||||
base_time = time.time()
|
# Use caller-provided reference time for deterministic scheduling
|
||||||
|
base_time = now_ms / 1000
|
||||||
tz = ZoneInfo(schedule.tz) if schedule.tz else datetime.now().astimezone().tzinfo
|
tz = ZoneInfo(schedule.tz) if schedule.tz else datetime.now().astimezone().tzinfo
|
||||||
base_dt = datetime.fromtimestamp(base_time, tz=tz)
|
base_dt = datetime.fromtimestamp(base_time, tz=tz)
|
||||||
cron = croniter(schedule.expr, base_dt)
|
cron = croniter(schedule.expr, base_dt)
|
||||||
|
|||||||
@ -177,6 +177,25 @@ PROVIDERS: tuple[ProviderSpec, ...] = (
|
|||||||
is_oauth=True, # OAuth-based authentication
|
is_oauth=True, # OAuth-based authentication
|
||||||
),
|
),
|
||||||
|
|
||||||
|
# Github Copilot: uses OAuth, not API key.
|
||||||
|
ProviderSpec(
|
||||||
|
name="github_copilot",
|
||||||
|
keywords=("github_copilot", "copilot"),
|
||||||
|
env_key="", # OAuth-based, no API key
|
||||||
|
display_name="Github Copilot",
|
||||||
|
litellm_prefix="github_copilot", # github_copilot/model → github_copilot/model
|
||||||
|
skip_prefixes=("github_copilot/",),
|
||||||
|
env_extras=(),
|
||||||
|
is_gateway=False,
|
||||||
|
is_local=False,
|
||||||
|
detect_by_key_prefix="",
|
||||||
|
detect_by_base_keyword="",
|
||||||
|
default_api_base="",
|
||||||
|
strip_model_prefix=False,
|
||||||
|
model_overrides=(),
|
||||||
|
is_oauth=True, # OAuth-based authentication
|
||||||
|
),
|
||||||
|
|
||||||
# DeepSeek: needs "deepseek/" prefix for LiteLLM routing.
|
# DeepSeek: needs "deepseek/" prefix for LiteLLM routing.
|
||||||
ProviderSpec(
|
ProviderSpec(
|
||||||
name="deepseek",
|
name="deepseek",
|
||||||
|
|||||||
@ -21,4 +21,5 @@ The skill format and metadata structure follow OpenClaw's conventions to maintai
|
|||||||
| `weather` | Get weather info using wttr.in and Open-Meteo |
|
| `weather` | Get weather info using wttr.in and Open-Meteo |
|
||||||
| `summarize` | Summarize URLs, files, and YouTube videos |
|
| `summarize` | Summarize URLs, files, and YouTube videos |
|
||||||
| `tmux` | Remote-control tmux sessions |
|
| `tmux` | Remote-control tmux sessions |
|
||||||
|
| `clawhub` | Search and install skills from ClawHub registry |
|
||||||
| `skill-creator` | Create new skills |
|
| `skill-creator` | Create new skills |
|
||||||
53
nanobot/skills/clawhub/SKILL.md
Normal file
53
nanobot/skills/clawhub/SKILL.md
Normal file
@ -0,0 +1,53 @@
|
|||||||
|
---
|
||||||
|
name: clawhub
|
||||||
|
description: Search and install agent skills from ClawHub, the public skill registry.
|
||||||
|
homepage: https://clawhub.ai
|
||||||
|
metadata: {"nanobot":{"emoji":"🦞"}}
|
||||||
|
---
|
||||||
|
|
||||||
|
# ClawHub
|
||||||
|
|
||||||
|
Public skill registry for AI agents. Search by natural language (vector search).
|
||||||
|
|
||||||
|
## When to use
|
||||||
|
|
||||||
|
Use this skill when the user asks any of:
|
||||||
|
- "find a skill for …"
|
||||||
|
- "search for skills"
|
||||||
|
- "install a skill"
|
||||||
|
- "what skills are available?"
|
||||||
|
- "update my skills"
|
||||||
|
|
||||||
|
## Search
|
||||||
|
|
||||||
|
```bash
|
||||||
|
npx --yes clawhub@latest search "web scraping" --limit 5
|
||||||
|
```
|
||||||
|
|
||||||
|
## Install
|
||||||
|
|
||||||
|
```bash
|
||||||
|
npx --yes clawhub@latest install <slug> --workdir ~/.nanobot/workspace
|
||||||
|
```
|
||||||
|
|
||||||
|
Replace `<slug>` with the skill name from search results. This places the skill into `~/.nanobot/workspace/skills/`, where nanobot loads workspace skills from. Always include `--workdir`.
|
||||||
|
|
||||||
|
## Update
|
||||||
|
|
||||||
|
```bash
|
||||||
|
npx --yes clawhub@latest update --all --workdir ~/.nanobot/workspace
|
||||||
|
```
|
||||||
|
|
||||||
|
## List installed
|
||||||
|
|
||||||
|
```bash
|
||||||
|
npx --yes clawhub@latest list --workdir ~/.nanobot/workspace
|
||||||
|
```
|
||||||
|
|
||||||
|
## Notes
|
||||||
|
|
||||||
|
- Requires Node.js (`npx` comes with it).
|
||||||
|
- No API key needed for search and install.
|
||||||
|
- Login (`npx --yes clawhub@latest login`) is only required for publishing.
|
||||||
|
- `--workdir ~/.nanobot/workspace` is critical — without it, skills install to the current directory instead of the nanobot workspace.
|
||||||
|
- After install, remind the user to start a new session to load the skill.
|
||||||
@ -30,6 +30,11 @@ One-time scheduled task (compute ISO datetime from current time):
|
|||||||
cron(action="add", message="Remind me about the meeting", at="<ISO datetime>")
|
cron(action="add", message="Remind me about the meeting", at="<ISO datetime>")
|
||||||
```
|
```
|
||||||
|
|
||||||
|
Timezone-aware cron:
|
||||||
|
```
|
||||||
|
cron(action="add", message="Morning standup", cron_expr="0 9 * * 1-5", tz="America/Vancouver")
|
||||||
|
```
|
||||||
|
|
||||||
List/remove:
|
List/remove:
|
||||||
```
|
```
|
||||||
cron(action="list")
|
cron(action="list")
|
||||||
@ -44,4 +49,9 @@ cron(action="remove", job_id="abc123")
|
|||||||
| every hour | every_seconds: 3600 |
|
| every hour | every_seconds: 3600 |
|
||||||
| every day at 8am | cron_expr: "0 8 * * *" |
|
| every day at 8am | cron_expr: "0 8 * * *" |
|
||||||
| weekdays at 5pm | cron_expr: "0 17 * * 1-5" |
|
| weekdays at 5pm | cron_expr: "0 17 * * 1-5" |
|
||||||
|
| 9am Vancouver time daily | cron_expr: "0 9 * * *", tz: "America/Vancouver" |
|
||||||
| at a specific time | at: ISO datetime string (compute from current time) |
|
| at a specific time | at: ISO datetime string (compute from current time) |
|
||||||
|
|
||||||
|
## Timezone
|
||||||
|
|
||||||
|
Use `tz` with `cron_expr` to schedule in a specific IANA timezone. Without `tz`, the server's local timezone is used.
|
||||||
|
|||||||
Loading…
x
Reference in New Issue
Block a user