Merge branch 'main' into pr-107
This commit is contained in:
commit
980c5992f4
4
.gitignore
vendored
4
.gitignore
vendored
@ -12,4 +12,6 @@ docs/
|
|||||||
*.pyw
|
*.pyw
|
||||||
*.pyz
|
*.pyz
|
||||||
*.pywz
|
*.pywz
|
||||||
*.pyzz
|
*.pyzz
|
||||||
|
.venv/
|
||||||
|
__pycache__/
|
||||||
|
|||||||
69
README.md
69
README.md
@ -18,7 +18,9 @@
|
|||||||
|
|
||||||
## 📢 News
|
## 📢 News
|
||||||
|
|
||||||
- **2026-02-01** 🎉 nanobot launched! Welcome to try 🐈 nanobot!
|
- **2026-02-05** ✨ Added Feishu channel, DeepSeek provider, and better scheduled tasks support!
|
||||||
|
- **2026-02-04** 🚀 v0.1.3.post4 released with multi-provider & Docker support! Check [release notes](https://github.com/HKUDS/nanobot/releases/tag/v0.1.3.post4) for details.
|
||||||
|
- **2026-02-02** 🎉 nanobot launched! Welcome to try 🐈 nanobot!
|
||||||
|
|
||||||
## Key Features of nanobot:
|
## Key Features of nanobot:
|
||||||
|
|
||||||
@ -28,7 +30,7 @@
|
|||||||
|
|
||||||
⚡️ **Lightning Fast**: Minimal footprint means faster startup, lower resource usage, and quicker iterations.
|
⚡️ **Lightning Fast**: Minimal footprint means faster startup, lower resource usage, and quicker iterations.
|
||||||
|
|
||||||
💎 **Easy-to-Use**: One-click to depoly and you're ready to go.
|
💎 **Easy-to-Use**: One-click to deploy and you're ready to go.
|
||||||
|
|
||||||
## 🏗️ Architecture
|
## 🏗️ Architecture
|
||||||
|
|
||||||
@ -166,12 +168,13 @@ nanobot agent -m "Hello from my local LLM!"
|
|||||||
|
|
||||||
## 💬 Chat Apps
|
## 💬 Chat Apps
|
||||||
|
|
||||||
Talk to your nanobot through Telegram or WhatsApp — anytime, anywhere.
|
Talk to your nanobot through Telegram, WhatsApp, or Feishu — anytime, anywhere.
|
||||||
|
|
||||||
| Channel | Setup |
|
| Channel | Setup |
|
||||||
|---------|-------|
|
|---------|-------|
|
||||||
| **Telegram** | Easy (just a token) |
|
| **Telegram** | Easy (just a token) |
|
||||||
| **WhatsApp** | Medium (scan QR) |
|
| **WhatsApp** | Medium (scan QR) |
|
||||||
|
| **Feishu** | Medium (app credentials) |
|
||||||
|
|
||||||
<details>
|
<details>
|
||||||
<summary><b>Telegram</b> (Recommended)</summary>
|
<summary><b>Telegram</b> (Recommended)</summary>
|
||||||
@ -242,6 +245,55 @@ nanobot gateway
|
|||||||
|
|
||||||
</details>
|
</details>
|
||||||
|
|
||||||
|
<details>
|
||||||
|
<summary><b>Feishu (飞书)</b></summary>
|
||||||
|
|
||||||
|
Uses **WebSocket** long connection — no public IP required.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
pip install nanobot-ai[feishu]
|
||||||
|
```
|
||||||
|
|
||||||
|
**1. Create a Feishu bot**
|
||||||
|
- Visit [Feishu Open Platform](https://open.feishu.cn/app)
|
||||||
|
- Create a new app → Enable **Bot** capability
|
||||||
|
- **Permissions**: Add `im:message` (send messages)
|
||||||
|
- **Events**: Add `im.message.receive_v1` (receive messages)
|
||||||
|
- Select **Long Connection** mode (requires running nanobot first to establish connection)
|
||||||
|
- Get **App ID** and **App Secret** from "Credentials & Basic Info"
|
||||||
|
- Publish the app
|
||||||
|
|
||||||
|
**2. Configure**
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"channels": {
|
||||||
|
"feishu": {
|
||||||
|
"enabled": true,
|
||||||
|
"appId": "cli_xxx",
|
||||||
|
"appSecret": "xxx",
|
||||||
|
"encryptKey": "",
|
||||||
|
"verificationToken": "",
|
||||||
|
"allowFrom": []
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
> `encryptKey` and `verificationToken` are optional for Long Connection mode.
|
||||||
|
> `allowFrom`: Leave empty to allow all users, or add `["ou_xxx"]` to restrict access.
|
||||||
|
|
||||||
|
**3. Run**
|
||||||
|
|
||||||
|
```bash
|
||||||
|
nanobot gateway
|
||||||
|
```
|
||||||
|
|
||||||
|
> [!TIP]
|
||||||
|
> Feishu uses WebSocket to receive messages — no webhook or public IP needed!
|
||||||
|
|
||||||
|
</details>
|
||||||
|
|
||||||
## ⚙️ Configuration
|
## ⚙️ Configuration
|
||||||
|
|
||||||
Config file: `~/.nanobot/config.json`
|
Config file: `~/.nanobot/config.json`
|
||||||
@ -256,6 +308,7 @@ Config file: `~/.nanobot/config.json`
|
|||||||
| `openrouter` | LLM (recommended, access to all models) | [openrouter.ai](https://openrouter.ai) |
|
| `openrouter` | LLM (recommended, access to all models) | [openrouter.ai](https://openrouter.ai) |
|
||||||
| `anthropic` | LLM (Claude direct) | [console.anthropic.com](https://console.anthropic.com) |
|
| `anthropic` | LLM (Claude direct) | [console.anthropic.com](https://console.anthropic.com) |
|
||||||
| `openai` | LLM (GPT direct) | [platform.openai.com](https://platform.openai.com) |
|
| `openai` | LLM (GPT direct) | [platform.openai.com](https://platform.openai.com) |
|
||||||
|
| `deepseek` | LLM (DeepSeek direct) | [platform.deepseek.com](https://platform.deepseek.com) |
|
||||||
| `groq` | LLM + **Voice transcription** (Whisper) | [console.groq.com](https://console.groq.com) |
|
| `groq` | LLM + **Voice transcription** (Whisper) | [console.groq.com](https://console.groq.com) |
|
||||||
| `gemini` | LLM (Gemini direct) | [aistudio.google.com](https://aistudio.google.com) |
|
| `gemini` | LLM (Gemini direct) | [aistudio.google.com](https://aistudio.google.com) |
|
||||||
|
|
||||||
@ -286,6 +339,14 @@ Config file: `~/.nanobot/config.json`
|
|||||||
},
|
},
|
||||||
"whatsapp": {
|
"whatsapp": {
|
||||||
"enabled": false
|
"enabled": false
|
||||||
|
},
|
||||||
|
"feishu": {
|
||||||
|
"enabled": false,
|
||||||
|
"appId": "cli_xxx",
|
||||||
|
"appSecret": "xxx",
|
||||||
|
"encryptKey": "",
|
||||||
|
"verificationToken": "",
|
||||||
|
"allowFrom": []
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
"tools": {
|
"tools": {
|
||||||
@ -392,7 +453,7 @@ PRs welcome! The codebase is intentionally small and readable. 🤗
|
|||||||
### Contributors
|
### Contributors
|
||||||
|
|
||||||
<a href="https://github.com/HKUDS/nanobot/graphs/contributors">
|
<a href="https://github.com/HKUDS/nanobot/graphs/contributors">
|
||||||
<img src="https://contrib.rocks/image?repo=HKUDS/nanobot" />
|
<img src="https://contrib.rocks/image?repo=HKUDS/nanobot&max=100&columns=12" />
|
||||||
</a>
|
</a>
|
||||||
|
|
||||||
|
|
||||||
|
|||||||
@ -137,6 +137,8 @@ When remembering something, write to {workspace_path}/memory/MEMORY.md"""
|
|||||||
current_message: str,
|
current_message: str,
|
||||||
skill_names: list[str] | None = None,
|
skill_names: list[str] | None = None,
|
||||||
media: list[str] | None = None,
|
media: list[str] | None = None,
|
||||||
|
channel: str | None = None,
|
||||||
|
chat_id: str | None = None,
|
||||||
) -> list[dict[str, Any]]:
|
) -> list[dict[str, Any]]:
|
||||||
"""
|
"""
|
||||||
Build the complete message list for an LLM call.
|
Build the complete message list for an LLM call.
|
||||||
@ -146,6 +148,8 @@ When remembering something, write to {workspace_path}/memory/MEMORY.md"""
|
|||||||
current_message: The new user message.
|
current_message: The new user message.
|
||||||
skill_names: Optional skills to include.
|
skill_names: Optional skills to include.
|
||||||
media: Optional list of local file paths for images/media.
|
media: Optional list of local file paths for images/media.
|
||||||
|
channel: Current channel (telegram, feishu, etc.).
|
||||||
|
chat_id: Current chat/user ID.
|
||||||
|
|
||||||
Returns:
|
Returns:
|
||||||
List of messages including system prompt.
|
List of messages including system prompt.
|
||||||
@ -154,6 +158,8 @@ When remembering something, write to {workspace_path}/memory/MEMORY.md"""
|
|||||||
|
|
||||||
# System prompt
|
# System prompt
|
||||||
system_prompt = self.build_system_prompt(skill_names)
|
system_prompt = self.build_system_prompt(skill_names)
|
||||||
|
if channel and chat_id:
|
||||||
|
system_prompt += f"\n\n## Current Session\nChannel: {channel}\nChat ID: {chat_id}"
|
||||||
messages.append({"role": "system", "content": system_prompt})
|
messages.append({"role": "system", "content": system_prompt})
|
||||||
|
|
||||||
# History
|
# History
|
||||||
|
|||||||
@ -17,6 +17,7 @@ from nanobot.agent.tools.shell import ExecTool
|
|||||||
from nanobot.agent.tools.web import WebSearchTool, WebFetchTool
|
from nanobot.agent.tools.web import WebSearchTool, WebFetchTool
|
||||||
from nanobot.agent.tools.message import MessageTool
|
from nanobot.agent.tools.message import MessageTool
|
||||||
from nanobot.agent.tools.spawn import SpawnTool
|
from nanobot.agent.tools.spawn import SpawnTool
|
||||||
|
from nanobot.agent.tools.cron import CronTool
|
||||||
from nanobot.agent.subagent import SubagentManager
|
from nanobot.agent.subagent import SubagentManager
|
||||||
from nanobot.session.manager import SessionManager
|
from nanobot.session.manager import SessionManager
|
||||||
|
|
||||||
@ -42,8 +43,10 @@ class AgentLoop:
|
|||||||
max_iterations: int = 20,
|
max_iterations: int = 20,
|
||||||
brave_api_key: str | None = None,
|
brave_api_key: str | None = None,
|
||||||
exec_config: "ExecToolConfig | None" = None,
|
exec_config: "ExecToolConfig | None" = None,
|
||||||
|
cron_service: "CronService | None" = None,
|
||||||
):
|
):
|
||||||
from nanobot.config.schema import ExecToolConfig
|
from nanobot.config.schema import ExecToolConfig
|
||||||
|
from nanobot.cron.service import CronService
|
||||||
self.bus = bus
|
self.bus = bus
|
||||||
self.provider = provider
|
self.provider = provider
|
||||||
self.workspace = workspace
|
self.workspace = workspace
|
||||||
@ -51,6 +54,7 @@ class AgentLoop:
|
|||||||
self.max_iterations = max_iterations
|
self.max_iterations = max_iterations
|
||||||
self.brave_api_key = brave_api_key
|
self.brave_api_key = brave_api_key
|
||||||
self.exec_config = exec_config or ExecToolConfig()
|
self.exec_config = exec_config or ExecToolConfig()
|
||||||
|
self.cron_service = cron_service
|
||||||
|
|
||||||
self.context = ContextBuilder(workspace)
|
self.context = ContextBuilder(workspace)
|
||||||
self.sessions = SessionManager(workspace)
|
self.sessions = SessionManager(workspace)
|
||||||
@ -93,6 +97,10 @@ class AgentLoop:
|
|||||||
# Spawn tool (for subagents)
|
# Spawn tool (for subagents)
|
||||||
spawn_tool = SpawnTool(manager=self.subagents)
|
spawn_tool = SpawnTool(manager=self.subagents)
|
||||||
self.tools.register(spawn_tool)
|
self.tools.register(spawn_tool)
|
||||||
|
|
||||||
|
# Cron tool (for scheduling)
|
||||||
|
if self.cron_service:
|
||||||
|
self.tools.register(CronTool(self.cron_service))
|
||||||
|
|
||||||
async def run(self) -> None:
|
async def run(self) -> None:
|
||||||
"""Run the agent loop, processing messages from the bus."""
|
"""Run the agent loop, processing messages from the bus."""
|
||||||
@ -157,11 +165,17 @@ class AgentLoop:
|
|||||||
if isinstance(spawn_tool, SpawnTool):
|
if isinstance(spawn_tool, SpawnTool):
|
||||||
spawn_tool.set_context(msg.channel, msg.chat_id)
|
spawn_tool.set_context(msg.channel, msg.chat_id)
|
||||||
|
|
||||||
|
cron_tool = self.tools.get("cron")
|
||||||
|
if isinstance(cron_tool, CronTool):
|
||||||
|
cron_tool.set_context(msg.channel, msg.chat_id)
|
||||||
|
|
||||||
# Build initial messages (use get_history for LLM-formatted messages)
|
# Build initial messages (use get_history for LLM-formatted messages)
|
||||||
messages = self.context.build_messages(
|
messages = self.context.build_messages(
|
||||||
history=session.get_history(),
|
history=session.get_history(),
|
||||||
current_message=msg.content,
|
current_message=msg.content,
|
||||||
media=msg.media if msg.media else None,
|
media=msg.media if msg.media else None,
|
||||||
|
channel=msg.channel,
|
||||||
|
chat_id=msg.chat_id,
|
||||||
)
|
)
|
||||||
|
|
||||||
# Agent loop
|
# Agent loop
|
||||||
@ -255,10 +269,16 @@ class AgentLoop:
|
|||||||
if isinstance(spawn_tool, SpawnTool):
|
if isinstance(spawn_tool, SpawnTool):
|
||||||
spawn_tool.set_context(origin_channel, origin_chat_id)
|
spawn_tool.set_context(origin_channel, origin_chat_id)
|
||||||
|
|
||||||
|
cron_tool = self.tools.get("cron")
|
||||||
|
if isinstance(cron_tool, CronTool):
|
||||||
|
cron_tool.set_context(origin_channel, origin_chat_id)
|
||||||
|
|
||||||
# Build messages with the announce content
|
# Build messages with the announce content
|
||||||
messages = self.context.build_messages(
|
messages = self.context.build_messages(
|
||||||
history=session.get_history(),
|
history=session.get_history(),
|
||||||
current_message=msg.content
|
current_message=msg.content,
|
||||||
|
channel=origin_channel,
|
||||||
|
chat_id=origin_chat_id,
|
||||||
)
|
)
|
||||||
|
|
||||||
# Agent loop (limited for announce handling)
|
# Agent loop (limited for announce handling)
|
||||||
@ -315,21 +335,29 @@ class AgentLoop:
|
|||||||
content=final_content
|
content=final_content
|
||||||
)
|
)
|
||||||
|
|
||||||
async def process_direct(self, content: str, session_key: str = "cli:direct") -> str:
|
async def process_direct(
|
||||||
|
self,
|
||||||
|
content: str,
|
||||||
|
session_key: str = "cli:direct",
|
||||||
|
channel: str = "cli",
|
||||||
|
chat_id: str = "direct",
|
||||||
|
) -> str:
|
||||||
"""
|
"""
|
||||||
Process a message directly (for CLI usage).
|
Process a message directly (for CLI or cron usage).
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
content: The message content.
|
content: The message content.
|
||||||
session_key: Session identifier.
|
session_key: Session identifier.
|
||||||
|
channel: Source channel (for context).
|
||||||
|
chat_id: Source chat ID (for context).
|
||||||
|
|
||||||
Returns:
|
Returns:
|
||||||
The agent's response.
|
The agent's response.
|
||||||
"""
|
"""
|
||||||
msg = InboundMessage(
|
msg = InboundMessage(
|
||||||
channel="cli",
|
channel=channel,
|
||||||
sender_id="user",
|
sender_id="user",
|
||||||
chat_id="direct",
|
chat_id=chat_id,
|
||||||
content=content
|
content=content
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|||||||
@ -149,7 +149,8 @@ class SubagentManager:
|
|||||||
|
|
||||||
# Execute tools
|
# Execute tools
|
||||||
for tool_call in response.tool_calls:
|
for tool_call in response.tool_calls:
|
||||||
logger.debug(f"Subagent [{task_id}] executing: {tool_call.name}")
|
args_str = json.dumps(tool_call.arguments)
|
||||||
|
logger.debug(f"Subagent [{task_id}] executing: {tool_call.name} with arguments: {args_str}")
|
||||||
result = await tools.execute(tool_call.name, tool_call.arguments)
|
result = await tools.execute(tool_call.name, tool_call.arguments)
|
||||||
messages.append({
|
messages.append({
|
||||||
"role": "tool",
|
"role": "tool",
|
||||||
|
|||||||
114
nanobot/agent/tools/cron.py
Normal file
114
nanobot/agent/tools/cron.py
Normal file
@ -0,0 +1,114 @@
|
|||||||
|
"""Cron tool for scheduling reminders and tasks."""
|
||||||
|
|
||||||
|
from typing import Any
|
||||||
|
|
||||||
|
from nanobot.agent.tools.base import Tool
|
||||||
|
from nanobot.cron.service import CronService
|
||||||
|
from nanobot.cron.types import CronSchedule
|
||||||
|
|
||||||
|
|
||||||
|
class CronTool(Tool):
|
||||||
|
"""Tool to schedule reminders and recurring tasks."""
|
||||||
|
|
||||||
|
def __init__(self, cron_service: CronService):
|
||||||
|
self._cron = cron_service
|
||||||
|
self._channel = ""
|
||||||
|
self._chat_id = ""
|
||||||
|
|
||||||
|
def set_context(self, channel: str, chat_id: str) -> None:
|
||||||
|
"""Set the current session context for delivery."""
|
||||||
|
self._channel = channel
|
||||||
|
self._chat_id = chat_id
|
||||||
|
|
||||||
|
@property
|
||||||
|
def name(self) -> str:
|
||||||
|
return "cron"
|
||||||
|
|
||||||
|
@property
|
||||||
|
def description(self) -> str:
|
||||||
|
return "Schedule reminders and recurring tasks. Actions: add, list, remove."
|
||||||
|
|
||||||
|
@property
|
||||||
|
def parameters(self) -> dict[str, Any]:
|
||||||
|
return {
|
||||||
|
"type": "object",
|
||||||
|
"properties": {
|
||||||
|
"action": {
|
||||||
|
"type": "string",
|
||||||
|
"enum": ["add", "list", "remove"],
|
||||||
|
"description": "Action to perform"
|
||||||
|
},
|
||||||
|
"message": {
|
||||||
|
"type": "string",
|
||||||
|
"description": "Reminder message (for add)"
|
||||||
|
},
|
||||||
|
"every_seconds": {
|
||||||
|
"type": "integer",
|
||||||
|
"description": "Interval in seconds (for recurring tasks)"
|
||||||
|
},
|
||||||
|
"cron_expr": {
|
||||||
|
"type": "string",
|
||||||
|
"description": "Cron expression like '0 9 * * *' (for scheduled tasks)"
|
||||||
|
},
|
||||||
|
"job_id": {
|
||||||
|
"type": "string",
|
||||||
|
"description": "Job ID (for remove)"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"required": ["action"]
|
||||||
|
}
|
||||||
|
|
||||||
|
async def execute(
|
||||||
|
self,
|
||||||
|
action: str,
|
||||||
|
message: str = "",
|
||||||
|
every_seconds: int | None = None,
|
||||||
|
cron_expr: str | None = None,
|
||||||
|
job_id: str | None = None,
|
||||||
|
**kwargs: Any
|
||||||
|
) -> str:
|
||||||
|
if action == "add":
|
||||||
|
return self._add_job(message, every_seconds, cron_expr)
|
||||||
|
elif action == "list":
|
||||||
|
return self._list_jobs()
|
||||||
|
elif action == "remove":
|
||||||
|
return self._remove_job(job_id)
|
||||||
|
return f"Unknown action: {action}"
|
||||||
|
|
||||||
|
def _add_job(self, message: str, every_seconds: int | None, cron_expr: str | None) -> str:
|
||||||
|
if not message:
|
||||||
|
return "Error: message is required for add"
|
||||||
|
if not self._channel or not self._chat_id:
|
||||||
|
return "Error: no session context (channel/chat_id)"
|
||||||
|
|
||||||
|
# Build schedule
|
||||||
|
if every_seconds:
|
||||||
|
schedule = CronSchedule(kind="every", every_ms=every_seconds * 1000)
|
||||||
|
elif cron_expr:
|
||||||
|
schedule = CronSchedule(kind="cron", expr=cron_expr)
|
||||||
|
else:
|
||||||
|
return "Error: either every_seconds or cron_expr is required"
|
||||||
|
|
||||||
|
job = self._cron.add_job(
|
||||||
|
name=message[:30],
|
||||||
|
schedule=schedule,
|
||||||
|
message=message,
|
||||||
|
deliver=True,
|
||||||
|
channel=self._channel,
|
||||||
|
to=self._chat_id,
|
||||||
|
)
|
||||||
|
return f"Created job '{job.name}' (id: {job.id})"
|
||||||
|
|
||||||
|
def _list_jobs(self) -> str:
|
||||||
|
jobs = self._cron.list_jobs()
|
||||||
|
if not jobs:
|
||||||
|
return "No scheduled jobs."
|
||||||
|
lines = [f"- {j.name} (id: {j.id}, {j.schedule.kind})" for j in jobs]
|
||||||
|
return "Scheduled jobs:\n" + "\n".join(lines)
|
||||||
|
|
||||||
|
def _remove_job(self, job_id: str | None) -> str:
|
||||||
|
if not job_id:
|
||||||
|
return "Error: job_id is required for remove"
|
||||||
|
if self._cron.remove_job(job_id):
|
||||||
|
return f"Removed job {job_id}"
|
||||||
|
return f"Job {job_id} not found"
|
||||||
263
nanobot/channels/feishu.py
Normal file
263
nanobot/channels/feishu.py
Normal file
@ -0,0 +1,263 @@
|
|||||||
|
"""Feishu/Lark channel implementation using lark-oapi SDK with WebSocket long connection."""
|
||||||
|
|
||||||
|
import asyncio
|
||||||
|
import json
|
||||||
|
import threading
|
||||||
|
from collections import OrderedDict
|
||||||
|
from typing import Any
|
||||||
|
|
||||||
|
from loguru import logger
|
||||||
|
|
||||||
|
from nanobot.bus.events import OutboundMessage
|
||||||
|
from nanobot.bus.queue import MessageBus
|
||||||
|
from nanobot.channels.base import BaseChannel
|
||||||
|
from nanobot.config.schema import FeishuConfig
|
||||||
|
|
||||||
|
try:
|
||||||
|
import lark_oapi as lark
|
||||||
|
from lark_oapi.api.im.v1 import (
|
||||||
|
CreateMessageRequest,
|
||||||
|
CreateMessageRequestBody,
|
||||||
|
CreateMessageReactionRequest,
|
||||||
|
CreateMessageReactionRequestBody,
|
||||||
|
Emoji,
|
||||||
|
P2ImMessageReceiveV1,
|
||||||
|
)
|
||||||
|
FEISHU_AVAILABLE = True
|
||||||
|
except ImportError:
|
||||||
|
FEISHU_AVAILABLE = False
|
||||||
|
lark = None
|
||||||
|
Emoji = None
|
||||||
|
|
||||||
|
# Message type display mapping
|
||||||
|
MSG_TYPE_MAP = {
|
||||||
|
"image": "[image]",
|
||||||
|
"audio": "[audio]",
|
||||||
|
"file": "[file]",
|
||||||
|
"sticker": "[sticker]",
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
class FeishuChannel(BaseChannel):
|
||||||
|
"""
|
||||||
|
Feishu/Lark channel using WebSocket long connection.
|
||||||
|
|
||||||
|
Uses WebSocket to receive events - no public IP or webhook required.
|
||||||
|
|
||||||
|
Requires:
|
||||||
|
- App ID and App Secret from Feishu Open Platform
|
||||||
|
- Bot capability enabled
|
||||||
|
- Event subscription enabled (im.message.receive_v1)
|
||||||
|
"""
|
||||||
|
|
||||||
|
name = "feishu"
|
||||||
|
|
||||||
|
def __init__(self, config: FeishuConfig, bus: MessageBus):
|
||||||
|
super().__init__(config, bus)
|
||||||
|
self.config: FeishuConfig = config
|
||||||
|
self._client: Any = None
|
||||||
|
self._ws_client: Any = None
|
||||||
|
self._ws_thread: threading.Thread | None = None
|
||||||
|
self._processed_message_ids: OrderedDict[str, None] = OrderedDict() # Ordered dedup cache
|
||||||
|
self._loop: asyncio.AbstractEventLoop | None = None
|
||||||
|
|
||||||
|
async def start(self) -> None:
|
||||||
|
"""Start the Feishu bot with WebSocket long connection."""
|
||||||
|
if not FEISHU_AVAILABLE:
|
||||||
|
logger.error("Feishu SDK not installed. Run: pip install lark-oapi")
|
||||||
|
return
|
||||||
|
|
||||||
|
if not self.config.app_id or not self.config.app_secret:
|
||||||
|
logger.error("Feishu app_id and app_secret not configured")
|
||||||
|
return
|
||||||
|
|
||||||
|
self._running = True
|
||||||
|
self._loop = asyncio.get_running_loop()
|
||||||
|
|
||||||
|
# Create Lark client for sending messages
|
||||||
|
self._client = lark.Client.builder() \
|
||||||
|
.app_id(self.config.app_id) \
|
||||||
|
.app_secret(self.config.app_secret) \
|
||||||
|
.log_level(lark.LogLevel.INFO) \
|
||||||
|
.build()
|
||||||
|
|
||||||
|
# Create event handler (only register message receive, ignore other events)
|
||||||
|
event_handler = lark.EventDispatcherHandler.builder(
|
||||||
|
self.config.encrypt_key or "",
|
||||||
|
self.config.verification_token or "",
|
||||||
|
).register_p2_im_message_receive_v1(
|
||||||
|
self._on_message_sync
|
||||||
|
).build()
|
||||||
|
|
||||||
|
# Create WebSocket client for long connection
|
||||||
|
self._ws_client = lark.ws.Client(
|
||||||
|
self.config.app_id,
|
||||||
|
self.config.app_secret,
|
||||||
|
event_handler=event_handler,
|
||||||
|
log_level=lark.LogLevel.INFO
|
||||||
|
)
|
||||||
|
|
||||||
|
# Start WebSocket client in a separate thread
|
||||||
|
def run_ws():
|
||||||
|
try:
|
||||||
|
self._ws_client.start()
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Feishu WebSocket error: {e}")
|
||||||
|
|
||||||
|
self._ws_thread = threading.Thread(target=run_ws, daemon=True)
|
||||||
|
self._ws_thread.start()
|
||||||
|
|
||||||
|
logger.info("Feishu bot started with WebSocket long connection")
|
||||||
|
logger.info("No public IP required - using WebSocket to receive events")
|
||||||
|
|
||||||
|
# Keep running until stopped
|
||||||
|
while self._running:
|
||||||
|
await asyncio.sleep(1)
|
||||||
|
|
||||||
|
async def stop(self) -> None:
|
||||||
|
"""Stop the Feishu bot."""
|
||||||
|
self._running = False
|
||||||
|
if self._ws_client:
|
||||||
|
try:
|
||||||
|
self._ws_client.stop()
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning(f"Error stopping WebSocket client: {e}")
|
||||||
|
logger.info("Feishu bot stopped")
|
||||||
|
|
||||||
|
def _add_reaction_sync(self, message_id: str, emoji_type: str) -> None:
|
||||||
|
"""Sync helper for adding reaction (runs in thread pool)."""
|
||||||
|
try:
|
||||||
|
request = CreateMessageReactionRequest.builder() \
|
||||||
|
.message_id(message_id) \
|
||||||
|
.request_body(
|
||||||
|
CreateMessageReactionRequestBody.builder()
|
||||||
|
.reaction_type(Emoji.builder().emoji_type(emoji_type).build())
|
||||||
|
.build()
|
||||||
|
).build()
|
||||||
|
|
||||||
|
response = self._client.im.v1.message_reaction.create(request)
|
||||||
|
|
||||||
|
if not response.success():
|
||||||
|
logger.warning(f"Failed to add reaction: code={response.code}, msg={response.msg}")
|
||||||
|
else:
|
||||||
|
logger.debug(f"Added {emoji_type} reaction to message {message_id}")
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning(f"Error adding reaction: {e}")
|
||||||
|
|
||||||
|
async def _add_reaction(self, message_id: str, emoji_type: str = "THUMBSUP") -> None:
|
||||||
|
"""
|
||||||
|
Add a reaction emoji to a message (non-blocking).
|
||||||
|
|
||||||
|
Common emoji types: THUMBSUP, OK, EYES, DONE, OnIt, HEART
|
||||||
|
"""
|
||||||
|
if not self._client or not Emoji:
|
||||||
|
return
|
||||||
|
|
||||||
|
loop = asyncio.get_running_loop()
|
||||||
|
await loop.run_in_executor(None, self._add_reaction_sync, message_id, emoji_type)
|
||||||
|
|
||||||
|
async def send(self, msg: OutboundMessage) -> None:
|
||||||
|
"""Send a message through Feishu."""
|
||||||
|
if not self._client:
|
||||||
|
logger.warning("Feishu client not initialized")
|
||||||
|
return
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Determine receive_id_type based on chat_id format
|
||||||
|
# open_id starts with "ou_", chat_id starts with "oc_"
|
||||||
|
if msg.chat_id.startswith("oc_"):
|
||||||
|
receive_id_type = "chat_id"
|
||||||
|
else:
|
||||||
|
receive_id_type = "open_id"
|
||||||
|
|
||||||
|
# Build text message content
|
||||||
|
content = json.dumps({"text": msg.content})
|
||||||
|
|
||||||
|
request = CreateMessageRequest.builder() \
|
||||||
|
.receive_id_type(receive_id_type) \
|
||||||
|
.request_body(
|
||||||
|
CreateMessageRequestBody.builder()
|
||||||
|
.receive_id(msg.chat_id)
|
||||||
|
.msg_type("text")
|
||||||
|
.content(content)
|
||||||
|
.build()
|
||||||
|
).build()
|
||||||
|
|
||||||
|
response = self._client.im.v1.message.create(request)
|
||||||
|
|
||||||
|
if not response.success():
|
||||||
|
logger.error(
|
||||||
|
f"Failed to send Feishu message: code={response.code}, "
|
||||||
|
f"msg={response.msg}, log_id={response.get_log_id()}"
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
logger.debug(f"Feishu message sent to {msg.chat_id}")
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Error sending Feishu message: {e}")
|
||||||
|
|
||||||
|
def _on_message_sync(self, data: "P2ImMessageReceiveV1") -> None:
|
||||||
|
"""
|
||||||
|
Sync handler for incoming messages (called from WebSocket thread).
|
||||||
|
Schedules async handling in the main event loop.
|
||||||
|
"""
|
||||||
|
if self._loop and self._loop.is_running():
|
||||||
|
asyncio.run_coroutine_threadsafe(self._on_message(data), self._loop)
|
||||||
|
|
||||||
|
async def _on_message(self, data: "P2ImMessageReceiveV1") -> None:
|
||||||
|
"""Handle incoming message from Feishu."""
|
||||||
|
try:
|
||||||
|
event = data.event
|
||||||
|
message = event.message
|
||||||
|
sender = event.sender
|
||||||
|
|
||||||
|
# Deduplication check
|
||||||
|
message_id = message.message_id
|
||||||
|
if message_id in self._processed_message_ids:
|
||||||
|
return
|
||||||
|
self._processed_message_ids[message_id] = None
|
||||||
|
|
||||||
|
# Trim cache: keep most recent 500 when exceeds 1000
|
||||||
|
while len(self._processed_message_ids) > 1000:
|
||||||
|
self._processed_message_ids.popitem(last=False)
|
||||||
|
|
||||||
|
# Skip bot messages
|
||||||
|
sender_type = sender.sender_type
|
||||||
|
if sender_type == "bot":
|
||||||
|
return
|
||||||
|
|
||||||
|
sender_id = sender.sender_id.open_id if sender.sender_id else "unknown"
|
||||||
|
chat_id = message.chat_id
|
||||||
|
chat_type = message.chat_type # "p2p" or "group"
|
||||||
|
msg_type = message.message_type
|
||||||
|
|
||||||
|
# Add reaction to indicate "seen"
|
||||||
|
await self._add_reaction(message_id, "THUMBSUP")
|
||||||
|
|
||||||
|
# Parse message content
|
||||||
|
if msg_type == "text":
|
||||||
|
try:
|
||||||
|
content = json.loads(message.content).get("text", "")
|
||||||
|
except json.JSONDecodeError:
|
||||||
|
content = message.content or ""
|
||||||
|
else:
|
||||||
|
content = MSG_TYPE_MAP.get(msg_type, f"[{msg_type}]")
|
||||||
|
|
||||||
|
if not content:
|
||||||
|
return
|
||||||
|
|
||||||
|
# Forward to message bus
|
||||||
|
reply_to = chat_id if chat_type == "group" else sender_id
|
||||||
|
await self._handle_message(
|
||||||
|
sender_id=sender_id,
|
||||||
|
chat_id=reply_to,
|
||||||
|
content=content,
|
||||||
|
metadata={
|
||||||
|
"message_id": message_id,
|
||||||
|
"chat_type": chat_type,
|
||||||
|
"msg_type": msg_type,
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Error processing Feishu message: {e}")
|
||||||
@ -55,6 +55,17 @@ class ChannelManager:
|
|||||||
logger.info("WhatsApp channel enabled")
|
logger.info("WhatsApp channel enabled")
|
||||||
except ImportError as e:
|
except ImportError as e:
|
||||||
logger.warning(f"WhatsApp channel not available: {e}")
|
logger.warning(f"WhatsApp channel not available: {e}")
|
||||||
|
|
||||||
|
# Feishu channel
|
||||||
|
if self.config.channels.feishu.enabled:
|
||||||
|
try:
|
||||||
|
from nanobot.channels.feishu import FeishuChannel
|
||||||
|
self.channels["feishu"] = FeishuChannel(
|
||||||
|
self.config.channels.feishu, self.bus
|
||||||
|
)
|
||||||
|
logger.info("Feishu channel enabled")
|
||||||
|
except ImportError as e:
|
||||||
|
logger.warning(f"Feishu channel not available: {e}")
|
||||||
|
|
||||||
async def start_all(self) -> None:
|
async def start_all(self) -> None:
|
||||||
"""Start WhatsApp channel and the outbound dispatcher."""
|
"""Start WhatsApp channel and the outbound dispatcher."""
|
||||||
|
|||||||
@ -195,7 +195,11 @@ def gateway(
|
|||||||
default_model=config.agents.defaults.model
|
default_model=config.agents.defaults.model
|
||||||
)
|
)
|
||||||
|
|
||||||
# Create agent
|
# Create cron service first (callback set after agent creation)
|
||||||
|
cron_store_path = get_data_dir() / "cron" / "jobs.json"
|
||||||
|
cron = CronService(cron_store_path)
|
||||||
|
|
||||||
|
# Create agent with cron service
|
||||||
agent = AgentLoop(
|
agent = AgentLoop(
|
||||||
bus=bus,
|
bus=bus,
|
||||||
provider=provider,
|
provider=provider,
|
||||||
@ -204,27 +208,27 @@ def gateway(
|
|||||||
max_iterations=config.agents.defaults.max_tool_iterations,
|
max_iterations=config.agents.defaults.max_tool_iterations,
|
||||||
brave_api_key=config.tools.web.search.api_key or None,
|
brave_api_key=config.tools.web.search.api_key or None,
|
||||||
exec_config=config.tools.exec,
|
exec_config=config.tools.exec,
|
||||||
|
cron_service=cron,
|
||||||
)
|
)
|
||||||
|
|
||||||
# Create cron service
|
# Set cron callback (needs agent)
|
||||||
async def on_cron_job(job: CronJob) -> str | None:
|
async def on_cron_job(job: CronJob) -> str | None:
|
||||||
"""Execute a cron job through the agent."""
|
"""Execute a cron job through the agent."""
|
||||||
response = await agent.process_direct(
|
response = await agent.process_direct(
|
||||||
job.payload.message,
|
job.payload.message,
|
||||||
session_key=f"cron:{job.id}"
|
session_key=f"cron:{job.id}",
|
||||||
|
channel=job.payload.channel or "cli",
|
||||||
|
chat_id=job.payload.to or "direct",
|
||||||
)
|
)
|
||||||
# Optionally deliver to channel
|
|
||||||
if job.payload.deliver and job.payload.to:
|
if job.payload.deliver and job.payload.to:
|
||||||
from nanobot.bus.events import OutboundMessage
|
from nanobot.bus.events import OutboundMessage
|
||||||
await bus.publish_outbound(OutboundMessage(
|
await bus.publish_outbound(OutboundMessage(
|
||||||
channel=job.payload.channel or "whatsapp",
|
channel=job.payload.channel or "cli",
|
||||||
chat_id=job.payload.to,
|
chat_id=job.payload.to,
|
||||||
content=response or ""
|
content=response or ""
|
||||||
))
|
))
|
||||||
return response
|
return response
|
||||||
|
cron.on_job = on_cron_job
|
||||||
cron_store_path = get_data_dir() / "cron" / "jobs.json"
|
|
||||||
cron = CronService(cron_store_path, on_job=on_cron_job)
|
|
||||||
|
|
||||||
# Create heartbeat service
|
# Create heartbeat service
|
||||||
async def on_heartbeat(prompt: str) -> str:
|
async def on_heartbeat(prompt: str) -> str:
|
||||||
|
|||||||
@ -17,12 +17,24 @@ class TelegramConfig(BaseModel):
|
|||||||
enabled: bool = False
|
enabled: bool = False
|
||||||
token: str = "" # Bot token from @BotFather
|
token: str = "" # Bot token from @BotFather
|
||||||
allow_from: list[str] = Field(default_factory=list) # Allowed user IDs or usernames
|
allow_from: list[str] = Field(default_factory=list) # Allowed user IDs or usernames
|
||||||
|
proxy: str | None = None # HTTP/SOCKS5 proxy URL, e.g. "http://127.0.0.1:7890" or "socks5://127.0.0.1:1080"
|
||||||
|
|
||||||
|
|
||||||
|
class FeishuConfig(BaseModel):
|
||||||
|
"""Feishu/Lark channel configuration using WebSocket long connection."""
|
||||||
|
enabled: bool = False
|
||||||
|
app_id: str = "" # App ID from Feishu Open Platform
|
||||||
|
app_secret: str = "" # App Secret from Feishu Open Platform
|
||||||
|
encrypt_key: str = "" # Encrypt Key for event subscription (optional)
|
||||||
|
verification_token: str = "" # Verification Token for event subscription (optional)
|
||||||
|
allow_from: list[str] = Field(default_factory=list) # Allowed user open_ids
|
||||||
|
|
||||||
|
|
||||||
class ChannelsConfig(BaseModel):
|
class ChannelsConfig(BaseModel):
|
||||||
"""Configuration for chat channels."""
|
"""Configuration for chat channels."""
|
||||||
whatsapp: WhatsAppConfig = Field(default_factory=WhatsAppConfig)
|
whatsapp: WhatsAppConfig = Field(default_factory=WhatsAppConfig)
|
||||||
telegram: TelegramConfig = Field(default_factory=TelegramConfig)
|
telegram: TelegramConfig = Field(default_factory=TelegramConfig)
|
||||||
|
feishu: FeishuConfig = Field(default_factory=FeishuConfig)
|
||||||
|
|
||||||
|
|
||||||
class AgentDefaults(BaseModel):
|
class AgentDefaults(BaseModel):
|
||||||
@ -50,6 +62,7 @@ class ProvidersConfig(BaseModel):
|
|||||||
anthropic: ProviderConfig = Field(default_factory=ProviderConfig)
|
anthropic: ProviderConfig = Field(default_factory=ProviderConfig)
|
||||||
openai: ProviderConfig = Field(default_factory=ProviderConfig)
|
openai: ProviderConfig = Field(default_factory=ProviderConfig)
|
||||||
openrouter: ProviderConfig = Field(default_factory=ProviderConfig)
|
openrouter: ProviderConfig = Field(default_factory=ProviderConfig)
|
||||||
|
deepseek: ProviderConfig = Field(default_factory=ProviderConfig)
|
||||||
groq: ProviderConfig = Field(default_factory=ProviderConfig)
|
groq: ProviderConfig = Field(default_factory=ProviderConfig)
|
||||||
zhipu: ProviderConfig = Field(default_factory=ProviderConfig)
|
zhipu: ProviderConfig = Field(default_factory=ProviderConfig)
|
||||||
vllm: ProviderConfig = Field(default_factory=ProviderConfig)
|
vllm: ProviderConfig = Field(default_factory=ProviderConfig)
|
||||||
@ -99,9 +112,10 @@ class Config(BaseSettings):
|
|||||||
return Path(self.agents.defaults.workspace).expanduser()
|
return Path(self.agents.defaults.workspace).expanduser()
|
||||||
|
|
||||||
def get_api_key(self) -> str | None:
|
def get_api_key(self) -> str | None:
|
||||||
"""Get API key in priority order: OpenRouter > Anthropic > OpenAI > Gemini > Zhipu > Groq > vLLM."""
|
"""Get API key in priority order: OpenRouter > DeepSeek > Anthropic > OpenAI > Gemini > Zhipu > Groq > vLLM."""
|
||||||
return (
|
return (
|
||||||
self.providers.openrouter.api_key or
|
self.providers.openrouter.api_key or
|
||||||
|
self.providers.deepseek.api_key or
|
||||||
self.providers.anthropic.api_key or
|
self.providers.anthropic.api_key or
|
||||||
self.providers.openai.api_key or
|
self.providers.openai.api_key or
|
||||||
self.providers.gemini.api_key or
|
self.providers.gemini.api_key or
|
||||||
|
|||||||
@ -43,6 +43,8 @@ class LiteLLMProvider(LLMProvider):
|
|||||||
elif self.is_vllm:
|
elif self.is_vllm:
|
||||||
# vLLM/custom endpoint - uses OpenAI-compatible API
|
# vLLM/custom endpoint - uses OpenAI-compatible API
|
||||||
os.environ["OPENAI_API_KEY"] = api_key
|
os.environ["OPENAI_API_KEY"] = api_key
|
||||||
|
elif "deepseek" in default_model:
|
||||||
|
os.environ.setdefault("DEEPSEEK_API_KEY", api_key)
|
||||||
elif "anthropic" in default_model:
|
elif "anthropic" in default_model:
|
||||||
os.environ.setdefault("ANTHROPIC_API_KEY", api_key)
|
os.environ.setdefault("ANTHROPIC_API_KEY", api_key)
|
||||||
elif "openai" in default_model or "gpt" in default_model:
|
elif "openai" in default_model or "gpt" in default_model:
|
||||||
|
|||||||
40
nanobot/skills/cron/SKILL.md
Normal file
40
nanobot/skills/cron/SKILL.md
Normal file
@ -0,0 +1,40 @@
|
|||||||
|
---
|
||||||
|
name: cron
|
||||||
|
description: Schedule reminders and recurring tasks.
|
||||||
|
---
|
||||||
|
|
||||||
|
# Cron
|
||||||
|
|
||||||
|
Use the `cron` tool to schedule reminders or recurring tasks.
|
||||||
|
|
||||||
|
## Two Modes
|
||||||
|
|
||||||
|
1. **Reminder** - message is sent directly to user
|
||||||
|
2. **Task** - message is a task description, agent executes and sends result
|
||||||
|
|
||||||
|
## Examples
|
||||||
|
|
||||||
|
Fixed reminder:
|
||||||
|
```
|
||||||
|
cron(action="add", message="Time to take a break!", every_seconds=1200)
|
||||||
|
```
|
||||||
|
|
||||||
|
Dynamic task (agent executes each time):
|
||||||
|
```
|
||||||
|
cron(action="add", message="Check HKUDS/nanobot GitHub stars and report", every_seconds=600)
|
||||||
|
```
|
||||||
|
|
||||||
|
List/remove:
|
||||||
|
```
|
||||||
|
cron(action="list")
|
||||||
|
cron(action="remove", job_id="abc123")
|
||||||
|
```
|
||||||
|
|
||||||
|
## Time Expressions
|
||||||
|
|
||||||
|
| User says | Parameters |
|
||||||
|
|-----------|------------|
|
||||||
|
| every 20 minutes | every_seconds: 1200 |
|
||||||
|
| every hour | every_seconds: 3600 |
|
||||||
|
| every day at 8am | cron_expr: "0 8 * * *" |
|
||||||
|
| weekdays at 5pm | cron_expr: "0 17 * * 1-5" |
|
||||||
@ -32,6 +32,9 @@ dependencies = [
|
|||||||
]
|
]
|
||||||
|
|
||||||
[project.optional-dependencies]
|
[project.optional-dependencies]
|
||||||
|
feishu = [
|
||||||
|
"lark-oapi>=1.0.0",
|
||||||
|
]
|
||||||
dev = [
|
dev = [
|
||||||
"pytest>=7.0.0",
|
"pytest>=7.0.0",
|
||||||
"pytest-asyncio>=0.21.0",
|
"pytest-asyncio>=0.21.0",
|
||||||
|
|||||||
Loading…
x
Reference in New Issue
Block a user