31 Commits

Author SHA1 Message Date
Cheng Wang
3ba0191cef fix: correct heartbeat token matching logic
The HEARTBEAT_OK_TOKEN comparison was broken because the token
itself ("HEARTBEAT_OK" with underscore) was being compared against
a response string that had underscores removed. This made the
condition always fail, preventing the heartbeat service from
recognizing "no tasks" responses.

Now both sides of the comparison remove underscores consistently,
allowing proper matching of the HEARTBEAT_OK token.
2026-02-02 19:47:42 +08:00
Cheng Wang
ea849650ef feat: improve web_fetch URL validation and security
Add URL validation and redirect limits to web_fetch tool to prevent potential security issues:

- Add _validate_url() function to validate URLs before fetching
  - Only allow http:// and https:// schemes (prevent file://, ftp://, etc.)
  - Verify URL has valid scheme and domain
  - Return descriptive error messages for invalid URLs

- Limit HTTP redirects to 5 (down from default 20) to prevent DoS attacks
  - Add MAX_REDIRECTS constant for easy configuration
  - Explicitly configure httpx.AsyncClient with max_redirects parameter

- Improve error handling with JSON error responses for validation failures

This addresses security concerns identified in code review where web_fetch
had no URL validation or redirect limits, potentially allowing:
- Unsafe URL schemes (file://, etc.)
- Redirect-based DoS attacks
- Invalid URL formats causing unclear errors
2026-02-02 19:34:22 +08:00
Xubin Ren
229fde021a
Merge pull request #8 from Neutralmilkzzz/fix-port-conflict
Change default gateway port to 18790
2026-02-02 14:16:01 +08:00
Neutral Milk
c400786b17 chore: change default gateway port to 18790 to avoid conflict with OpenClaw 2026-02-02 13:35:44 +08:00
chaohuang-ai
4ba8cc0f8f
Update README.md 2026-02-02 12:59:36 +08:00
Xubin Ren
2049e1a826
Merge pull request #4 from ZhihaoZhang97/feature/vllm-support
feat: add vLLM/local LLM support
2026-02-02 11:15:49 +08:00
ZhihaoZhang97
2b19dcf9fd feat: add vLLM/local LLM support
- Add vllm provider configuration in config schema
- Auto-detect vLLM endpoints and use hosted_vllm/ prefix for LiteLLM
- Pass api_base directly to acompletion for custom endpoints
- Add vLLM status display in CLI status command
- Add vLLM setup documentation in README
2026-02-02 11:23:04 +11:00
Re-bin
959c4dadf8 release 0.1.3.post3 2026-02-01 18:53:49 +00:00
Re-bin
ac527d40d7 fix: unify skill metadata format 2026-02-01 18:45:42 +00:00
Re-bin
d888e51d1c feat(telegram): markdown support 2026-02-01 18:35:27 +00:00
Re-bin
dbb070e5fd add pypi download count 2026-02-01 18:18:40 +00:00
Re-bin
1662208183 add feishu & wechat group 2026-02-01 18:17:56 +00:00
Re-bin
0c9aa3ed6c update readme 2026-02-01 16:45:51 +00:00
Re-bin
76df1bc795 release 0.1.3.post2 2026-02-01 16:42:27 +00:00
Re-bin
a3ed0c817e update readme 2026-02-01 16:38:13 +00:00
Re-bin
bbca63ddd5 update readme 2026-02-01 16:35:59 +00:00
Re-bin
051a97fa4e feat: add sub-agent system 2026-02-01 16:28:45 +00:00
Re-bin
c8a1190064 update readme 2026-02-01 15:33:43 +00:00
Re-bin
5437d072ad update logo 2026-02-01 15:32:59 +00:00
chaohuang-ai
f18238e19d
Update README.md 2026-02-01 23:24:09 +08:00
chaohuang-ai
d9b70f7fb4
Update README.md 2026-02-01 21:50:35 +08:00
chaohuang-ai
e095dccbce
Update README.md 2026-02-01 21:27:10 +08:00
chaohuang-ai
aec54b9043
Update README.md 2026-02-01 21:17:31 +08:00
chaohuang-ai
969dde083b
Update README.md 2026-02-01 21:16:40 +08:00
chaohuang-ai
de25511873
Update README.md 2026-02-01 21:12:39 +08:00
chaohuang-ai
a046413fac
Update README.md 2026-02-01 21:03:05 +08:00
chaohuang-ai
31b47cedfb
Update README.md 2026-02-01 20:55:10 +08:00
chaohuang-ai
f42b65b203
Update README.md 2026-02-01 20:54:22 +08:00
Re-bin
662a5d6f01 update readme 2026-02-01 07:39:23 +00:00
Re-bin
d4cc48afd5 🐈nanobot: hello world! 2026-02-01 07:36:42 +00:00
Xubin Ren
086d65ace5
Initial commit 2026-02-01 15:16:16 +08:00