ilia 4b9ffb5ddf docs: Update architecture and add new documentation for LLM and MCP
- Enhanced `ARCHITECTURE.md` with details on LLM models for work (Llama 3.1 70B Q4) and family agents (Phi-3 Mini 3.8B Q4).
- Introduced new documents:
  - `ASR_EVALUATION.md` for ASR engine evaluation and selection.
  - `HARDWARE.md` outlining hardware requirements and purchase plans.
  - `IMPLEMENTATION_GUIDE.md` for Milestone 2 implementation steps.
  - `LLM_CAPACITY.md` assessing VRAM and context window limits.
  - `LLM_MODEL_SURVEY.md` surveying open-weight LLM models.
  - `LLM_USAGE_AND_COSTS.md` detailing LLM usage and operational costs.
  - `MCP_ARCHITECTURE.md` describing the Model Context Protocol architecture.
  - `MCP_IMPLEMENTATION_SUMMARY.md` summarizing MCP implementation status.

These updates provide comprehensive guidance for the next phases of development and ensure clarity in project documentation.
2026-01-05 23:44:16 -05:00

1.7 KiB

Home Voice Agent

Main mono-repo for the Atlas voice agent system.

Project Structure

home-voice-agent/
├── llm-servers/          # LLM inference servers
│   ├── 4080/             # Work agent (Llama 3.1 70B Q4)
│   └── 1050/             # Family agent (Phi-3 Mini 3.8B Q4)
├── mcp-server/           # MCP tool server (JSON-RPC 2.0)
├── wake-word/            # Wake-word detection node
├── asr/                  # ASR service (faster-whisper)
├── tts/                  # TTS service
├── clients/              # Front-end applications
│   ├── phone/            # Phone PWA
│   └── web-dashboard/    # Web dashboard
├── routing/              # LLM routing layer
├── conversation/         # Conversation management
├── memory/               # Long-term memory
├── safety/               # Safety and boundary enforcement
├── admin/                # Admin tools
└── infrastructure/       # Deployment scripts, Dockerfiles

Quick Start

1. MCP Server

cd mcp-server
pip install -r requirements.txt
python server/mcp_server.py
# Server runs on http://localhost:8000

2. LLM Servers

4080 Server (Work Agent):

cd llm-servers/4080
./setup.sh
ollama serve

1050 Server (Family Agent):

cd llm-servers/1050
./setup.sh
ollama serve --host 0.0.0.0

Status

  • MCP Server: Implemented (TICKET-029)
  • 🔄 LLM Servers: Setup scripts ready (TICKET-021, TICKET-022)
  • Voice I/O: Pending (TICKET-006, TICKET-010, TICKET-014)
  • Clients: Pending (TICKET-039, TICKET-040)

Documentation

See parent atlas/ repo for:

  • Architecture documentation
  • Technology evaluations
  • Implementation guides
  • Ticket tracking