Some checks failed
CI / backend-test (push) Successful in 4m9s
CI / frontend-test (push) Failing after 3m48s
CI / lint-python (push) Successful in 1m41s
CI / secret-scanning (push) Successful in 1m20s
CI / dependency-scan (push) Successful in 10m50s
CI / workflow-summary (push) Successful in 1m11s
## Features Added
### Document Reference System
- Implemented numbered document references (@1, @2, etc.) with autocomplete dropdown
- Added fuzzy filename matching for @filename references
- Document filtering now prioritizes numeric refs > filename refs > all documents
- Autocomplete dropdown appears when typing @ with keyboard navigation (Up/Down, Enter/Tab, Escape)
- Document numbers displayed in UI for easy reference
### Conversation Management
- Added conversation rename functionality with inline editing
- Implemented conversation search (by title and content)
- Search box always visible, even when no conversations exist
- Export reports now replace @N references with actual filenames
### UI/UX Improvements
- Removed debug toggle button
- Improved text contrast in dark mode (better visibility)
- Made input textarea expand to full available width
- Fixed file text color for better readability
- Enhanced document display with numbered badges
### Configuration & Timeouts
- Made HTTP client timeouts configurable (connect, write, pool)
- Added .env.example with all configuration options
- Updated timeout documentation
### Developer Experience
- Added `make test-setup` target for automated test conversation creation
- Test setup script supports TEST_MESSAGE and TEST_DOCS env vars
- Improved Makefile with dev and test-setup targets
### Documentation
- Updated ARCHITECTURE.md with all new features
- Created comprehensive deployment documentation
- Added GPU VM setup guides
- Removed unnecessary markdown files (CLAUDE.md, CONTRIBUTING.md, header.jpg)
- Organized documentation in docs/ directory
### GPU VM / Ollama (Stability + GPU Offload)
- Updated GPU VM docs to reflect the working systemd environment for remote Ollama
- Standardized remote Ollama port to 11434 (and added /v1/models verification)
- Documented required env for GPU offload on this VM:
- `OLLAMA_MODELS=/mnt/data/ollama`, `HOME=/mnt/data/ollama/home`
- `OLLAMA_LLM_LIBRARY=cuda_v12` (not `cuda`)
- `LD_LIBRARY_PATH=/usr/local/lib/ollama:/usr/local/lib/ollama/cuda_v12`
## Technical Changes
### Backend
- Enhanced `docs_context.py` with reference parsing (numeric and filename)
- Added `update_conversation_title` to storage.py
- New endpoints: PATCH /api/conversations/{id}/title, GET /api/conversations/search
- Improved report generation with filename substitution
### Frontend
- Removed debugMode state and related code
- Added autocomplete dropdown component
- Implemented search functionality in Sidebar
- Enhanced ChatInterface with autocomplete and improved textarea sizing
- Updated CSS for better contrast and responsive design
## Files Changed
- Backend: config.py, council.py, docs_context.py, main.py, storage.py
- Frontend: App.jsx, ChatInterface.jsx, Sidebar.jsx, and related CSS files
- Documentation: README.md, ARCHITECTURE.md, new docs/ directory
- Configuration: .env.example, Makefile
- Scripts: scripts/test_setup.py
## Breaking Changes
None - all changes are backward compatible
## Testing
- All existing tests pass
- New test-setup script validates conversation creation workflow
- Manual testing of autocomplete, search, and rename features
225 lines
5.9 KiB
Python
225 lines
5.9 KiB
Python
"""JSON-based storage for conversations."""
|
|
|
|
import json
|
|
import os
|
|
from datetime import datetime
|
|
from typing import List, Dict, Any, Optional
|
|
from pathlib import Path
|
|
from .config import DATA_DIR
|
|
|
|
|
|
def ensure_data_dir():
|
|
"""Ensure the data directory exists."""
|
|
Path(DATA_DIR).mkdir(parents=True, exist_ok=True)
|
|
|
|
|
|
def get_conversation_path(conversation_id: str) -> str:
|
|
"""Get the file path for a conversation."""
|
|
return os.path.join(DATA_DIR, f"{conversation_id}.json")
|
|
|
|
|
|
def create_conversation(conversation_id: str) -> Dict[str, Any]:
|
|
"""
|
|
Create a new conversation.
|
|
|
|
Args:
|
|
conversation_id: Unique identifier for the conversation
|
|
|
|
Returns:
|
|
New conversation dict
|
|
"""
|
|
ensure_data_dir()
|
|
|
|
conversation = {
|
|
"id": conversation_id,
|
|
"created_at": datetime.utcnow().isoformat(),
|
|
"title": "New Conversation",
|
|
"messages": []
|
|
}
|
|
|
|
# Save to file
|
|
path = get_conversation_path(conversation_id)
|
|
with open(path, 'w') as f:
|
|
json.dump(conversation, f, indent=2)
|
|
|
|
return conversation
|
|
|
|
|
|
def get_conversation(conversation_id: str) -> Optional[Dict[str, Any]]:
|
|
"""
|
|
Load a conversation from storage.
|
|
|
|
Args:
|
|
conversation_id: Unique identifier for the conversation
|
|
|
|
Returns:
|
|
Conversation dict or None if not found
|
|
"""
|
|
path = get_conversation_path(conversation_id)
|
|
|
|
if not os.path.exists(path):
|
|
return None
|
|
|
|
with open(path, 'r') as f:
|
|
return json.load(f)
|
|
|
|
|
|
def save_conversation(conversation: Dict[str, Any]):
|
|
"""
|
|
Save a conversation to storage.
|
|
|
|
Args:
|
|
conversation: Conversation dict to save
|
|
"""
|
|
ensure_data_dir()
|
|
|
|
path = get_conversation_path(conversation['id'])
|
|
with open(path, 'w') as f:
|
|
json.dump(conversation, f, indent=2)
|
|
|
|
|
|
def list_conversations(include_archived: bool = False) -> List[Dict[str, Any]]:
|
|
"""
|
|
List all conversations (metadata only).
|
|
|
|
Args:
|
|
include_archived: If True, include archived conversations
|
|
|
|
Returns:
|
|
List of conversation metadata dicts
|
|
"""
|
|
ensure_data_dir()
|
|
|
|
conversations = []
|
|
for filename in os.listdir(DATA_DIR):
|
|
if filename.endswith('.json'):
|
|
path = os.path.join(DATA_DIR, filename)
|
|
with open(path, 'r') as f:
|
|
data = json.load(f)
|
|
# Return metadata only
|
|
conversations.append({
|
|
"id": data["id"],
|
|
"created_at": data["created_at"],
|
|
"title": data.get("title", "New Conversation"),
|
|
"message_count": len(data["messages"])
|
|
})
|
|
|
|
# Sort by creation time, newest first
|
|
conversations.sort(key=lambda x: x["created_at"], reverse=True)
|
|
|
|
return conversations
|
|
|
|
|
|
def add_user_message(conversation_id: str, content: str):
|
|
"""
|
|
Add a user message to a conversation.
|
|
|
|
Args:
|
|
conversation_id: Conversation identifier
|
|
content: User message content
|
|
"""
|
|
conversation = get_conversation(conversation_id)
|
|
if conversation is None:
|
|
raise ValueError(f"Conversation {conversation_id} not found")
|
|
|
|
conversation["messages"].append({
|
|
"role": "user",
|
|
"content": content
|
|
})
|
|
|
|
save_conversation(conversation)
|
|
|
|
|
|
def add_assistant_message(
|
|
conversation_id: str,
|
|
stage1: List[Dict[str, Any]],
|
|
stage2: List[Dict[str, Any]],
|
|
stage3: Dict[str, Any],
|
|
metadata: Optional[Dict[str, Any]] = None
|
|
):
|
|
"""
|
|
Add an assistant message with all 3 stages to a conversation.
|
|
|
|
Args:
|
|
conversation_id: Conversation identifier
|
|
stage1: List of individual model responses
|
|
stage2: List of model rankings
|
|
stage3: Final synthesized response
|
|
metadata: Optional metadata dict with timing and other info
|
|
"""
|
|
conversation = get_conversation(conversation_id)
|
|
if conversation is None:
|
|
raise ValueError(f"Conversation {conversation_id} not found")
|
|
|
|
message = {
|
|
"role": "assistant",
|
|
"stage1": stage1,
|
|
"stage2": stage2,
|
|
"stage3": stage3
|
|
}
|
|
if metadata:
|
|
message["metadata"] = metadata
|
|
|
|
conversation["messages"].append(message)
|
|
|
|
save_conversation(conversation)
|
|
|
|
|
|
def update_conversation_title(conversation_id: str, title: str):
|
|
"""
|
|
Update the title of a conversation.
|
|
|
|
Args:
|
|
conversation_id: Conversation identifier
|
|
title: New title for the conversation
|
|
"""
|
|
conversation = get_conversation(conversation_id)
|
|
if conversation is None:
|
|
raise ValueError(f"Conversation {conversation_id} not found")
|
|
|
|
conversation["title"] = title
|
|
save_conversation(conversation)
|
|
|
|
|
|
def delete_conversation(conversation_id: str):
|
|
"""
|
|
Delete a conversation (and its associated documents).
|
|
|
|
Args:
|
|
conversation_id: Conversation identifier
|
|
"""
|
|
path = get_conversation_path(conversation_id)
|
|
if not os.path.exists(path):
|
|
raise ValueError(f"Conversation {conversation_id} not found")
|
|
|
|
# Delete the conversation file
|
|
os.remove(path)
|
|
|
|
# Also delete associated documents directory
|
|
from .documents import _conversation_dir
|
|
docs_dir = _conversation_dir(conversation_id)
|
|
if docs_dir.exists():
|
|
import shutil
|
|
shutil.rmtree(docs_dir, ignore_errors=True)
|
|
|
|
|
|
def archive_conversation(conversation_id: str, archived: bool = True):
|
|
"""
|
|
Archive or unarchive a conversation.
|
|
|
|
Args:
|
|
conversation_id: Conversation identifier
|
|
archived: True to archive, False to unarchive
|
|
"""
|
|
conversation = get_conversation(conversation_id)
|
|
if conversation is None:
|
|
raise ValueError(f"Conversation {conversation_id} not found")
|
|
|
|
conversation["archived"] = archived
|
|
if archived:
|
|
conversation["archived_at"] = datetime.utcnow().isoformat()
|
|
else:
|
|
conversation.pop("archived_at", None)
|
|
|
|
save_conversation(conversation)
|