Merge pull request #459 from willmiao/codex/complete-refactor-with-tests-and-documentation

Add recipe route integration tests and update architecture docs
This commit is contained in:
pixelpaws
2025-09-22 14:19:41 +08:00
committed by GitHub
3 changed files with 427 additions and 32 deletions

View File

@@ -233,6 +233,32 @@ You can now run LoRA Manager independently from ComfyUI:
This standalone mode provides a lightweight option for managing your model and recipe collection without needing to run the full ComfyUI environment, making it useful even for users who primarily use other stable diffusion interfaces.
## Developer notes
The REST layer is split into modular registrars, controllers, and handler sets
to simplify maintenance:
* `py/routes/recipe_route_registrar.py` holds the declarative endpoint list.
* `py/routes/base_recipe_routes.py` wires shared services/templates and returns
the handler mapping consumed by `RecipeRouteRegistrar`.
* `py/routes/handlers/recipe_handlers.py` groups HTTP adapters by concern (page
rendering, listings, queries, mutations, sharing) and delegates business rules
to services in `py/services/recipes/`.
To add a new recipe endpoint:
1. Declare the route in `ROUTE_DEFINITIONS` with a unique handler name.
2. Implement the coroutine on the appropriate handler class or introduce a new
handler when the concern does not fit existing ones.
3. Inject additional collaborators in
`BaseRecipeRoutes._create_handler_set` (for example a new service or factory)
so the handler can access its dependencies.
The end-to-end wiring is documented in
[`docs/architecture/recipe_routes.md`](docs/architecture/recipe_routes.md), and
the integration suite in `tests/routes/test_recipe_routes.py` smoke-tests the
primary endpoints.
---
## Contributing

View File

@@ -1,50 +1,89 @@
# Recipe route scaffolding
# Recipe route architecture
The recipe HTTP stack is being migrated to mirror the shared model routing
architecture. The first phase extracts the registrar/controller scaffolding so
future handler sets can plug into a stable surface area. The stack now mirrors
the same separation of concerns described in
`docs/architecture/model_routes.md`:
The recipe routing stack now mirrors the modular model route design. HTTP
bindings, controller wiring, handler orchestration, and business rules live in
separate layers so new behaviours can be added without re-threading the entire
feature. The diagram below outlines the flow for a typical request:
```mermaid
graph TD
subgraph HTTP
A[RecipeRouteRegistrar] -->|binds| B[BaseRecipeRoutes handler owner]
A[RecipeRouteRegistrar] -->|binds| B[RecipeRoutes controller]
end
subgraph Application
B --> C[Recipe handler set]
C --> D[Async handlers]
D --> E[Services / scanners]
B --> C[RecipeHandlerSet]
C --> D1[Handlers]
D1 --> E1[Use cases]
E1 --> F1[Services / scanners]
end
subgraph Side Effects
F1 --> G1[Cache & fingerprint index]
F1 --> G2[Metadata files]
F1 --> G3[Temporary shares]
end
```
## Responsibilities
## Layer responsibilities
| Layer | Module(s) | Responsibility |
| --- | --- | --- |
| Registrar | `py/routes/recipe_route_registrar.py` | Declarative list of every recipe endpoint and helper that binds them to an `aiohttp` application. |
| Base controller | `py/routes/base_recipe_routes.py` | Lazily resolves shared services, registers the server-side i18n filter exactly once, pre-warms caches on startup, and exposes a `{handler_name: coroutine}` mapping used by the registrar. |
| Handler set (upcoming) | `py/routes/handlers/recipe_handlers.py` (planned) | Will group HTTP handlers by concern (page rendering, listings, mutations, queries, sharing) and surface them to `BaseRecipeRoutes.get_handler_owner()`. |
| Registrar | `py/routes/recipe_route_registrar.py` | Declarative list of every recipe endpoint and helper methods that bind them to an `aiohttp` application. |
| Controller | `py/routes/base_recipe_routes.py`, `py/routes/recipe_routes.py` | Lazily resolves scanners/clients from the service registry, wires shared templates/i18n, instantiates `RecipeHandlerSet`, and exposes a `{handler_name: coroutine}` mapping for the registrar. |
| Handler set | `py/routes/handlers/recipe_handlers.py` | Thin HTTP adapters grouped by concern (page view, listings, queries, mutations, sharing). They normalise responses and translate service exceptions into HTTP status codes. |
| Services & scanners | `py/services/recipes/*.py`, `py/services/recipe_scanner.py`, `py/services/service_registry.py` | Concrete business logic: metadata parsing, persistence, sharing, fingerprint/index maintenance, and cache refresh. |
`RecipeRoutes` subclasses the base controller to keep compatibility with the
existing monolithic handlers. Once the handler set is extracted the subclass
will simply provide the concrete owner returned by `get_handler_owner()`.
## Handler responsibilities & invariants
## High-level test baseline
`RecipeHandlerSet` flattens purpose-built handler objects into the callables the
registrar binds. Each handler is responsible for a narrow concern and enforces a
set of invariants before returning:
The new smoke suite in `tests/routes/test_recipe_route_scaffolding.py`
guarantees the registrar/controller contract remains intact:
| Handler | Key endpoints | Collaborators | Contracts |
| --- | --- | --- | --- |
| `RecipePageView` | `/loras/recipes` | `SettingsManager`, `server_i18n`, Jinja environment, recipe scanner getter | Template rendered with `is_initializing` flag when caches are still warming; i18n filter registered exactly once per environment instance. |
| `RecipeListingHandler` | `/api/lm/recipes`, `/api/lm/recipe/{id}` | `recipe_scanner.get_paginated_data`, `recipe_scanner.get_recipe_by_id` | Listings respect pagination and search filters; every item receives a `file_url` fallback even when metadata is incomplete; missing recipes become HTTP 404. |
| `RecipeQueryHandler` | Tag/base-model stats, syntax, LoRA lookups | Recipe scanner cache, `format_recipe_file_url` helper | Cache snapshots are reused without forcing refresh; duplicate lookups collapse groups by fingerprint; syntax lookups return helpful errors when LoRAs are absent. |
| `RecipeManagementHandler` | Save, update, reconnect, bulk delete, widget ingest | `RecipePersistenceService`, `RecipeAnalysisService`, recipe scanner | Persistence results propagate HTTP status codes; fingerprint/index updates flow through the scanner before returning; validation errors surface as HTTP 400 without touching disk. |
| `RecipeAnalysisHandler` | Uploaded/local/remote analysis | `RecipeAnalysisService`, `civitai_client`, recipe scanner | Unsupported content types map to HTTP 400; download errors (`RecipeDownloadError`) are not retried; every response includes a `loras` array for client compatibility. |
| `RecipeSharingHandler` | Share + download | `RecipeSharingService`, recipe scanner | Share responses provide a stable download URL and filename; expired shares surface as HTTP 404; downloads stream via `web.FileResponse` with attachment headers. |
* `BaseRecipeRoutes.attach_dependencies` resolves registry services only once
and protects the i18n filter from duplicate registration.
* Startup hooks are appended exactly once so cache pre-warming and dependency
resolution run during application boot.
* `BaseRecipeRoutes.to_route_mapping()` uses the handler owner as the source of
callables, enabling the upcoming handler set without touching the registrar.
* `RecipeRouteRegistrar` binds every declarative route to the aiohttp router.
* `RecipeRoutes.setup_routes` wires the registrar and startup hooks together so
future refactors can swap in the handler set without editing callers.
## Use case boundaries
The dedicated services encapsulate long-running work so handlers stay thin.
| Use case | Entry point | Dependencies | Guarantees |
| --- | --- | --- | --- |
| `RecipeAnalysisService` | `analyze_uploaded_image`, `analyze_remote_image`, `analyze_local_image`, `analyze_widget_metadata` | `ExifUtils`, `RecipeParserFactory`, downloader factory, optional metadata collector/processor | Normalises missing/invalid payloads into `RecipeValidationError`; generates consistent fingerprint data to keep duplicate detection stable; temporary files are cleaned up after every analysis path. |
| `RecipePersistenceService` | `save_recipe`, `delete_recipe`, `update_recipe`, `reconnect_lora`, `bulk_delete`, `save_recipe_from_widget` | `ExifUtils`, recipe scanner, card preview sizing constants | Writes images/JSON metadata atomically; updates scanner caches and hash indices before returning; recalculates fingerprints whenever LoRA assignments change. |
| `RecipeSharingService` | `share_recipe`, `prepare_download` | `tempfile`, recipe scanner | Copies originals to TTL-managed temp files; metadata lookups re-use the scanner; expired shares trigger cleanup and `RecipeNotFoundError`. |
## Maintaining critical invariants
* **Cache updates** Mutations (`save`, `delete`, `bulk_delete`, `update`) call
back into the recipe scanner to mutate the in-memory cache and fingerprint
index before returning a response. Tests assert that these methods are invoked
even when stubbing persistence.
* **Fingerprint management** `RecipePersistenceService` recomputes
fingerprints whenever LoRA metadata changes and duplicate lookups use those
fingerprints to group recipes. Handlers bubble the resulting IDs so clients
can merge duplicates without an extra fetch.
* **Metadata synchronisation** Saving or reconnecting a recipe updates the
JSON sidecar, refreshes embedded metadata via `ExifUtils`, and instructs the
scanner to resort its cache. Sharing relies on this metadata to generate
filenames and ensure downloads stay in sync with on-disk state.
## Extending the stack
1. Declare the new endpoint in `ROUTE_DEFINITIONS` with a unique handler name.
2. Implement the coroutine on an existing handler or introduce a new handler
class inside `py/routes/handlers/recipe_handlers.py` when the concern does
not fit existing ones.
3. Wire additional collaborators inside
`BaseRecipeRoutes._create_handler_set` (inject new services or factories) and
expose helper getters on the handler owner if the handler needs to share
utilities.
Integration tests in `tests/routes/test_recipe_routes.py` exercise the listing,
mutation, analysis-error, and sharing paths end-to-end, ensuring the controller
and handler wiring remains valid as new capabilities are added.
These guardrails mirror the expectations in the model route architecture and
provide confidence that future refactors can focus on handlers and use cases
without breaking HTTP wiring.

View File

@@ -0,0 +1,330 @@
"""Integration smoke tests for the recipe route stack."""
from __future__ import annotations
import json
from contextlib import asynccontextmanager
from dataclasses import dataclass
from pathlib import Path
from types import SimpleNamespace
from typing import Any, AsyncIterator, Dict, List, Optional
from aiohttp import FormData, web
from aiohttp.test_utils import TestClient, TestServer
from py.config import config
from py.routes import base_recipe_routes
from py.routes.recipe_routes import RecipeRoutes
from py.services.recipes import RecipeValidationError
from py.services.service_registry import ServiceRegistry
@dataclass
class RecipeRouteHarness:
"""Container exposing the aiohttp client and stubbed collaborators."""
client: TestClient
scanner: "StubRecipeScanner"
analysis: "StubAnalysisService"
persistence: "StubPersistenceService"
sharing: "StubSharingService"
tmp_dir: Path
class StubRecipeScanner:
"""Minimal scanner double with the surface used by the handlers."""
def __init__(self, base_dir: Path) -> None:
self.recipes_dir = str(base_dir / "recipes")
self.listing_items: List[Dict[str, Any]] = []
self.cached_raw: List[Dict[str, Any]] = []
self.recipes: Dict[str, Dict[str, Any]] = {}
self.removed: List[str] = []
async def _noop_get_cached_data(force_refresh: bool = False) -> None: # noqa: ARG001 - signature mirrors real scanner
return None
self._lora_scanner = SimpleNamespace( # mimic BaseRecipeRoutes expectations
get_cached_data=_noop_get_cached_data,
_hash_index=SimpleNamespace(_hash_to_path={}),
)
async def get_cached_data(self, force_refresh: bool = False) -> SimpleNamespace: # noqa: ARG002 - flag unused by stub
return SimpleNamespace(raw_data=list(self.cached_raw))
async def get_paginated_data(self, **params: Any) -> Dict[str, Any]:
items = [dict(item) for item in self.listing_items]
page = int(params.get("page", 1))
page_size = int(params.get("page_size", 20))
return {
"items": items,
"total": len(items),
"page": page,
"page_size": page_size,
"total_pages": max(1, (len(items) + page_size - 1) // max(page_size, 1)),
}
async def get_recipe_by_id(self, recipe_id: str) -> Optional[Dict[str, Any]]:
return self.recipes.get(recipe_id)
async def remove_recipe(self, recipe_id: str) -> None:
self.removed.append(recipe_id)
self.recipes.pop(recipe_id, None)
class StubAnalysisService:
"""Captures calls made by analysis routes while returning canned responses."""
instances: List["StubAnalysisService"] = []
def __init__(self, **_: Any) -> None:
self.raise_for_uploaded: Optional[Exception] = None
self.raise_for_remote: Optional[Exception] = None
self.raise_for_local: Optional[Exception] = None
self.upload_calls: List[bytes] = []
self.remote_calls: List[Optional[str]] = []
self.local_calls: List[Optional[str]] = []
self.result = SimpleNamespace(payload={"loras": []}, status=200)
StubAnalysisService.instances.append(self)
async def analyze_uploaded_image(self, *, image_bytes: bytes | None, recipe_scanner) -> SimpleNamespace: # noqa: D401 - mirrors real signature
if self.raise_for_uploaded:
raise self.raise_for_uploaded
self.upload_calls.append(image_bytes or b"")
return self.result
async def analyze_remote_image(self, *, url: Optional[str], recipe_scanner, civitai_client) -> SimpleNamespace: # noqa: D401
if self.raise_for_remote:
raise self.raise_for_remote
self.remote_calls.append(url)
return self.result
async def analyze_local_image(self, *, file_path: Optional[str], recipe_scanner) -> SimpleNamespace: # noqa: D401
if self.raise_for_local:
raise self.raise_for_local
self.local_calls.append(file_path)
return self.result
async def analyze_widget_metadata(self, *, recipe_scanner) -> SimpleNamespace:
return SimpleNamespace(payload={"metadata": {}, "image_bytes": b""}, status=200)
class StubPersistenceService:
"""Stub for persistence operations to avoid filesystem writes."""
instances: List["StubPersistenceService"] = []
def __init__(self, **_: Any) -> None:
self.save_calls: List[Dict[str, Any]] = []
self.delete_calls: List[str] = []
self.save_result = SimpleNamespace(payload={"success": True, "recipe_id": "stub-id"}, status=200)
self.delete_result = SimpleNamespace(payload={"success": True}, status=200)
StubPersistenceService.instances.append(self)
async def save_recipe(self, *, recipe_scanner, image_bytes, image_base64, name, tags, metadata) -> SimpleNamespace: # noqa: D401
self.save_calls.append(
{
"recipe_scanner": recipe_scanner,
"image_bytes": image_bytes,
"image_base64": image_base64,
"name": name,
"tags": list(tags),
"metadata": metadata,
}
)
return self.save_result
async def delete_recipe(self, *, recipe_scanner, recipe_id: str) -> SimpleNamespace:
self.delete_calls.append(recipe_id)
await recipe_scanner.remove_recipe(recipe_id)
return self.delete_result
async def update_recipe(self, *, recipe_scanner, recipe_id: str, updates: Dict[str, Any]) -> SimpleNamespace: # pragma: no cover - unused by smoke tests
return SimpleNamespace(payload={"success": True, "recipe_id": recipe_id, "updates": updates}, status=200)
async def reconnect_lora(self, *, recipe_scanner, recipe_id: str, lora_index: int, target_name: str) -> SimpleNamespace: # pragma: no cover
return SimpleNamespace(payload={"success": True}, status=200)
async def bulk_delete(self, *, recipe_scanner, recipe_ids: List[str]) -> SimpleNamespace: # pragma: no cover
return SimpleNamespace(payload={"success": True, "deleted": recipe_ids}, status=200)
async def save_recipe_from_widget(self, *, recipe_scanner, metadata: Dict[str, Any], image_bytes: bytes) -> SimpleNamespace: # pragma: no cover
return SimpleNamespace(payload={"success": True}, status=200)
class StubSharingService:
"""Share service stub recording requests and returning canned responses."""
instances: List["StubSharingService"] = []
def __init__(self, *, ttl_seconds: int = 300, logger) -> None: # noqa: ARG002 - ttl unused in stub
self.share_calls: List[str] = []
self.download_calls: List[str] = []
self.share_result = SimpleNamespace(
payload={"success": True, "download_url": "/share/stub", "filename": "recipe.png"},
status=200,
)
self.download_info = SimpleNamespace(file_path="", download_filename="")
StubSharingService.instances.append(self)
async def share_recipe(self, *, recipe_scanner, recipe_id: str) -> SimpleNamespace:
self.share_calls.append(recipe_id)
return self.share_result
async def prepare_download(self, *, recipe_scanner, recipe_id: str) -> SimpleNamespace:
self.download_calls.append(recipe_id)
return self.download_info
@asynccontextmanager
async def recipe_harness(monkeypatch, tmp_path: Path) -> AsyncIterator[RecipeRouteHarness]:
"""Context manager that yields a fully wired recipe route harness."""
StubAnalysisService.instances.clear()
StubPersistenceService.instances.clear()
StubSharingService.instances.clear()
scanner = StubRecipeScanner(tmp_path)
async def fake_get_recipe_scanner():
return scanner
async def fake_get_civitai_client():
return object()
monkeypatch.setattr(ServiceRegistry, "get_recipe_scanner", fake_get_recipe_scanner)
monkeypatch.setattr(ServiceRegistry, "get_civitai_client", fake_get_civitai_client)
monkeypatch.setattr(base_recipe_routes, "RecipeAnalysisService", StubAnalysisService)
monkeypatch.setattr(base_recipe_routes, "RecipePersistenceService", StubPersistenceService)
monkeypatch.setattr(base_recipe_routes, "RecipeSharingService", StubSharingService)
monkeypatch.setattr(config, "loras_roots", [str(tmp_path)], raising=False)
app = web.Application()
RecipeRoutes.setup_routes(app)
server = TestServer(app)
client = TestClient(server)
await client.start_server()
harness = RecipeRouteHarness(
client=client,
scanner=scanner,
analysis=StubAnalysisService.instances[-1],
persistence=StubPersistenceService.instances[-1],
sharing=StubSharingService.instances[-1],
tmp_dir=tmp_path,
)
try:
yield harness
finally:
await client.close()
StubAnalysisService.instances.clear()
StubPersistenceService.instances.clear()
StubSharingService.instances.clear()
async def test_list_recipes_provides_file_urls(monkeypatch, tmp_path: Path) -> None:
async with recipe_harness(monkeypatch, tmp_path) as harness:
recipe_path = harness.tmp_dir / "recipes" / "demo.png"
harness.scanner.listing_items = [
{
"id": "recipe-1",
"file_path": str(recipe_path),
"title": "Demo",
"loras": [],
}
]
harness.scanner.cached_raw = list(harness.scanner.listing_items)
response = await harness.client.get("/api/lm/recipes")
payload = await response.json()
assert response.status == 200
assert payload["items"][0]["file_url"].endswith("demo.png")
assert payload["items"][0]["loras"] == []
async def test_save_and_delete_recipe_round_trip(monkeypatch, tmp_path: Path) -> None:
async with recipe_harness(monkeypatch, tmp_path) as harness:
form = FormData()
form.add_field("image", b"stub", filename="sample.png", content_type="image/png")
form.add_field("name", "Test Recipe")
form.add_field("tags", json.dumps(["tag-a"]))
form.add_field("metadata", json.dumps({"loras": []}))
form.add_field("image_base64", "aW1hZ2U=")
harness.persistence.save_result = SimpleNamespace(
payload={"success": True, "recipe_id": "saved-id"},
status=201,
)
save_response = await harness.client.post("/api/lm/recipes/save", data=form)
save_payload = await save_response.json()
assert save_response.status == 201
assert save_payload["recipe_id"] == "saved-id"
assert harness.persistence.save_calls[-1]["name"] == "Test Recipe"
harness.persistence.delete_result = SimpleNamespace(payload={"success": True}, status=200)
delete_response = await harness.client.delete("/api/lm/recipe/saved-id")
delete_payload = await delete_response.json()
assert delete_response.status == 200
assert delete_payload["success"] is True
assert harness.persistence.delete_calls == ["saved-id"]
async def test_analyze_uploaded_image_error_path(monkeypatch, tmp_path: Path) -> None:
async with recipe_harness(monkeypatch, tmp_path) as harness:
harness.analysis.raise_for_uploaded = RecipeValidationError("No image data provided")
form = FormData()
form.add_field("image", b"", filename="empty.png", content_type="image/png")
response = await harness.client.post("/api/lm/recipes/analyze-image", data=form)
payload = await response.json()
assert response.status == 400
assert payload["error"] == "No image data provided"
assert payload["loras"] == []
async def test_share_and_download_recipe(monkeypatch, tmp_path: Path) -> None:
async with recipe_harness(monkeypatch, tmp_path) as harness:
recipe_id = "share-me"
download_path = harness.tmp_dir / "recipes" / "share.png"
download_path.parent.mkdir(parents=True, exist_ok=True)
download_path.write_bytes(b"stub")
harness.scanner.recipes[recipe_id] = {
"id": recipe_id,
"title": "Shared",
"file_path": str(download_path),
}
harness.sharing.share_result = SimpleNamespace(
payload={"success": True, "download_url": "/api/share", "filename": "share.png"},
status=200,
)
harness.sharing.download_info = SimpleNamespace(
file_path=str(download_path),
download_filename="share.png",
)
share_response = await harness.client.get(f"/api/lm/recipe/{recipe_id}/share")
share_payload = await share_response.json()
assert share_response.status == 200
assert share_payload["filename"] == "share.png"
assert harness.sharing.share_calls == [recipe_id]
download_response = await harness.client.get(f"/api/lm/recipe/{recipe_id}/share/download")
body = await download_response.read()
assert download_response.status == 200
assert download_response.headers["Content-Disposition"] == 'attachment; filename="share.png"'
assert body == b"stub"
download_path.unlink(missing_ok=True)